Are you a spammer

Please note, that the first 3 posts you make, will need to be approved by a forum Administrator or Moderator before they are publicly viewable.
Each application to join this forum is checked at the Stop Forum Spam website. If the email or IP address appears there when checked, you will not be allowed to join this forum.
If you get past this check and post spam on this forum, your posts will be immediately deleted and your account inactivated.You will then be banned and your IP will be submitted to your ISP, notifying them of your spamming. So your spam links will only be seen for an hour or two at most. In other words, don't waste your time and ours.

This forum is for the use and enjoyment of the members and visitors looking to learn about and share information regarding the topics listed. It is not a free-for-all advertising venue. Your time would be better spent pursuing legitimate avenues of promoting your websites.

Bob Loblaw's Law Blog

Love it, hate it drop posts here
Forum rules
Comments or opinions expressed on this forum are those of their respective contributors only. The views expressed on this forum do not necessarily represent the views of Ultimate Edition, its management or employees. Ultimate Edition is not responsible for, and disclaims any and all liability for the content of comments written by contributors to the forum.


Bob Loblaw's Law Blog

Postby Xanayoshi » Fri Mar 14, 2014 1:19 am

I wrote a paper for class today and in doing research I discovered that Watson, the one time Jeopardy contestant, was built from Suse Linux Server 11. How cool is that? By nature I am rather skeptical and always looking for the counter argument and association. The thesis statement for the essay mostly consisted of the intelligence issue and I did take the non-intelligence route basically stating that the naming is more an implication than what is actually occurring, such as Cognitive Computer, Neurosynaptic Chip, etc..

This does not mean that Watson is not incredibly intriguing nor does this mean I find this technology invaluable. I thought it was cool as hell. My argument was for the overall effectiveness of the tool. I should point out that I was not extremely firm on my stance and have no real argument to provide. This is just the only topic I could find out of an anthology assigned I found interesting.

I encourage anyone to peruse this topic however..

http://www.forbes.com/sites/ibm/2014/02 ... era-of-it/
Image

And now, the completely true facts, as told by an anonymous man hiding behind a screen name:

Packard Bell Pack Mate II 286 Intel 80286 1MB RAM
ImageImageImage
User avatar
Xanayoshi
Moderator
 
Posts: 1564
Joined: Thu Oct 18, 2012 1:46 pm
Location: Kitsap County
Age: 45
Operating System: Ultimate Edition 3.4 32 BIT



Re: Bob Loblaw's Law Blog

Postby pam » Fri Mar 14, 2014 2:36 am

Interesting indeed.
The early 90's saw a boom in alternative computing.
Everyone knew and knows hardware plays little to no benefit for cognitive computing. Or simply put it sits exactly right there in the future.
There are many names and methodologies, like you put but mean gibberish in reality eg: DNA computing, quantum computing(qubits), neural networks(circuits built emulating a neuron with synapse), grid, parallel computing, GPGPU etc etc..

IBM backed out long ago from real computing business. They now lay down standards and are pioneers for the next gen but yet again hardware means nothing. But its all quasi-intelligence. Its not hardware but software that understands that rewiring itself-- the hardware it works on to get better inter/intra throughput.

Nobody can predict 2 years from now. I cant believe it; 2 years today = 20 years of development in the past.
AMD has taken the first step along with ARM IP users.
AMD HSA(heterogenous systems architecture) is just another way of doing things smartly.

Its exactly like skynet in terminator. I for sure wont see it coming. 1Tbps ethernet is already under way..
http://www.extremetech.com/computing/13 ... s-ethernet
Huawei already has 1Tbps industrial routers.

Docsis 3.0 networks are a need for 4k/8k content with zetabytes of transfers with minimum net speeds of 50Mbps. 1 GiB/sec data transfer would be laughing matter by 2020. Network data transfer for servers is faster than the HDD head can write on the platter. Thats where RAID comes in. SSD is a surety; for long gone HDD's...

Sci-fi idoicy. Reality but nonetheless idoicy.
One day data transfer would be fast enough to discover the boson-higgs particle. Fast enough to use photons on home networks for line-of-sight laser data transmission(already done--you could build one yourself!). Fast enough to dematerialize you to reach work on time. :roll:
DONOTSPAMORTROLL:
http://forumubuntusoftware.info/viewtopic.php?f=9&t=11
Download Ultimate Edition and Oz Unity 3.0 from copy:-
https://www.copy.com/s/oBnDBsDOvxF8jW1EuLKM/Public
Download Ultimate Edition from sourceforge:-
http://sourceforge.net/projects/ultimat ... rce=navbar
Download Oz Unity 3.0 from sourceforge:-
http://sourceforge.net/projects/ueoz/files/UEOz/
Download Ultimate Edition torrents from linuxtracker:-
http://linuxtracker.org/index.php?page= ... 0&active=1
Download Oz Unity 3.0 torrents from linuxtracker:-
http://linuxtracker.org/index.php?page= ... 0&active=1
Image Image
Visit:http://www.ultimateeditionoz.com
User avatar
pam
Site Admin
 
Posts: 1087
Joined: Wed May 25, 2011 5:56 am
Location: India
Age: 38
Operating System: Ultimate Edition 3.5 64 BIT



Re: Bob Loblaw's Law Blog

Postby Xanayoshi » Fri Mar 14, 2014 2:17 pm

I almost wish AMD and Intel would get together in a way. I love the Atom processor, and Intel is making decent processors, after all the fastest supercomputer in the world is built from them(sorry, I reallllly reallly realllllly want to build a cluster)

But yes...it is certainly hard to predict where everything is heading.

A.I. is problematic at the root..if we do not fully understand a brain's functions, we can not replicate it.

Mountain Dew can be called extreme...but is it..is it really? Is a Cognitive computer truly cognitive...

One day perhaps I can play with the Watson API or a cognitive equivalent. I feel like until I have my hands on something I will not know it.

It is also perhaps not a good thing. We already(in America) are very dependent on easy access technology(Apple) that is very restricting in use(Apple) with certain pioneers(Steve Wozniak-“People don’t really choose their smartphones based on features,” he added. “I think Apple is superior at being able to say no.”) who think that the future of technology should be simple and feature free. Seems pretty ridiculous considering that their empire was built from features that were largely unavailable at the time. Like it or not the ipod did push us in a certain direction, and the iphone sealed it. I realise they did not do this tech first, but they did change the landscape.

A natural language machine would only serve to make those that do not understand technology more subservient to it. It is ridiculous that they do not teach standard computing technology in schools here as it is this technology that their lives revolve around in every conceivable way.
Image

And now, the completely true facts, as told by an anonymous man hiding behind a screen name:

Packard Bell Pack Mate II 286 Intel 80286 1MB RAM
ImageImageImage
User avatar
Xanayoshi
Moderator
 
Posts: 1564
Joined: Thu Oct 18, 2012 1:46 pm
Location: Kitsap County
Age: 45
Operating System: Ultimate Edition 3.4 32 BIT



Re: Bob Loblaw's Law Blog

Postby pam » Sat Mar 15, 2014 12:00 am

You read my mind Xanayoshi.

Xanayoshi wrote:(sorry, I reallllly reallly realllllly want to build a cluster)


Sometimes when I get up from sleep, that's all I want to do. You can try Rocks CLuster, its based on Centos which is RHEL.
http://www.rocksclusters.org/wordpress/?page_id=80
It has a no nonsense method of setting up.

I've tried Beowulf but unless you know how to script in C, you'll end up broke.

My Idea of a local cluster/cloud/grid is basically wirelessly networking all devices at the hardware level. Ofcourse the networking equipment, protocols and the rest of the stack have layered security otherwise people would be able to inject code directly into the CPU....

Otherwise all your ARM, AMD, Intel and Mips(routers) CPUS, memory and graphics would automatically form a local grid virtualizing the entire CPU and GPU power. Servers do this by using an MPI(message passing interface). No, MPI's wont allow you to run GTA V at 2000fps. Its just a fictional state imagined -- what if you could?

A.I. is almost an oblivious concept. Again its just a matter of time and it sits right there in the future. According to me AI is highly subjective and not an active intention of building advanced networks and systems. It'll happen when it needs to.

Apple is an a** of a company but they have opened up even more since the passing of Jobs.
They made OpenCL (GPGPU compute) to counter the proprietary Nvidia CUDA compute.
Then they made it fully open source.
AMD's entire business now depends on OpenCL and HSAIL. HSA uses opencl to a broad extent. ARM's(android) biggest strength is OpenGL and OpenCL. What Windows and Linux could never achieve, android has done so in less than 5 years using underperforming ARM chips.

Apple uses LLVM(clang) to make their programs, kernels etc. Richard Stallman has gone up against Clang because it uses another license that misinterprets the meaning of freedom........that's another story. Many developers now wish to compile their kernel with Clang instead of GCC. Gallium is compiled using llvm and is not GPL licensed....

AMD has released the SDK for HSA on linux(GCC). http://www.hsafoundation.com/hsa-developer-tools/

So any exclusivity to AI, cognitive computing etc for now can be passed off as Compute porno.

Xanayoshi wrote:A natural language machine would only serve to make those that do not understand technology more subservient to it. It is ridiculous that they do not teach standard computing technology in schools here as it is this technology that their lives revolve around in every conceivable way.


That is an argument and an internal confliction of thoughts I first encountered when I was in high school reading up on quantum well transistors, quantum chromodynamics and string theory :roll: from the library. Back then I was highly into anything quantum and mugged up Q is for quantum by John Gribbin ... http://www.amazon.com/IS-FOR-QUANTUM-En ... 4863154....

Today languages like Python/Haskell/C# and their vast IDE's(Geany, nano and vim are the best!) etc etc have changed the way programmers program.
Python by far can be called as the most advanced language. It can also be used for writing MPI's. No, it does not allow you to control the CPU but you can write almost anything else limited only by imagination.

The problem is not about teaching languages. The teaching staff would have to be much much more adept in coding apart from a multitude of other factors(real problems). Professors who teach high level unix stuff work for many Co's and unless paid well wouldn't care a bit more unless someone is willing to work for less(like in Asian countries)...programming languages today are like the 386 gen technology...in about 40 years time you could program in English...may the best oxford graduate make the best programs. :lol: ...again GOogle has done amazing things in speech recognition. AMD's major goal(one of the many) for APU's is speech recognition.

More than a decade ago when I was playing with C; assembly language was the only way you could control a CPU even though C was used majorly.
I played with mnemonic code on windows and linux and it was fun but soon realised if one were to write a kernel with such a language, it would take nearly 10-15 years. Now ofcourse unless C and C++ was hardcoded into the chip itself then you could even build a higher level language to program a CPU for....and then multicores came. ARM started adding encoding circuits directly into the chip unlike Intel which would sell chips by just upping the clock speeds. Today's chips have encoding stuff like x.264 and advanced instruction sets built in like AVX-256, AES, execute disable bit for security etc etc.

The best way to see how a computer "understands" language is to take a raw C code with .c format and pass it through a compiler. An a.out file along with another machine code file will be created filled with gibberish. That is exactly that goes on inside the CPU. This was achievable on older compilers and not so on new ones...
DONOTSPAMORTROLL:
http://forumubuntusoftware.info/viewtopic.php?f=9&t=11
Download Ultimate Edition and Oz Unity 3.0 from copy:-
https://www.copy.com/s/oBnDBsDOvxF8jW1EuLKM/Public
Download Ultimate Edition from sourceforge:-
http://sourceforge.net/projects/ultimat ... rce=navbar
Download Oz Unity 3.0 from sourceforge:-
http://sourceforge.net/projects/ueoz/files/UEOz/
Download Ultimate Edition torrents from linuxtracker:-
http://linuxtracker.org/index.php?page= ... 0&active=1
Download Oz Unity 3.0 torrents from linuxtracker:-
http://linuxtracker.org/index.php?page= ... 0&active=1
Image Image
Visit:http://www.ultimateeditionoz.com
User avatar
pam
Site Admin
 
Posts: 1087
Joined: Wed May 25, 2011 5:56 am
Location: India
Age: 38
Operating System: Ultimate Edition 3.5 64 BIT


Return to Rants and Raves

Who is online

Users browsing this forum: No registered users and 4 guests