Jump to content

Recommended Posts

Posted

Software, IMO. When it comes to the impact of computers on our daily lives (which is ultimately what we're talking about with the concept of "singularity"), software has become far more important than hardware. The processing power of a PC and the demand for personal data storage have leveled off, and the focus has become mobility and decreased expense.

 

But over the same period of time both the Internet and individual programs have become a lot more useful. Maps and GPS devices, localized resource searching (I'm driving down I-75... where's the nearest grocery store?), and the plethora of business portals and portal-based resource products has overhauled the entire Web in terms of usefulness. We're slammed right through "Web 2.0" and are hard-charging towards 3.0 and 4.0. Meanwhile Apple says the iPhone has had a billion downloads. Who would have thought that a simple telephone could be THAT useful just 2-3 years ago?

 

That's not to say that hardware is no longer developed or that it's no longer significant. I daily bemoan the loss of prominence of computer engineering programs and students interested in pursuing them -- that's going to come back to haunt us. But data-driven software is what's changing people's lives.

 

Just my humble opinion, of course.

Posted
Software, IMO. When it comes to the impact of computers on our daily lives (which is ultimately what we're talking about with the concept of "singularity"), software has become far more important than hardware. The processing power of a PC and the demand for personal data storage have leveled off, and the focus has become mobility and decreased expense.

 

But over the same period of time both the Internet and individual programs have become a lot more useful. Maps and GPS devices, localized resource searching (I'm driving down I-75... where's the nearest grocery store?), and the plethora of business portals and portal-based resource products has overhauled the entire Web in terms of usefulness. We're slammed right through "Web 2.0" and are hard-charging towards 3.0 and 4.0. Meanwhile Apple says the iPhone has had a billion downloads. Who would have thought that a simple telephone could be THAT useful just 2-3 years ago?

 

That's not to say that hardware is no longer developed or that it's no longer significant. I daily bemoan the loss of prominence of computer engineering programs and students interested in pursuing them -- that's going to come back to haunt us. But data-driven software is what's changing people's lives.

 

Just my humble opinion, of course.

 

So, you think intelligence will come about as software on a turing machine rather than a hardware analog to the neocortex? Either way, I think we're still a fair way away. At least another 15 years, IMO.

Posted

I think any such concept is sheer speculation, but in so far as such speculations have value, my feeling is that the rising complexity, ease-of-use, and ease-of-construction of software is what leads to singularity.

 

Processing horsepower is just the engine. Maybe the engine needs to get more powerful, I don't know. But complex software, made easier to develop and packaged with a strong motivation to develop it (e.g. iPhone apps) is the GPS nav, the steering wheel, and the five-speed transmission.

Posted

Some background:

 

http://en.wikipedia.org/wiki/Technological_singularity

 

Some better background:

 

http://www-rohan.sdsu.edu/faculty/vinge/misc/singularity.html

 

Vernor Vinge argues there's two ways the Singularity could happen (which aren't mutually exclusive):

 

  • Artificial Intelligence (AI): computers become smarter than humans
  • Intelligence Amplification (IA): Technologies emerge which make humans who use them much smarter than present day humans

 

I think hardware is the limiting factor in either of these.

 

It will take a good 20 years until Moore's Law will make it reasonable for a large scientific organization to purchase a supercomputer powerful enough to simulate the entire human brain, even slower than realtime (but at speeds which are fast enough to be useful)

 

I think software will *become* the limiting factor, after the hardware is fast enough, but right now for all intents and purposes AI researchers are waiting on faster hardware that's a few orders of magnitude faster than what we have today.

 

As far as Intelligence Amplification goes, Brain/Computer Interfaces, another hardware device, seem to be the main form this will take, and those are actually progressing much faster than I thought:

 

http://www.newscientist.com/article/dn17009-innovation-mindreading-headsets-will-change-your-brain.html

Posted

Is speed the limiting factor? I thought that computers were orders of magnitude faster than the brain.

Posted
Is speed the limiting factor? I thought that computers were orders of magnitude faster than the brain.

 

Our brains are made of trillions of neurons. A modern day CPU is incomprehensibly faster than an individual neuron, but they outpace modern CPUs in sheer numbers. Nature came up with very good ways of building self-similar systems, which is a problem CPU designers are only starting to focus on.

 

As things stand, CPU designers have maxed out the transistors they can devote to optimizing sequential execution, and the new trend is to have a simpler CPU core but dedicating those extra transistors that Moore's Law affords as time rolls on to making increasing numbers of duplicate CPU cores and interconnecting them in a mesh network.

 

As time progresses, computers are going to get a lot closer to the levels of parallelism we see in the human brain. Supercomputers will have hundreds, thousands, millions of CPU cores per logical "node", whereas today at the top end they may have 16 or 32.

Posted (edited)
Will it happen primarily via hardware or software? Why?

 

It will happen with software running on hardware. I think the largest factor is hardware. If nothing else, because hardware is improving exponentially. It could well be that this hardware is our own brain given advances in biotechnology.


Merged post follows:

Consecutive posts merged
Is speed the limiting factor? I thought that computers were orders of magnitude faster than the brain.

 

Computers work by doing a few calculations in cycles about one billionth of a second. Our neurons have far slower cycles, but we have more neurons. Far more. And connected in 3D, unlike current 2D chips.

 

http://en.wikipedia.org/wiki/Neuron

The human brain has a huge number of synapses. Each of the 10
11
(one hundred billion) neurons has on average 7,000 synaptic connections to other neurons. It has been estimated that the brain of a three-year-old child has about 10
15
synapses (1 quadrillion). This number declines with age, stabilizing by adulthood. Estimates vary for an adult, ranging from 10
14
to 5 x 10
14
synapses (100 to 500 trillion).[14]

Edited by Mr Skeptic
Consecutive posts merged.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.