28 March 2011

The Singularity -- Is Strong AI enough?

Kurzweil suggests that a computer with super-human intelligence is enough to lead us to a technologic and economic singularity.  However, we humans have lots of experience with super-human intelligence.  Get two or more people together to plan and execute an operation, and you're dealing with super-human intelligence.  When NASA put a man on the moon, that was done by a team of 400,000 or so humans working together.

So, say we build an exaflops computer and ask it to design the next generation of computer.  Well that exaflops computer has somewhere around 1,000 to 10,000 times as much processing power as a human.  So it isn't going to design a much better computer than Intel would be currently capable of designing.  

Suppose Intel busy an exaflops computer for each of the humans it hires and this raises the available processing power at Intel by a factor of about 1,000.  Can Intel now design in one day what would normally take 3 years to design?  In _The Mythical Man-Month_ Brooks strongly suggests that adding people to a project increases the communications overheads to such an extent that the extra people are not worthwhile.   

So it isn't that we are building boxes with far more processing power than a brain.  It's that we are building high bandwidth digital communications between and within those boxes.

Of course, breaking up a complex design task into numerous small pieces is hard, and getting the equivalent of 1,000,000 human brains to produce a new chip design in a day might require too more parallelism than we can obtain.

Another constraint may arise from needing to interact with the physical world.  It's not just designing a chip; it's waiting for the chip manufacturing machines to be designed and built.  Prototype chips need to be built and executed to verify the models of physical reality that the design software is using.

0 Comments:

Post a Comment

<< Home