So I'd say that the Singularity is pretty much inevitable at this point, barring some global catastrophe. The question of course is how soon will this take place.
The basis for a technological singularity is the creation of a superhuman intelligence, which in some sense we have already done (human brains have been estimated to have the equivalent of 100 teraflops of computing power; IBM's Blue Gene/L hit 280 teraflops last year). It's not so much a matter of processing power as programming, though, which is why Blue Gene hasn't already taken over the world. So the singularity could happen pretty soon, depending on if someone writes the right program. Or it could come another way, such as from the creation of nanobots that swarm together to become one intelligent entity. Which will take a bit longer, since we don't really have nanotechnology yet.
What do you guys think?
The basis for a technological singularity is the creation of a superhuman intelligence, which in some sense we have already done (human brains have been estimated to have the equivalent of 100 teraflops of computing power; IBM's Blue Gene/L hit 280 teraflops last year). It's not so much a matter of processing power as programming, though, which is why Blue Gene hasn't already taken over the world. So the singularity could happen pretty soon, depending on if someone writes the right program. Or it could come another way, such as from the creation of nanobots that swarm together to become one intelligent entity. Which will take a bit longer, since we don't really have nanotechnology yet.
What do you guys think?