Unsurprisingly, given my vast interest in all things artificial intelligence, this news story leaped out at me from the BBC News Technology page:
Machines 'to match man by 2029'
The point at which machines reach the same level of intelligence of man is known as "singularity", and is something I've been hearing a lot of lately. It was mentioned in last week's Skeptic's Guide podcast, in the context of the crazy "Mayan Apocalypse 2012" topic; apparently one of the ways in which the world might end is through reaching this singularity. Which actually links to the second way it came to my attention - through the Terminator spin-off series The Sarah Connor Chronicles. In both of these incarnations, the singularity is a Very Bad Thing. See also the original Matrix premise regarding the apocalypse brought about by the war between humanity and A.I.
But of course it need not be. The article regarding the possible singularity in 2029 is, it should be stressed, just a prediction - albeit a prediction from a leading expert in the field. I see no reason why his prediction might be wide of the mark; the technology in this area is advancing at a tremendous rate of knots. Reverse-engineering the brain (presumably the human brain) has been identified as one of the 14 major technological challenges facing humanity in this still-young century.
Even if singularity is not reached by 2029, two things are clear to me: firstly, it is itself inevitable for as long as research is done and progress is made; secondly, there's going to be an awful lot of very cool stuff going on by that point - much of which is discussed in that BBC article. Nanobots in particular, helping with the improvement of our own intelligence, fighting disease, and enhancing virtual reality.
As far as I'm concerned, singularity is coming. I'm personally hoping that it arrives sooner rather than later, because that will be a truly interesting time to be alive.