These authors dismiss the possibility. The sneer at Ray Kurzweil's careful calculations of accelerating technological change. They declare that human biology has a level of sophistication that isn't on the horizon for robots in the near future.
First they present Kurzweil's argument, i.e. computer technology is rapidly advancing and will soon sweep past the limits of biology:
One cubic centimeter of human brain tissue, which would fill a thimble, contains 50 million neurons; several hundred miles of axons, the wires over which neurons send signals; and close to a trillion (that’s a million million) synapses, the connections between neurons. ...The article gives many reasons why machines can't match humans anytime soon. These are valid. Back in the mid 1950s AI (Artificial Intelligence) researchers confidently predicted smart machines in a few years, a few decades at most. Obviously that didn't happen. So the pessimists "won". But despite the visible failure of AI's confident predictions, computers have slowly, imperceptibly continued to take over more and more sophisticated functions.
... That thimble [of brain tissue] would then contain 1,000 gigabytes (1 terabyte) of information. A thousand thimblefuls make up a whole brain, giving us a million gigabytes — a petabyte — of information. To put this in perspective, the entire archived contents of the Internet fill just three petabytes.
...Kurzweil invokes Moore’s Law, the principle that for the last four decades, engineers have managed to double the capacity of chips (and hard drives) every year or two. If we imagine that the trend will continue, it’s possible to guess when a single computer the size of a brain could contain a petabyte. That would be about 2025 to 2030, just 15 or 20 years from now.
This projection overlooks the dark, hot underbelly of Moore’s law: power consumption per chip, which has also exploded since 1985. By 2025, the memory of an artificial brain would use nearly a gigawatt of power, the amount currently consumed by all of Washington, D.C. So brute-force escalation of current computer technology would give us an artificial brain that is far too costly to operate.
Compare this with your brain, which uses about 12 watts, an amount that supports not only memory but all your thought processes. This is less than the energy consumed by a typical refrigerator light, and half the typical needs of a laptop computer. Cutting power consumption by half while increasing computing power many times over is a pretty challenging design standard. As smart as we are, in this sense we are all dim bulbs.
Projections like Hans Moravec's timeline of computer evolution are exciting. But Aamodt and Wang argue that these are optimistic projections that quietly ignore troublesome issues that compound as you shrink the electronics. Similarly, I remember industry pessimists in the mid-1970s giving reasons why Moore's law couldn't be projected out to 64Kbit memory chips. They argued there were just too many physical limitations to overcome. The limits were real, but ingenuity got around them. Today you can buy 4Gbit memory chips. That's 64,000 times greater than the density limit that the pessimists warned would end progress.
Personally, I fall in the middle of this debate. I doubt Kurzweil's projections of robots as smart as humans by 2025. But I wouldn't be surprised if such a machine showed up by 2100. But it won't show up as a box on legs running around debating with humans. It will probably show up in the infrastructure as functions or services that our "smart environments" handle for us. For example, who ever wonders how many processors or how big a disk farm is used by Google to support our searches?
Even more shocking are the technophiles who see a two-way street. Not just machines becoming more human-like, but humans taking advantage of technology to become more machine-like (see Wikipedia). They believe we will meet our brother and sister machines somewhere in the middle. Who knows? I for one wish I would be around to see the day. But, being the flaky carbon-based lifeform that I am, I won't be around to see it. Oh well.
No comments:
Post a Comment