While You Were Tweeting, the Singularity Got Nearer

While You Were Tweeting, the Singularity Got Nearer
X
Story Stream
recent articles

I first came across the concept of "The Singularity" in 2005 by reading Ray Kurzweil's book The Singularity is Near: When Humans Transcend Biology. Reading the book was an epiphany, since Kurzweil was able to combine several separate but related trends of the computer revolution and explain what it means for humanity in a form that non-engineers could understand.

The idea of "The Singularity" has been around since the early 1950s, when computer/math genius John von Neumann told a friend that all the changes he had seen in his lifetime regarding the "ever-accelerating progress of technology and changes in the mode of human life," made him think that mankind was "approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue."

A second, more concrete example of this "ever-accelerating program of technology" arrived in 1965 when Intel co-founder Gordon Moore published an article observing that computing power was increasing exponentially in power while decreasing steadily in cost. At the time, Moore thought the "law" cutting computing costs in half every two years would run about a decade or so, but the trend has now continued largely unimpeded for an incredible 50 years.

In the past half-century, computing power formerly available only to the federal government or research universities has migrated into the pockets and purses of 4 billion smart-phones users around the world. Kurzweil's great insight was to figure out that if both industry investment and computing power trends continued well into the 21st Century, it was possible for a computer to pass a "Turing Test," named after another mid-20th Century computer/math genius Alan Turing. Turing in 1950 developed a test of a machine's ability to exhibit intelligent behavior indistinguishable from that of a human being, thus proving artificial intelligence (AI). Kurzweil in 2005 predicted this test would be passed by roughly 2029.

Kurzweil's book argued—through voluminous examples and notations—that a "law of accelerating returns" in information technology has been working for more than a century and perhaps longer. He then took an extra, deeply radical step of saying the acceleration would not slow down after passing the Turing test, but continue to grow geometrically, with a powerful superintelligence that could surpass all collective human intelligence arriving around 2045.

What von Nuemann meant by a coming world beyond "human affairs" is difficult to discern. Kurzweil believes human-computer integration is the most likely post-Singularity world, with inexpensive software and hardware augmentations turning humans into a race of Six Million Dollar men and women (a 1970s television reference worth investigating) and perhaps even achieving near immortality.

Since publication of The Singularity is Near in 2005, 12 years have elapsed. Another dozen years and we will reach 2029, the date by which he estimated a computer will pass the "Turing Test." At this roughly half way point, it's worth asking: Are his estimates holding up?

Kurzweil gives nearly 50 separate metrics by which to support his thesis that fundamental measures of information technology are following a strict predictive trajectory. Of these metrics, many track calculations per second per dollar—the essence of Moore's Law. My best educated guess is that the four easiest to understand and important metrics include DNA sequencing cost, transistors per microprocessor, dynamic random access memory (RAM), and supercomputing speed—so let's see how these "four metrics of the Singularity" have progressed in the last 12 years.

Impressively, DNA sequencing costs have fallen well beyond the most aggressive estimates, falling from almost $10 million for a fully sequenced genome in 2005 to roughly $1,000 today, even faster than Moore's Law. The ramifications for such cheap sequencing, combined with the burgeoning use of CRISPR gene editing technology, are mind-boggling, with the potential to cure almost all genetically based diseases and perhaps allow cheap genetic augmentation down the road—just the kind of body-computer cyborg world that Kurzweil had predicted.

Coming in second in terms of succeeding in following Moore's Law is the continued shrinkage of transistors onto roughly 10 nanometers (nm) chips. Intel now reports it's able to pack 100 million transistors in each square millimeter of computer chip, compared to roughly 7.5 million back in 2010. Read the article here to figure out the dynamics of the debate over transistor density, and how the next move to 5nm chips is the critical breakthrough needed to help usher in the self-driving cars, on-board artificial intelligence, and 5G wireless cell service needed for virtual reality technology to truly find a foothold.

Memory is the bane of every computer geek. You never have enough, but improvements to memory chips over the last 20 years have been nearly consistent with Moore's Law. Unfortunately, as of late, the enormous demands of smart phones have created a shortage of DRAM chips, which may force Apple to have only 3 Gigabytes of Dram in its 10th Anniversary iPhone 8 later this year, rather than the original planned 4 GB. This event will likely push the trend-line for memory downward below the Moore's Law target. Don't get too depressed, however. The original iPhone released on June 2007 had only 128 Megabytes of memory and was selling at $599. A quick walk down memory lane shows the author's first bona fide desktop computer—an ASUS 486/66 All-in-One that combined a 14-inch screen with its computer tower and cost $1,699 in 1995—had a grand total of 8 Megabytes of memory, enough to play chess games against the computer and not much else.

The laggard of the four metrics is supercomputing power, which has fallen below the Moore's Law trend in the past three years. The causes behind the slow-down start with the flat-lining of "clock speed"—how fast microprocessors execute instructions—due to excessive heat, and also a decision by the industry to reorient its tens of billions of dollars in investment toward other things such as mobile and the Internet of Things. The hierarchy of needs within the computer industry has changed, and the importance of centralized super-computing power has become less relevant. It's worth remembering Kurzweil's book first came out more than two years before the first iPhone.

Kurzweil engendered a lot of criticism from his prognostication, both on technical and ethical themes. During the period just after the book's publication, critics like John Rennie and Microsoft co-founder Paul Allen argued against his "law of accelerating returns" on technical grounds, saying it's not a physical law but rather a capital investment profile that can be changed by economic forces. Sun Microsystems co-founder Bill Joy and others have criticized Kurzweil on ethical grounds, arguing his Pollyannaish view of the Singularity is naive, given what we know of the negative, violent aspects of human nature. The outcome of The Singularity in their view is more likely to create a dystopic world of the Terminator and its Skynet overlord than a global society happily married to computers.

It's worth noting that many of these criticism occurred in the first several years after the book was published and have diminished in the past five years. Kurzweil joining Google in 2013 as its chief technologist may have had something to do with this diminishment, given Google's enormous resources and its implicit validation of Kurzweil's theory. The unimpeded march of many of Kurzweil's metrics, especially the ones involving DNA technology, may also have quieted skeptics.

The biggest potential threat to Moore's Law has always been the chemical limits of silicon, the primary element used to make computer chips. It's likely that the slowing of the curve seen in in some of the metrics will become more pronounced in the early 2020s, as transistors become so small that quantum effects start to undermine electrical signals. But steady advances in quantum computing and machine learning in the next 10 years—Google, Microsoft, and Facebook are all spending billions in research in both areas—suggest Kurzweil's primary prophecy that the Turing Test will be passed by 2030, is more likely than it was a dozen years ago.

When taken together, there is lots of evidence of this, from the Deep Learning revolution of the past two years to questions over whether the iPhone will even be around for its 20th anniversary in 2027.

If voice commands and augmented-reality glasses replace much of the smartphone interface in the next decade, is it really that radical a step to say a human won't be able to tell the difference in interaction between a human and a computer?

Think about it. In 2017, humans are already communicating verbally with Apple's Siri, Google Assistant, Microsoft's Cortana, and Amazon's Alexa tens of thousands of times a day. Over time, these interactions will condition many users to be less and less conscious about whether their conversations are with humans or with artificial intelligence.

Come prepared. The Singularity is still near.

William Murray is an editor at RealClearEnergy.

William Murray is an editor at RealClearEnergy.

Comment
Show commentsHide Comments

Related Articles