Ray Kurzweil's writings are the best-known expression of Singularity memes, so I figured it's about time I read his 2005 best-seller The Singularity is Near.

Though earlier users of the term "technological Singularity" used it to refer to the arrival of machine superintelligence (an event beyond which our ability to predict the future breaks down), Kurzweil's Singularity is more vaguely defined:

What, then, is the Singularity? It's a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed.

Kurzweil says that people don't expect the Singularity because they don't realize that technological progress is largely exponential, not linear:

People intuitively assume that the current rate of progress will continue for future periods. Even for those who have been around long enough to experience how the pace of change increases, over time, unexamined intuition leaves one with the impression that change occurs at the same rate that we have experienced most recently. From the mathematician's perspective, the reason for this is that an exponential curve looks like a straight line when examined for only a brief duratio. As a result, even sophisticated commentators, when considering the future, typically extrapolate the current pace of change over the next ten years or one hundred years to determine their expectations...

But a serious assessment of the history of technology reveals that technological change is exponential... You can examine the data in different ways, on different timescales, and for a wide variety of technologies, ranging from electronic to biological... the acceleration of progress and growth applies to each of them.

Kurzweil has many examples:

Consider Gary Kasparov, who scorned the pathetic state of computer chess in 1992. Yet the relentless doubling of computer power every year enabled a computer to defeat him only five years later...

[Or] consider the biochemists who, in 1990, were skeptical of the goal of transcribing the entire human genome in a mere fifteen years. These scientists had just spent an entire year transcribing a mere one ten-thousandth of the genome. So... it seemed natural to them that it would take a century, if not longer, before the genome could be sequenced. [The complete genome was sequenced in 2003.]

He emphasizes that people often fail to account for how progress in one field will feed on accelerating progress in another:

Can the pace of technological progress continue to speed up indefinitely? Isn't there a point at which humans are unable to think fast enough to keep up? For unenhanced humans, clearly so. But what would 1,000 scientists, each 1,000 times more intelligent than human scientists today, and each operating 1,000 times faster that contemporary humans (because the information processing in their primarily nonbiological brains is faster) accomplish? One chronological year would be like a millennium for them... an hour would result in a century of progress (in today's terms).

Kurzweil's second chapter aims to convince us that Moore's law of exponential growth in computing power is not an anomaly: the "law of accelerating returns" holds for a wide variety of technologies, evolutionary developments, and paradigm shifts. The chapter is full of logarithmic plots for bits of DRAM per dollar, microprocessor clock speed, processor performance in MIPS, growth in Genbank, hard drive bits per dollar, internet hosts, nanotech science citations, and more.

The chapter is a wake-up call to those not used to thinking about exponential change, but one gets the sense that Kurzweil has cherry-picked his examples. Plenty of technologies have violated his law of accelerating returns, and Kurzweil doesn't mention them.

This cherry-picking is one of the two persistent problems with The Singularity is Near. The second persistent problem is detailed storytelling. Kurzweil would make fewer false predictions if he made statements about the kinds of changes we can expect and then gave examples as illustrations, instead of giving detailed stories about the future as his actual predictions.

My third major issue with the book is not a "problem" so much as it is a decision about the scope of the book. Human factors (sociology, psychology, politics) are largely ignored in the book , but would have been illuminating to include if done well — and certainly, they are important for technological forecasting.

It's a big book with many specific claims, so there are hundreds of detailed criticisms I could make (e.g. about his handling of AI risks), but I prefer to keep this short. Kurzweil's vision of the future is more similar to what I expect is correct than most people's pictures of the future are, and he should be applauded for finding a way to bring transhumanist ideas to the mainstream culture.

New Comment
43 comments, sorted by Click to highlight new comments since:

Kurzweil's vision of the future is more similar to what I expect is correct than most people's pictures of the future are, and he should be applauded for finding a way to bring transhumanist ideas to the mainstream culture.

I'm not sure if his effect is a net positive, though. In fact, his writings and his public persona were among the main reasons why I didn't take Singularity-related ideas seriously for a long time, until I saw them presented in a far more sensible way on Overcoming Bias. I don't believe my case is unique in this regard.

It isn't. I first heard of the Singularity by reading Wikipedia's list of pseudosciences or something similar. I had decided to read it because there was a chance that something was miscategorized and I reasoned that if there was an idea that people thought was pseudoscientific but was actually correct, it could be important to know that. The Wikipedia article was filled with quotes from Kurzweil and pictures of exponential graphs, so I dismissed the Singularity group of memes, while still believing that AI was achievable and that it would have a profound effect on the world.

[-]djcb100

Prize quote:

Faced with a question like “How can we stop death?” or “How can we build a human-level AI?” you learn to respond: “What’s another question that’s easier to answer, and that probably has to be answered anyway before we have any chance on the original one?”

[-][anonymous]100

The chapter is a wake-up call to those not used to thinking about exponential change, but one gets the sense that Kurzweil has cherry-picked his examples. Plenty of technologies have violated his law of accelerating returns, and Kurzweil doesn't mention them.

This is much less useful if it isn't specific. It is a common criticism levied against Kurzweil, and while my intuitions would tend to agree with it after listening to several of his talks and reading some of his work, I must ask:

Please give three examples.

Please give three examples.

It's not difficult to think of wide areas of technology where decades of extremely rapid progress have been followed by decades of utter stagnation. For example, transportation technologies are generally in this category. This might change soon with the advent of self-driving cars, but it's still hard to reconcile any "law of accelerating returns" with four decades of no meaningful progress. (And in some cases even retrogression -- the fastest military and passenger aircraft of all time, the Blackbird and the Concorde, were both launched around 40 years ago, and have been retired from service since. The current world record in manned aircraft speed was set in 1976!)

In fact, on the whole, it is hard for me to think of many examples of technologies where we do see anything resembling accelerating returns and exponential growth by any meaningful metrics. There are the integrated circuits, fancy thin displays, and sundry signal-processing technologies that enable fast long-range digital communication -- but except for these and their direct applications, what is so much different compared to 30-40 years ago that it would be meaningful to talk about "accelerating returns"?

(And in some cases even retrogression -- the fastest military and passenger aircraft of all time, the Blackbird and the Concorde, were both launched around 40 years ago, and have been retired from service since. The current world record in manned aircraft speed was set in 1976!)

I don't disagree with your broader point, but I am not sure this is the best example. The Blackbird was a reconnoissance aircraft -- something that has been replaced by the superior technology of imagery satellites. In the case of the Concorde, the plane proved uneconomical, in part because many of the major urban areas that would be natural points of call did not have room to build long enough runways to accommodate its speed. Aeronautical engineering is probably capable of building a faster passenger plane, but there would be no market for it.

For example, transportation technologies are generally in this category. This might change soon with the advent of self-driving cars, but it's still hard to reconcile any "law of accelerating returns" with four decades of no meaningful progress.

Fatal car accidents per million miles driven, and per capita, are down precipitously. Although consumer automotive technology is not becoming faster and faster, using it is becoming safer and safer. In human terms — in terms of its ability to implement our values, which certainly include using it without dying or killing — the technology is advancing steadily.

(Of course, this construes "consumer automotive technology" broadly — including airbags, highway design, and Breathalyzers! — not just things like engine design.)

Fair enough. "No meaningful progress" is an overstatement. But so is "advancing steadily": under any meaningful metric, 2011 cars represent much less progress over 1950s cars than the latter represented relative to the 1890s horse carriages or the early motor cars circa 1900.

And in any case, even steady advances would still be a counterexample to the supposed trend of exponential progress and accelerating returns.

[-][anonymous]20

I do tentatively agree with the assessment about stagnation in some fields of technology, I just think that when bringing up such criticism one should also include examples. And including such examples forces one to think in a different way about it.

Please give three examples.

  • Computer clock speed (Kurzweil got this wrong).
  • Nuclear power.
  • Transportation.
  • Agricultural yields.
  • Mining and resource extraction (on a price per unit basis; total production has gone up).
  • Drug discovery (Kurzweil mispredicted this one too).
  • Skyscraper construction.
  • Construction costs generally.
  • Electricity costs.
  • Household cleaning devices.

Graphical calculators: when I was in high school there were these brand new high-tech graphical programmable calculators that got me wondering what great tools students of the future would have. now a dozen years later, what do high-schoolers use? The same old TI 89 that I had at the time (OK, smartphones may be taking over that niche soon, and laptops already ate up part of it). So that's one area where I was expecting progress and got nothing.

[-]Shmi80

I haven't done a detailed analysis, but my suspicion is that, if one removes the Moore's law and its consequences from the [technological] progress, the rest will look nothing like an exponent. Various branches would speed up for some time, then stall, or even regress. The overall rate of progress (how do you even qualitatively measure it?) would probably be closer to linear on average, or possibly too all-over-the-map to fit a curve through reliably.

My point is that we might be living in a Moore's bubble, and the consequences of its bursting would be hard to predict.

Economic growth on the whole is exponential, that's why it's expressed as percentage growth.

The rest of technology looks like many separate curves, which do tend to show exponential or logistic shapes, at least for long periods, from agriculture yields to coal plant efficiency, to solar. Economies do that too: the norm is exponential growth, n% multiplicative increase per year, rather than linear growth. There are shifts, e.g. the fast exponential for top speeds in aerospace ended once we got into space, while DNA sequencing just increased its growth rate several fold, but it's common for annual change to be multiplicative rather than additive.

Check out the Performance Curve Database for many individual improvement curves.

I spent about 50 hours researching technological forecasting methods in the past few months and the performance curves database does look to be the most exciting thing happening in that field.

Amdahl's law comes to mind in these discussions. The question is to what extent computing power can substitute for other things.

It doesn't help his case that in recent years increases in clock speed have slowed.

Clock speed isn't the only measure of CPU performance. In fact, it isn't much of a measure at all, given that new processors are outperforming Pentium 4 chips (ca. 2005) by the factor you'd expect from Moore's law, despite the fact that their clock speeds are lower by as much as a half.

What really matters (and what Kurzweil emphasizes) is (computation power) / (dollar).

I agree with that, but that leads me to my big problem with Kurzweil.

Whenever I hear him talk about exponential growth in solar power, it's always about installed base, and not watts/dollar. As a practical matter, since we're talking about capturing a finite surface flux, I don't see any way for this to be go exponential in watts/dollar for practical purposes, given the cost of land use.

Even if you go exponential in the underlying technology, eventually giving away free mylar type sheets capturing 100% of incident solar flux (and say we're around 20% of flux now), you've still got all the costs of land and electrical distribution that aren't getting exponentially cheaper themselves. And we've really only got a little over 2 doublings of percentage of surface flux to go and we're done.

His plots of exponential installed base just seem a dishonest way of claiming that exponential growth is occurring in solar, and that gives me pause about trusting his other claims.

Similarly, just because we're using silicon manufacturing methods for solar doesn't make it an informational technology subject to exponential growth, but he talks like it does. Digital information technologies can go expoenntial because they can keep going smaller, but you can't do that if you're trying to capture a surface flux.

As a practical matter, since we're talking about capturing a finite surface flux, I don't see any way for this to be go exponential in watts/dollar for practical purposes, given the cost of land use.

Orbital solar power stations. Granted, it's only a temporary extension before the exponential starts stagnating, but it's just one example of how the limit of Earth's land area could be overcome. The eventual limit is capturing all of sun's energy output, e.g. with a Dyson sphere, with near-perfect efficiency.

Orbital solar power stations.

My point about surface flux wasn't about the limit of surface area to work from, but a limit on how much flux was passing through any square meter. Unlike semiconductor technology, you can't just keep going smaller to improve performance.

And I don't think orbital satellites are likely to be competitive with land on a cost per square meter basis for a while.

But this isn't my bigger concern. I'm more worried about the intellectual error involved, which lessens my trust in his other conclusions. Either he doesn't recognize the mistake, or he does.

http://bigthink.com/ideas/31635?page=all Ray Kurzweil: Solar Will Power the World in 16 Years

During his latest Big Think interview, Kurweil explained:

"Solar panels are coming down dramatically in cost per watt. And as a result of that, the total amount of solar energy is growing, not linearly, but exponentially. It’s doubling every 2 years and has been for 20 years. And again, it’s a very smooth curve. There’s all these arguments, subsidies and political battles and companies going bankrupt, they’re raising billions of dollars, but behind all that chaos is this very smooth progression."

Notice the sleight of hand. The cost per watt (of a panel, not of the energy delivered to a customer) is "coming down dramatically". Has that been exponential? Where is the curve? What is the doubling time? Also, are the land costs held fixed? Many of the newer, cheaper technologies trade off efficiency of the semiconductor used versus increased area used.

But instead of even showing us data on cost per watt, he starts talking about the aggregate power generation from solar, and how it has been on an exponential progression, and bases his predictions on that. That's either a fundamental mistake or a deception.

Maybe you missed my point. I want to see exponential growth in watts per dollar for total system cost.

There's a paragraph near the end that doesn't make a lot of sense, with a chart *with data only from 2005 to 2009, projected out to 2031." The fact that someone would project out 5 years of data for 27 more years does not fill me with confidence. Even the supposed crossover point is 2020, 11 years after his 5 year window of data.

This is another guy who is looking either dishonest or sloppy, conflating power per dollar for system and panel costs.

Very simple. Show a multidecade plot of power per dollar for system cost - with actual multidecade data, and not just projections. Let's see if it's exponential, and let's see what the doubling time is.

And then given the cost, let's see if Ray's projections for solar power generation look reasonable. I really doubt it.

Here's the Nation Renewable Energy Laboratory market report.

Page 62 has whole-system costs for 1998-2010, which fall about half (which doesn't reflect the recent drop of solar cell prices towards production cost, the alleviation of the silicon shortage caused by unexpectedly rapid growth, and the overshoot below cost due to Chinese subsidies).

There are module (not just PV cell, those are earlier) price data back to 1980 on page 60, also with a doubling time around a decade.

You can also look at this paper for more LCOE data:

Thank you for the diligence.

I think the graph on page 63 has the best installation data. Again, same years as the graph on 62, but breaking out the installation and pv costs separately. The installation costs went down about a third in 12 years. The internal trend looks a lot worse, with 2005-2010 being essentially flat, but lets go with the 35% drop. You wanna get out a calculator to figure out the halving time? Let's say conservatively, 15 years.

Meanwhile, Kurzweil's talk, from 2011, says that solar will rule the world in 16 years. Will another 35% reduction in installation costs make solar "rule the world"?

Looks like we're continuing our previous conversation: http://lesswrong.com/lw/dm5/why_could_you_be_optimistic_that_the_singularity/71cs

From last time, we had $3.30 per watt on installation. At $2.20 per watt in 15 years, and assuming free solar cells, will solar rule the world?

Maybe hopeful. They had coal at 2.10 per watt on the wikipedia page. Of course, the PVs won't really be free, but it does look competitive.

You've made me a little more hopeful.

I think it's materials science that eventually makes the difference, when we start replacing window panels with gorilla glass solar panels. The difference comes when solar is no longer something extra you add to a building, but part of the structure itself.

Hmmm, I'm not so sure. Yeah, we've got price per watt (Power), but it really should be price per kWh averaged over a day, which would include capacity factor, which is a big problem for solar. The panels seem cheap enough, but we need a big breakthrough in installation costs. I think it could happen, but the data doesn't predict it coming in Ray's timeframe.

Demo innovations (i.e. grist for the future, not already in the aggregate data) lately have included robots to do installation, designs with reduced installation reqs. The US DOE Sunshot Initiative page has the details of their programs supporting BOS, easier permitting, and so forth, although they have some interest in spinning a positive picture.

Cheap panels does suggest risk of slowdown, but there's also some room to shift further tradeoffs in design, i.e. as cost of manufacturing and efficiency become less of an issue more effort will go into designing panels that work well with the BOS improvements.

and assuming free solar cells, will solar rule the world?

It won't dominate dark areas, or assume 100% load (without batteries that push back dominance later), or price out already built nuclear plants (or coal and gas plants, absent massive carbon taxes). The claim that we might build so much solar as to match today's world electrical output (but not the output of that future time) wouldn't be shocking to me, although my guess would be for it to take longer than Kurzweil predicts (there are more efficiency gains to be had, but efficiency gives you free BOS savings by letting you use fewer panels, and the other areas will have to step up, especially for the later parts of his prediction).

it's only a temporary extension before the exponential starts stagnating

In Little Science, Big Science Derek J de Solla Price does a great job talking about stagnation of exponential scientific growth curves. In general, an exponential growth curve must flatten at some point. He did things like look at the number of scientists in relation to the population. A quote:

It is clear that we cannot go up another two orders of magnitude as we have climbed the last five. If we did, we should have two scientists for every man, woman, child, and dog in the population, and we should spend on them twice as much money as we had. Scientific doomsday is therefore less than a century distant. -Derek J De Solla Price

It's been years since I read it, (and it's a good 50 years old) but I remember it was a good book, that was one of the founders of scientometrics. I would definitely recommend.

Kurzweil has never claimed that exponentials go on forever. He has claimed that many exponentials go on longer than the current technology would allow.

Even when there is only an overhead of 100X in which to grow something, knowing it is probably going exponentially with T timescale gives you a very different sense of the midterm future than thinking it is stagnant or saturated or linear. And the singularity is all about the midterm.

http://www.pbs.org/wnet/need-to-know/environment/futurist-ray-kurzweil-isnt-worried-about-climate-change/7389/

Ray sez

But doubling every two years means it’s only eight more doublings before it meets a hundred percent of the world’s energy needs. So that’s 16 years. We will increase our use of electricity during that period, so add another couple of doublings: In 20 years we’ll be meeting all of our energy needs with solar, based on this trend which has already been under way for 20 years.

That's what he claims. He does it based on two mistakes - calling solar an information technology, and switching from looking at exponential growth in util/dollar to exponential growth in installed base.

See this article, and the links to the NREL for cost data (which Kurzweil does also talk about). Solar energy output per dollar has been improving with a doubling time of about a decade for several decades. If that trend continues, then it will be cheap relative to existing alternatives by the time Kurzweil projects gigantic market share. And prior to that there are markets in areas where competing electricity is expensive (theft on the lines in India, lack of connection to the grid in poor areas, the correlation of solar with peak load for air conditioning, places with carbon taxes) to absorb a lot of growth.

And cost improvements come not only from efficiency in absorbing flux, but from reduced use of materials, more efficient manufacturing processes, and so forth. Balance-of-system costs have also been going down. Distribution costs apply to other power sources (although distributed solar in some places benefits because homeowners can use the solar themselves, and free ride off the utilities for distribution, an implicit subsidy for early growth). Non-arable desert land is not particularly high value, nor are roofs.

This isn't really true--clock performance is a really good metric for computing power. If your clock speed doubles, you get a 2x speedup in the amount of computation you can do without any algorithmic changes. If you instead increase chip complexity, e.g., with parallelism, you need to write new code to take advantage of it.

Wrong. A two-fold increase in CPU clock rate implies a twofold increase in CPU cycles per second, and nothing more. Any number of pure hardware improvements - for example, increasing the number of instructions, decreasing the number of CPU cycles an instruction takes to execute, improving I/O speed, etc - can improve performance without changing the clock rate, or even while decreasing the clock rate, without introducing parallel processing cores.

[-]Shmi30

The original Moore's law talked about not the clock speed, but the number of gates per IC, which has been holding up OK, if you count multicores, although even that metric shows signs of falling below exponential.

What, then, is the Singularity? It's a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed.

This strikes me as a very un-ambitious definition. It seems to match the scientific, industrial, and (possibly) computing revolutions as well as it matches anything likely to happen in the future.

Kurzweil makes an interesting claim - he claims that, because of information theory, the smallest program which can emulate a human brain is likely to be (not much larger than) the compressed size of the human genome. This doesn't say anything about how easy it would be to actually write this program, or the computing power needed to run it. Also, it assumes that the emulated brain would be exposed to a similar kind of environment that a human would be.

I think that Kurzweil is right here, but I have trouble explaining exactly why I think that. And other people disagree. Any thoughts on this? I may write a LW discussion post on this.

(I don't know whether this particular claim is in the book as I haven't read it yet).

[-]gwern-10

I agree with the information theoretical claim, and I think you could prove it with the usual language-to-language argument for Kolmogorov complexity; the gist: the brain is the result of a program written in DNA for a certain extremely complex language known as 'the human body', which when run, takes an extremely complex input known as 'life' (all of which can be modeled in binary thanks to the universe apparently being computable). We want to translate the program written in DNA to a program written in x86; both are languages, so the translation only requires a certain fixed-length prefix interpreting the DNA for x86.

I take your link as agreeing that it works in theory but disagreeing about the practice: 'yeah, but all of those parts are extremely extremely complex and that fixed-length prefix is a really big prefix which we aren't even close to writing, and X and Y and Z; hence, Kurzweil's forecasts are way off and ridiculously optimistic'. Notice he doesn't talk in terms of it being impossible, period, but rather of missing parts and explanations of how something will be accomplished:

I presume they understand that if you program a perfect Intel emulator, you don't suddenly get Halo: Reach for free, as an emergent property of the system. You can buy the code and add it to the system, sure, but in this case, we can't run down to GameStop and buy a DVD with the human OS in it and install it on our artificial brain. You're going to have to do the hard work of figuring out how that works and reverse engineering it, as well. And understanding how the processor works is necessary to do that, but not sufficient.

(Having the fixed-length interpreter for the DNA program is necessary but not sufficient; having the input for the program is necessary but not sufficient; etc.)

This is the other case where I think Kurzweil is just in error.

The number of bits that it takes to encode DNA given "the language of the human body" is small, but how many bits to encode the language of the human body, and the language of cells, and the language of chemistry, and the language of the biome?

I can encode wikipedia with one digital bit, if I allow Wikipedia as one of the units in which I am encoding, and keep that complexity off the books. That's what Ray is doing here - keeping the complexity of atoms and molecules and cells and biomes "off the books".

You can do that as long as you're able to deal with the complexity of the final translation from your digitial world to your problem domain yourself. If your problem domain is entirely digital, and you just want a simulated brain to work in your simulated digital world, the information content of intelligence would be less than the information content of your virtual DNA given a digital world with the complexity to simulate atoms and molecules and cells..., but we aren't given that digital world.

In the meat world, you could create intelligence by fabricating DNA a piece at time, inserting it into a human cell, implanting that in a woman, and waiting for 9 months. Or, you could go the old fashioned manual route of inserting dna into human eggs without any information complexity at all related to human dna. But either method relies on the information content in the world to provide the language that converts our information and intent into an intelligence, and I think the point of an AI is to get beyond the reliance on either of these methods of translation.

Like the link, I read you as agreeing the argument is true but not useful.

I think for the claim to make sense, you need to be able to do it without the "interpreting the DNA for x86" component. It seems very likely that such a thing would be way larger than the genome itself (i.e. the "really big prefix" that you mentioned). So I'm not sure in what sense you agree with the information theoretical claim? We're talking about a particular genome, not the general trend as genome size -> infinity.

I don't follow. If it's true for genomes in general, then it's true for particular genomes. Real-world genomes may be small enough that it is not useful in practice, but the theory is still true.

He emphasizes that people often fail to account for how progress in one field will fed on accelerating progress in another:

I'm assuming that's a typo?

Fixed, thanks.