Eliezer said in a speech at the Singularity Summit that he's agnostic about whether technological change is accelerating, and mentions Michael Vassar and Peter Thiel as skeptical.

I'd vaguely assumed that it was accelerating, but when I thought about it a little, it seemed like a miserably difficult thing to measure. Moore's law just tracks The number of transistors that can be placed inexpensively on an integrated circuit.

Technology is a vaguer thing. Cell phones are an improvement (or at least most people get them) in well-off countries that have landlines, but they're a much bigger change in regions where cell phones are the first phones available. There a jump from a cell phone that's just a phone/answering machine/clock to a smartphone, but how do you compare that jump to getting home computers?

Do you have a way of measuring whether technological change is accelerating? If so, what velocity and acceleration do you see?

New Comment
36 comments, sorted by Click to highlight new comments since:

The thing with technological progress is that it's hard to predict it's direction. People in the 50's were expecting space travel and flying cars, not www, wikipedia, geonome sequencing, World of Warcraft and iPhones. And yet, now that we have them, we do consider them significant technological progress. So perhaps the next things that will come up will be equally unexpected but equally useful. It's hard to measure speed when the thing you're measuring changes form every few years.

[-]sfb00

People in the 50's were expecting space travel and flying cars, not www, wikipedia, geonome sequencing

Some people were, but not all. Buckminster Fuller was writing about the trend towards information proccessing before 1940: http://en.wikipedia.org/wiki/Etherealization

I'd be more interested in the answers to questions about technological progress in narrow domains. Way more information, way less controversial, and greater potential for producing analysis that I can usefully plug into my world model.

The multicore crisis appears to be continuing:

http://smoothspan.wordpress.com/2007/09/06/a-picture-of-the-multicore-crisis/

For a more recent graph, see:

http://orangecone.com/archives/2010/07/peak_mhz.html

According to The Singularity is Near we should have topped 11Mhz last year.

Clock speed is a pretty critical metric for tech progress. Not so exponential, now!

That and its linked posts are fantastic and explain quite a lot about the past few years.

ya'll need to read your wittgenstein. or at least paul graham. define technology rigorously enough and I'll give you a rigorous trendline.

As a metric for rate of technological change, I would want to look at what fraction of economic activity world-wide takes place using novel technologies. That is, what fraction of what we do is based on 5 year old technology vs 50 years vs 500 years vs 5000 years?

The trouble is, how do we interpret "based on"? Suppose I grow GM tomatoes. Agriculture is more than 5000 years old, but in much of the world tomatoes are only 500 years old. Do I count the GM part as 5 years, or 50? And suppose I distribute the tomatoes by airfreight and apply adhesive barcode labels to each tomato to assist the retailers. What fraction of my tomato-growing activity gets listed in each technology-age category.

Is the rate accelerating? I think so. It took 50 to 100 years to really exploit the scientific discoveries of the 19th century (electromagnetism, chemistry, evolutionary biology). Science hasn't produced anything quite so dramatic in the 20th century, but we do seem to be exploiting it more rapidly - more like 25 to 50 years for full exploitation. Flight, atomic power, computers, antibiotics, lasers. Plus, a much smaller fraction of mankind is, say, more than 50 years behind the leaders than was the case 100-200 years ago.

My proposal here was simply to weigh things:

http://machine-takeover.blogspot.com/2009/07/measuring-machine-takeover.html

I figure it is easier to weigh things than it is to calculate their economic impact.

What fraction of my tomato-growing activity gets listed in each technology-age category.

Maybe determine the fractions by the value added by each tier of technology? I don't know much about economics and would like someone knowledgeable to chime in.

I'm generally of the opinion that by most reasonable standards technological change is not accelerating. There have been periods that involved very rapid technological progress. For example, between 1890 and 1905 you have the invention of the radio, car, and the spread of the automobile. Aside from direct technology, that period also saw the discovery of special relativity and the discovery of radioactivity. Compared to that time period, almost any period in time looks like a slow crawl. We haven't had anything at all as game-changing as radio or airplanes for some time. Even if one argues that the world wide web counts (which is itself questionable since it is only an addition to the pre-existing internet), that was in the early 1990s. We've gone some time with steady improvement of existing technology, but very little that is impressively new.

[-]sfb30

Where you say radio and radioactivity, I expect the rate of discoveries like that to slow as we discover a larger amount of the finite "things to discover".

But as for making use of discoveries, the middle of the 1900s saw LASERs (1960), transistors (1950s), optical fibre (1952), credit cards, barcodes, solar cells, hovercrft, superglue, tippex, hard disks, satellites. All things which are arguably game changing, and giving 1890-1905 a run for it's money.

Also, the early 1900s are somewhat distorted with things like Jet Engines, where turbines were proposed and patented in the 1790s, but unable to be built. As soon as the ability to build more precisely and strongly was developed, a lot of queued inventions popped up.

So it seems there will be a burst of invention after similar enabling technologies become widely available - when we can reliably build "enough" kinds of nanotechnology components, there should be a corresponding burst of already waiting low hanging nanofruit harvested.

Sure, I'm willing to agree that the 1950s and 1960s saw a lot also. But the question is whether there's any substantial such activity now. Your point about queued inventions is very well taken.

The nanotech point might also be valid. Taken together that doesn't mean that the pace of technological change is accelerating but that we should expect it to start accelerating soon.

Limiting ourselves to game-changing technologies might not be a good way to measure this sort of thing. I suspect they're only distinguishable from hype in retrospect, at least when we're talking about popular technology rather than theory.

We haven't had anything at all as game-changing as radio or airplanes for some time. Even if one argues that the world wide web counts (which is itself questionable since it is only an addition to the pre-existing internet), that was in the early 1990s.

I'd say the social impact of the World Wide Web (even if the technology was there before) and of cell phones are on par with those of the radio and airplanes.

It seems pretty clear that technological change has accelerated over the last few decades or centuries -- most metrics of human technical ability, from generalizations of Moore's Law to measures of maximum speed and motive power, show a choppy but clear exponential curve over time. But to prove or disprove the singularity concept, it additionally seems necessary to identify the factors driving this acceleration.

Kurzweil, among other singularitarian writers of the accelerating change school, is of the opinion that technical sophistication itself drives further sophistication: technology feeds on itself, allowing smart people to do exponentially more with their hitherto more or less constant brainpower. This isn't a bad model. But while it's more or less inarguable that sophisticated tools are often a precondition for further sophistication, and it seems likely that certain types of tools (information storage and retrieval, communication, design, theory, etc.) do contribute to accelerating change, convincingly demonstrating that they contribute an exponential term is much tougher.

Offhand, the best alternative explanation I can come up with is that the dominant term for technological change is population: more people means more geniuses, and larger and more complex civilizations provide opportunities for specialization which allow smart people to innovate more effectively within their fields. If this is the case, we'd expect technological improvement to appear exponential up to the present, but to stagnate when the most populous regions of the world hit their demographic transition a few decades out.

I'm not immediately sure how you'd tell the difference between this hypothesis and Kurzweil's tech-driven model, although the existence of large but technologically stagnant civilizations (like China from the 15th to the 19th centuries) seems to imply that it's incomplete.

As far as Eliezer is concerned (with respect to the singularity) the important measurement is "best available technology" and not "average household technology". If you beleve in a singleton scenario for the singularity, it's the most advanced computers and other technologies that will probably be used.

Thanks, though it wouldn't shock me if average household and medical tech makes a contribution to the development of best available tech-- if people have better health and more free time, they have a chance to be more inventive.

I regularly see technology that makes me wonder what we'll have five years or ten from now - Wikipedia, 3D MMOs, smartphones, calculators, all the stuff you can buy online. I'm generally disappointed: today, World of Warcraft is the most popular MMO, as it was five years ago. The top-of-the range calculators (TI 92) are the same as when I was in school, when they just came out, etc. Wikipedia is neater than it was five years ago, but not extraordinarily so. But still, when I ask my parents, it doesn't seem that in their time they had brand new nifty impressive stuff hitting the world as often - they weren't wondering "damn, what will it be like in 5 years!".

I think that when it comes to technology, our "foreseeable future" has been shrinking. So I'd say "probably accelerating", but I'm open to evidence.

World of Warcraft would be better if it looked more like their demos do:

"Cataclysm Cinematic": http://vspy.org/?v=Wq4Y7ztznKc

It'd also be markedly less popular, since it wouldn't be accessible to people with 5 year old computers.

I agree with the elsewhere statement that "it's the best technology that matters, not what is accessible to most people."

Eliezer said in a speech at the Singularity Summit that he's agnostic about whether technological change is accelerating, and mentions Michael Vassar and Peter Thiel as skeptical.

This is subject to memory, but I think the point under contention is whether or not technological change is exponentially accelerating. It seems impossible to contest that technological change is accelerating,* but parabolic and exponential trajectories make incredibly different predictions ~20 years out.

*Assuming a long timescale, here. It might be possible to contest for short timescales, but that seems only doable because at short timescales you can lose the signal in the noise.

I'm rescuing Miller's comment from oblivion because it was intended to be ironic.

This is easy. We just have to measure how much technology changed from year to year for the last 100 years or so, and then eyeball the curve. Does anyone have that data?

Harder than the Turing test: Being better than humans at identifying irony.

Eliezer said in a speech at the Singularity Summit

At 29:40 BTW.

It will be difficult to measure without a clear definition. Perhaps, a taxonomy of definitions based on frequency of application would help so that important changes over small domains would be measured separately from unimportant changes over larger domains so that the rate of change in particle physics would a show a measure different from the development of robot vacuum cleaners. But once having determined clean definitions and categories, a simple count expressed as quantity per year, or other time frame, would give a rate. Those expressions could be used to compare a periods of time and determine if technological change was advancing.

Considerations like those in taw's post using expected utility theory to compare GDPs play a role.

Moore's law just tracks The number of transistors that can be placed inexpensively on an integrated circuit.

This seems like a rather ridiculous metric - if you go asynchronous, you can cover really large areas without too much difficulty. Edit: Moore's law is to do with price and performance.

Per area! :-P

That might make a little more sense (until we get proper 3D chips).

At my level of understanding, thermal budget concerns and other manufacturing defects suggest single-layer 2D chips are the only commercially viable option, and there are a number of challenges when it comes to stacking 2D chips to make a 3D chip. Is there a reason to prefer 3D chips to a large collection of asynchronous 2D chips? Have I misunderstood the technical challenges involved? (Both?)

Increased dimensionality reduces signal distances - and so increases speed.

Brains illustrate the viability of the approach.

Ok- I agree with you that it would be a good solution to the packing problem but I don't think the packing problem is the most relevant one.

[-][anonymous]00

The answer to the question doesn't much help revise our predicted future timeline or the key markers on it, unless we postulate a full reversal of progress.

[-]Miller-10

This is easy. We just have to measure how much technology changed from year to year for the last 100 years or so, and then eyeball the curve. Does anyone have that data?

I'm not sure whether you're being ironic.

Yes, irony. All the burden here is on the commenters, and if they had great ideas they should post them at the top themselves. It's kind of a poisonous time-sink pattern to ask a quantitative question about an extremely fuzzy concept. The math circuits light up and expect to be able to provide an answer. The best philosophers have fallen prey to bad questions. I also like nazgul's reference to wittgenstein.

If you swapped out technology with GDP, my comment is straightforward.