New Year's Predictions Thread

18 Post author: MichaelVassar 30 December 2009 09:39PM

I would like to propose this as a thread for people to write in their predictions for the next year and the next decade, when practical with probabilities attached. I'll probably make some in the comments.

Comments (426)

Comment author: PhilGoetz 09 January 2010 03:38:03AM *  1 point [-]

Carry-on luggage on US airlines will be reduced to a single handbag that inspectors can search thoroughly, in 2010 or 2011.

Comment author: gwern 26 August 2010 10:19:38AM 0 points [-]
Comment author: PhilGoetz 26 August 2010 10:46:22PM 0 points [-]

2010 is almost over, so the odds of my being right are now considerably less.

Comment author: gwern 27 August 2010 04:12:52AM 0 points [-]

Well, we're only 8/24 of the way to the end of 2011, so you could still be right. Ganbaru!

Comment author: LucasSloan 05 January 2010 05:23:42AM 0 points [-]

I get into UC Berkeley - 70%

Comment author: gwern 26 August 2010 10:18:44AM 0 points [-]

http://predictionbook.com/predictions/1719 but what date should the prediction terminate on?

Comment author: LucasSloan 26 August 2010 02:06:11PM 0 points [-]

About 3 months ago.

Comment author: gwern 27 August 2010 04:14:37AM 0 points [-]

o.0

OK, did you get in or no?

Comment author: LucasSloan 27 August 2010 06:50:59AM 1 point [-]

Yes.

Comment author: gwern 27 August 2010 07:04:28AM 0 points [-]

Congratulations to the both of us, then.

Comment author: NancyLebovitz 04 January 2010 12:54:11PM 0 points [-]

Secession: If you mean a state trying to leave the US in the next decade, 5%. If you mean a state actually being allowed to leave, I put it at 0%.

Insurrection in the next decade: I'm defining an insurrection as at least 1000 people in the same or closely allied organizations with military weapons taking violent action against the US government: 30%. They'll lose. It's certainly possible that my opinion on this is based on reading too much left wing material which is very nervous about the right. On the other hand, 1000 isn't a lot of people.

All predictions are 10 years out unless otherwise noted.

The rest of the world: Another EU-style organization gets started: 30%. The advantages of having a large population are getting obvious, and it's astonishing to me to see countries begging for a chance to give up some national sovereignty.

Fabbing: Automated custom shoes: 50%. Possibly wishful thinking-- I have feet which aren't quite in the easy-to-fit range.

On one hand, there's a massive market. On the other, shoes are a way of signaling status, and it's probable that I wildly underestimate how hard it is to put a shoe together. And shoes are a way of signaling status, so a custom machine shoe for the general market can't look much different from the standard shoes.

Custom machine shoes will start out expensive, and be for the athletic market.

Another product: Sous vide cookers (poaching food in vacuum-sealed bags at precisely controlled temperatures: has many good effects and should be especially appealing to geeks) will be down to $200 within 5 years (80%) and half as common as microwaves within 10 years (50%).

Obama will be re-elected unless he is assassinated (5%), there is a major terrorist attack on US soil (I'm not betting on that one, too random, and how he responds will have an unpredictable effect, too), or the economy doesn't improve (I think it will, but don't have a percentage).

Comment author: Kevin 03 January 2011 11:22:32PM *  0 points [-]

You can get a hacker sous vide setup for under $200 today. http://news.ycombinator.net/item?id=2058982

Comment author: NancyLebovitz 03 January 2011 11:37:00PM 0 points [-]

I think you could when I made the prediction-- what I had in mind was a sous vide cooker that you didn't need to put together.

Comment author: mwengler 03 January 2011 09:50:14PM 2 points [-]

Just as a check on 0% for a state being allowed to secede, consider this.

What would you put at the probability that there would be sufficient devastation in the eastern seaboard of the US in the next decade from (for example) bio or nuclear attacks or terrorism? If that happened, what would be the probability that the US would be disbanded as a going concern? I realize you would likely assign very small numbers to these possibilities, but possibly > 0%. If you assign >0% to this, then you assign >0% to a state being allowed to secede. (recapitulating an objection voiced to me by Anna Salamon when I made a claim of extremely small probability for some risk or another).

Comment author: Costanza 03 January 2011 11:54:48PM *  0 points [-]

There's a lot of ruin in a nation. The main axis nations of World War II -- Germany, Italy, and Japan -- provide some examples of nations that were really, really traumatized and damaged. Out of the three, only Germany split apart, and that only because of competing foreign occupiers. Even then it reunited as soon as it got the chance. I don't think there's enough hostility or just plain difference between most of the states west of the Mississippi to cause them to separate, especially under threat of external attack. If anything, I'd expect them to band together as tightly as possible.

Comment author: NancyLebovitz 03 January 2011 11:34:05PM 0 points [-]

I don't think the US would go away even if the eastern seaboard was nothing but glassy craters and deadly microbes.

That being said, it's conceivable that some technological or ideological change could weaken the central government to the point that states would be let go, though it's hard to imagine something that drastic shaping up in as little as 9 years. I'm also not sure what change could happen which would break the federal government while leaving state governments intact.

Ok, though-- in a decade, something very odd could happen. I don't think a lot of people were predicting the dissolution of the USSR before it happened.

Meanwhile, sous vides don't seem to be a lot cheaper or more popular, but I didn't put as extreme a probability on that one.

Comment author: gwern 26 August 2010 10:16:04AM 0 points [-]
  1. http://predictionbook.com/predictions/1713 & http://predictionbook.com/predictions/1714
  2. http://predictionbook.com/predictions/1715
  3. An EU-style organization - you'll have to be more specific than that. Every region has a bunch of multinational orgs like the UN. Africa has the Union of African States, Asia has ASEAN, SAARC, BIMSTEC, etc. Maybe you would prefer a prediction like 'at least 10 nations in Asia/Africa/South America will create a new common currency and switch to it'?
  4. http://predictionbook.com/predictions/1716 I agree that this one is wishful thinking on your part. :)
  5. http://predictionbook.com/predictions/1717 & http://predictionbook.com/predictions/1718 I agree that it's perfectly possible (surely right now) to sell a sous-vide cooker for $200; I question that there is demand enough, and really have no idea about the business environment. Cynicism tells me that there is no enormous revolution in American cuisine in the offing to the point where effectively half the middle-class has a sous-vide cooker, though. I mean come on.
  6. http://predictionbook.com/predictions/452 for his re-election
Comment author: NancyLebovitz 26 August 2010 01:37:09PM *  0 points [-]

Thanks, mostly.

I think it would have been more fair to make my predictions 1. "A state will not try to secede" and "A state will not succeed at seceding".

Other than that, it's interesting to see how uncertain I am that some of my predictions are the result of my own thinking rather than emotional effects from people I've been reading.

(3) What I had in mind for an EU-style organization was dropping restrictions on trade and travel. At this point, I'm not as optimistic, but that feels more like mood than new information. I don't know whether dropping restrictions on trade requires a common currency.

(4) Computer-fabbed custom-fitted shoes are a lot easier than AI. If you don't think that's at all likely within 10 years, does this affect any predictions you might have for AI? Your answer is about there not being a market for them-- I'd say that the market isn't perceived. Either way, I don't get the impression that that tech is ready to do it yet.

It might make more sense for a computer to measure the feet and make the pieces, but have human beings put the shoes together. :-/

I'm also assuming shoes would be mailed rather than being shoes on demand-- shoes on demand would be another jump in technology.

Thinking about it a little more, the footprint in stores could be pretty small-- just the measuring device. I'm not sure how much support from store staff it would be apt to need at the beginning.

This sort of development is also dependent on how much capital is available, and I'm not feeling optimistic about that.

(5) The conveyor belt for new aspects of food (perhaps unsurprisingly) seems to be more efficient for prepared food and ingredients than for cooking methods. I still haven't had sous vide food myself, but everything I've heard about it makes it sound wonderful. I think there will be a sudden shift with sous vide food becoming available in mid-range restaurants followed by a lot of people wanting to cook it.

ETA: The website didn't just format the numbers into pretty paragraphing, it "corrected" the numbers.

Comment author: gwern 27 August 2010 04:27:45AM 0 points [-]

For 3, a monetary union isn't necessary; look at the US & Mexico & Canada, thanks to NAFTA. Certainly helps, though. I don't really see any areas which might do this sort of thing. Open borders and no trade barriers is a very Western 1st World sort of thing to do, and the obvious candidates like Japan don't really have an incentive to do so. (Japan has no land borders, so having passport checks doesn't really increase the cost of flying or boating to it.)

For 4: I think custom-fitting is already possible, and has been since the early laser scanners came out in the... '80s? But like the sous-vide, I'm not confident in their uptake. (It's kind of like jetpacks and flying cars and pneumatic postal systems. We have them; we just don't use them.)

ETA: The website didn't just format the numbers into pretty paragraphing, it "corrected" the numbers.

This is part of standard markdown; you can number each item '1.' if you want! If you want a number item you can escape it with a backslash, or you can do like I did and insert a paragraph after the bullet (newline, and then indent the paragraph by 4-5 spaces).

Comment author: NancyLebovitz 27 August 2010 12:36:02PM 0 points [-]

Damn, on 3 I didn't say what I meant. The genuinely big deal is freedom to relocate and work.

Do you have a source for computerized custom-fitting of shoes? The big deal isn't just the fitting, though, it's reasonably-priced manufacture.

Afaik, jet-packs can be made, but carrying enough fuel for significant travel isn't feasible.

As for flying cars, it finally occurred to people that there were weather and pilot safety issue.

I don't see those sorts of considerations applying to sous vide or computerized custom shoes.

The futuristic prediction which seems to be not happening because people just don't want it is video which shows your face while you're talking on the phone.

Comment author: jimrandomh 30 August 2010 01:25:41PM 0 points [-]

Someone I know has a foot problem. Her orthopedist recommended having a scan done to produce inserts to adjust the shape of her regular shoes, and said if that didn't work, then entirely custom shoes could be made. So computerized custom-fit shoes do exist, but they're considered a medical item which makes them expensive.

Comment author: NancyLebovitz 30 August 2010 03:03:38PM 0 points [-]

That sounds to me as though the inserts are customized, but the custom shoes would be made by humans.

Comment author: RichardKennaway 27 August 2010 04:20:42PM *  0 points [-]

The futuristic prediction which seems to be not happening because people just don't want it is video which shows your face while you're talking on the phone.

That one's already happened. My new iPhone does video calls, and so does Skype on any computer with a webcam. That wasn't driven by demand, though, it was more that the technology all became ubiquitous for other purposes and it was easy to stitch it together to provide videophone functionality, even if it isn't actually used very much.

Comment author: gwern 27 August 2010 01:45:54PM 0 points [-]

Do you have a source for computerized custom-fitting of shoes? The big deal isn't just the fitting, though, it's reasonably-priced manufacture.

IIRC, I read it a long time ago in a mouldering paperback of Alvin Toffler's The Third Wave. (Or was it Future Shock?) But even without having read about clothes in particular, I have read about 3D models of statues etc. being generated through rotating the object while shining a laser on it; thus obviously one can generate a human model (I think CGI already does this), and fit clothes on that model. I would be deeply shocked if no one has ever used laser modeling to fit garments of some kind.

I don't see those sorts of considerations applying to sous vide or computerized custom shoes.

Considerations like expense and minimal benefit don't apply? Mm, well, as Marx said, nous verrons. Figuring out whose perception of reality is clearer is one of the points of recording predictions.

Comment author: whpearson 27 August 2010 02:21:06PM 2 points [-]

I would be deeply shocked if no one has ever used laser modeling to fit garments of some kind.

You don't have to be shocked. Here is one.

I think what user-specific clothing and shoes currently lacks is sufficiently advanced robotics. If you are doing the obvious, cutting out bits of material and attaching them together you have quite a few problems. You are having to manipulate non-standard sized bits of flexible material. The production line deals with many of the same sized and shaped bits of material so you can change molds/tools dependent upon the size of the shoe.

The knitting machine above removes that consideration as it produces the finished garment in one piece.

I found this pdf on customized shoe production from 2001 (requires login) while trying to find some videos of shoe manufacturing to confirm my ideas. I don't have time to look into it, but seems relevant to the discussion.

Comment author: NancyLebovitz 27 August 2010 01:56:27PM 1 point [-]

The hard part of computerized custom shoes might be designing the shoes rather than measuring the foot. Also note that the shoe has to fit while you're walking, though that seems like just adding difficulty rather than a whole new problem.

I should have been more precise about the difference I see between flying cars and sous vide cooking. Flying cars include infrastructure and group effects in a way that sous vide cookers do not.

Comment author: John_Maxwell_IV 05 January 2010 06:00:24AM 2 points [-]

Secession: If you mean a state trying to leave the US in the next decade, 5%. If you mean a state actually being allowed to leave, I put it at 0%.

Surely you mean "my estimate rounds to 0%"?

Comment author: NancyLebovitz 06 January 2010 01:21:57AM 0 points [-]

I meant 0%, but you probably have a point that I should present the chance as negligible rather than non-existent. Is there a limit, though? Does it make sense to say that there's a non-zero chance that a state will propose secession and be allowed to leave by tomorrow morning?

Comment author: AdeleneDawner 06 January 2010 01:29:37AM 1 point [-]

Does it make sense to say that there's a non-zero chance that a state will propose secession and be allowed to leave by tomorrow morning?

Yep. It even makes sense to say that there's a non-zero chance that a state seceded last month, and that we haven't heard about it yet. The word 'epsilon' is useful in such cases; it means 'nearly zero' or 'too close to zero to calculate'.

Comment author: John_Maxwell_IV 09 January 2010 05:16:10AM 1 point [-]

The word 'epsilon' is useful in such cases; it means 'nearly zero' or 'too close to zero to calculate'.

"Negligible" is a much better word, in my opinion, since epsilon is (conventionally) an arbitrarily small number, not a sufficiently small number. You could use "infinitesimal", but nothing in reality is actually infinitesimally small (including probabilities), so again you'd be inaccurate. I always get frustrated when people misuse precise mathematical words that have lots of syllables in them. The syllables are there to discourage colloquial use! I don't mind if you try to show off your knowledge, but for heaven's sake don't screw up and use that precise brainy term wrong!

Comment author: Tyrrell_McAllister 09 January 2010 05:50:51AM *  3 points [-]

You're straddling a strange line here. You're demanding a certain amount of strictness that is itself short of perfect strictness.

There's no such thing as an "arbitrarily small number". There are numbers chosen when any positive number might have been chosen. In particular, a given epsilon need not be "negligible". Really, to conform to the strict mathematical usage, one shouldn't say "epsilon" without first saying "For every". Once you're not demanding that, you're not using the "precise mathematical words" in the precise mathematical way.

I'm not saying that you're on some slippery slope where anything goes. But I wouldn't say that AdeleneDawner is either.

Comment author: John_Maxwell_IV 12 January 2010 06:05:53AM *  0 points [-]

You're demanding a certain amount of strictness that is itself short of perfect strictness.

Actually, I'm fine with people speaking vaguely, I just don't want to see terminology misused.

There's no such thing as an "arbitrarily small number".

"Through adding zeroes between the decimal point and the 7 in the string '.7', the number we are representing can be made arbitrarily small." Is this a misuse of the word "arbitrarily"?

In particular, a given epsilon need not be "negligible". Really, to conform to the strict mathematical usage, one shouldn't say "epsilon" without first saying "For every".

The important think about an epsilon in a mathematical proof is, conventionally, that it can be made arbitrarily small. This is a human interpretation I am adding on to the proof itself. If the important thing about a variable in a proof was that the variable could become arbitrarily large, my guess is that a variable other than epsilon would not be used.

Comment author: Tyrrell_McAllister 12 January 2010 03:22:36PM 1 point [-]

"Through adding zeroes between the decimal point and the 7 in the string '.7', the number we are representing can be made arbitrarily small." Is this a misuse of the word "arbitrarily"?

Your usage is fine, so long as it's clear that "arbitrarily small" is a feature of the set from which you are choosing numbers, or of the process by which you are constructing numbers, and not of any particular number in that set. This is clear with the context that you give above. It wasn't as clear to me when you wrote that "epsilon is (conventionally) an arbitrarily small number".

Comment author: byrnema 09 January 2010 05:44:06AM 1 point [-]

Don't forget "modulo".

Suppose that Nancy meant 0% except for a few special cases that she didn't think should be relevant. Then she could say, '0% modulo some special cases'.

Comment author: ciphergoth 09 January 2010 09:48:56AM 1 point [-]

I often use epsilon in the same informal way AdeleneDawner does, though I'm perfectly aware of the formal use. Still, I think the informal use of "modulo" is more defensible - it maps more closely to the mathematical meaning of "ignoring this particular class of ways of being different"

Comment author: John_Maxwell_IV 12 January 2010 06:09:51AM 1 point [-]

Could you explain this in greater detail? This way of using "modulo" bothers me significantly, and I think it's because I either don't know about one of the ways "modulo" is used in math, or I have an insufficiently deep understanding of the one way I do know that it's used.

Comment author: RobinZ 12 January 2010 01:14:59PM *  6 points [-]

In modulo arithmetic, adding or subtracting the base does not change the value. Thus, 12 modulo 9 is the same as 3 modulo 9. Thus, for example, "my iPhone is working great modulo the Wifi connection" implies that if you can subtract the base ("the Wifi connection") you can transform a description of the current state of my iPhone into "working great".

(For your amusement: modulo in the Jargon File. Epsilon is there too.)

Edit: Actually, in this case, you would have to add the base, because my Wifi isn't working, but the statement remains the same.

Comment author: AdeleneDawner 09 January 2010 05:38:22AM 2 points [-]

'Kay.

I'm not the only one you should be ranting at, though - I picked it up here, not in a math class, and I suggested it because it's in common use.

Comment author: John_Maxwell_IV 12 January 2010 06:07:45AM 0 points [-]

Yep, it is probably unrealistic to expect random folks to avoid picking up multisyllable terms in the way they pick up regular words.

Comment author: ejklake 03 January 2010 02:30:21PM 0 points [-]

No predictions about the state of the environment? Is every point of contention too close to call, then?

Comment author: magfrump 03 January 2010 08:01:31AM 1 point [-]

I would say better-than even chances that sites like intrade gain prestige in the next decade

and betting on predictions will become common ( 90% that there is a student at 75% or so of high schools in 2020 that will take bets on future predictions on any subject, 40% that >5% of US middle class will have made a bet about a future prediction)

naive guesses based largely on http://www.fivethirtyeight.com/2009/11/case-for-climate-futures-markets-ctd.html

I predict further that I will continue to post on LW at least once a month next year (90%) and in 2020 (50%)

Comment author: gwern 25 August 2010 06:16:25AM *  1 point [-]
  1. http://predictionbook.com/predictions/1710

    Kind of vague, but I suppose it's not too hard to do a search and note that the NYT only mentioned Intrade a few times in the 2000s and more in the 2010s.

  2. http://predictionbook.com/predictions/1709

    I have no idea how one would measure this one. I'm sure that at any high school you could find a student willing to wager with you on any damn topic you please.

  3. Not including a prediction for middleclasses. Already true if you count sports, as many prediction markets such as Betfair do.
  4. http://predictionbook.com/predictions/1711
  5. http://predictionbook.com/predictions/1712

    Agree with orthonormal that this is seriously over-optimistic. The only site I even use today that I did in 2000 would be Slashdot, and I haven't commented there in a dog's age.

Comment author: magfrump 25 August 2010 10:16:13AM 0 points [-]

I probably meant for claim 3 to exclude sports.

Comment author: gwern 25 August 2010 12:34:03PM 0 points [-]

Well, then you're using a variant definition of prediction market, and before I can feel confident judging any prediction of yours, I need to know what your idiosyncratic interpretation of the phrase is.

Comment author: magfrump 25 August 2010 07:59:25PM 0 points [-]

I agree that I wasn't making the most coherent claim, and since it's been a long time I can't guarantee fidelity of what I originally intended.

But my best guess would be, trying to phrase this as concretely as possible, was that I meant to predict that either

a) sports betting agencies would expand into non-sports venues and see significant business there

or b) newer betting agencies not created to serve sports would achieve similar success

I would be "disappointed" if "non-sports" meant something like player movement between teams and "excited" if it meant something like unemployment rates and vote shares in elections.

Comment author: orthonormal 03 January 2010 08:31:20AM 8 points [-]

I predict further that I will continue to post on LW at least once a month next year (90%) and in 2020 (50%)

Is there any comparable website that you were posting on in 2000 and continue to post on today? I agree that LW is awesome, but web communities have a short shelf life (and a tendency to be superseded as web technology improves).

Comment author: magfrump 03 January 2010 06:59:37PM 4 points [-]

Probably a good reason to adjust the estimate down. On the other hand I was 11 in 2000 so I wouldn't have been on this kind of site anyway, and conditional on the prediction that news-betting becomes more prestigious rationality almost certainly will.

Point taken, with the real point being that I have no sense of how long a decade is, so I'll adjust that down to a 20%

I have stayed in touch with a different web community for five years, with which I'm still in touch, although only barely at the level of once a month. So my odds for awesomeness overcoming shelf-lifes may be higher than for most.

Comment author: Morendil 02 January 2010 06:00:06PM *  3 points [-]

I expect that Brain-Computer Interfaces will make their way into consumer devices by the next decade, with disruptive consequences, once people become able to offload some auxiliary cognitive functions into these devices.

Call it 75% - I would be more than mildly surprised if it hadn't happened by 2020.

For what I have in mind, what counts as BCI is the ability to interact with a smartphone-like device in an inconspicuous manner, without using your hands.

My reasoning is similar to Michael Vassar's AR prediction, and based on the iPhone's success. That doesn't seem owed to any particular technological innovation; rather, Apple made things usable that were only previously feasible in the technical sense. A mobile device for searching the Web, finding out your GPS position and compass orientation, and communicating with others was technically feasible years ago. Making these features only slightly less awkward than previously has revealed a hidden demand for unsuspected usages, often combining old features in unexpected ways.

However, in many ways these interfaces are still primitive and awkward. "Sixth Sense" type interfaces are interesting, but still strike me as overly intrusive on others' personal space.

It would make sense to me to be able, say, to subvocalize a command such as "Show me the way to metro station X", then have my smartphone gently "tug" me in the right direction as I turn left and right, using a combination of compass and vibrations. This is only one scenario that strikes me as already easy to implement, requiring only some slightly greater integration of functionality.

I expect such things to be disruptive, because the more transparent the integration between our native cognitive abilities, and those provided by versatile external devices connected to the global network, the more we will effectively turn into "augmented humans".

When we merely have to think of a computation to have it performed externally and receive the result (visually or otherwise), we will be effectively smarter than we are now with calculators (and already essentially able, some would say, to achieve the same results).

I am not predicting with 75% probability that such augmentation will be pervasive by 2020, only that by then some newfangled gadget will have started to reveal hidden consumer demand for this kind of augmentation.

ETA: I don't mind this comment being downvoted, even as shorthand for "I disagree", but I'd be genuinely curious to know what flaws you're seeing in my thinking, or what facts you're aware of that make my degree of confidence seems way off.

Comment author: Morendil 05 June 2013 08:19:45PM 0 points [-]

Three and a half years in, this.

Comment author: shminux 05 June 2013 08:40:54PM 0 points [-]

Any updates to your original prediction?

Comment author: Morendil 30 November 2013 10:21:26AM 2 points [-]

Now this.

Comment author: Morendil 06 June 2013 06:10:35AM 0 points [-]

I think I've come round to Gwern's point of view - this is a bit too vague. The news item I posted makes me feel like we're still on track for it to happen, though I could be a few years off the mark. I might knock it down to 65% or so to account for uncertainty in timing.

Comment author: gwern 24 August 2010 11:07:10AM 1 point [-]

I'm not thrilled about your vagueness about what technologies count as a BCI. Little electrodes? The gaming device that came out last year or so got a lot of hype, but the gamers I've talked to who have actually used it were all deeply unimpressed. Voice recognition? Already here in niches, but not really popular.

If you can't think of what interfaces specifically*, then maybe you should phrase your prediction as a negative: 'by 2020, >50% of the smart cellphone market will use a non-gestural non-keyboard based interface' etc.

* and you really should be able to - just 9 years means that any possible tech has to have already been demonstrated in the lab and have a feasible route to commercialization; R&D isn't that fast a process, and neither is being good & cheap enough to take over the global market to the point of 'pervasive'

Comment author: Morendil 08 September 2010 10:01:12AM 0 points [-]
Comment author: Morendil 28 August 2010 06:37:23PM *  0 points [-]

Yep, electrodes, as in the gaming devices. Headsets is the form factor I have in mind, so not necessarily electrodes if this is to be believed. I don't want to commit to burdensome implementation details but voice isn't what I mean - it doesn't count as "unobtrusive" to my way of thinking.

I envision something where I can just form the thought "nearest MacDonalds" (ETA: or somehow bring up a menu selecting that among even a restricted set) without it being conspicuous for an outside observer, and get some form of feedback from the device leading me in the right direction. Visual overlay would work, but so would a physical tug.

Comment author: spasinsky 03 January 2010 12:17:26AM -1 points [-]

Given the feasibility that currently exists for gadgets that you envision... and Apple's uncanny ability to bring those ides to market... I say 2015 is a 75% target for the iThought side-processor device. :) .

Comment author: blogospheroid 02 January 2010 11:45:25AM 2 points [-]

Atleast one asian movie will exceed $400 mn in worldwide box office gross before the end of the decade.

It will most probably not be a wuxia movie. My guess of its genre is urban action or speculative fiction.

Comment author: rahul 03 January 2011 07:36:10AM 0 points [-]

I agree. I especially see a lot of convergence in present day mainstream Bollywood cinema with conventional blockbuster Hollywood fare in terms of both plots and production values. So expect a Moulin Rouge-like crossover musical in English with a major Hollywood box-office draw, an Indian model female lead, rags-to-riches storyline, Inception-like action sequences and CGI by studios in Hyderabad and Bangalore.

Comment author: gwern 24 August 2010 11:02:35AM 0 points [-]

http://predictionbook.com/predictions/1708

Seems like a solid prediction. 2020 allows a lot of growth in China & India, and Bollywood-style movies already play well in the West - look at Slumdog Millionaire which nearly grossed $400M, despite being a British film on Indian matters.

Comment author: orthonormal 02 January 2010 06:00:56AM 13 points [-]

One word: subcultures.

I think we'll see an expansion to most of the First World of the trend we see in cities like San Francisco, where the Internet has allowed people to organize niche cultures (steampunk, furries, pyromaniacs, etc.) like never before. I think that, by and large, people would prefer to seek out a smaller culture based on a common idiosyncratic interest if it were an option, not least because rising in status there is often easier than getting noticed in the local mainstream culture. I think that the main reason the mainstream culture is presently so large, therefore, is because it's hard for a juggling enthusiast in Des Moines to find like-minded people.

I expect that over the next 10 years, more and more niche cultures will arise and begin to sprout their own characteristics, with the measurable effect that cultural products will have to be targeted more narrowly. I expect that the most popular books, music, etc. of the late 2010s will sell fewer copies in the US than the most popular books, music, etc. of the Aughts, but that total consumption of media will go up substantially as a thousand niche bands, niche fiction markets, etc. become the norm. I expect that high schoolers in 2020 will spend less social time with their classmates and more time with the groups they met through the Internet.

And I expect that the next generation of hipsters will find a way to be irritatingly disdainful of a thousand cultures at once.

Comment author: gwern 23 August 2010 02:48:36PM 3 points [-]

What do you make of criticism that sales currently show the exact opposite trend?

Comment author: orthonormal 24 August 2010 05:57:03AM 2 points [-]

Thanks for the link! I didn't know there was already a version of this theory out there, and I didn't know the actual figures.

So what do I make of this data (assuming the veracity of the Wikipedia summary, since I'm not dedicated enough to read the papers)? Well, I'm surprised by it.

Comment author: gwern 24 August 2010 10:29:17AM 1 point [-]

I'm not especially surprised. Aside from possible confounding factors like the rise of Free & free stuff (strongest in subcultures) which obviously wouldn't get counted in commercial metrics, technological and economic development means that mass media can spread even further than Internet-borne stuff can. cue anecdotes about Mickey Mouse posters in African huts, etc.

The subcultures seem to me to appeal mostly to the restricted 1st World wealthier demographics that powered the mass media you are thinking of; one might caricature it as 'white' stuff. It makes sense that a subculture like anime/manga or FLOSS, which primarily is cannibalizing the 'white' market, can shrink ever more in percentage terms as the old 'white' stuff like Disney expand overseas into South America, Africa, Southeast Asia and so on.

If you had formulated your thesis in absolute numbers ('there will be more FLOSS enthusiasts in 2020 than 2010'), then I think you would be absolutely right. You might be able to get away with restricted areas too ('there will be more otaku in Japan in 2020 than 2010, despite a ~static population'). But nothing more.

Comment author: sketerpot 02 January 2010 06:42:30AM *  5 points [-]

So it's possible that, if we had a really huge, dense, wired city with excellent transportation, we would find a significant subculture of steampunk furries, or vampire gothic lolita hip-hop dance squads? Actually, this sounds like a lot like Tokyo.

And I expect that the next generation of hipsters will find a way to be irritatingly disdainful of a thousand cultures at once.

It's easy, really. Practice this phrase: "Man, what weirdos." You just have to selectively overlook the weirdness of your own subculture while recognizing and stigmatizing it in others. It's an elegant approach.

Comment author: Zack_M_Davis 02 January 2010 06:10:54AM *  12 points [-]

the Internet has allowed people to organize niche cultures (steampunk, furries, pyromaniacs, etc.)

You forgot us!

Comment author: scientism 01 January 2010 10:49:22PM -1 points [-]

Next 10 years:

  1. Nativism discredited (80%)

  2. Traditional economics discredited (80%)

  3. Cognitivism/computationalism discredited (70%)

  4. Generative linguistics discredited (60%)

To elaborate somewhat: By #1 I mean that in the fields of biology, psychology and neuroscience the idea that behaviours or ideas or patterns of thought can be "innate" will be marginalised and not accepted by mainstream researchers.

By #2 I mean that, not only will behavioural economics provide accounts of deviations from traditional economic models, but mainstream economists will accept that these models need to be discarded completely and replaced from the ground-up with psychologically-plausible models.

By #3 I mean the idea that the brain can be thought of as a computer and the "mind" as its algorithms will be marginalised. I give this lower odds than nativism being discredited only because the cognitivist tradition has managed to sustain itself through belligerence rather than evidence and is therefore likely to be more persistent and pernicious. Nativism, on the other hand, has persisted because of the difficulty of experimentally demonstrating that certain behaviours are learned rather than innate (as well as belligerence).

By #4 I mean that traditional linguistics, and especially generative grammar, will be marginalised. This one has long puzzled me since the generative grammarians based their ideas on intuition and explicitly deny a role for data or experiment (or the need to reconcile their beliefs with biology). The main problem has been the absence of a viable alternative research program. This is beginning to change.

Comment author: gwern 23 August 2010 02:39:30PM 1 point [-]

I'm not entering any of these into PredictionBook because all 4 strike me as hopelessly argumentative and subjective. (Take #1 - what, you mean stigmatised even more than it already is as the province of racists/sexists/-ists?)

Comment author: DanielLC 02 January 2010 01:08:46AM 0 points [-]

Regarding 3, there's no way to find evidence against it (or for it, for that matter). You can't look at a given system and measure its sentience. The closest to that anyone's ever attempted is to try and test intelligence, but that assumes cognitivism/computationalism, among other things.

I agree with orthonormal, except that I don't have $10,000 to bet.

Comment author: Zack_M_Davis 01 January 2010 11:42:00PM *  4 points [-]

A few thoughts:

  • It would be valuable to do an outside view sanity check: historically, how frequently have research programs of similar prestige been discredited?

  • There are all the standard problems with authority---lots of folks insist that they're in the mainstream and that opposing views have been discredited. Clearly nativism &c. have been discredited in your mind; when do they get canonically discredited? Sometimes I almost think that everyone would be better off if everyone just directly talked about how the world really is rather than swiping at the integrity of each other's research programs, but I'm probably just being naive.

  • Re 3, my domain knowledge is somewhat weak, so everyone ignore me if my very words are confused, but I'm not sure what would count as a refutation or the mind being an algorithm. Surely (surely?) most would agree that the brain is not literally a computer as we ordinarily think of computers, but I understand algorithm in the broadest sense to refer to some systematic mechanism for accomplishing a task. Thought isn't ontologically fundamental; the brain systematically accomplishes something; why shouldn't we speak of abstracting away an algorithm from that? Maybe I've just made computationalism an empty tautology, but I don't ... think so.

  • I don't think the innate/learned dichotomy is fundamental; it's both, everyone knows that's it's both, everyone knows that everyone knows that it's both. Like that old analogy, a rectangle's area is a product of length and width. What specific questions of fact are people confused about?

Comment author: scientism 02 January 2010 12:57:27AM 0 points [-]

I think these research programs represent something without a clear historical precedent. Traditional economics and generative linguistics, for example, could be compared to pre-scientific disciplines that were overthrown by scientific disciplines. But both exhibit a high degree of formal and institutional sophistication. I don't think pre-Copernican astronomy had the same level of sophistication. Economics also has data (although so did geocentric astronomy) whereas the generative tradition in linguistics considers data misleading and prefers intuitive judgement. What neither has is a systematic experimental research program or a desire to integrate with the natural sciences.

Cognitivism is essentially Cartesian philosophy with a computer analogy and experiments. In practice it just becomes experimental psychology with some extra jargon. Nativism, too, comes from Cartesian philosophy (Chomsky was quite explicit about this). While cognitivism has experiments it has an interpretation that isn't founded in experiment (the type of computer the brain is supposed to be and the algorithms it could be said to run is not addressed) and an opposition to integration with the natural sciences (the so-called "autonomy of psychology" thesis).

These research programs are similar to pre-scientific research programs but have managed to persist in a world where you have to attempt to "look scientific" in order to secure research grants and they reflect this fact.

You point to many problems and I wouldn't take any bets because of these. It would be too difficult to judge who had won. On the nature/nurture debate: Empiricism evolved into constructivism/interactionism (i.e., the developing organism interacting with the environment with genes driving development), which is the dominate view in biology, and it's not obvious what, precisely, modern Nativists believe. But it is obvious that they still exist since naive nativist talk persists almost everywhere else. It's similarly difficult to figure out what computationalists mean by their analogies and the degree to which they intend them to be analogies vs. literal propositions. This is probably why the natural sciences tend not to base research programs on analogies. What is clear is that they have a particular style of interpreting their results in terms of representations and sequential processing that is clearly at odds with biology and display no interest in addressing the issue.

Comment author: orthonormal 02 January 2010 01:20:29AM 4 points [-]

Nativism, too, comes from Cartesian philosophy (Chomsky was quite explicit about this).

First, this is the genetic fallacy. Secondly, I don't take Chomsky's authority seriously.

The experimental evidence that, say, Steven Pinker presents in How the Mind Works for innate mental traits and for the computational perspective are sound, and have nothing to do with Cartesian dualism.

Comment author: scientism 02 January 2010 01:37:32AM -2 points [-]

The point is that the views have their origins in philosophy rather than experiment. We're not dealing with a research program developed from a set of compelling experimental results but a research program that has inherited a set of assumptions from a non-empirical source. This is more obviously the case with computationalism, where advocates have shown almost no interest in establishing the foundational assumptions of their discipline experimentally, and some claim that to do so would be irrelevant. But it's also true for nativism where almost no thought is given to how nativist mechanisms would be realised biologically.

Comment author: orthonormal 01 January 2010 11:00:32PM *  6 points [-]

If we could agree on a suitable judging mechanism, I would bet up to $10,000 against you on #1 and on #3 at those odds (or even at substantially different odds). I also disagree on the latter claim in #2, but that's not as much of a slam dunk for me as the others.

Comment author: whpearson 01 January 2010 11:00:03PM 6 points [-]

Can you unpack what you mean by innate. I think babies would have a hard time surviving if sucking things wasn't a behaviour that was with them from their genes.

Comment author: Vladimir_Nesov 02 January 2010 12:18:31AM 3 points [-]

And more generally, the distinction innate/learned is overly simplistic in a lot of contexts; rather, there are adaptations that determine the way organism develops depending on its environment. The standard reference I know of is

J. Tooby & L. Cosmides (1992). `The psychological foundations of culture'. In J. Barkow, L. Cosmides, & J. Tooby (eds.), The adapted mind: Evolutionary psychology and the generation of culture. Oxford University Press, New York.

Comment author: dfranke 01 January 2010 07:05:48PM 1 point [-]
  • By the end of 2013: Either the Iranian regime is overthrown by popular revolution, or there is an overt airstrike against Iran by either the US or Israel, or Israel is attacked by an Iranian nuclear weapon (70%).

  • Essentially seconding mattnewport: the price of gold reaches $3000USD, or inflation of the US dollar exceeds 12% in one year (65%).

  • The current lull in the increase of the speed at which CPUs perform sequential operations comes to an end, yielding a consumer CPU that performs sequential integer arithmetic operations 4x as quickly as a modern 3GHz Xeon (80%).

  • Android-descended smartphones outnumber iPhone-descended smartphones (60%).

  • The number of IMAX theaters in the US triples (40%).

Comment author: gwern 23 August 2010 02:36:36PM 0 points [-]
Comment author: sketerpot 02 January 2010 08:29:59AM 0 points [-]

The current lull in the increase of the speed at which CPUs perform sequential operations comes to an end, yielding a consumer CPU that performs sequential integer arithmetic operations 4x as quickly as a modern 3GHz Xeon (80%).

When you say sequential integer operations, do you mean integer operations that really are sequential? In other words, the instructions can't be performed in parallel because of data dependencies? If not, then this is already possible with a sufficiently wide superscalar processor or really big SIMD units.

But let's assume you really mean sequential integer operations. The only pipeline stage in this example that can't work on several instructions at once is the execute stage, so I'm assuming that's where the bottleneck is here. This means that the speed is limited by the clock frequency. So, here are two ways to achieve your prediction:

  1. Crank up the clock! Find a way to get it up to 12 GHz without burning up.

  2. Make the execute stage capable of running much faster than the rest of the processor does. This is natural for asynchronous processors; in normal operation the integer functional units will be sitting idle most of the time waiting for input, and the bulk of the time and complexity will be in fetching the instructions, decoding them, scheduling them, and in memory access and I/O. But in your contrived scenario, the integer math units could just go hog wild and the rest of the processor would keep them fed. This can be done with current semiconductor technology, I'm pretty sure.

So, either way, kind of an ambitious prediction. I like it.

Comment author: Valkyrie_Ice 02 January 2010 11:20:15AM -1 points [-]

Have you not heard that they discovered a way to use graphene as a one to one replacement for copper in chip production. That alone will allow speeds of 12-15GHz.

I would put faster chips using multicore running at many times current speeds will be available by 2011-2012 at near 100% certainty.

Comment author: Vladimir_Nesov 02 January 2010 12:16:05PM 2 points [-]

Have you not heard that they discovered a way to use graphene as a one to one replacement for copper in chip production. That alone will allow speeds of 12-15GHz.

This seems to be still very far from application, a quick search on your claim turned up only this paper that isn't cited by anybody yet, publicized in a few popular articles.

Comment author: sketerpot 02 January 2010 09:33:06PM 1 point [-]

Let's assume they put it into practice and start mass-producing processors with graphene interconnects with better-than-copper resistivity. We've got two things to worry about here: speed and power.

The speed of signal propagation along a wire depends on RC, the product of the resistance and the capacitance. Graphene lowers the resistance of a wire of a given size, but does nothing to lower the capacitance -- that depends on the insulator surrounding the wire and the shape of the wire and its proximity to other wires. The speed gains from graphene look moderate, but significant.

The power dissipated by sending signals through wires will be most of the power of future processors, if current trends continue. Power is a barrier to clocking chips fast. We can overclock processors a lot, but you've got to worry about them burning up. Decreasing resistivity improves the power situation somewhat, but the bulk of the interconnect's influence on power comes from its capacitance. Transistors have to charge and discharge the capacitance of the wires, and that takes power. So on power, graphene will help somewhat, but it's not the slam-dunk that Valkyrie Ice is expecting.

tl;dr: Graphene interconnect sounds good, but not fantastic.

Comment author: Bo102010 02 January 2010 10:29:08PM 0 points [-]

Thank you - I was wanting to write something along similar lines in response to Valkyrie Ice's comment, but wouldn't have ended up with something this compact.

I'll add that clocking is just a piece of the puzzle when it comes to making computers that compute faster.

Comment author: cabalamat 01 January 2010 06:51:24PM 0 points [-]

China is the 2nd biggest economy in 2020 (99%). Note I'm counting the EU as lots of countries, not as one big economy. Counting the EU together, China will be the 3rd biggest.

Pirate Parties will have been in government for a time in at least one country by 2020 (90%)

Pirate Parties will win >=10 seats in the European parliament in 2014 (75%), and <=30 seats (75%).

The Conservatives will win a majority the next UK general election (60%), there will be no overal majority (37%), or any other outcome (3%).

Comment author: gwern 21 August 2010 09:56:09AM 0 points [-]

China is the 2nd biggest economy in 2020 (99%). Note I'm counting the EU as lots of countries, not as one big economy. Counting the EU together, China will be the 3rd biggest.

Now, that's unfair. You've already won that one, and any look at the numbers would've told you this was a like 99.999% prediction or something.

Comment author: FAWS 21 August 2010 10:11:44AM *  0 points [-]

No, there is a reasonable (IMO >1%) chance China could overtake the USA or EU in the next ten years.

Comment author: gwern 21 August 2010 11:07:15PM 0 points [-]

I'm a little confused what you're predicting. China is already the 2nd biggest economy, my understanding was, unless the EU is counted as a single economy. So your 99% prediction is actually 'China will not become the world's largest economy and will remain #2/#3'?

Comment author: ciphergoth 01 January 2010 08:54:17PM 2 points [-]

The Conservatives will win a majority the next UK general election (60%), there will be no overall majority (37%), or any other outcome (3%).

Do you have bets on Intrade or Betfair for those guesses? It's probably better for you to bet directly than for me to do arbitrage on you :-) They have around 68% Conservative victory, 26% no overall majority, and around 6% Labour victory.

Betfair

Comment author: James_Miller 01 January 2010 05:34:47PM 6 points [-]

Within ten years either genetic manipulation or embryo selection will have been used on at least 10,000 babies in China to increase the babies’ expected intelligence- 75%.

Within ten years either genetic manipulation or embryo selection will have been used on at least 50% of Chinese babies to increase the babies’ expected intelligence- 15%.

Within ten years the SAT testing service will require students to take a blood test to prove they are not on cognitive enhancing drugs. – 40%

All of the major candidates for the 2016 presidential election will have had samples of their DNA taken and analyzed (perhaps without the candidates’ permission.) The results of the analysis for each candidate will be widely disseminated and will influence many peoples' voting decisions - 70%

While president, Obama will announce support for a VAT tax - 70%.

While president, Obama will announce support for means testing Social Security - 70%

Within ten years the U.S. repudiates its debt either officially or with an inflation rate of over 100% for one year - 20%.

Within five years the Israeli economy will have been devastated because many believe there is a high probability that an atomic bomb will someday be used against Israel – 30%

Within ten years there will be another $200 billion+ Wall Street Bailout - 80%

Comment author: gwern 21 August 2010 09:47:53AM *  6 points [-]
  1. http://predictionbook.com/predictions/1689
  2. http://predictionbook.com/predictions/1690
    I think you are on crack for this one. 15% ?! You seriously think there's a 15% chance that a embryo selection and/or genetic manipulation for IQ will be developed, commercialized, and turned into an infrastructure capable of modifying roughly 9 million pregnancies a year? Where the hell are all the technicians and doctors going to come from, for one thing? There's a long lead time for that sort of thing.
  3. http://predictionbook.com/predictions/1691
    Ditto - America doesn't have that many phlebotomists, and would go batshit over a Collegeboard requirement like that. There would have to be an enormous national outcry over nootropics, and there's zero sign of that, and tremendous takeup of drugs like modafinil. Even a urine or spit test would encounter tremendous opposition, and the Collegeboard has no incentive for such testing. (Cost, blame for false positives, and possibly dragging down scores which would earn it even more criticism. To name just the very most obvious negatives.)
  4. http://predictionbook.com/predictions/1696
    I think you forgot the part of your prediction where all the candidates went insane and agreed to such an incredibly status-lowering procedure, gave up all privacy, and completely forgot about how past candidates got away with not releasing all sorts of germane records.
  5. http://predictionbook.com/predictions/1576 (Not sure if your wording is exactly the same as Cowen's VAT prediction, but I figure it'll do.)
  6. http://predictionbook.com/predictions/1692
    I recently read a book on old age public policy; amidst the endless details and financial minutia, I was deeply impressed how many ways there were to effectively means-test, even inadvertently, without obviously being means-testing or having that name. Judging could be very difficult.
  7. http://predictionbook.com/predictions/1693
    With a probability that high, shouldn't you be desperately diversifying your personal finances overseas? Either fork of your prediction means major pain for US debt, equity, or cash holders.
  8. http://predictionbook.com/predictions/1694
    The odds of an Iranian bomb aren't that terribly high, much less such an outcome happening.
  9. http://predictionbook.com/predictions/1695
    Definitions here are an issue. Some forecasts are for 2-500 billion dollars in defaults on student loans, which likely would provoke another bailout. Would that count? Does a 0% Fed rate and >0% Treasury rate constitute an ongoing bailout? etc.

All in all, this is a set of predictions that makes me think that I really should go on Intrade. I did manage to double my money at the IEM; at the time I assumed it was because I got lucky on picking McCain and Obama for the nominations, but if this is the best a random LWer can do, even aware of biases, basic data, and the basics of probability...

Comment author: James_K 02 January 2010 07:33:37AM *  3 points [-]

"While president, Obama will announce support for means testing Social Security - 70%"

I'd be wiling to take those odds, with some refinements.

Comment author: James_Miller 02 January 2010 04:54:26PM 2 points [-]

How about this - I win if before he leaves office I can point to a speech Obama gave in which he advocates means testing Social Security. Otherwise you win. The speech has to be given after today, so you don't fear this is some kind of trick.

If I win I get $100 from you. If you win I give you $233. But with these odds I'm indifferent to making the bet. So for me to be willing to bet I want you to agree that if Obama makes such a speech you have to pay me right away.

Comment author: James_K 03 January 2010 01:03:20AM 2 points [-]

That works for me, with one little change. The end of his term needs to be counted as the end of a presidential election he doesn't win, rather than the inauguration of his successor. This is because the reason I don't think its very likely is that the political effects on him would be dire, so if he does it as a lame duck president he has nothing to lose. I'm still willing to take the risk on his second term since even a second-term president is subject to some political forces.

And as a clarification, I take "means testing" to mean increasing or decreasing social security payouts based on a person's assets or income. It also has to apply to US citizens to count.

And since I'm not an American, I'd just like to confirm that the best is in US dollars. That works for me, and I assume it works for you too.

Comment author: James_Miller 03 January 2010 05:45:49PM 4 points [-]

OK, I accept - and yes the bet should be in U.S. dollars.

Please contact me at

EconomicProf@Yahoo.com so we can exchange addresses.

Comment author: ciphergoth 01 January 2010 08:55:28PM *  6 points [-]

All of the major candidates for the 2016 presidential election will have had samples of their DNA taken and analyzed (perhaps without the candidates’ permission.) The results of the analysis for each candidate will be widely disseminated and will influence many peoples' voting decisions - 70%

Within five years the Israeli economy will have been devastated because many believe there is a high probability that an atomic bomb will someday be used against Israel – 30%

Within ten years there will be another $200 billion+ Wall Street Bailout - 80%

I'd take the other side on any of these if we can find a way to make it precise.

Comment author: Vladimir_Golovin 01 January 2010 02:59:10PM *  12 points [-]

I'm 90% confident that the cinematic uncanny valley will be crossed in the next decade. The number applies to movies only, it doesn't apply to humanoid robots (1%) and video game characters (5%).

Edit: After posting this, I thought that my 90% estimate was underconfident, but then I remembered that we started the decade with Jar-Jar Binks and Gollum, and it took us almost ten years to reach the level of Emily and Jake Sully.

Comment author: gwern 21 August 2010 09:21:01AM 1 point [-]

How would you verify a crossing of the uncanny valley? A movie critic invoking it by name and saying a movie doesn't trigger it?

Comment author: Vladimir_Golovin 21 August 2010 11:19:16AM 3 points [-]

An ideal indicator would be a regular movie or trailer screening where the audience failed to detect a synthetic actor who (who?) played a lead role, or at least had significant screen time during the screening.

Comment author: timtyler 21 August 2010 11:34:08AM 1 point [-]

There isn't much financial incentive to CGI a human - if they are just acting like a regular human. That's what actors are for.

Comment author: gwern 21 August 2010 11:04:52PM 2 points [-]

I suppose Avatar is a case in point - it's worth CGIfying human actors because otherwise they would be totally out of place in the SF environment which is completely CGI.

Comment author: timtyler 22 August 2010 07:21:32AM 1 point [-]

''There are a number of shots of CGI humans,'' James Cameron says. ''The shots of [Stephen Lang] in an AMP suit, for instance — those are completely CG. But there's a threshold of proximity to the camera that we didn't feel comfortable going beyond. We didn't get too close.''

Comment author: xamdam 02 July 2010 05:39:10PM 1 point [-]

Interesting, it seems that they are currently ahead with image synthesis than voice/speech synthesis.

Comment author: MatthewB 03 January 2010 09:29:37AM 4 points [-]

You don't think that the Valley will be crossed for video games in the next ten years?

Considering how rapidly the digital technologies make it from big screen to small, I'm guessing that we can see the Uncanny Valley crossed (for Video Games) within 2 years of its closure in films (the vast majority of digital films having crossed it).

Part of the reason is that the software packages that do things like Digital Emily (mentioned below) are so easy to buy now. They no longer cost hundreds of thousands, as they did in the early days of CGI, and even huge packages like AutoDesk, which used to sell for $25,000, now can be had for only $5,000. And, those packages can be had for a similar price. That is peanuts when compared to the cost of the people who run that software.

Comment author: Christian_Szegedy 06 January 2010 08:03:48PM 1 point [-]

I agree with you. The uncanny valley refers to rendering human actors only. It is not necessary to render a whole movie from scratch. It is much more work, but only work.

IMO, The Life of Benjamin Button was the first movie that managed to cross the valley.

Comment author: Vladimir_Golovin 03 January 2010 10:00:12AM 0 points [-]

My reply is here. BTW, major CG packages like Autodesk Maya and 3DS Max were at the level of $5000 and below for over a decade.

Comment author: MatthewB 03 January 2010 12:35:25PM 0 points [-]

I've been out of circulation for a while. Last time I priced Autodesk, was in the early 90s, and it was still tens of thousands. I'm just now getting caught up to basic AutoCAD, and I hope to begin learning 3DS Max and Maya in the next year or so. I am astounded at how cheap these packages are now (and how wrong one of my best friends is/was about how quickly these types of software would be available. In 1989, he said it would be 30 to 40 years before we saw the types of graphics displays & software that were pretty much common by (I have discovered) 1995)... Thanks for the head's up though.

Comment author: Bindbreaker 02 January 2010 10:09:38AM 2 points [-]

In a way, the uncanny valley has already been crossed-- video game characters in some games are sufficiently humanlike that I hesitate to kill them.

Comment author: Vladimir_Golovin 03 January 2010 10:04:04AM *  2 points [-]

I once watched a video of an Iraqi sniper at work, and it was disturbingly similar to what I see in realistic military video games (I don't play them myself, but I've seen a couple.)

Comment author: James_K 02 January 2010 07:28:45AM 5 points [-]

Is there a reason Avatar doesn't count as crossing the threshold already?

Comment author: Vladimir_Golovin 03 January 2010 09:13:02AM 3 points [-]

Avatar and Digital Emily are the reasons why I'm so confident. Digital actors in Avatar are very impressive, and as a (former) CG nerd I do think that Avatar has crossed the valley -- or at least found the way across it -- I just don't think that this is proof enough for general audience and critics.

Comment author: MatthewB 03 January 2010 09:33:11AM 4 points [-]

I think before the critics will be satisfied, one would have to make an entirely CGI film that wasn't Sci Fi, or fantastic in its setting or characters.

Something like a Western that had Clint Eastwood & Lee Van Cleef from their Sergio Leone Glory Days, alongside modern day Western Stars like Christian Bale, or.. That Australian Guy who was in 3:10 to Yuma. If we were to see CGI Movies, such as I mentioned, with the Avatar tech (or Digital Emily), then I am sure the critics and public would sit up and take notice (and immediately launch into how it was really not CGI at all, but really a conspiracy to hide immortality technology from the greater public).

Comment author: Vladimir_Golovin 03 January 2010 09:48:05AM 0 points [-]

I think before the critics will be satisfied, one would have to make an entirely CGI film that wasn't Sci Fi, or fantastic in its setting or characters.

Exactly. I was thinking about something like an Elvis Presley biopic, but your example will do just fine (except that I don't think that vanilla westerns are commercially viable today).

Comment author: MatthewB 03 January 2010 12:32:05PM -1 points [-]

Vanilla Westerns?!? There is Nothing Vanilla about a Sergio Leone Western! And Clint Eastwood's Unforgiven was an awesome western, as were Silverado and 3:10 to Yuma (and there are even more that have made a fair killing at the box office).

Westerns are not usually thought of as Block-Busters though, but they do draw a big enough crowd to be profitable.

If one were to draw together Lee Van cleef, Clint Eastwood, and Eli Wallach from their Sergio Leone days together with some of the Big names in Action flics today to make a period western that starred all of these people... I think you'd have a near Block-Buster...

However, the point is really that using this technology one would be able to draw upon stage or film actors of any period or genre (where we had a decent image and voice recording) and to be able to mix actors of the past with those of today.

I just happen to have a passion for a decent Horse Opera. Pity that Firefly was such crap... decent Horse Opera is really no different from a decent Space Opera. Something like Trigun or Cowboy Bebop

Comment author: stevage 02 January 2010 09:46:12AM 4 points [-]

Because the giant blue Na'vi people are not human.

Comment author: timtyler 02 January 2010 11:14:07AM 8 points [-]

You mean you didn't notice the shots with the simulated humans in Avatar? ;-)

Comment author: dfranke 01 January 2010 08:43:21PM 2 points [-]

Why such a big gulf between your confidence for cinema and your confidence for video games?

Comment author: Chronos 01 January 2010 09:33:32PM 0 points [-]

The obvious answer would be "offline rendering".

Even if the non-interactivity of pre-rendered video weren't an issue, games as a category can't afford to pre-render more than the occasional cutscene here or there: a typical modern game is much longer than a typical modern movie -- typically by at least one order of magnitude, i.e. 15 to 20 hours of gameplay, and the storyline often branches as well. In terms of dollars grossed per hours rendered, games simply can't afford to keep up. Thus, the rise of real-time hardware 3D rendering in both PC gaming and console gaming.

Comment author: mattnewport 06 January 2010 07:24:40AM 4 points [-]

Rendering is not the problem. I would say that the uncanny valley has already been passed for static images rendered in real time by current 3D hardware (this NVIDIA demo from 2007 gets pretty close). The challenge for video games to cross the uncanny valley is now mostly in the realm of animation. Video game cutscenes rendered in real time will probably cross the uncanny valley with precanned animations in the next console generation but doing so for procedural animations is very much an unsolved problem.

(I'm a graphics programmer in the video games industry so I'm fairly familiar with the current state of the art).

Comment author: Chronos 11 January 2010 04:53:30AM 1 point [-]

I wasn't even considering the possibility of static images in video games, because static images aren't generally considered to count in modern video games. The world doesn't want another Myst game, and I can only imagine one other instance in a game where photorealistic, non-uncanny static images constitute the bulk of the gameplay: some sort of a dialog tree / disguised puzzle game where one or more still characters' faces changed in reaction to your dialog choices (i.e. something along the lines of a Japanese-style dating sim).

Comment author: mattnewport 11 January 2010 08:34:59AM 1 point [-]

By 'static images rendered in real time' I meant static images (characters not animated) rendered in real time (all 3D rendering occurring at 30+ fps). Myst consisted of pre-rendered images which is quite different.

It is possible to render 3D images of humans in real time on current consumer level 3D hardware that has moved beyond the uncanny valley when viewed as a static screenshot (from a real time rendered sequence) or as a Matrix style static scene / dynamic camera bullet time effect. The uncanny valley has not yet been bridged for procedurally animated humans. The problem is no longer in the rendering but in the procedural animation of human motion.

Comment author: Vladimir_Golovin 01 January 2010 09:01:15PM *  7 points [-]

Movies are 'pre-computed' so you can use a real human actor as a data source for animations, plus you have enough editing time to spot and iron out any glitches, but in a video game facial animations are generated on-the-fly, so all you can use is a model that perfectly captures human facial behavior. I don't think that it can be realistically imitated by blending between pre-recorded animations like it's done today with mo-cap animations -- e.g. you can't pre-record eye movement for a game character.

As for the robots, they are also real-time, AND they would need muscle / eye / face movement implemented physically (as a machine, not just software), hence the lower confidence level.