It's time to look back to see what was predicted a year ago and how successfully it was.

But even more, it's time for the fresh predictions for the following year, 2014.

New Comment
62 comments, sorted by Click to highlight new comments since:

Nicholas Wade will get fired from the New York Times because of his book A Troublesome Inheritance: Genes, Race, and Human History. (Probability 30%)

From that page:

Correction: Wade tells blogger Luke Ford that he retired from the Times 2 years ago, but still contributes articles to the paper. Neither Wade nor the Times returned earlier requests for comment on the matter.

Luke Ford source http://www.lukeford.net/blog/?p=54601 :

[Wade:] “I retired from the Times about two years ago. There’s a blogosphere story today that the Times didn’t like the book and fired me, but the writer invented the whole thing based on his having seen the words ‘former Science editor’ in the piece I did in Time.”

...Luke: “You will be writing future articles for the New York Times?”

Nicholas: “I assume so. I write for them quite regularly on a contract basis but I am not on their staff any longer.”

I propose the prediction be amended to, "The New York Times will never accept articles from Nicholas Wade again."

I gave it 40% after reading the advance reviews. That was even faster than I had expected.

Apparently the Daily Caller article is mistaken, Wade took a retirement package two years ago and is now a science writer rather than science editor, according to Charles Murray here. So, the jury is still out on this one, we will wait and see if he is fired from being a science writer.

Wade stopped being science editor in 1997 (so that he could write articles), so it's pretty weird that the Time byline was "former science editor," regardless of how Wade's position changed in 2012 or last week.

I agree with this prediction.

[-]satt30

The mean IQ reported in the latest LW survey will be 137 or less (probability 75%).

Thought I'd squeeze that in before Yvain posts the results. Some background. In the 2012 survey LWers reported having a mean IQ of 138.7. I found (and find) that unlikely, and reckon that average was inflated through a mixture of selective reporting, mis-remembering, perhaps the occasional lie, and people taking old or otherwise iffy tests. However, the 2013 survey should suffer much less from that last problem because (if I remember rightly) it asks only about formally assessed IQs. So I expect the 2013 IQ average to be less (because less inflated) than the 2012 IQ average.

I'm not super confident about this, because there's some countervailing evidence (using self-reported SAT & ACT scores from the 2012 survey as IQ proxies suggests higher IQ estimates, not lower estimates), and because a selection effect could actually worsen the inflation in the 2013 survey (people with higher IQs might be more likely to have their IQs formally tested; if so, asking only for properly measured IQs will selectively pick out high-IQ people). But my hunch is that these issues won't matter so much.

[-]satt20

Prediction failed.

(As well as the headline result that the average turned out to be 138.2, see also section V.B, and Vaniver's comment. Also, it looks like I may have been wrong to take Yvain's SAT-and-ACT-to-IQ conversions from last year at face value in the parent comment.)

When will we get to see the results of the LW survey?

[-]satt00

No idea (Yvain hasn't said yet), but presumably some time this year!

But I can try to guess. In 2012, Yvain apparently closed the survey on November 26 and reported the results on December 7. In 2011, the survey closed on December 3 and Yvain reported results on December 5. And in 2009, Yvain announced the survey on May 3 and posted its results on May 12. So for those surveys, it's been 11 days, 2 days, and ≈9 days between when the survey closed and when Yvain posted results, and I'd guess that 2-11 days after Yvain closes last year's survey (which he hasn't yet), he'll post its results.

(Maybe I should make that a formal prediction? "2-11 days after Yvain closes the current LW survey, he'll post the results (80%).")

A 138 average doesn't seem far-fetched at all to me. A little bit of self-serving bias is inevitable, but I highly doubt the real average is e.g. in the 120s. This random website I found says that the average IQ of an Ivy League student is 142. I go to a school that isn't as good as most Ivys but is better than some of them. I would guess the average IQ of a student here is 135-ish. The average LW poster seems much, much smarter to me than the average person at my school.

[-]satt00

Yeah, I've moved a bit towards your sort of position because of the 2013/14 results. That said, I don't have an impression of LWers being way, way smarter than other students I encounter in real life.

(I'm also still leery of the poor correlation between education level and IQ, which cropped up again in the latest survey. To go into tedious detail: among those aged ≥29, 25 people with a high school education or less gave a mean IQ of 139.5, and 155 people with more education gave a mean IQ of 140.2. And Nornagest's suggestion to look at the high end now gives a less statistically significant result than last year. The 24 oldsters with PhDs gave a mean IQ of 142.4, and the other 156 non-PhDs gave a mean IQ of 139.8.)

[-][anonymous]30

Bitcoin will surpass $5,000 (for example, due to the opening of one or more easily assessible exchange traded funds).

Release of the consumer version of the Occulus Rift (if it happens in 2014) will bring back the 90's optimism for augmented and virtual reality. Someone will release a "metaverse" client for a peer-to-peer, decentralized, distribed virtual world.

ETA: Clarified "due to" construction of prediction.

Probability estimates?

[-]gjm220

I predict that there will be disappointingly few of these.

And those that there are will be pretty inaccurate (100%).

[-][anonymous]30

I predict there's about 50% chance the price of bitcoin is going to rise and about 50% chance it's going to fall. Anyone wanna bet me on this?

[-][anonymous]30

Better than even odds or I wouldn't have said it. Any more precision than that would be intellectual dishonesty.

How much Bitcoin do you own?

[-][anonymous]80

A few dozen although I'm burning through it quickly. I make my living paid in bitcoin (I'm a bitcoin-core developer living off donations) so that isn't so much an investment as my current runway. Since I don't think any ETF will get online until Summer at the earliest, and my runway isn't that long, I doubt that I'll make much money off the rise. Assuming I keep getting donations, it'll be just a few months float at the most.

What about Bitcoin surpassing 5k for any reason?

[-]gjm160

Much less likely, obviously...

[-][anonymous]10

I think you have that reversed. How can the statement "Bitcoin will surpass $5,000" be less probable than the statement "Bitcoin will surpass $5,000, due to X"?

I think you have that reversed.

I strongly suspect it's a joke about the conjunction fallacy.

The prediction that Bitcoin passes $5k, that there's a bitcoin ETF, and that both happen, are all interesting probabilities; the claim that the conjunction will happen (and specifically that the bitcoin ETF will be the primary cause of the Bitcoin price increase) seems like a conjunction fallacy prompt, that the additional detail is there to make it seem more credible rather than less. (Otherwise, you should use a conditional- if a bitcoin ETF exists, then the price will likely top $5k.)

[-]gjm40

a joke about the conjunction fallacy.

Yep, exactly. (More precisely, bramflakes was making a point about the conjunction fallacy and I was reinforcing it with a bit of irony. Trying to, anyway.)

[-][anonymous]00

I'm not sure the conjunction fallacy should be of operational importance to the giving of predictions, unless there is betting involved. What should I do instead, simply state "Bitcoin will rise to $5,000."? That's completely uninteresting, and I fail to see how anyone could judge that prediction based on anything other than my credentials and their prejudices. Saying "Bitcoin will rise to $5,000 due to X." gives people a window into my thinking and lets them judge for themselves whether my assessment is likely.

You could say “Bitcoin will rise to $5,000 (for example, due to X).”

[-]gjm20

There's nothing at all wrong with making specific predictions. But it should be done with care, in view of our brains' tendency to infer higher rather than lower probability when they see something more specific.

Also, it's nearly impossible to figure out what caused a price movement, even after the fact.

[-][anonymous]10

That's true of index funds, not of individual assets, and demonstrably false in the case with a highly illiquid asset like bitcoin.

The problem with all this 3d headwear seems, to me at least, to be that they don't really offer any substantial improvement over a monitor and mouse. Our brains don't need stereoscopic displays to percieve a 3d world. Our brains are very very good at building up a 3d representation of a world from just a 2d image (something that comes in very handy in the real world when one of your eyes is closed or non-functioning). And moving around a 3d world with your hand seems to be about the same, if not easier, level of difficulty than moving around with your neck. And the main disadvantage with headwear is discomfort.

I'd give the Oculus Rift a 50% chance of success.

At least augmented reality headwear (such as google glass or the very amazing Meta Spaceglasses: https://www.spaceglasses.com/ ) have something to offer that can't be had by a traditional monitor. However, it still remains to be seen how much people will actually desire those things. I can definitely imagine the Spaceglasses being widely used by creative professions.

EDIT: Changed 'fatigue' to 'discomfort'.

Have you tried an Oculus Rift? I did, and I had the same "this is awesome!" reaction most people seem to report. Having more 3d space show up when you turn your head around is a big deal, as is having the 3d world take over your entire field of view.

There might be fatigue problems that show up with long-term use that we haven't seen yet, and strange cultural reactions to the way the headset user becomes totally isolated from the surrounding real world, but the initial reaction where almost everyone who tries it on thinks it's awesome predicts at least a fad success.

Initial reactions do not seem to be a good predictor of success. After the initial novelty wears off, users do in fact report problems such as low resolution and discomfort (dizziness, headaches, vertigo, and nausea). See for instance this review and many others that can be found with simple google search.

3d television users also initially reported "this is awesome!" reactions (for similar reasons), but it does not seem to have caught on (also for similar reasons: poor resolution and discomfort).

As mentioned in that review, it also depends on how the technology is used. Game developers have to take steps to reduce discomfort and to use the technology in novel ways. If that is done, then I agree that the chance of success becomes much larger.

[-][anonymous]40

After the initial novelty wears off, users do in fact report problems such as low resolution and discomfort (dizziness, headaches, vertigo, and nausea).

Most of which is due to limitations in the devkit model (lack of degrees of freedom in head tracking and low resolution), all of which are being fixed in the consumer model. Reviews of the consumer model prototypes tested at conventions / press events have reported these symptoms are gone.

When I made my prediction I called out a Snow Crash like metaverse as the killer app. More generally, I think we will be seeing applications of head-mounted VR that are surprising, novel, and ultimately far more interesting than gaming. Occulus Rift will be, I think, a transformative technology in general, even if it ends up controversial or marginalized in gaming.

Reviews of the consumer model prototypes tested at conventions / press events have reported these symptoms are gone.

In that case, the chances of success look much better.

More generally, I think we will be seeing applications of head-mounted VR that are surprising, novel, and ultimately far more interesting than gaming.

Can you give some examples?

[-][anonymous]30

Besides the metaverse I've already mentioned, here's another one:

Through my work I've been fortunate enough to be able to use CAVE environments developed at UC Davis and UC San Diego in analysis of planetry data. Search for "3d CAVE" on youtube and you should fine plenty of videos showing what this experience is like.

The effect of being able to immersively interact with this data is incredible. The classic example I gave visitors was some of the first published data to come out of the UC Davis computer science / geology visualization partnership: a buckling of subduction zones that was previously unknown despite having sufficient data available for the last century at least. They loaded earthquake data overlayed on a globe basically as a test of the system, and almost immediately discovered the subduction buckling from straight visual inspection.

Analyzing geometric data directly in an immersive 3D environment is so much more productive than traditional techniques, because it exploits the natural machinery we have inbuilt in us for aggregating and extracting out details of sensory data. Already it sees use in many areas - I sat next to someone on a plane once who's job it was to install these things in oil exploration ships, where the energy companies would use it to quickly analyze the terrabytes of data coming in from the sea bed.

I expect that in nearly all fields of engineering, physical science, and biology there are great efficiencies to be gained by utilizing the immersive CAVE experience. But a traditional CAVE will cost you half a million dollars, putting it way outside the reach of most organizations. An Occulus Rift + Kinect + decent graphics card puts you back less than a thousand dollars, on the other hand.

(BTW, experience in immersive CAVE environments is that with suitable precision and capability in the technology motion sickness-like symptoms disappear for all but a few percent of the population)

I actually agree with you here. As I mentioned in my first reply, I can easily imagine virtual/augmented reality headsets being used for creative professions, and I can also easily imagine them being used for science/engineering and so on. It's just hard for me to imagine them being widely used in gaming, at least in their current form. Maybe future, more advanced iterations of the technology would have better chances.

[-]wwa00

How about an architect walking his clients though their soon-to-be house?

What makes the Oculus Rift special in that regard? There have been numerous head-mounted VR solutions that have been able to do that for many years. Yet they have not seen any serious use for such purposes.

Have you tried it?

The Rift is different in that it provides full hemisphere viewing angle. There is no 'tunnel vision', and you get full peripheral vision. Peripheral vision is important to the HVS for motion sensation and situational awareness.

Its immediately different as soon as you turn your head, there is a definite wow factor over a monitor.

The tradeoff of course is the terrible resolution, but its interesting in showing the potential of at leas solving most of the other immersion problems.

[-][anonymous]00

The tradeoff of course is the terrible resolution, ...

Solved in the consumer version which is still being worked on (at least 1080p in each eye).

1080p in each eye is hardly enough to 'solve' the resolution problem. There is a fundamental tradeoff between FOV and effective resolution - a reason why other manufactures haven't attempted full human FOV. For a linear display its something like 8k x 4k per eye for a full FOV HMD to have HDTV equivalent resolution.

The big advantage over a monitor is immersion. When I tried out an oculus rift I felt like I was inside the virtual space in a way that I've never felt while playing FPSes on a monitor. That's not a small thing.

Another advantage is that it increases how many input axes you have. Think of games where you're flying a spaceship or driving a car and you can freely look in all directions while controlling your vehicle with both hands. That's impossible on a standard monitor.

It's not impossible. Games frequently allow you to use the arrow keys to move around while using the mouse to change the view direction (or vice versa).

I know that; I've played FPSes with that control layout for thousands of hours. I said "while controlling your vehicle with both hands" which means, for example, with a steering wheel, a throttle+joystick, or a keyboard+mouse with the mouse controlling something besides camera angle.

[-]JTHM00

The present value of a commodity reflects the market's best estimate as to the future value of that commodity. You are not smarter than the market; practically nobody is. If the market value of Bitcoin is X, then something not far from X is the best estimate of Bitcoin's near-future value. (The very best guess isn't exactly X because of cost of liquidity and time preferences.)

[-][anonymous]00

People have access to different sets of information (particularly in non-regulated markets), come to the table with different priors, and have very different time preferences. individual investor estimates are therefore all over the map and often time dependent, which is why there is any trade at all (if everyone felt the same way and never changed their minds, the market would quickly stabilize and the volume drop to zero). For these reasons it's fallacious to try to aggregate and extrapolate estimations of future value from current market prices.

[-][anonymous]20

.

[This comment is no longer endorsed by its author]Reply

What do you mean by human-level?

[-][anonymous]00

.

[This comment is no longer endorsed by its author]Reply

5% chance of human level AI this year seems extremely high to me. What are you basing that on?

[-][anonymous]00

.

[This comment is no longer endorsed by its author]Reply

Marijuana is legalized/decriminalized in a New England state or another west coast state in the USA (35%).

If you are distinguishing between legalization and decriminalization, hasn't it already been decriminalized in all the states you mention?

Will it still be illegal under federal law? States don't have the power to remove all criminal penalties for marijuana use.

1, Florida's new law on anatomical donations will prevent a cryonicist who dies in that state from going into cryo.

Reference:

http://groups.yahoo.com/neo/groups/New_Cryonet/conversations/messages/5925

  1. An authority figure in the cryonics movement will finally acknowledge that Drexler's "nanotechnology" simply can never exist because it gets the physics wrong, and he recommends that cryonics organizations stop invoking the idea as the revival mechanism. Continuing to rely on "nanotechnology" propaganda leaves cryonics organizations open to accusations of knowingly practicing fraud based on pseudo- and cargo cult science.

  2. We'll see a divergence in public discussions about "the future," where one group continues to promote accelerationist claims, while another asks why "the future" the former keeps promising us hasn't arrived yet.

And I keep asking that myself. My father took me to see Stanley Kubrick's famous film at a theater in Tulsa back in 1968 (quite an adventure for a geeky 8 year old Okie boy!), and I can remember thinking that the year 2001 seemed like a wondrous, far-off future time. Living thirteen years afterwards of that date in reality, I feel a bit cheated.

[-][anonymous]120

An authority figure in the cryonics movement will finally acknowledge that Drexler's "nanotechnology" simply can never exist because it gets the physics wrong.

Please provide an explanation or citation. In the mean time, here is a growing list of peer-reviewed publications of ab-initio quantum simulations of Drexler-esque diamondoid mechanosynthesis:

http://www.molecularassembler.com/Nanofactory/Publications.htm

[-][anonymous]60

An authority figure in the cryonics movement will finally acknowledge that Drexler's "nanotechnology" simply can never exist because it gets the physics wrong, and he recommends that cryonics organizations stop invoking the idea as the revival mechanism.

Hasn't Mike Darwin been doing this for years? (And for good reason?)

We'll see a divergence in public discussions about "the future," where one group continues to promote accelerationist claims, while another asks why "the future" the former keeps promising us hasn't arrived yet.

Happened in my circles years ago. With the vast majority of my circle (and me) falling into the latter camp for most of the stereotypical issues/themes.