Consider the following commonly-made argument: cryonics is unlikely to work. Trained rationalists are signed up for cryonics at rates much greater than the general population. Therefore, rationalists must be pretty gullible people, and their claims to be good at evaluating evidence must be exaggerations at best.
This argument is wrong, and we can prove it using data from the last two Less Wrong surveys.
The question at hand is whether rationalist training - represented here by extensive familiarity with Less Wrong material - makes people more likely to believe in cryonics.
We investigate with a cross-sectional study, looking at proto-rationalists versus experienced rationalists. Define proto-rationalists as those respondents to the Less Wrong survey who indicate they have been in the community for less than six months and have zero karma (usually indicative of never having posted a comment). And define experienced rationalists as those respondents to the Less Wrong survey who indicate they have been in the community for over two years and have >1000 karma (usually indicative of having written many well-received posts).
By these definitions, there are 93 proto-rationalists, who have been in the community an average of 1.3 months, and 134 experienced rationalists, who have been in the community an average of 4.5 years. Proto-rationalists generally have not read any rationality training material - only 20/93 had read even one-quarter of the Less Wrong Sequences. Experienced rationalists are, well, more experienced: two-thirds of them have read pretty much all the Sequence material.
Proto-rationalists thought that, on average, there was a 21% chance of an average cryonically frozen person being revived in the future. Experienced rationalists thought that, on average, there was a 15% chance of same. The difference was marginally significant (p < 0.1).
Marginal significance is a copout, but this isn't our only data source. Last year, using the same definitions, proto-rationalists assigned a 15% probability to cryonics working, and experienced rationalists assigned a 12% chance. We see the same pattern.
So experienced rationalists are consistently less likely to believe in cryonics than proto-rationalists, and rationalist training probably makes you less likely to believe cryonics will work.
On the other hand, 0% of proto-rationalists had signed up for cryonics compared to 13% of experienced rationalists. 48% of proto-rationalists rejected the idea of signing up for cryonics entirely, compared to only 25% of experienced rationalists. So although rationalists are less likely to believe cryonics will work, they are much more likely to sign up for it. Last year's survey shows the same pattern.
This is not necessarily surprising. It only indicates that experienced rationalists and proto-rationalists treat their beliefs in different ways. Proto-rationalists form a belief, play with it in their heads, and then do whatever they were going to do anyway - usually some variant on what everyone else does. Experienced rationalists form a belief, examine the consequences, and then act strategically to get what they want.
Imagine a lottery run by an incompetent official who accidentally sets it up so that the average payoff is far more than the average ticket price. For example, maybe the lottery sells only ten $1 tickets, but the jackpot is $1 million, so that each $1 ticket gives you a 10% chance of winning $1 million.
Goofus hears about the lottery and realizes that his expected gain from playing the lottery is $99,999. "Huh," he says, "the numbers say I could actually win money by playing this lottery. What an interesting mathematical curiosity!" Then he goes off and does something else, since everyone knows playing the lottery is what stupid people do.
Gallant hears about the lottery, performs the same calculation, and buys up all ten tickets.
The relevant difference between Goofus and Gallant is not skill at estimating the chances of winning the lottery. We can even change the problem so that Gallant is more aware of the unlikelihood of winning than Goofus - perhaps Goofus mistakenly believes there are only five tickets, and so Gallant's superior knowledge tells him that winning the lottery is even more unlikely than Goofus thinks. Gallant will still play, and Goofus will still pass.
The relevant difference is that Gallant knows how to take ideas seriously.
Taking ideas seriously isn't always smart. If you're the sort of person who falls for proofs that 1 = 2 , then refusing to take ideas seriously is a good way to avoid ending up actually believing that 1 = 2, and a generally excellent life choice.
On the other hand, progress depends on someone somewhere taking a new idea seriously, so it's nice to have people who can do that too. Helping people learn this skill and when to apply it is one goal of the rationalist movement.
In this case it seems to have been successful. Proto-rationalists think there is a 21% chance of a new technology making them immortal - surely an outcome as desirable as any lottery jackpot - consider it an interesting curiosity, and go do something else because only weirdos sign up for cryonics.
Experienced rationalists think there is a lower chance of cryonics working, but some of them decide that even a pretty low chance of immortality sounds pretty good, and act strategically on this belief.
This is not to either attack or defend the policy of assigning a non-negligible probability to cryonics working. This is meant to show only that the difference in cryonics status between proto-rationalists and experienced rationalists is based on meta-level cognitive skills in the latter whose desirability is orthogonal to the object-level question about cryonics.
(an earlier version of this article was posted on my blog last year; I have moved it here now that I have replicated the results with a second survey)
I don't like this appropriation of the term "rational" (even with the "-ist" suffix), and in fact I find it somewhat offensive.
[ Warning: Trolling ahead ]
But since words are arbitrary placeholders, let's play a little game and replace the word "rationalist" with another randomly generated string, such as "cultist" (which you might possibly find offensive, but remember, it's just a placeholder).
So what does your data say?
Proto-cultists have give a higher average probability of cryonics success than committed cultists.
But this isn't necessarily particularly informative, because averaging probabilities from different estimators doesn't really tell us much (consider scenario A where half of the respondents say p = 1 and half say p = 0, and scenario B where all the respondents say p = 0.5. The arithmetic mean is the same, but the scenarios are completely different). The harmonic mean can be a better way of averaging probabilities.
But anyway, let's assume that the distribution of the responses is well-behaved enough that it holds that a randomly sampled proto-cultist is more likely to assign an higher probability of cryonics success than a randomly sampled committed cultist (you can test this hypothesis on the data).
On the other hand, proto-cultists are much less likely to be signed up for cryonics than committed cultists (in fact, none of the proto-cultist are signed up).
What is the correlation between belief in cryonics success and being signed up for cryonics? I don't know since it isn't reported neither here nor in the survey results post (maybe it was computed by found to be not significant, since IIUC there was a significance cutoff for correlations in the survey results post).
Do committed cultists who sign up for cryonics do it because they assign a high probability to its success, or despite they assign it a low probability? I have no way of knowing.
Or, actually, I could look at the data, but I wont, since you wrote the post trying to make a point from the data, hence the burden of providing a meaningful statistic analysis was on you.
Let's try to interpret this finding:
Cryonics is a weird belief. Proto-cultists didn't spend much time researching it and thinking on it, and because their typical background (mostly computer science students) find it somewhat plausible, but they don't really trust their estimate very much.
Committed cultists, on the other hand, have more polarized beliefs. Being in the cult might have actually stripped them away of their instrumental rationality (or selected for irrational people) so they decide against their explicit beliefs. Or they respond to social pressures, since cryonics is high status in the cult and has been explicitly endorsed by one of the cult elders and author of the Sacred Scrip-...Sequences. Or both.
Oops.
[ End of trolling ]
The bottom line of my deliberately uncharitable post is:
Don't use words lightly. Words aren't really just syntactic labels, they convey implicit meaning. Using words with an implicit positive meaning ("experienced rationalist") to refer to the core members of a community, naturally suggest a charitable interpretation (that they are smart). Using words with an implicit negative meaning ("committed cultists") suggests an uncharitable interpretation (that they are brainwashed, groupthinking, too much preoccupied with costly status signalling that has no value outside the group).
If you are trying to make a point from data, provide relevant statistics.
Yeah. Suppose we were talking about a newage-ish cult where the founder has arranged himself to be flown to Tibet for the sky burial when he dies. They can very well have exact same statistics on their online forum.