To everyone who just read this and is about to argue with the specific details of the bullet points or the mock argument:
Don't bother, they're (hopefully) not really the point of this.
Focus on the conclusion and the point that LW beliefs have a large inferential distance. The summary of this post which is interesting to talk about is "some (maybe most) LW beliefs will appear to be crackpot beliefs to the general public" and "you can't actually explain them in a short conversation in person because the inferential distance is too large". Therefore, we should be very careful to not get into situations where we might need to explain things in short conversations in person.
Therefore, we should be very careful to not get into situations where we might need to explain things in short conversations in person.
Should I start staying indoors more?
The way to have these conversations is to try to keep them as narrow as possible. You're not trying to explain your worldview, you're just trying to take the other person one step forward in inferential distance. There should be one point that you're trying to make that you want the other person to take away from the conversation, and you should try to make that point as clearly and simply as possible, in a way that will be understandable to the other person. Maybe you can give them a glimpse that there's more to your thinking than just this one point, but only if it doesn't distract from that point.
Bob doesn't do this. He feels that he needs to explain the nature of evidence, he uses an example which is controversial to Rian (and thus is a distraction from the point that Bob is trying to establish with the example), and he responds to every issue that Rian brings up instead of trying to bring the conversation back to the original point. Bob's problem is not that he has particularly unusual or crazy beliefs, it's that he has various views that are different from Rian's and he lets the conversation bounce from one to another without ever sticking with one point of disagreement long enough to get a clear explanation of his views across.
you are likely to agree with the majority of these statements
If you hadn't amplified the oddness of the beliefs on the list, this would be true. The trouble is, the way you amplified oddness is mostly by changing the substance of what was communicated, not just the style. Like by using over-general words so that people will hear one connotation when you might have been trying to say another. And so, why should we agree with statements that say the wrong thing?
Starting out with an incorrect guess about the reader is really bad for the rest of the post. You should start with your message instead, maybe even use personal experience - "I've had conversations where I brought up beliefs spread on LW, and people thought I was a crackpot."
But I also disagree with the thesis that the solution is to try to "hide the crazy." Bayesian Bob doesn't break things down into small enough parts and tries to use too many "impressive" statements that are actually harmful to communication. So a first action might be to stop digging himself deeper, but ultimately I think Bob should try to get better at explaining.
If you fail to conceal your low-status beliefs you'll be punished for it socially.
This shows a lack of understanding of signaling theory.
A poor kid wears middle class clothes so that people will think they're middle class and not poor. A middle class person wears rich clothes so that people will think they're rich and not middle class. A rich person wears whatever they want, because middle class people are already wearing 'rich' clothes and nobody's going to confuse them for being poor while they're matching ripped jeans with Rolex watches. If you and your beliefs are already low status, then having 'crackpot' beliefs will push your status lower. If you are already high status, then eccentric beliefs will increase your status. At the highest levels of status, people will automatically and unconsciously update their beliefs toward yours.
Your story sounds like Ryan is much higher status than Bob. Ryan's got kung-fu master level rationality skills versus low level Bayesian judo. Ryan also sounds more articulate and intelligent than Bob, although that might be the halo effect talking since we already established he's higher status. Bob is outgunned on every level and isn't smart enough to extricate himself, so of course he's going to be punished for it socially. It could have been an argument between any two ideological positions and Bob would have lost.
It says nothing about how most of us on Less Wrong should display our beliefs.
It is not "damning". The test diagnoses a particular cognitive style, characterised by precision and attention to detail - this is of no great benefit in social settings, and in extreme cases can lead to difficulty in social interaction and peculiar behaviour. On the other hand, in sciences, engineering and probably philosophy, this style brings major benefits. The overall quality of LW site is a reflection of this.
I classed this as a 'why commonly held LW beliefs are wrong' post when I first saw the list, then skipped to the conclusion (which made a really useful point, for which I upvoted the post.) I'm mentioning this because I think that the post would communicate better if you revealed your main point earlier.
The conversation between Rational Rian and Bayesian Bob is uncannily reminiscent of several conversations I had when I first grew infatuated with some of EY's writings and Lesswrong overall. This later led me to very quickly start wondering if the community would be willing to dedicated some intellectual effort and apply rationality to hiding bad signalling.
I think the OP is worth posting in the main section. But someone should write up something, about how to raise the sanity waterline without damaging your own reputation after that. Now I know when people call on someone to do something, this more or less means no one especially not me. This is why I've been doing my own thinking on the matter, but I'd first like to know if people on LW are interested at all in this line of thought.
For an example: A basic stratagem seems to be to successfully diagnose, perhaps even affirm, some of your acquaintances beliefs then over time present some simple and powerful, if perhaps by now obsolete or superseded arguments that first started several of LW's more prominent writers (or yourself) on a path to the current set of beliefs. This naturally isn't rationality building (though it might happen in the process), just spreading beliefs, but the objective here is to change the in group norms of your social circle.
Then, you can start individually building rationality skills.
It is not nearly as bad as you make it out. Bayesian Bob just seems really bad at explaining.
Rian seems to not consider detectives investigating a crime to be gathering evidence, but Bob does not seem to notice this. We can come up with examples of socially categorized types of evidence and explain why the categories are socially useful.
Absence of Evidence is Evidence of Absence can be explained in scientific terms. If a scientific experiment looking for evidence of a theory produces no results, that is evidence against the theory. This is easier to deal with in a scientific experiment because its controlled nature allows you to know how hard it was looking for evidence, to calculate how likely it would be to find the evidence if the theory were correct. Outside the context, the principle is harder to apply because the conditional probability is harder to calculate, but it is still valid.
Not once did Bob bring up such concepts as likelihood ratios or conditional probability.
And plenty of other comments have noted the problem with "starts out with a 50% probability".
As has also been pointed out already, most of the bullet point statements are either not actually controver...
A statement, any statement, starts out with a 50% probability of being true, and then you adjust that percentage based on the evidence you come into contact with.
Zed, you have earned an upvote (and several more mental ones) from me for this display of understanding on a level of abstraction even beyond what some LW readers are comfortable with, as witnessed by other comments. How prescient indeed was Bayesian Bob's remark:
(I shouldn't have said that 50% part. There's no way that's going to go over well. I'm such an idiot.)
You can be assured that poor Rational Rian has no chance when even Less Wrong has trouble!
But yes, this is of course completely correct. 50% is the probability of total ignorance -- including ignorance of how many possibilities are in the hypothesis space. Probability measures how much information you have, and 50% represents a "score" of zero. (How do you calculate the "score", you ask? It's the logarithm of the odds ratio. Why should that be chosen as the score? Because it makes updating additive: when you see evidence, you update your score by adding to it the number of bits of evidence you see.)
Of course, we almost never reach thi...
I would be much happier with that survey if it used the standard five-degrees-of-belief format rather than a flat agree/disagree. Especially later on, it includes many statements which I believe or disbelieve with low confidence, or which I consider irrelevant or so malformed as to be essentially meaningless.
Some of those statements in the list are sufficiently unclear that I can't really agree or disagree with them. Others have multiple different claims in them and I agree with some parts and disagree with others. And some are just false.
Most scientists disagree with this but that's just because it sounds counter-intuitive and scientists are biased against counterintuitive explanations.
This one is false, as some other comments have pointed out.
Besides, the scientific method is wrong because it is in conflict with probability theory. Oh, and probability is created by humans, it doesn't exist in the universe.
(Bayesian) Probability theory doesn't say that the scientific method is wrong. It provides a formal specification of why the scientific method (of changing beliefs based on evidence) is correct and how to apply it. The second sentence refers to the true beliefs explained in Probability is in the Mind and Probability is Subjectively Objective, but it mangles them.
Every fraction of a second you split into thousands of copies of yourself. Of course you cannot detect these copies scientifically, but that because science is wrong and stupid.
"Science is wrong and stupid&q...
argument ad populum
I think a more correct term in this context would be argumentum ad verecundiam. It's about arguing based on the opinion of a small number of authoritative people, not the general public.
If you have worked your way through most of the sequences you are likely to agree with the majority of these statements
I realize this is not the main point of the post, but this statement made me curious: what fraction of Less Wrong readers become convinced of these less mainstream beliefs?
To this end I made a Google survey! If you have some spare time, please fill it out. (Obviously, we should overlook the deliberately provocative phrasing when answering).
I'll come back two weeks from now and post a new comment with the results.
Here are the crackpot belief survey results.
All in all, 77 people responded. It seems we do drink the Kool-Aid! Of the substantial questions, the most contentious ones were "many clones" and timeless physics, and even they got over 50%. Thanks to everyone who responded!
I want people to cut off my head when I'm medically dead, so my head can be preserved and I can come back to life in the (far far) future.
Agree 73% Disagree 27%
It is possible to run a person on Conways Game of Life. This would be a person as real as you or me, and wouldn't be able to tell he's in a virtual world because it looks exactly like ours.
Agree 90% Disagree 10%
Right now there exist many copies/clones of you, some of which are blissfully happy and some of which are being tortured and we should not care about this at all.
Agree 53% Disagree 47%
Most scientists disagree with this but that's just because it sounds counter-intuitive and scientists are biased against counterintuitive explanations.
Agree 32% Disagree 68%
Besides, the scientific method is wrong because it is in conflict with probability theory.
Agree 23% Disagree 77%
Oh, and probability is created by humans, it doesn't exist i...
I want to fill it out, I really do, but the double statements make me hesitate.
For example I do believe that there are ~lots of "clones of me" around, but I disagree that we shouldn't care about this. It has significant meaning when you're an average utilitarian, or something approaching one.
It is possible to run a person on Conways Game of Life. This would be a person as real as you or me, and wouldn't be able to tell he's in a virtual world because it looks exactly like ours.
I don't see why you call this a "crackpot belief". The ( extended ) Church-Turing Thesis has near-universal acceptance and implies that humans can be simulated by turing machines. Similarly, it is widely accepted that Conways Game of Life can run turing machines . Physicists who don't believe this are widely regarded as controversial.
Right, so whether a belief is low status is (among other things) a property of the audience.
But even if the audience consists of people who "who like philosophy and [are] familiar with the different streams and philosophical dilemmas, who know computation theory and classical physics, who [have] a good understanding of probability and math and somebody who [are] naturally curious reductionists", which is a very educated audience, then the cognitive gap is still so large that it cannot be bridged in casual conversation.
I think it's fair to say a highly educated reductionist audience is considered high status by almost any standard[1]. And my claim is, and my experience is, that if you casually slip in a LW-style argument then because of the cognitive gap you won't be able to explain exactly what you mean, because it's extraordinarily difficult to fall back on arguments that don't depend on the sequences or any other prerequisites.
If you have a belief that you can't explain coherently then I think people will assume that's because your understanding of the subject matter is bad, even though that's not the problem at all. So if you try to explain your beliefs but fail to do so in a manner that makes sense (to the audience) then you face a social penalty.
[1] we can't get away with defining every group that doesn't reason like we do as low-status
This reminds me of some old OB posts, I think, on non-conformity - the upshot being that you can't get away with being public on all the ways you are a maverick and to do so is self-sabotaging.
If you have worked your way through most of the sequences you are likely to agree with the majority of these statements:
I have, but I don't. A couple I agree with and there are some others about which I can at least see how they could be used as a straw man. Then there are some which are just way off.
Then there is:
It is possible to run a person on Conways Game of Life. This would be a person as real as you or me, and wouldn't be able to tell he's in a virtual world because it looks exactly like ours.
That I agree with. And it doesn't qualify as a 'cr...
Concealing unconventional beliefs with high inferential distance to those you are speaking with makes sense. Dismissing those beliefs with the absurdity heuristic does not.
Also, I think you underestimate the utility of rhetorical strategies. For example, you could:
Most of the statements you make are false in their connotations, but there's one statement you make (and attribute to "Bayesian Bob") that seems false no matter what way you look at it, and it's this one: "A statement, any statement, starts out with a 50% probability of being true" Even the rephrasing "in a vacuum we should believe it with 50% certainty" still seems simply wrong. Where in the world did you see that in Bayesian theory?
For saying that, I label you a Level-0 Rationalist. Unless someone's talking about binary dig...
I think Bayesian Bob should just get better at arguing. It's the same thing I tell students when they complain that they can't explain their paper properly in a 4 sentence abstract: The number of possible sentences you might write is very very large. You're going to have to work a lot harder before I'm convinced that no sequence of 4 sentences will suffice.
My experience has been that if I'm arguing about something I know well and I'm very confident about, it never feels like I'm in danger of "losing status".
"Bayesian Bob: ... I meant that in a vacuum we should believe it with 50% certainty..."
No we shouldn't: http://lesswrong.com/lw/jp/occams_razor/
As for proving a negative, I've got two words: Modus Tollens.
Bob does need to go back to math class! ;)
A statement, any statement, starts out with a 50% probability of being true, and then you adjust that percentage based on the evidence you come into contact with.
Suppose we have a statement X, and the only thing we know about X is that it was randomly selected from the set S of statements of 100 characters, with the alphabet consisting of the digits 0 to 9 and the symbols + and =. If
th...
A statement, any statement, starts out with a 50% probability of being true, and then you adjust that percentage based on the evidence you come into contact with.
That's wildly wrong. "50% probability" is what you assign if someone tells you, "One and only one of the statements X or Y is true, but I'm not going to give you the slightest hint as to what they mean" and it's questionable whether you can even call that a statement, since you can't say anything about its truth-conditions.
Any statement for which you have the faintest idea ...
"If you know nothing, 50% prior probability" still strikes me as just plain wrong.
That strikes me as even weirder and wrong. So given a variable A which could be every possible variable, I should assign it... 75% and ~A 25%? or 25%, and make ~A 75%? Or what? - Isn't 50% the only symmetrical answer?
Basically, given a single variable and its negation, isn't 1/2 the max-entropy distribution, just as a collection of n variables has 1/n as the max-ent answer for them?
I don't think it's that bad. Anything at an inferential distance sounds ridiculous is you just matter-of-factly assert it, but that just means that if you want to tell someone about something at an inferential distance don't just matter-of-factly assert it. The framing probably matters at least as much as the content.
science is wrong
No. Something like "Bayesian reasoning is better than science" would work.
Every fraction of a second you split into thousands of copies of yourself.
Not "thousands". "Astronomically many" would work.
Computers will soon become so fast that AI researchers will be able to create an artificial intelligence that's smarter than any human
That's the accelerating change, not the intelligence explosion school of singularity. Only the latter is popular around here.
...Also, we sometimes prefer torture to dust-spe
the scientific method is wrong
Is your refrigerator running?
Time isn't real.
Rainbows are real.
Computers will soon become so fast that AI researchers will be able to create an artificial intelligence that's smarter than any human.
Not Kurzweil...anything but that
When this happens humanity will probably be wiped out.
Marxist-style fatalism? Worse than Kurzweil.
truth of all these statements is completely obvious
Obviousness isn't a property of statements or facts. This idiom of thought embodies the mind projection fallacy.
Just because I read the sequences doesn't mean I'm particularly likely to agree with any of them. Some, yes, but not all. Many of the statements you listed are controversial even on LW. If they were unanimously accepted here without further discussion, it would be a worrying sign.
I wonder what should Friendly AI do, when it discovers something that at first sight seems like a "crackpot belief" to its human operators. Let's assume that the AI is far smarter than humans (and the "crackpot belief" requires many logical steps), but is still in a testing phase and humans don't believe in its correctness.
If AI tells the discovery openly to humans, they will probably turn it off quickly, assuming there was something wrong in a program.
On the other hand, if the AI predicts that humans are not ready for this information,...
Think back for a second to your pre-bayesian days. Think back to the time before your exposure to the sequences. Now the question is, what estimate would you have given that any chain of arguments could persuade you the statements above are true? In my case, it would be near zero.
p(convincing argument exists) >= (Number of Bayesians post-sequence - Number of Bayesians pre-sequence) / Number of people reading sequence.
Or, in simpler terms, "the sequences are a convincing chain of argument, and they are effective." I'll admit I'm working with...
I can't handle this site's scrutiny right now.
If you have worked your way through most of the sequences you are likely to agree with the majority of these statements:
In two words: crackpot beliefs.
These statements cover only a fraction of the sequences and although they're deliberately phrased to incite kneejerk disagreement and ugh-fields I think most LW readers will find themselves in agreement with almost all of them. And If not then you can always come up with better examples that illustrate some of your non-mainstream beliefs.
Think back for a second to your pre-bayesian days. Think back to the time before your exposure to the sequences. Now the question is, what estimate would you have given that any chain of arguments could persuade you the statements above are true? In my case, it would be near zero.
You can take somebody who likes philosophy and is familiar with the different streams and philosophical dilemmas, who knows computation theory and classical physics, who has a good understanding of probability and math and somebody who is a naturally curious reductionist. And this person will still roll his eyes and will sarcastically dismiss the ideas enumerated above. After all, these are crackpot ideas, and people who believe them are so far "out there", they cannot be reasoned with!
That is really the bottom line here. You cannot explain the beliefs that follow from the sequences because they have too many dependencies and even if you did have time to go through all the necessary dependencies explaining a belief is still an order of magnitude more difficult than following the explanation written down by somebody else because in order to explain something you have to juggle two mental models: your own and the one of the listener.
Some of the sequences touches on the concept of the cognitive gap (inferential distance). We have all learned this the hard way that we can't expect people to just understand what we say and we can't expect short inferential distances. In practice there is just no way to bridge the cognitive gap. This isn't a big deal for most educated people, because people don't expect to understand complex arguments in other people's fields and all educated intellectuals are on the same team anyway (well, most of the time). For crackpot LW beliefs it's a whole different story though. I suspect most of us have found that out the hard way.
Rational Rian: What do you think is going to happen to the economy?
Bayesian Bob: I'm not sure. I think Krugman believes that a bigger cash injection is needed to prevent a second dip.
Rational Rian: Why do you always say what other people think, what's your opinion?
Bayesian Bob: I can't really distinguish between good economic reasoning and flawed economic reasoning because I'm a lay man. So I tend to go with what Krugman writes, unless I have a good reason to believe he is wrong. I don't really have strong opinions about the economy, I just go with the evidence I have.
Rational Rian: Evidence? You mean his opinion.
Bayesian Bob: Yep.
Rational Rian: Eh? Opinions aren't evidence.
Bayesian Bob: (Whoops, now I have to either explain the nature of evidence on the spot or Rian will think I'm an idiot with crazy beliefs. Okay then, here goes.) An opinion reflects the belief of the expert. These beliefs can either be uncorrelated with reality, negatively correlated or positively correlated. If there is absolutely no relation between what an expert believes and what is true then, sure, it wouldn't count as evidence. However, it turns out that experts mostly believe true things (that's why they're called experts) and so the beliefs of an expert are positively correlated with reality and thus his opinion counts as evidence.
Rational Rian: That doesn't make sense. It's still just an opinion. Evidence comes from experiments.
Bayesian Bob: Yep, but experts have either done experiments themselves or read about experiments other people have done. That's what their opinions are based on. Suppose you take a random scientific statement, you have no idea what it is, and the only thing you know is that 80% of the top researchers in that field agree with that statement, would you then assume the statement is probably true? Would the agreement of these scientists be evidence for the truth of the statement?
Rational Rian: That's just an argument ad populus! Truth isn't governed by majority opinion! It is just religious nonsense that if enough people believe something then there must be some some truth to it.
Bayesian Bob: (Ad populum! Populum! Ah, crud, I should've phrased that more carefully.) I don't mean that majority opinion proves that the statement is true, it's just evidence in favor of it. If there is counterevidence the scale can tip the other way. In the case of religion there is overwhelming counterevidence. Scientifically speaking religion is clearly false, no disagreement there.
Rational Rian: There's scientific counterevidence for religion? Science can't prove non-existence. You know that!
Bayesian Bob: (Oh god, not this again!) Absence of evidence is evidence of absence.
Rational Rian: Counter-evidence is not the same as absence of evidence! Besides, stay with the point, science can't prove a negative.
Bayesian Bob: The certainty of our beliefs should be proportional to amount of evidence we have in favor of the belief. Complex beliefs require more evidence than simple beliefs, and the laws of probability, Bayes specifically, tell us how to weigh new evidence. A statement, any statement, starts out with a 50% probability of being true, and then you adjust that percentage based on the evidence you come into contact with. (I shouldn't have said that 50% part. There's no way that's going to go over well. I'm such an idiot.)
Rational Rian: A statement without evidence is 50% likely to be true!? Have you forgotten everything from math class? This doesn't make sense on so many levels, I don't even know where to start!
Bayesian Bob: (There's no way to rescue this. I'm going to cut my losses.) I meant that in a vacuum we should believe it with 50% certainty, not that any arbitrary statement is 50% likely to accurately reflect reality. But no matter. Let's just get something to eat, I'm hungry.
Rational Rian: So we should believe something even if it's unlikely to be true? That's just stupid. Why do I even get into these conversations with you? *sigh* ... So, how about Subway?
The moral here is that crackpot beliefs are low status. Not just low-status like believing in a deity, but majorly low status. When you believe things that are perceived as crazy and when you can't explain to people why you believe what you believe then the only result is that people will see you as "that crazy guy". They'll wonder, behind your back, why a smart person can have such stupid beliefs. Then they'll conclude that intelligence doesn't protect people against religion either so there's no point in trying to talk about it.
If you fail to conceal your low-status beliefs you'll be punished for it socially. If you think that they're in the wrong and that you're in the right, then you missed the point. This isn't about right and wrong, this is about anticipating the consequences of your behavior. If you choose to to talk about outlandish beliefs when you know you cannot convince people that your belief is justified then you hurt your credibility and you get nothing for it in exchange. You cannot repair the damage easily, because even if your friends are patient and willing to listen to your complete reasoning you'll (accidently) expose three even crazier beliefs you have.
An important life skill is the ability to get along with other people and to not expose yourself as a weirdo when this isn't in your interest to do so. So take heed and choose your words wisely, lest you fall into the trap.
EDIT - Google Survey by Pfft
PS: intended for /main but since this is my first serious post I'll put it in discussion first to see if it's considered sufficiently insightful.