Unknowns comments on The Importance of Self-Doubt - Less Wrong

23 Post author: multifoliaterose 19 August 2010 10:47PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (726)

You are viewing a single comment's thread. Show more comments above.

Comment author: Unknowns 21 August 2010 05:26:44AM *  2 points [-]

Unless you've actually calculated the probability mathematically, a probability of one in a billion for a natural language claim that a significant number of people accept as likely true is always overconfident. Even Eliezer said that he couldn't assign a probability as low as one in a billion for the claim "God exists" (although Michael Vassar criticized him for this, showing himself to be even more overconfident than Eliezer.)

Comment author: komponisto 23 August 2010 11:25:52AM 5 points [-]

Unless you've actually calculated the probability mathematically, a probability of one in a billion for a natural language claim that a significant number of people accept as likely true is always overconfident.

I'm afraid I have to take severe exception to this statement.

You give the human species far too much credit if you think that our mere ability to dream up a hypothesis automatically raises its probability above some uniform lower bound.

Comment author: Unknowns 23 August 2010 11:51:52AM 0 points [-]

I am aware of your disagreement, for example as expressed by the absurd claims here. Yes, my basic idea is, unlike you, to give some credit to the human species. I think there's a limit on how much you can disagree with other human beings-- unless you're claiming to be something superhuman.

Did you see the link to this comment thread? I would like to see your response to the discussion there.

Comment author: komponisto 23 August 2010 07:58:47PM 5 points [-]

I think there's a limit on how much you can disagree with other human beings-- unless you're claiming to be something superhuman.

At least for epistemic meanings of "superhuman", that's pretty much the whole purpose of LW, isn't it?

Did you see the link to this comment thread? I would like to see your response to the discussion there.

My immediate response is as follows: yes, dependency relations might concentrate most of the improbability of a religion to a relatively small subset of its claims. But the point is that those claims themselves possess enormous complexity (which may not necessarily be apparent on the surface; cf. the simple-sounding "the woman across the street is a witch; she did it").

Comment author: Unknowns 24 August 2010 03:50:26AM *  7 points [-]

Let's pick an example. How probable do you think it is that Islam is a true religion? (There are several ways to take care of logical contradictions here, so saying 0% is not an option.)

Suppose there were a machine--for the sake of tradition, we can call it Omega--that prints out a series of zeros and ones according to the following rule. If Islam is true, it prints out a 1 on each round, with 100% probability. If Islam is false, it prints out a 0 or a 1, each with 50% probability.

Let's run the machine... suppose on the first round, it prints out a 1. Then another. Then another. Then another... and so on... it's printed out 10 1's now. Of course, this isn't so improbable. After all, there was a 1/1024 chance of it doing this anyway, even if Islam is false. And presumably we think Islam is more likely than this to be false, so there's a good chance we'll see a 0 in the next round or two...

But it prints out another 1. Then another. Then another... and so on... It's printed out 20 of them. Incredible! But we're still holding out. After all, million to one chances happen every day...

Then it prints out another, and another... it just keeps going... It's printed out 30 1's now. Of course, it did have a chance of one in a billion of doing this, if Islam were false...

But for me, this is my lower bound. At this point, if not before, I become a Muslim. What about you?

You've been rather vague about the probabilities involved, but you speak of "double digit negative exponents" and so on, even saying that this is "conservative," which implies possibly three digit exponents. Let's suppose you think that the probability that Islam is true is 10^-20; this would seem to be very conservative, by your standards. According to this, to get an equivalent chance, the machine would have to print out 66 1's.

If the machine prints out 50 1's, and then someone runs in and smashes it beyond repair, before it has a chance to continue, will you walk away, saying, "There is a chance at most of 1 in 60,000 that Islam is true?"

If so, are you serious?

Comment author: cousin_it 24 August 2010 11:33:54PM *  9 points [-]

Thank you a lot for posting this scenario. It's instructive from the "heuristics and biases" point of view.

Imagine there are a trillion variants of Islam, differing by one paragraph in the holy book or something. At most one of them can be true. You pick one variant at random, test it with your machine and get 30 1's in a row. Now you should be damn convinced that you picked the true one, right? Wrong. Getting this result by a fluke is 1000x more likely than having picked the true variant in the first place. Probability is unintuitive and our brains are mush, that's all I'm sayin'.

Comment author: Unknowns 25 August 2010 05:41:52AM 1 point [-]

I agree with this. But if the scenario happened in real life, you would not be picking a certain variant. You would be asking the vague question, "Is Islam true," to which the answer would be yes if any one of those trillion variants, or many others, were true.

Yes, there are trillions of possible religions that differ from one another as much as Islam differs from Judaism, or whatever. But only a few of these are believed by human beings. So I still think I would convert after 30 1's, and I think this would reasonable.

Comment author: cousin_it 25 August 2010 11:20:38AM *  4 points [-]

If a religion's popularity raises your prior for it so much, how do you avoid Pascal's Mugging with respect to the major religions of today? Eternity in hell is more than 2^30 times worse than anything you could experience here; why aren't you religious already?

Comment author: Unknowns 26 August 2010 06:35:02AM 2 points [-]

It doesn't matter whether it raises your prior or not; eternity in hell is also more than 2^3000 times worse etc... so the same problem will apply in any case.

Elsewhere I've defended Pascal's Wager against the usual criticisms, and I still say it's valid given the premises. But there are two problematic premises:

1) It assumes that utility functions are unbounded. This is certainly false for all human beings in terms of revealed preference; it is likely false even in principle (e.g. the Lifespan Dilemma).

2) It assumes that humans are utility maximizers. This is false in fact, and even in theory most of us would not want to self-modify to become utility maximizers; it would be a lot like self-modifying to become a Babyeater or a Super-Happy.

Comment author: Wei_Dai 25 August 2010 10:24:28PM 1 point [-]

Do you have an answer for how to avoid giving in to the mugger in Eliezer's original Pascal's Mugging scenario? If not, I don't think your question is a fair one (assuming it's meant to be rhetorical).

Comment author: cousin_it 26 August 2010 05:36:03PM *  0 points [-]

I don't have a conclusive answer, but many people say they have bounded utility functions (you see Unknowns pointed out that possibility too). The problem with assigning higher credence to popular religions is that it forces your utility bound to be lower if you want to reject the mugging. Imagining a billion lifetimes is way easier than imagining 3^^^^3 lifetimes. That was the reason for my question.

Comment author: thomblake 25 August 2010 03:17:50PM 1 point [-]

Pascal's Mugging

Oddly, I think you meant "Pascal's Wager".

Comment author: FAWS 25 August 2010 03:28:56PM *  1 point [-]

Pascal's Mugging. Pascal's Wager with something breaking symmetry (in this case observed belief of others).

Comment author: Vladimir_Nesov 25 August 2010 01:22:13PM 0 points [-]

Yes, there are trillions of possible religions that differ from one another as much as Islam differs from Judaism, or whatever. But only a few of these are believed by human beings.

Privileging the hypothesis! That they are believed by human beings doesn't lend them probability.

Comment author: FAWS 25 August 2010 01:55:40PM *  2 points [-]

Well, it does to the extent that lack of believers would be evidence against them. I'd say that Allah is considerably more probable than a similarly complex and powerful god who also wants to be worshiped and is equally willing to interact with humans, but not believed in by anyone at all. Still considerably less probable than the prior of some god of that general sort existing, though.

Comment author: Vladimir_Nesov 25 August 2010 02:09:17PM 0 points [-]

Well, it does to the extent that lack of believers would be evidence against them.

Agreed, but then we have the original situation, if we only consider the set of possible gods that have the property of causing worshiping of themselves.

Comment author: Perplexed 25 August 2010 02:58:05PM *  2 points [-]

Yes, there are trillions of possible religions that differ from one another as much as Islam differs from Judaism, or whatever. But only a few of these are believed by human beings.

Privileging the hypothesis! That they are believed by human beings doesn't lend them probability.

No. It doesn't lend probability, but it seems like it ought to lend something. What is this mysterious something? Lets call it respect.

Privileging the hypothesis is a fallacy. Respecting the hypothesis is a (relatively minor) method of rationality.

We respect the hypotheses that we find in a math text by investing the necessary mental resources toward the task of finding an analytic proof. We don't just accept the truth of the hypothesis on authority. But on the other hand, we don't try to prove (or disprove) just any old hypothesis. It has to be one that we respect.

We respect scientific hypotheses enough to invest physical resources toward performing experiments that might refute or confirm them. We don't expend those resources on just any scientific hypothesis. Only the ones we respect.

Does a religion deserve respect because it has believers? More respect if it has lots of believers? I think it does. Not privilege. Definitely not. But respect? Why not?

Comment author: Vladimir_Nesov 25 August 2010 03:37:25PM 4 points [-]

Privileging the hypothesis is a fallacy. Respecting the hypothesis is a (relatively minor) method of rationality.

No, it's a method of anti-epistemic horror.

Comment author: FAWS 25 August 2010 03:42:00PM *  3 points [-]

You can dispense with this particular concept of respect since in both your examples you are actually supplied with sufficient Bayesian evidence to justify evaluating the hypothesis, so it isn't privileged. Whether this is also the case for believed in religions is the very point contested.

Comment author: [deleted] 26 August 2010 06:48:29PM 0 points [-]

Yes, this seems right.

A priori, with no other evidence one way or another, a belief held by human beings is more likely to be true than not. If Ann says she had a sandwich for lunch, then her words are evidence that she actually had a sandwich for lunch.

Of course, we have external reason to doubt lots of things that human beings claim and believe, including religions. And a religion does not become twice as credible if it has twice as many adherents. Right now I believe we have good reason to reject (at least some of) the tenets of all religious traditions.

But it does make some sense to give some marginal privilege or respect to an idea based on the fact that somebody believes it, and to give the idea more credit if it's very durable over time, or if particularly clever people believe it. If it were any subject but religion -- if it were science, for instance -- this would be an obvious point. Scientific beliefs have often been wrong, but you'll be best off giving higher priors to hypotheses believed by scientists than to other conceivable hypotheses.

Comment author: Unknowns 26 August 2010 06:45:32AM *  1 point [-]

Also... if you haven't been to Australia, is it privileging the hypothesis to accept the word of those who say that it exists? There are trillions of possible countries that could exist that people don't believe exist...

And don't tell me they say they've been there... religious people say they've experienced angels etc. too.

And so on. People's beliefs in religion may be weaker than their belief in Austrialia, but it certainly is not privileging a random hypothesis.

Comment author: Vladimir_Nesov 26 August 2010 09:32:41AM *  0 points [-]

Your observations (of people claiming to having seen an angel, or a kangaroo) are distinct from hypotheses formed to explain those observations. If in a given case, you don't have reason to expect statements people make to be related to facts, then the statements people make taken verbatim have no special place as hypotheses.

Comment author: thomblake 25 August 2010 04:16:13PM 1 point [-]

Privileging the hypothesis!

Begging the question!

Comment author: Unknowns 25 August 2010 04:10:21PM 1 point [-]

This whole discussion is about this very point. Downvoted for contradicting my position without making an argument.

Comment author: Vladimir_Nesov 25 August 2010 04:14:19PM 0 points [-]

Your position statement didn't include an argument either, and the problem with it seems rather straightforward, so I named it.

Comment author: Cyan 25 August 2010 05:21:16PM 0 points [-]

I disagree; given that most of the religions in question center on human worship of the divine, I have to think that Pr(religion X becomes known among humans | religion X is true) > Pr(religion X does not become known among humans | religion X is true). But I hate to spend time arguing about whether a likelihood ratio should be considered strictly equal to 1 or equal to 1 + epsilon when the prior probabilities of the hypotheses in question are themselves ridiculously small.

Comment author: komponisto 24 August 2010 10:53:27PM *  3 points [-]

If the machine prints out 50 1's, and then someone runs in and smashes it beyond repair, before it has a chance to continue, will you walk away, saying, "There is a chance at most of 1 in 60,000 that Islam is true?"

If so, are you serious?

Of course I'm serious (and I hardly need to point out the inadequacy of the argument from the incredulous stare). If I'm not going to take my model of the world seriously, then it wasn't actually my model to begin with.

Sewing-Machine's comment below basically reflects my view, except for the doubts about numbers as a representation of beliefs. What this ultimately comes down to is that you are using a model of the universe according to which the beliefs of Muslims are entangled with reality to a vastly greater degree than on my model. Modulo the obvious issues about setting up an experiment like the one you describe in a universe that works the way I think it does, I really don't have a problem waiting for 66 or more 1's before converting to Islam. Honest. If I did, it would mean I had a different understanding of the causal structure of the universe than I do.

Further below you say this, which I find revealing:

If this actually happened to you, and you walked away and did not convert, would you have some fear of being condemned to hell for seeing this and not converting? Even a little bit of fear? If you would, then your probability that Islam is true must be much higher than 10^-20, since we're not afraid of things that have a one in a hundred billion chance of happening.

As it happens, given my own particular personality, I'd probably be terrified. The voice in my head would be screaming. In fact, at that point I might even be tempted to conclude that expected utilities favor conversion, given the particular nature of Islam.

But from an epistemic point of view, this doesn't actually change anything. As I argued in Advancing Certainty, there is such a thing as epistemically shutting up and multiplying. Bayes' Theorem says the updated probability is one in a hundred billion, my emotions notwithstanding. This is precisely the kind of thing we have to learn to do in order to escape the low-Earth orbit of our primitive evolved epistemology -- our entire project here, mind you -- which, unlike you (it appears), I actually believe is possible.

Comment author: Wei_Dai 25 August 2010 12:59:57AM 4 points [-]

Has anyone done a "shut up and multiply" for Islam (or Christianity)? I would be interested in seeing such a calculation. (I did a Google search and couldn't find anything directly relevant.) Here's my own attempt, which doesn't get very far.

Let H = "Islam is true" and E = everything we've observed about the universe so far. According to Bayes:

P(H | E) = P(E | H) P(H) / P(E)

Unfortunately I have no idea how to compute the terms above. Nor do I know how to argue that P(H|E) is as small as 10^-20 without explicitly calculating the terms. One argument might be that P(H) is very small because of the high complexity of Islam, but since E includes "23% of humanity believe in some form of Islam", the term for the complexity of Islam seems to be present in both the numerator and denominator and therefore cancel each other out.

If someone has done such a calculation/argument before, please post a link?

Comment author: FAWS 25 August 2010 04:00:32PM 3 points [-]

the term for the complexity of Islam seems to be present in both the numerator and denominator and therefore cancel each other out.

Actually it doesn't, human generated complexity is different from naturally generated complexity (for instance it fits into narratives, apparent holes are filled with the sort of justifications a human is likely to think of etc.). That's one of the ways you can tell stories from real events. Religious accounts contain much of what looks like human generated complexity.

Comment author: [deleted] 25 August 2010 03:49:31PM 2 points [-]
  1. Here's a somewhat rough way of estimating probabilities of unlikely events. Let's say that an event X with P(X) = about 1-in-10 is a "lucky break." Suppose that there are L(1) ways that Y could occur on account of a single lucky break, L(2) ways that Y could occur on account of a pair of independent lucky breaks, L(3) ways that Y could occur on account of 3 independent lucky breaks, and so on. Then P(Y) is approximately the sum over all n of L(n)/10^n. I have the feeling that arguments about whether P(Y) is small versus extremely small are arguments about the growth rate of L(n).

  2. I discussed the problem of estimating P("23% of humanity believes...") here. I'd be grateful for thoughts or criticisms.

Comment author: cousin_it 25 August 2010 01:03:20PM *  3 points [-]

P(E) includes the convincingness of Islam to people on average, not the complexity of Islam. These things are very different because of the conjunction fallacy. So P(H) can be a lot smaller than P(E).

Comment author: Wei_Dai 25 August 2010 10:26:06PM 3 points [-]

I don't understand how P(E) does not include a term for the complexity of Islam, given that E contains Islam, and E is not so large that it takes a huge number of bits to locate Islam inside E.

Comment author: Furcas 25 August 2010 11:37:01PM 1 point [-]

I don't think that's true; cousin_it had it right the first time. The complexity of Islam is the complexity of a reality that contains an omnipotent creator, his angels, Paradise, Hell, and so forth. Everything we've observed about the universe includes people believing in Islam, but not the beings and places that Islam says exist.

In other words, E contains Islam the religion, not Islam the reality.

Comment author: Vladimir_Nesov 26 August 2010 09:51:29AM 0 points [-]

It doesn't take a lot of bits to locate "Islam is false" based on "Islam is true". Does it mean that all complex statements have about 50% probability?

Comment author: cousin_it 25 August 2010 10:47:19PM *  -1 points [-]

Whoops, you're right. Now I'm ashamed that my comment got upvoted.

I think the argument may still be made to work by fleshing out the nonstandard notion of "complexity" that I had in my head when writing it :-) Your prior for a given text being true shouldn't depend only on the text's K-complexity. For example, the text "A and B and C and D" has the same complexity as "A or B or C or D", but the former is way less probable. So P(E) and P(H) may have the same term for complexity, but P(H) also gets a "conjunction penalty" that P(E) doesn't get because people are prey to the conjunction fallacy.

EDIT: this was yet another mistake. Such an argument cannot work because P(E) is obviously much smaller than P(H), because E is a huge mountain of evidence and H is just a little text. When trying to reach the correct answer, we cannot afford to ignore P(E|H).

Comment author: gjm 17 June 2015 01:39:01PM 0 points [-]

There are some very crude sketches of shutting-up-and-multiplying, from one Christian and a couple of atheists, here (read the comments as well as the post itself), and I think there may be more with a similar flavour in other blog posts there (and their comments) from around the same time.

(The author of the blog has posted a little on LW. The two skeptics responsible for most of the comments on that post have both been quite active here. One of them still is, and is in fact posting this comment right now :-).)

Comment author: Unknowns 25 August 2010 07:06:33AM 0 points [-]

Wei Dai, exactly. The point about about the complexity of the thing is included in the fact that people believe it was the point I have been making all along. Regardless of what you think the resulting probability is, most of the "evidence" for Islam consists in the very fact that some people think it is true-- and as you show in your calculation, this is very strong evidence.

It seems to me that komponisto and others are taking it to be known with 100% certainly that Islam and the like were generated by some random process, and then trying to determine what the probability would be.

Now I know that most likely Mohammed was insane and in effect the Koran was in fact generated by a random process. But I certainly don't know how you can say that the probability that it wasn't generated randomly is 1 in 10^20 or lower. And in fact if you're going to assign a probability like this you should have an actual calculation.

Comment author: [deleted] 25 August 2010 01:37:42AM 0 points [-]

This is a small point but "E includes complex claim C" does not imply that the (for instance, Kolmogorov) complexity of E is as large as the Kolmogorov complexity of C. The complexity of the digits of square root of 2 is pretty small, but they contain strings of arbitrarily high complexity.

Comment author: Wei_Dai 25 August 2010 01:56:07AM 2 points [-]

E includes C implies that K(C) <= K(E) + K(information needed to locate C within E). In this case K(information needed to locate C within E) seems small enough not to matter to the overall argument, which is why I left it out. (Since you said "this is a small point" I guess you probably understand and agree with this.)

Comment author: [deleted] 25 August 2010 02:09:51AM 1 point [-]

Actually no I hadn't thought of that. But I wonder if the amount of information it takes to locate "lots of people are muslims" within E is as small as you say. My particular E does not even contain that much information about Islam, and how people came to believe it, but it does contain a model of how people come to believe weird things in general. Is that a misleading way of putting things? I can't tell.

Comment author: Unknowns 25 August 2010 07:13:27AM *  -2 points [-]

I agree that your position is analogous to "shutting up and multiplying." But in fact, Eliezer may have been wrong about that in general -- see the Lifespan Dilemma -- because people's utility functions are likely not unbounded.

In your case, I agree with shutting up and multiplying when we have a way to calculate the probabilities. In this case, we don't, so we can't do it. If you had a known probability (see cousin_it's comment on the possible trillions of variants of Islam) of one in a trillion, then I would agree with walking away after seeing 30 1's, regardless of the emotional effect of this.

But in reality, we have no such known probability. The result is that you are going to have to use some base rate: "things that people believe" or more accurately, "strange things that people believe" or whatever. In any case, whatever base rate you use, it will not have a probability anywhere near 10^-20 (i.e. more than 1 in 10^20 strange beliefs is true etc.)

My real point about the fear is that your brain doesn't work the way your probabilities do-- even if you say you are that certain, your brain isn't. And if we had calculated the probabilities, you would be justified in ignoring your brain. But in fact, since we haven't, your brain is more right than you are in this case. It is less certain precisely because you are simply not justified in being that certain.

Comment author: RichardKennaway 25 August 2010 03:27:58PM 2 points [-]

But for me, this is my lower bound. At this point, if not before, I become a Muslim. What about you?

At this point, if not before, I doubt Omega's reliability, not mine.

Comment author: Pavitra 26 August 2010 06:42:29AM 2 points [-]

It is a traditional feature of Omega that you have confidence 1 in its reliability and trustworthiness.

Comment author: RichardKennaway 26 August 2010 07:31:30AM 3 points [-]

It is a traditional feature of Omega that you have confidence 1 in its reliability and trustworthiness.

Traditions do not always make sense, neither are they necessarily passed down accurately. The original Omega, the one that appears in Newcomb's problem, does not have to be reliable with probability 1 for that problem to be a problem.

Of course, to the purist who says that 0 and 1 are not probabilities, you've just sinned by talking about confidence 1, but the problem can be restated to avoid that by asking for one's conditional probability P(Islam | Omega is and behaves as described).

In the present case, the supposition that one is faced with an overwhelming likelihood ratio raising the probability that Islam is true by an unlimited amount is just a blue tentacle scenario. Any number that anyone who agrees with the general anti-religious view common on LessWrong comes up with is going to be nonsense. Professing, say, 1 in a million for Islam on the grounds that 1 in a billion or 1 in a trillion is too small a probability for the human brain to cope with is the real cop-out, a piece of reversed stupidity with no justification of its own.

The scenario isn't going to happen. Forcing your brain to produce an answer to the question "but what if it did?" is not necessarily going to produce a meaningful answer.

Comment author: Pavitra 26 August 2010 08:21:03AM 2 points [-]

Traditions do not always make sense, neither are they necessarily passed down accurately. The original Omega, the one that appears in Newcomb's problem, does not have to be reliable with probability 1 for that problem to be a problem.

Quite true. But if you want to dispute the usefulness of this tradition, you should address the broader and older tradition of which it is an instance: that thought experiments should abstract away real-world details irrelevant to the main point.

Of course, to the purist who says that 0 and 1 are not probabilities, you've just sinned by talking about confidence 1

This is a pet peeve of mine, and I've wanted an excuse to post this rant for a while. Don't take it personally.

That "purist" is as completely wrong as the person who insists that there is no such thing as centrifugal force. They are ignoring the math in favor of a meme that enables them to feel smugly superior.

0 and 1 are valid probabilities in every mathematical sense: the equations of probability don't break down when passed p=0 or p=1 the way they do with genuine nonprobabilities like -1 or 2. A probability of 0 or 1 is like a perfect vacuum: it happens not to occur in the world that we happen to inhabit, but it is perfectly well-defined, we can do math with it without any difficulty, and it is extraordinarily useful in thought experiments.

When asked to consider a spherical black body of radius one meter resting on a frictionless plane, you don't respond "blue tentacles", you do the math.

Comment author: RichardKennaway 26 August 2010 12:02:45PM 0 points [-]

I agree with the rant. 0 and 1 are indeed probabilities, and saying that they are not is a misleading way of enjoining people to never rule out anything. Mathematically, P(~A|A) is zero, not epsilon, and P(A|A) is 1, not 1-epsilon. Practically, 0 and 1 in subjective judgements mean as near to 0 and 1 as makes no practical difference. When I agree a rendezvous with someone, I don't say "there's a 99% chance I'll be there", I say "I'll be there".

Where we part ways is in our assessment of the value of this thought-experiment. To me it abstracts and assumes away so much that what is left does not illuminate anything. I can calculate 2^{-N}, but asked how large N would have to be to persuade me of some fantastic claim backed by this fantastic machine I simply cannot name any value. I have no confidence that whatever value I named would be the value I would actually use were this impossible scenario to come to pass.

Comment author: Unknowns 26 August 2010 06:39:29AM 0 points [-]

This is a copout.

Comment author: [deleted] 24 August 2010 05:10:16AM *  1 point [-]

You've asked us to take our very small number, and imagine it doubling 66 times. I agree that there is a punch to what you say -- no number, no matter how small, could remain small after being doubled 66 times! But in fact long ago Archimedes made a compelling case that there are such numbers.

Now, it's possible that Archimedes was wrong and something like ultrafinitism is true. I take ultrafinitist ideas quite seriously, and if they are correct then there are a lot things that we will have to rethink. But Islam is not close to the top of list of things we would should rethink first.

Maybe there's a kind of meta claim here: conditional on probability theory being a coherent way to discuss claims like "Islam is true," the probability that Islam is true really is that small.

Comment author: Unknowns 24 August 2010 05:25:14AM *  0 points [-]

I just want to know what you would actually do, in that situation, if it happened to you tomorrow. How many 1's would you wait for, before you became a Muslim?

Also, "there are such numbers" is very far from "we should use such numbers as probabilities when talking about claims that many people think are true." The latter is an extremely strong claim and would therefore need extremely strong evidence before being acceptable.

Comment author: [deleted] 24 August 2010 07:46:49AM *  3 points [-]

I think after somewhere between 30 and 300 coin flips, I would convert. With more thought and more details about what package of claims is meant by "Islam," I could give you a better estimate. Escape routes that I'm not taking: I would start to suspect Omega was pulling my leg, I would start to suspect that I was insane, I would start to suspect that everything I knew was wrong, including the tenets of Islam. If answers like these are copouts -- if Omega is so reliable, and I am so sane, and so on -- then it doesn't seem like much of a bullet to bite to say "yes, 2^-30 is very small but it is still larger than 2^-66; yes something very unlikely has happened but not as unlikely as Islam"

Also, "there are such numbers" is very far from "we should use such numbers as probabilities when talking about claims that many people think are true." The latter is an extremely strong claim and would therefore need extremely strong evidence before being acceptable.

If you're expressing doubts about numbers being a good measure of beliefs, I'm totally with you! But we only need strong evidence for something to be acceptable if there are some alternatives -- sometimes you're stuck with a bad option. Somebody's handed us a mathematical formalism for talking about probabilities, and it works pretty well. But it has a funny aspect: we can take a handful of medium-sized probabilities, multiply them together, and the result is a tiny tiny probability. Can anything be as unlikely as the formalism says 66 heads in a row is? I'm not saying you should say "yes," but if your response is "well, whenever something that small comes up in practice, I'll just round up," that's a patch that is going to spring leaks.

Comment author: Unknowns 24 August 2010 09:26:54AM *  -1 points [-]

Another point, regarding this:

yes, 2^-30 is very small but it is still larger than 2^-66; yes something very unlikely has happened but not as unlikely as Islam.

Originally I didn't intend to bring up Pascal's Wager type considerations here because I thought it would just confuse the issue of the probability. But I've rethought this-- actually this issue could help to show just how strong your beliefs are in reality.

Suppose you had said in advance that the probability of Islam was 10^-20. Then you had this experience, but the machine was shut off after 30 1's ( a chance of one in a billion.) The chance that Islam is true is now one in a hundred billion, updated from your prior.

If this actually happened to you, and you walked away and did not convert, would you have some fear of being condemned to hell for seeing this and not converting? Even a little bit of fear? If you would, then your probability that Islam is true must be much higher than 10^-20, since we're not afraid of things that have a one in a hundred billion chance of happening.

Comment author: thomblake 25 August 2010 03:28:12PM 0 points [-]

If this actually happened to you, and you walked away and did not convert, would you have some fear of being condemned to hell for seeing this and not converting? Even a little bit of fear? If you would, then your probability that Islam is true must be much higher than 10^-20, since we're not afraid of things that have a one in a hundred billion chance of happening.

This is false.

I must confess that I am sometimes afraid that ghosts will jump out of the shadows and attack me at night, and I would assign a much lower chance of that happening. I have also been afraid of velociraptors. Fear is frequently irrational.

Comment author: Unknowns 24 August 2010 09:14:42AM -1 points [-]

It's good you managed some sort of answer to this. However, 30 - 300 is quite a wide range; from 1 in 10^9 to 1 in 10^90. If you're going to hope for any sort of calibration at all in using numbers like this, you're going to have to much more precise...

I wasn't expressing doubts about numbers being a measure of beliefs (although you could certainly question this as well), but about extreme numbers being a measure of our beliefs, which do not seem able to be that extreme. Yes, if you have a large number of independent probabilities, the result can be extreme. And supposedly, the basis for saying that Islam (or reincarnation, or whatever) is very improbable would be the complexity of the claim. But who has really determined how much complexity it has? As I pointed out elsewhere (on the "Believable Bible" comment thread), a few statements, if we knew them to be true, would justify Islam or any other such thing. Which particular statements would we need, and how complex are those statements, really? No one has determined them to any degree of precision, and until they do, you have to use something like a base rate. Just as astronomers start out with fairly high probabilities for the collision of near-earth asteroids, and only end up with low probabilities after very careful calculation, you would have to start out with a fairly high prior for Islam, or reincarnation, or whatever, and you would only be justified in holding an extreme probability after careful calculation... which I don't believe you've done. Certainly I haven't.

Apart from the complexity, there is also the issue of evidence. We've been assuming all along that there is no evidence for Islam, or reincarnation, or whatever. Certainly it's true that there isn't much. But that there is literally no evidence for such things simply isn't so. The main thing is that we aren't motivated to look at the little evidence that there is. But if you intend to assign probabilities to that degree of precision, you are going to have to take into account every speck of evidence.

Comment author: [deleted] 24 August 2010 04:12:37PM 0 points [-]

I thought the salient feature of Islam was that many people believed it, not that it has less complexity than I thought, or more evidence in its favor than I thought. That might be, but I'm not interested in discussing it.

I don't "feel" beliefs strongly or weakly. Sometimes probability calculations help me with fear and other emotions, sometimes they don't. Again, I'm not interested in discussing it.

So tell me something about how important it is that many people believe in Islam.

Comment author: [deleted] 21 August 2010 07:54:24AM 1 point [-]

The product of two probabilities above your threshold-for-overconfidence can be below your threshold-for-overconfidence. Have you at least thought this through before?

For instance, the claim "there is a God" is not that much less spectacular than the claim "there is a God, and he's going to make the next 1000 times you flip a coin turn up heads." If one-in-a-billion is a lower bound for the probability that God exists, then one-in-a-billion-squared is a generous lower bound for the probability that the next 1000 times you flip a coin will turn up heads. (One-in-a-billion-squared is about 2-to-the-sixty). You're OK with that?

Comment author: Unknowns 21 August 2010 03:12:20PM *  1 point [-]

Yes. As long as you think of some not-too-complicated scenario where the one would lead to the other, that's perfectly reasonable. For example, God might exist and decide to prove it to you by effecting that prediction. I certainly agree this has a probability of at least one in a billion squared. In fact, suppose you actually get heads the next 60 times you flip a coin, even though you are choosing different coins, it is on different days, and so on. By that point you will be quite convinced that the heads are not independent, and that there is quite a good chance that you will get 1000 heads in a row.

It would be different of course if you picked a random series of heads and tails: in that case you still might say that there is at least that probability that someone else will do it (because God might make that happen), but you surely cannot say that it had that probability before you picked the random series.

This is related to what I said in the torture discussion, namely that explicitly describing a scenario automatically makes it far more probable to actually happen than it was before you described it. So it isn't a problem if the probability of 1000 heads in a row is more likely than 1 in 2-to-1000. Any series you can mention would be more likely than that, once you have mentioned it.

Also, note that there isn't a problem if the 1000 heads in a row is lower than one in a billion, because when I made the general claim, I said "a claim that significant number of people accept as likely true," and no one expects to get the 1000 heads.

Comment author: [deleted] 21 August 2010 05:13:54PM *  0 points [-]

Probabilities should sum to 1. You're saying moreover that probabilities should not be lower that some threshhold. Can I can get you to admit that there's a math issue here that you can't wave away, without trying to fine-tune my examples? If you claim you can solve this math issue, great, but say so.

Edit: -1 because I'm being rude? Sorry if so, the tone does seem inappropriately punchy to me now. -1 because I'm being stupid? Tell me how!

Comment author: Unknowns 21 August 2010 06:57:29PM 1 point [-]

I set a lower bound of one in a billion on the probability of "a natural language claim that a significant number of people accept as likely true". The number of such mutually exclusive claims is surely far less than a billion, so the math issue will resolve easily.

Yes, it is easy to find more than a billion claims, even ones that some people consider true, but they are not mutually exclusive claims. Likewise, it is easy to find more than a billion mutually exclusive claims, but they are not ones that people believe to be true, e.g. no one expects 1000 heads in a row, no one expects a sequence of five hundred successive heads-tails pairs, and so on.

I didn't downvote you.

Comment author: [deleted] 21 August 2010 09:03:22PM 0 points [-]

Maybe I see. You are updating on the fact that many people believe something, and are saying that P(A|many people believe A) should not be too small. Do you agree with that characterization of your argument?

In that case, we will profitably distinguish between P(A|no information about how many people believe A) and P(A|many people believe A). Is there a compact way that I can communicate something like "Excepting/not updating on other people's beliefs, P(God exists) is very small"? If I said something like that would you still think I was being overconfident?

Comment author: Unknowns 22 August 2010 06:17:18AM 0 points [-]

This is basically right, although in fact it is not very profitable to speak of what the probability would be if we didn't have some of the information that we actually have. For example, the probability of this sequence of ones and zeros -- 0101011011101110 0010110111101010 0100010001010110 1010110111001100 1110010101010000 -- being chosen randomly, before anyone has mentioned this particular sequence, is one out 2 to the 80. Yet I chose it randomly, using a random number generator (not a pseudo random number generator, either.) But I doubt that you will conclude that I am certainly lying, or that you are hallucinating. Rather, as Robin Hanson points out, extraordinary claims are extraordinary evidence. The very fact that I write down this improbable evidence is extremely extraordinary evidence that I have chosen it randomly, despite the huge improbability of that random choice. In a similar way, religious claims are extremely strong evidence in favor of what they claim; naturally, just as if I hadn't written the number, you would never believe that I might choose it randomly, in the same way, if people didn't make religious claims, you would rightly think them to be extremely improbable.

Comment author: [deleted] 22 August 2010 08:54:53PM 1 point [-]

It is always profitable to give different concepts different names.

Let GM be the assertion that I'll one day play guitar on the moon. Your claim is that this ratio

P(GM|I raised GM as a possibility)/P(GM)

is enormous. Bayes theorem says that this is the same as

P(I raised GM as a possibility|GM)/P(I raised GM as a possibility)

so that this second ratio is also enormous. But it seems to me that both numerator and denominator in this second ratio are pretty medium-scale numbers--in particular the denominator is not miniscule. Doesn't this defeat your idea?

Comment author: Unknowns 23 August 2010 12:56:37AM *  0 points [-]

The evidence contained in your asserting GM would be much stronger than the evidence contained in your raising the possibility.

Still, there is a good deal of evidence contained in your raising the possibility. Consider the second ratio: the numerator is quite high, probably more than .5, since in order to play guitar on the moon, you would have to bring a guitar there, which means you'd probably be thinking about it.

The denominator is in fact quite small. If you randomly raise one outlandish possibility of performing some action in some place, each day for 50 years, and there are 10,000 different actions (I would say there are at least that many), and 100,000 different places, then the probability of raising the possibility will be 18,250/(10,000 x 100,000), which is 0.00001825, which is fairly small. The actual probability is likely to be even lower, since you may not be bringing up such possibilities every day for 50 years. Religious claims are typically even more complicated than the guitar claim, so the probability of raising their possibility is even lower.

--one more thing: I say that raising the possibility is strong evidence, not that the resulting probability is high: it may start out extremely low and end up still very, very low, going from say one in a google to one in a sextillion or so. It is when you actually assert that it's true that you raise the probability to something like one in a billion or even one in a million. Note however that you can't refute me by now going on to assert that you intend to play a guitar on the moon; if you read Hanson's article in my previous link, you'll see that he shows that assertions are weak evidence in particular cases, namely in ones in which people are especially likely to lie: and this would be one of them, since we're arguing about it. So in this particular case, if you asserted that you intended to do so, it would only raise the probability by a very small amount.

Comment author: [deleted] 23 August 2010 03:10:38AM 0 points [-]

I understand that you think the lower bound on probabilities for things-that-are-believed is higher than the lower bound on probabilities for things-that-are-raised-as-possibilities. I am fairly confident that I can change your mind (that is, convince you not to impose lower bounds like this at all), and even more confident that I can convince you that imposing lower bounds like this is mathematically problematic (that is, there are bullets to be bitten) in ways that hadn't occurred to you a few days ago.

I do not see one of these bounds as more or less sound than the other, but am focusing on the things-that-are-raised-as-possibilities bound because I think the discussion will go faster there.

More soon, but tell me if you think I've misunderstood you, or if you think you can anticipate my arguments. I would also be grateful to hear from whoever is downvoting these comments.

Comment author: multifoliaterose 21 August 2010 05:33:43AM 0 points [-]

My estimate does come some effort at calibration, although there's certainly more that I could do. Maybe I should have qualified my statement by saying "this estimate may be a gross overestimate or a gross underestimate."

In any case, I was not being disingenuous or flippant. I have carefully considered the question of how likely it is that Eliezer will be able to play a crucial role in a FAI project if he continues to exhibit a strategy qualitatively similar to his current one and my main objection to SIAI's strategy is that I think it extremely unlikely that Eliezer will be able to have an impact if he proceeds as he has up until this point.

I will be detailing why I don't think that Eliezer's present strategy toward working toward an FAI is a fruitful one in a later top level post.

Comment author: steven0461 21 August 2010 05:47:06AM 4 points [-]

Maybe I should have qualified my statement by saying "this estimate may be a gross overestimate or a gross underestimate."

It sounds, then, like you're averaging probabilities geometrically rather than arithmetically. This is bad!

Comment author: multifoliaterose 21 August 2010 05:51:37AM *  1 point [-]

I understand your position and believe that it's fundamentally unsound. I will have more to say about this later.

For now I'll just say that the arithmetical average of the probabilities that I imagine I might ascribe to Eliezer's current strategy resulting in an FAI to be 10^(-9).