orthonormal comments on Open Thread: March 2010 - Less Wrong

5 Post author: AdeleneDawner 01 March 2010 09:25AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (658)

You are viewing a single comment's thread. Show more comments above.

Comment author: Alicorn 02 March 2010 01:56:20AM *  2 points [-]

Yes, but in this situation you have so little information that .5 doesn't seem remotely cautious enough. You might as well ask the members of Strigli as they land on Earth what their probability is that the Red Sox will win at a spelling bee next year - does it look obvious that they shouldn't say 50% in that case? .5 isn't the right prior - some eensy prior that any given possibly-made-up alien thing will happen, adjusted up slightly to account for the fact that they did choose this question to ask over others, seems better to me.

Comment author: orthonormal 02 March 2010 02:17:11AM 3 points [-]

You might as well ask the members of Strigli as they land on Earth what their probability is that the Red Sox will win at a spelling bee next year - does it look obvious that they shouldn't say 50% in that case?

Unless there's some reason that they'd suspect it's more likely for us to ask them a trick question whose answer is "No" than one whose question is "Yes" (although it is probably easier to create trick questions whose answer is "No", and the Striglian could take that into account), 50% isn't a bad probability to assign if asked a completely foreign Yes-No question.

Basically, I think that this and the other problems of this nature discussed on LW are instances of the same phenomenon: when the space of possibilities (for alien culture, Omega's decision algorithm, etc) grows so large and so convoluted as to be utterly intractable for us, our posterior probabilities should be basically our ignorance priors all over again.

Comment author: Alicorn 02 March 2010 02:31:10AM *  7 points [-]

It seems to me that even if you know that there is a Doldun game, played by exactly two teams, of which one is Strigli, which game exactly one team will entirely win, 50% is as high as you should go. If you don't have that much precise information, then 50% is an extremely generous upper bound for how likely you should consider a Strigli win. The space of all meaningful false propositions is hugely larger than the space of all meaningful true propositions. For every proposition that is true, you can also contradict it directly, and then present a long list of indirectly contradictory statements. For example: it is true that I am sitting on a blue couch. It is false that I am not on a blue couch - and also false that I am on a red couch, false that I am trapped in carbonite, false that I am beneath the Great Barrier Reef, false that I'm in the Sea of Tranquility, false that I'm equidistant between the Sun and the star Polaris, false that... Basically, most statements you can make about my location are false, and therefore the correct answer to most yes-or-no questions you could ask about my location is "no".

Basically, your prior should be that everything is almost certainly false!

Comment author: cousin_it 09 March 2010 03:51:33PM 2 points [-]

The odds of a random sentence being true are low, but the odds of the alien choosing to give you a true sentence are higher.

Comment author: thomblake 09 March 2010 08:13:00PM 0 points [-]

A random alien?

Comment author: bogdanb 12 March 2010 01:29:31PM *  0 points [-]

No, just a random alien that (1) I encountered and (2) asked me a question.

The two conditions above restrict enormously the general class of “possible” random aliens. Every condition that restricts possibilities brings information, though I can't see a way of properly encoding this information as a prior about the answer to said question.

[ETA:] Note that I don't necessarily accept cousin_it's assertion, I just state my interpretation of it.

Comment author: orthonormal 02 March 2010 02:41:11AM *  0 points [-]

Well, let's say I ask you whether all "fnynznaqre"s are "nzcuvovna"s. Prior to using rot13 on this question (and hypothesizing that we hadn't had this particular conversation beforehand), would your prior really be as low as your previous comment implies?

(Of course, it should probably still be under 50% for the reference class we're discussing, but not nearly that far under.)

Comment author: Alicorn 02 March 2010 02:43:13AM *  1 point [-]

Given that you chose this question to ask, and that I know you are a human, then screening off this conversation I find myself hovering at around 25% that all "fnynznaqre"s are "nzcuvovna"s. We're talking about aliens. Come on, now that it's occurred to you, wouldn't you ask an E.T. if it thinks the Red Sox have a shot at the spelling bee?

Comment author: orthonormal 02 March 2010 03:00:15AM 0 points [-]

Yes, but I might as easily choose a question whose answer was "Yes" if I thought that a trick question might be too predictable of a strategy.

1/4 seems reasonable to me, given human psychology. If you expand the reference class to all alien species, though, I can't see why the likelihood of "Yes" should go down— that would generally require more information, not less, about what sort of questions the other is liable to ask.

Comment author: Alicorn 02 March 2010 03:03:50AM 1 point [-]

Okay, if you have some reason to believe that the question was chosen to have a specific answer, instead of being chosen directly from questionspace, then you can revise up. I didn't see a reason to think this was going on when the aliens were asking the question, though.

Comment author: orthonormal 02 March 2010 03:07:13AM *  0 points [-]

Hmm. As you point out, questionspace is biased towards "No" when represented in human formalisms (if weighting by length, it's biased by nearly the length of the "not" symbol), and it would seem weird if it weren't so an an alien representation. Perhaps that's a reason to revise down and not up when taking information off the table. But it doesn't seem like it should be more than (say) a decibel's worth of evidence for "No".

ETA: I think we each just acknowledged that the other has a point. On the Internet, no less!

Comment author: Alicorn 02 March 2010 03:10:04AM 1 point [-]

ETA: I think we just each acknowledged that the other has a point. On the Internet, no less!

Isn't it awesome when that happens? :D

Comment author: vinayak 02 March 2010 05:49:23AM 0 points [-]

I think one important thing to keep in mind when assigning prior probabilities to yes/no questions is that the probabilities you assign should at least satisfy the axioms of probability. For example, you should definitely not end up assigning equal probabilities to the following three events -

  1. Strigli wins the game.
  2. It rains immediately after the match is over.
  3. Strigli wins the game AND it rains immediately after the match is over.

I am not sure if your scheme ensures that this does not happen.

Also, to me, Bayesianism sounds like an iterative way of forming consistent beliefs, where in each step you gather some evidence and update your probability estimates for the truth or falsity of various hypotheses accordingly. But I don't understand how exactly to start. Or in other words, consider the very first iteration of this whole process, where you do not have any evidence whatsoever. What probabilities do you assign to the truth or falsity of different hypotheses?

One way I can imagine is to assign all of them a probability inversely proportional to their Kolmogorov complexities. The good thing about Kolmogorov complexity is that it satisfies the axioms of probability. But I have only seen it defined for strings and such. I don't know how to define Kolmogorov complexity of complicated things like hypotheses. Also, even if there is a way to define it, I can't completely convince myself that it gives a correct prior probability.

Comment author: JGWeissman 02 March 2010 02:38:59AM *  0 points [-]

and also false that I am on a red couch,

But it is true that you are not on a red couch.

Negation is a one-to-one map between true and false propositions.

Comment author: Alicorn 02 March 2010 02:40:46AM 3 points [-]

Since you can understand the alien's question except for the nouns, presumably you'd be able to tell if there was a "not" in there?

Comment author: JGWeissman 02 March 2010 02:52:49AM *  2 points [-]

Yes, you have made a convincing argument, I think, that given that a proposition does not involve negation, as in the alien's question, that it is more likely to be false than true. (At least, if you have a prior for being presented with questions that penalize complexity. The sizes of the spaces of true and false propositions, however, are the same countable infinity.) (Sometimes I see claims in isolation, and so miss that a slightly modified claim is more correct and still supports the same larger claim.)

ETA: We should also note the absence of any disjunctions. It is also true that Alicorn is sitting on a blue couch or a red couch. (Well, maybe not, some time has passed since she reported sitting on a blue couch. But that's not the point.)

This effect may be screened off if, for example, you have a prior that the aliens first choose whether the answer should be yes or no, and then choose a question to match the answer.