BrienneYudkowsky comments on Probability is in the Mind - Less Wrong

60 Post author: Eliezer_Yudkowsky 12 March 2008 04:08AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (186)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 24 May 2013 03:26:43AM 6 points [-]

Very low, because B9 has to hypothesize a causal framework involving colors without any way of observing anything but quantitatively varying luminosities. In other words, they must guess that they're looking at the average of three variables instead of at one variable. This may sound simple but there are many other hypotheses that could also be true, like two variables, four variables, or most likely of all, one variable. B9 will be surprised. This is right and proper. Most physics theories you make up with no evidence behind them will be wrong.

Comment author: BrienneYudkowsky 24 May 2013 04:30:33AM 2 points [-]

I think I'm confused. We're talking about something that's never even heard of colors, so there shouldn't be anything in the mind of the robot related to "blue" in any way. This ought to be like the prior probability from your perspective that zorgumphs are wogle. Now that I've said the words, I suppose there's some very low probability that zorgumphs are wogle, since there's a probability that "zorgumph" refers to "cats" and "wogle" to "furry". But when you didn't even have those words in your head anywhere, how could there have been a prior? How could B9's prior be "very low" instead of "nonexistent"?

Comment author: hairyfigment 24 May 2013 05:35:25AM *  3 points [-]

Eliezer seems to be substituting the actual meaning of "blue". Now, if we present the AI with the English statement and ask it to assign a probability...my first impulse is to say it should use a complexity/simplicity prior based on length. This might actually be correct, if shorter message-length corresponds to greater frequency of use. (ETA that you might not be able to distinguish words within the sentence, if faced with a claim in a totally alien language.)

Comment author: TheOtherDave 24 May 2013 07:08:17AM 0 points [-]

Well, if nothing else, when I ask B9 "is your ball blue?", I'm only providing a finite amount of evidence thereby that "blue" refers to a property that balls can have or not have. So if B9's priors on "blue" referring to anything at all are vastly low, then B9 will continue to believe, even after being asked the question, that "blue" doesn't refer to anything. Which doesn't seem like terribly sensible behavior. That sets a floor on how low the prior on "'blue' is meaningful" can be.