fghjgfu comments on Open Thread February 25 - March 3 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (354)
A little bit of How An Algorithm Feels From Inside:
Why is the Monty Hall problem so horribly unintuitive? Why does it feel like there's an equal probability to pick the correct door (1/2+1/2) when actually there's not (1/3+2/3)?
Here are the relevant bits from the Wikipedia article:
[...]
Those bias listed in the last paragraph maybe explain why people choose not to switch the door, but what explains the "equal probability" intuition? Do you have any insight on this?
Another datapoint is the counterintuitiveness of searching a desk: with each drawer you open looking for something, the probability of finding it in the next drawer increases, but your probability of ever finding it decreases. The difference seems to whipsaw people; see http://www.gwern.net/docs/statistics/1994-falk
A bit late, but I think this part of your article was most relevant to the Monty Hall problem:
People probably don't distinguish between their personal probability of the target event and the probabilities of the doors. It feels like the probability of there being a car behind the doors is a parameter that belongs to those doors or to the car - however you want to phrase it. Since you're only given information about what's behind the doors, and that information can't actually change the reality of what's behind the doors then it feels like the probability can't change just because of that.
I think the monty hall problem very closely resembles a more natural one in which the probability is 1/2; namely, that where the host is your opponent and chose whether to offer you the chance to switch. So evolutionarily-optimized instincts tell us the probability is 1/2.
I'd say it's that it closely resembles the one where the host has no idea which door has the car in it, and picks a door at random.
I do not think this is correct. First, the host should only offer you the chance to switch if you are winning, so the chance should be 0. Second, this example seems too contrived to be something that we would have evolved a good instinct about.
Unless they're trying to trick you. The problem collapses to a yes or no question of whether one of you is able to guess the level the other one of you is thinking on
Um, no, the only Nash equilibria are where you never accept the deal. If you ever accept it at all, then they will only offer it when it hurts you.
I'd probably broaden this beyond 1/2 - I think the base case is the host gives you a chance to gamble with a question or test of skill, and the result is purely dependent on the player. The swap-box scenario is then an extreme case of that where the result depends less and less on the skill of the player, eventually reaching 50% chance of winning. I wouldn't say evolutionary-optimised, but maybe familiarity with the game-show tropes being somewhere along this scale.
Monty Hall is then a twist on this extreme case, which pattern-matches to the more common 50% case with no allowance for the effect of the host's knowledge.