Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Misleading the witness

14 Post author: Bo102010 09 August 2009 08:13PM

Related: Trust in Math

I was reading John Allen Paulos' A Mathematician Plays the Stock Market, in which Paulos relates a version of the well-known "missing dollar" riddle. I had heard it once before, but only vaguely remembered it. If you don't remember it, here it is:

Three people stay in a hotel overnight. The innkeeper tells them that the price for three rooms is $30, so each pays $10.

After the guests go to their rooms, the innkeeper realizes that there is a special discount for groups, and that the guests' total should have only been $25.

The innkeeper gives a bellhop $5 with the instructions to return it to the guests.

The bellhop, not wanting to get change, gives each guest $1 and keeps $2.

Later, the bellhop thinks "Wait - something isn't right. Each guest paid $10. I gave them each back $1, so they each paid $9. $9 times 3 is $27. I kept $2. $27 + $2 is $29. Where did the missing dollar go?"

I remembered that the solution involves trickery, but it still took me a minute or two to figure out where it is. At first, I started mentally keeping track of the dollars in the riddle, trying to see where one got dropped so their sum would be 30.

Then I figured it out. The story should end:

Later, the bellhop thinks "Wait - something isn't right. Each guest paid $10. I gave them each back $1, so they each paid $9. $9 times 3 is $27. The cost for their rooms was $25. $27 - $25 = $2, so they collectively overpaid by $2, which is the amount I kept. Why am I such a jerk?"

I told my fiance the riddle, and asked her where the missing dollar went. She went through the same process as I did, looking for a place in the story where $1 could go missing.

It's remarkable to me how blatantly deceptive the riddle is. The riddler states or implies at the end of the story that the dollars paid by the guests and the dollars kept by the bellhop should be summed, and that that sum should be $30. In fact, there's no reason to sum the dollars paid by the guests and the dollars kept by the bellhop, and no reason for any accounting we do to end up with $30.

The contrasts somewhat with the various proofs that 1 = 2, in which the misstep is hidden somewhere within a chain of reasoning, not boldly announced at the end of the narrative.

Both Paulos and Wikipedia give examples with different numbers that make the deception in the missing dollar riddle more obvious (and less effective). In the case of the missing dollar riddle, the fact that $25, $27, and $30 are close to each other makes following the incorrect path very seductive.

This riddle made me remember reading about how beginning magicians are very nervous in their first public performances, since some of their tricks involve misdirecting the audience by openly lying (e.g., casually pick up a stack of cards shuffled by a volunteer, say "Hmm, good shuffle" while adding a known card to the top of the stack, hand the deck back to the volunteer, and then boldly announce "notice that I have not touched or manipulated the cards!"1). However, they learn to be more comfortable once they find out how easily the audience will pretty much accept whatever false statements they make.

Thinking about these things makes me wonder about how to think rationally given the tendency for human minds to accept some deceptive statements at face value. Can anyone think of good ways to notice when outright deception is being used? How could a rationalist practice her skills at a magic show?

How about other examples of flagrant misdirection? I suspect that political debates might be able to make use of such techniques (I think that there might be some examples in the recent debates over health care reform accounting and the costs of obesity to the health care system, but I haven't been able to find any yet.)

 

 

Footnote 1: I remember reading this example very recently, maybe at this site. Please let me know whom to credit for it.

Comments (112)

Comment author: anonym 09 August 2009 09:24:53PM 5 points [-]

One technique is to look carefully for fallacies and/or gaps in the reasoning by summarizing the key theses of the argument and then considering what assumptions (and definitions) have to be made for the theses to be accepted and what has to be true for each to follow deductively from what has already been given. A book on "critical thinking" (e.g., this one) will have lots of exercises to develop this kind of skill. They typically have lots of examples drawn from politics just because political discussion is so frequently chock full of fallacies and bad arguments.

When you're trying to be critical of your own arguments and to identify cognitive biases at work, there are many simple and practical techniques mentioned in The Psychology of Judgment and Decision Making. To give just one example, poor calibration can be improved and overconfidence can be attenuated by considering the reasons why what you believe isn't so might actually be so. The mere act of trying to find reasons for the hypotheses you don't believe will make you less overconfident in the hypothesis you do believe. In general, iterating through the biases discussed in that book (and many others), and considering how each bias might apply in the particular circumstance, is a widely applicable and very useful technique.

Comment author: SilasBarta 10 August 2009 12:17:16AM 4 points [-]

While I'd seen the missing dollar problem before, I think I have a new appreciation for it now. I seem to recall puzzle books presenting this problem, but even when they present the solution, they present it in terms of "here's where the missing dollar is". But as you, Wikipedia, and Paulos point out, the whole problem is that the dollar is only "missing" relative to an invalid comparison.

So, to solve the problem by finding a missing dollar is to fail to learn from it.

This riddle made me remember reading about how beginning magicians are very nervous in their first public performances, since some of their tricks involve misdirecting the audience by openly lying... they learn to be more comfortable once they find out how easily the audience will pretty much accept whatever false statements they make.

It makes me wonder how dangerous magicians can become in their regular lives.

Comment author: D_Alex 10 August 2009 03:40:13AM 3 points [-]

From advertising via a friend: Apparently a specific technique used in advertising a product with a known weakness is to promote it as a strength. Eg when first feedback from consumers shows that the taste of a particular toothpaste is disliked, the response may be to put a prominent "Great New Taste!!!" label on the pack.

Comment author: dclayh 10 August 2009 06:57:23AM 5 points [-]

Otherwise known as "it's not a bug, it's a feature".

Comment author: PhilGoetz 10 August 2009 06:52:58PM 12 points [-]

This was made famous by Heinz Ketchup in the 1970s. They surveyed consumers and found they were losing market share because their ketchup was so hard to get out of the bottle because it was so thick. So they made a series of "It's Slow Good!" commercials implying that pouring slower was, for some reason, a good thing. And it worked.

Comment author: billswift 10 August 2009 01:04:00PM 7 points [-]

These sorts of brain-teasers are of limited help in developing your critical thinking skills for dealing with real world problems. Here the problem is presented to you and you just have to figure out what went wrong with a train of thought. In the real world, the BIG problem is noticing that there is a difficulty in the first place.

Comment author: Bo102010 10 August 2009 01:48:07PM 6 points [-]

I think what's striking about this example is that it's not just a misstep in a train of thought; it's the riddler flat out lying to the riddled, forcing them into a certain pattern of thought.

Comment author: Bo102010 09 August 2009 09:28:53PM *  2 points [-]

I remembered shortly after writing this that there was an example of outright lying in an attempt to get one to conform to a certain pattern of thought in Initiation Ceremony (hooray fictional evidence).

Comment author: teageegeepea 09 August 2009 09:15:38PM *  2 points [-]

I had heard it before a while back but I decided to think through it myself and see what was going on. The first thought that occurred to me was that the bellhop's two dollars should be subtracted, just as each of the 1 dollar bills given back to the guests was (to get 27 from 30). Then, I imagined that the guests had not paid with $10 dollars bills, but in 30 ones. The hotel has 30, then each of the three guests is given 1and the bellhop takes two. Here is where the money is:

  • Guest 1: 1
  • Guest 2: 1
  • Guest 3: 1
  • Bellhop: 2
  • Hotel: 25
  • Total: 30

It is basically following all the individual dollars to see where they are and make sure none are missing. Alternately, you can imagine that the price really was 27 in the beginning, they all paid 9, and then the bellhop just stole 2 on the assumption that the owner wouldn't notice.

Comment author: Vladimir_Nesov 09 August 2009 10:33:31PM *  2 points [-]

I don't know, I felt the correct sign the first time I read it. I also didn't get confused by the cognitive reflection test (in the sense that there is no confusion, the correct way of seeing the problem is all there is). It's really hard to imagine how a person with math training can miss that.

But from what I heard, a sizable portion of math students still manage to get confused by these. Tracing the analogy to cognitive biases, there may be a qualitative difference between a person who just knows about it "in theory" and even done a lot of reading on the topic ("typical" math student), and a person who thought about the techniques at every opportunity for a number of years.

Comment author: PhilGoetz 10 August 2009 02:51:59AM *  5 points [-]

Re. the linked article about the cognitive confusion test:

80 percent of high-scoring men would pick a 15 percent chance of $1 million over a sure $500, compared with only 38 percent of high-scoring women, 40 percent of low-scoring men and 25 percent of low-scoring women.

Wow. That's the most mind-blowing thing I've read in a while. I can't think of a good explanation for anyone picking the $500, let alone the male-female difference. Maybe they assume that someone might actually give them $500, but the $1M is a scam. And how does this square with the idea that poor people play the lottery more?

Princeton students scored a mean of 1.63. Heh.

Comment author: Vladimir_Golovin 10 August 2009 05:54:00AM 21 points [-]

I can't think of a good explanation for anyone picking the $500

Say, you're starving and if you don't get a meal today, you'll die. In such situations, the choice between 15% chance of $1 million and a sure $500 boils down to a choice between 15% chance to survive today and a 100% chance to survive today (assuming that the meal costs less than $500.)

Perhaps the people who chose $500 operate in this 'starvation mode' by default.

Comment author: randallsquared 11 August 2009 03:07:56AM 3 points [-]

The general term for "people who operate in starvation mode" is "the poor".

Comment author: ShardPhoenix 15 August 2009 04:17:54PM *  0 points [-]

I doubt this is the case for most of the people who would take the $500 - I'd assume it's more that most of them couldn't or just didn't think to calculate the expectation value of a 15% chance at a million.

Comment author: Alicorn 15 August 2009 05:41:50PM 3 points [-]

I think people are flipping the offer in their minds and comparing a sure $500 to an 85% chance of zilch.

Comment author: MattFisher 22 August 2009 09:50:13AM 4 points [-]

One reason people might pick the $500 is because they'll come out better off than 85% of people who take the more rational course. It is little comfort to be able to claim to have made the right decision when everyone who made the less rational decision is waving a stack of money in your face and laughing at the silly rationalist. People don't want to be rich - they just want to be richer than their next door neighbour.

Comment author: [deleted] 10 August 2009 03:31:13AM 4 points [-]

Wow. That's the most mind-blowing thing I've read in a while.

Agreed. That made my eyes water quite a bit. Alicorn's large-amounts-of-money-can-have-negative-utility explanation snapped me out of it.

Just wait until Eliezer sees this...

Comment author: Eliezer_Yudkowsky 10 August 2009 05:02:38AM 3 points [-]

AAIIIIIEEEEEAAARRRRRGGGHHH.

Just when you think your species can't possibly get any more embarrassing.

Comment author: [deleted] 10 August 2009 05:24:28AM 14 points [-]

Precisely the reaction I expected! This model of despair produced by the Singularity Institute for Eliezer Yudkowsky is matching quite well. A rigorous theory of Eliezer Yudkowsky can't be far off.

--Delta, your friendly neighborhood Friendly AI

Comment author: Eliezer_Yudkowsky 11 August 2009 07:12:54AM 5 points [-]

Okay, I tested this on a couple of uninvolved bystanders and yes, they would take the $500 over the 15% chance of $1m. Guess it's true. Staggers the mind.

Comment author: Hans 11 August 2009 01:29:13PM 1 point [-]

As previous comments have said, it would be possible to sell the 15% chance for anything up to $150k. Once people realise that the 15% chance is a liquid asset, I'm sure many will change their mind and take that instead of the $500.

What does this mean? If the 15% chance is made liquid, that removes nearly all of the risk of taking that chance. This leads me to believe that people pick the $500 because they are, quite simply, (extremely) risk-averse. Other explanations (diminishing marginal utility of money, the $1 million actually having negative utility, etc.) are either wrong, or they are not a large factor in the decision-making process.

Comment author: conchis 11 August 2009 01:59:02PM *  3 points [-]

Note that the standard explanation for risk-aversion just is diminishing marginal utility (where utility is defined in the decision-theoretic sense, rather than the hedonic sense). However, Matt Rabin pretty convincingly demolishes this in his paper Diminishing marginal utility of wealth cannot explain risk aversion.

Comment author: orangecat 13 August 2009 01:20:28AM 1 point [-]

I tried it on two women at work and they both went for the million, one with no hesitation and the other after maybe 10 seconds. Although they both have some background in finance and are probably 1 to 2 standard deviations above average IQ.

Comment author: Eliezer_Yudkowsky 13 August 2009 03:09:02PM 2 points [-]

That's not very surprising. You could see if they passed all three questions on the reflection test.

Comment author: Bo102010 13 August 2009 02:25:52AM 2 points [-]

My fiance (who has a more advanced degree than I) thought I was trying to trick her and made me restate the problem several times.

Comment author: MichaelVassar 11 August 2009 05:49:16AM 0 points [-]

OK, you have to think like reality too. How many times am I going to post this same sentence on one thread?

Comment author: Nubulous 24 August 2009 10:27:12PM 2 points [-]

I can't think of a good explanation for anyone picking the $500

For a person who doesn't expect to get many more similar betting chances, the expectation value of the big win is unphysical.

Comment author: Christian_Szegedy 24 August 2009 11:03:33PM *  0 points [-]

Yes. Probably that must be one of the reasons.

I also read someone that humans generally don't distinguish much between between probabilities lower than 5%. That is, everything below 5% is treated as a low probability event.

Even I, with good mathematical training, I guess if I would prefer $100K at 100% to $1B at 1%.

Although the second alternative has 100X "expected" pay off, I don't "expect" myself to be lucky enough to get it. :)

And although I'd definitely prefer $1M@15% to $500@100% if you'd multiply it by thousand, I think, I'd take $500K@100% rather than $1B@15% (in this case of course, Bill Gates would laugh at me... :) )

Comment author: Psy-Kosh 25 August 2009 12:04:51AM 0 points [-]

Well, also part of it is that for most people, utility isn't linear in money.

Imagine someone starving, on the verge of death or such. This offer is more or less their very last chance at this particular moment to be able to survive.

500$ with certainty means high probability of immediate survival. 1 million dollars with 15% chance means ~15% chance of survival.

500$ can potentially get enough meals and so on to buy enough time to get more help.

Again, not everyone is in this situation, obviously. But this is a simple construct to demo that utility isn't linear in money and that picking the 500$ can, at least in some cases, be rather more rational than the initial naive computation may suggest at first. (Shut up and multiply UTILITIES and probabilities, rather than money and probability. :))

Having said all that, probably for most people in that study, picking 500$ was the wrong choice. :)

Comment author: MarkusRamikin 03 July 2011 08:44:31AM 1 point [-]

Well, also part of it is that for most people, utility isn't linear in money.

Yeah. This assumption of linearity is annoyingly common; I wish more people were aware of the problems with it when contructing their various thought experiments. Not just with money, either.

Comment author: Christian_Szegedy 25 August 2009 12:21:20AM *  0 points [-]

I don't think you can model my preferences with excepted value computation based on a money -> utility mapping.

E.g. I'd definitely prefer 100M@100% to any amount of money at less than 100%. Still I'd prefer 101M@100%.

I think that my preference is quite defensible from a rational point of view, still there is no real valued money to utility mapping that could make it fit into an expected utility-maximization framework.

Comment author: Psy-Kosh 25 August 2009 12:37:50AM 0 points [-]

Well, you can use money to do stuff that does have value to you. So while there isn't a simple utility(money) computation, in principle one might have utility(money|current state of the world)

ie, there's a sufficiently broad set of things you can do with money such that more money will, for a very wide range of amounts of money, give you more opportunity to bring reality into states higher in your preference ranking.

and wait... are you saying you'd prefer 100 million dollars at probability =1 to, say, 100 billion dollars at probability = .99999?

Comment author: Christian_Szegedy 25 August 2009 12:54:06AM 0 points [-]

Call me a chicken, but yes: I would not risk going out empty handed even in 1 out of 100000 if I could have left with $100M.

This kind of super-cautious mindset can't be modeled with any real valued money X (current state of the world) -> utility type of mapping.

Comment author: Alicorn 25 August 2009 01:08:43AM 4 points [-]

The technical term is "risk-averse", not "chicken".

Comment author: Eliezer_Yudkowsky 25 August 2009 02:54:38AM 2 points [-]

This kind of super-cautious mindset can't be modeled with any real valued money X (current state of the world) -> utility type of mapping.

If you would trade a .99999 probability of $100M for a .99997 probability of $100B, then you're correct - you have no consistent utility function, and hence you can be money-pumped by the Allais Paradox.

Comment author: SilasBarta 25 August 2009 03:26:55AM 8 points [-]

And as I've argued before, that only follows if the a) the subject is given an arbitrarily large number of repeats of that choice, and b) their preference for one over the other is interpreted as them writing an arbitrarily large number of option contracts trading one for the other.

If -- as is the case when people actually answer the Allais problem as presented -- they merely show a one-shot preference for one over the other, it does not follow that they have an inconsistent utility function, or that they can be money-pumped. When you do the experiment again and again, you get the expected value. When you don't, you don't.

If making the "wrong" choice when presented with two high-probability, high-payoff lottery tickets is exploitation, I don't want to be empowered. (You can quote me on that.)

Comment author: Christian_Szegedy 25 August 2009 03:42:50AM *  1 point [-]

The above example had no consistent (real valued) utility function regardless off my 100M@.99999 vs. 100B@.99997 preference.

BTW, whatever would that preference be (I am a bit unsure, but I think I'd still take the 100M as not doing so would triple my chances of losing it) I did not really get the conclusion of the essay. At least I could not follow why being money-pumped (according to that definition of "money pumped") is so undesirable from any rational point of view.

Comment author: rhollerith_dot_com 28 August 2009 07:26:18PM *  -1 points [-]

Call me a chicken, but yes: I would not risk going out empty handed even in 1 out of 100000 if I could have left with $100M.

This kind of super-cautious mindset can't be modeled with any real valued money X (current state of the world) -> utility type of mapping.

Like Vladimir Nesov [pointed out][2], that is false -- not the preference being expressed, of course, but the statement that the preference can't be modeled with the mapping.

Now first let me make it clear that I disapprove of the atmosphere you find in some academic science departments where making a false statement is taken to be a mortifying sin. That kind of attitude is a big barrier to teaching and to learning. Since teaching and learning is a big part of what we want to do here, we should not think poorly of a participant for making a false statement.

But I am a little worried that in 88 hours since the false statement was made, no one downvoted the false statement (or if they did, the vote was canceled out by an upvote). And I am a little worried that in the 81 hours since his reply, no one upvoted Nesov's reply in which he explains why the statement is false. (I have just cast my votes on these 2 comments.)

It is good to have an atmosphere of respect for people even if they make a mistake, but it is bad IMHO when most readers ignore a false statement like the one we have here when there is no doubt about its falseness (it is not open to interpretation) and it involves knowledge central to the mission of the community (e.g., like the one we have here about the most elementary decision theory). Note that elementary decision theory is central to the rationality mission of Less Wrong and to the improve-the-global-situation-with-AI mission of Less Wrong.

Moreover, if you not only read a comment, but also decide to reply to it, well, then IMHO, you should take particular care to make sure you understand the comment, especially when the comment is as short and unnuanced as the one under discussion. But before Nesov's reply, two people replied to the comment under discussion without showing any sign that they recognise that the one statement of fact made in the comment is false. One reply (upvoted 3 times) reads, 'The technical term is "risk-averse", not "chicken"'. [The other][1] introduces the Allais paradox, which is irrelevant to why the statement is false.

I do not mean to single out this comment and these 2 replies or the people who wrote them: the only reason I am drawing attention to them is to illustrate something that happens regularly. And I definitely realize that it probably happens a lot less here on Less Wrong than it does in any other conversation on the internet that ranges over as many subject relevant to the human condition as the conversation on Less Wrong does. And a significant reason for that is the hard work Eliezer and others put into the development of the software behind the site.

But I suspect that one of the best opportunities for creating a conversation that is even better than the conversation we are all in right now is to make the response by the community to false statements (the kind not open to interpretation) more salient and more consistent. Wikipedia's response to false statements gives me the impression of rising to the level of saliency and consistency I am talking about, but of course the software behind Wikipedia does not support conversation as well as the software behind Less Wrong does. (And more importantly but more subtly, Wikipedia is badly governed: much of the goodwill and reputation enjoyed by Wikipedia will probably be captured by the ideological and personal agendas of Wikipedia's insiders.)

Comment author: thomblake 28 August 2009 07:41:53PM 3 points [-]

I disagree that false statements are the sorts of things that should be downvoted. I'm all about this being a place where people can happily be false and get corrected, and that means the 'I want to see fewer comments like this" interpretation suggests that I should not downvote comments merely for containing falsehoods.

Comment author: Christian_Szegedy 29 August 2009 12:52:46AM 0 points [-]

My original statement was mathematically true. Maybe Vladimir was sloppy reading it (his utilty function satisfied only half of the requirements), but I would not downvote him for that.

Comment author: Vladimir_Nesov 25 August 2009 07:31:16AM *  1 point [-]

This kind of super-cautious mindset can't be modeled with any real valued money X (current state of the world) -> utility type of mapping.

Yes it can: use the mapping U:money->utils such that U(x) is increasing for x<$100M (probably concave) and U(x) = C = const for x>=$100M. Then expected utility EU($100M@100%) = C*1 = C, and also EU($100B@90%) = C*0.9 < EU($100M@100%). But one of the consequences of expected utility representation is that now you must be indifferent between 20% chance at $100M and 20% chance at $100B.

Comment author: Christian_Szegedy 29 August 2009 12:48:55AM *  1 point [-]

I also made the requirement that 101M@100% should be preferred to 100M@100%.

Your utility function of U(x)=C for x>100M can't satisfy that.

Comment author: jimrandomh 10 August 2009 12:27:33PM *  5 points [-]

Just because someone tells you that something has a 15% chance does not make it so. If someone offers you a 15% chance at $1M for anything less than $150k, then you should be 95% confident that they will try to cheat somehow.

Comment author: PhilGoetz 10 August 2009 03:20:08PM 11 points [-]

Sure; but it's posed as a hypothetical. The participants know there's no real money involved. Are their conscious selves unable to prevent a subconscious defense against being scammed?

Comment author: Nick_Tarleton 11 August 2009 12:20:03AM 6 points [-]

Sure; but it's posed as a hypothetical.

Maybe they don't have have the same concept as we do of a "hypothetical".

Are their conscious selves unable to prevent a subconscious defense against being scammed?

If their conscious selves could shut down the defense, scammers could convince it to. This kind of sphexish paranoia is adaptive, if you're the sort of person who scores low on the cognitive reflection test. Maybe.

Comment author: thomblake 10 August 2009 10:17:06PM 9 points [-]

If it's the right answer in reality, then it's the right answer in a hypothetical. People use their actual cognitive faculties for pondering hypotheticals, not imaginary ones.

Comment author: Jonni 03 September 2011 02:51:01PM 0 points [-]

They may do, but they are still missing many of the physical reactions one might have to genuinely being offered large sums of money - excitement, adrenalin etc - and these are bound to have some effect on people's decision making processes.

Perhaps a way around this would be to conduct several thought experiments with a subject in one sitting, and tell them beforehand that 'one of the offers in these thought experiments is real and you will receive what you choose - although you will not know which one until the end of the experiment'.

This would be a good way to induce their visceral reactions to the situation, and of course, disappointingly perhaps, a more modest-sum-involving thought experiment at the end could provide them with their dividend.

Also worth noting: Deal or No Deal (UK version) demonstrates the variety of reactions and strategies people have to this sort of proposition. The US version is just silly though :)

Comment author: MichaelVassar 11 August 2009 05:47:43AM 4 points [-]

Most people don't really get hypotheticals. Even most high IQ people seem not to.

Comment author: orthonormal 10 August 2009 08:28:44PM 1 point [-]

We've had this argument before, and it still looks to me like this couldn't account for the full effect of risk aversion. The fact that scammers regularly succeed means that people don't usually base their reasoning on that sort of suspicion.

Comment author: randallsquared 11 August 2009 03:15:44AM 3 points [-]

People who feel secure do not, and people who do not feel secure do. Unfortunately, to someone in the latter camp, genuine opportunity really looks like a scam; it's "too good to be true".

Comment author: anonym 15 August 2009 06:00:54PM 0 points [-]

This is a really good point. I'd feel much more confident in these sorts of results if the questions were prefaced by disclaimers stating that there is no chance whatsoever of getting ripped off, that the random decision that determines the win or loss is absolutely guaranteed to be secure and accurate, that the $1M is tax-free, will be given in secret, doesn't need to be reported to the government, etc.

Comment author: thomblake 10 August 2009 10:26:39PM 3 points [-]

I can't think of a good explanation for anyone picking the $500

I agree with Vladimir Golovin. I definitely think this way - I can think immediately of how useful $500 would be to me, but cannot think of many ways to use a 15% chance of $1M.

Well, I can think of one way - I would take the 15% chance of $1M and then sell it to one of you suckers for $100,000.

Comment author: Alicorn 10 August 2009 11:12:28PM 5 points [-]

Show of hands - who really has $100,000 that they could free up to buy this from thomblake? Personally, I don't think I actually know what 100k is worth to me because I have never had my hands on that much money.

Comment author: MichaelVassar 11 August 2009 05:45:00AM *  1 point [-]

Actually, if he has the $1M, I'm in. I don't have $100K liquid in any normal sense but I could certainly raise it for such a deal and divide the risk and return up among a few people.

Comment author: Alicorn 11 August 2009 04:00:39PM 10 points [-]

He does not have (even in this hypothetical situation) a million bucks. Hypothetically, he's being offered a 15% chance of winning a million bucks.

Incidentally, in a staggering display of risk aversion, I asked a friend how much she'd pay for a 15% chance of a million dollars and she said maybe twenty bucks because those did not seem like "very good odds" to her. -.-

Comment author: CronoDAS 10 August 2009 11:39:58PM *  1 point [-]

I don't have $100,000. I only have about $30,000.

Just how much is a one in seven chance of a million dollars worth to everyone here, anyway? (Offer me a near certainty of $70,000 and I'd start to have second thoughts about taking the gamble.)

Comment author: barrkel 11 August 2009 05:30:12AM 1 point [-]

Exactly. If someone has $1.1M, spending $0.1M on a 15% chance of $1M is a good deal. Someone who has $0.05M and has to go into debt to buy the 15% chance is very probably insane.

Comment author: Alicorn 10 August 2009 03:04:53AM 2 points [-]

Another possible sane motivation for taking the $500 is a familiarity with how commonly lottery winners find their lives ruined by the sudden influx of cash.

Comment author: anonym 10 August 2009 03:53:13AM 20 points [-]

It's possible but doesn't seem very likely, since given the choice between $1M outright or $500 outright, those same people would almost certainly take the $1M.

I think a more likely explanation is that they conceptualize the problem as having to choose between "probably getting $0" and "certainly getting $500".

Comment author: bbleeker 11 August 2009 10:33:10AM 4 points [-]

Of course that's it. $500 is a lot to pay for a lottery ticket, even one with as high a chance of winning as this. Change it to a certain $20 and a 15% chance of $40,000, and I bet (heh) that many more people will take the chance then.

Comment author: PhilGoetz 10 August 2009 03:26:44PM *  8 points [-]

Well, I don't buy this in general, but I do know one person who would do this. I was talking with her recently about a lottery winnery who won some huge sum, maybe $100M, and she said, "But it ruined his life." And I said, "Why? What happened?" And she said, "Oh, I don't know what happened. But I assume it ruined his life."

In her mind, I think she had already taken her high prior for B, assumed B, and converted the non-incident into further evidence for B. She did laugh at herself when she realized she'd done this, so there is hope. :)

Comment author: simpleton 10 August 2009 03:34:10AM 4 points [-]

Being aware of that tendency should make it possible to avoid ruination without forgoing the money entirely (e.g. by investing it wisely and not spending down the principal on any radical lifestyle changes, or even by giving all of it away to some worthy cause).

Comment author: Alicorn 10 August 2009 05:24:35AM 1 point [-]

Unless there's akrasia involved. I can only imagine how tempting it would be to just outright buy a house if I were suddenly handed a million dollars, no matter how sternly I told myself not to just outright buy a house.

Comment author: simpleton 10 August 2009 06:02:23AM *  9 points [-]

And the best workaround you can come up with is to walk away from the money entirely? I don't buy it.

If you go through life acting as if your akrasia is so immutable that you have to walk away from huge wins like this, you're selling yourself short.

Even if you're right about yourself, you can just keep $1000 [edit: make that $3334, so as to have a higher expected value than a sure $500] and give the rest away before you have time to change your mind. Or put the whole million in an irrevocable trust. These aren't even the good ideas; they're just the trivial ones which are better than what you're suggesting.

Comment author: hirvinen 10 August 2009 11:11:22PM 0 points [-]

Ha! Buying a house and even more so moving is hard work, even with hired help. No way I'd do that right away.

Comment author: rwallace 10 August 2009 03:09:24PM 0 points [-]

Given a million-dollar windfall, buying a house at today's depressed prices would be one of the best investments you could make. (An additional benefit would be to make the money less liquid, thereby cutting down the temptation to spend it frivolously.)

Comment author: Alicorn 10 August 2009 03:42:10PM 3 points [-]

Perhaps, but owning a house would be a terrible time investment for me the way my life is working. I suppose I could hire a property manager and rent it out, though.

Comment author: matt 10 August 2009 10:36:43PM 2 points [-]

How sure are you that you know more than the market on this one? What information do you have that (still) rich property speculators don't have?

Comment author: MichaelVassar 11 August 2009 05:51:21AM 1 point [-]

Housing markets aren't even theoretically efficient. Too big, diffuse, illiquid, etc.

Comment author: MichaelBishop 15 August 2009 09:36:49PM 1 point [-]

ok, but are you arguing that Matt's skepticism is unwarranted? are you heavily invested in realestate?

Comment author: MichaelBishop 15 August 2009 09:37:29PM 0 points [-]
Comment author: CarlShulman 10 August 2009 04:10:32AM *  2 points [-]

So they should keep $10,000 if they win. Or just sell a stake in the winnings to a less risk-averse third party for $10,000, risk free.

Comment author: PhilGoetz 10 August 2009 03:23:50PM *  1 point [-]

What $10,000?

EDIT: Never mind; I thought he said "the $10,000".

Comment author: Alicorn 10 August 2009 03:41:00PM *  5 points [-]

You could probably sell somebody a 15% chance to win a million bucks for 10k. It's worth fifteen times that to a risk-neutral agent.

Comment author: Technologos 11 August 2009 05:41:43AM 1 point [-]

I'm pretty confident that you could sell a true 15% chance to win a million bucks for a lot more than 10k... after all, investment banks make substantially greater gambles regularly.

I'd probably ask for 100k to start and go from there.

Comment author: timtyler 10 August 2009 06:01:44PM -1 points [-]

Re: Maybe they assume that someone might actually give them $500, but the $1M is a scam.

If you were offering this deal, wouldn't the $1M be based on a deceptive maniuputation of the stated probabilities? Many participants can probably figure that one out.

Comment author: thomblake 10 August 2009 10:14:30PM 0 points [-]

kudos for linking to Virginia Postrel.

Comment author: MichaelVassar 11 August 2009 05:34:35AM 0 points [-]

Think like reality. If it's hard to imagine how something could happen update your model.

Comment author: Vladimir_Nesov 11 August 2009 01:09:31PM 0 points [-]

My model says that there is a big difference between formal education and deep understanding that can only be developed by extracurricular appreciation of the subject.

Comment author: Jonathan_Graehl 10 August 2009 09:12:33PM 0 points [-]

I was briefly tempted to answer 10 cents to the first problem.

one study was done at University of Toledo'' -- where the mean score on his test was 0.57 out of a possible 3 -- ''and one study was done at Princeton,'' where the mean was 1.63

I'm just thrilled to think of how dumb our elite (top 10% and top 2% respectively?) are.

Almost a third of high scorers preferred a 1 percent chance of $5,000 to a sure $60.

Maybe those are just the high scorers who got 3/3 (avoiding the tempting surface error) almost by sheer chance.

Comment author: Eneasz 10 August 2009 09:44:30PM 1 point [-]

I got 3/3 and I would take the chance. My rational is that $60 is almost nothing. I can make that very quickly, and it won't buy much. I won't notice it in my monthly finances. $5000, on the other hand, is actually worth considering. That could change my month significantly (and impact the rest of my year as well). Would you rather have a 100% chance of getting a nickel, or a 0.01% chance of getting a small diamond?

Comment author: JGWeissman 10 August 2009 10:01:29PM 4 points [-]

What if we transform the problem, so that you have the opportunity to pay $60 for a 1% chance to gain $5000?

Comment author: DonGeddis 10 August 2009 10:29:16PM 7 points [-]

Exactly! This is gambling, isn't it? A small expected loss, with a tiny chance of some huge gain.

If your utility for money really is so disproportionate to the actual dollar value, then you probably ought to take a trip to Las Vegas and lay down a few long-odds bets. You'll almost certainly lose your betting money (but you wouldn't "notice it in [your] monthly finances"), while there's some (small) chance that you get lucky and "change [your] month considerably".

It's not hypothetical! You can do this in the real world! Go to Vegas right now.

(If the plane flight is bothering you, I'm sure we could locate some similar online betting opportunities.)

Comment author: orangecat 11 August 2009 11:56:46PM 2 points [-]

That sort of attitude (among my opponents) is very helpful to my poker bankroll. You're giving up $60 for $50 of expected value. Even given your risk-seeking preference, surely you can find a better gamble. Putting it on a single number in roulette would be a better deal.

Comment author: orthonormal 11 August 2009 02:25:25AM 0 points [-]

By the way, welcome to Less Wrong (I notice you had some comments on Overcoming Bias as well); you should check out the welcome thread if you haven't already.

Comment author: Alicorn 10 August 2009 10:05:43PM *  0 points [-]

I'd be almost guaranteed to lose the diamond before I could liquidate it if I won it. Should I factor that in or not? Diamonds are also notoriously difficult to liquidate unless you are in the relevant cartel...

Comment author: anonym 15 August 2009 05:52:16PM 0 points [-]

I would guess that most people who got the first problem correct also had "10 cents" as their initial thought, for about a half or second or so before they had finished reading the question and before they had actually started deliberatively thinking about the problem.

Comment author: SforSingularity 11 August 2009 10:16:30PM *  1 point [-]

Can anyone think of good ways to notice when outright deception is being used? How could a rationalist practice her skills at a magic show?

Most "rationalists" are quite smart people, so tricks that are designed by a trickster to fool the masses rarely work on us. For example, I doubt that many on this site would invest heavily in a pyramid scheme or get fooled by a used car salesman. This is because these tricks are targeted at the average idiot.

However, I have recently noticed that there is, for each of us, a stalker who stalks us and at each and every turn attempts to deceive us, and is just as smart as we are. That stalker/trickster is your own cognitive biases, and by far and away inflicts the greatest material losses on you. This is certainly true in my case.

I cannot even remember the last time I was fooled by someone else, but now that I am working on reducing my losses due to self deception, I realize that basically every day I engage in successful self-deception: I get into some emotional state, myopic, irrational algorithms take over, and I make up little excuses to myself for why they reached the right conclusion.

The real enemy is already inside your head.

Comment author: Annoyance 13 August 2009 03:36:08PM *  3 points [-]

Most "rationalists" are quite smart people, so tricks that are designed by a trickster to fool the masses rarely work on us.

Wrong. Tricksters rely on people making stupid assumptions and failing to check assertions. People with a lot of brainpower can do those things just as easily as people without.

Physicists asked to evaluate paranormal claims do very poorly, yet they are clearly very brainy. It takes more than just brains to be intelligent - you have to use the brains properly.

If I had a dollar for every brainy person who'd been gulled because they thought they were "too smart" to require being skeptical...

Comment author: SforSingularity 13 August 2009 07:11:15PM 2 points [-]

Physicists asked to evaluate paranormal claims do very poorly, yet they are clearly very brainy.

Reference, please. I defy the implied claim that "Physicists asked to evaluate paranormal claims do worse than the average person". I bet 6:1 against this.

If I had a dollar for every brainy person who'd been gulled because they thought they were "too smart" to require being skeptical...

and if I had a dollar for every average idiot who sleepwalked straight into an obvious scam I would make a lot more money.

Comment author: Aurini 22 August 2009 12:08:09AM *  2 points [-]

Project Alpha by James Randi: http://en.wikipedia.org/wiki/Project_Alpha

Scientists tend to be trusting and naive, since neither nature nor their peers are prone to lying. That's why magicians make such great skeptics -- their profession is nothing but lying!

Comment author: Annoyance 01 September 2009 01:48:48PM 0 points [-]

If I had a dollar for every brainy person who'd been gulled because they thought they were "too smart" to require being skeptical...

and if I had a dollar for every average idiot who sleepwalked straight into an obvious scam I would make a lot more money.

Those sets are not disjoint.

Comment author: SforSingularity 01 September 2009 02:27:29PM 1 point [-]

I define "average idiot" to be disjoint from "brainy person". Does that sound reasonable?

Of course, I am sure that there are some very clever people who sleepwalked straight into a really obvious scam without even questioning it, but I am making the empirical claim that this doesn't happen as much as it does for people of below average intelligence.

Comment author: PhilGoetz 10 August 2009 02:36:49AM 0 points [-]

Restating the problem in simpler terms, without narrative, would help with this example.

Comment author: Bo102010 10 August 2009 02:53:21AM *  2 points [-]

Which problem do you mean? The original riddle?

Actor A charges actors B1, B2, and B3 $10 each, for a total charge of $30. Next, A changes the total charge to $25. Next, Actor C gives $1 of the $5 difference to each of the Bs, and keeps $2. After having paid $10 and returned $1, each of the 3 Bs paid $9. $9 times 3 is $27, plus the $2 kept by C is $29. What happened to the extra $1 so that the sum is $30?

The flagrant lying is occurs from "plus the $2 kept by C" to the end.

Comment author: PhilGoetz 10 August 2009 06:55:52PM 1 point [-]

You're still telling it as a narrative. If you wrote it out as an Excel spreadsheet, I think the difficulty would vanish.

Comment author: RichardKennaway 10 August 2009 07:28:38PM *  2 points [-]

Spreadsheet?? Just look at it this way: where is the money? The guests have paid $27, of which $25 is with the innkeeper and $2 is with the bellhop. Problem gone.

Comment author: Bo102010 11 August 2009 12:05:51AM *  0 points [-]

The aim of my post is to point out that there is no difficulty until the riddler leads you into thinking there is. Nonetheless, you could do:

1) A: $0, B1: $10, B2: $10, B3: $10, C: $0

2) A: $30, B1: $0, B2: $0, B3: $0, C: $0

3) A: $25, B1: $0, B2: $0, B3: $0, C: $5

4) A: $25, B1: $1, B2: $1, B3: $1, C: $2

And then misdirect by saying "But," then stating,

[B1(1) - B1(4)] + [B2(1) - B2(4)] + [B3(1) - B3(4)] + C = 29

and then asking "Where'd the missing dollar go?"

Comment author: Aurini 21 August 2009 11:44:53PM 1 point [-]

A while back on here somebody mentioned that they were at a College, and then later on somebody else mentioned MIT, so I drew the conclusion that they were at MIT. This is a power which must be used wisely...

Comment author: Larks 09 August 2009 08:36:53PM 1 point [-]

Can anyone think of good ways to notice when outright deception is being used?

I suspect one of the best ways may be to try to re-create their reasoning from the beginning, so you engage the logical inference part of your brain in trying to (re)reason, rather than the 'listening to and believing' part, as we now know that we first believe, and have to consciously reject, new ideas, rather than the other way around.

Where money is concerned, I suppose you could check whether income matches expenditures.

other examples of flagrant misdirection

I can think of many examples of cases where candidates evade the most basic of economics, but as Politics is the Mind-Killer, I think it's probably best not to bring them up.

Comment author: Larks 19 August 2009 10:11:55PM 0 points [-]

other examples of flagrant misdirection

The National Lottery!

Comment author: MichaelBishop 15 August 2009 09:38:42PM 0 points [-]

Ask people to write/draw the problem and I bet you get far better responses.