Comment author: wedrifid 21 June 2010 01:29:54PM 4 points [-]

You are just wrong. These are people whose utility function does not place a higher utility on "dieing but not having to take my meds".

If your preferred theory takes a human and forces the self-contradictions into a simple rational agent with a coherent utility function you must resolve the contradictions the way the agent would prefer them to be resolved if they were capable of resolving them intelligently. If your preferred theory does not do this then it is a crap theory. A map that does not describe the territory. A map that is better used as toilet paper.

Comment author: pricetheoryeconomist 21 June 2010 04:25:19PM *  8 points [-]

"These are people whose utility function does not place a higher utility on 'dieing but not having to take my meds'."

Why are you making claims about their utility functions that the data does not back? Either people prefer less to more, knowingly, or they are making rational decisions about ignorance, and not violating their "ugh" field, which is costly for them.

How is that any different than a smoker being uncomfortable quitting smoking? (Here I recognize that smoking is obviously a rational behavior for people who choose to smoke).

Comment author: pricetheoryeconomist 21 June 2010 12:05:18PM 3 points [-]

You have it all wrong. Your "ugh" field should go into their utility function! Whether or not they invest the resources to overcome that "ugh" field and save their life is endogenous to their situation!

You are making the case for rationality, it seems to me. Your suggestion may be that people are emotional, but not that they are irrational! Indeed, this is what the GMU crowd calls "rationally irrational." Which makes perfect sense--think about the perfectly rational decision to get drunk (and therefore be irrational). It has costs and benefits that you evaluate and decide that going with your emotions is preferable.

I see this comment as not understanding the definition of "rational" in economics, which would be simply maximizing utility subject to costs such as incomplete information (and endogeneous amounts of information), emotional constraints and costs, etc.

Comment author: pricetheoryeconomist 25 May 2010 11:59:27PM 5 points [-]

Great idea. Very clever.

Perhaps someone has said this already, but it's worth noting that if you did this in the car dealer example, car dealers could sign similar contracts--your deal would not go through.

Then, negotiating with car dealers would have a game theoretic hawk/dove or snowdrift equilibrium. Similarly with potential wives. They could sign contracts that agree they will never sign prenups--another hawk/dove equilibrium.

Comment author: utilitymonster 08 May 2010 04:18:43PM *  13 points [-]

Nothing is more unjust, however common, than to charge with hypocrisy him that expresses zeal for those virtues which he neglects to practice; since he may be sincerely convinced of the advantages of conquering his passions, without having yet obtained the victory, as a man may be confident of the advantages of a voyage, or a journey..., without having courage or industry to undertake it, and may honestly recommend to others, those attempts which he neglects himself.

--Samuel Johnson

Comment author: pricetheoryeconomist 09 May 2010 04:44:30PM 7 points [-]

Is Samuel Johnson's quote a valid or true statement? I understand your central thrust--the inability to do something personally (such as control one's sexual urges) and the disposition to encourage others to overcome that inability are not necessarily contradictory--indeed, they may fall together naturally.

However, in Samuel Johnson's world, and the world in which this "issue" comes up the most, politics, we might imagine that there exist two types of people: sociopathic individuals hungry for power, and individuals who are sincere.

If sociopathic individuals hungry for power are more often hypocrites, then we might, as an efficient rule of thumb (not being able to distinguish the two save through their observable actions!) condemn hypocrites because they are likely to be power-hungry individuals.

As a bayesian update, in the world of politics, we expect that hypocrites are more likely to be power hungry or sociopathic. I see Samuel Johnson's quote as potentially true, but ignoring a world of imperfect information and signaling.

Comment author: pricetheoryeconomist 09 May 2010 01:39:53PM *  4 points [-]

A reasonable an idea for this and other problems that don't' seem to suffer from ugly asymptotics would simply to mechanically test it.

That is to say that it may be more efficient, requiring less brain power, to believe the results of repeated simulations. After going through the Monty Hall tree and statistics with people who can't really understand either, then end up believing the results of a simulation whose code is straightforward to read, I advocate this method--empirical verification over intuition or mathematics that are fallible (because you yourself are fallible in your understanding, not because they contain a contradiction).

Comment author: pricetheoryeconomist 07 May 2010 04:43:12PM 4 points [-]

I don't see this as a valid criticism, if it intended to be a dismissal. The addendum "beware this temptation" is worth highlighting. While this is a point worth making, the response "but someone would have noticed" is shorthand for "if your point was correct, others would likely believe it as well, and I do not see a subset of individuals who also are pointing this out."

Let's say there are ideas that are internally inconsistent or rational or good (and are thus not propounded) and ideas that are internally consistent or irrational or bad. Each idea comes as a draw from a bin of ideas, with some proportion that are good and some that are bad.

Further, each person has an imperfect signal on whether or not an idea is good or not. Finally, we only see ideas that people believe are good, setting the stage for sample selection.

Therefore, when someone is propounding an idea, the fact that you have not heard of it before makes it more likely to have been censored--that is, more likely to have been judged a bad idea internally and thus never suggested. I suggest as a bayesian update that, given you have never heard the idea before, it is more likely to be internally inconsistent/irrational/bad than if you hear it constantly, the idea having passed many people's internal consistency checks.