Eight Short Studies On Excuses

210 Yvain 20 April 2010 11:01PM

The Clumsy Game-Player

You and a partner are playing an Iterated Prisoner's Dilemma. Both of you have publicly pre-committed to the tit-for-tat strategy. By iteration 5, you're going happily along, raking up the bonuses of cooperation, when your partner unexpectedly presses the "defect" button.

"Uh, sorry," says your partner. "My finger slipped."

"I still have to punish you just in case," you say. "I'm going to defect next turn, and we'll see how you like it."

"Well," said your partner, "knowing that, I guess I'll defect next turn too, and we'll both lose out. But hey, it was just a slipped finger. By not trusting me, you're costing us both the benefits of one turn of cooperation."

"True", you respond "but if I don't do it, you'll feel free to defect whenever you feel like it, using the 'finger slipped' excuse."

"How about this?" proposes your partner. "I promise to take extra care that my finger won't slip again. You promise that if my finger does slip again, you will punish me terribly, defecting for a bunch of turns. That way, we trust each other again, and we can still get the benefits of cooperation next turn."

You don't believe that your partner's finger really slipped, not for an instant. But the plan still seems like a good one. You accept the deal, and you continue cooperating until the experimenter ends the game.

After the game, you wonder what went wrong, and whether you could have played better. You decide that there was no better way to deal with your partner's "finger-slip" - after all, the plan you enacted gave you maximum possible utility under the circumstances. But you wish that you'd pre-committed, at the beginning, to saying "and I will punish finger slips equally to deliberate defections, so make sure you're careful."

continue reading »

Anti-Akrasia Technique: Structured Procrastination

51 patrissimo 12 November 2009 07:35PM

This idea has been mentioned in several comments, but it deserves a top-level post.  From an ancient, ancient web article (1995!), Stanford philosophy professor John Perry writes:

I have been intending to write this essay for months. Why am I finally doing it? Because I finally found some uncommitted time? Wrong. I have papers to grade, textbook orders to fill out, an NSF proposal to referee, dissertation drafts to read. I am working on this essay as a way of not doing all of those things. This is the essence of what I call structured procrastination, an amazing strategy I have discovered that converts procrastinators into effective human beings, respected and admired for all that they can accomplish and the good use they make of time. All procrastinators put off things they have to do. Structured procrastination is the art of making this bad trait work for you. The key idea is that procrastinating does not mean doing absolutely nothing. Procrastinators seldom do absolutely nothing; they do marginally useful things, like gardening or sharpening pencils or making a diagram of how they will reorganize their files when they get around to it. Why does the procrastinator do these things? Because they are a way of not doing something more important. If all the procrastinator had left to do was to sharpen some pencils, no force on earth could get him do it. However, the procrastinator can be motivated to do difficult, timely and important tasks, as long as these tasks are a way of not doing something more important.

The insightful observation that procrastinators fill their time with effort, not staring at the walls, gives rise to this form of akrasia aikido, where the urge to not do something is cleverly redirected into productivity.  If you can "waste time" by doing useful things, while feeling like you are avoiding doing the "real work", then you avoid depleting your limited supply of willpower (which happens when you force yourself to do something).

In other words, structured procrastination (SP) is an efficient use of this limited resource, because doing A in order to avoid doing B is easier than making yourself do A.  If A is something you want to get done, then the less willpower you can use to do it, the more you will be to accomplish.  This only works if A is something that you do want to get done - that's how SP differs from normal procrastination, of course.

continue reading »

Why safety is not safe

48 rwallace 14 June 2009 05:20AM

June 14, 3009

Twilight still hung in the sky, yet the Pole Star was visible above the trees, for it was a perfect cloudless evening.

"We can stop here for a few minutes," remarked the librarian as he fumbled to light the lamp. "There's a stream just ahead."

The driver grunted assent as he pulled the cart to a halt and unhitched the thirsty horse to drink its fill.

It was said that in the Age of Legends, there had been horseless carriages that drank the black blood of the earth, long since drained dry. But then, it was said that in the Age of Legends, men had flown to the moon on a pillar of fire. Who took such stories seriously?

The librarian did. In his visit to the University archive, he had studied the crumbling pages of a rare book in Old English, itself a copy a mere few centuries old, of a text from the Age of Legends itself; a book that laid out a generation's hopes and dreams, of building cities in the sky, of setting sail for the very stars. Something had gone wrong - but what? That civilization's capabilities had been so far beyond those of his own people. Its destruction should have taken a global apocalypse of the kind that would leave unmistakable record both historical and archaeological, and yet there was no trace. Nobody had anything better than mutually contradictory guesses as to what had happened. The librarian intended to discover the truth.

Forty years later he died in bed, his question still unanswered.

The earth continued to circle its parent star, whose increasing energy output could no longer be compensated by falling atmospheric carbon dioxide concentration. Glaciers advanced, then retreated for the last time; as life struggled to adapt to changing conditions, the ecosystems of yesteryear were replaced by others new and strange - and impoverished. All the while, the environment drifted further from that which had given rise to Homo sapiens, and in due course one more species joined the billions-long roll of the dead. For what was by some standards a little while, eyes still looked up at the lifeless stars, but there were no more minds to wonder what might have been.

continue reading »

Well-Kept Gardens Die By Pacifism

105 Eliezer_Yudkowsky 21 April 2009 02:44AM

Previously in seriesMy Way
Followup toThe Sin of Underconfidence

Good online communities die primarily by refusing to defend themselves.

Somewhere in the vastness of the Internet, it is happening even now.  It was once a well-kept garden of intelligent discussion, where knowledgeable and interested folk came, attracted by the high quality of speech they saw ongoing.  But into this garden comes a fool, and the level of discussion drops a little—or more than a little, if the fool is very prolific in their posting.  (It is worse if the fool is just articulate enough that the former inhabitants of the garden feel obliged to respond, and correct misapprehensions—for then the fool dominates conversations.)

So the garden is tainted now, and it is less fun to play in; the old inhabitants, already invested there, will stay, but they are that much less likely to attract new blood.  Or if there are new members, their quality also has gone down.

Then another fool joins, and the two fools begin talking to each other, and at that point some of the old members, those with the highest standards and the best opportunities elsewhere, leave...

I am old enough to remember the USENET that is forgotten, though I was very young.  Unlike the first Internet that died so long ago in the Eternal September, in these days there is always some way to delete unwanted content.  We can thank spam for that—so egregious that no one defends it, so prolific that no one can just ignore it, there must be a banhammer somewhere.

But when the fools begin their invasion, some communities think themselves too good to use their banhammer for—gasp!—censorship.

After all—anyone acculturated by academia knows that censorship is a very grave sin... in their walled gardens where it costs thousands and thousands of dollars to enter, and students fear their professors' grading, and heaven forbid the janitors should speak up in the middle of a colloquium.

continue reading »

GroupThink, Theism ... and the Wiki

-4 byrnema 13 April 2009 05:28PM

In response to the  The uniquely awful example of theism, I presented myself as a datapoint of someone in the group who disagrees that theism is uncontroversially irrational.

With a loss of considerable time, several karma points and two bad posts, I now retract my position.

continue reading »

Bystander Apathy

25 Eliezer_Yudkowsky 13 April 2009 01:26AM

The bystander effect, also known as bystander apathy, is that larger groups are less likely to act in emergencies - not just individually, but collectively.  Put an experimental subject alone in a room and let smoke start coming up from under the door.  75% of the subjects will leave to report it.  Now put three subjects in the room - real subjects, none of whom know what's going on.  On only 38% of the occasions will anyone report the smoke.  Put the subject with two confederates who ignore the smoke, and they'll only report it 10% on the time - even staying in the room until it becomes hazy.  (Latane and Darley 1969.)

On the standard model, the two primary drivers of bystander apathy are:

  • Diffusion of responsibility - everyone hopes that someone else will be first to step up and incur any costs of acting.  When no one does act, being part of a crowd provides an excuse and reduces the chance of being held personally responsible for the results.
  • Pluralistic ignorance - people try to appear calm while looking for cues, and see... that the others appear calm.

Cialdini (2001):

Very often an emergency is not obviously an emergency.  Is the man lying in the alley a heart-attack victim or a drunk sleeping one off?  ...  In times of such uncertainty, the natural tendency is to look around at the actions of others for clues.  We can learn from the way the other witnesses are reacting whether the event is or is not an emergency.  What is easy to forget, though, is that everybody else observing the event is likely to be looking for social evidence, too.  Because we all prefer to appear poised and unflustered among others, we are likely to search for that evidence placidly, with brief, camouflaged glances at those around us.  Therefore everyone is likely to see everyone else looking unruffled and failing to act.

Cialdini suggests that if you're ever in emergency need of help, you point to one single bystander and ask them for help - making it very clear to whom you're referring.  Remember that the total group, combined, may have less chance of helping than one individual.

continue reading »

Sunk Cost Fallacy

30 Z_M_Davis 12 April 2009 05:30PM

Related to: Just Lose Hope Already, The Allais Paradox, Cached Selves

In economics we have this concept of sunk costs, referring to costs that have already been incurred, but which cannot be recouped. Sunk cost fallacy refers to the fallacy of honoring sunk costs, which decision-theoretically should just be ignored. The canonical example goes something like this: you have purchased a nonrefundable movie ticket in advance. (For the nitpickers in the audience, I will also specify that the ticket is nontransferable and that you weren't planning on meeting anyone.) When the night of the show comes, you notice that you don't actually feel like going out, and would actually enjoy yourself more at home. Do you go to the movie anyway?

A lot of people say yes, to avoid wasting the ticket. But on further consideration, it would seem that these people are simply getting it wrong. The ticket is a sunk cost: it's already paid for, and you can't do anything with it but go to the movie. But we've stipulated that you don't want to go to the movie. The theater owners don't care whether you go; they already have their money. The other theater-goers, insofar as they can be said to have a preference, would actually rather you stayed home, making the theater marginally less crowded. If you go to the movie to satisfy your intuition about not wasting the ticket, you're not actually helping anyone. Of course, you're entitled to your values, if not your belief. If you really do place terminal value on using something because you've paid for it, well, fine, I guess. But we should all try to notice exactly what it is we're doing, in case it turns out to not be what we want. Please, think it through.

Dearest reader, if you're now about to scrap your intuition against wasting things, I implore you: don't! The moral of the parable of the movie ticket is not that waste is okay; it's that you should implement your waste-reduction interventions at a time when they can actually help. If you can anticipate your enthusiasm waning on the night of the show, don't purchase the nonrefundable ticket in the first place!

continue reading »

Maybe Theism Is OK -- Part 2

-6 byrnema 11 April 2009 06:32AM

In response to: The uniquely awful example of theism

And Maybe Theism Is OK

Finally, I think I understand where gim and others are coming from when they made statements that I thought represented overly intolerant views of religious belief. I think that a good summary of the source of the initial difference in opinion is that while many people in this group have the purpose to eliminate all sources of irrationality,  I would like to pick and choose which sources of irrationality I have in the optimization of a different problem: general life-hacking.

Probably many people in this group believe that the best life-hack would be to eliminate irrationality. But I'm pretty sure this depends on the person (not everyone is suited for X-rationality), and I'm pretty sure -- though not certain -- that my best life-hack would include some irrationality.

Since my goals are different than that of this forum, many of my views are not relevant here, and there is no need to debate them.

Instead, I would like to present two arguments (1,2) for why it could be rational to hold an irrational belief, and two arguments (3,4) as to why someone could be more accepting of the existence of irrational beliefs (i.e., why not to hate it).

(1) It could be rational to hold an irrational belief if you are aware of your irrational belief and choose to hold it because it is grafted to components of your personality/ psyche that are valuable to you. For example, you may find that

  • eschewing your religious beliefs makes you feel depressed and you are unable to work productively
  • your ability to control unwanted impulses is tied with a moral conscience that is inextricably tied with beliefs about God.
  • ability to perform a certain artistic activity that you enjoy is compartmentalized with spiritual beliefs

I imagine these situations would be the result of an organically developing mind that has made several errors and is possibly unstable. But until we have a full understanding of mental processes/psychology/the physiology of emotions, we cannot expect a rational person to just "tough it out" to optimize rationality while his life falls apart.

Later added: This argument has since been described better, with a better emphasis, with [this comment.](http://lesswrong.com/lw/aq/how_much_thought/6zp)

(2) It could be rational to hold an irrational belief if you choose to hold it because you would like to exercise true control of your mind. Put another way, you may find it to be an aesthetic art of some form to choose a set of beliefs and truly believe them. Why would anyone want to do this? Eliminating all beliefs and becoming rational is a good exercise in controlling your mind. I hazard that a second exercise would be to believe what you consciously choose to.

(3) I think there is another reason to consciously choose to try to believe something that you don't believe rationally-- true understanding of the enemy; the source and the grip of an irrational thought. What irked me most about the negative comments about religious views was the lack of any empathy for those views. It may seem like a contradiction but while I believe some religious views are irrational I do not dismiss people who hold them as hopelessly irrational. With empathy, I believe that it is possible to hold religious views and not greatly compromise rationality.

(4) Maybe you are indeed right that any kind of religious view is irrational and that we would be better off without it. However, it is not at as clear that religious views can ever be completely exorcised... Suppose we wanted to create a world in which important parts of people's personalities are never tied to religious views. Are children allowed to daydream? Is a child allowed to daydream they are omnipotent? Are they allowed to pretend there is a God for a day? How will it affect creativity and motivation and development if there is no empathy for an understanding of God?

Awful Austrians

34 Swimmy 12 April 2009 06:06AM

Response to: The uniquely awful example of theism

Why is theism such an ever-present example of irrationality in this community? I think ciphergoth overstates the case. Even theism is not completely immune to evidence, as the acceptance of, say, evolution by so many denominations over time will testify. Theism is a useful whipping boy because it needs no introduction.

But I think the case is overstated for another reason. There are terrible epistemologies out there that are just as bad as theism's. Allow me to tell you a tale, of how I gave up my religion and my association with a school of economics at the same time.

I grew up in a southern Presbyterian church in the U.S. While I was taught standard pseudo-evidential defenses for belief, such as "creation science" and standard critiques of evolution, my church was stringently anti-evidentialist. Their preferred apologetic was something called presuppositionalism. It's certainly a minority apologetic among major defenders of Christianity today, especially compared to the cosmological or morality arguments. But it's a particularly rigorous attempt to defend beliefs against evidence nonetheless.

Presuppositionalism (in some forms) hangs on the problem of induction. We cannot ultimately justify any of our beliefs without first making some assumptions, otherwise we end in solipsism. Christianity, then, justifies itself not on evidence, but on internal consistency. It is ok for an argument to be ultimately circular, because all arguments are ultimately circular. Christianity alone maintains perfect worldview consistency when examined through this lens, and is therefore correct.

Since I've spent a lot of time thinking about this--it can take a considerable effort to change one's mind, after all--I can imagine innumerable things wrong with it, but they're not the focus of this entry. First, I just want to note how close it is to a kind of intro-level Bayesian understanding. Bayesians admit that we must have priors, that it's indeed nonsense to think we can even have an argument with one who doesn't. We must ultimately admit that certain justifications are going to be either recursive or based on priors. We believe that we should update our priors based on evidence, but there's nothing in the math that tells us we can't start with a prior for some position of 0% or 100%. (There is something in the math that tells us such probability assignments are very bad ideas, and we have more than enough cognitive bias literature that tells us we shouldn't be so damn overconfident. But then, what if you have a prior that keeps you from accepting such evidence?) It doesn't have any of the mathematical rigor, but it comes very close on a few major points.

continue reading »

Toxic Truth

12 MichaelHoward 11 April 2009 11:25AM

For those who haven't heard about this yet, I thought this would be a good way to show the potentially insidious effect of biased, one-sided analysis and presentation of evidence under ulterior motives, and the importance of seeking out counter-arguments before accepting a point, even when the evidence being presented to support that point is true.

"[DHMO] has been a part of nature longer than we have; what gives us the right to eliminate it?" - Pro-DHMO web site.

DHMO (hydroxilic acid), commonly found in excised tumors and lesions of terminal lung and throat cancer patients, is a compound known to occur in second hand tobacco smoke. Prolonged exposure in solid form causes severe tissue damage, and a proven link has been established between inhalation of DHMO (even in small quantities) and several deaths, including many young children whose parents were heavy smokers.

It's also used as a solvent during the synthesis of cocaine, in certain forms of particularly cruel and unnecessary animal research, and has been traced to the distribution process of several cases of pesticides causing genetic damage and birth defects. But there are huge political and financial incentives to continue using the compound.

There have been efforts across the world to ban DHMO - an Australian MP has announced a campaign to ban it internationally - but little progress. Several online petitions to the British prime minister on this subject have been rejected. The executive director of the public body that operates Louisville Waterfront Park was actually criticised for posting warning signs on a public fountain that was found to contain DHMO. Jacqui Dean, New Zealand National Party MP was simily told "I seriously doubt that the Expert Advisory Committee on Drugs would want to spend any time evaluating that substance".

If you haven't guessed why, re-read my first sentence then click here.

HT the Coalition to Ban Dihydrogen Monoxide.

[Edit to clarify point:] I'm not saying truth is in any way bad. Truth rocks. I'm reminding you truth is *not sufficient*. When they're given treacherously or used recklessly, truth is as toxic as hydroxilic acid.

Follow-up to: Comment in The Forbidden Post.

View more: Next