The Least Convenient Possible World
Related to: Is That Your True Rejection?
"If you’re interested in being on the right side of disputes, you will refute your opponents’ arguments. But if you’re interested in producing truth, you will fix your opponents’ arguments for them. To win, you must fight not only the creature you encounter; you must fight the most horrible thing that can be constructed from its corpse."
-- Black Belt Bayesian, via Rationality Quotes 13
Yesterday John Maxwell's post wondered how much the average person would do to save ten people from a ruthless tyrant. I remember asking some of my friends a vaguely related question as part of an investigation of the Trolley Problems:
You are a doctor in a small rural hospital. You have ten patients, each of whom is dying for the lack of a separate organ; that is, one person needs a heart transplant, another needs a lung transplant, another needs a kidney transplant, and so on. A traveller walks into the hospital, mentioning how he has no family and no one knows that he's there. All of his organs seem healthy. You realize that by killing this traveller and distributing his organs among your patients, you could save ten lives. Would this be moral or not?
I don't want to discuss the answer to this problem today. I want to discuss the answer one of my friends gave, because I think it illuminates a very interesting kind of defense mechanism that rationalists need to be watching for. My friend said:
It wouldn't be moral. After all, people often reject organs from random donors. The traveller would probably be a genetic mismatch for your patients, and the transplantees would have to spend the rest of their lives on immunosuppressants, only to die within a few years when the drugs failed.
On the one hand, I have to give my friend credit: his answer is biologically accurate, and beyond a doubt the technically correct answer to the question I asked. On the other hand, I don't have to give him very much credit: he completely missed the point and lost a valuable effort to examine the nature of morality.
So I asked him, "In the least convenient possible world, the one where everyone was genetically compatible with everyone else and this objection was invalid, what would you do?"
He mumbled something about counterfactuals and refused to answer. But I learned something very important from him, and that is to always ask this question of myself. Sometimes the least convenient possible world is the only place where I can figure out my true motivations, or which step to take next. I offer three examples:
Kahneman's Planning Anecdote
Followup to: Planning Fallacy
From "Timid Choices and Bold Forecasts: Cognitive Perspective on Risk Taking" by Nobel Laureate Daniel Kahneman and Dan Lovallo, in a discussion on "Inside and Outside Views":
In 1976 one of us (Daniel Kahneman) was involved in a project designed to develop a curriculum for the study of judgment and decision making under uncertainty for high schools in Israel. When the team had been in operation for about a year, with some significant achievements already to its credit, the discussion at one of the team meetings turned to the question of how long the project would take. To make the debate more useful, I asked everyone to indicate on a slip of paper their best estimate of the number of months that would be needed to bring the project to a well-defined stage of completion: a complete draft ready for submission to the Ministry of education. The estimates, including my own, ranged from 18 to 30 months.
At this point I had the idea of turning to one of our members, a distinguished expert in curriculum development, asking him a question phrased about as follows:
"We are surely not the only team to have tried to develop a curriculum where none existed before. Please try to recall as many such cases as you can. Think of them as they were in a stage comparable to ours at present. How long did it take them, from that point, to complete their projects?"
= 783df68a0f980790206b9ea87794c5b6)
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)