Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Leave a Line of Retreat

51 Post author: Eliezer_Yudkowsky 25 February 2008 11:57PM

"When you surround the enemy
Always allow them an escape route.
They must see that there is
An alternative to death."
        —Sun Tzu, The Art of War, Cloud Hands edition

"Don't raise the pressure, lower the wall."
        —Lois McMaster Bujold, Komarr

Last night I happened to be conversing with a nonrationalist who had somehow wandered into a local rationalists' gathering.  She had just declared (a) her belief in souls and (b) that she didn't believe in cryonics because she believed the soul wouldn't stay with the frozen body.  I asked, "But how do you know that?"  From the confusion that flashed on her face, it was pretty clear that this question had never occurred to her.  I don't say this in a bad way—she seemed like a nice person with absolutely no training in rationality, just like most of the rest of the human species.  I really need to write that book.

Most of the ensuing conversation was on items already covered on Overcoming Bias—if you're really curious about something, you probably can figure out a good way to test it; try to attain accurate beliefs first and then let your emotions flow from that—that sort of thing.  But the conversation reminded me of one notion I haven't covered here yet:

"Make sure," I suggested to her, "that you visualize what the world would be like if there are no souls, and what you would do about that.  Don't think about all the reasons that it can't be that way, just accept it as a premise and then visualize the consequences.  So that you'll think, 'Well, if there are no souls, I can just sign up for cryonics', or 'If there is no God, I can just go on being moral anyway,' rather than it being too horrifying to face.  As a matter of self-respect you should try to believe the truth no matter how uncomfortable it is, like I said before; but as a matter of human nature, it helps to make a belief less uncomfortable, before you try to evaluate the evidence for it."

The principle behind the technique is simple:  As Sun Tzu advises you to do with your enemies, you must do with yourself—leave yourself a line of retreat, so that you will have less trouble retreating.  The prospect of losing your job, say, may seem a lot more scary when you can't even bear to think about it, than after you have calculated exactly how long your savings will last, and checked the job market in your area, and otherwise planned out exactly what to do next.  Only then will you be ready to fairly assess the probability of keeping your job in the planned layoffs next month.  Be a true coward, and plan out your retreat in detail—visualize every step—preferably before you first come to the battlefield.

The hope is that it takes less courage to visualize an uncomfortable state of affairs as a thought experiment, than to consider how likely it is to be true.  But then after you do the former, it becomes easier to do the latter.

Remember that Bayesianism is precise—even if a scary proposition really should seem unlikely, it's still important to count up all the evidence, for and against, exactly fairly, to arrive at the rational quantitative probability.  Visualizing a scary belief does not mean admitting that you think, deep down, it's probably true.  You can visualize a scary belief on general principles of good mental housekeeping.  "The thought you cannot think controls you more than thoughts you speak aloud"—this happens even if the unthinkable thought is false!

The leave-a-line-of-retreat technique does require a certain minimum of self-honesty to use correctly.

For a start:  You must at least be able to admit to yourself which ideas scare you, and which ideas you are attached to.  But this is a substantially less difficult test than fairly counting the evidence for an idea that scares you.  Does it help if I say that I have occasion to use this technique myself?  A rationalist does not reject all emotion, after all.  There are ideas which scare me, yet I still believe to be false.  There are ideas to which I know I am attached, yet I still believe to be true.  But I still plan my retreats, not because I'm planning to retreat, but because planning my retreat in advance helps me think about the problem without attachment.

But greater test of self-honesty is to really accept the uncomfortable proposition as a premise, and figure out how you would really deal with it.  When we're faced with an uncomfortable idea, our first impulse is naturally to think of all the reasons why it can't possibly be so.  And so you will encounter a certain amount of psychological resistance in yourself, if you try to visualize exactly how the world would be, and what you would do about it, if My-Most-Precious-Belief were false, or My-Most-Feared-Belief were true.

Think of all the people who say that, without God, morality was impossible.  (And yes, this topic did come up in the conversation; so I am not offering a strawman.)  If theists could visualize their real reaction to believing as a fact that God did not exist, they could realize that, no, they wouldn't go around slaughtering babies.  They could realize that atheists are reacting to the nonexistence of God in pretty much the way they themselves would, if they came to believe that.  I say this, to show that it is a considerable challenge to visualize the way you really would react, to believing the opposite of a tightly held belief.

Plus it's always counterintuitive to realize that, yes, people do get over things.  Newly minted quadriplegics are not as sad as they expect to be six months later, etc.  It can be equally counterintuitive to realize that if the scary belief turned out to be true, you would come to terms with it somehow.  Quadriplegics deal, and so would you.

See also the Litany of Gendlin and the Litany of Tarski.  What is true is already so; owning up to it doesn't make it worse.  You shouldn't be afraid to just visualize a world you fear.  If that world is already actual, visualizing it won't make it worse; and if it is not actual, visualizing it will do no harm.  And remember, as you visualize, that if the scary things you're imagining really are true—which they may not be!—then you would, indeed, want to believe it, and you should visualize that too; not believing wouldn't help you.

How many religious people would retain their belief in God, if they could accurately visualize that hypothetical world in which there was no God and they themselves have become atheists?

Leaving a line of retreat is a powerful technique, but it's not easy.  Honest visualization doesn't take as much effort as admitting outright that God doesn't exist, but it does take an effort.

(Meta note:  I'm posting this on the advice that I should break up long sequences of mathy posts with non-mathy posts.  (I was actually advised to post something "fun", but I'd rather not—it feels like I have too much important material to cover in the next couple of months.)  If anyone thinks that I should have, instead, gone ahead and posted the next item in the information-theory sequence rather than breaking it up; or, alternatively, thinks that this non-mathy post came as a welcome change; then I am interested in hearing from you in the comments.)

 

Part of the Letting Go subsequence of How To Actually Change Your Mind

Next post: "Crisis of Faith"

Previous post: "No One Can Exempt You From Rationality's Laws"

Comments (65)

Sort By: Old
Comment author: L._Zoel 26 February 2008 12:20:21AM 11 points [-]

How many rationalists would retain their belief in reason, if they could accurately visualize that hypothetical world in which there was no rationality and they themselves have become irrational?

Comment author: Origin64 05 November 2012 09:34:28PM 0 points [-]

I don't know. But I would. Irrationality is caused by ignorance, so there will always be tangent worlds (while regarding this current one as prime) in which I give up. There will always be a world where anything that is physically possible occurs. (and probably many where even that requirement doesn't hold)

To put it another way, there has been a moment in time when I was not rational. Is that reason to give up rationality forever? Time could be just another dimension, it's manipulation as far out of our grasp as that of other possible worlds.

Comment author: faul_sname 12 November 2012 11:59:25PM 11 points [-]

if they could accurately visualize that hypothetical world in which there was no rationality and they themselves have become irrational?

I just attempted to visualize such a world, and my mind ran into a brick wall. I can easily imagine a world in which I am not perfectly rational (and in fact am barely rational at all), and that world looks a lot like this world. But I can't imagine a world in which rationality doesn't exist, except as a world in which no decision-making entities exist. Because in any world in which there exist better and worse options and an entity that can model those options and choose between them with better than random chance, there exists a certain amount of rationality.

Comment author: Benito 02 August 2013 08:42:39PM 0 points [-]

I suppose I'd just think about before I met LessWrong. I wouldn't choose that world.

Comment author: Voltairina 14 October 2014 02:13:04PM *  0 points [-]

Well, a world that lacked rationality might be one in which all the events were a sequence of non-sequiters. A car drives down the street. Then dissappears. We are in a movie theater with a tyrannosaurus. Now we are a snail on the moon. Then there's just this poster of rocks. Then I can't remember what sight was like, but there's jazz music. Now I fondly remember fighting in world war 2, while evading the Empire with Hans solo. Oh! I think I might be boiling water, but with a sense of smell somehow.... that's a poor job of describing it -- too much familiar stuff - but you get the idea. If there was no connection between one state of affairs and the next, talking about what strategy to take might be impossible, or a brief possibility that then dissappears when you forget what you are doing and you're back in the movie theater again with the tyrannosaurus. If 'you' is even a meaningful way to describe a brief moment of awareness bubbling into being in that universe. Then again, if at any moment 'you' happen to exist and 'you' happen to understand what rationality means- I guess now that I think about it, if there is any situation where you can understand what the word rationality means, its probably one in which it exists (howevery briefly) and is potentially helpful to you, even if there is little useful to do about whatever situation you are in, there might be some useful thing to do about the troubling thoughts in your mind.

Comment author: CCC 14 October 2014 02:34:32PM 0 points [-]

While that is a world without rationality, it seems a fairly extreme case.

Another example of a world without rationality is a world in which, the more you work towards achieving a goal, the longer it takes to reach that goal; so an elderly man might wander distractedly up Mount Everest to look for his false teeth with no trouble, but a team of experienced mountaineers won't be able to climb a small hill. Even if they try to follow the old man looking for his teeth, the universe notices their intent and conspires against them. And anyone who notices this tendency and tries to take advantage of it gets struck by lightning (even if they're in a submarine at the time) and killed instantly.

Comment author: Voltairina 15 October 2014 12:46:08AM 1 point [-]

That reminds me of Hofstadter's Law: "It will always take longer than you think it is going to take. Even when you take into account Hofstadter's Law."

Comment author: Jotto999 27 November 2012 11:33:06AM 3 points [-]

I'm not sure what "no rationality" would mean. Evolutionarily relevant kinds of rationality can still be expected, like preference to sexually fertile mates, fearing spiders/snakes/heights, and if we're still talking about something at all similar to Homo Sapiens, language and cultural learning and such, which require some amounts of rationality to use.

I wonder if you might be imagining rationality in the form of essentialism, allowing you to universally turn the attribute off, but in reality there no such off switch that is compatible with having decision making agents.

Comment author: Yosarian2 21 January 2014 02:26:00AM 1 point [-]

That's not the idea that really scares Less Wrong people.

Here's a more disturbing one; try to picture a world where all the rational skills you're learning on Less Wrong are actually somehow flawed, and actually make it less likely that you'll discover the truth or made you correct less often, for whatever reason? What would that look like? Would you be able to tell the difference.

I must say, I have trouble picturing that, but I can't prove it's not true (we are basically tinkering with the way our mind works without a software manual, after all).

Comment author: SnappyCrunch 26 February 2008 12:32:58AM 1 point [-]

I enjoy the non-mathy posts. I believe Overcoming Bias is a worthy endeavor, and as a relatively new field of study, the math-oriented posts are important. They are often the most succinct and accurate way to convey concepts. With that said, I find that math posts to be dense with information, perhaps overly so. I find myself unconsciously starting to skim instead of read, and I find it difficult to force myself to pay attention.

The mathy posts appeal to people who are serious about moving this burgeoning field forward, and the non-mathy posts appeal to people who are more casually interested in the concepts, and allow you to have a wider audience. You will have a balance between the two no matter what you attempt, the only question is what your intended audience is, and the best way to reach those people.

Comment author: Kriti 26 February 2008 12:39:56AM 3 points [-]

I enjoy all posts here, but would love a post on what does it mean to be rational. Something introductory, something you can link to when you talk with people who think "if you can justify what someone did, no matter what the justification is, the action becomes rational".

Comment author: Thermopyle 26 February 2008 12:48:48AM 3 points [-]

then I am interested in hearing from you in the comments.

While I appreciate the mathy posts as well as I can, as someone without much training in mathematics I really enjoy these types of posts (I've got a large backlog of your more mathy posts bookmarked for me to work through, whereas your non-mathy posts I read as soon as they show up in my feed reader).

Let us have both!

Comment author: Caledonian2 26 February 2008 01:00:08AM 4 points [-]

The ability to endure cognitive dissonance long enough to find the resolution to the dissonance, rather than just short-circuiting to something that makes no sense but offers relief from the strain, is a necessary precondition for rational thought.

I don't think it can be cultivated, and I don't think there's a substitute. Either you pass through the gauntlet, or you don't.

Comment author: SecondWind 02 May 2013 02:27:37AM 0 points [-]

Couldn't you start with easier cognitive dissonances, and work your way up?

Comment author: Tiiba2 26 February 2008 01:01:27AM 1 point [-]

I just want you to get to that "revelation" of yours already. I thought you were approaching it, if you're talking about neural nets and arithmetic coding. Where does it rank in your schedule? Or is this blog for human reasoning only?

Comment author: Kellopyy 26 February 2008 01:04:02AM 0 points [-]

I was expecting to read yet another mathy post tonight, but I was dissapointed. Less mathy stuff is ok, but shouldn't really come at cost of anything intresting.

I agree with Kriti - introductory essay, post, etc would be useful.

Comment author: O2 26 February 2008 01:21:53AM 1 point [-]

I prefer the less mathy.

Comment author: brent 26 February 2008 01:26:46AM 1 point [-]

I too prefer less mathy - well, to be precise I'll actually read the less mathy stuff in the first place.

More to the point, I've stopped listening to news reports about global warming - and this is harming my ability to think rationally about it. I'll change the channel instead of hear someone say "You know how we all thought we've got 50 years to live? Turns out it's only 30/25/20."

Comment author: Frank_Hirsch 26 February 2008 02:10:46AM 1 point [-]

[Without having read the comments]

WTF? You say: [...] I was actually advised to post something "fun", but I'd rather not [...]

I think it was fun!

BTW could we increase the probability of people being honest by basing reward not on individual choices, but on the log-likelihood over a sample of similar choices? (For a given meaning of similar.)

Comment author: Roland2 26 February 2008 02:26:21AM 0 points [-]

As a mathematician I like your mathy posts, but this is also very welcome for a reason: it contains practical advice. Some posts are of little direct practical use but this one certainly is.

Keep on the good work!

Comment author: Roland2 26 February 2008 02:28:46AM 0 points [-]

"this is also very welcome" I'm refering to this post.

Comment author: Frank_Hirsch 26 February 2008 02:42:04AM 0 points [-]

[having read the comments]

Kriti et al: I'd recommend this and this to anybody who hasn't already read it. Otherwise I have not much idea for introductory texts right now.

Comment author: denis_bider 26 February 2008 03:47:28AM 0 points [-]

I think you should go with the advice and post something fun. Especially so if you have "much important material" to cover in following months. No need for a big hurry to lose readers. ;)

Comment author: denis_bider 26 February 2008 03:55:17AM 0 points [-]

I should however note that one of the last mathy posts (Mutual Information) struck a chord with me and caused an "Aha!" moment for which I am grateful.

Specifically, it was this:

I digress here to remark that the symmetry of the expression for the mutual information shows that Y must tell us as much about Z, on average, as Z tells us about Y. I leave it as an exercise to the reader to reconcile this with anything they were taught in logic class about how, if all ravens are black, being allowed to reason Raven(x)->Black(x) doesn't mean you're allowed to reason Black(x)->Raven(x). How different seem the symmetrical probability flows of the Bayesian, from the sharp lurches of logic - even though the latter is just a degenerate case of the former.

Insightful!

Comment author: Benquo 26 February 2008 04:14:52AM 0 points [-]

I agree with SnappyCrunch.

Comment author: Ben_L. 26 February 2008 08:49:53AM 0 points [-]

I like non-mathy posts. I particularly enjoyed this one, as it seems to have a clear practical application.

Comment author: Ulrik 26 February 2008 08:58:03AM 0 points [-]

I liked this post, but then again, I like all your posts Eliezer! (I've just been hiding behind my feedreader, and so not commenting about it before.)

My opinion about mathy/non-mathy is that you should do what you think is most natural. Most days, you'll probably want to get on with the mathy exposition (and I am very much looking forward to the more advanced mathy posts), and then sprinkle in something lighter when the occasions to do so arise. For instance, I like that you based today's post on a recent discussion you had.

I believe this approach would be most conducive to interesting reading.

Comment author: Ben_Jones 26 February 2008 09:23:19AM 0 points [-]

'Newly minted quadriplegics'? What's more fun than that?

Don't worry too much about who wants what when. Like you say, it's all important stuff, and at a post a day no-one's going to complain about the odd vignette. Just keep up the good work.

Comment author: CarlShulman 26 February 2008 01:45:30PM 0 points [-]

When I saw the title I thought you were responding to this: http://www.overcomingbias.com/2008/02/more-moral-wigg.html

Comment author: The_Darkness 26 February 2008 01:55:50PM 0 points [-]

Thank GOD for non-mathy posts ;-)

Comment author: LG 26 February 2008 02:47:36PM 0 points [-]

There's a common literary technique used in most storytelling in which the author writes alternating "up" and "down" scenes -- it provides pacing and context; it also allows us time to digest the "up" scenes.

It seems to me that the technique is appropriate here -- it might be worth making a goal for yourself to write a mathy post, then to follow up with a post on the same topic but without any math in it at all, except maybe references to the previous post. That would be an interesting exercise for you, I think. It's supposed to accessible work -- how accessible can you make it? Can you write about these mathy topics without numbers?

I don't know, but if you never try to do impossible things...

Comment author: Will_Pearson 26 February 2008 02:47:52PM 3 points [-]

There hasn't been much evidence of atheists forming groups that have the positive aspects that a church/synagogue/mosque holds in the social life of some humans. So you might forgive a theist pretending to be a rationalist, for not holding the probability of this happening very high, and that the world would lack said institutions and would be a worse place.

If rationalists truly wants to get rid of religions, without getting rid of humans, we would have to ask ourselves, "What do humans get out of being part of a religion?" And then provide that through organisations.

And please no strawmen of the comfort of ignorance, I am talking about reassurance of being with people who are trying to hold the same goal system.

Comment author: Gordon_Worley 26 February 2008 03:13:25PM 0 points [-]

Eliezer,

You know that you can't succeed without the math, and slowing down for posts like this is taking away 24 hours that might have been better used to save humanity. Not that this was a bad post, but I think you would be better off letting others write the fun posts unless you need to write a fun post to recover from teaching.

Comment author: randy 26 February 2008 04:23:30PM 1 point [-]

Eliezer, this was a welcome relief from the long series of mathy posts.

Comment author: Unknown 26 February 2008 07:14:19PM 2 points [-]

Eliezer, suppose it turned out to be the case that:

1) God exists. 2) At some time in the future, tomorrow, for example, God comes to Eliezer Yudkowsky in order announce His existence. 3) Not only does He announce His existence, but He is willing to have His existence and power tested, and passes every test. 4) He also asserts that according to Eliezer's CEV, although not according to his present knowledge, God's way of acting in the world is perfectly moral, even according to Eliezer's values.

How would you react to these events? Would you write a post about them on OB?

Comment author: Eliezer_Yudkowsky 26 February 2008 07:54:27PM 4 points [-]

Thanks for feedback, all! The consensus appears to favor leavening mathy posts with less mathy ones. I'll bear that in mind, though I make no promises - I do have my own agenda here.

---

Unknown, can't say I've ever thought of that one. I've considered how to kill or rewrite a Judeo-Christian type God, but not that particular scenario you've just described.

I think I would simply reply to number 4, "I don't believe that without an explanation." After all, just because an entity displays great power doesn't mean it will always tell you the truth.

You can't necessarily force me to consider believing number 4 because it involves a moral question and those are not subject to forced visualization (by this rule) in the way that factual scenarios are.

You can invent all kinds of Gods and demand that I visualize the case of their existence, or of their telling me various things, but you can't necessarily force me to visualize the case where I accept their statement that killing babies is a good idea - not unless you can argue it well enough to create a real moral doubt in my mind.

If I myself am in actual doubt on a moral question, then I can visualize it both ways without confusing myself; and then you can demand that I visualize it. But when I am not in doubt, trying to visualize the contrary has the same quality as trying to concretely visualize 2 + 2 = 3, only more so.

I can visualize a mind constructed so as to possess a different morality, of course; but that is not the same as identifying myself with that mind.

---

This reminds me of an item from a list of "horrible job interview questions" we once devised for SIAI:

Would you kill babies if it was intrinsically the right thing to do? Yes/No

If you circled "no", explain under what circumstances you would not do the right thing to do: ___________________

If you circled "yes", how right would it have to be, for how many babies? ___________________

Comment author: conchis 26 February 2008 08:05:18PM 1 point [-]

Alternatively, if you want something super scary, try 1), 2), and 3) without 4).

Comment author: Nick_Tarleton 26 February 2008 08:22:07PM 2 points [-]

I've considered how to kill or rewrite a Judeo-Christian type God

Please make this your next "fun" post. (Speaking of which, I enjoy the digression.)

You can't necessarily force me to consider believing number 4 because it involves a moral question and those are not subject to forced visualization (by this rule) in the way that factual scenarios are.

But "my CEV judges killing babies as good" (unlike "killing babies is good") is a factual proposition. You know what your current moral judgments are, but you can't be certain what the idealized Eliezer would think. You might justifiably judge repugnant volition too unlikely to bother imagining it, but exempt?

Comment author: PK 26 February 2008 08:35:44PM 0 points [-]

This reminds me of an item from a list of "horrible job interview questions" we once devised for SIAI:

Would you kill babies if it was intrinsically the right thing to do? Yes/No

If you circled "no", explain under what circumstances you would not do the right thing to do: ___________________

If you circled "yes", how right would it have to be, for how many babies? ___________________

What a horrible horrible question. My answer is ... what do you mean when you say "intrinsically the right thing to do"? The "right thing" according to whom? If it was the right thing according to an authority figure but I disagreed, I probably would not do it. If the circumstances were so extreme that I truly believed it's the right thing(eg: not killing a baby results in the baby's death anyway + 1 million babies) then I would kill babies(assuming I could overcome my aversion to killing).

Actually I don't really know how I would react. This is how I wish I would act. Calmly theorising in front of the computer never having experienced circumstances remotely as awful is not the same as being in those circumstances when the fear and dread overtakes you. There would probably be a significant shift from what I consider and feel is "me" right now to the "me" I would become in that hypothetical situation.

Comment author: Tom_McCabe2 26 February 2008 08:52:44PM 1 point [-]

"This reminds me of an item from a list of "horrible job interview questions" we once devised for SIAI:"

Could you post these?

Comment author: Psy-Kosh 26 February 2008 09:46:22PM 0 points [-]

"I've considered how to kill or rewrite a Judeo-Christian type God"

Okay, now I'm curious what you've concluded with regards to that. :)

Probably not worth doing more then just talking 'bout it in comments, if that, unless you feel like doing a post on that just for fun.

But as far as this post, I also liked it. Useful to have actual suggestions for mental practices to practice to help one debias oneself.

Comment author: Joe_Marier 27 February 2008 12:05:39AM 1 point [-]

Why do the work of hypothesizing the world without God? It's not like Nietszche, Sartre, Camus, Marx, Shaw, Derrida, etc. haven't done a much better job of it than me, because they were better philosophers than me. However, I also consider Aquinas to be the better philosopher than the aforementioned. Is that so unreasonable?

Comment author: Maribel_Hawkins 27 February 2008 01:21:50AM 0 points [-]

Thanks for reminding me of The Art of War from your quote. You might be interested in this great translation - http://www.sonshi.com/huynh.html

Comment author: Wendy_Collings 27 February 2008 02:27:16AM 0 points [-]

"The mathy posts appeal to people who are serious about moving this burgeoning field forward, and the non-mathy posts appeal to people who are more casually interested in the concepts" - (Snappycrunch)

Beware of mistaking mathematical thinking for rational thinking; math is a tool like any other, to be used rationally or irrationally. Nassim Taleb demonstrates this very well in his book "Fooled by Randomness".

There's nothing casual about being interested in the concepts of rational thinking; even the mathematically minded (who will naturally be more interested in the mathy posts) need the concepts to understand what framework to put the math into.

Comment author: TGGP4 27 February 2008 04:03:20AM 0 points [-]
Comment author: Mike5 27 February 2008 04:20:14AM 0 points [-]

How does one go about visualizing a world without souls? Or, rather a world in which nobody believes in souls, and how would this visualization have any bearing to "reality"? It seems like the thought experiment is really: What would I do if everything were the same except I didn't have a soul?

Comment author: Anna 27 February 2008 07:21:59AM 0 points [-]

Regardless of all previous posts.

I think you write better when you are expressing your beliefs and inner thoughts as opposed to the mathematical equation that leads you there.

“Do not dwell in the past, do not dream of the future, concentrate the mind on the present moment.”

Just a thought. Anna

Comment author: Ben_Jones 27 February 2008 09:56:14AM 1 point [-]

slowing down for posts like this is taking away 24 hours that might have been better used to save humanity.

Sarcasm? Humour? Sincerity?

I've considered how to kill or rewrite a Judeo-Christian type God

Please make this your next "fun" post.

Seconded!

Comment author: steven 27 February 2008 02:05:43PM 0 points [-]

I've considered how to kill or rewrite a Judeo-Christian type God

Obligatory Pascal: Ah, but what if there's a tiny chance that He's reading along to figure out our tactics?

Comment author: Eliezer_Yudkowsky 27 February 2008 04:48:57PM 2 points [-]

Steven: To kill or rewrite a Judeo-Christian God, obviously, the technique has to work even if the God can read your planning thoughts. It's a lot easier than dealing with an UFAI, though, because the Judeo-Christian God has anthropomorphic cognitive vulnerabilities and a considerable response time delay. ("You ate the apple?")

Naturally you prefer to rewrite the God if possible - shame to waste all that power.

Comment author: steven 27 February 2008 05:17:52PM 0 points [-]

Heh, so how do you know that it is not the case that this hypothetical JCG reads overcomingbias but not people's private thoughts?

Comment author: steven 27 February 2008 05:38:33PM 0 points [-]

(Of course as long as we're under these weird assumptions then not discussing tactics could be a fatal mistake too, etc etc)

Comment author: Paul_Gowder 27 February 2008 10:59:48PM 0 points [-]

I'm skeptical about the possibility of really carrying out this kind of visualization (or, more broadly, imaginary leap). Here's why.

I might be able to say that I can imagine the existence of a god, and what the world would be like if, say, it were the Christian one. But I can't imagine myself in that world -- in that world, I'm a different person. For in that world, either I hold the counterfactually true belief that there is such a god, or I don't. If I don't hold that belief, then my response to that world is the same as my response to this world. If I do hold it, well, how can I model that?

This point is related to a point that Eliezer made in the comments, that I think just absolutely nails the problem, for a narrower class of the true set of states for which the problem exists:

You can invent all kinds of Gods and demand that I visualize the case of their existence, or of their telling me various things, but you can't necessarily force me to visualize the case where I accept their statement that killing babies is a good idea - not unless you can argue it well enough to create a real moral doubt in my mind.

Exactly.

But I maintain that you can't model the existence of a God with the right properties (including omnipotence, omniscience, and omnibenevolence) without being able to model that acceptance.

And likewise, the woman who believed in the soul couldn't model her reaction to a world without a soul without being able to experience herself as a person who genuinely doesn't believe in a soul. But she can only have that experience by becoming such a person.

I think this is just a limitation of human psychology. Cf. Thomas Nagel's great article, What is it like to be a bat? The argument doesn't directly apply, but the intuition does.

Comment author: Mike_Blume 27 July 2008 09:00:32PM 4 points [-]

This reminds me of an item from a list of "horrible job interview questions" we once devised for SIAI:

Would you kill babies if it was intrinsically the right thing to do? Yes/No

If you circled "no", explain under what circumstances you would not do the right thing to do: I assume by "intrinsically right thing to do", you do not intend something straightforward like "here are five babies carrying a virus which, if left unchecked, will wipe out half the population of the planet. There is no means by which they can be quarantined, the virus can cross even the cold reaches of space. The only way to save us is to kill them". I assume rather, that you, Eliezer Yudkowsky, hand me a booklet, possibly hundreds of pages long. On page 0 are listed my most cherished moral truths, and on page N is written: "thus, it is right and decent to kill as many babies as possible, whenever the opportunity arises. Any man who walks past a mother pushing a stroller, and does not immediately throttle the infant where it lies, is nothing more than a moral coward." For all n between 1 and N inclusive, the statements on page n seem to me to follow naturally and self-evidently from my acceptance of the statements on page n-1. As I look up, astonishment etched on my face, I see you standing before me, grinning broadly. You hand me a long, curved blade, and tell me the staff of the SIAI are taking the afternoon off to raid the local nursery, and would I like to join?

Under these circumstances I would assign high probability to the idea that you are morally ill, and wish to murder infants for your own enjoyment. That somewhere in the proof you have given me is a logical error - the moral equivalent of dividing by zero. I would imagine, not that morality led me astray, but that my incomplete knowledge of morality led me not to spot this error. I would show the proof to as many moral philosophers as I could, ones whose intelligence and expertise in the field I respected, and held to be above my own, and who were initially as unenthusiastic as I am at the prospect of infanticide. I would ask them if they could point me to an error in the proof, and explain to me clearly and fully why this step, which had seemed so simple to me, is not a legal move in the dance at that point. If they could not explain this to me to my satisfaction, I would devote much of my time from then on to the study of morality so that I could better understand it, and until I could, would distrust any moral conclusions I came to on my own. If none of them could find an error, I would still assign high probability to the notion that somewhere in the proof is an error which we humans have not advanced sufficiently in the study of metamorality to discover. I would consider it one of the most important outstanding problems in the field, and would, again, distrust any major moral decisions which did not clearly add up to normality until it was solved.

Just as the mathematical "proof" that 2=1 would, if accepted, destroy the foundations of mathematics itself, and must therefore be doubted until we can discover its error, so your proof that killing babies is good, would, if accepted, destroy the foundations of my morality, and so I must doubt it until I can find an error.

I am well aware that a fundamentalist could take my previous paragraph, replace "killing babies" with "oral sex" and thus make his prudery unassailable by argument. So much the worse for him, I say. If he considers the prohibition of a mutually beneficial and joyful act to be at the foundation of his morality, then he is a miserable creature and all my rationality will not save him from himself.

I have tried indirectly to answer your question. To answer it directly I will have to resort to what seems a paradox. I would not do "the right thing to do" if I know, at bottom, that it simply is not the right thing to do.

If you circled "yes", how right would it have to be, for how many babies? N/A

So, would I get the job?

Comment author: Eliezer_Yudkowsky 27 July 2008 09:44:38PM 0 points [-]

I would show the proof to as many moral philosophers as I could

Boy, I sure wouldn't. Ever read Cherniak's "The Riddle of the Universe and Its Solution"?

I am well aware that a fundamentalist could take my previous paragraph, replace "killing babies" with "oral sex" and thus make his prudery unassailable by argument. So much the worse for him, I say.

I sympathize, but I don't think that really solves the dilemma.

Comment author: Hopefully_Anonymous 27 July 2008 10:25:46PM 0 points [-]

Post what you want to post most. The advice that you should go against your own instincts and pander is bad, in my opinion. The only things you should force yourself to do are: (1) try to post something every day, and (2) try to edit and delete comments as little as possible. I believe the result will be an excellent and authentic blog with the types of readers you want most (and that are most useful to you).

Comment author: CarlShulman 28 July 2008 12:47:33AM 0 points [-]

Eliezer,

I think there is pretty overwhelming evidence that moral philosophers are almost never moved to do anything nearly so onerous and dangerous as killing babies by their moral views. See Unger, Singer, Parfit, etc.

Comment author: Raw_Power 31 October 2010 11:01:38AM 3 points [-]

That title confused me. I expected an article on how, when debating, it was better to leave the opponent a line of retreat so that they would not feel dialectically cornered and start panicking. Of course, along that line of retreat, your arguments would be waiting for them. Socrates apparently was a true master of this little dance. This is especially useful if you have a lot of time and you are trying to actually change the way your opponent thinks, rather than changing that of an audience.

Comment author: timtyler 23 April 2011 04:23:53PM *  0 points [-]

I am pretty sure that is what the term "leaving a line of retreat" in the context of an argument or disagreement should be used to refer to.

The meaning being proposed in this post is counter-intuitive. I classify it as being undesirable terminology.

Comment author: TheStevenator 13 December 2011 05:06:00AM 0 points [-]

Great post!

I think the greatest test of self honesty (maybe it ties with honestly imagining the world you wish weren't real) would be admitting to yourself that the world looks an awful lot like the hypotheticl world you just vividly imagined. I think if anyone who believes in god or homeopathy or what-have-you honestly imagined what the world would look like if their belief was wrong, and they had enough courage, they'd admit to themselves that the world looks a lot like that already.

Comment author: Origin64 05 November 2012 09:30:54PM 1 point [-]

You really should write a book. Seriously. I could probably raise the hypothesis of teaching Rationality as a first-year course (as a follow-up to Logic) instead of useless "password" classes like I've received at my college. Having a book I could wave around with to convince people maybe being rational is important when you're a scientist would help a lot. At least I'd start printing and distributing it.

You could also just put the primary sequences of this website into a (e)book format, and release it. You might reach a wider audience that way, which would of course be Winning.

Comment author: thomblake 05 November 2012 09:44:49PM 0 points [-]

A serious book on Rationality has been in the works for some time.

Comment author: Nornagest 05 November 2012 09:49:57PM 0 points [-]

There's a couple of ebook versions of the Sequences floating around. I believe an official release is still in the works, but links to several unofficial ones may be found here.

Comment author: [deleted] 09 November 2012 10:11:57PM 1 point [-]

The trouble with the sequences is that each was written in the course of a day, and most were unrevised since then. They're obviously rich and interesting, but far from publishable material. The sequences meet every standard you could want for being insightful, but they fall far short of most standards of factual accuracy, organization, contact with contemporary discussions, etc.

Comment author: Origin64 05 November 2012 09:48:51PM 1 point [-]

The hope is that it takes less courage to visualize an uncomfortable state of affairs as a thought experiment, than to consider how likely it is to be true. But then after you do the former, it becomes easier to do the latter.

And again you manage to condense a wise life lesson to two sentences. I should really write them down.