Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Crisis of Faith

46 Post author: Eliezer_Yudkowsky 10 October 2008 10:08PM

Followup toMake an Extraordinary Effort, The Meditation on Curiosity, Avoiding Your Belief's Real Weak Points

"It ain't a true crisis of faith unless things could just as easily go either way."
       —Thor Shenkel

Many in this world retain beliefs whose flaws a ten-year-old could point out, if that ten-year-old were hearing the beliefs for the first time.  These are not subtle errors we are talking about.  They would be child's play for an unattached mind to relinquish, if the skepticism of a ten-year-old were applied without evasion. As Premise Checker put it, "Had the idea of god not come along until the scientific age, only an exceptionally weird person would invent such an idea and pretend that it explained anything."

And yet skillful scientific specialists, even the major innovators of a field, even in this very day and age, do not apply that skepticism successfully.  Nobel laureate Robert Aumann, of Aumann's Agreement Theorem, is an Orthodox Jew:  I feel reasonably confident in venturing that Aumann must, at one point or another, have questioned his faith.  And yet he did not doubt successfullyWe change our minds less often than we think.

This should scare you down to the marrow of your bones.  It means you can be a world-class scientist and conversant with Bayesian mathematics and still fail to reject a belief whose absurdity a fresh-eyed ten-year-old could see.  It shows the invincible defensive position which a belief can create for itself, if it has long festered in your mind.

What does it take to defeat an error which has built itself a fortress?

But by the time you know it is an error, it is already defeated.  The dilemma is not "How can I reject long-held false belief X?" but "How do I know if long-held belief X is false?"  Self-honesty is at its most fragile when we're not sure which path is the righteous one.  And so the question becomes:

How can we create in ourselves a true crisis of faith, that could just as easily go either way?

Religion is the trial case we can all imagine.  (Readers born to atheist parents have missed out on a fundamental life trial, and must make do with the poor substitute of thinking of their religious friends.)  But if you have cut off all sympathy and now think of theists as evil mutants, then you won't be able to imagine the real internal trials they face.  You won't be able to ask the question:

"What general strategy would a religious person have to follow in order to escape their religion?"

I'm sure that some, looking at this challenge, are already rattling off a list of standard atheist talking points—"They would have to admit that there wasn't any Bayesian evidence for God's existence", "They would have to see the moral evasions they were carrying out to excuse God's behavior in the Bible", "They need to learn how to use Occam's Razor—"

WRONG!  WRONG WRONG WRONG!  This kind of rehearsal, where you just cough up points you already thought of long before, is exactly the style of thinking that keeps people within their current religions.  If you stay with your cached thoughts, if your brain fills in the obvious answer so fast that you can't see originally, you surely will not be able to conduct a crisis of faith.

Maybe it's just a question of not enough people reading "Godel, Escher, Bach" at a sufficiently young age, but I've noticed that a large fraction of the population—even technical folk—have trouble following arguments that go this meta.  On my more pessimistic days I wonder if the camel has two humps.

Even when it's explicitly pointed out, some people seemingly cannot follow the leap from the object-level "Use Occam's Razor!  You have to see that your God is an unnecessary belief!" to the meta-level "Try to stop your mind from completing the pattern the usual way!"  Because in the same way that all your rationalist friends talk about Occam's Razor like it's a good thing, and in the same way that Occam's Razor leaps right up into your mind, so too, the obvious friend-approved religious response is "God's ways are mysterious and it is presumptuous to suppose that we can understand them."  So for you to think that the general strategy to follow is "Use Occam's Razor", would be like a theist saying that the general strategy is to have faith.

"But—but Occam's Razor really is better than faith!  That's not like preferring a different flavor of ice cream!  Anyone can see, looking at history, that Occamian reasoning has been far more productive than faith—"

Which is all true.  But beside the point.  The point is that you, saying this, are rattling off a standard justification that's already in your mind.  The challenge of a crisis of faith is to handle the case where, possibly, our standard conclusions are wrong and our standard justifications are wrong.  So if the standard justification for X is "Occam's Razor!", and you want to hold a crisis of faith around X, you should be questioning if Occam's Razor really endorses X, if your understanding of Occam's Razor is correct, and—if you want to have sufficiently deep doubts—whether simplicity is the sort of criterion that has worked well historically in this case, or could reasonably be expected to work, etcetera.  If you would advise a religionist to question their belief that "faith" is a good justification for X, then you should advise yourself to put forth an equally strong effort to question your belief that "Occam's Razor" is a good justification for X.

(Think of all the people out there who don't understand the Minimum Description Length or Solomonoff Induction formulations of Occam's Razor, who think that Occam's Razor outlaws Many-Worlds or the Simulation Hypothesis.  They would need to question their formulations of Occam's Razor and their notions of why simplicity is a good thing.  Whatever X in contention you just justified by saying "Occam's Razor!", I bet it's not the same level of Occamian slam dunk as gravity.)

If "Occam's Razor!" is your usual reply, your standard reply, the reply that all your friends give—then you'd better block your brain from instantly completing that pattern, if you're trying to instigate a true crisis of faith.

Better to think of such rules as, "Imagine what a skeptic would say—and then imagine what they would say to your response—and then imagine what else they might say, that would be harder to answer."

Or, "Try to think the thought that hurts the most."

And above all, the rule:

"Put forth the same level of desperate effort that it would take for a theist to reject their religion."

Because, if you aren't trying that hard, then—for all you know—your head could be stuffed full of nonsense as ridiculous as religion.

Without a convulsive, wrenching effort to be rational, the kind of effort it would take to throw off a religion—then how dare you believe anything, when Robert Aumann believes in God?

Someone (I forget who) once observed that people had only until a certain age to reject their religious faith.  Afterward they would have answers to all the objections, and it would be too late.  That is the kind of existence you must surpass.  This is a test of your strength as a rationalist, and it is very severe; but if you cannot pass it, you will be weaker than a ten-year-old.

But again, by the time you know a belief is an error, it is already defeated.  So we're not talking about a desperate, convulsive effort to undo the effects of a religious upbringing, after you've come to the conclusion that your religion is wrong.  We're talking about a desperate effort to figure out if you should be throwing off the chains, or keeping them.  Self-honesty is at its most fragile when we don't know which path we're supposed to take—that's when rationalizations are not obviously sins.

Not every doubt calls for staging an all-out Crisis of Faith.  But you should consider it when:

  • A belief has long remained in your mind;
  • It is surrounded by a cloud of known arguments and refutations;
  • You have sunk costs in it (time, money, public declarations);
  • The belief has emotional consequences (note this does not make it wrong);
  • It has gotten mixed up in your personality generally.

None of these warning signs are immediate disproofs.  These attributes place a belief at-risk for all sorts of dangers, and make it very hard to reject when it is wrong.  But they also hold for Richard Dawkins's belief in evolutionary biology as well as the Pope's Catholicism.  This does not say that we are only talking about different flavors of ice cream.  Only the unenlightened think that all deeply-held beliefs are on the same level regardless of the evidence supporting them, just because they are deeply held.  The point is not to have shallow beliefs, but to have a map which reflects the territory.

I emphasize this, of course, so that you can admit to yourself, "My belief has these warning signs," without having to say to yourself, "My belief is false."

But what these warning signs do mark, is a belief that will take more than an ordinary effort to doubt effectively.  So that if it were in fact false, you would in fact reject it.  And where you cannot doubt effectively, you are blind, because your brain will hold the belief unconditionally.  When a retina sends the same signal regardless of the photons entering it, we call that eye blind.

When should you stage a Crisis of Faith?

Again, think of the advice you would give to a theist:  If you find yourself feeling a little unstable inwardly, but trying to rationalize reasons the belief is still solid, then you should probably stage a Crisis of Faith.  If the belief is as solidly supported as gravity, you needn't bother—but think of all the theists who would desperately want to conclude that God is as solid as gravity.  So try to imagine what the skeptics out there would say to your "solid as gravity" argument.  Certainly, one reason you might fail at a crisis of faith is that you never really sit down and question in the first place—that you never say, "Here is something I need to put effort into doubting properly."

If your thoughts get that complicated, you should go ahead and stage a Crisis of Faith.  Don't try to do it haphazardly, don't try it in an ad-hoc spare moment.  Don't rush to get it done with quickly, so that you can say "I have doubted as I was obliged to do."  That wouldn't work for a theist and it won't work for you either.  Rest up the previous day, so you're in good mental condition.  Allocate some uninterrupted hours.  Find somewhere quiet to sit down.  Clear your mind of all standard arguments, try to see from scratch.  And make a desperate effort to put forth a true doubt that would destroy a false, and only a false, deeply held belief.

Elements of the Crisis of Faith technique have been scattered over many posts:

  • Avoiding Your Belief's Real Weak Points—One of the first temptations in a crisis of faith is to doubt the strongest points of your belief, so that you can rehearse your good answers.  You need to seek out the most painful spots, not the arguments that are most reassuring to consider.
  • The Meditation on Curiosity—Roger Zelazny once distinguished between "wanting to be an author" versus "wanting to write", and there is likewise a distinction between wanting to have investigated and wanting to investigate.  It is not enough to say "It is my duty to criticize my own beliefs"; you must be curious, and only uncertainty can create curiosity.  Keeping in mind Conservation of Expected Evidence may help you Update Yourself Incrementally:  For every single point that you consider, and each element of new argument and new evidence, you should not expect your beliefs to shift more (on average) in one direction than another—thus you can be truly curious each time about how it will go.
  • Cached Thoughts and Pirsig's Original Seeing, to prevent standard thoughts from rushing in and completing the pattern.
  • The Litany of Gendlin and the Litany of Tarski:  People can stand what is true, for they are already enduring it.  If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it.  You would advise a religious person to try to visualize fully and deeply the world in which there is no God, and to, without excuses, come to the full understanding that if there is no God then they will be better off believing there is no God.  If one cannot come to accept this on a deep emotional level, they will not be able to have a crisis of faith.  So you should put in a sincere effort to visualize the alternative to your belief, the way that the best and highest skeptic would want you to visualize it.  Think of the effort a religionist would have to put forth to imagine, without corrupting it for their own comfort, an atheist's view of the universe.
  • Make an Extraordinary Effort, for the concept of isshokenmei, the desperate convulsive effort to be rational that it would take to surpass the level of Robert Aumann and all the great scientists throughout history who never let go of their religions.
  • The Genetic Heuristic:  You should be extremely suspicious if you have many ideas suggested by a source that you now know to be untrustworthy, but by golly, it seems that all the ideas still ended up being right.  (E.g., the one concedes that the Bible was written by human hands, but still clings to the idea that it contains indispensable ethical wisdom.)
  • The Importance of Saying "Oops"—it really is less painful to swallow the entire bitter pill in one terrible gulp.
  • Singlethink, the opposite of doublethink.  See the thoughts you flinch away from, that appear in the corner of your mind for just a moment before you refuse to think them.  If you become aware of what you are not thinking, you can think it.
  • Affective Death Spirals and Resist the Happy Death Spiral.  Affective death spirals are prime generators of false beliefs that it will take a Crisis of Faith to shake loose.  But since affective death spirals can also get started around real things that are genuinely nice, you don't have to admit that your belief is a lie, to try and resist the halo effect at every point—refuse false praise even of genuinely nice things.  Policy debates should not appear one-sided.
  • Hold Off On Proposing Solutions until the problem has been discussed as thoroughly as possible without proposing any; make your mind hold off from knowing what its answer will be; and try for five minutes before giving up, both generally, and especially when pursuing the devil's point of view.

And these standard techniques are particularly relevant:

But really there's rather a lot of relevant material, here and there on Overcoming Bias.  The Crisis of Faith is only the critical point and sudden clash of the longer isshoukenmei—the lifelong uncompromising effort to be so incredibly rational that you rise above the level of stupid damn mistakes.  It's when you get a chance to use your skills that you've been practicing for so long, all-out against yourself.

I wish you the best of luck against your opponent.  Have a wonderful crisis!

 

Part of the Letting Go subsequence of How To Actually Change Your Mind

Next post: "The Ritual"

Previous post: "Leave a Line of Retreat"

Comments (148)

Sort By: Old
Comment author: RobinHanson 10 October 2008 10:40:06PM 8 points [-]

This is an unusually high quality post, even for you Eliezer; congrats!

Comment author: Bo 10 October 2008 11:04:14PM 3 points [-]

It seems that it takes an Eliezer-level rationalist to make an explicit account of what any ten-year-old can do intuitively. For those not quite Eliezer-level or not willing to put in the effort, this is really frustrating in the context of an argument or debate.

Comment author: roko3 10 October 2008 11:25:03PM 2 points [-]

I suspect that there are many people in this world who are, by their own standards, better off remaining deluded. I am not one if them; but I think you should qualify statements like "if a belief is false, you are better off knowing that it is false".

It is even possible that some overoptimistic transhumanists/singularitarians are better off, by their own standards, remaining deluded about the potential dangers of technology. You have the luxury of being intelligent enough to be able to utilize your correct belief about how precarious our continued existence is becoming. For many people, such a belief is of no practical benefit yet is psychologically detrimental.

This creates a "tradgedy of the commons" type problem in global catastrophic risks: each individual is better off living in a fool's paradise, but we'd all be much better off if everyone faced up to the dangers of future technology.

Comment author: Anti-reductionist 10 October 2008 11:43:39PM -3 points [-]

Many in this world retain beliefs whose flaws a ten-year-old could point out

Very true. Case in point: the belief that "minimum description length" or "Solomonoff induction" can actually predict anything. Choose a language that can describe MWI more easily than Copenhagen, and they say you should believe MWI; choose a language that can describe Copenhagen more easily than MWI, and they say you should believe Copenhagen. I certainly could have told you that when I was ten...

Comment author: [deleted] 25 August 2009 10:30:16AM 5 points [-]

The argument in this post is precisely analogous to the following:

Bayesian reasoning cannot actually predict anything. Choose priors that result in the posterior for MWI being greater than that for Copenhagen, and it says you should believe MWI; choose priors that result in the posterior for Copenhagen being greater than that for MWI, and it says you should believe Copenhagen.

The thing is, though, choosing one's own priors is kind of silly, and choosing one's own priors with the purpose of making the posteriors be a certain thing is definitely silly. Priors should be chosen to be simple but flexible. Likewise, choosing a language with the express purpose of being able to express a certain concept simply is silly; languages should be designed to be simple but flexible.

Comment author: cousin_it 25 August 2009 10:38:47AM *  5 points [-]

It seems to me that you're waving the problem away instead of solving it. For example, I don't know of any general method for devising a "non-silly" prior for any given parametric inference problem. Analogously, what if your starting language accidentally contains a shorter description of Copenhagen than MWI?

Comment author: [deleted] 27 August 2009 12:56:02AM 1 point [-]

If you're just doing narrow AI, then look at your hypothesis that describes the world (e.g. "For any two people, they have some probability X of having a relationship we'll call P. For any two people with relationship P, every day, they have a probability Y of causing perception A."), then fill in every parameter (in this case, we have X and Y) with reasonable distributions (e.g. X and Y independent, each with a 1/3 chance of being 0, a 1/3 chance of being 1, and a 1/3 chance of being the uniform distribution).

Yes, I said "reasonable". Subjectivity is necessary; otherwise, everyone would have the same priors. Just don't give any statement an unusually low probability (e.g. a probability practically equal to zero that a certain physical constant is greater than Graham's number), nor any statement an unusually high probability (e.g. a 50% probability that Christianity is true). I think good rules are that the language your prior corresponds to should not have any atoms that can be described reasonably easily (perhaps 10 atoms or less) using only other atoms, and that every atom should be mathematically useful.

If the starting language accidentally contains a shorter description of Copenhagen than MWI? Spiffy! Assuming there is no evidence either way, Copenhagen will be more likely than MWI. Now, correct me if I'm wrong, but MWI is essentially the idea that the set of things causing wavefunction collapse is empty, while Copenhagen states that it is not empty. Supposing we end up with a 1/3 chance of MWI being true and a 2/3 chance that it's some other simple thing, is that really a bad thing? Your agent will end up designing devices that will work only if a certain subinterpretation of the Copenhagen interpretation is true and try them out. Eventually, most of the simple, easily-testable versions of the Copenhagen interpretation will be ruled out--if they are, in fact, false--and we'll be left with two things: unlikely versions of the Copenhagen interpretation, and versions of the Copenhagen interpretation that are practically identical to MWI.

(Do I get a prize for saying "e.g." so much?)

Comment author: Alicorn 27 August 2009 01:03:19AM 10 points [-]

(Do I get a prize for saying "e.g." so much?)

Yes. Here is an egg and an EEG.

Comment author: potato 13 August 2011 09:18:05AM *  0 points [-]

The minimum description length formulation doesn't allow for that at all. You are not allowed to pick whatever language you want, you have to pick the optimal code. If in the most concise code possible, state 'a' has a smaller code than state 'b', then 'a' must be more probable than 'b', since the most concise codes possible assign the smallest codes to the most probable states.

So if you wanna know what state a system is in, and you have the ideal (or close to ideal) code for the states in that system, the probability of that state will be strongly inversely correlated with the length of the code for that state.

Comment author: Oscar_Cunningham 13 August 2011 11:31:20AM 1 point [-]

Aren't you circularly basing your code on your probabilities but then taking your priors from the code?

Comment author: potato 13 August 2011 12:07:52PM 0 points [-]

Yep, but that's all the proof shows: the more concise your code, the stronger the inverse correlation between the probability of a state and the code length of that state.

Comment author: paper-machine 13 August 2011 12:14:18PM *  2 points [-]

You are not allowed to pick whatever language you want, you have to pick the optimal code. If in the most concise code possible, state 'a' has a smaller code than state 'b', then 'a' must be more probable than 'b', since the most concise codes possible assign the smallest codes to the most probable states.

I haven't read anything like this in my admittedly limited readings on Solomonoff induction. Disclaimer: I am only a mere mathematician in a different field, and have only read a few papers surrounding Solomonoff.

The claims I've seen revolve around "assembly language" (for some value of assembly language) being sufficiently simple that any biases inherent in the language are small (some people claim constant multiple on the basis that this is what happens when you introduce a symbol 'short-circuiting' a computation). I think a more correct version of Anti-reductionist's argument should run, "we currently do not know how the choice of language affects SI; it is conceivable that small changes in the base language imply fantastically different priors."

I don't know the answer to that, and I'd be very glad to know if someone has proved it. However, I think it's rather unlikely that someone has proved it, because 1) I expect it will be disproven (on the basis that model-theoretic properties tend to be fragile), and 2) given the current difficulties in explicitly calculating SI, finding an explicit, non-trivial counter-example would probably be difficult.

Note that

Choose a language that can describe MWI more easily than Copenhagen, and they say you should believe MWI; choose a language that can describe Copenhagen more easily than MWI, and they say you should believe Copenhagen.

is not such a counter-example, because we do not know if "sufficiently assembly-like" languages can be chosen which exhibit such a bias. I don't think the above thought-experiment is worth pursuing, because I don't think we even know a formal (on the level of assembly-like languages) description of either CI or MWI.

Comment author: potato 13 August 2011 12:36:57PM *  0 points [-]

Not Solomonoff, minimum description length, I'm coming from an information theory background, I don't know very much about Solomonoff induction.

Comment author: paper-machine 13 August 2011 12:39:08PM *  0 points [-]

OP is talking about Solomonoff priors, no? Is there a way to infer on minimum description length?

Comment author: potato 13 August 2011 12:42:01PM 0 points [-]

What is OP?

Comment author: Vladimir_Nesov 13 August 2011 12:47:09PM 0 points [-]

EY

Comment author: paper-machine 13 August 2011 12:49:03PM 0 points [-]

I meant Anti-reductionist, the person potato originally replied to... I suppose grandparent would have been more accurate.

Comment author: potato 13 August 2011 12:52:08PM 0 points [-]

He was talking about both.

the belief that "minimum description length" or "Solomonoff induction" can actually predict anything

Comment author: paper-machine 13 August 2011 12:56:20PM 1 point [-]

So how do you predict with minimum description length?

Comment author: g 10 October 2008 11:47:19PM 6 points [-]

Bo, the point is that what's most difficult in these cases isn't the thing that the 10-year-old can do intuitively (namely, evaluating whether a belief is credible, in the absence of strong prejudices about it) but something quite different: noticing the warning signs of those strong prejudices and then getting rid of them or getting past them. 10-year-olds aren't specially good at that. Most 10-year-olds who believe silly things turn into 11-year-olds who believe the same silly things.

Eliezer talks about allocating "some uninterrupted hours", but for me a proper Crisis of Faith takes longer than that, by orders of magnitude. If I've got some idea deeply embedded in my psyche but am now seriously doubting it (or at least considering the possibility of seriously doubting it), then either it's right after all (in which case I shouldn't change my mind in a hurry) or I've demonstrated my ability to be very badly wrong about it despite thinking about it a lot. In either case, I need to be very thorough about rethinking it, both because that way I may be less likely to get it wrong and because that way I'm less likely to spend the rest of my life worrying that I missed something important.

Yes, of course, a perfect reasoner would be able to sit down and go through all the key points quickly and methodically, and wouldn't take months to do it. (Unless there were a big pile of empirical evidence that needed gathering.) But if you find yourself needing a Crisis of Faith, then ipso facto you *aren't* a perfect reasoner on the topic in question.

Wherefore, I at least don't have the *time* to stage a Crisis of Faith about every deeply held belief that shows signs of meriting one.

I think there would be value in some OB posts about resource allocation: deciding which biases to attack first, how much effort to put into updating which beliefs, how to prioritize evidence-gathering versus theorizing, and so on and so forth. (We can't Make An Extraordinary Effort every single time.) It's a very important aspect of practical rationality.

Comment author: Hopefully_Anonymous 10 October 2008 11:50:35PM -1 points [-]

Some interesting, useful stuff in this post. Minus the status-cocaine of declaring that you're smarter than Robert Aumann about his performed religious beliefs and the mechanics of his internal mental state. In that area, I think Michael Vassar's model for how nerds interpret the behavior of others is your God. There's probably some 10 year olds that can see through it (look everybody, the emperor has no conception that people can believe one thing and perform another). Unless this is a performance on your part too, and there's shimshammery all the way down!

Comment author: Roland2 10 October 2008 11:51:09PM 4 points [-]

"How do I know if long-held belief X is false?"

Eliezer, I guess if you already are asking this question you are well on your way. The real problem arises when you didn't even manage to pinpoint the possibly false believe. And yes I was a religious person for many years before realizing that I was on the wrong way.

Why didn't I question my faith? Well, it was so obviously true to me. The thing is: did you ever question heliocentrism? No? Why not? When you ask the question "How do I know if Heliocentrism is false?" You are already on your way. The thing is, your brain needs a certain amount of evidence to pinpoint the question.

How did I overcome my religion? I noticed that something was wrong with my worldview like seeing a deja vu in the matrix every now and then. This on an intellectual level, not as a visible thing. But much more subtle and less obvious so you really have to be attentive no notice it, to notice that there is a problem in the pattern. Things aren't the way they should be.

But over time I became more and more aware that the pieces weren't fitting together. But from there to arrive at the conclusion that my basic assumptions where wrong was really not easy. If you live in the matrix and see strange things happening, how will you arrive at the conclusion that this is because you are in a simulation?

Your posts on rationality were a big help, though. They always say: "Jesus will make you free." Unfortunately that didn't work out for me. Well, I finally am free after a decade of false believing, and during all the time I was a believer I never was as happy as I'm now.

Comment author: PK 10 October 2008 11:51:26PM 0 points [-]

Good post but this whole crisis of faith business sounds unpleasant. One would need Something to Protect to be motivated to deliberately venture into this masochistic experience.

Comment author: Vladimir_Nesov 11 October 2008 01:00:15AM 1 point [-]

All these posts present techniques for applying a simple principle: check every step on the way to your belief. They adapt this principle to be more practically useful, allowing a person to start on the way lacking necessary technical knowledge, to know which errors to avoid, which errors come with being human, where not to be blind, which steps to double-check, what constitutes a step and what a map of a step, and so on. All the techniques should work in background mode, gradually improving the foundations, propagating the consequences of the changes to more and more dearly held beliefs, shifting the focus of inquiry.

Crisis of faith finds a target to attack, boosts a priority of checking the foundations for a specific belief. I'm not sure how useful forcing this process could be, major shifts in defining beliefs take time, and probably deservingly so. Effects of a wrong belief should be undone by the holes in a network supporting these beliefs, not by executive decision declaring the belief wrong. Even though executive decision is based on the same grounds, it's hard to move more than one step of inferential distance without shooting yourself in the foot, before you train yourself to intuitively perceive the holes, or rather repaired fabric. So I guess that the point of exercise is in making the later gradual review more likely to seriously consider the evidence, to break the rust, not in changing the outlook overnight. Changing the outlook is a natural conclusion of a long road, it doesn't take you by surprise. One day you just notice the old outlook to be dead, and so leave it in the past.

Comment author: Cyan2 11 October 2008 02:05:53AM 0 points [-]

Fact check: MDL is not Bayesian. Done properly, it doesn't even necessarily obey the likelihood principle. Key term: normalized maximum likelihood distribution.

Comment author: Doug_S. 11 October 2008 02:17:00AM 2 points [-]

My father is an atheist with Jewish parents, and my mother is a (non-practicing) Catholic. I was basically raised "rationalist", having grown up reading my father's issues of Skeptical Inquirer magazine. I find myself in the somewhat uncomfortable position of admitting that I acquired my belief in "Science and Reason" in pretty much the same way that most other people acquire their religious beliefs.

I'm pretty sure that, like everyone else, I've got some really stupid beliefs that I hold too strongly. I just don't know which ones they are!

Comment author: Bob_Unwin13 11 October 2008 05:00:11AM 0 points [-]

Great post. I think that this sort of post on rationality is extremely valuable. While one can improve everyday judgment and decision making by learning about rationality from philosophy, econ and statistics, I think that these informal posts can also make a significant difference to people.

The recent posts on AI theorists and EY's biography were among my least favorite on OB. If you have a choice, please spend more time on either technical sequences (e.g. stuff on concepts/concept space, evolutionary bio, notion of bias in statistics) or stuff on rationality like this.

Comment author: Brandon_Reinhart 11 October 2008 05:53:25AM 2 points [-]

A good reminder. I've recently been studying anarcho-capitalism. It's easy to get excited about a new, different perspective that has some internal consistency and offers alternatives to obvious existing problems. Best to keep these warnings in mind when evaluating new systems, particularly when they have an ideological origin.

Comment author: Normal_Anomaly 17 January 2011 02:59:12PM *  0 points [-]

EDIT: This comment is redacted.

Replace "anarcho-capitalism" with "singularitarianism" and that's the experience I'm having. It's not so much wondering if a long-held belief is false as wondering if the new belief I'm picking up is false.

Comment author: mtraven 11 October 2008 07:32:08AM 1 point [-]

"Try to think the thought that hurts the most."

This is exactly why I like to entertain religious thoughts. My background, training, and inclination are to be a thoroughgoing atheist materialist, so I find that trying to make sense of religious ideas is good mental exercise. Feel the burn!

In that vein, here is an audio recording of Robert Aumann on speaking on "The Personality of God".

Also, the more seriously religious had roughly the same idea, or maybe it's the opposite idea. The counterfactuality of religious ideas is part of their strength, apparently.

Comment author: MichaelG 11 October 2008 08:15:10AM 1 point [-]

Here's a doubt for you: I'm a nerd, I like nerds, I've worked on technology, and I've loved techie projects since I was a kid. Grew up on SF, all of that.

My problem lately is that I can't take Friendly AI arguments seriously. I do think AI is possible, that we will invent it. I do think that at some point in the next hundreds of years, it will be game over for the human race. We will be replaced and/or transformed.

I kind of like the human race! And I'm forced to conclude that a human race without that tiny fraction of nerds could last a good long time yet (tens of thousands of years) and would change only slowly, through biological evolution. They would not do much technology, since it takes nerds (in the broadest sense) to do this. But, they would still have fulfilling, human, lives.

On the other hand, I don't think a human race with nerds can forever avoid inventing a self-destructive technology like AI. So much as I have been brought up to think of politicians and generals as destroyers, and scientists and other nerds as creators, I have to admit that it's the other way around, ultimately.

The non-nerds can't destroy the human race. Only we nerds can do that.

That's my particular crisis of faith. Care to take a side?

Comment author: JohnH 22 April 2011 06:56:49PM 3 points [-]

The non-nerds can't destroy the human race.

Have you ever heard of the term hubris?

If you can't imagine ways in which the human race can be destroyed by non-nerds then that shows a lack of imagination not that it can not be done. Also, it isn't like nerds and non-nerds are actually a different species, people that do not have a natural aptitude for a subject are still capable of learning the subject,. If nerds all moved to nerdtopia other people would study what material there was on the subject and attempt to continue on. If this is not possible then you have applied the term nerd to be too broad such that it contains the majority of people and all that would be left are people that are incapable to fully taking care of themselves without some form of outside assistance and would thus destroy the human race by sheer ineptitude at basic survival skills.

Comment author: Will_Pearson 11 October 2008 09:00:52AM 1 point [-]

I'd be interested in a list of questions you had decided to have a crisis of faith over. If I get round to it I might try and have one over whether a system can recursively self-improve in a powerful way or not.

Comment author: Michael_Rooney 11 October 2008 09:57:02AM -2 points [-]

A lot of truths in EY's post. Though I also agree with Hopefully Anon's observations -- as is so often the case, Eliezer reminds me of Descartes -- brilliant, mathematical, uncowed by dogma, has his finger on the most important problems, is aware of how terrifyingly daunting those problems are, thinks he has a universal method to solve those problems.

Comment author: Caledonian2 11 October 2008 11:34:22AM 0 points [-]

Trying to set up an artificial crisis in which one outcome is as likely as another is a very bad idea.

If your belief is rationally unjustfiable, a 'crisis' in which one has only a fifty-fifty chance of rejecting the belief is not an improvement in rationality. Such a crisis is nothing more than picking a multiple-choice answer at random -- and with enough arbirarily-chosen options, the chance of getting the *right* one becomes arbitrarily small.

A strategy that actually works is setting your specific beliefs aside and returning to a state of uncertainty, then testing one possibility against the other on down to first principles. Uncertainty != each possibility equally likely.

Comment author: tut 18 June 2009 07:06:48AM *  1 point [-]

Uncertainty != each possibility equally likely

I think he meant that each possibility appears equally likely before you look at the evidence. Basically reset your prior, if that were possible.

Comment author: Paul_Rhodes 11 October 2008 12:35:26PM 1 point [-]

Thank you for this post, Eliezer. I must painfully question my belief that a positive Singularity is likely to occur in the foreseeable future.

Comment author: Manon_de_Gaillande 11 October 2008 01:34:06PM 1 point [-]

Nazir Ahmad Bhat, you are missing the point. It's not a question of identity, like which ice cream flavor you prefer. It's about truth. I do not believe there is a teapot orbiting around Jupiter, for the various reasons explained on this site (see _Absence of evidence is evidence of absence_ and the posts on Occam's Razor). You may call this a part of my identity. But I don't need people to believe in a teapot. Actually, I want everyone to know as much as possible. Promoting false beliefs is harming people, like slashing their tires. You don't believe in a flying teapot: do you need other people to?

Comment author: Paul_Rhodes 11 October 2008 01:42:06PM 1 point [-]

Nazir, must there be atheists in order for you to believe in a god? The "identity" of those who believe that the world is round does not depend on others believing that the world is flat, or vice versa. Truth does not require disagreement.

Comment author: Matthew_C.2 11 October 2008 02:20:07PM -3 points [-]

Excellent post, Eliezer. Along with your comments on MR about the financial crisis, definitely good stuff worth reading.

I would submit that, for you, the belief you are unable to question is materialistic reductionism. I would suggest reading Irreducible Mind which will acquaint you with a great deal of evidence that reality is different from the current model of it you hold in your mind. I would suggest that you begin with chapter 3 which presents a vast body of observational and research evidence from medicine that simply doesn't fit into your current belief system. Start with the introduction, read the entire introduction (which is very good and fits with many of the more conceptual posts you have made here about avoiding pitfalls along the path of rationality), and then read chapter 3 about empirical findings of the relationship between mind and body.

Comment author: Carl_Shulman 11 October 2008 03:06:17PM 2 points [-]

Matthew C.,

You've been suggesting that for a while:

http://www.overcomingbias.com/2007/01/godless_profess.html#comment-27993437 http://www.overcomingbias.com/2008/09/psychic-powers.html#comment-130445874

Those who have read it (or the hundreds of pages available on Google Books, which I have examined) don't seem to be impressed.

Why do you think it's better than Broderick's book? If you want to promote it more effectively in the face of silence (http://www.overcomingbias.com/2007/02/what_evidence_i.html), why not pay for a respected reviewer's time and a written review (in advance, so that you're not accused of bribing to ensure a favorable view)? Perhaps from a statistician?

Comment author: Vladimir_Gritsenko 11 October 2008 03:21:48PM 0 points [-]

Do these methods actually work? There were a few posts here on how more evidence and bias awareness don't actually change minds or reduce bias, at least not without further effort. Can a practical "Deduce the Truth in 30 Days" guide be derived from these methods, and change the world?

Comment author: Caledonian2 11 October 2008 05:08:58PM 1 point [-]

A fifty-fifty chance of choosing your previous belief does not constitute a reasonable test. If your belief is unreasonable, why would treating it as equally plausible as the alternative be valid?

The trick is to suspend belief and negate the biasing tendencies of belief when you re-evaluate, not to treat all potentials as equal.

Comment author: Phil_Goetz4 11 October 2008 05:48:03PM 3 points [-]

Eliezer:

If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it.

I think you should try applying your own advice to this belief of yours. It is usually true, but it is certainly not always true, and reeks of irrational bias.

My experience with my crisis of faith seems quite opposite to your conceptions. I was raised in a fundamentalist family, and I had to "make an extraordinary effort" to keep believing in Christianity from the time I was 4 and started reading through the Bible, and finding things that were wrong; to the time I finally "came out" as a non-Christian around the age of 20. I finally gave up being Christian only when I was worn out and tired of putting forth such an extraordinary effort.

So in some cases your advice might do more harm than good. A person who is committed to making "extraordinary efforts" concerning their beliefs is more likely to find justifications to continue to hold onto their belief, than is someone who is lazier, and just accepts overwhelming evidence instead of letting it kick them into an "extraordinary effort." In other words, you are advocating a combative, Western approach; I am bringing up a more Eastern approach, which is not to be so attached to anything in the first place, but to bend if the wind blows hard enough.

Comment author: [deleted] 17 January 2011 03:31:22PM 4 points [-]

Agreed.

Every time I changed my mind about something, it felt like "quitting," like ceasing the struggle to come up with evidence for something I wanted to be true but wasn't. Realizing "It's so much easier to give up and follow the preponderance of the evidence."

Examples: taking an economics class made it hard to believe that government interventions are mostly harmless. Learning about archaeology and textual analysis made it hard to believe in the infallibility of the Bible. Hearing cognitive science/philosophy arguments made it hard to believe in Cartesian dualism. Reading more papers made it hard to believe that looking at the spectrum of the Laplacian is a magic bullet for image processing. Extensive conversations with a friend made it hard to believe that I was helping him by advising him against pursuing his risky dreams.

When something's getting hard to believe, consider giving up the belief. Just let the weight fall. Be lazy. If you're working hard to justify an idea, you're probably working too hard.

Comment author: JohnH 22 April 2011 07:29:52PM 0 points [-]

One of the problems with your examples in both economics and archeology is that less is known on the subject then what you think is known, especially if you have just taken introductory courses on the subject.

Comment author: wallowinmaya 17 May 2011 07:21:02PM 5 points [-]

From "Twelve virtues of rationality" by Eliezer:

The third virtue is lightness. Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can. Do this the instant you realize what you are resisting; the instant you can see from which quarter the winds of evidence are blowing against you.

Eliezer uses almost the same words as you do.( Oh, and this document is from 2006, so he has not copied your lines.) Some posts earlier Eliezer accused you of not reading his writings and just making stuff up regarding his viewpoints.......

Comment author: Kenny 10 June 2013 12:13:21PM 0 points [-]

The posts on making an extraordinary effort didn't explicitly exclude preserving the contents of one's beliefs as an effort worth being made extraordinarily, so you've definitely identified a seeming loophole, and yet you've simultaneously seemed to ignore all of the other posts about epistemic rationality.

Comment author: Kaj_Sotala 11 October 2008 06:35:26PM 1 point [-]

MichaelG:

On the other hand, I don't think a human race with nerds can forever avoid inventing a self-destructive technology like AI.

The idea is that if we invent Friendly AI *first*, it will become powerful enough to keep later, Unfriendly ones in check (either alone, or with several other FAIs working together with humanity). You don't need to avoid inventing one forever: it's enough to avoid inventing one as the first thing that comes up.

Comment author: Phil_Goetz4 11 October 2008 07:59:32PM 0 points [-]

If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it.

It is easy to construct at least these 2 kinds of cases where this is false:

  • You have a set of beliefs optimized for co-occurence, and you are replacing one of these beliefs with a more-true belief. In other words, the new true belief will cause you harm because of other untrue (or less true) beliefs that you still hold.
  • If an entire community can be persuaded to adopt a false belief, it may enable them to overcome a tragedy-of-the-commons or prisoners'-dilemma situation.

If you still aren't convinced whether you are always better-off with a true belief, ask yourself whether you have ever told someone else something that was not quite true, or withheld a truth from them, because you thought the full truth would be harmful.

Comment author: Will_Pearson 11 October 2008 08:25:26PM 2 points [-]

I was raised in a christian family, fairly liberal Church of England, and my slide into agnosticism started when I about 5-7 when I asked if Santa Claus and God were real. I refused to get confirmed and stopped going to church when I was 13ish I think.

In other words, you are advocating a combative, Western approach; I am bringing up a more Eastern approach, which is not to be so attached to anything in the first place, but to bend if the wind blows hard enough.

The trouble is that you cannot break new ground this way. You can't do einstein like feats. You should follow the direction of the wind, but engage nitrous to follow that direction. Occasionally stopping and sticking a finger out the window to make sure you are going the right direction.

Comment author: steven 11 October 2008 08:32:05PM -1 points [-]

If an entire community can be persuaded to adopt a false belief, it may enable them to overcome a tragedy-of-the-commons or prisoners'-dilemma situation.

In a PD, agents hurt *each other*, not *themselves*. Obviously false beliefs in my enemy can help me.

Comment author: Alan_Crowe 11 October 2008 10:08:59PM 1 point [-]

Study this deranged rant. Its ardent theism is expressed by its praise of the miracles God can do, if he choses.

And yet,... There is something not quite right here. Isn't it merely cloakatively theistic? Isn't the ringing denounciation of "Crimes against silence" militant atheism at its most strident?

So here is my idea: Don't try to doubt a whole core belief. That is too hard. Probe instead for the boundary. Write a little fiction, perhaps a science fiction of first contact, in which you encounter a curious character from a different culture. Write him a borderline belief, troublingly odd to both sides in a dispute about which your own mind is made up. He sits on one of our culture's fences. What is his view like from up there?

Is he "really" on your side, or "really" on the other side. Now there is doubt you can actually be curious about. You have a thread to pull on; what unravels if you tug?

Comment author: TGGP4 11 October 2008 11:51:21PM 0 points [-]

If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it. I think evolution facilitated self-delusion precisely because that is not the case.

I was a Fred Phelps style ultra-Calvinist and my transition involved scarcely any effort.

Comment author: TGGP4 11 October 2008 11:52:18PM 0 points [-]

Also, anti-reductionist, that's the first comment you've made I felt was worth reading. You may take it as an insult but I felt compelled to give you kudos.

Comment author: Eliezer_Yudkowsky 12 October 2008 12:00:10AM 3 points [-]

I suspect that there are many people in this world who are, by their own standards, better off remaining deluded. I am not one if them; but I think you should qualify statements like "if a belief is false, you are better off knowing that it is false".

Of course I deliberately did not qualify it. Frankly, if you're still qualifying the statement, you're not the intended audience for a post about how to make a convulsive effort to be rational using two dozen different principles.

Comment author: christopherj 17 December 2013 02:58:47AM 0 points [-]

And you seriously believe that, in all circumstances and for all people with any false belief, those people are better off believing the truth concerning that belief? The obvious counterexample is the placebo effect, where a false belief is scientifically proven to have a benefit. The beneficial effects of false beliefs are so powerful, that you can't conduct a pharmaceutical study without accounting for them. And you are no doubt familiar with that effect. Another example would be believing that you're never better off believing a false belief, because then you have more incentive to investigate suspicious beliefs.

Comment author: Eliezer_Yudkowsky 17 December 2013 08:43:21PM 1 point [-]

The difficult epistemic state to get into is justifiably believing that you're better off believing falsely about something without already, in some sense, knowing the truth about it.

Comment author: christopherj 22 December 2013 03:35:57AM 0 points [-]

It's actually very easy and common to believe that you're better off believing X, whether or not X is true, without knowing the truth about it. This is also well-justified in decision theory, and by your definition of rationality, if believing X will help you win. A common example is choosing to believe that your date has an "average romantic history" and choosing not to investigate.

If you think you can't do this, I propose this math problem. Using a random number generator over all American citizens, I have selected Bob (but not identified him to you). If you can guess Bob's IQ (with margin of error +/- 5 points), you get a prize. Do you think it is possible for Bob's IQ to be higher or lower than you expected, and if so, do you believe you're better off not having any expectation at all rather than a potentially false expectation? See, as soon as a question is asked, you fill in the answer with [expected answer range of probabilities] rather than [no data].

It's much easier to believe something and not investigate, than to investigate and try to deceive yourself. And unless you add as an axiom "unlike all other humans for me false beliefs are never beneficial" (which sounds like a severe case of irony), then a rationalist on occasion must be in favor of said false beliefs. Just out of curiosity, why the switch from "rational" to "epistemic"?

Comment author: AnnaSalamon 12 October 2008 01:06:52AM 0 points [-]

Eliezer, what do you mean here? Do you mean:

(A1) Individuals in the reference class really are always better off with the truth, with sufficient probability that the alternative does not bear investigating;

(A2) Humans are so unreliable as judges of what we would and would not benefit from being deceived about that the heuristic "we're always better off with the truth" is more accurate than the available alternatives;

(B) Individuals must adopt the Noble Might-be-truth "I'm always better off with the truth" to have a chance at the Crisis of Faith technique?

Comment author: michael_vassar3 12 October 2008 03:26:33AM 1 point [-]

Eliezer: The position that people may be better off deluded in some situations is VERY compelling. If your audience is people who are literally NEVER better off deluded then I sincerely doubt that it includes you or anyone else. Obviously not every belief need receive all appropriate qualifications every single time, but when someone else points out a plausible qualification you should, as a rationalist, acknowledge it.

I'm very open to Anna's (A1), especially given the special difficulties of this sort of investigation, but only with respect to themselves. I would expect someone as smart as me who knew me well enough to some day come upon a situation where I should, by my values, be deceived, at least for some period.

Comment author: NAZIR_AHMAD_BHAT 12 October 2008 04:49:49AM -3 points [-]

Mr.paul Rohdes: thanks this may include the response to other friend.the athiest belives in rationality much on scientific terms and desires to see GOD as easy as some physical thing in hand. it is appreciating that athiests are not common belivers and do claim to have the force of critical thinking behind in testing the things. assuming in their favour as crude as scientific argument to disprove the existence of GOD; athiest fails to appreciate similar kind of test to prove the existence of God.for instance on the discovery of "gravitation"newton just assumed on facts that gravitation could be there which was neither seen nor touched. he said"it is incomprehensible that inanimate and insensitive matter can exert a force of attraction on another without any {visible} contact without any meduim between them"{ refer worksof bently vol:3rd.p.221}in the sphere of critical thinking one can say that universe being so designed without error must have its designer; which theory of theists depend much on the same lines of scientific observaton as relied by athiests in other matters.however; from the post"crisis of faith" the learned author had already criticised such of those scietists who inspite of being scientists do belivein God.and finally; for athiests if science is the measuring rod; which is constantly going unde change; the athiests never hold that their scientific beilve is adhoc as on date to disblieve in GOd; may be the same science and criticl thinking tomorrow hold that GOD exists they shall have to beilve in it; then why propoganda on finality of argument? thanks

Comment author: Eliezer_Yudkowsky 12 October 2008 05:35:41AM 5 points [-]

@Anna:

I mean that you've given up trying to be clever.

@Vassar:

The position that people may be better off deluded in some situations is VERY compelling.

The position that people may be optimally deluded, without a third alternative, is much less compelling.

The position that realistic human students of rationality can be trying to do their best (let alone do the impossible), while trying to deliberately self-delude, strikes me as outright false. It would be like trying to win a hot-dog eating contest while keeping a golf ball in your mouth.

It is this outright falsity that I refer to when I say that by the time you attempt to employ techniques at this level, you should already have given up on trying to be clever.

As someone once said to Brennan:

She reared back in mock-dismay. "Why, Brennan, surely you don't expect me to just tell you!"

Brennan gritted his teeth. "Why not?"

"What you're feeling now, Brennan, is called curiosity. It's an important emotion. You need to learn to live with it and draw upon its power. If I just give you the information, why, you won't be curious any more." Her eyes turned serious. "Not that you should prefer ignorance. There is no curiosity that does not want an answer. But, Brennan, tradition doesn't say I have to hand you knowledge on a silver platter."

It's easy to visualize Jeffreyssai deciding to not say something - in fact, he does that every time he poses a homework problem without telling the students the answer immediately. Can you visualize him lying to his students? (There are all sorts of clever-sounding reasons why you might gain a short-term benefit from it. Don't stop thinking when you come to the first benefit.) Can you imagine Jeffreyssai deliberately deciding that he himself is better off not realizing that X is true, therefore he is not going to investigate the matter further?

Clearly, if everyone was always better off being in immediate possession of every truth, there would be no such thing as homework. But the distinction between remaining silent, and lying, and not wanting to know the truth even for yourself, suggests that there is more at work here than "People are always better off being in immediate possession of every truth."

Comment author: pdf23ds 12 October 2008 06:36:18AM 2 points [-]

The problem with the idea that sometimes people are better of not knowing is that it has no practical impact on how an ideal rationalist should behave, even assuming it's true. By the time you've learned something you'd be better off not knowing, it's too late to unlearn it. Humans can't really do doublethink, and especially not at the precision that would be required to be extremely rational while using it.

Comment author: pdf23ds 12 October 2008 06:38:48AM 0 points [-]

it has no practical impact on how an ideal rationalist should behave

With respect to themselves, not necessarily to others. Withholding information or even lying can be rational.

Comment author: Phil_Goetz4 12 October 2008 06:51:36AM 0 points [-]

In other words, you are advocating a combative, Western approach; I am bringing up a more Eastern approach, which is not to be so attached to anything in the first place, but to bend if the wind blows hard enough.

The trouble is that you cannot break new ground this way. You can't do einstein like feats. You should follow the direction of the wind, but engage nitrous to follow that direction. Occasionally stopping and sticking a finger out the window to make sure you are going the right direction.

I think Einstein is a good example of both bending with the wind (when he came up with relativity), and of not bending with the wind (when he refused to accept quantum mechanics).

By "bending with the wind" I don't mean "bending with public opinion". I mean not being emotionally attached to your views.

In a PD, agents hurt *each other*, not *themselves*.

In a PD, everyone having accurate information about the payoff matrix leads to a worse outcome for everyone, than some false payoff matrices you could misinform them with. That is the point.

Comment author: JamesAndrix 12 October 2008 06:53:15AM 0 points [-]

In retrospect, I was certainly leaving christianity the day I decided that if god existed, he could not possibly object to me honestly trying to determine The Truth. Doubts stopped feeling like sins.

I think your something-to-protect must be the accuracy of your map, not anything on the map. (at least for a moment)

If someone says you need to fire a revolver at your something-to-protect, you will raise objection based on strongly held beliefs about the effects of revolvers. It's so hard to risk those beliefs because, with your current belief set, someone who lacked those beliefs would lose their something-to-protect. You can't stop believing as long as you believe the cost for disbelieving is any kind of hell.

I was about to say I was lucky to have such a god, but no: I constructed a god just nice enough to let me relieve that cognitive tension.

It's relatively easy to invent and defend very contrarian ideas when you start off joking. This could be a technique if you're confident you can later break a good idea out of the silly-idea prison.

Comment author: steven 12 October 2008 08:23:12AM 0 points [-]

In a PD, everyone having accurate information about the payoff matrix leads to a worse outcome for everyone, than some false payoff matrices you could misinform them with. That is the point.

Do you agree that in a PD, it is not the case for any individual that that individual is harmed by that individual's knowledge? Your point goes through if we somehow think of the collective consisting as a single "you" with beliefs and preferences, but raises all sorts of issues and anyway isn't what Eliezer was talking about.

Comment author: Will_Pearson 12 October 2008 09:06:22AM 0 points [-]

I think Einstein is a good example of both bending with the wind (when he came up with relativity)

I'm not sure what you mean by bending with the wind. I thought it was the evidence that provided the air pressure, but there was no evidence to support Einstein's theory above the theories of the day. He took an idea and ran with it to its logical conclusions. Then the evidence came, he was running ahead of the evidential wind.

If the wind is following occams or something internal, then it can be blowing in the wrong direction...

Comment author: pdf23ds 12 October 2008 09:29:04AM 0 points [-]

If the wind is following occams or something internal, then it can be blowing in the wrong direction...

Isn't that the subject of The Ritual?

Comment author: steven 12 October 2008 09:58:08AM 0 points [-]

One more argument against deceiving epistemic peers when it seems to be in their interest is that if you are known to have the disposition to do so, this will cause others to trust your non-deceptive statements less; and here you could recommend that they shouldn't trust you less, but then we're back into doublethink territory.

Comment author: Will_Pearson 12 October 2008 10:19:04AM 0 points [-]

Phil Goetz, who I was replying to, was saying that type of thought should be unnecessary, if you don't hang on to your ideas tightly.

Not hanging on to ideas tightly is great for engineers and experimental scientists. It doesn't matter to a chemist if MWI or bohm is right. He can use either, switching back or forth from the view points as he sees fit.

For a theoretical quantum physicist, he has to have some way of determining at which face of the knowledge mine to work, he has to pick one or the other. If it is not a strong reason then he might split his work and get less far with either.

For this sort of person it makes sense to pick one direction and run with it, getting invested in it etc. At least until he comes across reasons that maybe he should take the opposite direction or neither direction, then the crisis of faith might be needed.

Comment author: JulianMorrison 12 October 2008 02:09:00PM -1 points [-]

Well, I've just sat down and done one of those, and it was really difficult. Not so much because I was pushing against established beliefs (I had strong beliefs both ways, so it was more that any movement pushed somewhere) but because the largest worry I had, "Is this a fad?", is hard to answer specifically because I've recently changed to become so much more Bayesian. I used to do daft things like "giving ideas a chance". Consequently, I can't look to my long and undistinguished history in order to glean hints. I already don't do the obvious wrong stuff.

So the problem has to be phrased as "what sorts of irrationality would not be obvious to a beginner Bayesian?"

That's a real poser. Just by being one, I'm in the worst possible place to guess.

(FWIW, the outcome was "continue, for now".)

Comment author: Eliezer_Yudkowsky 12 October 2008 02:45:00PM 3 points [-]

"I am better off deluding myself into believing my cherished late spouse was faithful to me."

1) Do you believe this is true for you, or only other people?

2) If you know that someone's cherished late spouse cheated on them, are you justified in keeping silent about the fact?

3) Are you justified in lying to prevent the other person from realizing?

4) If you suspect for yourself (but are not sure) that the cherished late spouse might have been unfaithful, do you think that you will be better off, both for the single deed, and as a matter of your whole life, if you refuse to engage in any investigation that might resolve your doubts one way or the other? If there is no resolving investigation, do you think that exerting some kind of effort to "persuade yourself", will leave you better off?

5) Would you rather associate with friends who would (a) tell you if they discovered previously unsuspected evidence that your cherished late spouse had been unfaithful, or who would (b) remain silent about it? Which would be a better human being in your eyes, and which would be a better friend to you?

Comment author: Phil_Goetz 12 October 2008 03:12:00PM 1 point [-]

I think Einstein is a good example of both bending with the wind (when he came up with relativity)

I'm not sure what you mean by bending with the wind. I thought it was the evidence that provided the air pressure, but there was no evidence to support Einstein's theory above the theories of the day. He took an idea and ran with it to its logical conclusions. Then the evidence came, he was running ahead of the evidential wind.

You do know roughly what I mean, which is that strenuous effort is only part of the solution; not clinging to ideas is the other part of the solution. Focusing on the strenuous effort part can lead to people making strenuous effort to justify bad ideas. Who makes the most strenuous effort on the question of evolution? Creationists.

Einstein had evidence; it just wasn't experimental evidence. The discovery that your beliefs contain a logical inconsistency is a type of evidence.

Comment author: Will_Pearson 12 October 2008 03:14:00PM -2 points [-]

I am better off (in most circumstances) if deluding myself to believe that the weather in Maine on the 23rd of June 1865 was near what I think the seasonal average might be, for that decade, rather than memorising the exact temperature and rainfall if it was presented to me.

I believe this is true for most people, apart from climatologists.

I would be rather not be around people who kept telling me true minutiae about the world and he cosmos, if they have no bearing on the problems I am trying to solve.

Am I justified in giving people a guess of the average temp, if someone had told me earlier what the exact temp was? Yes, if I didn't discard data, even assuming I had a 100% truth detector, people could quite easily DOS me by truth flooding me, over running my memory buffers and preventing me from doing useful things.

There are an extremely high number of truths, some more valuable than others.

There is no way you can differentiate externally between someone telling a lie and someone forgetting. It has the exact same consequence, people will give less accurate information than they could have done.

Comment author: JulianMorrison 12 October 2008 03:25:00PM -1 points [-]

From where I'm standing, the spouse thing looks like obvious nonsense (of the category: not looking for a third alternative). You'd be far better off learning to share - especially since, if your spouse died, you'd have someone to talk to.

Comment author: Recovering_irrationalist 12 October 2008 03:34:00PM 2 points [-]

Nazir, a secret hack to prevent Eliezer from deleting your posts is here. #11.6 is particularly effective.

Comment author: mtraven 12 October 2008 04:23:00PM 0 points [-]

Religion is the classic example of a delusion that might be good for you. There is some evidence that being religious increases human happiness, or social cohesion. It's universality in human culture suggests that it has adaptive value.

Comment author: tut 18 June 2009 07:55:41AM 8 points [-]

Nope. There is some evidence that christians in the USA are happier than atheists in the USA. But since that correlation doesn't hold up in Europe I prefer to interprete it as: America is bad for atheists.

Comment author: Cyan2 12 October 2008 04:40:00PM 0 points [-]

1) Do you believe this is true for you, or only other people?

I don't fit the premise of the statement -- my cherished spouse is not yet late, so it's hard to say.

2) If you know that someone's cherished late spouse cheated on them, are you justified in keeping silent about the fact?

Mostly yes.

3) Are you justified in lying to prevent the other person from realizing?

Mostly no.

4) If you suspect for yourself (but are not sure) that the cherished late spouse might have been unfaithful, do you think that you will be better off, both for the single deed, and as a matter of your whole life, if you refuse to engage in any investigation that might resolve your doubts one way or the other?

Depends on the person. Some people would be able to leave their doubts unresolved and get on with their life -- others would find their quality of life affected by their persistent doubts.

If there is no resolving investigation, do you think that exerting some kind of effort to "persuade yourself", will leave you better off?

No. You can count that as a win if you like -- "deluding myself" is too strong. "I am better off *remaining deluded* ..." is more likely to be true for some people.

5) Would you rather associate with friends who would (a) tell you if they discovered previously unsuspected evidence that your cherished late spouse had been unfaithful, or who would (b) remain silent about it?

Supposing I am emotionally fragile and might harm myself if I discovered that my spouse had been unfaithful, (b). Supposing that I am emotionally stable and that I place great weight on having an accurate view of the circumstances of my life, (a). Other situations, other judgment calls.

Which would be a better human being in your eyes, and which would be a better friend to you?

Depends on how I can reasonably be expected to react.

Comment author: Matthew_C. 12 October 2008 05:25:00PM -3 points [-]

Carl Schuman,

I keep posting the link, for a very simple reason.

Eliezer continues to post about the certainty of reductionism, while he has completely failed to investigate the evidence that reductionism cannot account for all of the observations.

He also continues to post snide remarks about the reality of psi phenomena. Again, he has completely failed to investigate the best evidence that he is wrong about this.

The post he wrote here shows a great committment to intellectual integrity. And I honestly believe he means what he wrote here.

I suspect at some point Eli's desire for the truth will overcome his ego identification with his current beliefs as well as his financial interest in preserving them.

I happen to have come across a PDF of Irreducible Mind which is temporarily available here.

Start with the introduction (the best part of the intro begins on page 23 (xxiii) ), then read chapter 3 which covers in detail a vast panoply of medical phenomena seen in clinical practice and in research which simply does not fit into the reductionistic framework.


Of course there are lots of other good books and thousands of important research papers, many of which are cited in the appendices of Irreducible Mind. But the advantage of this book, and especially chapter 3, is that the inability of the standard reductionistic dogmas to account for the evidence simply becomes crushingly obvious.

Comment author: Caledonian2 12 October 2008 05:58:00PM 2 points [-]

Eliezer continues to post about the certainty of reductionism, while he has completely failed to investigate the evidence that reductionism cannot account for all of the observations.

Reducing things to the simplest description possible -- where 'possible' refers to the ability to accurately model things -- by definition accounts for all veridical observations.

His point is necessarily correct, as well as empirically so.

He also continues to post snide remarks about the reality of psi phenomena. Again, he has completely failed to investigate the best evidence that he is wrong about this.

No, he hasn't. The best evidence strongly indicates that there are no 'unusual' phenomena that require explanation, and that psi does not exist.

The fact that you have deluded yourself into believing otherwise does not constitute a failure on Eliezer's part.

Comment author: Phil_Goetz 12 October 2008 08:40:00PM 1 point [-]

I guess I am questioning whether making a great effort to shake yourself free of a bias is a good or a bad thing, on average. Making a great effort doesn't necessarily get you out of biased thinking. It may just be like speeding up when you suspect you're going in the wrong direction.

If someone else chose a belief of yours for you to investigate, or if it were chosen for you at random, then this effort might be a good thing. However, I have observed many cases where someone chose a belief of theirs to investigate thoroughly, precisely because it was an untenable belief that they had a strong emotional attachment to, or a strong inclination toward, and wished to justify. If you read a lot of religious conversion stories, as I have, you see this pattern frequently. A non-religious person has some emotional discontent, and so spends years studying religions until they are finally able to overcome their cognitive dissonance and make themselves believe in one of them.

After enough time, the very fact that you have spent time investigating a premise without rejecting it becomes, for most people, their main evidence for it.

I don't think that, from the inside, you can know for certain whether you are trying to test, or trying to justify, a premise.

Comment author: Phil_Goetz 12 October 2008 08:55:00PM 0 points [-]

Religion is the classic example of a delusion that might be good for you. There is some evidence that being religious increases human happiness, or social cohesion. It's universality in human culture suggests that it has adaptive value.

See last week's Science, Oct. 3 2008, p. 58-62: "The origin and evolution of religious prosociality". One chart shows that, in any particular year, secular communes are four times as likely to dissolve as religious communes.

Comment author: Matthew_C. 12 October 2008 10:34:00PM -2 points [-]

Caledonian,

Read chapter 3, then come back and explain why a reductionistic explanation best accounts for the phenomena described there. Because if you are inconversant with the evidence, you simply have no rational basis to make any comment whatsoever.

You also seem to be playing some kind of semantic games with the word "reductionism" which I'll just note and ignore.

Comment author: steven 12 October 2008 11:48:00PM 1 point [-]

It's important in these crisis things to remind yourself that 1) P does not imply "there are no important generally unappreciated arguments for not-P", and 2) P does not imply "the proponents of P are not all idiots, dishonest, and/or users of bad arguments". You can switch sides without deserting your favorite soldiers. IMO.

Comment author: Phil_Goetz 13 October 2008 01:57:00AM 2 points [-]

Matthew C -

You are advocating nonreductionism and psi at the same time.

Supposing that you are right requires us to suppose that there is both a powerful argument against reductionism, and a powerful argument in favor of psi.

Supposing that you are a crank requires only one argument, and one with a much higher prior.

In other words, if you were advocating one outrageous theory, someone might listen. The fact that you are advocating two simultaneously makes dismissing all of your claims, without reading the book you recommend, the logical response. We thus don't have to read it to have a rational basis to dismiss it.

Comment author: Ben_Jones 13 October 2008 08:36:00AM 1 point [-]

I would be rather not be around people who kept telling me true minutiae about the world and he cosmos, if they have no bearing on the problems I am trying to solve.

Will, not wishing to be told pointless details is not the same as deluding yourself.

I was discussing the placebo effect with a friend last night though, and found myself arguing that this could well be an example of a time when more true knowledge could hurt. Paternalistic issues aside, people appear to get healthier when they believe falsehoods about the effectiveness of, say, homeopathy or sugar pills.

Would I rather live in a world where doctors seek to eliminate the placebo effect by disseminating more true knowledge; or one where they take advantage of it, save more lives, but potentially give out misinformation about what they're prescribing? I honestly don't know.

Comment author: Nick_Tarleton 13 October 2008 09:38:00AM 0 points [-]

Phil: One of psi or non-reductionism being true would be a powerful argument in favor of the other.

Ben: great example.

Comment author: Caledonian2 13 October 2008 06:03:00PM 0 points [-]

Phil: One of psi or non-reductionism being true would be a powerful argument in favor of the other.

No, it really wouldn't. Neither implies the other, or even suggests that the other is more likely than it would be otherwise.

Comment author: Zubon 13 October 2008 08:28:00PM 0 points [-]

Caledonian, maybe you had arguments on this thread previously, but it seems more like the place for that sub-debate.

Comment author: Pierre-André_Noël 14 October 2008 12:06:00AM 0 points [-]

First of all, great post Eliezer. If somebody holding that kind of standard thinks that cryogenics is a good investment, I should someday take some time into investing the question deeper than I had.

Now, without lessening the previous praise, I would like to make the following remarks about friendly AI:
- The belief has long remained in your mind;
- It is surrounded by a cloud of known arguments and refutations;
- You have sunk costs in it (time, money, public declarations).
I do not know if it has emotional consequences for you or if it has gotten mixed up in your personality generally.

I think the following questions better translate my line of thoughts than any explanations I could formulate. Given a limited amount of EY resources:
- is friendly AI the best bet for "saving mankind"?
- would this "crisis of faith technique" (or similar rational approaches) be more popular, could other alternatives than FAI be envisioned?
- if FAI is required (the sole viable alternative), would it worth the cost to invest time into "educating people" into such rational approaches (writing books, publicise etc.) in order to gather ressources/manpower to achieve FAI?

Maybe you have already passed through such a reasoning and came to the answer that the current time you invest on OB is the optimal amount of publicity...

Comment author: Caledonian2 14 October 2008 03:18:00PM 2 points [-]

In fact most human beings at most times (including today) have accepted both propositions as real.

That should be a tipoff for you. Of all the things most humans have accepted as real, how many of them do we currently recognize as real? The general acceptance of a position is a virtual guarantee that it's completely wrong.

As for your text -- no one who seriously suggests that mental states affecting health and shamanistic death spells are evidence for either 'non-reductionism' or psi is worth taking the time to refute in detail.

(Hint: there are perfectly-suitable existing explanations for both of those things. Immune system cells are highly responsive to neurotransmitters, and are thought to have either be the evolutionary progenitors or descendents of neurons. There are obvious benefits to distributing resources differently when organisms are under stresses, and the immune system is an obvious resource drain. As for the voodoo explanation, it's called "the vagus nerve".)

The nifty thing about science is that, when it possesses dogmas, it does a pretty good job of overturning them. Psi advocates have had plenty of opportunities to demonstrate real phenomena. They have failed. They have repeatedly, demonstrably, empirically failed. If you do not believe their failure constitutes a valid reason to reject their premises, what premises DO you believe science has had reason to reject?

Comment author: Paul_Murray 15 October 2008 04:32:00AM 2 points [-]

I was raised a theist and came to no longer belive as an adult. One of the turning points was reading the Anglican confession of faith, and supposing what my own beliefs might look like to an Anglican, who was also a christian, saved by Jesus just like me - just a different variety of.

Eventually I began to wonder what my life experiences might look like to an atheist - religion is above all an interpretive filter that we use to make sense of our lives. Although I knew that my beliefs in God were right, what would my life look like to me if I did not belive it?

Eventually, I could not help noticing that the nonbeliver point of view made better sense of the world.

If I had to attach labels to the personal qualities that changed my mind (excuse me if I sound vain), I'd say: curiosity - a drive to know the truth, whatever it may be; humility - from the first, I was prepared to accept that the Anglicans might be right and I wrong (and the the catholics, the muslims, the hindus, and finally the atheists); refraining from judgment - being prepared to tolerate an open question; perhaps even courage. And ... a decision to trust myself to come to a right conclusion - somthing that religions actively discourage. Perhaps we might call it "integrity".

But deliberately setting out to have a crisis of faith? I cant imagine doin ... actually, yes I can. I did it every time I asked myself "what would an atheist think of this miracle, this prophecy, this teaching, this world event".

No: that's not the key. The key is not "what would an athest think ..", but "what would *I* think, if I were an atheist?". Admitting the possibility of change. Fully owning, if only for a moment, another point of view. Seeing the world with your own eyes from someone else's point of view. Or at least, making an honest effort to.

Comment author: Zubon 15 October 2008 04:27:00PM 5 points [-]

Matthew C, I read the introduction and chapters 1 and 3. Are you sure you meant chapter 3? It does not seem to say what you think it says. Most of it is a description of placebos and other psychosomatic effects. It also discusses some events that are unlikely in isolation but seem trivially within the realm of chance given 100 years and approaching 7 billion people. There is also a paragraph with no numbers saying it can't just be chance.

It feels kind of like asking everyone in the country to flip a coin 25 times, then calling the 18 or so people who have continuous streaks psychics. And ignoring that all-heads and all-tails both count. And maybe also counting the people who got HHHHHTTTTTHHHHHTTTTTHHHHH or HTHTHTHTHTHTHTHTHTHTHTHTH or such. Survivorship and publication bias and all that.

There were a few things that might have fallen outside those obvious mistakes, but given the quality of analysis, I did not feel a pressing need to check that they reported their sources properly, that their sources reported theirs properly, and that there was no deception etc. involved. This Stevenson fellow might be worth pursuing, but it seems likely that he is just the archivist of the one-in-a-million events that continuously happen with billions of people. I feel compelled to read on, however, by the promise that not only identity but also skills can survive bodily death. I am picturing a free-floating capacity for freecell or ping-pong, just looking for somewhere to reincarnate. Sadly, I do not expect the text to be that fun.

Comment author: taryneast 29 May 2011 08:42:35PM 1 point [-]

If I could give you extra points I would. Many thanks for actually having read this stuff, then given us a clear explanation of what it entails... so we don't have to bother :)

Comment author: Odd 15 October 2008 06:44:00PM 0 points [-]

Odd, I'm a Christian daughter of two atheists. I guess I didn't miss out after all.

Comment author: taryneast 29 May 2011 08:44:13PM 0 points [-]

I agree. I was raised atheist... went through a "religious phase" then figured myself out again. Being raised atheist doesn't mean you haven't been through all those crises too. :)

Comment author: Also_odd 15 October 2008 07:21:00PM -1 points [-]

I became a Christian because I was a Bayesian first. I know there are others like me. I saw and experienced evidence that caused me to positively update my belief.

Now if you don't like that argument, then please tell me how can anyone become an atheist via Bayesian updating? Can your posterior really go to a point mass at zero (belief in God)? If so, please tell me what prior you were using. If not, please tell me how you define atheism.

Comment author: Eliezer_Yudkowsky 15 October 2008 07:46:00PM 9 points [-]

Atheism is believing that the state of evidence on the God question is similar to the state of evidence on the werewolf question.

Comment author: Richard_Hollerith 15 October 2008 08:39:00PM 2 points [-]

s/werewolf/Easter bunny/ IMHO.

Comment author: Also_odd 15 October 2008 08:52:00PM 0 points [-]

Would that apply to someone with a particularly high prior on the werewolf question? So, you would agree that anyone who believes that the state of evidence on "the God question" is more positive than the state of evidence on the "werewolf question" should consider labeling themselves an agnostic? theist?

And, I presume that you believe that one's current belief in the state of evidence would be controlled by 1) verifiable general evidence, 2) experience, and 3) priors on both questions?

Then we're in agreement: you should (apparently) call yourself an atheist, and I should call myself a Christian, as we differ on #2 and #3. (not that theism = Christian, but that goes back to #2).

Comment author: Paul_Murray 16 October 2008 07:39:00AM 2 points [-]

[quote]I became a Christian because I was a Bayesian first. I know there are others like me. I saw and experienced evidence that caused me to positively update my belief.

Now if you don't like that argument, then please tell me how can anyone become an atheist via Bayesian updating? Can your posterior really go to a point mass at zero (belief in God)? If so, please tell me what prior you were using. If not, please tell me how you define atheism.
[/quote]
And how can *your* probability go to one? You erect a straw man, sir. My probablility that there is a god is not exactly zero, any more than yours is exactly one. If God were to send an actual angel down, right now, and make my dinner vanish with his magic stick (it's in Genesis, somewhere) then that would shift my probability.

But as things stand, I am confident that there are no gods.


Comment author: Also_odd 16 October 2008 01:17:00PM -1 points [-]

@ Paul Murray

There is no straw man. You've presumed that I meant that Christian = "Pr(god)=1". That was never my claim. It had seemed that atheist was being used as Atheist="Pr(god)=0", but E. clarified his position. I think that agnostic (in the literal sense) is always a better term than atheist, but that's just semantics.

The real issue (to me) is what Christians (or other "people of faith") think of the atheistic position, and vice versa. Christians are often derided here as uneducated or un-Bayesian.

My point is not to convince you to believe, but to ask whether you think that a rational Bayesian can ever become a Christian (or person of other faith), given that we have different life experiences and different priors? Can it be so? And if so, then why the derision? Is that not an irrational bias?

I'll leave it up to God to care about space-time location of your dinner.

Comment author: steven 16 October 2008 01:28:00PM 0 points [-]

Eliezer, that's a John McCarthy quote.

Comment author: Doug_S. 17 October 2008 06:59:00AM 4 points [-]

I don't need to read the book. I believe that psi effects are not real, because if they were, they would already be part of accepted science.

It's not a matter of being closed-minded or open-minded. I'm just not accepting your book author as a legitimate authority. Most things I believe, I believe because they are asserted by authorities I accept. For example, I have never personally seen an experiment performed that establishes that the Sun is made primarily of hydrogen and helium, that an atom of gold contains seventy-nine protons, that George Washington was the first President of the United States, that light is quantized, or many other things I learned in school.

My criteria is simple: on matters in which I have no special expertise or direct knowledge, I simply accept the view of the majority of those I consider legitimate experts. If you want to persuade me that "psi: is real, go persuade the Nobel Prize committee; anyone who can establish it through controlled, repeatable experiments would certainly be deserving of the Nobel Prize in Physics.

In other words, I rely on people like James Randi and Joe Nickell to do the investigating for me. Convince them, and I'll believe in psi. Until then, don't go shoving your data in my face, because I'll just conclude that your data, or your interpretation of it, is wrong.

Comment author: Egor_Duda 17 October 2008 10:10:00AM 2 points [-]

@Mattew C.

Do you mean by "remote staring experiments" those of Wiseman/Schlitz?

It seems that when properly controlled, they produced no statistically significant effect: http://forums.randi.org/archive/index.php/t-43727.html

Comment author: Jo 07 November 2008 10:15:00AM 10 points [-]

So here I am having been raised in the Christian faith and trying not to freak out over the past few weeks because I've finally begun to wonder whether I believe things just because I was raised with them. Our family is surrounded by genuinely wonderful people who have poured their talents into us since we were teenagers, and our social structure and business rests on the tenets of what we believe. I've been trying to work out how I can 'clear the decks' and then rebuild with whatever is worth keeping, yet it's so foundational that it will affect my marriage (to a pretty special man) and my daughters who, of course, have also been raised to walk the Christian path.

Is there anyone who's been in this position - really, really invested in a faith and then walked away?

Comment author: AndyCossyleon 07 September 2010 09:28:14PM 2 points [-]

Daniel Everett was a missionary to the Piraha of Brazil and a husband and father.

Comment author: JoshuaZ 07 September 2010 09:49:50PM *  6 points [-]

Quite a few. Dan Barker was a Christian minister before he walked away. But the truth is, it is much harder to walk away from a religion when one is married and has a family. And sometimes, it can destroy families to even voice doubts.

Christianity isn't the only religion that has this aspect. Among Orthodox Jews there's a common refrain that too many are leaving the faith and a standard suggested solution for this is to make kids marry earlier because once they are married they are much more likely to stay in the faith.

But whenever this sort of thing comes up it is important to ask how much do the social structures really depend on the religion? Will your husband love you less if you tell you him don't believe? Will your friends no longer be friendly? Will they stop providing social support? And if they will stop being friendly on such a basis, what makes you see them as genuine friends in the first place?

There's no question that these issues are deep and difficult and should probably be handled slowly. I'd recommend maybe sending a version of your question to The Friendly Atheist- one of the writers there has a column (Ask Richard) where he regularly answers questions much like yours and if your question gets to posted then it likely to get a large amount of input in the comments threads from people who went through similar circumstances(it might be worth looking in the archives also to see if they've had similar letters in the past. I think they have but I don't have a link to one off the top of my head).

Comment author: TimFreeman 03 June 2011 04:55:23PM 1 point [-]

So here I am having been raised in the Christian faith and trying not to freak out over the past few weeks because I've finally begun to wonder whether I believe things just because I was raised with them. Our family is surrounded by genuinely wonderful people who have poured their talents into us since we were teenagers, and our social structure and business rests on the tenets of what we believe.

This is exactly the situation where the Litany of Gendlin seems most questionable to me. I haven't been in your situation. One option for dealing with the situation might be to learn to lie really well. It might be the compassionate thing to do, if you believe that the people you interact with would not benefit from hearing that you no longer believe.

(Yes, I'm aware that I'm responding to a stale post.)

Comment author: Kenny 10 June 2013 04:36:04PM 0 points [-]

I don't believe I should lie to you (or anyone) because there might be one way you might not benefit from my honest and forthright communication. So, unfortunately, I've decided to reply to you and tell you that your advice is terrible, however well-intentioned. You seem to think that if you can imagine even one possible short-term benefit from lying or not-disclosing something, then that's sufficient justification to do so. But where exactly is the boundary dividing those things that, however uncomfortable or even devastating, must be said or written and those things about which one can decieve or dupe those one loves and respects?

'Radical honesty' isn't obviously required, but I would think that honesty about fundamental beliefs would be more important than what is normally considered acceptable dishonesty or non-disclosure for social purposes.

Comment author: TimFreeman 22 June 2013 07:55:59PM *  1 point [-]

You seem to think that if you can imagine even one possible short-term benefit from lying or not-disclosing something, then that's sufficient justification to do so.

That's not what I said. I said several things, and it's not clear which one you're responding to; you should use quote-rebuttal format so people know what you're talking about. Best guess is that you're responding to this:

[learning to lie really well] might be the compassionate thing to do, if you believe that the people you interact with would not benefit from hearing that you no longer believe.

You sharpened my "might be" to "is" just so you could disagree.

But where exactly is the boundary dividing those things that, however uncomfortable or even devastating, must be said or written and those things about which one can decieve or dupe those one loves and respects?

This is a rhetorical question, and it only makes sense in context if your point is that in the absence of such a boundary with an exact location that makes it clear when to lie, we should be honest. But if you can clearly identify which side of the boundary the alternative you're considering is on because it is nowhere close to the boundary, then the fact that you don't know exactly where the boundary is doesn't affect what you should do with that alternative.

You're doing the slippery slope fallacy.

Heretics have been burned at the stake before, so compassion isn't the only consideration when you're deciding whether to lie to your peers about your religious beliefs. My main point is that the Litany of Gendlin is sometimes a bad idea. We should be clear that you haven't cast any doubt on that, even though you're debating whether lying to one's peers is compassionate.

Given that religious relatives tend to fubar cryonics arrangements, the analogy with being burned at the stake is apt. Religious books tend to say nothing about cryonics, but the actual social process of religious groups tends to be strongly against it in practice.

(Edit: This all assumes that the Litany of Gendlin is about how to interact with others. If it's about internal dialogue, then of course it's not saying that one should or should not lie to others. IMO it is too ambiguous.)

Comment author: nshepperd 22 June 2013 09:29:31PM 6 points [-]

The Litany of Gendlin is specifically about what you should or should not believe, and your feelings about reality. It says nothing about telling people what you think is true — although "owning up to it" is confusingly an idiom that normally means admitting the truth to some authority figure, whereas in this case it is meant to indicate admitting the truth to yourself.

Comment author: Kenny 23 June 2013 04:01:00PM -1 points [-]

That's not what I said.

And that's why I wrote "You seem to think that ..."; I was describing why I thought you would privilege the hypothesis that lying would be better.

You're absolutely right that learning to lie really well and actually lying to one's family, the "genuinely wonderful people" they know, everyone in one's "social structure" and business, as well as one's husband and daughter MIGHT be the "compassionate thing to do". But why would you pick out exactly that option among all the possibilities?

This is a rhetorical question ...

Actually it wasn't a rhetorical question. I was genuinely curious how you'd describe the boundary.

The reason why I think it's a justified presumption to be honest to others is in fact because of a slippery slope argument. Human being's minds run on corrupted hardware and deception is dangerous (for one reason) because it's not always easy to cleanly separate one's lies from one's true beliefs. But your implication (that lying is sometimes right) is correct; there are some obvious or well-known schelling fences on that slippery slope, such as lying to the Nazis when they come to your house while you're hiding Jews.

Your initial statement seemed rather cavalier and didn't seem to be the product of sympathetic consideration of the original commenter's situation.

Have you considered Crocker's rules? If you care about the truth or you have something to protect then the Litany of Gendlin is a reminder of why you might adopt Crocker's rules, despite the truth possibly not being the "compassionate thing to do".

Comment author: Jiro 23 June 2013 04:47:04PM *  1 point [-]

Rationality can't be wrong, but it can be misused.

"People can stand what is true, for they are already enduring it." is technically correct, but omits factors relevant to the situations when most people consider lying to be necessary. The fact that you know something is true is itself a truth.

So if you reason "they have to endure the truth whether I tell them it or not", you also have to acknowledge that by telling them you've added a second-order truth, and they now have to endure that second-order truth that they didn't before. The implication that telling someone the truth doesn't change anything because it didn't change the original truth... isn't true,

Of course most people don't think in terms of "telling someone a truth adds another truth", but if you try to analyze it, it turns out that it does.

If you care about the truth ... then

Virtually nobody "cares about the truth" in the absolute sense needed to make that statement logically correct. Most people care about the truth as one of several things that they care about, which need to be balanced against each other.

Comment author: wedrifid 24 June 2013 02:05:55PM *  0 points [-]

If you care about the truth ... then

Virtually nobody "cares about the truth" in the absolute sense needed to make that statement logically correct.

As a matter of logic nobody caring about the truth (in whatever sense is meant by the claim) is sufficient to ensure that the statement is always correct (the part replaced by the ellipsis need not be resolved). (The problem is that it is then probably useless.)

Comment author: TimFreeman 23 June 2013 04:49:14PM *  0 points [-]

You're absolutely right that learning to lie really well and actually lying to one's family, the "genuinely wonderful people" they know, everyone in one's "social structure" and business, as well as one's husband and daughter MIGHT be the "compassionate thing to do". But why would you pick out exactly that option among all the possibilities?

Because it's a possibility that the post we're talking about apparently did not consider. The Litany of Gendlin was mentioned in the original post, and I think that when interpreted as a way to interact with others, the Litany of Gendlin is obviously the wrong thing to do in some circumstances.

Perhaps having these beautifully phrased things with a person's name attached is a liability. If I add a caveat that it's only about one's internal process, or it's only about communication with people that either aspire to be rational or that you have no meaningful relationship with, then it's not beautifully phrased anymore, and it's not the Litany of Gendlin anymore, and it seems hopeless for the resulting Litany of Tim to get enough mindshare to matter.

But where exactly is the boundary dividing those things that, however uncomfortable or even devastating, must be said or written and those things about which one can decieve or dupe those one loves and respects?

Actually it wasn't a rhetorical question. I was genuinely curious how you'd describe the boundary.

I'm not curious about that, and in the absence of financial incentives I'm not willing to try to answer that question. There is no simple description of how to deal with the world that's something a reasonable person will actually want to do.

Comment author: JohnH 22 April 2011 08:35:55PM -2 points [-]

God's ways are mysterious and it is presumptuous to suppose that we can understand them.

"I do not feel obliged to believe that that same God who has endowed us with senses, reason: and intellect has intended to forgo their use". Gods "thoughts are higher then out thoughts" but "this is life eternal, that they might know thee the only true God, and Jesus Christ, whom thou hast sent." and we are approaching "A time to come in which nothing shall be withheld".

The easiest way to have a crisis of faith is to go out and commit sins, but then the question is are you actually questioning the faith or justifying your sins?

As for everyone that says that delusion is better then truth, perhaps not for you but for others, what makes you think you are different than anyone else? Why do you want truth but think others shouldn't?

Comment author: Curiouskid 03 June 2011 04:28:02PM 0 points [-]

"Self-honesty is at its most fragile when we're not sure which path is the righteous one."

Are we ever "sure" of anything (especially ethics)?

Comment author: nwthomas 11 June 2011 10:37:20AM 7 points [-]

For the past three days I have been repeatedly performing the following mental operation:

"Imagine that you never read any documents claimed to be produced by telepathy with extraterrestrials. Now gauge your emotional reaction to this situation. Once calm, ask yourself what you would believe about the world in this situation. Would you accept materialism? Or would you still be seeking mystical answers to the nature of reality?"

I am still asking myself this question. Why? I am struggling to figure out whether or not I am wrong.

I believe things that raise a lot of red flags for "crazy delusion." Things like:

"I came from another planet, vastly advanced in spiritual evolution relative to Earth, in order to help Earth transition from the third dimension to the fourth dimension. My primary mission is to generate as much light and love as possible, because this light and love will diffuse throughout Earth's magnetic fields and reduce the global amount of strife and suffering while helping others to achieve enlightenment. I am being aided in this mission by extraterrestrials from the fourth dimension who are telepathically beaming me aid packages of light and love."

These beliefs, and many others like them, are important to my worldview and I use them to decide my actions. Because I like to think of myself as a rational person, it is a matter of great concern to me to determine whether or not they are true.

I have come across nobody who can put forth an argument that makes me question these beliefs. Noboby except for one person: Eliezer Yudkowsky. This man did what no other could: he made me doubt my basic beliefs. I am still struggling with the gift he gave me.

This gift is that he made me realize, on a gut level, that I might be wrong, and gave me motivation to really figure out the truth of the matter.

So many intelligent people believe patently absurd things. It is so difficult to escape from such a trap once you have fallen into it. If I am deluded, I want to be one of the fortunate ones who escaped from his insanity.

The thing is, I really don't know whether or not I am deluded. I have never before been so divided on any issue. Does anybody have anything they'd like to add, which might stimulate my thinking towards resolving this confusion?

Comment author: arundelo 11 June 2011 03:29:25PM 1 point [-]

Good luck! It may help to remember that this sort of thing seems to be a failure mode of the human mind. I know someone who had a manic episode during which he believed he was destined to bring enlightenment to the world. (He also believed he could control the weather.)

In case you haven't come across this already, go here and read the paragraph that starts "But it is possible to do better, even if your brain malfunctions on you."

Comment author: Alicorn 11 June 2011 06:13:02PM 12 points [-]

There are several things to ask about beliefs like this:

  1. Do they make internal sense? (e.g. "What is the fourth dimension?")

  2. Do they match the sort of evidence that you would expect to have in the case of non-delusion? (e.g. "Do you have any observable physical traits indicating your extraterrestrial origin? Would someone looking into records of your birth find discrepancies in your records indicating forgery?")

  3. Do they try to defend themselves against testing? (e.g. "Do you expect to illuminate a completely dark room at night by generating light? Would you expect to exist happily in psychological conditions that would harm normal humans by subsisting on aid packages full of love?")

  4. Do they have explanatory power? (e.g. "Has there, as a matter of historical fact, been a sudden and dramatic reduction in global strife and suffering since the date of your supposed arrival?")

  5. Do they have a causal history that can be reasonably expected to track with truth across the entire reference class from an outside view? (e.g. "Did you receive your information via private mental revelation or a belief from as long ago as you can remember, similar to the beliefs of people you do consider crazy?")

Comment author: XiXiDu 11 June 2011 06:48:53PM 1 point [-]

"What is the fourth dimension?"

I understand your point, but it also reminded me of this :-)

Comment author: nwthomas 12 June 2011 07:12:49AM 3 points [-]

Hi, Alicorn!

  1. Yes. They are drawn from the material at http://lawofone.info/ . The philosophy presented there is internally consistent, to the best of my understanding.

  2. There is no physical evidence. All of the "evidence" is in my head. This is a significant point.

  3. There are a variety of points in the source document which could be interpreted as designed to defend its claims against testing. This is a significant point.

  4. I am not aware of any physically testable predictions that these beliefs make. This is a significant point.

  5. The causal history of these beliefs is that I read the aforementioned document, and eventually decided that it was true, mainly on the basis of the fact that it made sense to my intuition and resonated personally with me. This is a significant point.

Thanks for asking!

Comment author: MixedNuts 13 June 2011 10:28:35AM 1 point [-]

Currently reading Law of One. I'm not sure what the mechanism is, but it seems to involve people receiving telepathic messages (from an entity named Ra) and speaking them aloud. I would like to note that I have experienced messages coming into my head, seemingly from outside (either as voices or as an impulse to write), and can even occasionally cause it voluntarily. Their content can be partially unexpected, but it never contains information I could test independently. I consider this an entertaining misbug in my brain, not evidence of an external telepathic entity.

Comment author: handoflixue 11 June 2011 12:23:26PM 1 point [-]

Not every doubt calls for staging an all-out Crisis of Faith. But you should consider it when: [snip list]

It's interesting realising how many of these generally apply to the idea "I don't want a sex change" (and "I'm happy with my sexual orientation / current relation / current job / current social circle", but specifically I've noticed that transitioning from one sex to another seems to require that sort of heroic rational effort)

Comment author: Arandur 01 August 2011 04:56:19AM 2 points [-]

My belief in the tenants of the Church of Jesus Christ of Latter-Day Saints has these warning signs.

Comment author: anonymous300 07 October 2012 08:42:54PM 4 points [-]

Years later, I'm reading this.

I've been reading through the Major Sequences and I'm getting towards the end of "How to actually change your mind" with this growing sense of unease that there are all these rational principles that I could use but I know not to what end and I think I might finally have found somewhere: My political convictions.

I'm not going to say which direction I lean, but I lean that direction very strongly. My friends lean that way. My parents do not lean that way as much as I do but they are in that general direction.

And I realize that I may not lean that way because it is the rational way to approach a well-run country, but because I am merely used to it.

So perhaps, one of these weeks, I will sit down and have a long hard think on my politics. I think I would prefer to stay leaned the way I currently am - but I would, wouldn't I.

Comment author: chaosmosis 07 October 2012 09:00:36PM 2 points [-]

You'll be much more vulnerable to biases if you do this alone.

It would be ideal if you could find someone who's more moderate or opposite your view who is also wanting to consider politics objectively, and you talked about things with them. This could backfire if they're not very patient, and you shouldn't let it become an excuse for delaying, but if you can find someone suitable who is like this then I think that would be better.

Comment author: Rixie 27 August 2013 07:37:11PM 1 point [-]

I started letting go of my faith when I realized that there really isn't much Bayesian evidence for it. Realizing that the majority of the evidence needed to believe something is used just to isolate that something out of all the other possible beliefs finished it off. But I do have one question: If Jesus wasn't magic, where did the Bible even come from? Lee Strobel "proves" that Jesus died and came back from the dead, but his proofs are based on the Bible. Why was the Bible so widely accepted if there wasn't anything extra-special about Jesus after all?

Comment author: Benito 27 August 2013 08:11:36PM 1 point [-]

Asking similar questions about the Quran and various other religion's holy texts, and just general popularity of many cults and things, makes you realise an idea or set of such things has no requirement to be true to be popular. In fact, looking at the self-help section in a bookstore reminds you of this (see Lukeprog's self-help sequence first post). I also believe that Richard Carrier has a book called 'Not the Impossible Faith' which discusses this question, although defo check that if you're thinking of buying for that purpose.

Comment author: Nornagest 27 August 2013 08:18:48PM *  1 point [-]

If Jesus wasn't magic, where did the Bible even come from?

Well, if you make the assumption that Jesus existed and behaved as described in the New Testament, this reduces to Lewis's trilemma. The criticisms section of that page outlines some of the possible responses.

The option I personally find most compelling is that there's plenty of room for distortion and myth-making between Jesus's ministry and the writing of the earliest Christian works we know about: at least four decades [ETA: got this wrong earlier; see downthread], possibly more depending on how generous you're being. Knowing what we do about how myths form, that's more than enough time for the supernaturalism in the Gospels to have accumulated. Look at it this way and it's no longer a question of "lunatic, liar, or Lord"; rather a colossal game of Telephone played between members of a fragmented and frequently persecuted sect, many of whom would have had incentive to play up the significance of the founding events. There are more recent religious innovations that you can look at for comparison: Mormonism, for example, or Rastafarianism.

Some have even used this to argue against the historicity of Jesus, although I don't think doing so is necessary to a secular interpretation of the New Testament.

Comment author: Protagoras 27 August 2013 11:23:53PM *  0 points [-]

How do you get "between a hundred and twenty and two hundred" years? The standard story puts the death of Jesus around 30 B.C.E., and dates the composition of the earliest gospel to around 70 B.C.E. Admittedly, the standard story is certainly not beyond question[1] but I'd be interested if you had any specific reasons for advocating a different timeline. Of course, 40 years is more than sufficient for pretty much unlimited distortion and mythmaking anyway.

[1] The chain of reasoning for dating the composition is, sadly, too often along these lines: we know that A was certainly written before date X, because A must be before B. We know this because B contains a vague reference that kind of looks like it refers to A, and it doesn't look all that likely that B was tampered with by later scholars to insert the reference. B must be before C for similar reasons, and C before D, and D before E, and E actually contains some fairly specific references to being written around date Y which we again don't think are all that likely to have been tampered with by later copyists. It is unlikely at each stage that the next writer acquired and made use of the text as soon as it was written, so we subtract a few years from Y for each stage for the transmission of the text and arrive at X as the latest possible date for A to have been written.

Comment author: Nornagest 27 August 2013 11:47:21PM *  1 point [-]

My mistake, I was thinking of non-Christian references to the life of Jesus (and didn't have the dates quite right there either; Tacitus wrote in the early second century and Josephus late in the first, although both references are rather brief). As best I can tell, you're right about the chronology of Christian writings; Mark is thought to be the earliest of the surviving Gospels, and that was probably written around CE 70. The hypothetical Q source may have come somewhat earlier, but seems to have been a collection of sermons and proverbs rather than a gospel as such, if its projected influence on later works is anything to go by.

Edited to correct. But yes, forty years is a large enough gap to explain a lot of drift.

Comment author: Protagoras 28 August 2013 01:18:09AM 0 points [-]

Ugh. Why'd I write "B.C.E." when I meant "C.E."? Oh well, I guess it didn't confuse anyone. Anyway, besides a handful of people who question the usual gospel dating and try to argue that it was really considerably later, I know there's also a tiny minority of scholars who date the life of Jesus much earlier, as much as a century or more before what the standard story reports. Hence, I'd wondered if you were a subscriber to one of those theories. It means having to assume some of the references to contemporary events in the gospels are just wrong, but honestly the standard story also has to do that; it just has a different set of mistakes it needs to explain away. Still, it's a pretty tiny minority theory, and I haven't really investigated what the evidence for it is supposed to be.

Comment author: Salemicus 27 August 2013 08:20:22PM *  2 points [-]

If Jesus wasn't magic, where did the Bible even come from?

Some people wrote it down. That's also the Christian story of where the Bible came from.

There probably was something extra-special about Jesus, in the sense that he was highly charismatic, or persuasive, and so on. And his followers probably really did think that he'd come back from the dead, or at least that his body had mysteriously vanished. But none of that adds up to magic or divinity. Look at people in the current day - convinced (rightly or wrongly) in the existence of aliens, or homeopathy, or whatever else. "If L. Ron Hubbard wasn't magic, where did Dianetics come from?"

Alternatively, consider Joseph Smith. He's far more recent and far better-attested than Jesus, who also had a loyal group of followers who swore blind that they'd seen miracles - even the ones who later broke with him, and who after his death, carried on his teachings and founded a religion with the utmost seriousness and in the face of extreme hardship and sacrifice. Yet chances are you're not a Mormon (or, if you are a Mormon, consider Mohammed ibn Abdullah). Apply the same thinking to Jesus's life as you do to that of Josepth Smith, and see where it takes you.