This is an unusually high quality post, even for you Eliezer; congrats!
It seems that it takes an Eliezer-level rationalist to make an explicit account of what any ten-year-old can do intuitively. For those not quite Eliezer-level or not willing to put in the effort, this is really frustrating in the context of an argument or debate.
I suspect that there are many people in this world who are, by their own standards, better off remaining deluded. I am not one if them; but I think you should qualify statements like "if a belief is false, you are better off knowing that it is false".
It is even possible that some overoptimistic transhumanists/singularitarians are better off, by their own standards, remaining deluded about the potential dangers of technology. You have the luxury of being intelligent enough to be able to utilize your correct belief about how precarious our continue...
Many in this world retain beliefs whose flaws a ten-year-old could point out
Very true. Case in point: the belief that "minimum description length" or "Solomonoff induction" can actually predict anything. Choose a language that can describe MWI more easily than Copenhagen, and they say you should believe MWI; choose a language that can describe Copenhagen more easily than MWI, and they say you should believe Copenhagen. I certainly could have told you that when I was ten...
The argument in this post is precisely analogous to the following:
Bayesian reasoning cannot actually predict anything. Choose priors that result in the posterior for MWI being greater than that for Copenhagen, and it says you should believe MWI; choose priors that result in the posterior for Copenhagen being greater than that for MWI, and it says you should believe Copenhagen.
The thing is, though, choosing one's own priors is kind of silly, and choosing one's own priors with the purpose of making the posteriors be a certain thing is definitely silly. Priors should be chosen to be simple but flexible. Likewise, choosing a language with the express purpose of being able to express a certain concept simply is silly; languages should be designed to be simple but flexible.
It seems to me that you're waving the problem away instead of solving it. For example, I don't know of any general method for devising a "non-silly" prior for any given parametric inference problem. Analogously, what if your starting language accidentally contains a shorter description of Copenhagen than MWI?
(Do I get a prize for saying "e.g." so much?)
Yes. Here is an egg and an EEG.
Bo, the point is that what's most difficult in these cases isn't the thing that the 10-year-old can do intuitively (namely, evaluating whether a belief is credible, in the absence of strong prejudices about it) but something quite different: noticing the warning signs of those strong prejudices and then getting rid of them or getting past them. 10-year-olds aren't specially good at that. Most 10-year-olds who believe silly things turn into 11-year-olds who believe the same silly things.
Eliezer talks about allocating "some uninterrupted hours", but for me a proper Crisis of Faith takes longer than that, by orders of magnitude. If I've got some idea deeply embedded in my psyche but am now seriously doubting it (or at least considering the possibility of seriously doubting it), then either it's right after all (in which case I shouldn't change my mind in a hurry) or I've demonstrated my ability to be very badly wrong about it despite thinking about it a lot. In either case, I need to be very thorough about rethinking it, both because that way I may be less likely to get it wrong and because that way I'm less likely to spend the rest of my life worrying that I missed somethin...
Some interesting, useful stuff in this post. Minus the status-cocaine of declaring that you're smarter than Robert Aumann about his performed religious beliefs and the mechanics of his internal mental state. In that area, I think Michael Vassar's model for how nerds interpret the behavior of others is your God. There's probably some 10 year olds that can see through it (look everybody, the emperor has no conception that people can believe one thing and perform another). Unless this is a performance on your part too, and there's shimshammery all the way down!
"How do I know if long-held belief X is false?"
Eliezer, I guess if you already are asking this question you are well on your way. The real problem arises when you didn't even manage to pinpoint the possibly false believe. And yes I was a religious person for many years before realizing that I was on the wrong way.
Why didn't I question my faith? Well, it was so obviously true to me. The thing is: did you ever question heliocentrism? No? Why not? When you ask the question "How do I know if Heliocentrism is false?" You are already on your way. The thing is, your brain needs a certain amount of evidence to pinpoint the question.
How did I overcome my religion? I noticed that something was wrong with my worldview like seeing a deja vu in the matrix every now and then. This on an intellectual level, not as a visible thing. But much more subtle and less obvious so you really have to be attentive no notice it, to notice that there is a problem in the pattern. Things aren't the way they should be.
But over time I became more and more aware that the pieces weren't fitting together. But from there to arrive at the conclusion that my basic assumptions where wrong was really ...
Good post but this whole crisis of faith business sounds unpleasant. One would need Something to Protect to be motivated to deliberately venture into this masochistic experience.
All these posts present techniques for applying a simple principle: check every step on the way to your belief. They adapt this principle to be more practically useful, allowing a person to start on the way lacking necessary technical knowledge, to know which errors to avoid, which errors come with being human, where not to be blind, which steps to double-check, what constitutes a step and what a map of a step, and so on. All the techniques should work in background mode, gradually improving the foundations, propagating the consequences of the changes to m...
Fact check: MDL is not Bayesian. Done properly, it doesn't even necessarily obey the likelihood principle. Key term: normalized maximum likelihood distribution.
My father is an atheist with Jewish parents, and my mother is a (non-practicing) Catholic. I was basically raised "rationalist", having grown up reading my father's issues of Skeptical Inquirer magazine. I find myself in the somewhat uncomfortable position of admitting that I acquired my belief in "Science and Reason" in pretty much the same way that most other people acquire their religious beliefs.
I'm pretty sure that, like everyone else, I've got some really stupid beliefs that I hold too strongly. I just don't know which ones they are!
Great post. I think that this sort of post on rationality is extremely valuable. While one can improve everyday judgment and decision making by learning about rationality from philosophy, econ and statistics, I think that these informal posts can also make a significant difference to people.
The recent posts on AI theorists and EY's biography were among my least favorite on OB. If you have a choice, please spend more time on either technical sequences (e.g. stuff on concepts/concept space, evolutionary bio, notion of bias in statistics) or stuff on rationality like this.
A good reminder. I've recently been studying anarcho-capitalism. It's easy to get excited about a new, different perspective that has some internal consistency and offers alternatives to obvious existing problems. Best to keep these warnings in mind when evaluating new systems, particularly when they have an ideological origin.
"Try to think the thought that hurts the most."
This is exactly why I like to entertain religious thoughts. My background, training, and inclination are to be a thoroughgoing atheist materialist, so I find that trying to make sense of religious ideas is good mental exercise. Feel the burn!
In that vein, here is an audio recording of Robert Aumann on speaking on "The Personality of God".
Also, the more seriously religious had roughly the same idea, or maybe it's the opposite idea. The counterfactuality of religious ideas is part of their strength, apparently.
Here's a doubt for you: I'm a nerd, I like nerds, I've worked on technology, and I've loved techie projects since I was a kid. Grew up on SF, all of that.
My problem lately is that I can't take Friendly AI arguments seriously. I do think AI is possible, that we will invent it. I do think that at some point in the next hundreds of years, it will be game over for the human race. We will be replaced and/or transformed.
I kind of like the human race! And I'm forced to conclude that a human race without that tiny fraction of nerds could last a good long tim...
I'd be interested in a list of questions you had decided to have a crisis of faith over. If I get round to it I might try and have one over whether a system can recursively self-improve in a powerful way or not.
A lot of truths in EY's post. Though I also agree with Hopefully Anon's observations -- as is so often the case, Eliezer reminds me of Descartes -- brilliant, mathematical, uncowed by dogma, has his finger on the most important problems, is aware of how terrifyingly daunting those problems are, thinks he has a universal method to solve those problems.
Trying to set up an artificial crisis in which one outcome is as likely as another is a very bad idea.
If your belief is rationally unjustfiable, a 'crisis' in which one has only a fifty-fifty chance of rejecting the belief is not an improvement in rationality. Such a crisis is nothing more than picking a multiple-choice answer at random -- and with enough arbirarily-chosen options, the chance of getting the right one becomes arbitrarily small.
A strategy that actually works is setting your specific beliefs aside and returning to a state of uncertainty, then testing one possibility against the other on down to first principles. Uncertainty != each possibility equally likely.
Thank you for this post, Eliezer. I must painfully question my belief that a positive Singularity is likely to occur in the foreseeable future.
Nazir Ahmad Bhat, you are missing the point. It's not a question of identity, like which ice cream flavor you prefer. It's about truth. I do not believe there is a teapot orbiting around Jupiter, for the various reasons explained on this site (see Absence of evidence is evidence of absence and the posts on Occam's Razor). You may call this a part of my identity. But I don't need people to believe in a teapot. Actually, I want everyone to know as much as possible. Promoting false beliefs is harming people, like slashing their tires. You don't believe in a flying teapot: do you need other people to?
Nazir, must there be atheists in order for you to believe in a god? The "identity" of those who believe that the world is round does not depend on others believing that the world is flat, or vice versa. Truth does not require disagreement.
Matthew C.,
You've been suggesting that for a while:
http://www.overcomingbias.com/2007/01/godless_profess.html#comment-27993437 http://www.overcomingbias.com/2008/09/psychic-powers.html#comment-130445874
Those who have read it (or the hundreds of pages available on Google Books, which I have examined) don't seem to be impressed.
Why do you think it's better than Broderick's book? If you want to promote it more effectively in the face of silence (http://www.overcomingbias.com/2007/02/what_evidence_i.html), why not pay for a respected reviewer's time and a writ...
Do these methods actually work? There were a few posts here on how more evidence and bias awareness don't actually change minds or reduce bias, at least not without further effort. Can a practical "Deduce the Truth in 30 Days" guide be derived from these methods, and change the world?
A fifty-fifty chance of choosing your previous belief does not constitute a reasonable test. If your belief is unreasonable, why would treating it as equally plausible as the alternative be valid?
The trick is to suspend belief and negate the biasing tendencies of belief when you re-evaluate, not to treat all potentials as equal.
Eliezer:
If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it.I think you should try applying your own advice to this belief of yours. It is usually true, but it is certainly not always true, and reeks of irrational bias.
My experience with my crisis of faith seems quite opposite to your conceptions. I was raised in a fundamentalist family, and I had to "make an extraordinary effort" to keep believing in Christianity from the time I was 4 and started reading through the Bible, and findin...
From "Twelve virtues of rationality" by Eliezer:
The third virtue is lightness. Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can. Do this the instant you realize what you are resisting; the instant you can see from which quarter the winds of evidence are blowing against you.
Eliezer uses almost the same words as you do.( Oh, and this document is from 2006, so he has not copied your lines.) Some posts earlier Eliezer accused you of not reading his writings and just making stuff up regarding his viewpoints.......
MichaelG:
On the other hand, I don't think a human race with nerds can forever avoid inventing a self-destructive technology like AI.
The idea is that if we invent Friendly AI first, it will become powerful enough to keep later, Unfriendly ones in check (either alone, or with several other FAIs working together with humanity). You don't need to avoid inventing one forever: it's enough to avoid inventing one as the first thing that comes up.
If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it.It is easy to construct at least these 2 kinds of cases where this is false:
I was raised in a christian family, fairly liberal Church of England, and my slide into agnosticism started when I about 5-7 when I asked if Santa Claus and God were real. I refused to get confirmed and stopped going to church when I was 13ish I think.
In other words, you are advocating a combative, Western approach; I am bringing up a more Eastern approach, which is not to be so attached to anything in the first place, but to bend if the wind blows hard enough.
The trouble is that you cannot break new ground this way. You can't do einstein like feats. Y...
If an entire community can be persuaded to adopt a false belief, it may enable them to overcome a tragedy-of-the-commons or prisoners'-dilemma situation.
In a PD, agents hurt each other, not themselves. Obviously false beliefs in my enemy can help me.
Study this deranged rant. Its ardent theism is expressed by its praise of the miracles God can do, if he choses.
And yet,... There is something not quite right here. Isn't it merely cloakatively theistic? Isn't the ringing denounciation of "Crimes against silence" militant atheism at its most strident?
So here is my idea: Don't try to doubt a whole core belief. That is too hard. Probe instead for the boundary. Write a little fiction, perhaps a science fiction of first contact, in which you encounter a curious character from a different culture. Wri...
If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it. I think evolution facilitated self-delusion precisely because that is not the case.
I was a Fred Phelps style ultra-Calvinist and my transition involved scarcely any effort.
Also, anti-reductionist, that's the first comment you've made I felt was worth reading. You may take it as an insult but I felt compelled to give you kudos.
I suspect that there are many people in this world who are, by their own standards, better off remaining deluded. I am not one if them; but I think you should qualify statements like "if a belief is false, you are better off knowing that it is false".
Of course I deliberately did not qualify it. Frankly, if you're still qualifying the statement, you're not the intended audience for a post about how to make a convulsive effort to be rational using two dozen different principles.
Eliezer, what do you mean here? Do you mean:
(A1) Individuals in the reference class really are always better off with the truth, with sufficient probability that the alternative does not bear investigating;
(A2) Humans are so unreliable as judges of what we would and would not benefit from being deceived about that the heuristic "we're always better off with the truth" is more accurate than the available alternatives;
(B) Individuals must adopt the Noble Might-be-truth "I'm always better off with the truth" to have a chance at the Crisis of Faith technique?
Eliezer: The position that people may be better off deluded in some situations is VERY compelling. If your audience is people who are literally NEVER better off deluded then I sincerely doubt that it includes you or anyone else. Obviously not every belief need receive all appropriate qualifications every single time, but when someone else points out a plausible qualification you should, as a rationalist, acknowledge it.
I'm very open to Anna's (A1), especially given the special difficulties of this sort of investigation, but only with respect to themselves. I would expect someone as smart as me who knew me well enough to some day come upon a situation where I should, by my values, be deceived, at least for some period.
@Anna:
I mean that you've given up trying to be clever.
@Vassar:
The position that people may be better off deluded in some situations is VERY compelling.
The position that people may be optimally deluded, without a third alternative, is much less compelling.
The position that realistic human students of rationality can be trying to do their best (let alone do the impossible), while trying to deliberately self-delude, strikes me as outright false. It would be like trying to win a hot-dog eating contest while keeping a golf ball in your mouth.
It is this outright falsity that I refer to when I say that by the time you attempt to employ techniques at this level, you should already have given up on trying to be clever.
As someone once said to Brennan:
She reared back in mock-dismay. "Why, Brennan, surely you don't expect me to just tell you!"...Brennan gritted his teeth. "Why not?"
"What you're feeling now, Brennan, is called curiosity. It's an important emotion. You need to learn to live with it and draw upon its power. If I just give you the information, why, you won't be curious any more." Her eyes turned serious. "Not that you should prefer ign
The problem with the idea that sometimes people are better of not knowing is that it has no practical impact on how an ideal rationalist should behave, even assuming it's true. By the time you've learned something you'd be better off not knowing, it's too late to unlearn it. Humans can't really do doublethink, and especially not at the precision that would be required to be extremely rational while using it.
it has no practical impact on how an ideal rationalist should behave
With respect to themselves, not necessarily to others. Withholding information or even lying can be rational.
I think Einstein is a good example of both bending with ...In other words, you are advocating a combative, Western approach; I am bringing up a more Eastern approach, which is not to be so attached to anything in the first place, but to bend if the wind blows hard enough.The trouble is that you cannot break new ground this way. You can't do einstein like feats. You should follow the direction of the wind, but engage nitrous to follow that direction. Occasionally stopping and sticking a finger out the window to make sure you are going the right direction.
In retrospect, I was certainly leaving christianity the day I decided that if god existed, he could not possibly object to me honestly trying to determine The Truth. Doubts stopped feeling like sins.
I think your something-to-protect must be the accuracy of your map, not anything on the map. (at least for a moment)
If someone says you need to fire a revolver at your something-to-protect, you will raise objection based on strongly held beliefs about the effects of revolvers. It's so hard to risk those beliefs because, with your current belief set, someone who...
In a PD, everyone having accurate information about the payoff matrix leads to a worse outcome for everyone, than some false payoff matrices you could misinform them with. That is the point.
Do you agree that in a PD, it is not the case for any individual that that individual is harmed by that individual's knowledge? Your point goes through if we somehow think of the collective consisting as a single "you" with beliefs and preferences, but raises all sorts of issues and anyway isn't what Eliezer was talking about.
I think Einstein is a good example of both bending with the wind (when he came up with relativity)I'm not sure what you mean by bending with the wind. I thought it was the evidence that provided the air pressure, but there was no evidence to support Einstein's theory above the theories of the day. He took an idea and ran with it to its logical conclusions. Then the evidence came, he was running ahead of the evidential wind.
If the wind is following occams or something internal, then it can be blowing in the wrong direction...
If the wind is following occams or something internal, then it can be blowing in the wrong direction...
Isn't that the subject of The Ritual?
One more argument against deceiving epistemic peers when it seems to be in their interest is that if you are known to have the disposition to do so, this will cause others to trust your non-deceptive statements less; and here you could recommend that they shouldn't trust you less, but then we're back into doublethink territory.
Phil Goetz, who I was replying to, was saying that type of thought should be unnecessary, if you don't hang on to your ideas tightly.
Not hanging on to ideas tightly is great for engineers and experimental scientists. It doesn't matter to a chemist if MWI or bohm is right. He can use either, switching back or forth from the view points as he sees fit.
For a theoretical quantum physicist, he has to have some way of determining at which face of the knowledge mine to work, he has to pick one or the other. If it is not a strong reason then he might split his wor...
Well, I've just sat down and done one of those, and it was really difficult. Not so much because I was pushing against established beliefs (I had strong beliefs both ways, so it was more that any movement pushed somewhere) but because the largest worry I had, "Is this a fad?", is hard to answer specifically because I've recently changed to become so much more Bayesian. I used to do daft things like "giving ideas a chance". Consequently, I can't look to my long and undistinguished history in order to glean hints. I already don't do the o...
"I am better off deluding myself into believing my cherished late spouse was faithful to me."
1) Do you believe this is true for you, or only other people?
2) If you know that someone's cherished late spouse cheated on them, are you justified in keeping silent about the fact?
3) Are you justified in lying to prevent the other person from realizing?
4) If you suspect for yourself (but are not sure) that the cherished late spouse might have been unfaithful, do you think that you will be better off, both for the single deed, and as a matter of you...
I think Einstein is a good example of both bending with the wind (when he came up with relativity)I'm not sure what you mean by bending with the wind. I thought it was the evidence that provided the air pressure, but there was no evidence to support Einstein's theory above the theories of the day. He took an idea and ran with it to its logical conclusions. Then the evidence came, he was running ahead of the evidential wind. You do know roughly what I mean, which is that strenuous effort is only part of the solution; not clinging to ideas is the ot...
I am better off (in most circumstances) if deluding myself to believe that the weather in Maine on the 23rd of June 1865 was near what I think the seasonal average might be, for that decade, rather than memorising the exact temperature and rainfall if it was presented to me.
I believe this is true for most people, apart from climatologists.
I would be rather not be around people who kept telling me true minutiae about the world and he cosmos, if they have no bearing on the problems I am trying to solve.
Am I justified in giving people a guess of the average ...
From where I'm standing, the spouse thing looks like obvious nonsense (of the category: not looking for a third alternative). You'd be far better off learning to share - especially since, if your spouse died, you'd have someone to talk to.
Nazir, a secret hack to prevent Eliezer from deleting your posts is here. #11.6 is particularly effective.
Religion is the classic example of a delusion that might be good for you. There is some evidence that being religious increases human happiness, or social cohesion. It's universality in human culture suggests that it has adaptive value.
Nope. There is some evidence that christians in the USA are happier than atheists in the USA. But since that correlation doesn't hold up in Europe I prefer to interprete it as: America is bad for atheists.
1) Do you believe this is true for you, or only other people?
I don't fit the premise of the statement -- my cherished spouse is not yet late, so it's hard to say.
2) If you know that someone's cherished late spouse cheated on them, are you justified in keeping silent about the fact?
Mostly yes.
3) Are you justified in lying to prevent the other person from realizing?
Mostly no.
4) If you suspect for yourself (but are not sure) that the cherished late spouse might have been unfaithful, do you think that you will be better off, both for the single deed, and as a...
Carl Schuman,
I keep posting the link, for a very simple reason.
Eliezer continues to post about the certainty of reductionism, while he has completely failed to investigate the evidence that reductionism cannot account for all of the observations.
He also continues to post snide remarks about the reality of psi phenomena. Again, he has completely failed to investigate the best evidence that he is wrong about this.
The post he wrote here shows a great committment to intellectual integrity. And I honestly believe he means what he wrote here.
I suspect at some ...
Eliezer continues to post about the certainty of reductionism, while he has completely failed to investigate the evidence that reductionism cannot account for all of the observations.Reducing things to the simplest description possible -- where 'possible' refers to the ability to accurately model things -- by definition accounts for all veridical observations.
His point is necessarily correct, as well as empirically so.
He also continues to post snide remarks about the reality of psi phenomena. Again, he has completely failed to investigate the best eviden...
I guess I am questioning whether making a great effort to shake yourself free of a bias is a good or a bad thing, on average. Making a great effort doesn't necessarily get you out of biased thinking. It may just be like speeding up when you suspect you're going in the wrong direction.
If someone else chose a belief of yours for you to investigate, or if it were chosen for you at random, then this effort might be a good thing. However, I have observed many cases where someone chose a belief of theirs to investigate thoroughly, precisely because it was an ...
Religion is the classic example of a delusion that might be good for you. There is some evidence that being religious increases human happiness, or social cohesion. It's universality in human culture suggests that it has adaptive value.
See last week's Science, Oct. 3 2008, p. 58-62: "The origin and evolution of religious prosociality". One chart shows that, in any particular year, secular communes are four times as likely to dissolve as religious communes.
Caledonian,
Read chapter 3, then come back and explain why a reductionistic explanation best accounts for the phenomena described there. Because if you are inconversant with the evidence, you simply have no rational basis to make any comment whatsoever.
You also seem to be playing some kind of semantic games with the word "reductionism" which I'll just note and ignore.
It's important in these crisis things to remind yourself that 1) P does not imply "there are no important generally unappreciated arguments for not-P", and 2) P does not imply "the proponents of P are not all idiots, dishonest, and/or users of bad arguments". You can switch sides without deserting your favorite soldiers. IMO.
Matthew C -
You are advocating nonreductionism and psi at the same time.
Supposing that you are right requires us to suppose that there is both a powerful argument against reductionism, and a powerful argument in favor of psi.
Supposing that you are a crank requires only one argument, and one with a much higher prior.
In other words, if you were advocating one outrageous theory, someone might listen. The fact that you are advocating two simultaneously makes dismissing all of your claims, without reading the book you recommend, the logical response. We thus don't have to read it to have a rational basis to dismiss it.
I would be rather not be around people who kept telling me true minutiae about the world and he cosmos, if they have no bearing on the problems I am trying to solve.
Will, not wishing to be told pointless details is not the same as deluding yourself.
I was discussing the placebo effect with a friend last night though, and found myself arguing that this could well be an example of a time when more true knowledge could hurt. Paternalistic issues aside, people appear to get healthier when they believe falsehoods about the effectiveness of, say, homeopathy or s...
Phil: One of psi or non-reductionism being true would be a powerful argument in favor of the other.
Ben: great example.
Phil: One of psi or non-reductionism being true would be a powerful argument in favor of the other.No, it really wouldn't. Neither implies the other, or even suggests that the other is more likely than it would be otherwise.
Caledonian, maybe you had arguments on this thread previously, but it seems more like the place for that sub-debate.
First of all, great post Eliezer. If somebody holding that kind of standard thinks that cryogenics is a good investment, I should someday take some time into investing the question deeper than I had.
Now, without lessening the previous praise, I would like to make the following remarks about friendly AI:
- The belief has long remained in your mind;
- It is surrounded by a cloud of known arguments and refutations;
- You have sunk costs in it (time, money, public declarations).
I do not know if it has emotional consequences for you or if it has gotten mixed u...
In fact most human beings at most times (including today) have accepted both propositions as real.That should be a tipoff for you. Of all the things most humans have accepted as real, how many of them do we currently recognize as real? The general acceptance of a position is a virtual guarantee that it's completely wrong.
As for your text -- no one who seriously suggests that mental states affecting health and shamanistic death spells are evidence for either 'non-reductionism' or psi is worth taking the time to refute in detail.
(Hint: there are perfec...
I was raised a theist and came to no longer belive as an adult. One of the turning points was reading the Anglican confession of faith, and supposing what my own beliefs might look like to an Anglican, who was also a christian, saved by Jesus just like me - just a different variety of.
Eventually I began to wonder what my life experiences might look like to an atheist - religion is above all an interpretive filter that we use to make sense of our lives. Although I knew that my beliefs in God were right, what would my life look like to me if I did not belive...
Matthew C, I read the introduction and chapters 1 and 3. Are you sure you meant chapter 3? It does not seem to say what you think it says. Most of it is a description of placebos and other psychosomatic effects. It also discusses some events that are unlikely in isolation but seem trivially within the realm of chance given 100 years and approaching 7 billion people. There is also a paragraph with no numbers saying it can't just be chance.
It feels kind of like asking everyone in the country to flip a coin 25 times, then calling the 18 or so people who ...
Odd, I'm a Christian daughter of two atheists. I guess I didn't miss out after all.
I became a Christian because I was a Bayesian first. I know there are others like me. I saw and experienced evidence that caused me to positively update my belief.
Now if you don't like that argument, then please tell me how can anyone become an atheist via Bayesian updating? Can your posterior really go to a point mass at zero (belief in God)? If so, please tell me what prior you were using. If not, please tell me how you define atheism.
Atheism is believing that the state of evidence on the God question is similar to the state of evidence on the werewolf question.
Would that apply to someone with a particularly high prior on the werewolf question? So, you would agree that anyone who believes that the state of evidence on "the God question" is more positive than the state of evidence on the "werewolf question" should consider labeling themselves an agnostic? theist?
And, I presume that you believe that one's current belief in the state of evidence would be controlled by 1) verifiable general evidence, 2) experience, and 3) priors on both questions?
Then we're in agreement: you should (apparently) call yourself an atheist, and I should call myself a Christian, as we differ on #2 and #3. (not that theism = Christian, but that goes back to #2).
[quote]I became a Christian because I was a Bayesian first. I know there are others like me. I saw and experienced evidence that caused me to positively update my belief.
Now if you don't like that argument, then please tell me how can anyone become an atheist via Bayesian updating? Can your posterior really go to a point mass at zero (belief in God)? If so, please tell me what prior you were using. If not, please tell me how you define atheism.
[/quote]
And how can your probability go to one? You erect a straw man, sir. My probablility that there is a god ...
@ Paul Murray
There is no straw man. You've presumed that I meant that Christian = "Pr(god)=1". That was never my claim. It had seemed that atheist was being used as Atheist="Pr(god)=0", but E. clarified his position. I think that agnostic (in the literal sense) is always a better term than atheist, but that's just semantics.
The real issue (to me) is what Christians (or other "people of faith") think of the atheistic position, and vice versa. Christians are often derided here as uneducated or un-Bayesian.
My point is not to conv...
I don't need to read the book. I believe that psi effects are not real, because if they were, they would already be part of accepted science.
It's not a matter of being closed-minded or open-minded. I'm just not accepting your book author as a legitimate authority. Most things I believe, I believe because they are asserted by authorities I accept. For example, I have never personally seen an experiment performed that establishes that the Sun is made primarily of hydrogen and helium, that an atom of gold contains seventy-nine protons, that George Washington ...
@Mattew C.
Do you mean by "remote staring experiments" those of Wiseman/Schlitz?
It seems that when properly controlled, they produced no statistically significant effect: http://forums.randi.org/archive/index.php/t-43727.html
So here I am having been raised in the Christian faith and trying not to freak out over the past few weeks because I've finally begun to wonder whether I believe things just because I was raised with them. Our family is surrounded by genuinely wonderful people who have poured their talents into us since we were teenagers, and our social structure and business rests on the tenets of what we believe. I've been trying to work out how I can 'clear the decks' and then rebuild with whatever is worth keeping, yet it's so foundational that it will affect my marriage (to a pretty special man) and my daughters who, of course, have also been raised to walk the Christian path.
Is there anyone who's been in this position - really, really invested in a faith and then walked away?
Quite a few. Dan Barker was a Christian minister before he walked away. But the truth is, it is much harder to walk away from a religion when one is married and has a family. And sometimes, it can destroy families to even voice doubts.
Christianity isn't the only religion that has this aspect. Among Orthodox Jews there's a common refrain that too many are leaving the faith and a standard suggested solution for this is to make kids marry earlier because once they are married they are much more likely to stay in the faith.
But whenever this sort of thing comes up it is important to ask how much do the social structures really depend on the religion? Will your husband love you less if you tell you him don't believe? Will your friends no longer be friendly? Will they stop providing social support? And if they will stop being friendly on such a basis, what makes you see them as genuine friends in the first place?
There's no question that these issues are deep and difficult and should probably be handled slowly. I'd recommend maybe sending a version of your question to The Friendly Atheist- one of the writers there has a column (Ask Richard) where he regularly answers questions much like yo...
The Litany of Gendlin is specifically about what you should or should not believe, and your feelings about reality. It says nothing about telling people what you think is true — although "owning up to it" is confusingly an idiom that normally means admitting the truth to some authority figure, whereas in this case it is meant to indicate admitting the truth to yourself.
God's ways are mysterious and it is presumptuous to suppose that we can understand them.
"I do not feel obliged to believe that that same God who has endowed us with senses, reason: and intellect has intended to forgo their use". Gods "thoughts are higher then out thoughts" but "this is life eternal, that they might know thee the only true God, and Jesus Christ, whom thou hast sent." and we are approaching "A time to come in which nothing shall be withheld".
The easiest way to have a crisis of faith is to go out an...
"Self-honesty is at its most fragile when we're not sure which path is the righteous one."
Are we ever "sure" of anything (especially ethics)?
For the past three days I have been repeatedly performing the following mental operation:
"Imagine that you never read any documents claimed to be produced by telepathy with extraterrestrials. Now gauge your emotional reaction to this situation. Once calm, ask yourself what you would believe about the world in this situation. Would you accept materialism? Or would you still be seeking mystical answers to the nature of reality?"
I am still asking myself this question. Why? I am struggling to figure out whether or not I am wrong.
I believe things that raise a lot of red flags for "crazy delusion." Things like:
"I came from another planet, vastly advanced in spiritual evolution relative to Earth, in order to help Earth transition from the third dimension to the fourth dimension. My primary mission is to generate as much light and love as possible, because this light and love will diffuse throughout Earth's magnetic fields and reduce the global amount of strife and suffering while helping others to achieve enlightenment. I am being aided in this mission by extraterrestrials from the fourth dimension who are telepathically beaming me aid packages of light and love.&...
There are several things to ask about beliefs like this:
Do they make internal sense? (e.g. "What is the fourth dimension?")
Do they match the sort of evidence that you would expect to have in the case of non-delusion? (e.g. "Do you have any observable physical traits indicating your extraterrestrial origin? Would someone looking into records of your birth find discrepancies in your records indicating forgery?")
Do they try to defend themselves against testing? (e.g. "Do you expect to illuminate a completely dark room at night by generating light? Would you expect to exist happily in psychological conditions that would harm normal humans by subsisting on aid packages full of love?")
Do they have explanatory power? (e.g. "Has there, as a matter of historical fact, been a sudden and dramatic reduction in global strife and suffering since the date of your supposed arrival?")
Do they have a causal history that can be reasonably expected to track with truth across the entire reference class from an outside view? (e.g. "Did you receive your information via private mental revelation or a belief from as long ago as you can remember, similar to the beliefs of people you do consider crazy?")
Not every doubt calls for staging an all-out Crisis of Faith. But you should consider it when: [snip list]
It's interesting realising how many of these generally apply to the idea "I don't want a sex change" (and "I'm happy with my sexual orientation / current relation / current job / current social circle", but specifically I've noticed that transitioning from one sex to another seems to require that sort of heroic rational effort)
My belief in the tenants of the Church of Jesus Christ of Latter-Day Saints has these warning signs.
Years later, I'm reading this.
I've been reading through the Major Sequences and I'm getting towards the end of "How to actually change your mind" with this growing sense of unease that there are all these rational principles that I could use but I know not to what end and I think I might finally have found somewhere: My political convictions.
I'm not going to say which direction I lean, but I lean that direction very strongly. My friends lean that way. My parents do not lean that way as much as I do but they are in that general direction.
And I realize that I may not lean that way because it is the rational way to approach a well-run country, but because I am merely used to it.
So perhaps, one of these weeks, I will sit down and have a long hard think on my politics. I think I would prefer to stay leaned the way I currently am - but I would, wouldn't I.
I started letting go of my faith when I realized that there really isn't much Bayesian evidence for it. Realizing that the majority of the evidence needed to believe something is used just to isolate that something out of all the other possible beliefs finished it off. But I do have one question: If Jesus wasn't magic, where did the Bible even come from? Lee Strobel "proves" that Jesus died and came back from the dead, but his proofs are based on the Bible. Why was the Bible so widely accepted if there wasn't anything extra-special about Jesus after all?
I really am having trouble doubting my conviction in rational thought. I can't visualize an alternative. I can visualize an alternative to my atheist philosophy though, since if God descended from heaven and handed me a bunch of concrete evidence that He exists, I wouldn't say 'ah, rationality was wrong.' I would say 'Oh, so you exist. I'll eat my hat on that one and concede that my confidence in your non-existence has been defeated, but to be fair until just now you've given me no rational reason to believe in you.' I'm a rational atheist because all of t...
For a person who already escaped from religion a thought about "What general strategy would a religious person have to follow in order to escape their religion?" is like a thought about how to make all people on Earth stop eating meat for a vegetarian person. Not a very constructive thought. If one starts thinking about such general strategy then one implicitly sets a goal to somehow assist all religious persons to escape from their religion. But that kind of goal is not necessarily a good one :) Instead, a person who already (or from the start) escaped the religion, can spend that time to look for more people with similar minds.
It seems to me that in our civilisation we have a quite nice way of dealing with deficiency of faith's crises - assuming the narrative of epistemological societal progress the people with poor epistemic hygiene, along with a smaller mix of those with a better one die off and a new generation is generally more able to look at the issues with a fresh set of eyes.
Not sure however how true it is that accurate memes tend to live and propagate - there are quite a few cases that are still disputed despite being settled for hundreds of years, although I may be looking at not big enough time frame here.
Saint Peter of Verona, patron saint of inquisitors, practiced this method when dealing with suspected heretics. By allowing himself to have a crisis of faith when confronted with the sincerity of his opposition, his beliefs came out stronger in the end and we're often persuasive. Saint Peter not only never lost his faith, but through his example, inspired his assassin to devote his life to the Church.
I suggest instead finding an unforgivable sin within the religion you are seeking to escape. Then committing that sin gives you a personal incentive to bui...
no one questions the historicity of say Pythagoras
Really?
Here's a newspaper review whose author says Pythagoras "may well be a mythical amalgam of various forgotten sages". The book under review itself says "Sadly, it is now almost universally assumed by classical scholars that Pythagoras never existed". I suspect this is partly tongue in cheek, since the other information I can find doesn't seem consistent with what it says on its face, but if it's a joke I think it's the sort that depends on not being too far from the truth. Here's...
Which wasn't Christian while it was doing the conquering.
Fair point: Christianity in Europe is more "persuade the ruler and others will fall into line" than "spread by the sword". (Though IIRC there's some reason to think that the ruler was willing to be persuaded partly because substantial fractions of his people already had been.)
The problem with faith is that for many people it has become a part of their identity. The brain cells are intertwined and when someone attacks their faith, their self-protection mechanism kicks in and their rational thinking turns off.
It's basically like Plato's Allegory of the Cave, where prisoners choose to disbelieve the real world and go back to their own fake reality.
On my more pessimistic days I wonder if the camel has two humps.)
Link is dead. Is this the new link?
About ten years late to the party here, but regarding Aumann, I think you do him an injustice - he is well aware of the conflict between rationality and God. Here is an interview with him that goes in depth into these issues:
http://www.ma.huji.ac.il/~hart/papers/md-aumann.pdf?
He says: "Religion is an experience, mainly an emotional and aesthetic one. It is not about whether the earth is 5,765 years old. " He goes into more detail. For him, the question of whether or not god really exists is almost irrelevant to his religion. He then delves int...
For me, I'd already absorbed all the right arguments against my religion, as well as several years' worth of assiduously devouring the counterarguments (which were weak, but good enough to push back my doubts each time). What pushed me over the edge, the bit of this that I reinvented for myself, was:
"What would I think about these arguments if I hadn't already committed myself to faith?"
Once I asked myself those words, it was clear where I was headed. I've done my best to remember them since.
Why do you consider religious faiths to be obviously untrue? "They would be child's play for an unattached mind to relinquish, if the skepticism of a ten-year-old were applied without evasion." Why do you consider the questions of a ten-year old to be unaswerable except through evasion? On the contrary, such questions are almost invariably easily answerable to anyone who has the slighest knowledge about philosophy of religion and the doctrine of their particular religion. I would be silly to be guided by the questions of a 10-year old instead of the a...
I came back to this post to draw inspiration from it and found several issues with it, that I now spot as a much older and more mature adult, almost 30.
Many in this world retain beliefs whose flaws a ten-year-old could point out, if that ten-year-old were hearing the beliefs for the first time. These are not subtle errors we’re talking about. They would be child's play for an unattached mind to relinquish, if the skepticism of a ten-year-old were applied without evasion. As Premise Checker put it, "Had the idea of god not come along until the scientific age, only an exceptionally weird person would invent such an idea and pretend that it explained anything."
And yet skillful scientific specialists, even the major innovators of a field, even in this very day and age, do not apply that skepticism successfully. Nobel laureate Robert Aumann, of Aumann’s Agreement Theorem, is an Orthodox Jew: I feel reasonably confident in venturing that Aumann must, at one point or another, have questioned his faith. And yet he did not doubt successfully. We change our minds less often than we think.
This should scare you down to the marrow of your bones. It means you can be a world-class scientist and conversant with Bayesian mathematics and still fail to reject a belief whose absurdity a fresh-eyed ten-year-old could see. It shows the invincible defensive position which a belief can create for itself, if it has long festered in your mind.
What does it take to defeat an error that has built itself a fortress?
But by the time you know it is an error, it is already defeated. The dilemma is not “How can I reject long-held false belief X?” but “How do I know if long-held belief X is false?” Self-honesty is at its most fragile when we’re not sure which path is the righteous one. And so the question becomes:
Religion is the trial case we can all imagine.2 But if you have cut off all sympathy and now think of theists as evil mutants, then you won’t be able to imagine the real internal trials they face. You won’t be able to ask the question:
I’m sure that some, looking at this challenge, are already rattling off a list of standard atheist talking points—“They would have to admit that there wasn’t any Bayesian evidence for God’s existence,” “They would have to see the moral evasions they were carrying out to excuse God’s behavior in the Bible,” “They need to learn how to use Occam’s Razor—”
Wrong! Wrong wrong wrong! This kind of rehearsal, where you just cough up points you already thought of long before, is exactly the style of thinking that keeps people within their current religions. If you stay with your cached thoughts, if your brain fills in the obvious answer so fast that you can't see originally, you surely will not be able to conduct a crisis of faith.
Maybe it’s just a question of not enough people reading Gödel, Escher, Bach at a sufficiently young age, but I’ve noticed that a large fraction of the population—even technical folk—have trouble following arguments that go this meta.3 On my more pessimistic days I wonder if the camel has two humps.
Even when it’s explicitly pointed out, some people seemingly cannot follow the leap from the object-level “Use Occam’s Razor! You have to see that your God is an unnecessary belief!” to the meta-level “Try to stop your mind from completing the pattern the usual way!” Because in the same way that all your rationalist friends talk about Occam’s Razor like it’s a good thing, and in the same way that Occam’s Razor leaps right up into your mind, so too, the obvious friend-approved religious response is “God’s ways are mysterious and it is presumptuous to suppose that we can understand them.” So for you to think that the general strategy to follow is “Use Occam’s Razor,” would be like a theist saying that the general strategy is to have faith.
“But—but Occam’s Razor really is better than faith! That’s not like preferring a different flavor of ice cream! Anyone can see, looking at history, that Occamian reasoning has been far more productive than faith—”
Which is all true. But beside the point. The point is that you, saying this, are rattling off a standard justification that’s already in your mind. The challenge of a crisis of faith is to handle the case where, possibly, our standard conclusions are wrong and our standard justifications are wrong. So if the standard justification for X is “Occam’s Razor!” and you want to hold a crisis of faith around X, you should be questioning if Occam’s Razor really endorses X, if your understanding of Occam’s Razor is correct, and—if you want to have sufficiently deep doubts—whether simplicity is the sort of criterion that has worked well historically in this case, or could reasonably be expected to work, et cetera. If you would advise a religionist to question their belief that “faith” is a good justification for X, then you should advise yourself to put forth an equally strong effort to question your belief that “Occam’s Razor” is a good justification for X.4
If “Occam’s Razor!” is your usual reply, your standard reply, the reply that all your friends give—then you’d better block your brain from instantly completing that pattern, if you’re trying to instigate a true crisis of faith.
Better to think of such rules as, “Imagine what a skeptic would say—and then imagine what they would say to your response—and then imagine what else they might say, that would be harder to answer.”
Or, “Try to think the thought that hurts the most.”
And above all, the rule:
Because if you aren’t trying that hard, then—for all you know—your head could be stuffed full of nonsense as bad as religion.
Without a convulsive, wrenching effort to be rational, the kind of effort it would take to throw off a religion—then how dare you believe anything, when Robert Aumann believes in God?
Someone (I forget who) once observed that people had only until a certain age to reject their religious faith. Afterward they would have answers to all the objections, and it would be too late. That is the kind of existence you must surpass. This is a test of your strength as a rationalist, and it is very severe; but if you cannot pass it, you will be weaker than a ten-year-old.
But again, by the time you know a belief is an error, it is already defeated. So we’re not talking about a desperate, convulsive effort to undo the effects of a religious upbringing, after you’ve come to the conclusion that your religion is wrong. We’re talking about a desperate effort to figure out if you should be throwing off the chains, or keeping them. Self-honesty is at its most fragile when we don’t know which path we’re supposed to take—that’s when rationalizations are not obviously sins.
Not every doubt calls for staging an all-out Crisis of Faith. But you should consider it when:
None of these warning signs are immediate disproofs. These attributes place a belief at risk for all sorts of dangers, and make it very hard to reject when it is wrong. And they hold for Richard Dawkins’s belief in evolutionary biology, not just the Pope’s Catholicism.
Nor does this mean that we’re only talking about different flavors of ice cream. Two beliefs can inspire equally deep emotional attachments without having equal evidential support. The point is not to have shallow beliefs, but to have a map that reflects the territory.
I emphasize this, of course, so that you can admit to yourself, “My belief has these warning signs,” without having to say to yourself, “My belief is false.”
But what these warning signs do mark is a belief that will take more than an ordinary effort to doubt effectively. It will take more than an ordinary effort to doubt in such a way that if the belief is in fact false, you will in fact reject it. And where you cannot doubt in this way, you are blind, because your brain will hold the belief unconditionally. When a retina sends the same signal regardless of the photons entering it, we call that eye blind.
When should you stage a Crisis of Faith?
Again, think of the advice you would give to a theist: If you find yourself feeling a little unstable inwardly, but trying to rationalize reasons the belief is still solid, then you should probably stage a Crisis of Faith. If the belief is as solidly supported as gravity, you needn’t bother—but think of all the theists who would desperately want to conclude that God is as solid as gravity. So try to imagine what the skeptics out there would say to your “solid as gravity” argument. Certainly, one reason you might fail at a crisis of faith is that you never really sit down and question in the first place—that you never say, “Here is something I need to put effort into doubting properly.”
If your thoughts get that complicated, you should go ahead and stage a Crisis of Faith. Don’t try to do it haphazardly; don’t try it in an ad-hoc spare moment. Don’t rush to get it done with quickly, so that you can say, “I have doubted, as I was obliged to do.” That wouldn’t work for a theist, and it won’t work for you either. Rest up the previous day, so you’re in good mental condition. Allocate some uninterrupted hours. Find somewhere quiet to sit down. Clear your mind of all standard arguments; try to see from scratch. And make a desperate effort to put forth a true doubt that would destroy a false—and only a false—deeply held belief.
Elements of the Crisis of Faith technique have been scattered over many essays:
And these standard techniques, discussed in How to Actually Change Your Mind and Map and Territory, are particularly relevant:
But really, there’s rather a lot of relevant material, here and on Overcoming Bias. There are ideas I have yet to properly introduce. There is the concept of isshokenmei—the desperate, extraordinary, convulsive effort to be rational. The effort that it would take to surpass the level of Robert Aumann and all the great scientists throughout history who never broke free of their faiths.
The Crisis of Faith is only the critical point and sudden clash of the longer isshoukenmei—the lifelong uncompromising effort to be so incredibly rational that you rise above the level of stupid damn mistakes. It’s when you get a chance to use the skills that you’ve been practicing for so long, all-out against yourself.
I wish you the best of luck against your opponent. Have a wonderful crisis!
1See “Occam’s Razor” (in Map and Territory).
2Readers born to atheist parents have missed out on a fundamental life trial, and must make do with the poor substitute of thinking of their religious friends.
3See “Archimedes’s Chromophone” (http://lesswrong.com/lw/h5/archimedess_chronophone) and “Chromophone Motivations” (http://lesswrong.com/lw/h6/chronophone_motivations).
4Think of all the people out there who don’t understand the Minimum Description Length or Solomonoff induction formulations of Occam’s Razor, who think that Occam’s Razor outlaws many-worlds or the simulation hypothesis. They would need to question their formulations of Occam’s Razor and their notions of why simplicity is a good thing. Whatever X in contention you just justified by saying “Occam’s Razor!” is, I bet, not the same level of Occamian slam dunk as gravity.