Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Anticipation vs. Faith: At What Cost Rationality?

8 Post author: Wei_Dai 13 October 2009 12:10AM

Anticipation and faith are both aspects of the human decision process, in a sense just subroutines of a larger program, but they also generate subjective experiences (qualia) that we value for their own sake. Suppose you ask a religious friend why he doesn’t give up religion, he might say something like “Having faith in God comforts me and I think it is a central part of the human experience. Intellectually I know it’s irrational, but I want to keep my faith anyway. My friends and the government will protect me from making any truly serious mistakes as a result of having too much faith (like falling into dangerous cults or refusing to give medical treatment to my children)."

Personally I've never been religious, so this is just a guess of what someone might say. But these are the kinds of thoughts I have when faced with the prospect of giving up the anticipation of future experiences (after being prompted by Dan Armak). We don't know for sure yet that anticipation is irrational, but it's hard to see how it can be patched up to work in an environment where mind copying and merging are possible, and in the mean time, we have a decision theory (UDT) that seems to work fine, but does not involve any notion of anticipation.

What would you do if true rationality requires giving up something even more fundamental to the human experience than faith? I wonder if anyone is actually willing to take this step, or is this the limit of human rationality, the end of a short journey across the space of possible minds?

Comments (105)

Comment author: PhilGoetz 13 October 2009 01:13:16AM 17 points [-]

You have to have a core of arational desires/goals/values. Otherwise, you're just a logic engine with nothing to prove.

Comment author: [deleted] 13 October 2009 02:28:29PM *  7 points [-]

Upvoted. To extend Eliezer's favorite metaphor, rationality makes it easy to get where you're going by making the map closely match the territory, but merely having a good map won't tell you where you want to go.

Comment author: UnholySmoke 15 October 2009 12:56:33PM 3 points [-]

Also upvoted, and very succintly put.

Rationality is a tool we use get to our terminal value. And what do we do when that tool tells us our terminal value is irrational?

Never ask that question.

Comment author: DanArmak 13 October 2009 04:12:02PM *  2 points [-]

I agree, and I think that pretty much answers the post's question:

What would you do if true rationality requires giving up something even more fundamental to the human experience than faith?

"True rationality" is, pretty much by definition, the best way of achieving your goals. The above question should be written as: do you have goals that are so important, that you would agree to give up something fundamental to the human experience to achieve them?

My personal answer is: mine only such goals are of the form "do not undergo a huge amount of torture/suffering".

Comment author: Christian_Szegedy 20 October 2009 12:02:35AM 6 points [-]

If someone would come and tell you: "I can't fall in love. I've never fallen in love and I also think it is wrong, harmful, crazy and irrational to fall in love." What would you tell?

Most people would say opt for one of the following answers:

  • Cool, that's great, good for you, finally a sane person!
  • That's pitiful, you miss something important!

Falling in love has a lot of parallels to real religious conviction: It's irrational. It's not supported by evidence, still can't be argued away. It can be harmful for the person and also easily exploited and often it is not mutual. :)

Comment author: RobinZ 20 October 2009 12:11:18AM 2 points [-]

As a clarification of the subjective nature of religious faith, this is most helpful ... but it doesn't really answer Wei_Dai's question.

Comment author: Christian_Szegedy 20 October 2009 12:24:28AM *  2 points [-]

No, it's not a clarification, it's just an extension.

You can ask the same questions about falling in love as about religious faith. Whether you come to similar or different conclusions, you learn something on the way.

Comment author: RobinZ 20 October 2009 02:20:16AM 1 point [-]

That's ... not really accurate. With love, the question we in this community would ask is "would seeking to maintain or expand this relationship be a good idea?" With religion, it's "does this being with whom a relationship is suggested actually exist?"

Comment author: Christian_Szegedy 20 October 2009 06:03:05PM 3 points [-]

In a lot of cases, one asks later whether the person one fell in love actually exists.

Comment author: RobinZ 20 October 2009 06:30:26PM 3 points [-]

No, they don't. They may say the words, "was the person I fell in love with ever real", but their actual question is, "was I mistaken about the character of the human being I formed a relationship with". Not the same question.

Comment author: Jonii 13 October 2009 12:59:46AM *  5 points [-]

One such thing could be a conscious mind. You want to be rational and efficient, and you're offered a deal by a local neurosurgeon to reprogram your brain so that you become the perfect Bayesian that does everything to further your values. No acrasia, no bias blind spots, no hindrance at all. Only downside in the transformation process is that something that enables our conscious experiences(let it be something like self-reflection, or conflicts within the system) is lost. Wanna do it?

I'm well aware that people here might disagree if this would be possible even in principle, but as far as I know, it could be, so this could then work as an example of a huge sacrifice made for furthering rationality.

Edit: Also, it would seem to me that "eliminating anticipation" is a case of explaining away.

Comment author: MichaelVassar 16 October 2009 09:48:22PM 4 points [-]

Not happily, but I won't be unhappy about it for long, so sure. I'm pretty sure that one such person is all the world would need, and, you know that epigram about 'what profit a man'? Well, it profit his utility function a lot.

Comment author: Alicorn 13 October 2009 01:06:31AM 5 points [-]

The transformation is at least partially self-undermining, if one of your values is conscious experience.

Comment author: Eliezer_Yudkowsky 13 October 2009 03:39:41PM 3 points [-]

Can I get xeroxed and have the transformation performed on one randomly selected version of me before it wakes up?

Comment author: DanArmak 13 October 2009 04:07:24PM *  1 point [-]

To have the "pure rational" version serve the human one? Have it achieve goals that aren't defined by ownership (e.g., 'reduce total suffering'), because it's better at those things? This sounds reasonable and I don't see why we shouldn't do this - assuming it's easier or otherwise preferable to creating a GAI from scratch to achieve these goals.

Comment author: aausch 16 October 2009 02:02:06AM *  0 points [-]

Would you enter a lottery where a small portion (0.01%) of the entrants are selected as losers at random, and transformed without being copied?

Comment author: DanArmak 16 October 2009 02:07:14AM *  0 points [-]

Being modified like this is much the same as death. I would lose personhood and the modified remains would not be serving my original goals, but those of the winners in the lottery.

So this is equivalent to asking: would I enter a lottery with 0.01% chance of instant death and a prize for everything else? I might, depending on the prize. If the prize is service by the modified individuals, and they aren't based on myself but on other people (so they're not very well suited to advancing my personal goals), and I have to timeshare them with all the other winners (every winner receives 0.01 / 99.99 ~~ 10^-4 % service time) - that doesn't seem worthwhile.

Comment author: aausch 16 October 2009 02:16:55AM 0 points [-]

Modified question in place.

Comment author: DanArmak 16 October 2009 01:22:35PM 0 points [-]

Modified answer in place...

Comment author: Tyrrell_McAllister 13 October 2009 01:21:45AM *  3 points [-]

What would be the consequence of giving up the idea of a subjective thread of consciousness?

I wonder if believers in subjective threads of consciousness can perform a thought experiment like Chalmers' qualia-zombie thought experiment. I gather that advocates of the subjective thread hold that it is something more than just having certain clumps of matter existing at different times holding certain causal relationships with one another. (Otherwise you couldn't decide which of two future copies of yourself gets to inherit your subjective thread). So, advocates, does this mean that you can imagine an alternate universe in which matter is arranged in the same way as in our own throughout time, but in which no subjective threads bind certain clumps together? That is, do you think that "subjective-thread zombies" are possible in principle?

Just as in the Chalmers thought experiment, subjective-thread zombies would go around insisting that they have subjective threads. After all, their brains and lips would be participating in the same causal processes that lead you to say such things in this universe. And yet they would be wrong. They would not be saying these things because they have subjective threads, since they don't. And so, it seems, your insistence that you have a subjective thread also cannot have anything to do with whether you in fact do.

It seems that the idea of subjective-thread zombies is subject to all the problems that qualia zombies have. How do advocates of the subjective thread address or evade these problems?

Comment author: Mycroft65536 15 October 2009 09:12:27PM 2 points [-]

Faith is easy to dismiss because it can fairly be defined as "belief without evidence".

What exactly is meant by "anticipation"?

Comment author: Jack 13 October 2009 05:16:59PM 2 points [-]

I haven't kept up with all the decision theory stuff but can someone demonstrate to me that anticipation might be irrational and not just that anticipation has to be reduced and re-understood?

Comment author: Wei_Dai 13 October 2009 06:29:49PM 2 points [-]

I'm still trying to understand the nature of anticipation well enough to either produce a conclusive demonstration that it's irrational, or show how it can re-understood to fit into a rational decision process. Others are probably working on the same thing (I hope). So far, the best argument I have that anticipation might be irrational is the one I gave in the post.

Comment author: gjm 13 October 2009 09:12:06AM 2 points [-]

I don't see any reason why you should give up anticipation of future experiences. It's possible that in situations involving duplication and merging of minds you should accept that anticipation is an unreliable guide for decision-making, but that's not at all the same thing. (It's more like a religious person agreeing that if he gets seriously ill he should see a doctor rather than relying on his god to cure him -- which, in fact, religious people generally do.) At least for the present, the sort of switching and copying and merging and muddling of minds that would make our usual anticipation-based thinking fail badly doesn't happen to any appreciable extent.

Comment author: Jonathan_Graehl 13 October 2009 12:22:38AM 2 points [-]

Do you think an intellectual argument is all you need to self-modify so drastically? Or is this just in anticipation of some future mind-surgery technology?

Comment author: Wei_Dai 13 October 2009 12:56:31AM 1 point [-]

That's part of what I'm asking, I guess. Does anyone value rationality for its own sake, enough to give up anticipation if it turns out to be irrational, purely on intellectual grounds? And what would you do if mind copy/merging technology becomes a reality in the future (which I assume most of us here think is more likely than not)?

Comment author: Eliezer_Yudkowsky 13 October 2009 01:26:07AM 15 points [-]

Does anyone value rationality for its own sake, enough to give up anticipation if it turns out to be irrational, purely on intellectual grounds?

You don't make a conscious decision to give up something like that, if it needs giving up. You learn more, see that what you once thought was sense was in fact nonsense, and in the moment of realization, you have already lost that which you never had. Really this is the wrong way to phrase the question: you should properly ask, "If the idea of anticipation is complete nonsense and all our thoughts about it are mere helpless clinging to our own confusion, would you rather know what was really going on?" and to this I answer "Yes."

If someone offered to tell me the Real Story, saying, "Once you learn the Real Story, you will lose your grasp of that which you once called 'anticipation'; the concept will dissolve, and you will find it difficult to remember why you ever once believed such a notion could be coherent; just as you once lost 'time'," I would indeed reply "Tell me, tell me!"

Comment author: Wei_Dai 13 October 2009 09:17:26AM *  6 points [-]

I think when I wrote my previous response I may have missed your point somewhat. I guess what you're really saying is that, if anticipation is truly irrational, then once we sufficiently understand why it's irrational, we won't value it anymore, and it won't require any particular "effort" to give it up. Is this a better summary of your position?

If so, are you really sure it's true, that the human mind has that much flexibility and meta-rationality? Why? (Why do you believe this? And why would evolution have that much apparent foresight?)

Comment author: Eliezer_Yudkowsky 13 October 2009 03:30:43PM 3 points [-]

It is a better summary; and I can give no better answer than, "It's always worked that way for me before." I think the real difficulty would come for someone who was told that they had to give up anticipation, rather than seeing it for themselves in a thunderbolt of dissolving insight.

Comment author: Wei_Dai 13 October 2009 06:01:35PM 2 points [-]

My reasoning here is that evolution in general has very limited foresight, therefore there must be a limit to human rationality somewhere that is probably far short of ideal rationality. "It's always worked that way for me before" doesn't seem like very strong evidence in comparison to that argument.

Comment author: CronoDAS 13 October 2009 08:59:47AM *  6 points [-]

So, it might very well be the case that "time" is something you can get rid of from fundamental physics equations and still get the right answers.

But clocks still work. Even if time is only an "emergent" phenomenon (tee hee), it's still something that's there...

Comment author: aausch 16 October 2009 02:11:40AM 3 points [-]

If someone offered to tell me the Real Story, saying, "Once you learn the Real Story, you will lose your grasp of that which you once called 'anticipation'; the concept will dissolve, and you will find it difficult to remember why you ever once believed such a notion could be coherent; just as you once lost 'time'," I would indeed reply "Tell me, tell me!"

What about this situation:

"As a significant shortcut to developing an understanding of the Real Story, you can follow a formula which begins with a forced loss of your grasp of that which you once called 'anticipation'. I can promise that, once you do understand the Real Story, you will find it difficult to remember why you ever once believed the notion of 'anticipation' could be coherent. I have never found it useful to think about reversing the shortcut formula, so I cannot promise that the process is reversible"

Comment author: MichaelVassar 16 October 2009 09:50:35PM 2 points [-]

I'll do that too. Lots of chemicals can help with this.

Comment author: Jonathan_Graehl 13 October 2009 06:36:25PM 3 points [-]

The possibility of losing the natural feeling of anticipation, or time, isn't really on the table (yet). Knowing the real nature of things intellectually is always good, but does knowing that a feeling is an illusion remove its interference with comfort in the face of a rational decision?

Part of the thrill in bungee jumping is in the overriding. Are you saying that you can manipulate your decision making so that counterproductive instincts fade away?

Comment author: Wei_Dai 13 October 2009 03:10:42AM *  2 points [-]

You don't make a conscious decision to give up something like that, if it needs giving up.

I don't agree with this. I tend to make all other important decisions consciously. What's so special about this one? (ETA: Also, one potential way of giving it up is to edit my brain using some future technology. I think I definitely want to make that decision consciously.)

The rest of your comment seems to be saying that you're not yet convinced that anticipation is irrational. That's fair enough, but doesn't really address the main point of my post, which is that we regard some parts of our decision making process as having terminal values, and may decide to keep them (as luxuries, more or less) even if we come to believe that they no longer have positive instrumental value as decision subroutines.

Comment author: MichaelVassar 16 October 2009 09:49:28PM 0 points [-]

I'd like to vote this up several times.

Comment author: Wei_Dai 16 October 2009 11:07:52PM 3 points [-]

Can you explain why? I personally can't get used to this writing style, and it took me a few hours to figure out what Eliezer was getting at. I also don't understand why he chose to use a tone of high confidence, on something that he has rather flimsy evidence about.

Comment author: Douglas_Knight 16 October 2009 11:34:56PM 1 point [-]

If you spent hours to figure out what something meant, it's probably worth writing it out in your own words. At least it should help people who find the first style natural understand and communicate with you.

Comment author: Wei_Dai 16 October 2009 11:50:46PM 2 points [-]

I did. See here.

Comment author: Psychohistorian 17 October 2009 01:23:59AM *  0 points [-]

Does anyone value rationality for its own sake, enough to give up anticipation if it turns out to be irrational, purely on intellectual grounds?

Anticipation is an experience. I don't really see how one could decide to give up anticipation because it's irrational any more than they could decide to give up hunger just because it's irrational.

At least, so long as anticipation refers to "the good (bad) feeling one gets when thinking about an upcoming good (bad) event." I'm not really sure what else you'd mean by it, and I'm not sure how the truth could hope to destroy it.

Comment author: pengvado 17 October 2009 02:25:58PM *  0 points [-]

I'm pretty sure that everyone here who's considering giving up "anticipation", uses that term to mean not just any thinking about future experiences, but a consistent method of assigning concrete probabilities to future experiences. And the hypothesis about the irrationality of experiencing such anticipation, is merely a corollary of that factual hypothesis: If your probability estimates on some particular question consistently fail describe good bets, then binding emotions to those probabilities motivates bad decisions.

Comment author: SilasBarta 13 October 2009 02:30:24AM 1 point [-]

Suppose you ask a religious friend why he doesn’t give up religion, he might say something like “Having faith in God comforts me and I think it is a central part of the human experience. Intellectually I know it’s irrational, but I want to keep my faith anyway. My friends and the government will protect me from making any truly serious mistakes as a result of having too much faith (like falling into dangerous cults or refusing to give medical treatment to my children)."

Um, no. If you were close with that friend, and he proved himself to be pretty intelligent, and he downed a few beers and you kept prying, his answer would be something more like,

"Yeah, I know all that God stuff is a load of garbage, but the public profession of faith in 'God' provides the social glue that allows a welfare-maximizing mutualist group to form, involving people of varying intelligence levels who take this stuff literally and not literally, which grants me access to a large social network with enforcement mechanisms for the prisoner's dilemma, and the ability to trade labor for labor, such as for babysitting, at more favorable rates than cash purchases would allow. Also, it uses psychological mechanisms that allow me to believe strongly enough in my healing to invoke the placebo effect in the body, which gives me real healing. Finally, my price for joining was low enough.

"Show me an atheist group that does all that, and I'm in. *hic* [passes out]."

Comment author: PhilGoetz 13 October 2009 03:22:13PM *  9 points [-]

No, that isn't what they would say. I was a Christian for many years. Most of them sincerely believe everything they're supposed to believe - at least in conservative churches, which I think comprise a large majority of US churches.

Comment author: SilasBarta 13 October 2009 03:36:04PM 0 points [-]

No, they don't really believe it; their actions are severely suboptimal for that belief set. They might have a greater belief that they believe it, however.

Comment author: Peter_de_Blanc 13 October 2009 08:28:20PM 8 points [-]

Oh, come on. Show me a human whose actions aren't severely suboptimal for their belief set.

Comment author: SilasBarta 13 October 2009 08:52:03PM 2 points [-]

Fair point, but here the difference is the most obvious, if the professed beliefs accurately represent their internal predictive model of reality, which I claim it doesn't.

Comment author: GuySrinivasan 13 October 2009 04:10:09PM 5 points [-]

In the past I believed it, and was sad when "the flesh was weak" and I took actions severely suboptimal for that belief set. I wish I had only believed I believed it. I guess it's possible I only now believe I believed it and in fact in the past I merely believed I believed it, but I'm guessing no.

Humans are well known to be terrible at taking actions that aren't severely suboptimal for their beliefs, and for holding contradictory beliefs, and for holding beliefs which imply actions which are suboptimal for their other, contradictory beliefs.

Comment author: PhilGoetz 14 October 2009 01:43:56AM 4 points [-]

I understand the point about how people can "only believe they believe something," but the way you phrased it sounded like they are consciously faking it, and I think that is in most cases not true.

Comment author: Tyrrell_McAllister 13 October 2009 05:25:31PM 3 points [-]

Unless you are a person to whom faith ever came naturally, you should be very skeptical that your mind contains an accurate model of "that belief set" or of the minds that profess it.

Faith never came naturally to me. After long interaction with "faithful" people, I've developed some tentative hypotheses about how some of them think on some things. Cautious though these hypotheses are, they are enough to rule out the applicability of your characterization to most cases.

Comment author: SilasBarta 13 October 2009 05:43:52PM *  0 points [-]

If I'm right and they're just cynically going through the motions, do you think they're going to tell you that? Do you think they're even going to give evidence consistent with that? Of course not! They'll just keep up the charade. They'd only admit it if you were a close friend, and they were drunk at the time, like in my example.

The social benefits break down when you make your genuine beliefs become public knowledge.

Comment author: Tyrrell_McAllister 13 October 2009 05:45:59PM *  4 points [-]

If I'm right and they're just cynically going through the motions, do you think they're going to tell you that? Do you think they're even going to give evidence consistent with that? Of course not! They'll just keep up the charade.

If it's as hard to gather evidence as you claim, then you should be all the more skeptical of your own conclusions.

ETA: And if it's so crucial to avoid letting the slightest hint of doubt creep out, then we should expect evolution to find the simplest way to keep that from happening: Make a mind with the capacity to genuinely believe this stuff.

Comment author: SilasBarta 13 October 2009 06:15:44PM 1 point [-]

When anthropologists study religion, they focus mostly on the rituals, the social cohesion, the punishment of defectors (in the PD sense), and formation of authority structures, and not so much on the factual content of the adherents' purported beliefs.

My position is just: do that.

Comment author: Tyrrell_McAllister 13 October 2009 07:34:53PM 3 points [-]

When anthropologists study religion, they focus mostly on the rituals, the social >cohesion, the punishment of defectors (in the PD sense), and formation of authority >structures, and not so much on the factual content of the adherents' purported beliefs.

My position is just: do that.

I'm not suggesting that you rely only on their portrayal of their own beliefs. On the contrary, I'm suggesting long and careful observation of their behavior (including professions of belief) before you reach any confident conclusion.

And even after you've gathered many such observations, you will still be misled if you use the wrong approach in incorporating those observations into a model. Many natural-born atheists use the following fallacious approach to understanding the religious: They think to themselves, "What would it take to make me act like that and say those things? Well, for that to happen, I'd need to have the following things going on inside my mind: <...>. Therefore, those things must also be going on in the minds of theists (or at least of the intelligent ones)."

The flaw with this approach is that you're modeling the mind of a theists using the mind of a natural-born atheist, a mind which almost certainly works differently from a theist's mind when it comes to theological issues, almost by definition. That is why you should be skeptical that your mind contains an accurate model of a theist's mind.

Comment author: SilasBarta 13 October 2009 08:58:59PM 0 points [-]

I'm not suggesting that you rely only on their portrayal of their own beliefs. On the contrary, I'm suggesting long and careful observation of their behavior (including professions of belief) before you reach any confident conclusion.

...And even after you've gathered many such observations, you will still be misled if you use the wrong approach in incorporating those observations into a model. Many natural-born atheists use the following fallacious approach to understanding the religious: They think to themselves, "What would it take to make me act like that and say those things? ...

Well, I'm already relying on a large data set, and I was born into a Catholic family. My theory still makes more sense. Here are some more data points:

-The parallels between religion and politics: how they force people into teams, say whatever it takes to defend the team, look for cues about whether you're on their team when they ask about your beliefs,

-The history of religious warfare. It makes no sense to view these people as going out to die for inscrutable theological doctrines, but complete sense to view their motives the same as they would be if you replaced the religion with some other memetic group.

Comment author: Tyrrell_McAllister 13 October 2009 10:50:22PM *  5 points [-]

My theory still makes more sense. Here are some more data points:

-The parallels between religion and politics: how they force people into teams, say whatever it takes to defend the team, look for cues about whether you're on their team when they ask about your beliefs,

I'd say that this data point supports my position. Get an extreme left-winger or right-winger drunk and you're not going to hear them say, "yeah, those extreme political positions I espouse, I don't really think they're true. I just pretend to because of the social benefits I reap." On the contrary, you're going to hear them spout even more extreme views, views that they'd realize they ought to keep to themselves had they been sober.

-The history of religious warfare. It makes no sense to view these people as going out to die for inscrutable theological doctrines, but complete sense to view their motives the same as they would be if you replaced the religion with some other memetic group.

I agree. I'm not saying that every action ostensibly justified by religious beliefs is really done because of those beliefs. But that says nothing about whether those beliefs are sincerely held.

Comment author: PhilGoetz 13 October 2009 08:36:42PM *  2 points [-]

No; then you would have done that, rather than making an assertion about what they believed.

Perhaps you only believe that you believe that. :)

Comment author: SilasBarta 13 October 2009 09:00:32PM 1 point [-]

Well, I am doing that in the sense of judging religions by the factors anthropologists study rather than focusing on how I can well I can disprove the claim that the earth is 6000 years old.

Comment author: bogus 13 October 2009 06:27:40PM 1 point [-]

Yes. Confucianism is the prototypical example of a "religion" which has no cosmological beliefs per se, but still provides for community cohesion (i.e. protection from perceived threats), an ethical code (the analects of Confucius are often quoted as proverbs/dogmas), a focus on authority figures and so forth.

Comment author: UnholySmoke 15 October 2009 01:07:37PM 3 points [-]

Beware of generalising across people you haven't spent much time around, however tempting the hypothesis. Drawing a map of the city from your living room etc.

My first 18 years were spent attending a Catholic church once a week. To the extent that we can ever know what other people actually believe (whatever that means), most of them have genuinely internalised the bits they understand. Like, really.

We can call into question what we mean by 'believe', but I can't agree that a majority of the world population is just cynically going with the flow. Finally, my parish priest is one of the most intelligent people I've ever met, and he believed in his god harder/faster/whatever than I currently believe anything. Scary thought, right?

Comment author: Jack 13 October 2009 05:00:51PM 0 points [-]

their actions are severely suboptimal for that belief set.

I agree that this is the case. However, that they don't really believe in the tenets from Christianity only follows from this if we have a strictly behaviorist definition of "belief". I doubt many people hold that view and I'm not sure why anyone should.

Comment author: thomblake 13 October 2009 03:04:11PM 3 points [-]

I've invoked similar arguments in favor of organized religion. While atheists could in principle get together every week and sing together, I don't know any who actually do, and I think we're worse off for it. Probably part of the appeal of humanistic churches.

Comment author: LauraABJ 13 October 2009 03:13:16PM 4 points [-]

I recommend the Unitarian Universalist church. I went to one as a child, and the focus was on humanism and morality and not god and faith. The sunday school taught a different religion every weekend, making it nearly impossible to believe any of them were true. Most of the people there didn't really believe in god anyway, but were there for the reasons you so name.

Comment author: Alicorn 13 October 2009 03:16:02PM 4 points [-]

And there's the less ubiquitous Ethical Culture Society, which is even less religious than Unitarianism.

Comment author: Eliezer_Yudkowsky 13 October 2009 03:35:56PM 2 points [-]

I wonder, though, if an x-rationalist would get a feeling of belonging there.

Comment author: Alicorn 13 October 2009 03:41:28PM 2 points [-]

What are the circumstances under which an x-rationalist would get a feeling of belonging? If there are no such circumstances, this is hardly a critique of Ethical Culture in particular.

Comment author: SilasBarta 13 October 2009 03:38:36PM 1 point [-]

Note the functions that I listed: the singing isn't strictly necessary; any bonding/reinforcement mechanism would work, but singing is very effective. If you could get the general mutualist functions down, then you'd have a competitive option.

Comment author: AllanCrossman 13 October 2009 04:41:43PM 0 points [-]

Ugh. The horrible music is the worst thing about church. Give me sermons about fire and brimstone any day.

Comment author: komponisto 13 October 2009 04:50:24PM 1 point [-]

Well, we would have better music, of course!

Comment author: CronoDAS 13 October 2009 06:55:02PM 2 points [-]
Comment author: Furcas 13 October 2009 06:44:56AM 3 points [-]

You seem to think that intelligent religious people are less crazy than dumb religious people. They're not.

Comment author: LauraABJ 13 October 2009 12:16:59PM 10 points [-]

Yes, and I would say actual faith is a cognitive error more akin to deja-vu than double think, in that it is a feeling of knowledge for which adequate logical justification may not exist. A friend of mine once said, "I'm sorry that I'm so bad at explaining this [the existence of God], but I just know it, and once you do too, you'll understand."

People can have experiences of faith in non-religious contexts, such as having faith (a sense of certainty or foreknowledge) that a critically ill loved-one will pull through. Intuition and gut-feelings maybe considered faith-light, but I think certainty is part of the faith experience, and just because that certainty is false, doesn't make the feeling any less real.

Comment author: Eliezer_Yudkowsky 13 October 2009 03:37:57PM 2 points [-]

I would say actual faith is a cognitive error more akin to deja-vu than double think, in that it is a feeling of knowledge for which adequate logical justification may not exist.

It looks to me like greater intelligence pulls people away from deja-vu faith and toward doublethink faith, but this is a generalization based on little data. Still, that little data seems to show that smart people who think about their religions end up with Escher-painting minds.

Comment author: LauraABJ 13 October 2009 04:13:46PM 2 points [-]

I don't have a large enough sample either, but I think what you interpret as doublethink and 'Escher-painting minds' may be the result of rationalizing a faith that at its core is an emotional attachment to a cognitive error. The friend I mentioned probably doesn't have an IQ much below the median for the readers of this blog-- double major in biochem and philosophy at an ivy-league school, head of a libertarian club (would probably agree with Robin Hanson on almost everything).

Comment author: Eliezer_Yudkowsky 13 October 2009 05:46:31PM 2 points [-]

Well, yes, lots of rationalization is exactly how you end up with an Escher-painting mind. Even human beings aren't born that twisted.

See also: Occam's Imaginary Razor.

Comment author: SilasBarta 13 October 2009 03:56:43PM 1 point [-]

How is it crazy to cynically go along with rituals for the social benefits? Risky, maybe, but crazy?

Comment author: Furcas 13 October 2009 05:10:43PM 4 points [-]

It isn't crazy at all. I was saying that your intelligent, religious, and very drunk friend would never say those words, because there's no religious person who believes them. All these reasons may be the ultimate cause of religious beliefs, but that doesn't mean religious believers are aware of them, consciously or even subconsciously.

Comment author: SilasBarta 14 October 2009 10:28:24PM 1 point [-]

and very drunk friend would never say those words, because there's no religious person who believes them.

Just to clarify, what do you mean by "religious" here? Do you define it by whether they're active in a church?

If so, how much money would you bet I can't find you a counterexample?

Comment author: Furcas 15 October 2009 12:22:43AM *  2 points [-]

I mean a person who holds self-deceptive beliefs that serve as the basis for a moral code of some sort. Church attendance is irrelevant.

I know there are some people who act religious and call themselves religious but aren't religious at all, but I don't think that's the kind of person you were talking about, since such a person couldn't benefit from the placebo effect. You're talking about the kind of person who has successfully fooled himself into holding religious beliefs, and yet is still so fully aware that it's all self-deception that he calls it "a load of garbage".

There may be real religious believers who would say something like what you've written, but I'm certain that it would just be a rationalization for them, a way to hide the ridiculousness of their beliefs behind a veneer of fake instrumental rationality.

Considering I'm currently unemployed and have very little money left in my bank account, I would bet a thousand Canadian dollars that you can't find a real religious believer who will say those words and honestly mean them.

EDIT:

And if you were talking about people who completely fake being religious, well, in my experience most of them don't ever admit to themselves that they're really atheists in their heart of hearts. I suppose there must be exceptions, though.

Comment author: Alicorn 15 October 2009 12:29:45AM 2 points [-]

You can benefit from the placebo effect even if you know you're taking a placebo.

Comment author: thomblake 15 October 2009 01:30:00PM 1 point [-]

Relevant article in Wired: Placebos are getting stronger - researchers are starting to study the placebo response to see how it can be better utilized to aid in healing.

Comment author: Furcas 15 October 2009 12:40:24AM *  0 points [-]

I'm guessing that by the word "know" you mean "acknowledge that the evidence is strongly in favor of", which doesn't necessarily entail belief, as many religious believers have demonstrated.

If that isn't what you mean, I have no clue what you're talking about

Comment author: Alicorn 15 October 2009 01:08:16AM 4 points [-]

No. I mean you can swallow a sugar pill, in full knowledge of, belief in, and acknowledgment-of-evidence-for the fact that it is a sugar pill, and still improve relative to not taking a sugar pill. It's not obvious to me why psychological "sugar pills" wouldn't work the same way.

Comment author: Furcas 15 October 2009 01:13:44AM 0 points [-]

I mean you can swallow a sugar pill, in full knowledge of, belief in, and acknowledgment-of-evidence-for the fact that it is a sugar pill,

... and belief that sugar pills don't cure diseases / alleviate symptoms?

and still improve relative to not taking a sugar pill.

I thought the placebo effect had to do with belief.

Comment author: SilasBarta 15 October 2009 03:05:27AM 1 point [-]

Considering I'm currently unemployed and have very little money left in my bank account, I would bet a thousand Canadian dollars that you can't find a real religious believer who will say those words and honestly mean them.

Okay, see, we're going in circles here: I'm trying to ask about the existence of someone who knows "it's all a load of garbage", heck, maybe even contributes to this very board, but cynically joins a church to get the social benefits.

And then you keep saying, no, such people don't exist, if you mean people who are also really religious. But that's the very point under discussion: how many people go through the motions of formal religions for the benefits, say the right applause lights, etc. for the social benefits while holding the conscious belief that there's no literally God in the sense the people there espouse, etc. ?

And if you were talking about people who completely fake being religious, well, in my experience most of them don't ever admit to themselves that they're really atheists in their heart of hearts. I suppose there must be exceptions, though.

I don't see the difference. If you take the LW rationalist position on God, doesn't that mean you're an atheist? So what does it matter if you admit it to yourself. Is there some internal psychological ritual now? If you believe you're a duck, you're a duck...self-believer.

Comment author: Furcas 15 October 2009 03:26:56AM *  1 point [-]

Okay, see, we're going in circles here: I'm trying to ask about the existence of someone who knows "it's all a load of garbage", heck, maybe even contributes to this very board, but cynically joins a church to get the social benefits.

All right. I was misled by the fact that your first commend was a reply to Wei Dai, who was talking about real religious people. I thought you believed that (most?) intelligent people who say they're religious aren't really religious.

I don't see the difference. If you take the LW rationalist position on God, doesn't that mean you're an atheist? So what does it matter if you admit it to yourself.

It's the difference between your average forthright atheist and someone like Karen Armstrong, who believes that God "is merely a symbol that points beyond itself to an indescribable transcendence". If you look past the flowery language she's no more a theist than Richard Dawkins is. However, she likes to think of herself as a religious believer, so you'll never get her to admit the true reasons for her profession of belief, no matter how much alcohol she drinks, because she doesn't even admit it to herself.

Comment author: SilasBarta 15 October 2009 03:31:56AM 2 points [-]

All right. I was misled by the fact that your first commend was a reply to Wei Dai, who was talking about real religious people. I thought you believed that (most?) intelligent people who say they're religious aren't really religious.

Aren't religious in the sense of consciously taking it all literally, correct, that's my position.

It's the difference between your average forthright atheist and someone like Karen Armstrong, who believes that God "is merely a symbol that points beyond itself to an indescribable transcendence".

So, let's see, she gets benefit of approval from the numerous religious groups by saying all of the applause lights, while maintaining rationality about the literal God hypothesis.

Does that count as intelligent or foolish? I'll leave that as an exercise for the reader.

Comment author: Kaj_Sotala 14 October 2009 06:17:41PM 1 point [-]

I still feel the occasional temptation to start believing, and it has nothing to do with social benefits or a desire to heal my body.

Comment author: LauraABJ 13 October 2009 02:07:15AM 1 point [-]

I believe this idea has been played out in numerous works of science fiction (Kurt Vonnegut's Tralfamidorians for example). It's difficult to comprehend what it would be like to know/understand the future so well as to not anticipate it, and thus I cannot say if I would 'give it up.' I think Eliezer is right though- there would probably be no dramatic choice in the matter; either we understand or we don't.

Comment author: loqi 14 October 2009 12:22:41AM 0 points [-]

This seems analogous to giving up the belief that free will is somehow ontologically basic. The experience of having made a "choice" can be arbitrarily superimposed on pretty much any action I perform. I value the experience of choosing, but recognize it as subjective fiction. Similarly, I find your suggestion

perhaps it can run some computations on the side to generate the qualia of anticipation and continuity

to be highly intuitively acceptable, but I feel that I'm missing something, perhaps a compelling counter-example.

Comment author: timtyler 13 October 2009 06:40:32AM 0 points [-]

No notion of "anticipation"?

That's only because it subcontracts that work to a "mathematical intuition subroutine" that allows the formation of beliefs about the likely consequences of actions.

Comment author: Wei_Dai 13 October 2009 07:32:50AM 0 points [-]

True, but the output of the mathematical intuition subroutine is beliefs about mathematical propositions, relationships, structures, and the like, not expectations of future experiences, which is what we seem to value having. So unless you have some point that I'm not getting, your comment doesn't really seem to address my questions.

Comment author: timtyler 13 October 2009 08:04:27AM *  0 points [-]

"Anticipation" refers to feelings associated with the contemplation of future events. All reasonable decision theories contemplate future events - so is it the associated feelings that you are querying? You are suggesting that a "zombie" or "spock" agent might perform just as well? Maybe so - though one might claim that the evaluations of the desirability of the future events such agents contemplate is closely analogous to the "feelings" of more conventionally constructed creatures.