In response to Cached Selves
Comment author: kurige 23 March 2009 12:08:49AM *  9 points [-]

Great post.

Here's some additional reading that supports your argument:

Distract yourself. You're more honest about your actions when you can't exert the mental energies necessary to rationalize your actions.

And the (subconcious) desire to avoid appearing hypocritical is a huge motivator.

I've noticed this in myself often. I faithfully watched LOST through the third season, explaining to my friends who had lost interest around the first season that it was, in fact, an awesome show. And then I realized it kind of sucked.

Comment author: kurige 22 March 2009 08:33:10PM *  28 points [-]

Picture of Eliezer in monk's robes (That is you, right?), stories about freemason-esque rituals, specific vocabulary with terms like, "the Bayesian conspiracy".

It's all tongue in cheek, and I enjoy it. But if you're trying to not look like a cult, then you're doing it wrong.

Comment author: kurige 21 March 2009 10:44:27PM *  2 points [-]

In the modern world, karate is unlikely to save your life. But rationality can.

The term "bayesian black-belt" has been thrown around a number of times on OB and LW... this, in my mind, seems misleading. As far as I can tell there are two ways in which bayesian reasoning can be applied directly: introspection and academia. Within those domains, sure, the metaphor makes sense... in meatspace life-and-death situations? Not so much.

"Being rational" doesn't develop your quick-twitch muscle fibers or give you a sixth sense.

Perhaps, where you live, you are never in danger of being physically accosted. If so, you are in the minority. Rationality may help you avoid such situations, but never with a 100% success rate. When you do find yourself in such a situation, you may find yourself wishing you'd studied up on a little Tae Kwon Do.

On at least two occasions - one only a year past - my life was at serious risk because I was not thinking clearly. ... As a gambler I don't like counting on luck, and I'd much rather be rational enough to avoid serious mistakes.

Can you give an example of how being "more rational" could have avoided the accidents?

Of course, properly applying rational techniques will bleed over into all areas of your life. Having a more accurate map of the territory means that you will make better decisions. The vast majority of these decisions, however, can be written off as common sense. Just because I drink coffee when I drive at night to stay alert doesn't make me a master of the "martial art of rationality".

Comment author: Cameron_Taylor 19 March 2009 10:45:00AM 1 point [-]

Perhaps there's something fundamental I'm missing here, but the linearity of events seems pretty clear. If Omega really did calculate that I would give him the $100 then either he miscalculated, or this situation cannot actually occur.

That's absolutely true. In exactly the same way, if the Omega really did calculate that I wouldn't give him the $100 then either he miscalculated, or this situation cannot actually occur.

The difference between your counterfactual instance and my counterfactual instance is that yours just has a weird guy hassling you with deal you want to reject while my counterfactual is logically inconsistent for all values of 'me' that I identify as 'me'.

Comment author: kurige 19 March 2009 11:08:10AM 4 points [-]

Thank you. Now I grok.

So, if this scenario is logically inconsistent for all values of 'me' then there really is nothing that I can learn about 'me' from this problem. I wish I hadn't thought about it so hard.

Comment author: MBlume 19 March 2009 10:02:53AM *  23 points [-]

There are various intuition pumps to explain the answer.

The simplest is to imagine that a moment from now, Omega walks up to you and says "I'm sorry, I would have given you $10000, except I simulated what would happen if I asked you for $100 and you refused". In that case, you would certainly wish you had been the sort of person to give up the $100.

Which means that right now, with both scenarios equally probable, you should want to be the sort of person who will give up the $100, since if you are that sort of person, there's half a chance you'll get $10000.

If you want to be the sort of person who'll do X given Y, then when Y turns up, you'd better bloody well do X.

Comment author: kurige 19 March 2009 10:34:18AM *  5 points [-]

That's not the situation in question. The scenario laid out by Vladimir_Nesov does not allow for an equal probability of getting $10000 and paying $100. Omega has already flipped the coin, and it's already been decided that I'm on the "losing" side. Join that with the fact that me giving $100 now does not increase the chance of me getting $10000 in the future because there is no repetition.

Perhaps there's something fundamental I'm missing here, but the linearity of events seems pretty clear. If Omega really did calculate that I would give him the $100 then either he miscalculated, or this situation cannot actually occur.

-- EDIT --

There is a third possibility after reading Cameron's reply... If Omega is correct and honest, then I am indeed going to give up the money.

But it's a bit of a trick question, isn't it? I'm going to give up the money because Omega says I'm going to give up the money and everything Omega says is gospel truth. However, if Omega hadn't said that I would give up the money, then I wouldn't of given up the money. Which makes this a bit of an impossible situation.

Assuming the existence of Omega, his intelligence, and his honesty, this scenario is an impossibility.

Comment author: Eliezer_Yudkowsky 19 March 2009 06:23:57AM 11 points [-]

We're assuming Omega is trustworthy? I'd give it the $100, of course.

Comment author: kurige 19 March 2009 09:44:21AM 8 points [-]

Can you please explain the reasoning behind this? Given all of the restrictions mentioned (no iterations, no possible benefit to this self) I can't see any reason to part with my hard earned cash. My "gut" says "Hell no!" but I'm curious to see if I'm missing something.

Comment author: kurige 19 March 2009 08:35:38AM *  13 points [-]

This post goes hand in hand with Crisis of Faith. Eliezer's post is all about creating an internal crisis and your post is all about applying that to a real world debate. Like peanut-butter and jelly.

If you want to correct and not just refute then you cannot bring to the table evidence that can only be seen as evidence from your perspective. Ie. you cannot directly use evolution as evidence when the opposing party has no working knowledge of evolution. Likewise, a christian cannot convince an atheist of the existence of God by talking about the wonders of His creation. If you picture you and your opponent's belief systems as vin-diagrams then the discussion must start where they overlap, no matter how small that sliver of common knowledge might be. Hopefully, if you and your opponent employ crisis-of-faith properly, those two circles will slowly converge.

Comment author: kurige 18 March 2009 04:14:16AM *  16 points [-]

There is an excellent example of "priming" the mind here.

The idea is that specific prior knowledge drastically changes the way we process new information. You listen to a sine-wave modulated recording that is initially unintelligible. You then listen to the original recording. You are now primed. Listen again to the modulated recording and suddenly the previously unintelligible recording is clear as day.

I first listened to all of the samples on December 8th, when the link was posted on kottke.org. If I'm not mistaken that means it's been exactly 100 days since I last heard, or even thought about, these recordings. I listened to them again just a few minutes ago and understood every single one of them perfectly.

I can't decide if this is impressive or terrifying.

Comment author: kurige 14 March 2009 11:01:49AM 1 point [-]

Just did a quick search of this page and it didn't turn up... so, by far, the most memorable and referred-to post I've read on OB is Crisis of Faith.

Comment author: Tyrrell_McAllister 09 March 2009 06:31:42PM 19 points [-]

I hope that Kurige comes back to verify this, but I'll bet that when he said

I chose to believe in the existence of God - deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.

he did not mean, "My belief isn't correlated with reality". Rather, I'll bet, he meant exactly what you meant when you said

telling yourself X doesn't make X true

By saying that his choice had no effect on reality, I expect that he meant that his control over his belief did not entail control over the subject of that belief, i.e., the fact of the matter.

His attribution of Orwellian doublethink to himself is far more confusing. I have no idea what to make of that. Maybe your advice in this post is on point there. But the "absolutely zero effect" quote seems unobjectionable.

Comment author: kurige 10 March 2009 09:15:45AM *  5 points [-]

His attribution of Orwellian doublethink to himself is far more confusing. I have no idea what to make of that. Maybe your advice in this post is on point there. But the "absolutely zero effect" quote seems unobjectionable.

From the original comment:

One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.

I don't have the original text handy, but a quick search on wikipedia brings up this quote from the book defining the concept:

The power of holding two contradictory beliefs in one's mind simultaneously, and accepting both of them. … To tell deliberate lies while genuinely believing in them, to forget any fact that has become inconvenient, and then, when it becomes necessary again, to draw it back from oblivion for just so long as it is needed, to deny the existence of objective reality and all the while to take account of the reality which one denies.

The first sentence and the first sentence alone is the definition I had in my mind when I wrote the comment. It has been quite a while since I last read 1984 and I had forgotten the connotation that to "double-think" is to "deny the existence of objective reality." This was not my intention at all, although, upon reflection, it should have been obvious.

This was bad homework on my part; I should have looked the quote up before writing the comment. Instead of focusing on the example of morality that I used in the original comment I'm going to try to step back a bit to clarify my original point... Instead of blind-faith in religious tenants, my world-view currently accommodates two traditionally exclusive systems of belief: religion and science.

These two beliefs are not contradictory, but the complexity lies in reconciling the two.

If one does not agree with the other then my understanding of one or the other is flawed.

View more: Prev | Next