Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

lmm comments on Rationalist Fiction - Less Wrong

27 Post author: Eliezer_Yudkowsky 19 March 2009 08:22AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (189)

You are viewing a single comment's thread. Show more comments above.

Comment author: lmm 07 January 2014 11:28:17PM 2 points [-]

I view Dirk Gently as a kind of wonderfully effective strawman, and his stories were a great aid to realizing I was an atheist, because at first he seems correct: surely, rather than a "localized meteorological phenomenon", it makes more sense that the guy who's been rained on for 14 straight years is some kind of rain god.

And then you think about what would happen in the real world, and realize that no, even if someone had been rained on for 14 years straight, I would not believe that they were a rain god. Because rain gods are actually impossible.

That part hit me like a punch in the gut.

Comment author: TheOtherDave 08 January 2014 04:01:09AM 5 points [-]

In the real world, these are mostly just games we play with words.

Someone who has been rained on for 14 years straight has an extremely surprising property.

The label we assign that property matters a little, since it affects our subsequent behavior with respect to it. If I call it "rain god" I may be more inclined to worship it; if I label it a "localized meteorological phenomenon" I might be more inclined to study it using the techniques of meteorology; if I label it an extremely unlikely coincidence I might be more inclined not to study it at all; if I label it the work of pranksters with advanced technology I might be more inclined to look for pranksters, etc.

Etc.

But other things matter far more.

Do they have any other equally unlikely observable attributes, for example?
Did anything equally unlikely occur 14 years ago?

Etc.

Worrying overmuch about labels can distract us from actually observing what's in front of us.

Comment author: VAuroch 08 January 2014 04:47:42AM 0 points [-]

So it wouldn't be possible to convince you that 2+2=3? No matter the evidence?

If someone claimed to be a rain god, or was credibly claimed to be a rain god based on previous evidence, and tested this by going through an EMP, stripping, generally removing any plausible way technological means could be associated with them, then being transported while in a medically-induced coma to a series of destinations not disclosed to them in advance in large deserts, and at all times was directly under, in, or above rainclouds, defying all meteorological patterns predicted by the best models just in advance of the trip, I find it hard to see how you could reasonably fail to assign significant probability to a model which made the same predictions as "this person is a rain god".

Comment author: lmm 10 January 2014 12:38:07PM 0 points [-]

It'd be possible, but it would take more evidence than someone having been rained on for 14 years.

If you're talking about models and predictions you've already made the relevant leap, IMO. Even if you're calling the person a "god", you're still taking a fundamentally naturalistic approach; you're not assuming basic mental entities, you're not worshiping.

Comment author: VAuroch 11 January 2014 01:26:21AM -1 points [-]

Calling someone a rain god is making the prediction "If I worship this person, rain will occur at the times I need it more often than it would if I did not worship this person." Worship doesn't stop being worship just because it works.

Comment author: hyporational 08 January 2014 06:39:52AM 0 points [-]

Where does personal insanity become a factor in your probability estimates?

Comment author: VAuroch 09 January 2014 04:37:11AM 1 point [-]

In some sense, basically everywhere there is a very-low or very-high probability belief, since obviously I can't be more confident in any belief than I can be in the reliableness of my system of reasoning. I definitely consider this when I'm evaluating the proper strength of nearly-certain beliefs. In another sense, almost nowhere.

I don't know exactly how confident I should be in my sanity, except that the probability of insanity is small. Also, I'm not confident there would be any evidence distinguishing 'sane and rational' from 'insane but apparently rational'. I model a logical-insane VAuroch as being like the anti-inductors; following different rules which, according to their own standards, are self-consistent.

Since I can't determine how to quantify it, my response has been to treat all other beliefs as conditioned on "my reasoning process is basically sound", which makes a fair number of my beliefs having tacit probability 1; if I find reason to question any of these beliefs, I will have to rederive every belief from the original evidence as much as possible, because it's exposed a significant flaw in the means by which I determine what beliefs to hold. Largely this consists of mathematical proofs, but also things like "there is not currently a flying green elephant in this room" and "an extant rain god is mutually incompatible with reductionism".

Comment author: ialdabaoth 09 January 2014 04:52:24AM 0 points [-]

Since I can't determine how to quantify it, my response has been to treat all other beliefs as conditioned on "my reasoning process is basically sound", which makes a fair number of my beliefs having tacit probability 1; if I find reason to question any of these beliefs, I will have to rederive every belief from the original evidence as much as possible, because it's exposed a significant flaw in the means by which I determine what beliefs to hold. Largely this consists of mathematical proofs, but also things like "there is not currently a flying green elephant in this room" and "an extant rain god is mutually incompatible with reductionism".

This is an amazingly apt description of the mind-state that Robert Anton Wilson called "Chapel Perilous".

Comment author: VAuroch 09 January 2014 05:19:52AM *  -1 points [-]

It is interesting that you think so, but I can't make head or tail of his description of the state, and other descriptions don't bear any particular resemblance to the state of mind I describe.

My position on the matter boils down to "All my beliefs may be unjustified, but until I have evidence suggesting they are, I should provisionally assume the opposite, because worrying about it is counterproductive."