RobinZ comments on The Fundamental Question - Less Wrong

43 Post author: MBlume 19 April 2010 04:09PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (277)

You are viewing a single comment's thread. Show more comments above.

Comment author: RobinZ 23 April 2010 07:44:20PM 0 points [-]

Taboo "Pascal's wager", please.

Comment author: byrnema 23 April 2010 08:22:50PM *  0 points [-]

Sure.

Here's an argument:

Suppose there is a dichotomy of beliefs, X and Y, their probabilities are Px and Py, and the utilities of having each belief is Ux and Uy. Then, the average utility of having belief X is Px*Ux and the utility of having belief Y is Py*Uy. You "should" choose having the belief (or set of beliefs) that maximizes average utility, because having beliefs are actions and you should choose actions that maximize utility.

What is the flaw in this argument?

For me, the flaw that you should identify is that you should choose beliefs that are most likely to be true, rather than those which maximize average utility. But this is a normative argument, rather than a logical flaw in the argument.

Comment author: Vladimir_Nesov 23 April 2010 08:37:40PM 3 points [-]

Normally, you should keep many competing beliefs with associated levels of belief in them. The mindset of choosing the action with estimated best expected utility doesn't apply, as actions are mutually exclusive, while mutually contradictory beliefs can be maintained concurrently. Even when you consider which action to carry out, all promising candidates should be kept in mind until moment of execution.

Comment author: mattnewport 23 April 2010 08:46:00PM 1 point [-]

This is complicated in the case of religious beliefs where the deity will judge you by your beliefs and not just your actions.

Comment author: RobinZ 23 April 2010 08:49:34PM 1 point [-]

It is also complicated in the case of religious beliefs where other human beings will judge you by your beliefs, which is one reason why abandoning religions is so hard. But that is off-topic, particularly as you can just lie.

Comment author: mattnewport 23 April 2010 09:00:43PM -1 points [-]

While we're being off topic, I'm of the opinion that if you are someone who accepts you should one-box then you should also accept Pascal's wager. I think both are wrong but most people here seem to accept one-boxing is correct but not accept Pascal's wager. I don't care enough about either to work the argument out in detail though.

Comment author: SilasBarta 23 April 2010 10:06:08PM *  4 points [-]

Newcomb's problem is just a case of making decisions when someone else, who "knows you very well" has already made a decision based on expectation of your decision. There are numerous real-world examples of this. Newcomb's problem only differs in that it takes the limit of the "how well they know you" variable as it approaches "perfect". There needn't be an actual Omega, just a decision theory that is robust for all values of the variable up to and including perfect.

Comment author: mattnewport 23 April 2010 10:28:50PM 0 points [-]

Newcomb's problem is just a case of making decisions when someone else, who "knows you very well" has already made a decision based on expectation of your decision.

Which sounds a lot like Pascal's wager to me, when your decision is whether to believe in god and god is the person who "knows you very well" and is deciding whether to let you into heaven based on whether you believe in him or not.

There are situations which I guess are what you would describe as 'Newcomb-like' where I would do the equivalent of one-boxing. If Omega shows up this evening though I will be taking both his boxes, because there is too big an epistemic gap for me to cross to reach the point of thinking that one-boxing is sensible in this universe.

Comment author: RobinZ 23 April 2010 10:48:03PM *  0 points [-]

But the plausibility of a hypothetical is unrelated to the correct resolution of the hypothetical. One could equally say that two-boxing implies that you should push the man off the bridge in the trolley problem - the latter is just as unphysical as Newcomb. The proper objection to unreasonable hypotheticals is to claim that they do not resemble the real-world situations one might compare them to in the relevant aspects.

Comment author: mattnewport 23 April 2010 10:54:42PM 0 points [-]

I actually think that implausible hypotheticals are unhelpful and probably actively harmful which is why I usually don't involve myself in discussions about Omega. I wish I'd stuck with that policy now.

Comment author: RobinZ 23 April 2010 09:13:27PM 0 points [-]

I think you're mistaken, therefore I would like to see your proof. It would be a shame if I missed an opportunity to be more correct. ;)

Comment author: mattnewport 23 April 2010 09:52:05PM 2 points [-]

They both have an element of privileging the hypothesis. If I had some reason to think I lived in a universe with an Omega/God then I might agree I should one-box/believe in god but since I don't have any reason to think I live in such a universe why am I wasting my time even considering this particular implausible scenario?

Comment author: RobinZ 23 April 2010 10:00:12PM *  1 point [-]

I see what you mean, but there exists one of two problems with the symmetry.

First, the most annoying form of Pascal's Wager is the epistemological version: "Believing that God exists has positive expected utility, so you should do so". This argument fails logically, for reasons SilasBarta listed, and it is usually this form being refuted when people say, "Pascal's Wager fails".

Second, the form of Pascal's Wager concerning worship, "Believing in God, who is known to exist, has positive utility", has moral complexities which are absent from Newcomb's dilemma. Objections in this case usually arise from the normative argument that you should not believe things which are false.

Comment author: byrnema 23 April 2010 10:09:40PM *  0 points [-]

First, the most annoying form of Pascal's Wager is the epistemological version: "Believing that God exists has positive expected utility, so you should do so". This argument fails logically, for reasons SilasBarta listed, and it is usually this form being refuted when people say, "Pascal's Wager fails".

I disagree that it fails logically. The argument, written modus ponens, is:

"If believing in God has positive expected utility, then you should do so".

If you don't believe that believing in God has positive expected utility, then this is not a disagreement in the logic of Pascal's Wager. Pascal's Wager would equally say, "If believing in God has negative expected utility, then you should not do so".

Comment author: byrnema 23 April 2010 08:51:31PM *  0 points [-]

Good point, I edited my form of the argument to include 'sets of beliefs'. If having a set of beliefs maximizes your utility, then having the set is what you "should" do, I think, in the spirit of the argument.

Comment author: RobinZ 23 April 2010 08:27:49PM 0 points [-]