Logos01 comments on Practicing what you preach - Less Wrong

2 Post author: TwistingFingers 23 October 2011 06:12PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (294)

You are viewing a single comment's thread. Show more comments above.

Comment author: Logos01 26 October 2011 07:36:24AM 0 points [-]

Logos01, we seem to be using different definitions of "rational behavior". So far I can't tell if this stems from a political dispute, or a factual disagreement, or just an argument that took on a life of its own.

We are, and I initially noted this when I parsed "rational behavior" from "instrumentally rational behavior".

Please try to state your first claim (from the great-grandparent comment) without using the word "rational" or "reason" or any synonym thereof.

This is a request that is definitionally impossible, since the topic at hand was "what is rational behavior".

For my own position: if you choose not to do what works or "wins", then complaining about how you conformed to the right principles will accomplish nothing (except in cases where complaining does help). It will not change the outcome, nor will it increase your knowledge.

No contest.

In my case, what I was getting at was the notion that it is possible to present a counterfactual scenario where doing what "loses" is rationally necessary. For this to occur it would be necessary for the processes of rationality available to Louis-the-Loser to have a set of goals, and come to the conclusion that it is necessary to violate all of them.

Let's assume that Louis-the-Loser has a supergoal of giving all humans more happiness over time. Over time, Louis comes to the factual conclusion (with total certainty) that humans are dead set on reducing their happiness to negative levels -- permanently; and that further they have the capacity to do so. Perhaps you now think that Louis's sole remaining rational decision is to fail his supergoal: to kill all humans. And this would be maximal happiness for humans since he'd prevent negative happiness, yes? (And therefore, "win")

But there's a second answer to this question. And it is equally rationally viable. Give the humans what they want. Allow them to constantly and continuously maximize their unhappiness; perhaps even facilitate them in that endeavor. Now, why is this a reasonable thing to do? Because even total certainty can be wrong, and even P=1 statements can be revised.

However, it does require Louis to actually lose.

Comment author: hairyfigment 26 October 2011 07:49:19AM 3 points [-]

To focus on one point that seems straightforward:

even P=1 statements can be revised.

My first response was, no they can't. I'll change that to, "how, exactly?"

Comment author: wedrifid 26 October 2011 09:07:57AM 0 points [-]

I suppose the verbal utterance of "P=1" can be revised. Just not any belief system with a P=1 in it without external hacking (or bad updating).

Comment author: Logos01 26 October 2011 08:10:47AM -1 points [-]

I'll change that to, "how, exactly?"

By being wrong when you made said statement. Or by a fundamental shift in reality.

Comment author: dbaupp 26 October 2011 08:38:28AM 0 points [-]

I don't think you understand what P(X)=1 means. It doesn't just mean X is going to happen if the laws of the universe remain the same, it doesn't just mean X going to happen if 3^^^3 coins are flipped and at least one lands on heads.

It means that X happens in every possible version of our universe from this point onward. Including ones where the universe is a simulation that explicitly disallows X.

(The only time P(X) = 1 makes sense is in mathematics, e.g. P(two random lines in 2D space are not parallel) = 1)

Comment author: Logos01 26 October 2011 08:42:45AM -2 points [-]

I don't think you understand what P(X)=1 means. [...]

Ergo, for P(X)=1 to be revised requires the person making that assertion be wrong, or for there to be a fundamental shift in reality.

Comment author: dbaupp 26 October 2011 09:13:31AM *  1 point [-]

Yeah, the person making the assertion can be wrong.

for there to be a fundamental shift in reality.

Huh? Did you read what I wrote:

It doesn't just mean X is going to happen if the laws of the universe remain the same [...] It means that X happens in every possible version of our universe from this point onward

Every. Possible. Universe. This accounts for "fundamental shift[s] in reality".

Comment author: Logos01 26 October 2011 09:17:38AM -2 points [-]

Huh? Did you read what I wrote:

Yup, I most assuredly did.

Every. Possible. Universe. This accounts for "fundamental shift[s] in reality".

Saving for those in which the principle you related is altered. Don't try to wrap your head around it. It's a paradox.

Comment author: dbaupp 26 October 2011 09:39:59AM *  0 points [-]

Which principle?

Comment author: Logos01 26 October 2011 09:42:15AM 0 points [-]

Which principle?

"[P(X)=1] doesn't just mean X is going to happen if the laws of the universe remain the same, it doesn't just mean X going to happen if 3^^^3 coins are flipped and at least one lands on heads.

It means that X happens in every possible version of our universe from this point onward. Including ones where the universe is a simulation that explicitly disallows X."

Comment author: dbaupp 26 October 2011 09:47:13AM 2 points [-]

There is no paradox. Mathematics is independent of the physics of the universe in which it is being discussed, e.g. "The integers" satisfy the same properties as they do for us, even if there are 20 spatial dimensions and 20 temporal ones.

Sure, you can change the axioms you start with, but then you are talking about different objects.