hairyfigment comments on Practicing what you preach - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (294)
For gods sakes, people.
Logos01, we seem to be using different definitions of "rational behavior". So far I can't tell if this stems from a political dispute, or a factual disagreement, or just an argument that took on a life of its own.
Please try to state your first claim (from the great-grandparent comment) without using the word "rational" or "reason" or any synonym thereof. Perhaps then we can figure out what, if anything, we disagree on.
For my own position: if you choose not to do what works or "wins", then complaining about how you conformed to the right principles will accomplish nothing (except in cases where complaining does help). It will not change the outcome, nor will it increase your knowledge.
We are, and I initially noted this when I parsed "rational behavior" from "instrumentally rational behavior".
This is a request that is definitionally impossible, since the topic at hand was "what is rational behavior".
No contest.
In my case, what I was getting at was the notion that it is possible to present a counterfactual scenario where doing what "loses" is rationally necessary. For this to occur it would be necessary for the processes of rationality available to Louis-the-Loser to have a set of goals, and come to the conclusion that it is necessary to violate all of them.
Let's assume that Louis-the-Loser has a supergoal of giving all humans more happiness over time. Over time, Louis comes to the factual conclusion (with total certainty) that humans are dead set on reducing their happiness to negative levels -- permanently; and that further they have the capacity to do so. Perhaps you now think that Louis's sole remaining rational decision is to fail his supergoal: to kill all humans. And this would be maximal happiness for humans since he'd prevent negative happiness, yes? (And therefore, "win")
But there's a second answer to this question. And it is equally rationally viable. Give the humans what they want. Allow them to constantly and continuously maximize their unhappiness; perhaps even facilitate them in that endeavor. Now, why is this a reasonable thing to do? Because even total certainty can be wrong, and even P=1 statements can be revised.
However, it does require Louis to actually lose.
To focus on one point that seems straightforward:
My first response was, no they can't. I'll change that to, "how, exactly?"
I suppose the verbal utterance of "P=1" can be revised. Just not any belief system with a P=1 in it without external hacking (or bad updating).
By being wrong when you made said statement. Or by a fundamental shift in reality.
I don't think you understand what P(X)=1 means. It doesn't just mean X is going to happen if the laws of the universe remain the same, it doesn't just mean X going to happen if 3^^^3 coins are flipped and at least one lands on heads.
It means that X happens in every possible version of our universe from this point onward. Including ones where the universe is a simulation that explicitly disallows X.
(The only time P(X) = 1 makes sense is in mathematics, e.g. P(two random lines in 2D space are not parallel) = 1)
Ergo, for P(X)=1 to be revised requires the person making that assertion be wrong, or for there to be a fundamental shift in reality.
Yeah, the person making the assertion can be wrong.
Huh? Did you read what I wrote:
Every. Possible. Universe. This accounts for "fundamental shift[s] in reality".
Yup, I most assuredly did.
Saving for those in which the principle you related is altered. Don't try to wrap your head around it. It's a paradox.
Which principle?
"[P(X)=1] doesn't just mean X is going to happen if the laws of the universe remain the same, it doesn't just mean X going to happen if 3^^^3 coins are flipped and at least one lands on heads.
It means that X happens in every possible version of our universe from this point onward. Including ones where the universe is a simulation that explicitly disallows X."