# thomblake comments on Rationality and Winning - Less Wrong

17 04 May 2012 06:31PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Sort By: Best

Comment author: 04 May 2012 07:39:37PM 9 points [-]

You didn't phrase it as though it were an example, you phrased it as a summary. Your comment states that Luke's point is about the Singularity, which was not mentioned in the post.

Comment author: 05 May 2012 08:31:44AM *  -3 points [-]

You didn't phrase it as though it were an example, you phrased it as a summary.

Phew, I certainly didn't expect that. I thought it was completely obvious to everyone that the post does not talk about the Singularity and that therefore my comment couldn't possible be about the Singularity either.

Let's analyze my comment:

1a) Your post is basically saying that if you believe that a negative Singularity is likely and that a positive Singularity has lots of expected utility,...

Since his original post did not talk about the Singularity it is instantly obvious that the above sentence can be read as:

1b) Your post is basically saying that if you hold belief X and that belief X is the right thing to do,...

2a) ...then if you work to achieve a positive Singularity you are rational (consistency) and therefore winning.

The end of that sentence makes it clear that I was actually talking about the original post by referring to the consistency of acting according to your beliefs. It could be read as:

2b) ...then if you act according to belief X you are rational (consistency) and therefore winning.

3a) And since nobody can disprove your claim that the Singularity is near, until the very end of the universe, you will be winning winning winning....without actually achieving anything ever.

That sentence shows how anyone could choose any belief about the future, frame it as an unprovable prediction and act accordingly and yet fit the definition of rationality that has been outlined in the original post. It could be read as:

3b) And since nobody can disprove belief X, you will be winning winning winning....without actually achieving anything ever.

Comment author: 06 May 2012 06:22:42AM 7 points [-]

I thought it was completely obvious to everyone that the post does not talk about the Singularity and that therefore my comment couldn't possible be about the Singularity either.

The problem is that you have a history of bringing Singularity issues into posts that are not about the Singularity. (Or at least, have a history of making comments that look like that.) Two examples that spring readily to mind are using a post about Leverage Research to critique SIAI and bringing in post-Singularity scenarios when commenting on a post about current-day issues. With such a history, it's not obvious that your comment couldn't have been about the Singularity.

Comment author: 05 May 2012 05:54:12PM *  2 points [-]

You have succeeded to mix together an unbased personal accusation with a difficult epistemic problem. The complexity of the problem makes it difficult to exactly point out the inappropriateness of the offense... but obviously, it is there, readers see it and downvote accordingly.

The epistemic problem is basicly this: feeling good is an important part of everyone's utility function. If a belief X makes one happy, shouldn't it be rational (as in: increasing expected utility) to believe it, even if it's false? Especially if the belief is unfalsifiable, so the happiness caused by belief will never be countered by a sadness of falsification.

And then you pick Luke as an example, accusing him that this is exactly what he is doing (kind of wireheading himself psychologically). Since what Luke is doing is a group value here, you have added a generous dose of mindkilling to a question that is rather difficult even without doing so. But even without that, it's unnecessarily personally offensive.

The correct answer is along the lines that if Luke has also something else in his utility function, believing a false belief may prevent him from getting it. (Because he might wait for Singularity to provide him this thing, which would never happen; but without this belief he might have followed his goal directly and achieved it.) If the expected utility of achieving those other goals is greater than expected utility of feeling good by thinking false thoughts, then false belief is a net loss, and it even prevents one from realizing and fixing it. But this explanation can be countered by more epistemic problems, etc.

For now, let me just state openly that I would prefer to discuss difficult epistemic problems in a thread without this kind of contributions. Maybe even on a website without this kind of contributions.

Comment author: 05 May 2012 08:09:28PM *  1 point [-]

it is instantly obvious

I would say the karmic reaction disagrees.