Vladimir_Nesov comments on Value Deathism - Less Wrong

26 Post author: Vladimir_Nesov 30 October 2010 06:20PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (118)

You are viewing a single comment's thread. Show more comments above.

Comment author: Vladimir_Nesov 30 October 2010 06:40:55PM *  2 points [-]

That you won't try incorporates feasibility (and can well be a correct decision, just as expecting defeat may well be correct), but value judgment doesn't, and shouldn't be updated on lack of said feasibility. It's not OK to not take over the world.

I even think trying to achieve it would be counter-productive.

There is no value in trying.

Comment author: DSimon 30 October 2010 08:04:01PM *  2 points [-]

I think that if I took over the world it might cause me to go Unfriendly; that is, there's a nontrivial chance that the values of a DSimon that rules the world would diverge from my current values sharply and somewhat quickly.

Basically, I just don't think I'm immune to corruption, so I don't personally want to rule the world. However, I do wish that the world had an effective ruler that shared my current values.

Comment author: Vladimir_Nesov 30 October 2010 08:08:30PM *  7 points [-]

See this comment. The intended meaning is managing to get your values to successfully optimize the world, not for your fallible human mind to issue orders.

Your actions are pretty "Unfriendly" even now, to the extent they don't further your values because of poor knowledge of what you actually want and poor ability to form efficient plans.

Comment author: RobinHanson 30 October 2010 06:46:07PM 2 points [-]

I don't think you know what "OK" means.

Comment author: Vladimir_Nesov 30 October 2010 06:50:31PM 0 points [-]

Yes, that was some rhetoric applause-lighting on my part with little care about whether you meant what my post seemed to assume you meant. I think the point is worth making (with deathist interpretation of "OK"), even if it doesn't actually apply to yours or Ben's positions.

Comment author: wedrifid 30 October 2010 06:56:09PM 1 point [-]

It's not OK to not take over the world.

Unless you know you're kind of a git or, more generally, your value system itself doesn't rate 'you taking over the world' highly. I agree with your position though.

It is interesting to note that Robin's comment is all valid when considered independently. The error he makes is that he presents it as a reply to your argument. "Should" is not determined by "probably will".

Comment author: Vladimir_Nesov 30 October 2010 07:23:38PM *  3 points [-]

It's not OK to not take over the world.

Unless you know you're kind of a git or, more generally, your value system itself doesn't rate 'you taking over the world' highly.

It's an instrumental goal, it doesn't have to be valuable in itself. If you don't want for your "personal attitude" to apply to the world as a whole, it reflects the fact that your values disagree with your personal attitude, and you prefer for the world to be controlled by your values rather than personal attitude.

Taking over the world as a human ruler is certainly not what I meant, and I expect is a bad idea with bad expected consequences (apart from independent reasons like being in a position to better manage existential risks).

Comment author: wedrifid 30 October 2010 07:42:57PM *  3 points [-]

It's an instrumental goal, it doesn't have to be valuable in itself.

The point being that It can be a terminal anti-goal. People could (and some of them probably do) value not-taking-over-the-world very highly. Similarly there are people who actually do want to die after the normal alloted years, completely independently of sour grapes updating. I think they are silly, but it is their values that matter to them, not my evaluation thereof.

Comment author: Vladimir_Nesov 30 October 2010 07:52:12PM *  3 points [-]

People could (and some of them probably do) value not-taking-over-the-world very highly.

This is a statement about valuation of states of the world, a valuation that is best satisfied by some form of taking over the world (probably much more subtle than what gets classified so by the valuation itself).

I think they are silly, but it is their values that matter to them, not my evaluation thereof.

It's still your evaluation of their situation that says whether you should consider their opinion on the matter of their values, or know what they value better than they do. What is the epistemic content of your thinking they are silly?

Comment author: wedrifid 30 October 2010 07:53:47PM 3 points [-]

I do not agree.