Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Kaj_Sotala comments on The curse of identity - Less Wrong

125 Post author: Kaj_Sotala 17 November 2011 07:28PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (298)

You are viewing a single comment's thread. Show more comments above.

Comment author: Kaj_Sotala 17 November 2011 02:51:20PM *  6 points [-]

For one, status-seeking is a zero sum game and only indirectly causes overall gains. The world would be a much better place if people actually cared about things like saving the world or even helping others, and put a little thought to it.

Also, mismatches between our consciously-held goals and our behavior cause plenty of frustration and unhappiness, like in the case of the person who keeps stressing out because their studies don't progress.

Comment author: Vaniver 17 November 2011 07:08:16PM 1 point [-]

For one, status-seeking is a zero sum game and only indirectly causes overall gains. The world would be a much better place if people actually cared about things like saving the world or even helping others, and put a little thought to it.

If I actually cared about saving the world and about conserving my resources, it seems like I would choose some rate of world-saving A.

If I actually cared about saving the world, about conserving my resources, and the opinion of my peers, it seems like I would choose some rate of world-saving B. For reasonable scenarios, B would be greater than A because I can also get respect from my peers, and when you raise demand and keep supply constant quantity supplied increases.

That is, I understand that status causes faking behavior that's a drain. (Status conflicts also lower supply, but it's not clear how much.) I don't think it's clear that the mechanism of status-seeking conflicts with actually caring about other goals or detracts from them on net.

Comment author: Jonathan_Graehl 17 November 2011 06:50:27PM 0 points [-]

I'm sure you've considered that "X is a 0 sum game" doesn't always mean that you should unilaterally avoid playing that game entirely. It does mean you'll want to engineer environments where X taxes at a lower rate.

Comment author: [deleted] 17 November 2011 03:35:14PM -2 points [-]

For one, status-seeking is a zero sum game and only indirectly causes overall gains.

But if status-seeking is what you really want, as evidenced by your decisions, how can you say it's bad that you do it? Can't I just go and claim any goal you're not optimizing for as your "real" goal you "should" have? Alternatively, can't I claim that you only want us to drop status-seeking to get rid of the competition? Where's your explanatory power?

Comment author: Kaj_Sotala 17 November 2011 04:04:44PM *  2 points [-]

But if status-seeking is what you really want, as evidenced by your decisions, how can you say it's bad that you do it?

By the suffering it causes, and also by the fact that when I have realized that I'm doing it, I've stopped doing (that particular form of) it.

Comment author: XiXiDu 17 November 2011 03:08:10PM *  -1 points [-]

For one, the world would be a much better place if people actually cared about things like saving the world or even helping others, and put a little thought to it.

Why do you want to save the world? To allow people, humans, to do what they like to do for much longer than they would otherwise be able to. Status-seeking is one of those things that people are especially fond of.

Ask yourself, would you have written this post after a positive Singularity? Would it matter if some people were engaged in status games all day long?

What you are really trying to tell people is that they want to help solving friendly AI because it is universally instrumentally useful.

In case you want to argue that status-seeking is bad, no matter what, under any circumstances, you have to explain why that is so. And if you are unable to ground utility in something that is physically measurable, like the maximization of certain brain states, then I don't think that you could convincingly demonstrate it to be a relatively undesirable human activity.

Comment author: Kaj_Sotala 17 November 2011 04:07:57PM *  3 points [-]

Umm. Sure, status-seeking may be fine once we have solved all possible problems anyway and we're living in a perfect utopia. But that's not very relevant if we want to discuss the world as it is today.

Comment author: XiXiDu 17 November 2011 04:59:43PM -2 points [-]

But that's not very relevant if we want to discuss the world as it is today.

It is very relevant, because the reason why we want to solve friendly AI in the first place is to protect our complex values given to us by the Blind Idiot God.

Comment author: Kaj_Sotala 17 November 2011 05:20:17PM 0 points [-]

If we're talking about Friendly AI design, sure. I wasn't.