Oscar_Cunningham comments on Human values differ as much as values can differ - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (205)
I agree, I can almost hear Eliezer saying (correctly) that it's daft to try and tell the FAI what to do , you just give it it's values and let it rip. (if you knew what to do, you wouldn't need the AI) All this post has brought up is a problem that an AI might potentially have to solve. And sure it looks like a difficult problem, but it doesn't feel like one that can't be solved at all. I can think of several rubbish ways to make a bunch of humans think that they all have high status, a brain the size of a planet would think of a excellent one.
Isn't that what society currently does? Come up with numerous ways to blur and obscure the reality of where exactly you fall in the ranking, yet let you plausibly believe you're higher than you really are?
Isn't it that we each care about a particular status hierarchy? The WOW gamer doesn't care about the status hierarchy defined by physical strength and good looks. It's all about his 10 level 80 characters with maxed out gear, and his awesome computer with a Intel Core i7 975 Quad-Core 3.33Ghz cpu, 12GB of tri-channel DDR3, Dual SLIed GeForce GTX 260 graphics cards, 2 1TB hard drives, Bluray, and liquid cooling.
This issue came up on crookedtimber.org before in reply to a claim by Will Wilkinson that free market societies decrease conflict by having numerous different hierarchies so that everyone can be near the top in one of them. (Someone google-fu this?)
The CT.org people replied that these different hiearchies actually exist within a meta-hierarchy that flattens it all out and retains a universal ranking for everyone, dashing the hopes that everyone can have high status. The #1 WOW player, in other words, is still below the #100 tennis player.
Despite the ideological distance I have from them, I have to side with the CT.org folks on this one :-/
ETA: Holy Shi-ite! That discussion was from October '06! Should I be worried or encouraged by the fact that I can remember things like this from so long ago?
The crooked timber post is here. On first glance it seems like a matter of degree: to the extent that there is such a universal ranking, it only fully defeats Wilkinson's point if the universal ranking and its consequences are the only ranking anyone cares about. As long as different people care differently about (the consequences of) different rankings, which it seems to me is often the case, everyone can rise in their favorite ranking and benefit more than others are harmed.
ETA: though maybe the more hierarchies there are, the less good it feels to be #100 on any of them.
Okay, to substantiate my position (per a few requests), I dispute that you can actually achieve the state where people only care about a few particular hierarchies, or even that people have significant choice in which hierarchies they care about. We're hardwired to care about status; this drive is not "up for grabs", and if you could turn off your caring for part of the status ranking, why couldn't you turn it all off?
Furthermore, I'm highly skeptical that e.g. the WOW superstar is actually fully content to remain in the position that being #1 in WOW affords him; rather, he's doing the best he can given his abilities, and this narrow focus on WOW is a kind of resignation. In a way I can kind of relate: in high school, I used to dominate German competitions and classes involving math or science. While that was great, it just shifted my attention to the orchestra classes and math/debate competitions that I couldn't dominate.
Now, you can dull the social influence on yourself that makes you care about status by staying away from the things that will make you compare yourself to the broader (e.g. non-WoW) society, but this is a devil's bargain: it has the same kind of effect on you as solitary confinement, just of a lesser magnitude. (And I can relate there too, if anyone's interested.)
I think the WOW superstar would, if he could, trade his position for one comparable to the #100 tennis player in a heartbeat. And how many mistresses does #1 in Wow convert to?
I don't know about in-game WoW superstars, but I knew an admin of an "unofficial" Russian server of a major AAA MMORPG, and he said that basically all female players of that server he met in real life wanted to go to bed with him. This might have been an exaggeration, but I can confirm at least one date. BTW, I wouldn't rate the guy as attractive.
In the 1990s I happened upon a game of Vampire (a live-action role-playing game) being played outdoors at night on the campus of UC Berkeley. After the game, I happened to be sitting around at Durant Food Court (a cluster of restaurants near campus) when I overheard one of the female players throw herself at one of the organizers: "How many experience points would I need to go to bed with you?" she asked playfully. (The organizer threw me a juicy grin on the side a few moments later, which I took as confirmation that the offer was genuine.)
I am guessing that in the environment of evolutionary adaptation, political success and political advantage consisted largely of things very much like being able to get a dozen people to spend an evening in some organized activity that you run.
ADDED. Now that I have had time to reflect, what she probably said is, "how many experience points do I get for . . .", which is a wittier come-on than the one I originally wrote and which jibes with the fact that one of the organizer's jobs during the game is to award experience points to players.
Interesting; I guess I underestimated the position of unofficial Russian WoW server admins in the meta-hierarchy -- in part because I didn't expect as many desirable Russian women to play WoW.
If the server population is a couple thousand players, and there are 5% of females among them, that leaves you with about 100 females, 10 of which will likely be attractive to you -- and if you run a dozen servers or so, that's definitely not a bad deal if you ask me :)
Take a less extreme version of the position you are arguing against: the WOWer cares about more than the WOW hierarchy, but the meta-hierarchy he sets up is still slightly different from the meta-hierarchy that the 100th best tennis player sets up. The tennis player wouldn rank (1st in tennis, 2nd in WOW) higher than (2nd in tennis, 1st in WOW), but the WOWer would flip the ranking. Do you find this scenario all that implausible?
It's plausible, but irrelevant. The appropriate comparison is how the WoWer would regard a position
If he doesn't yearn for a high ranking in tennis, it's because of the particulars of tennis, not out of a lack of interest in a higher ranking in the meta-hierarchy.
Well, it's not relevant if the WOWer would still rather be the 100th best tennis player and suck at WOW than his current position - which is plausible, but there are probably situations where this sort of preference does matter.
He's certainly interested in the meta-hierarchy, but why can't he value the status gained from WOW slightly higher than the status gained from tennis, irrespective of how much he likes tennis and WOW in themselves?
Yes, I get that someone might plausibly not care about tennis per se. That's irrelevant. What's relevant is whether he'd trade his current position for one with a meta-hierarchy position near the #100 tennis player -- not necessarily involving tennis! -- while also being something he has some interest in anyway.
What I dispute is that people can genuinely not care about moving up in the meta-hierarchy, since it's so hardwired. You can achieve some level of contentedness, sure, but not total satisfaction. The characterization steven gave of the #1 WoW player's state of mind is not realistic.
I'm interested. How can you relate? What was your situation?
Without going into too many personal details (PM or email me if you're interested in that), for a while I lived a lifestyle where my in-person socialization was limited, as were most of my links to the broader society (e.g. no TV), though I made a lot of money (at least relative to the surrounding community).
I also found myself frequently sad, which was very strange, as I felt all of my needs and wants were being met. It was only after a long time that I noticed the correlation between "being around other people" and "not being sad" -- and I'm an introvert!
Here is the article you are looking for
er... why?
ETA: my counter point would be essentially what steven said, but you didn't seem to give an argument.
See my reply to steven.
Create a lot of human-seeming robots = Give everyone a volcano = Fool the humans = Build the Matrix.
To quote myself:
In other words, it isn't valid to analyze the sensations that people get when their higher status is affirmed by others, and then recreate those sensations directly in everyone, without anyone needing to have low status. If you did that, I can think of only 3 possible interpretations of what you would have done, and I find none of them acceptable:
Consciousness is not dependent on computational structure (this leads to vitalism); or
You have changed the computational structure their behaviors and values are part of, and therefore changed their conscious experience and their values; or
You have embedded them each within their own Matrix, in which they perceive themselves as performing isomorophic computations.
I agree that these are all rubbish ideas, which is why we let the AI solve the problem. Because it's smarter than us. If this post was about how we should make the world the better place on our own, then these issues are indeed a (small) problem, but since it was framed in terms of FAI, it's asking the wrong questions.
You're missing the main point of the post. Note the bullet points are ranked in order of increasing importance. See the last bullet point.
BTW, how do you let the AI solve the problem of what kind of AI to build?
What kind of AI to be. That's the essence of being a computationally complex algorithm, and decision-making algorithm in particular: you always learn something new about what you should do, and what you'll actually do, and not just learn it, but make it so.
...or more likely, this won't be a natural problem-category to consider at all.