MrMind comments on An attempt in layman's language to explain the metaethics sequence in a single post. - Less Wrong

1 Post author: Bound_up 12 October 2016 01:57PM

Comments (61)

You are viewing a single comment's thread. Show more comments above.

Comment author: MrMind 17 October 2016 01:48:03PM *  0 points [-]

To be honest, Eliezer made a slightly different argument:
1) humans share (because of evolution) a psychological unity that is not affected by regional or temporal distinctions;
2) this unity entails a set of values that is inescapable for every human beings, its collective effect on human cognition and actions we dub "morality";
3) Clippy, Elves and Pebblesorters, being fundamentally different, share a different set of values that guide their actions and what they care about;
4) those are perfectly coherent and sound for those who entertain them, we should though do not call them "Clippy's, Elves' or Pebblesorters' morality", because words should be used in such a way to maximize their usefulness in carving reality: since we cannot go out of our programming and conceivably find ourselves motivated by eggnog or primality, we should not use the term and instead use primality or other words.
That's it: you can debate any single point, but I think the difference is only formal. The underlying understanding, that "motivating set of values" is a two place predicate, is the same, Yudkowski preferred though to use different words for different partially applied predicates, on the grounds of point 1 and 4.

Comment author: TheAncientGeek 19 October 2016 04:48:39PM *  1 point [-]

those are perfectly coherent and sound for those who entertain them, we should though do not call them "Clippy's, Elves' or Pebblesorters' morality", because words should be used in such a way to maximize their usefulness in carving reality: since we cannot go out of our programming and conceivably find ourselves motivated by eggnog or primality, we should not use the term and instead use primality or other words.

So my car is a car becuse it motor-vates me, but your car is no car at all, because it motor-vates you around, but not me. And yo mama ain't no Mama cause she ain't my Mama!

Yudkowsky isn't being rigourous, he is instead appealing to an imaginary rule, one that is not seen in any other case.

And it's not like the issue isn't important, either .. obviously the premissibility of imposing ones values on others depends on whether they are immoral, amoral, differently moral , etc. Differrently moral is still a possibilirt, for the reasons that you are differently mothered, not unmohtered.

Comment author: MrMind 20 October 2016 07:53:42AM 0 points [-]

So my car is a car becuse it motor-vates me, but your car is no car at all, because it motor-vates you around, but not me.

The difference is not between two cars, yours and mine, but between a passegner ship and a cargo ship, built for two different purpose and two different class of users.

Yudkowsky isn't being rigourous, he is instead appealing to an imaginary rule, one that is not seen in any other case.

On this we surely agree, I just find the new rule better than the old one. But this is the least important part of the whole discussion.

obviously the premissibility of imposing ones values on others depends on whether they are immoral, amoral, differently moral , etc. Differrently moral is still a possibilirt, for the reasons that you are differently mothered, not unmohtered.

This is well explored in "Three worlds collide". Yudkowski vision of morality is such that it assigns different morality to different aliens, and the same morality to the same species (I'm using your convention). When different worlds collide, it is moral for us to stop babyeaters from eating babies, and it is moral for the superhappy to happify us. I think Eliezer is correct in showing that the only solution is avoiding contact at all.

Comment author: TheAncientGeek 20 October 2016 01:20:04PM *  1 point [-]

The difference is not between two cars, yours and mine, but between a passegner ship and a cargo ship, built for two different purpose and two different class of users.

That seems different to what you were saying before.

This is well explored in "Three worlds collide". Yudkowski vision of morality is such that it assigns different morality to different aliens, and the same morality to the same species (I'm using your convention). When different worlds collide, it is moral for us to stop babyeaters from eating babies, and it is moral for the superhappy to happify us. I think Eliezer is correct in showing that the only solution is avoiding contact at all.

There's not much objectivity in that.

Why is it so important that our morality is the one that motivates us? People keep repeating it as though its a great revelation, but its equally true that babyeater morality motivates babyeaters, so the situation comes out looking symmetrical and therefore relativistc.

Comment author: entirelyuseless 17 October 2016 02:10:18PM 0 points [-]

"words should be used in such a way to maximize their usefulness in carving reality"

That does not mean that we should not use general words, but that we should have both general words and specific words. That is why it is right to speak of morality in general, and human morality in particular.

As I stated in other replies, it is not true that this disagreement is only about words. In general, when people disagree about how words should be used, that is because they disagree about what should be done. Because when you use words differently, you are likely to end up doing different things. And I gave concrete places where I disagree with Eliezer about what should be done, ways that correspond to how I disagree with him about morality.

In general I would describe the disagreement in the following way, although I agree that he would not accept this characterization: Eliezer believes that human values are intrinsically arbitrary. We just happen to value a certain set of things, and we might have happened to value some other random set. In whatever situation we found ourselves, we would have called those things "right," and that would have been a name for the concrete values we had.

In contrast, I think that we value the things that are good for us. What is "good for us" is not arbitrary, but an objective fact about relationships between human nature and the world. Now there might well be other rational creatures and they might value other things. That will be because other things are good for them.

Comment author: TheAncientGeek 23 October 2016 08:12:08AM 0 points [-]

But not everything people value is actually good for them. You are retaining the problem of equating morality with values.

Comment author: entirelyuseless 23 October 2016 03:15:19PM 1 point [-]

I agree that not everything in particular that people value is good for them. I say that everything that they value in a fundamental way is good for them. If you disagree, and think that some people value things that are bad for them in a fundamental way, how are they supposed to find out that those things are bad for them?

Comment author: hairyfigment 23 October 2016 12:18:57AM *  0 points [-]

I'm going to assume you mean what you say and are not just arguing about definitions. In that case:

You would be an apologist for HP Lovecraft's Azathoth, at best, if you lived in his universe. There's no objective criterion you could give to explain why that wouldn't be moral, unless you beg the question and bring in moral criteria to judge a possible 'ground of morality.' Yes, I'm saying Nyarlathotep should follow morality instead of the supposed dictates of his alien god. And that's not a contradiction but a tautology.

While I'm on the subject, Aquinian theology is an ugly vulgarization of Aristotle's, the latter being more naturally linked to HPL's Azathoth or the divine pirates of Pastafarianism.

Comment author: MrMind 18 October 2016 07:26:22AM *  0 points [-]

That is why it is right to speak of morality in general, and human morality in particular.

I prefer Eliezer's way because it makes evident, when talking to someone who hasn't read the Sequence, that there are different set of self-consistent values, but it's an agreement that people should have before starting to debate and I personally would have no problem in talking about different moralities.

Eliezer believes that human values are intrinsically arbitrary

But does he? Because that would be demonstrably false. Maybe arbitrary in the sense of "occupying a tiny space in the whole set of all possible values", but since our morality is shaped by evolution, it will contain surely some historical accident but also a lot of useful heuristics.
No human can value drinking poison, for example.

What is "good for us" is not arbitrary, but an objective fact about relationships between human nature and the world

If you were to unpack "good", would you insert other meanings besides "what helps our survival"?

Comment author: entirelyuseless 19 October 2016 03:20:05AM 0 points [-]

"There are different sets of self-consistent values." This is true, but I do not agree that all logically possible sets of self-consistent values represent moralities. For example, it would be logically possible for an animal to value nothing but killing itself; but this does not represent a morality, because such an animal cannot exist in reality in a stable manner. It cannot come into existence in a natural way (namely by evolution) at all, even if you might be able to produce one artificially. If you do produce one artificially, it will just kill itself and then it will not exist.

This is part of what I was saying about how when people use words differently they hope to accomplish different things. I speak of morality in general, not to mean "logically consistent set of values", but a set that could reasonably exist in the real word with a real intelligent being. In other words, restricting morality to human values is an indirect way of promoting the position that human values are arbitrary.

As I said, I don't think Eliezer would accept that characterization of his position, and you give one reason why he would not. But he has a more general view where only some sets of values are possible for merely accidental reasons, namely because it just happens that things cannot evolve in other ways. I would say the contrary -- it is not an accident that the value of killing yourself cannot evolve, but this is because killing yourself is bad.

And this kind of explains how "good" has to be unpacked. Good would be what tends to cause tendencies towards itself. Survival is one example, but not the only one, even if everything else will at least have to be consistent with that value. So e.g. not only is survival valued by intelligent creatures in all realistic conditions, but so is knowledge. So knowledge and survival are both good for all intelligent creatures. But since different creatures will produce their knowledge and survival in different ways, different things will be good for them in relation to these ends.

Comment author: TheAncientGeek 20 October 2016 04:49:37AM *  2 points [-]

Good would be what tends to cause tendencies towards itself. Survival is one example

Any virulently self-reproducing meme would be another.

Comment author: entirelyuseless 20 October 2016 01:29:15PM -2 points [-]

This would be a long discussion, but there's some truth in that, and some falsehood.