You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.
Comment author:Bound_up
12 October 2016 06:02:16PM
0 points
[-]
Okay. By saying "If they have failed to grasp that morality is obligatory, have they understood it at all? They might continue caring more about eggnog, of course. That is beside the point... morality means what you should care about, not what you happen to do."
it seems you have not understood the idea. Were there any parts of the the post that seemed unclear that you think I might make clearer?
Because the whole point is that to say something is moral = you should do it = it is valued according to the morality equation.
For an Elf to agree something is moral is also to agree that they should do it. When I say they agree it's moral and don't care, that also means they agree they should do it and don't care.
Something being Christmas Spiritey = you Spiritould do it. Humans might agree that something is Christmas Spirit-ey, and agree that they spiritould do it, they just don't care about what they spiritould do, they only care about what they should do.
moral is to Christmas spiritey what "should" is to (make up a word like) "spiritould"
Obligatory is just a kind of "should." Elves agree that some things are obligatory, and don't care, they care about what's ochristmastory.
.
Likewise, to say that today's morality equation is the "best" is to say that today's morality equation is the equation which is most like today's morality equation. Tautology.
Best = most good, and good = valued by the morality equation.
it seems you have not understood the idea. Were there any parts of the the post that seemed unclear that you think I might make clearer?
Almost everything. You explain morality by putting forward one theory. Under those circumstances, most people would expect to see some critique of other theories, and explanation of why your theory is the One True Theory. You don't do the first, and it is not clear that you are even trying to do the second.
Because the whole point is that to say something is moral = you should do it = it is valued according to the morality equation.
And to say that only humans have morality. But if there is something the Elves should do, then morality applies to them., contradicting that claim.
For an Elf to agree something is moral is also to agree that they should do it. When I say they agree it's moral and don't care, that also means they agree they should do it and don't care.
That doesn't help. For one thing, humans don't exactly want to be moral...their moral fibre has to be buttressed bty various punishments and rewards. For another "should" and "want to" are not synonyms..but "moral" and "what you should do" are. So if there is something the Elves should do, at that point you have established that morality applies to the Elves, and the fact that they don't want to do it is a side-issue. (And of course they could tweak their own motivations by constructing punishments and rewards).
Something being Christmas Spiritey = you Spiritould do it. Humans might agree that something is Christmas Spirit-ey, and agree that they spiritould do it, they just don't care about what they spiritould do, they only care about what they should do.
OK. Now you seem to be saying..without quite making it quite explicit of course, ..that morality is by definition unique to humans, because the word "moral" just labels what motivates humans, in the way that "Earth" or "Terra" labels the planet where humans live. That claim isn't completely incomprehensible, it's just strange and arbitrary, and what is considerably strange is the way you feel no need to defend it against alternative theories -- the main alternative being that morality is multiply instantiable, that other civilisations could have their own versions. like they have their own versions , in the way they could have their own versions of houses or money.
You state it as though it is obvious, yet it has gone unnoticed for thousands of years.
Suppose I were to announce that dark matter is angels' tears. Doesn't it need some expansion? That's how your claim reads, that' the outside view.
Obligatory is just a kind of "should." Elves agree that some things are obligatory, and don't care, they care about what's ochristmastory.
Obligatory is a kind of "should" *that shouldn't be overridden by other considerations. (A failure to do what is obligatory is possible, of course, but it is important to remember that it is seen as a lapse, as something wrong, not a valid choice). Yet the Elves are overriding it, casting doubt on whether they have actually understood the concept of "obligatory"
Likewise, to say that today's morality equation is the "best" is to say that today's morality equation is the equation which is most like today's morality equation. Tautology.
Since anyone can say that at any time, that breaks the meaning of "best", which is supposed to pick out something unique. That would be a reductio ad absurdum of your own theory.
Comment author:Bound_up
15 October 2016 12:47:01PM
-1 points
[-]
No, no, no...
Every possible creature, and every process of physics SHOULD do XYZ. But practically nothing is moved by that fact.
This sentence means: It is highly valued in the morality equation for XYZ to be the state of affairs, independently of who/what causes it to be so.
Likewise, everything Spiritould do ABC, but only Elves are moved by that fact.
These are objective equations which apply to everything. To say should, spiritould, clipperould, etc., is just to say about different things that they are valued by this equation or that one. It's an objective truth that they are valued by this equation or that one.
It's just that humans are not moved by almost any of the possible equations. They ARE moved by the morality equation.
Humans and Elves should AND spiritould do whatever. They are both equally obligated and ochristmasated. But one species finds one of those facts moving and not the other, and the other finds the other moving and not the one.
It is not a clear expression of something that can be seen to work
Version 1.
I am obligated to both do and not do any number of acts by any number of shouldness-equations
If that is the case, anything resembling objectivism is out of the window. If I am obligate to do X, and I do X, then my action is right. If I am obligated not do to X, and I do X, my action is wrong. if I am both obligated and not obligated to do X, then my action is somehow both right and wrong..that is, it has no definite moral status.
But that's not quite what you were saying.
Version 2.
There are lots of different kinds of morality, but I am only obligated by human morality.
That would work, but it's not what you mean. You are explicitly embracing...
Version 3.
There are lots of different kinds of morality, but I am only motivated by human morality
There's only one word of difference between that and version 2, which is the substitution of "motivated" for "obligated". As we saw under version 1, it's the existence of multiple conflicting obligations which stymies ethical objectivism. And motivation can't fix that problem, because it is a different thing to obligation. In fact it is orthogonal, because:
You can be motivated to do what you are not obligated to do.
You can be obligated to d what your are not motivated to do.
Or both.
Or neither.
Because of that, version 3 implies version 1, and has the same problem.
All of this is why Eliezer's morality sequence is wrong. Version 2 is basically right. The Baby-Eaters were not immoral, but moral, but according to a different morals. That is not subjectivism, because it is an objective fact that Baby-Eaters are what they are, and are obligated by Baby-Eater morality, and humans are humans, and are obligated by human morality.
But Eliezer (and Bound-Up) do not admit this, nonsensically asserting that non-humans should be obligated by human morality.
Comment author:MrMind
17 October 2016 01:48:03PM
*
0 points
[-]
To be honest, Eliezer made a slightly different argument:
1) humans share (because of evolution) a psychological unity that is not affected by regional or temporal distinctions;
2) this unity entails a set of values that is inescapable for every human beings, its collective effect on human cognition and actions we dub "morality";
3) Clippy, Elves and Pebblesorters, being fundamentally different, share a different set of values that guide their actions and what they care about;
4) those are perfectly coherent and sound for those who entertain them, we should though do not call them "Clippy's, Elves' or Pebblesorters' morality", because words should be used in such a way to maximize their usefulness in carving reality: since we cannot go out of our programming and conceivably find ourselves motivated by eggnog or primality, we should not use the term and instead use primality or other words.
That's it: you can debate any single point, but I think the difference is only formal. The underlying understanding, that "motivating set of values" is a two place predicate, is the same, Yudkowski preferred though to use different words for different partially applied predicates, on the grounds of point 1 and 4.
those are perfectly coherent and sound for those who entertain them, we should though do not call them "Clippy's, Elves' or Pebblesorters' morality", because words should be used in such a way to maximize their usefulness in carving reality: since we cannot go out of our programming and conceivably find ourselves motivated by eggnog or primality, we should not use the term and instead use primality or other words.
So my car is a car becuse it motor-vates me, but your car is no car at all, because it motor-vates you around, but not me. And yo mama ain't no Mama cause she ain't my Mama!
Yudkowsky isn't being rigourous, he is instead appealing to an imaginary rule, one that is not seen in any other case.
And it's not like the issue isn't important, either .. obviously the premissibility of imposing ones values on others depends on whether they are immoral, amoral, differently moral , etc. Differrently moral is still a possibilirt, for the reasons that you are differently mothered, not unmohtered.
Comment author:MrMind
20 October 2016 07:53:42AM
0 points
[-]
So my car is a car becuse it motor-vates me, but your car is no car at all, because it motor-vates you around, but not me.
The difference is not between two cars, yours and mine, but between a passegner ship and a cargo ship, built for two different purpose and two different class of users.
Yudkowsky isn't being rigourous, he is instead appealing to an imaginary rule, one that is not seen in any other case.
On this we surely agree, I just find the new rule better than the old one. But this is the least important part of the whole discussion.
obviously the premissibility of imposing ones values on others depends on whether they are immoral, amoral, differently moral , etc. Differrently moral is still a possibilirt, for the reasons that you are differently mothered, not unmohtered.
This is well explored in "Three worlds collide". Yudkowski vision of morality is such that it assigns different morality to different aliens, and the same morality to the same species (I'm using your convention). When different worlds collide, it is moral for us to stop babyeaters from eating babies, and it is moral for the superhappy to happify us. I think Eliezer is correct in showing that the only solution is avoiding contact at all.
The difference is not between two cars, yours and mine, but between a passegner ship and a cargo ship, built for two different purpose and two different class of users.
That seems different to what you were saying before.
This is well explored in "Three worlds collide". Yudkowski vision of morality is such that it assigns different morality to different aliens, and the same morality to the same species (I'm using your convention). When different worlds collide, it is moral for us to stop babyeaters from eating babies, and it is moral for the superhappy to happify us. I think Eliezer is correct in showing that the only solution is avoiding contact at all.
There's not much objectivity in that.
Why is it so important that our morality is the one that motivates us? People keep repeating it as though its a great revelation, but its equally true that babyeater morality motivates babyeaters, so the situation comes out looking symmetrical and therefore relativistc.
"words should be used in such a way to maximize their usefulness in carving reality"
That does not mean that we should not use general words, but that we should have both general words and specific words. That is why it is right to speak of morality in general, and human morality in particular.
As I stated in other replies, it is not true that this disagreement is only about words. In general, when people disagree about how words should be used, that is because they disagree about what should be done. Because when you use words differently, you are likely to end up doing different things. And I gave concrete places where I disagree with Eliezer about what should be done, ways that correspond to how I disagree with him about morality.
In general I would describe the disagreement in the following way, although I agree that he would not accept this characterization: Eliezer believes that human values are intrinsically arbitrary. We just happen to value a certain set of things, and we might have happened to value some other random set. In whatever situation we found ourselves, we would have called those things "right," and that would have been a name for the concrete values we had.
In contrast, I think that we value the things that are good for us. What is "good for us" is not arbitrary, but an objective fact about relationships between human nature and the world. Now there might well be other rational creatures and they might value other things. That will be because other things are good for them.
Comment author:MrMind
18 October 2016 07:26:22AM
*
0 points
[-]
That is why it is right to speak of morality in general, and human morality in particular.
I prefer Eliezer's way because it makes evident, when talking to someone who hasn't read the Sequence, that there are different set of self-consistent values, but it's an agreement that people should have before starting to debate and I personally would have no problem in talking about different moralities.
Eliezer believes that human values are intrinsically arbitrary
But does he? Because that would be demonstrably false. Maybe arbitrary in the sense of "occupying a tiny space in the whole set of all possible values", but since our morality is shaped by evolution, it will contain surely some historical accident but also a lot of useful heuristics.
No human can value drinking poison, for example.
What is "good for us" is not arbitrary, but an objective fact about relationships between human nature and the world
If you were to unpack "good", would you insert other meanings besides "what helps our survival"?
"There are different sets of self-consistent values." This is true, but I do not agree that all logically possible sets of self-consistent values represent moralities. For example, it would be logically possible for an animal to value nothing but killing itself; but this does not represent a morality, because such an animal cannot exist in reality in a stable manner. It cannot come into existence in a natural way (namely by evolution) at all, even if you might be able to produce one artificially. If you do produce one artificially, it will just kill itself and then it will not exist.
This is part of what I was saying about how when people use words differently they hope to accomplish different things. I speak of morality in general, not to mean "logically consistent set of values", but a set that could reasonably exist in the real word with a real intelligent being. In other words, restricting morality to human values is an indirect way of promoting the position that human values are arbitrary.
As I said, I don't think Eliezer would accept that characterization of his position, and you give one reason why he would not. But he has a more general view where only some sets of values are possible for merely accidental reasons, namely because it just happens that things cannot evolve in other ways. I would say the contrary -- it is not an accident that the value of killing yourself cannot evolve, but this is because killing yourself is bad.
And this kind of explains how "good" has to be unpacked. Good would be what tends to cause tendencies towards itself. Survival is one example, but not the only one, even if everything else will at least have to be consistent with that value. So e.g. not only is survival valued by intelligent creatures in all realistic conditions, but so is knowledge. So knowledge and survival are both good for all intelligent creatures. But since different creatures will produce their knowledge and survival in different ways, different things will be good for them in relation to these ends.
Comment author:Bound_up
16 October 2016 10:17:48PM
*
0 points
[-]
They eat innocent, sentient beings who suffer and are terrified because of it. That's wrong, no matter who does it.
It may not be un-baby-eater-ey, but it's wrong.
Likewise, not eating babies is un-baby-eater-ey, no matter who does it. It might not be wrong, but it is un-baby-eater-ey.
We have two species who agree on the physical effects of certain actions. One species likes the effects of the action, and the other doesn't. The difference between them is what they value.
"Right" just means "in harmony with this set of values." Baby-eater-ey means "in harmony with this other set of values."
There's no contradiction in saying that something can be in harmony with one set of values and not in harmony with another set of values. Hence, there's no contradiction in saying that eating babies is wrong, and is also baby-eater-ey. You can also note that the action is found compelling by one species and not compelling by another, and there is no contradiction in this, either.
What could "right" mean if we have "right according to these morals" AND "right according to these other, contradictory morals?"
I see one possibility: "right" is taken to mean " in harmony with any set of values." Which, of course, makes it meaningless. Do you see another possibility?
I disagree that it is wrong for them to do that. And this is not just a disagreement about words: I disagree that Eliezer's preferred outcome for the story is better than the other outcome.
"Right" is just another way of saying "good", or anyway "reasonably judged to be good." And good is the kind of thing which naturally results in desire. Note that I did not say it is "what is desired" any more than you want to say that someone values at a particular moment is necessarily right. I said it is what naturally results in desire. This definition is in fact very close to yours, except that I don't make the whole universe revolve around human beings by saying that nothing is good except what is good for humans. And since different kinds of things naturally result in desire for different kinds of beings (e.g. humans and babyeaters), those different things are right for different kinds of beings.
That does not make "right" or "good" meaningless. It makes it relative to something. And this is an obvious fact about the meaning of the words; to speak of good is to speak of what is good for someone. This is not subjectivism, since it is an objective fact that some things are good for humans, and other things are good for other things.
Nor does this mean that right means "in harmony with any set of values." It has to be in harmony with some real set of values, not an invented one, nor one that someone simply made up -- for the same reasons that you do not allow human morals to be simply invented by a random individual.
Returning to the larger point, as I said, this is not just a disagreement about words, but about what is good. People maintaining your theory (like Eliezer) hope to optimize the universe for human values. I have no such hope, and I think it is a perverse idea in the first place.
"Right" is just another way of saying "good", or anyway "reasonably judged to be good."
No, morally rightness and wrongness have implications about rule following and rule breaking, reward and punishment that moral goodness and harness dont. Giving to charity is virus, but not giving to charity isn't wrong and doesn't deserve punishment.
Similarly, moral goodness and hedonic goodness are different.
I'm not sure what you're saying. I would describe giving to charity as morally good without implying that not giving is morally evil.
I agree that moral goodness is different from hedonic goodness (which I assume means pleasure), but I would describe that by saying that pleasure is good in a certain way, but may or may not be good all things considered, while moral goodness means what is good all things considered.
Comment author:Bound_up
19 October 2016 03:27:15AM
0 points
[-]
I think I get it.
You're saying that "right" just means "in harmony with any set of values held by sentient beings?"
So, baby-eating is right for baby-eaters, wrong for humans, and all either of those statements means is that they are/aren't consistent with the fundamental values of the two species?
That is most of it. But again, I insist that the disagreement is real. Because Eliezer would want to stomp out baby-eater values from the cosmos. I would not.
Comment author:Bound_up
16 October 2016 10:20:08PM
*
-1 points
[-]
If you are interested, I might recommend trying to write up what you think this idea is, and see if you find any holes in your understanding that way. I'm not sure how to make it any clearer right now, but, for what it's worth, you have my word that you have not understood the idea.
We are not disagreeing about something we both understand; you are disagreeing with a series of ideas you think I hold, and I am trying to explain the original idea in a way that you find understandable and, apparently, not yet succeeding.
If you are interested, I might recommend trying to write up what you think this idea is, and see if you find any holes in your understanding that way.
I believe I just did something like that. Of course, I attributed the holes to the theory not working. If you want me to attribute them to my not having understood you, you need to put forward a version that works.
Comments (47)
Okay. By saying "If they have failed to grasp that morality is obligatory, have they understood it at all? They might continue caring more about eggnog, of course. That is beside the point... morality means what you should care about, not what you happen to do."
it seems you have not understood the idea. Were there any parts of the the post that seemed unclear that you think I might make clearer?
Because the whole point is that to say something is moral = you should do it = it is valued according to the morality equation.
For an Elf to agree something is moral is also to agree that they should do it. When I say they agree it's moral and don't care, that also means they agree they should do it and don't care.
Something being Christmas Spiritey = you Spiritould do it. Humans might agree that something is Christmas Spirit-ey, and agree that they spiritould do it, they just don't care about what they spiritould do, they only care about what they should do.
moral is to Christmas spiritey what "should" is to (make up a word like) "spiritould"
Obligatory is just a kind of "should." Elves agree that some things are obligatory, and don't care, they care about what's ochristmastory.
.
Likewise, to say that today's morality equation is the "best" is to say that today's morality equation is the equation which is most like today's morality equation. Tautology.
Best = most good, and good = valued by the morality equation.
Almost everything. You explain morality by putting forward one theory. Under those circumstances, most people would expect to see some critique of other theories, and explanation of why your theory is the One True Theory. You don't do the first, and it is not clear that you are even trying to do the second.
And to say that only humans have morality. But if there is something the Elves should do, then morality applies to them., contradicting that claim.
That doesn't help. For one thing, humans don't exactly want to be moral...their moral fibre has to be buttressed bty various punishments and rewards. For another "should" and "want to" are not synonyms..but "moral" and "what you should do" are. So if there is something the Elves should do, at that point you have established that morality applies to the Elves, and the fact that they don't want to do it is a side-issue. (And of course they could tweak their own motivations by constructing punishments and rewards).
OK. Now you seem to be saying..without quite making it quite explicit of course, ..that morality is by definition unique to humans, because the word "moral" just labels what motivates humans, in the way that "Earth" or "Terra" labels the planet where humans live. That claim isn't completely incomprehensible, it's just strange and arbitrary, and what is considerably strange is the way you feel no need to defend it against alternative theories -- the main alternative being that morality is multiply instantiable, that other civilisations could have their own versions. like they have their own versions , in the way they could have their own versions of houses or money.
You state it as though it is obvious, yet it has gone unnoticed for thousands of years.
Suppose I were to announce that dark matter is angels' tears. Doesn't it need some expansion? That's how your claim reads, that' the outside view.
Obligatory is a kind of "should" *that shouldn't be overridden by other considerations. (A failure to do what is obligatory is possible, of course, but it is important to remember that it is seen as a lapse, as something wrong, not a valid choice). Yet the Elves are overriding it, casting doubt on whether they have actually understood the concept of "obligatory"
Since anyone can say that at any time, that breaks the meaning of "best", which is supposed to pick out something unique. That would be a reductio ad absurdum of your own theory.
No, no, no...
Every possible creature, and every process of physics SHOULD do XYZ. But practically nothing is moved by that fact.
This sentence means: It is highly valued in the morality equation for XYZ to be the state of affairs, independently of who/what causes it to be so.
Likewise, everything Spiritould do ABC, but only Elves are moved by that fact.
These are objective equations which apply to everything. To say should, spiritould, clipperould, etc., is just to say about different things that they are valued by this equation or that one. It's an objective truth that they are valued by this equation or that one.
It's just that humans are not moved by almost any of the possible equations. They ARE moved by the morality equation.
Humans and Elves should AND spiritould do whatever. They are both equally obligated and ochristmasated. But one species finds one of those facts moving and not the other, and the other finds the other moving and not the one.
Perhaps now it is clear?
It is not a clear expression of something that can be seen to work
Version 1.
I am obligated to both do and not do any number of acts by any number of shouldness-equations
If that is the case, anything resembling objectivism is out of the window. If I am obligate to do X, and I do X, then my action is right. If I am obligated not do to X, and I do X, my action is wrong. if I am both obligated and not obligated to do X, then my action is somehow both right and wrong..that is, it has no definite moral status.
But that's not quite what you were saying.
Version 2.
There are lots of different kinds of morality, but I am only obligated by human morality.
That would work, but it's not what you mean. You are explicitly embracing...
Version 3.
There are lots of different kinds of morality, but I am only motivated by human morality
There's only one word of difference between that and version 2, which is the substitution of "motivated" for "obligated". As we saw under version 1, it's the existence of multiple conflicting obligations which stymies ethical objectivism. And motivation can't fix that problem, because it is a different thing to obligation. In fact it is orthogonal, because:
You can be motivated to do what you are not obligated to do. You can be obligated to d what your are not motivated to do. Or both. Or neither.
Because of that, version 3 implies version 1, and has the same problem.
All of this is why Eliezer's morality sequence is wrong. Version 2 is basically right. The Baby-Eaters were not immoral, but moral, but according to a different morals. That is not subjectivism, because it is an objective fact that Baby-Eaters are what they are, and are obligated by Baby-Eater morality, and humans are humans, and are obligated by human morality.
But Eliezer (and Bound-Up) do not admit this, nonsensically asserting that non-humans should be obligated by human morality.
To be honest, Eliezer made a slightly different argument:
1) humans share (because of evolution) a psychological unity that is not affected by regional or temporal distinctions;
2) this unity entails a set of values that is inescapable for every human beings, its collective effect on human cognition and actions we dub "morality";
3) Clippy, Elves and Pebblesorters, being fundamentally different, share a different set of values that guide their actions and what they care about;
4) those are perfectly coherent and sound for those who entertain them, we should though do not call them "Clippy's, Elves' or Pebblesorters' morality", because words should be used in such a way to maximize their usefulness in carving reality: since we cannot go out of our programming and conceivably find ourselves motivated by eggnog or primality, we should not use the term and instead use primality or other words.
That's it: you can debate any single point, but I think the difference is only formal. The underlying understanding, that "motivating set of values" is a two place predicate, is the same, Yudkowski preferred though to use different words for different partially applied predicates, on the grounds of point 1 and 4.
So my car is a car becuse it motor-vates me, but your car is no car at all, because it motor-vates you around, but not me. And yo mama ain't no Mama cause she ain't my Mama!
Yudkowsky isn't being rigourous, he is instead appealing to an imaginary rule, one that is not seen in any other case.
And it's not like the issue isn't important, either .. obviously the premissibility of imposing ones values on others depends on whether they are immoral, amoral, differently moral , etc. Differrently moral is still a possibilirt, for the reasons that you are differently mothered, not unmohtered.
The difference is not between two cars, yours and mine, but between a passegner ship and a cargo ship, built for two different purpose and two different class of users.
On this we surely agree, I just find the new rule better than the old one. But this is the least important part of the whole discussion.
This is well explored in "Three worlds collide". Yudkowski vision of morality is such that it assigns different morality to different aliens, and the same morality to the same species (I'm using your convention). When different worlds collide, it is moral for us to stop babyeaters from eating babies, and it is moral for the superhappy to happify us. I think Eliezer is correct in showing that the only solution is avoiding contact at all.
That seems different to what you were saying before.
There's not much objectivity in that.
Why is it so important that our morality is the one that motivates us? People keep repeating it as though its a great revelation, but its equally true that babyeater morality motivates babyeaters, so the situation comes out looking symmetrical and therefore relativistc.
"words should be used in such a way to maximize their usefulness in carving reality"
That does not mean that we should not use general words, but that we should have both general words and specific words. That is why it is right to speak of morality in general, and human morality in particular.
As I stated in other replies, it is not true that this disagreement is only about words. In general, when people disagree about how words should be used, that is because they disagree about what should be done. Because when you use words differently, you are likely to end up doing different things. And I gave concrete places where I disagree with Eliezer about what should be done, ways that correspond to how I disagree with him about morality.
In general I would describe the disagreement in the following way, although I agree that he would not accept this characterization: Eliezer believes that human values are intrinsically arbitrary. We just happen to value a certain set of things, and we might have happened to value some other random set. In whatever situation we found ourselves, we would have called those things "right," and that would have been a name for the concrete values we had.
In contrast, I think that we value the things that are good for us. What is "good for us" is not arbitrary, but an objective fact about relationships between human nature and the world. Now there might well be other rational creatures and they might value other things. That will be because other things are good for them.
I prefer Eliezer's way because it makes evident, when talking to someone who hasn't read the Sequence, that there are different set of self-consistent values, but it's an agreement that people should have before starting to debate and I personally would have no problem in talking about different moralities.
But does he? Because that would be demonstrably false. Maybe arbitrary in the sense of "occupying a tiny space in the whole set of all possible values", but since our morality is shaped by evolution, it will contain surely some historical accident but also a lot of useful heuristics.
No human can value drinking poison, for example.
If you were to unpack "good", would you insert other meanings besides "what helps our survival"?
"There are different sets of self-consistent values." This is true, but I do not agree that all logically possible sets of self-consistent values represent moralities. For example, it would be logically possible for an animal to value nothing but killing itself; but this does not represent a morality, because such an animal cannot exist in reality in a stable manner. It cannot come into existence in a natural way (namely by evolution) at all, even if you might be able to produce one artificially. If you do produce one artificially, it will just kill itself and then it will not exist.
This is part of what I was saying about how when people use words differently they hope to accomplish different things. I speak of morality in general, not to mean "logically consistent set of values", but a set that could reasonably exist in the real word with a real intelligent being. In other words, restricting morality to human values is an indirect way of promoting the position that human values are arbitrary.
As I said, I don't think Eliezer would accept that characterization of his position, and you give one reason why he would not. But he has a more general view where only some sets of values are possible for merely accidental reasons, namely because it just happens that things cannot evolve in other ways. I would say the contrary -- it is not an accident that the value of killing yourself cannot evolve, but this is because killing yourself is bad.
And this kind of explains how "good" has to be unpacked. Good would be what tends to cause tendencies towards itself. Survival is one example, but not the only one, even if everything else will at least have to be consistent with that value. So e.g. not only is survival valued by intelligent creatures in all realistic conditions, but so is knowledge. So knowledge and survival are both good for all intelligent creatures. But since different creatures will produce their knowledge and survival in different ways, different things will be good for them in relation to these ends.
Any virulently self-reproducing meme would be another.
They eat innocent, sentient beings who suffer and are terrified because of it. That's wrong, no matter who does it.
It may not be un-baby-eater-ey, but it's wrong.
Likewise, not eating babies is un-baby-eater-ey, no matter who does it. It might not be wrong, but it is un-baby-eater-ey.
We have two species who agree on the physical effects of certain actions. One species likes the effects of the action, and the other doesn't. The difference between them is what they value.
"Right" just means "in harmony with this set of values." Baby-eater-ey means "in harmony with this other set of values."
There's no contradiction in saying that something can be in harmony with one set of values and not in harmony with another set of values. Hence, there's no contradiction in saying that eating babies is wrong, and is also baby-eater-ey. You can also note that the action is found compelling by one species and not compelling by another, and there is no contradiction in this, either.
What could "right" mean if we have "right according to these morals" AND "right according to these other, contradictory morals?"
I see one possibility: "right" is taken to mean " in harmony with any set of values." Which, of course, makes it meaningless. Do you see another possibility?
I disagree that it is wrong for them to do that. And this is not just a disagreement about words: I disagree that Eliezer's preferred outcome for the story is better than the other outcome.
"Right" is just another way of saying "good", or anyway "reasonably judged to be good." And good is the kind of thing which naturally results in desire. Note that I did not say it is "what is desired" any more than you want to say that someone values at a particular moment is necessarily right. I said it is what naturally results in desire. This definition is in fact very close to yours, except that I don't make the whole universe revolve around human beings by saying that nothing is good except what is good for humans. And since different kinds of things naturally result in desire for different kinds of beings (e.g. humans and babyeaters), those different things are right for different kinds of beings.
That does not make "right" or "good" meaningless. It makes it relative to something. And this is an obvious fact about the meaning of the words; to speak of good is to speak of what is good for someone. This is not subjectivism, since it is an objective fact that some things are good for humans, and other things are good for other things.
Nor does this mean that right means "in harmony with any set of values." It has to be in harmony with some real set of values, not an invented one, nor one that someone simply made up -- for the same reasons that you do not allow human morals to be simply invented by a random individual.
Returning to the larger point, as I said, this is not just a disagreement about words, but about what is good. People maintaining your theory (like Eliezer) hope to optimize the universe for human values. I have no such hope, and I think it is a perverse idea in the first place.
No, morally rightness and wrongness have implications about rule following and rule breaking, reward and punishment that moral goodness and harness dont. Giving to charity is virus, but not giving to charity isn't wrong and doesn't deserve punishment.
Similarly, moral goodness and hedonic goodness are different.
I'm not sure what you're saying. I would describe giving to charity as morally good without implying that not giving is morally evil.
I agree that moral goodness is different from hedonic goodness (which I assume means pleasure), but I would describe that by saying that pleasure is good in a certain way, but may or may not be good all things considered, while moral goodness means what is good all things considered.
I'm saying its a bad idea to collapse together the ideas of moral obligation, moral advisability and pleasure.
I think I get it.
You're saying that "right" just means "in harmony with any set of values held by sentient beings?"
So, baby-eating is right for baby-eaters, wrong for humans, and all either of those statements means is that they are/aren't consistent with the fundamental values of the two species?
That is most of it. But again, I insist that the disagreement is real. Because Eliezer would want to stomp out baby-eater values from the cosmos. I would not.
Metaethically, I don't see a disagreement between you and Eliezer. Ethically, I do.
Eliezer says he values babies not being eaten more than he values letting a sentient being eat babies just because it wants to.
You say you don't, that's all. Different values.
Are you serious, though? What if you had enough power to stop them from eating babies without having to kill them? Can we just give them fake babies?
If you are interested, I might recommend trying to write up what you think this idea is, and see if you find any holes in your understanding that way. I'm not sure how to make it any clearer right now, but, for what it's worth, you have my word that you have not understood the idea.
We are not disagreeing about something we both understand; you are disagreeing with a series of ideas you think I hold, and I am trying to explain the original idea in a way that you find understandable and, apparently, not yet succeeding.
I believe I just did something like that. Of course, I attributed the holes to the theory not working. If you want me to attribute them to my not having understood you, you need to put forward a version that works.