Without commenting on whether this presentation matches the original metaethics sequence (with which I disagree), this summary argument seems both unsupported and unfalsifiable.
Would this be an accurate summary of what you think is the meta-ethics sequence? I feel that you captured the important bits but I also feel that we disagree on some aspects:
V(Elves, ) = Christmas spirity
V(Pebblesorters, ) = primality
V(Humans, _ ) = morality
If V(Humans, Alice) =/= V(Humans, ) that doesn't make morality subjective, it is rather i...
Unpacking "should" as " morally obligated to" is potentially helpful, so inasmuch as you can give separate accounts of "moral" and "obligatory".
The elves are not moral. Not just because I, and humans like me happen to disagree with them, no, certainly not. The elves aren’t even trying to be moral. They don’t even claim to be moral. They don’t care about morality. They care about “The Christmas Spirit,” which is about eggnog and stuff
That doesn't generalise to the point that non humans have no morality. You have m...
Morality binds and blinds. People derive moral claims from emotional and intuitive notions. It can feel good and moral to do amoral things. Objective morality has to be tied to evidence what really is human wellbeing; not to moral intuitions that are adaptions to the benefit of ones ingroup; or post hoc thought experiments about knowledge.
The difference is not between two cars, yours and mine, but between a passegner ship and a cargo ship, built for two different purpose and two different class of users.
On this we surely agree, I just find the new rule better than the old one. But this is the least important part of the whole discussion.
This is well explored in "Three worlds collide". Yudkowski vision of morality is such that it assigns different morality to different aliens, and the same morality to the same species (I'm using your convention). When different worlds collide, it is moral for us to stop babyeaters from eating babies, and it is moral for the superhappy to happify us. I think Eliezer is correct in showing that the only solution is avoiding contact at all.
That seems different to what you were saying before.
... (read more)