MrMind comments on An attempt in layman's language to explain the metaethics sequence in a single post. - Less Wrong

1 Post author: Bound_up 12 October 2016 01:57PM

Comments (65)

You are viewing a single comment's thread. Show more comments above.

Comment author: MrMind 20 October 2016 07:53:42AM 0 points [-]

So my car is a car becuse it motor-vates me, but your car is no car at all, because it motor-vates you around, but not me.

The difference is not between two cars, yours and mine, but between a passegner ship and a cargo ship, built for two different purpose and two different class of users.

Yudkowsky isn't being rigourous, he is instead appealing to an imaginary rule, one that is not seen in any other case.

On this we surely agree, I just find the new rule better than the old one. But this is the least important part of the whole discussion.

obviously the premissibility of imposing ones values on others depends on whether they are immoral, amoral, differently moral , etc. Differrently moral is still a possibilirt, for the reasons that you are differently mothered, not unmohtered.

This is well explored in "Three worlds collide". Yudkowski vision of morality is such that it assigns different morality to different aliens, and the same morality to the same species (I'm using your convention). When different worlds collide, it is moral for us to stop babyeaters from eating babies, and it is moral for the superhappy to happify us. I think Eliezer is correct in showing that the only solution is avoiding contact at all.

Comment author: TheAncientGeek 20 October 2016 01:20:04PM *  1 point [-]

The difference is not between two cars, yours and mine, but between a passegner ship and a cargo ship, built for two different purpose and two different class of users.

That seems different to what you were saying before.

This is well explored in "Three worlds collide". Yudkowski vision of morality is such that it assigns different morality to different aliens, and the same morality to the same species (I'm using your convention). When different worlds collide, it is moral for us to stop babyeaters from eating babies, and it is moral for the superhappy to happify us. I think Eliezer is correct in showing that the only solution is avoiding contact at all.

There's not much objectivity in that.

Why is it so important that our morality is the one that motivates us? People keep repeating it as though its a great revelation, but its equally true that babyeater morality motivates babyeaters, so the situation comes out looking symmetrical and therefore relativistc.