DanielLC comments on Separate morality from free will - Less Wrong

6 Post author: PhilGoetz 10 April 2011 02:35AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (84)

You are viewing a single comment's thread.

Comment author: DanielLC 10 April 2011 08:01:18PM 0 points [-]

I believe the "free will" thing is because without it, you could talk about whether or not a rock is moral. You could just say whether or not the universe is moral.

I consider morality to be an aspect of the universe (a universe with happier people is better, even if nobody's responsible), so I don't see any importance of free will.

Comment author: rabidchicken 10 April 2011 11:36:52PM 2 points [-]

I don't understand, you cannot talk about whether a rock is moral?

Given that a rock appears to have no way to recieve input from the universe, create a plan to satisfy its goals, and act, I would consider a rock morally neutral - In the same way that I consider someone to be morally neutral when they fail to prevent a car from being stolen while they are in a coma in another country.

Comment author: Perplexed 11 April 2011 03:04:52PM 0 points [-]

I believe you are missing Kant's point regarding free will. People have free will. Rocks don't. And that is why it makes moral sense for you to want a universe with happy people, and not a universe with happy rocks!

People deserve happiness because they are morally responsible for causing happiness. Rocks take no responsibility, hence those of us who do take responsibility are under no obligation to worry about the happiness of rocks.

Utilitarians of the LessWrong variety tend to think that possession of consciousness is important in determining whether some entity deserves our moral respect. Kant tended to think that possession of free will is important.

As a contractarian regarding morals, I lean toward Kant's position, though I would probably express the idea in different language.

Comment author: jimrandomh 11 April 2011 05:07:14PM *  2 points [-]

Generally speaking, I'm uneasy about any reduction from a less-confused concept to a more-confused concept. Free will is a more confused concept than moral significance. Also, I can imagine things changing my perspective on free will that would not also change my perspective on moral significance. For example, if we interpret free will as unsolvability by rivals, then the birth of a superintelligence would cause everyone to lose their free will, but have no effect on anyone's moral significance.