The author says a moral theory should:
- "Cover how one should act in all situations" (instead of dealing only with 'moral' ones)
- Contain no contradictions
- "Cover all situations in which somebody should perform an action, even if this “somebody” isn’t a human being"
In other words, a decision theory, complete with an algorithm (so you can actually use it), and a full set of terminal goals. Not what anyone else means by "moral theory'.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)