1 min read

2

"Moral uncertainty (or normative uncertainty) is uncertainty about how to act given the diversity of moral doctrines. For example, suppose that we knew for certain that a new technology would enable more humans to live on another planet with slightly less well-being than on Earth. An average utilitarian would consider these consequences bad, while a total utilitarian would endorse such technology. If we are uncertain about which of these two theories are right, what should we do?" (see: https://wiki.lesswrong.com/wiki/Moral_uncertainty)

You're invited at the Macroscope to discuss moral uncertainty. Please read this short piece, and at least skim or read some of the articles in the "Further reading" section in order to have good fondations to base our discussion on:https://concepts.effectivealtruism.org/concepts/moral-uncertainty/.

Posted on:

2

New Comment