Manfred comments on Weighting the probability of being a mind by the quantity of the matter composing the computer that calculates that mind - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Loading…
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Comments (23)
If you make a robot that explicitly cares about things differently depending on how heavy it is, then sure, it can take actions as if it cared about things more when it's heavier.
But that is done using the same probabilities as normal, merely a different utility function. Changing your utilities without changing your probabilities has no impact on the "probability of being a mind."
We don't have to program the machine to explicitly care about things differently depending on how heavy they are. Instead, we program the machine to care simply about how many systems exist--but wait! It turns out it we don't know what we mean by that! According to yttrium.