If you mean that others' mental states matter equally much, then I agree (but this distracts from the point of the experience machine hypothetical). Anything else couldn't possibly matter.
And unsupported strong claim. Dozens of implications and necessary conditions in evolutionary psychology if the claim is assumed true. No justification. No arguments. Only one or two weak points looked up by the claim's proponent.
I think you may be confusing labels and concepts. Maximizing hedonistic mental states means, to the best of my knowledge, programming a hedonistic imperative directly into DNA for full-maximal state constantly from birth, regardless of conditions or situations, and then stacking up humans as much as possible to have as many of them as possible feeling as good as possible. If any of the humans move, they could prove to be a danger to efficient operation of this system, and letting them move thus becomes a net negative, so it follows from this that in the process of optimization all human mobility should be removed, considering that for a superintelligence removing limbs and any sort of mobility from "human" DNA is probably trivial.
But since they're all feeling the best they could possibly feel, then it's all good, right? It's what they like (having been programmed to like it), so that's the ideal world, right?
Edit: See Wireheading for a more detailed explanation and context of the possible result of a happiness-maximizer.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
You haven't read or paid much attention to the metaethics sequence yet, have you? Or do you simply disagree with pretty much all the major points of the first half of it?
Also relevant: Joy in the merely real
I remember starting it, and putting it away because yes, I disagreed with so many things. Especially the present subject; I couldn't find any arguments for the insistence on placating wants rather than improving experience. I'll read it in full next week.