Comment author: DaFranker 24 August 2012 01:56:15PM 1 point [-]

You haven't read or paid much attention to the metaethics sequence yet, have you? Or do you simply disagree with pretty much all the major points of the first half of it?

Also relevant: Joy in the merely real

Comment author: koning_robot 24 August 2012 03:32:31PM 0 points [-]

I remember starting it, and putting it away because yes, I disagreed with so many things. Especially the present subject; I couldn't find any arguments for the insistence on placating wants rather than improving experience. I'll read it in full next week.

Comment author: DaFranker 24 August 2012 12:54:40PM *  1 point [-]

If you mean that others' mental states matter equally much, then I agree (but this distracts from the point of the experience machine hypothetical). Anything else couldn't possibly matter.

And unsupported strong claim. Dozens of implications and necessary conditions in evolutionary psychology if the claim is assumed true. No justification. No arguments. Only one or two weak points looked up by the claim's proponent.

I think you may be confusing labels and concepts. Maximizing hedonistic mental states means, to the best of my knowledge, programming a hedonistic imperative directly into DNA for full-maximal state constantly from birth, regardless of conditions or situations, and then stacking up humans as much as possible to have as many of them as possible feeling as good as possible. If any of the humans move, they could prove to be a danger to efficient operation of this system, and letting them move thus becomes a net negative, so it follows from this that in the process of optimization all human mobility should be removed, considering that for a superintelligence removing limbs and any sort of mobility from "human" DNA is probably trivial.

But since they're all feeling the best they could possibly feel, then it's all good, right? It's what they like (having been programmed to like it), so that's the ideal world, right?

Edit: See Wireheading for a more detailed explanation and context of the possible result of a happiness-maximizer.

Comment author: koning_robot 24 August 2012 02:08:59PM 0 points [-]

And unsupported strong claim. Dozens of implications and necessary conditions in evolutionary psychology if the claim is assumed true. No justification. No arguments. Only one or two weak points looked up by the claim's proponent.

This comment has justification. I don't see how this would affect evolutionary psychology. I'm not sure if I'm parsing your last sentence here correctly; I didn't "look up" anything, and I don't know what the weak points are.

Assuming that the scenario you paint is plausible and the optimal way to get there, then yeah, that's where we should be headed. One of the explicit truths of your scenario is that "they're all feeling the best they could possibly feel". But your scenario is a bad intuition pump. You deliberately constructed this scenario so as to manipulate me into judging what the inhabitants experience as less than that, appealing to some superstitious notion of true/pure/honest/all-natural pleasure.

You may be onto something when you say I might be confusing labels and concepts, but I am not saying that the label "pleasure" refers to something simple. I am only saying that the quality of mental states is the only thing we should care about (note the word should, I'm not saying that is currently the way things are).

Comment author: nshepperd 24 August 2012 12:12:39PM 3 points [-]

Anything else couldn't possibly matter.

Why's that?

Comment author: koning_robot 24 August 2012 01:43:47PM 0 points [-]

A priori, nothing matters. But sentient beings cannot help but make value judgements regarding some of their mental states. This is why the quality of mental states matters.

Wanting something out there in the world to be some way, regardless of whether anyone will ever actually experience it, is different. A want is a proposition about reality whose apparent falsehood makes you feel bad. Why should we care about arbitrary propositions being true or false?

Comment author: nshepperd 22 August 2012 03:33:42PM 2 points [-]

"Desire" denotes your utility function (things you want). "Pleasure" denotes subjectively nice-feeling experiences. These are not necessarily the same thing.

Surely you would have to be superstitious to refuse!

There's nothing superstitious about caring about stuff other than your own mental state.

Comment author: koning_robot 24 August 2012 10:41:49AM -2 points [-]

"Desire" denotes your utility function (things you want). "Pleasure" denotes subjectively nice-feeling experiences. These are not necessarily the same thing.

Indeed they are not necessarily the same thing, which is why my utility function should not value that which I "want" but that which I "like"! The top-level post all but concludes this. The conclusion the author draws just does not follow from what came before. The correct conclusion is that we may still be able to "just" program an AI to maximize pleasure. What we "want" may be complex, but what we "like" may be simple. In fact, that would be better than programming an AI to make the world into what we "want" but not necessarily "like".

There's nothing superstitious about caring about stuff other than your own mental state.

If you mean that others' mental states matter equally much, then I agree (but this distracts from the point of the experience machine hypothetical). Anything else couldn't possibly matter.

Comment author: koning_robot 22 August 2012 08:54:10AM -1 points [-]

In the last decade, neuroscience has confirmed what intuition could only suggest: that we desire more than pleasure. We act not for the sake of pleasure alone. We cannot solve the Friendly AI problem just by programming an AI to maximize pleasure.

Either this conclusion contradicts the whole point of the article, or I don't understand what is meant by the various terms "desire", "want", "pleasure", etc. If pleasure is "that which we like", then yes we can solve FAI by programming an AI to maximize pleasure.

The mistake you (lukeprog, but also eliezer) are apparently making worries me very much. It is irrelevant what we desire or want, as is what we act for. The only thing that is relevant is that which we like. Tell me, if the experience machine gave you that which you like ("pleasure" or "complex fun" or whatchamacallit), would you hook up to it? Surely you would have to be superstitious to refuse!

Comment author: koning_robot 08 March 2012 09:48:14AM *  0 points [-]

I am going to try to be there. I'll be traveling from Maastricht.

Edit: I decided not to go after all.

View more: Prev | Next