nyan_sandwich comments on We Don't Have a Utility Function - Less Wrong

43 [deleted] 02 April 2013 03:49AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (123)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 02 April 2013 04:56:37PM 1 point [-]

On what basis do you assert you were "reasoned out" of that position?

I'll admit it's rather shaky and I'd be saying the same thing if I'd merely been brainwashed. It doesn't feel like it was precipitated by anything other than legitimate moral argument, though. If I can be brainwashed out of my "terminal values" so easily, and it really doesn't feel like something to resist, then I'd like a sturdier basis on which to base my moral reasoning.

For example, what about your change of mind causes you to reject a conversation metaphor?

What is a conversation metaphor? I'm afraid I don't see what you're getting at.

The other way you might respond is that you have realized that you still value freedom, but have recently realized it is not a terminal value. But that makes the example less useful in figuring out how actual terminal values work.

I still value freedom in what feels like a fundamental way, I just also value hierarchy and social order now. What is gone is the extreme feeling of ickyness attached to authority, and the feeling of sacredness attached to freedom, and the belief that these things were terminal values.

The point is that things I'm likely to identify as "terminal values", especially in the contexts of disagreements, are simply not that fundamental, and are much closer to derived surface heuristics or even tribal affiliation signals.

I feel like I'm not properly responding to your comment though.

Comment author: Furslid 02 April 2013 05:18:32PM 6 points [-]

Nyan, I think your freedom example is a little off. The converse of freedom is not bowing down to a leader. It's being made to bow. People choosing to bow can be beautiful and rational, but I fail to see any beauty in someone bowing when their values dictate they should stand.

Comment author: TimS 02 April 2013 05:24:52PM *  2 points [-]

What is a conversation metaphor? I'm afraid I don't see what you're getting at.

My fault for failing to clarify. There are roughly three ways one can talk about changes to an agent's terminal values.

(1) Such changes never happen. (At a society level, this proposition appears to be false).

(2) Such changes happen through rational processes (i.e. reasoning).

(3) Such changes happen through non-rational processes (e.g. tribal affiliation + mindkilling).

I was using "conversion" as a metaphorical shorthand for the third type of change.

Comment author: Eugine_Nier 03 April 2013 06:10:25AM 3 points [-]

I was using "conversion" as a metaphorical shorthand for the third type of change.

BTW, you might want to change "conversation" to "conversion" in the grandparent.

Comment author: TimS 03 April 2013 01:44:27PM 1 point [-]

Ah! Thanks.

Comment author: [deleted] 02 April 2013 06:33:33PM 1 point [-]

Ok. Then my answer to that is roughly this:

I'll admit it's rather shaky and I'd be saying the same thing if I'd merely been brainwashed. It doesn't feel like it was precipitated by anything other than legitimate moral argument, though. If I can be brainwashed out of my "terminal values" so easily, and it really doesn't feel like something to resist, then I'd like a sturdier basis on which to base my moral reasoning.

This could of course use more detail, unless you understand what I'm getting at.

Comment author: private_messaging 04 April 2013 10:41:08AM 1 point [-]

. If I can be brainwashed out of my "terminal values" so easily, and it really doesn't feel like something to resist, then I'd like a sturdier basis on which to base my moral reasoning.

Are you sure you aren't simply trading open ended beliefs for those that circularly support themselves to a greater extent? When you trust in an authority which tells you to trust in that authority, that's sturdier.

Comment author: Strange7 04 April 2013 10:23:41AM 1 point [-]

I still value freedom in what feels like a fundamental way, I just also value hierarchy and social order now.

Gygax would say your alignment has shifted a step toward Lawful. I tend to prefer the Exalted system, which could represent such a shift through the purchase of a third dot in the virtue of Temperance.

Comment author: TimS 02 April 2013 05:21:04PM 1 point [-]

The point is that things I'm likely to identify as "terminal values", especially in the contexts of disagreements, are simply not that fundamental, and are much closer to derived surface heuristics or even tribal affiliation signals.

That's certainly a serious risk, especially if terminal values work like axioms. There's a strong incentive in debate or policy conflict to claim an instrumental value was terminal just to insulate it from attack. And then, by process of the failure mode identified in Keep Your Identity Small, one is likely to come to believe that the value actually is a terminal value for oneself.

I feel like I'm not properly responding to your comment though.

I took your essay as trying to make a meta-ethical point about "terminal values" and how using the term with an incoherent definition causes confusion in the debate. Parallel to when you said if we interact with an unshielded utility, it's over, we've committed a type error. If that was not your intent, then I've misunderstood the essay.

Comment author: [deleted] 02 April 2013 06:31:47PM 1 point [-]

I took your essay as trying to make a meta-ethical point about "terminal values" and how using the term with an incoherent definition causes confusion in the debate. Parallel to when you said if we interact with an unshielded utility, it's over, we've committed a type error. If that was not your intent, then I've misunderstood the essay.

Oops, it wasn't really about how we use terms or anything. I'm trying to communicate that we are not as morally wise as we sometimes pretend to be, or think we are. That Moral Philosophy is an unsolved problem, and we don't even have a good idea how to solve it (unlike, say physics, where it's unsolved, but the problem is understood).

This is in preparation for some other posts on the subject, the next of which will be posted tonight or soon.

Comment author: Eugine_Nier 03 April 2013 06:08:22AM 0 points [-]

That Moral Philosophy is an unsolved problem, and we don't even have a good idea how to solve it

That said there has been centuries of work on the subject, that Eliezer unfortunately through out because VHM-utilitarianism is so mathematically elegant.