TimS comments on We Don't Have a Utility Function - Less Wrong

43 [deleted] 02 April 2013 03:49AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (123)

You are viewing a single comment's thread. Show more comments above.

Comment author: TimS 02 April 2013 05:24:52PM *  2 points [-]

What is a conversation metaphor? I'm afraid I don't see what you're getting at.

My fault for failing to clarify. There are roughly three ways one can talk about changes to an agent's terminal values.

(1) Such changes never happen. (At a society level, this proposition appears to be false).

(2) Such changes happen through rational processes (i.e. reasoning).

(3) Such changes happen through non-rational processes (e.g. tribal affiliation + mindkilling).

I was using "conversion" as a metaphorical shorthand for the third type of change.

Comment author: Eugine_Nier 03 April 2013 06:10:25AM 3 points [-]

I was using "conversion" as a metaphorical shorthand for the third type of change.

BTW, you might want to change "conversation" to "conversion" in the grandparent.

Comment author: TimS 03 April 2013 01:44:27PM 1 point [-]

Ah! Thanks.

Comment author: [deleted] 02 April 2013 06:33:33PM 1 point [-]

Ok. Then my answer to that is roughly this:

I'll admit it's rather shaky and I'd be saying the same thing if I'd merely been brainwashed. It doesn't feel like it was precipitated by anything other than legitimate moral argument, though. If I can be brainwashed out of my "terminal values" so easily, and it really doesn't feel like something to resist, then I'd like a sturdier basis on which to base my moral reasoning.

This could of course use more detail, unless you understand what I'm getting at.