Jonii comments on The true prisoner's dilemma with skewed payoff matrix - Less Wrong

0 Post author: Jonii 20 November 2010 08:37PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (42)

You are viewing a single comment's thread. Show more comments above.

Comment author: Jonii 21 November 2010 02:29:02PM 0 points [-]

But I'd think if I only said "It doesn't have moral value in itself", you'd still have to go back similar steps to find that property cluster that we assign value. I tried to transfer both ideas by using the word soul and claiming lack of moral value.

Comment author: Vladimir_Nesov 21 November 2010 02:35:10PM 0 points [-]

you'd still have to go back similar steps to find that property cluster that we assign value. I tried to transfer both ideas by using the word soul and claiming lack of moral value.

What property cluster/why I'd need to find it/which both ideas?

Comment author: Jonii 21 November 2010 03:19:44PM 0 points [-]

Those properties that we think makes happy humans better than totally artificial smiling humans mimicing happy humans. You'd need to find it in order to grasp what it means to have a being that lacked moral value, and "both ideas" refers to the distinct ways of explaining what sort of paperclip maximizer we're talking about.

Comment author: Vladimir_Nesov 21 November 2010 03:29:48PM 0 points [-]

Those properties that we think makes happy humans better than totally artificial smiling humans mimicking happy humans.

This I guessed.

You'd need to find it in order to grasp what it means to have a being that lacked moral value,

Why? "No moral value" has a clear decision-theoretic meaning, and referring to particular patterns that have moral value doesn't improve on that understanding. Also, the examples of things that have moral value are easy to imagine.

"both ideas" refers to the distinct ways of explaining what sort of paperclip maximizer we're talking about.

This I still don't understand. You'd need to name two ideas. My intuition at grasping the intended meaning fails me often. One relevant idea that I see is that the paperclip maximizer lacks moral value. What's the other, and how is it relevant?