You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Pavitra comments on Lifeism, Anti-Deathism, and Some Other Terminal-Values Rambling - Less Wrong Discussion

4 Post author: Pavitra 07 March 2011 04:35AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (87)

You are viewing a single comment's thread. Show more comments above.

Comment author: Pavitra 10 March 2011 10:22:09PM 0 points [-]

Hurting others is ethically problematic, not morally. For example, I would probably be okay with hurting someone else at their own request. Avoidance of torture is a question of an entirely different type: what I value, not how I think it's appropriate to go about getting it.

I don't have a formalization of my terminal values, but roughly:

I have noticed that sometimes I feel more conscious than other times -- not just awake/dreaming/sleeping, but between different "awake" times. I infer that consciousness/sentience/sapience/personhood/whatever you want to call it, you know, that thing we care about is not a binary predicate, but a scalar. I want to maximize the degree of personhood that exists in the universe.

Comment author: DanielLC 12 March 2011 05:49:03PM *  0 points [-]

Hurting others is ethically problematic, not morally.

What's the difference between ethics and morals?

I want to maximize the degree of personhood that exists in the universe.

So, if you create a person, and torture them for their entire life, that's worth it?

Comment author: Pavitra 12 March 2011 08:00:35PM 0 points [-]

What's the difference between ethics and morals?

By morals, I mean terminal values. By ethics, I mean advanced forms of strategy involving things like Hofstadter's superrationality. I'm not sure what the standard LW jargon is for this sort of thing, but I think I remember reading something about deciding as though you were deciding on behalf of everyone who shares your decision theory.

I want to maximize the degree of personhood that exists in the universe.

So, if you create a person, and torture them for their entire life, that's worth it?

If the most conscious person possible would be unhappy, I'd rather create them than not. The consensus among science fiction writers seems to be with me on this: a drug that makes you happy at the expense of your creative genius is generally treated as a bad thing.

Comment author: DanielLC 13 March 2011 05:03:16AM 0 points [-]

By ethics, I mean advanced forms of strategy involving things like Hofstadter's superrationality. I'm not sure what the standard LW jargon is for this sort of thing

Sounds like decision theory.

Comment author: Pavitra 13 March 2011 05:36:18AM 0 points [-]

That link was what I needed. By ethics I mean, roughly, the difference between causal decision theory and the right answer.

Comment author: TheOtherDave 12 March 2011 08:10:20PM 0 points [-]

Do you mean to equate here the degree to which something is a person, the degree to which a person is conscious, and the degree to which a person is a creative genius?

That's what it reads like, but perhaps I'm reading too much into your comment.

That seems unjustified to me.

Comment author: Pavitra 12 March 2011 08:34:30PM 0 points [-]

I don't mean to equate them. They're each a rough approximation to the thing I actually care about.