Jack comments on Less Wrong views on morality? - Less Wrong

1 Post author: hankx7787 05 July 2012 05:04PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (145)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 05 July 2012 05:42:30PM *  17 points [-]

There seems to be some division on this point.

I might be mistaken but I got the feeling that there's not much of a division, the picture I've got of LW on meta-ethics is something along the lines of: values exist in peoples heads, those are real, but if there were no people there wouldn't be any values. Values are to some extent universal, since most people care about similar things, this makes some values behave as if they were objective. If you want to categories - though I don't know what you would get out of it, it's a form of nihilism.

An appropriate question when discussing objective and subjective morality is:

  • What would an objective morality look like? VS a subjective one?
Comment author: Jack 05 July 2012 07:30:30PM *  19 points [-]

People here seem to share anti-realist sensibilities but then balk at the label and do weird things for anti-realists like treat moral judgments as beliefs, make is-ought mistakes, argue against non-consequentialism as if there were a fact of the matter, and expect morality to be describable in terms of a coherent and consistent set of rules instead of an ugly mess of evolved heuristics.

I'm not saying it can never be reasonable for an anti-realist to do any of those things, but it certainly seems like belief in subjective or non-cognitive morality hasn't filtered all the way through people's beliefs.

Comment author: TimS 05 July 2012 07:42:17PM 2 points [-]

I attribute this behavior in part to the desire to preserve the possibility of universal provably Friendly AI. I don't think a moral anti-realist is likely to think an AGI can be friendly to me and to Aristotle. It might not even be possible to be friendly to me and any other person.

Comment author: Jack 05 July 2012 08:02:28PM 18 points [-]

I attribute this behavior in part to the desire to preserve the possibility of universal provably Friendly AI

Well that seems like the most dangerous instance of motivated cognition ever.

Comment author: ChristianKl 06 July 2012 06:25:09PM 1 point [-]

It seems like an issue that's important to get right. Is there a test we could run to see whether it's true?

Comment author: Eugine_Nier 07 July 2012 05:28:08AM *  1 point [-]

Yes, but only once. ;)

Comment author: RobertLumley 07 July 2012 03:54:56PM 0 points [-]

Did you mean to link to this comment?

Comment author: Eugine_Nier 07 July 2012 11:16:08PM 0 points [-]

Thanks, fixed.

Comment author: [deleted] 06 July 2012 10:23:38AM 1 point [-]

People here seem to share anti-realist sensibilities but then balk at the label

When I explain my meta-ethical standpoint to people in general, I usually avoid using phrases or words such as "there is no objective morality" or "nihilism” because there is usually a lot of emotional baggage, often times go they go “ah so you think everything is permitted” which is not really what I’m trying to convey.

do weird things for anti-realists like treat moral judgments as beliefs, make is-ought mistakes, argue against non-consequentialism as if there were a fact of the matter, and expect morality to be describable in terms of a coherent and consistent set of rules instead of an ugly mess of evolved heuristics.

In a lot of cases you are absolutely correct, but there are times when I think people on LW try answer “what do I think is right?”, this becomes a question concerning self-knowledge that is e.g. to what degree I'm I aware of what motivates me or can I formulate what I value?

Comment author: Jack 06 July 2012 01:27:35PM 3 points [-]

When I explain my meta-ethical standpoint to people in general, I usually avoid using phrases or words such as "there is no objective morality" or "nihilism” because there is usually a lot of emotional baggage, often times go they go “ah so you think everything is permitted” which is not really what I’m trying to convey.

Terms like "moral subjectivism" are often associated with 'naive undergraduate moral relativism' and I suspect a lot of people are trying to avoid affiliating with the latter.

Comment author: mwengler 10 July 2012 06:57:41AM 2 points [-]

When I explain my meta-ethical standpoint to people in general, I usually avoid using phrases or words such as "there is no objective morality" or "nihilism” because there is usually a lot of emotional baggage, often times go they go “ah so you think everything is permitted” which is not really what I’m trying to convey.

So you don't think everything is permitted?

How do you convey thinking there is no objective truth value to any moral statement and then convey that something is forbidden?

Comment author: [deleted] 10 July 2012 10:51:33AM *  0 points [-]

How do you convey thinking there is no objective truth value to any moral statement and then convey that something is forbidden?

Sure, I can. Doing something that is forbidden, results in harsh consequences (that other agents impose), that is the only meaningful definition I can come up with. Can you come up with any other useful definition?

Comment author: mwengler 10 July 2012 08:06:40PM 1 point [-]

I like to stick with other people's definitions and not come up with my own. Merriam-Webster for example:

1: not permitted or allowed

Thanks for being my straight man! :)

Comment author: [deleted] 22 July 2012 09:03:01PM *  1 point [-]

While reading your response the first time I got a bit annoyed frankly speaking. So I decided to answer it later when I wouldn't just scream blue!

I might have misinterpreted your meaning, but it seems like you present a straw man of my argument. I was trying to make concepts like forbidden and permitted pay rent - even in a world where there is no objective morality, as well as show that our - at least my - intuition about "forbiddeness" and "permittedness" is derived form the kind of consequences that they result in. It's not like something is not permitted in a group, but do not have any bad consequences if preformed.

Comment author: mwengler 22 July 2012 11:28:25PM 0 points [-]

The largest rent I can ever imagine getting from terms which are in wide and common use is to use them to mean the same things everybody else means when using them. To me, it seems coming up with private definitions for public words decreases the value of these words.

I was trying to make concepts like forbidden and permitted pay rent - even in a world where there is no objective morality,

There are many words used to make moral statements. When you declare that no moral statement can be objectively true, then I don't think it makes sense to redefine all these words so they now get used in some other way. I doubt you will ever convince me to agree to the redefining of words away from their standard definitions because to me that is just a recipe for confusion.

I have no idea what is "straw man" about any of my responses here.

Comment author: Viliam_Bur 06 July 2012 08:37:24AM 1 point [-]

treat moral judgments as beliefs, make is-ought mistakes, argue against non-consequentialism

A few examples could help me understand what you mean, because right now I don't have a clue.

expect morality to be describable in terms of a coherent and consistent set of rules instead of an ugly mess of evolved heuristics

I guess the goal is to simplify the mess as much as possible, but not more. To find a smallest set of rules that would generate a similar result.

Comment author: komponisto 05 July 2012 10:54:37PM 0 points [-]

Well said.

Comment author: bryjnar 05 July 2012 10:41:14PM 0 points [-]

I agree. I can't figure out clearly enough exactly what Eliezer's metaethics is, but there definitely seem to be latent anti-realist sympathies floating around.