Will_Newsome comments on Rationalists don't care about the future - Less Wrong

3 Post author: PhilGoetz 15 May 2011 07:48AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (143)

You are viewing a single comment's thread. Show more comments above.

Comment author: Will_Newsome 23 May 2011 05:09:22AM *  0 points [-]

This idea of "rational value" you think is incoherent is perhaps a straw-man. Let's instead say that some people think that those methods you are using to discard instrumental values as irrational or find/endorse arational terminal values, might be generalized beyond what is obvious, might assume mistaken things, or might be an approximation of rules that are more explicitly justifiable.

For example, I think a lot of people use a simple line of reasoning like "okay, genetic evolution led me to like certain things, and memetic evolution led me to like other things, and maybe quirks of events that happened to me during development led me to like other things, and some of these intuitively seem more justified, or upon introspecting on them they feel more justified, or seem from the outside as if there would be more selection pressure for their existence so that probably means they're the real values, ..." and then basically stop thinking, or stop examining the intuitions they're using to do that kind of thinking, or continue thinking but remain very confident in their thinking despite all of the known cognitive biases that make such thinking rather difficult.

Interestingly very few people ponder ontology of agency, or timeless control, or the complex relationship between disposition and justification, or spirituality and transpersonal psychology; and among the people who do ponder these things it seems to me that very few stop and think "wait, maybe I am more confused about morality than I had thought". It seems rather unlikely to me that this is because humans have reached diminishing marginal returns in the field of meta-ethics.

Comment author: Rain 23 May 2011 01:14:00PM *  0 points [-]

My "straw-man" does appear to have defenders, though we seem to agree you aren't one of them. I've admitted great confusion regarding ethics, morality, and meta-ethics, and I agree that rationality is one of the most powerful tools we have to dissect and analyze it.

Comment author: Friendly-HI 26 May 2011 12:37:10AM *  0 points [-]

What other valid tools for dissecting and analyzing morality are there again?

I'm not facetiously nit-picking, just wondering about your answer if there is one.

Comment author: Rain 26 May 2011 01:26:38AM *  0 points [-]

Before rationality can be applied, there has to be something there to say 'pick rationality'. Some other options might include intuition, astrology, life wisdom, or random walk.

You required a very narrow subset of possibilities ("valid tools for analyzing and dissecting"), so I'm sure the above options aren't included in what you would expect; it seems to me that you've got an answer already and are looking for a superset.

Comment author: Friendly-HI 26 May 2011 11:25:10AM 0 points [-]

Thanks for your reply. Reading the sentence "rationality is one of the most powerful tools we have to dissect and analyze [morality]" seemed to imply that you thought there were other "equally powerful" (powerful = reliably working) tools to arrive at true conclusions about morality.

As far as I'm concerned rationality is the whole superset, so I was curious about your take on it. And yes, your above options are surely not included in what I would consider to be "powerful tools to arrive at true conclusions". Ultimately I think we don't actually disagree about anything - just another "but does it really make a sound" pitfall.

Comment author: Will_Newsome 23 May 2011 02:12:12PM 0 points [-]

My "straw-man" does appear to have defenders, though we seem to agree you aren't one of them.

To some extent I am one such defender in the sense that I probably expect there to be a lot more of something like rationality to our values than you do. I was just saying that it's not necessary for that to be the case. Either way the important thing is that values are in the territory where you can use rationality on them.

Comment author: Vladimir_Nesov 23 May 2011 02:27:11PM *  0 points [-]

Either way the important thing is that values are in the territory where you can use rationality on them.

For reference, this point was discussed in this post:

Rationality begins by asking how-the-world-is, but spreads virally to any other thought which depends on how we think the world is.

Comment author: Rain 23 May 2011 02:24:13PM *  0 points [-]

The point at which I think rationality enters our values is when those values are self-modifying, at which point you must provide a function for updating. Perhaps we only differ on the percentage we believe to be self-modifying.