LauraABJ comments on Virtue Ethics for Consequentialists - Less Wrong

33 Post author: Will_Newsome 04 June 2010 04:08PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (178)

You are viewing a single comment's thread.

Comment author: LauraABJ 04 June 2010 04:31:53PM 14 points [-]

I agree that these virtue ethics may help some people with their instrumental rationality. In general I have noticed a trend at lesswrong in which popular modes of thinking are first shunned as being irrational and not based on truth, only to be readopted later as being more functional for achieving one's stated goals. I think this process is important, because it allows one to rationally evaluate which 'irrational' models lead to the best outcome.

Comment author: fburnaby 04 June 2010 06:42:56PM 3 points [-]

This also fits my (non-LW) experience very well.

There's that catchy saying: "evolution is smarter than you are". I think it probably also extends somewhat to cultural evolution. Given that our behaviour is strongly influenced by these, I think we should expect to 'rediscover' much of our own biases and intuitions as useful heuristics for increasing instrumental rationality under some fairly familiar-looking utility function.

Comment author: thomblake 04 June 2010 06:53:34PM 2 points [-]

Given that our behaviour is strongly influenced by these, I think we should expect to 'rediscover' much of our own biases and intuitions as useful heuristics for increasing instrumental rationality under some fairly familiar-looking utility function.

Sadly, there's good reason to think that many of these familiar heuristics and biases were very good for acting optimally in tribes on the savanna during a particular period of time, and it's likely that they'll lead us into more trouble the further we go from that environment.

Comment author: fburnaby 04 June 2010 07:51:45PM *  2 points [-]

You are right. I was wrong, or at least far too sloppy. I agree that we should not presume that any given mismatch between our rational evaluation and a more 'folksy' one can be attributed to a problem in our map. Rationality is interesting precisely because it does better than my intuition in situations that my ancestors didn't often encounter.

But the point I'm trying and so far failing to get at is that for the purposes of instrumental rationality, we are equipped with some interesting information-processing gear. Certainly, letting it run amok won't benefit me, but rationally exploiting my intuitions where appropriate is kind-of a cool mind-hack. Will_Newsome's post, as I understood it, does a good job of making this point. He says "Moral philosophy was designed for humans, not for rational agents." and that we should exploit that where appropriate.

The post resonated with my view how I try to do science, for example. I adopt a very naive form of scientific realism when I'm learning new scientific theories. I take the observations and proposed explanatory models to be objective truths, picturing them in my mind's eye. There's something about that which is just psychologically easier. The skepticism and clearer epistemological thinking can be switched on later, once I've got my head wrapped around the idea.

Comment author: gwern 06 June 2010 09:29:48PM 1 point [-]

As one of the rationalist quote threads said,

"To become properly acquainted with a truth, we must first have disbelieved it, and disputed against it."

Comment author: RobinZ 07 June 2010 06:58:50PM 0 points [-]

Which one? I can't find it, now.

Comment author: gwern 07 June 2010 09:13:58PM 0 points [-]

Hm, you know what? I think I might've gotten that Novalis quote just from browsing Wikiquotes. Although it certainly does seem like something I would've picked up from the quote threads.