Sideways2 comments on (Moral) Truth in Fiction? - Less Wrong

17 Post author: Eliezer_Yudkowsky 09 February 2009 05:26PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (82)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Sideways2 09 February 2009 10:36:56PM 2 points [-]

HughRistik:

Speaking as a new reader of Overcoming Bias myself--I think that the sort of people who read this blog are more likely to miss how dangerous the Superhappies are, because we've considered ways that human suffering could be reduced or eliminated while still letting humans develop properly. Then, when people who already have ideas about how to reduce suffering read that the Superhappies want to eliminate suffering, they assume that the Superhappies' plans are the same as their own. (I'm not sure if this is a previously discussed and named bias, but it sure ought to be.)

As far as I can tell, the Superhappies don't care about proper human development, and are not even curious as to what it is. They want us to be happy; being "good people" doesn't enter into it. I'd say the Superhappies are "paperclip maximizers" for happiness-- though their idea of happiness is more complicated than a paperclip, the same principle is at work.

I would have said that the Superhappy proposal to find a happy middle ground between their values and the Babyeaters' by having everyone eat thousands of nonsentient babies was a preposterous straw-man for moral relativists, if that proposal hadn't actually been even more preposterously defended in the comments. Even if it's morally neutral to eat thousands of nonsentient babies, doesn't it seem... well, kind of ridiculous?

Which leads me to a point about the subject of this post, one that I don't think has been brought up yet: sometimes, people understand something more easily and more completely if they can see an example of it. Which is easier, to explain to someone what a cracker is, or to just show them a cracker? It's not practical to build a paperclipper and show it to everyone -- and that's where fiction comes in.

Comment author: johnlawrenceaspden 23 October 2012 09:50:30AM 0 points [-]

Isn't any rational agent a paperclip maximiser for something? I thought that was what 'rational' was supposed to mean round here.

Comment author: MugaSofer 23 October 2012 01:29:01PM 3 points [-]