Jiro comments on Lesswrong, Effective Altruism Forum and Slate Star Codex: Harm Reduction - Less Wrong

13 Post author: diegocaleiro 08 June 2015 04:37PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (150)

You are viewing a single comment's thread. Show more comments above.

Comment author: katydee 09 June 2015 01:06:08AM *  6 points [-]

In terms of weird fixations, there are quite a few strange things that the LW community seems to have as part of its identity - polyamory and cryonics are perhaps the best examples of things that seem to have little to do with rationality but are widely accepted as norms here.

If you think rationality leads you to poly or to cryo, I'm fine with that, but I'm not fine with it becoming such a point of fixation or an element of group identity.

For that matter, I think atheism falls into the same category. Religion is basically politics, and politics is the mind-killer, but people here love to score cheap points by criticizing religion. The fact that things like the "secular solstice" have become part of rationalist community norms and identity is indicative of serious errors IMO.

For me, one of the most appealing things about EA (as opposed to rationalist) identity is that it's not wrapped up in all this unnecessary weird stuff.

Comment author: Jiro 09 June 2015 06:02:34PM 12 points [-]

For me, one of the most appealing things about EA (as opposed to rationalist) identity is that it's not wrapped up in all this unnecessary weird stuff.

I'd consider EA itself to be one of those strange things that LW has as part of its identity. It's true that EA involves rationality, but the premises that EA is based on are profoundly weird. I have no desire to maximize utility for the entire human race in such a way that each person's utility counts equally, and neither does just about everyone else outside of the LW-sphere. I prefer to increase utility for myself, my family, friends, neighbors, and countrymen in preference to increasing the utility of arbitrary people. And you'll find that pretty much everyone else outside of here does too.

Comment author: jsteinhardt 09 June 2015 06:25:32PM *  3 points [-]

I don't view this as inconsistent with EA. I basically share the same preferences as you (except that I don't think I care about countrymen more than arbitrary people). On the other hand, I care a non-zero amount about arbitrary people, and I would like whatever resources I spend helping them to be spent efficiently. (Also, given the sheer number of other people, things like scientific research that would potentially benefit everyone at once feel pretty appealing to me.)

Comment author: Jiro 09 June 2015 07:43:52PM *  1 point [-]

Well, that's a matter of semantics. I could say "I don't want to maximize utility added up among all people", or I could say "I assign greater utility to people closer to me, and I want to maximize utility given that assignment". Is that EA? If you phrase it the second way, it sort of is, but if you phrase it the first, it isn't.

Also, I probably should add "and people who think like me" after "countrymen". For instance, I don't really care about the negative utility some people get when others commit blasphemy.

Comment author: Clarity 10 June 2015 01:01:58PM *  1 point [-]

this was an unhelpful comment, removed and replaced by the comment you are now reading

Comment author: ChristianKl 10 June 2015 01:51:19PM 0 points [-]

I prefer to increase utility for myself, my family, friends, neighbors, and countrymen in preference to increasing the utility of arbitrary people. And you'll find that pretty much everyone else outside of here does too.

I think there are plenty of people out there who do care to some extend about saving starving African children.

Comment author: Jiro 10 June 2015 02:46:32PM 3 points [-]

Yes, they care to some extent, but they would still prefer saving their own child from starvation to saving another child in a distant continent from starvation. Caring to some extent is not equally preferring.

Comment author: ChristianKl 10 June 2015 02:52:43PM 1 point [-]

I don't think any of the EA people wouldn't care more about their own child. To me that seems like a strawman.

Comment author: Jiro 10 June 2015 03:03:58PM *  3 points [-]

The argument usually goes in reverse: since you'd care about your own child, surely you should care equally about this child in Africa who's just as human. It's presented as a reason to care more for the distant child, not care less for your own child. But it still implies that you should care equally about them, not care more about your own.

Comment author: ChristianKl 10 June 2015 03:25:21PM 2 points [-]

I don't know any EA who says that they have an utility function that treats every child 100% equally.