You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

lukeprog comments on The Singularity Institute's Arrogance Problem - Less Wrong Discussion

63 Post author: lukeprog 18 January 2012 10:30PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (307)

You are viewing a single comment's thread. Show more comments above.

Comment author: lukeprog 19 January 2012 01:26:54AM *  1 point [-]

The claim made that donating to the SIAI is the charity donation with the highest expected return* always struck me as rather arrogant

I feel like I've heard this claimed, too, but... where? I can't find it.

Can't remember the exact wording but that was the takeaway of a headline in the last fundraiser.

Here is the latest fundraiser; which line were you thinking of? I don't see it.

Comment author: [deleted] 19 January 2012 01:35:19AM 12 points [-]

I feel like I've heard this claimed, too, but... where? I can't find it.

Question #5.

Comment author: lukeprog 19 January 2012 01:53:28AM *  5 points [-]

Yup, there it is! Thanks.

Eliezer tends to be more forceful on this than I am, though. I seem to be less certain about how much x-risk is purchased by donating to SI as opposed to donating to FHI or GWWC (because GWWC's members are significantly x-risk focused). But when this video was recorded, FHI wasn't working as much on AI risk (like it is now), and GWWC barely existed.

I am happy to report that I'm more optimistic about the x-risk reduction purchased per dollar when donating to SI now than I was 6 months ago. Because of stuff like this. We're getting the org into better shape as quickly as possible.

Comment author: curiousepic 19 January 2012 04:42:07PM *  9 points [-]

because GWWC's members are significantly x-risk focused

Where is this established? As far as I can tell, one cannot donate "to" GWWC, and none of their recommended charities are x-risk focused.

Comment author: Thrasymachus 05 February 2012 07:44:33PM 2 points [-]

(Belated reply): I can only offer anecdotal data here, but as one of the members of GWWC, many of the members are interested. Also, listening to the directors, most of them are also interested in x-risk issues.

You are right in that GWWC isn't a charity (although it is likely to turn into one), and their recommendations are non-x-risk. The rationale for recommending charities is dependent on reliable data: and x-risk is one of those things where a robust "here's more much more likely happy singularity will be if you give to us" analysis looks very hard.

Comment author: Barry_Cotter 19 January 2012 01:35:38AM 1 point [-]

I feel like I've heard this claimed, too, but... where? I can't find it.

Neither can I but IIRC Anna Salamon did an EU calculation which came up with eight lives saved per dollar donated, no doubt impressively caveated and with error bars aplenty.

Comment author: lukeprog 19 January 2012 01:52:57AM 4 points [-]

I think you're talking about this video. Without watching it again, I can't remember if Anna says that SI donation could buy something like eight lives per dollar, or whether donation to x-risk reduction in general could buy something like eight lives per dollar.