wilkox comments on Mini-camp on Rationality, Awesomeness, and Existential Risk (May 28 through June 4, 2011) - Less Wrong

39 Post author: AnnaSalamon 24 April 2011 08:10AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (84)

You are viewing a single comment's thread.

Comment author: wilkox 26 April 2011 08:28:02AM 2 points [-]

Why is the Singularity Institute paying for this?

We're trying to reduce existential risk -- to increase the odds that an eventual Singularity is good, from the perspective of humane values. To do this, we need more rational, effective people -- people who can train to do the needed research, who can fund that or other work, and who can otherwise exert influence toward good outcomes.

I'd be interested in hearing more about how you foresee graduates of these camps working to reduce existential risk, especially as a donor to the SIAI. Is there a long term plan in place or are you just trying some things out?

Comment author: Jonathan_Graehl 26 April 2011 09:54:52PM 7 points [-]

At the very least, people who personally benefit from this program are incredibly more likely to donate for the rest of their lives, even if (especially if) they make no direct research or advocacy contribution.

Comment author: wilkox 27 April 2011 01:05:42AM 3 points [-]

That's a good point. An increase in donations from a specific group of people should be easy to measure too, so the SIAI could use it to directly assess the effectiveness of these programs.

Comment author: [deleted] 27 April 2011 01:08:15AM *  1 point [-]

Holding a program for the purpose of increasing donation revenue makes me feel uncomfortable. Maybe we should stick to the party line about raising the sanity waterline.

Comment author: wilkox 27 April 2011 01:22:06AM 4 points [-]

The idea of holding a program to increase donations actually made me more comfortable, as it seems more like a long term investment in reducing existential risk then money squandered on something fun but not obviously essential.

Comment author: [deleted] 27 April 2011 01:51:06AM *  0 points [-]

You'll have to run that calculation by me. I don't see how expected utility of the the former is higher than the latter.

Comment author: wilkox 27 April 2011 02:16:49AM 6 points [-]

By way of analogy, suppose a cancer charity has $10,000 to spend. It could invest the money directly into research, for a marginal expected return in decreased cancer suffering, or it could spend it on a glitzy event where potential donors get to "try their hand" at working in a research lab for a day. The second option could sound like a waste of money, as the donors probably won't do anything worthwhile in a day of messing around in a lab. However, if they go on to contribute $100,000 more to the charity than they otherwise would have, that money can be reinvested in research for a 9x greater return on investment than investing the original $10,000 directly into research would have yielded (ignoring discount rates and assuming linear return on research investment). If any of the participants did happen to go on and become great cancer researchers, this would just be an excellent bonus effect.

The idea that this program will result in increased donations makes me more comfortable because it seems this is a more likely way the program will directly reduce existential risk than the vaguer goal of 'raising the sanity waterline'. If it does succeed in raising the sanity waterline in a way that reduces existential risk, that would be an excellent bonus.

Comment author: [deleted] 27 April 2011 02:23:39AM 2 points [-]

Your analogy makes sense, but why do you think the numbers will go that way?

Comment author: wilkox 27 April 2011 02:31:59AM 5 points [-]

I don't know they will - see my above comment suggesting the SIAI actually measure donations from program participants. It does seem more likely now, however, that the program will at least break even on reducing existential risk, hence my increased comfort with the idea.

Comment author: [deleted] 27 April 2011 02:34:21AM 3 points [-]

Does it seem that the program will break even because you've anchored yourself to 9x ROI?