Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

2012 Winter Fundraiser for the Singularity Institute

30 Post author: lukeprog 06 December 2012 10:41PM

Cross-posted here.

(The Singularity Institute maintains Less Wrong, with generous help from Trike Apps, and much of the core content is written by salaried SI staff members.)

Thanks to the generosity of several major donors, every donation to the Singularity Institute made now until January 20t (deadline extended from the 5th) will be matched dollar-for-dollar, up to a total of $115,000! So please, donate now!

Now is your chance to double your impact while helping us raise up to $230,000 to help fund our research program.

(If you're unfamiliar with our mission, please see our press kit and read our short research summary: Reducing Long-Term Catastrophic Risks from Artificial Intelligence.)

Now that Singularity University has acquired the Singularity Summit, and SI's interests in rationality training are being developed by the now-separate CFAR, the Singularity Institute is making a major transition.  Most of the money from the Summit acquisition is being placed in a separate fund for a Friendly AI team, and therefore does not support our daily operations or other programs.

For 12 years we've largely focused on movement-building — through the Singularity Summit, Less Wrong, and other programs. This work was needed to build up a community of support for our mission and a pool of potential researchers for our unique interdisciplinary work.

Now, the time has come to say "Mission Accomplished Well Enough to Pivot to Research." Our community of supporters is now large enough that qualified researchers are available for us to hire, if we can afford to hire them. Having published 30+ research papers and dozens more original research articles on Less Wrong, we certainly haven't neglected research. But in 2013 we plan to pivot so that a much larger share of the funds we raise is spent on research.


Accomplishments in 2012

Future Plans You Can Help Support

In the coming months, we plan to do the following:

  • As part of Singularity University's acquisition of the Singularity Summit, we will be changing our name and launching a new website.
  • Eliezer will publish his sequence Open Problems in Friendly AI.
  • We will publish nicely-edited ebooks (Kindle, iBooks, and PDF) for many of our core materials, to make them more accessible: The Sequences, 2006-2009, Facing the Singularity, and The Hanson-Yudkowsky AI Foom Debate.
  • We will publish several more research papers, including "Responses to Catastrophic AGI Risk: A Survey" and a short, technical introduction to timeless decision theory.
  • We will set up the infrastructure required to host a productive Friendly AI team and try hard to recruit enough top-level math talent to launch it.

(Other projects are still being surveyed for likely cost and strategic impact.)

We appreciate your support for our high-impact work! Donate now, and seize a better than usual chance to move our work forward. Credit card transactions are securely processed using either PayPal or Google Checkout. If you have questions about donating, please contact Louie Helm at (510) 717-1477 or louie@intelligence.org.

$115,000 of total matching funds has been provided by Edwin Evans, Mihaly Barasz, Rob Zahra, Alexei Andreev, Jeff Bone, Michael Blume, Guy Srinivasan, and Kevin Fischer.

I will mostly be traveling (for AGI-12) for the next 25 hours, but I will try to answer questions after that.

Comments (126)

Comment author: blob 07 December 2012 07:06:04AM 44 points [-]

I donated 650$ and will donate the same amount to the CFAR fundraiser.

Comment author: JoshuaFox 07 December 2012 07:12:20AM 41 points [-]

I have been donating $100 monthly on a subscription payment and will continue to do so.

Easier on the cash-flow than a lump donation. More fuzzies per year, too.

Comment author: pengvado 07 December 2012 02:46:21AM *  100 points [-]

I donated 20,000$ now, in addition to 110,000$ earlier this year.

Comment author: lukeprog 07 December 2012 12:41:09PM 8 points [-]

Thanks very much!!

Comment author: MixedNuts 08 December 2012 05:59:37PM 8 points [-]

Holy pickled waffles on a pogo stick. Thanks, dude.

Is there anything you're willing to say about how you acquired that dough? My model of you has earned less in a lifetime.

Comment author: pengvado 08 December 2012 09:51:53PM *  22 points [-]

I value my free time far too much to work for a living. So your model is correct on that count. I had planned to be mostly unemployed with occasional freelance programming jobs, and generally keep costs down.

But then a couple years ago my hobby accidentally turned into a business, and it's doing well. "Accidentally" because it started with companies contacting me and saying "We know you're giving it away for free, but free isn't good enough for us. We want to buy a bunch of copies." And because my co-founder took charge of the negotiations and other non-programming bits, so it still feels like a hobby to me.

Both my non-motivation to work and my willingness to donate a large fraction of my income have a common cause, namely thinking of money in far-mode, i.e. not alieving The Unit of Caring on either side of the scale.

Comment author: MixedNuts 08 December 2012 11:14:51PM 3 points [-]

Yeah, I know exactly who you are, I just didn't want to bust privacy or drop creepy hints. I didn't know that VideoLAN projects were financially independent of each other, so that explains where profit comes from. It's just that I didn't expect two guys in a basement to make that much, and you're too young (and didn't have much income before anyway) to have significant savings. So they're more money in successful codecs than I guessed.

Comment author: pengvado 09 December 2012 03:32:38AM *  10 points [-]

you're too young (and didn't have much income before anyway) to have significant savings.

Err, I haven't yet earned as much from the lazy entrepreneur route as I would have if I had taken a standard programming job for the past 7 years (though I'll pass that point within a few months at the current rate). So don't go blaming my cohort's age if they haven't saved and/or donated as much as me. I'm with Rain in spluttering at how people can have an income and not have money.

Comment author: [deleted] 09 December 2012 11:54:26AM 1 point [-]

i.e. not alieving The Unit of Caring on either side of the scale

I don't, either -- possibly because I've never been in real economic hardships; I think if I had grown up in a poorer family I probably would. (I do try to be frugal because so far I've lived almost exclusively on my parents' income and it seems unfair towards them to waste their money, though.)

Comment author: Kawoomba 07 December 2012 01:05:36PM 7 points [-]

(At the time of this comment) 27 karma for a $20k donation, 13 karma for $250, 9 karma for $20 (and a joke) ... something's amiss with the karma-$ currency exchange rate!

Comment author: AlexMennen 07 December 2012 05:26:37PM 7 points [-]

Under the assumption that being rewarded with karma can motivate someone to make a donation, but if they make a donation, they do not respond to karma as an incentive when deciding how much to donate, then upvoting any donation is the best policy for maximizing money to SI. I'm not sure how realistic that model is, but it seems intuitive to me.

Comment author: Kindly 08 December 2012 01:05:11AM 6 points [-]

It might motivate someone to donate $20 rather than $5 if there is a karma difference; probably not $20000 rather than $20, though.

Comment author: Kindly 07 December 2012 04:21:27PM *  7 points [-]

What do you expect to happen? We don't have enough users giving karma for donation to sustain a linear exchange rate in the [$20, $20000] range. Unless, I suppose, we give up any attempt at fine resolution over the [$1, $500] range.

In practice, what most people are probably doing is picking a threshold (possibly $0) beyond which they give karma for a donation. This could be improved: you could pick a large threshold beyond which you give 1 karma, and give fractional karma (by flipping a biased coin) below that threshold. However, if the large threshold were anywhere close to $20000, and your fractional karma scales linearly, then you would pretty much never give karma to the other donations.

Edit: after doing some simulations, I'm no longer sure the fractional approach is an improvement. It gives interesting graphs, though!

If we knew the Singularity Institute's approximate budget, we could fix this by assuming log-utility in money, but this is complicated.

Comment author: PhilipL 07 December 2012 02:40:46PM *  1 point [-]

Reversed scope insensitivity?

Comment author: CronoDAS 07 December 2012 02:53:32AM 6 points [-]

Really?

Comment author: Eliezer_Yudkowsky 08 December 2012 01:35:02AM 22 points [-]

Yes.

Comment author: CronoDAS 08 December 2012 11:01:43PM 8 points [-]

Wow.

Comment author: Qiaochu_Yuan 07 December 2012 03:05:49AM *  26 points [-]

"No, she wouldn't say anything to me about Lucius afterwards, except to stay away from him. So during the Incident at the Potions Shop, while Professor McGonagall was busy yelling at the shopkeeper and trying to get everything under control, I grabbed one of the customers and asked them about Lucius."

Draco's eyes were wide again. "Did you really?"

Harry gave Draco a puzzled look. "If I lied the first time, I'm not going to tell you the truth just because you ask twice."

Comment author: CronoDAS 08 December 2012 11:14:56PM 16 points [-]

Nice quote.

"Really?" is more polite to say than "I find that hard to believe, can you provide confirming evidence" or "[citation needed]", though. Also, sometimes people actually will say "No, I was kidding" if you ask them.

Comment author: Kindly 08 December 2012 11:53:06PM 5 points [-]

Also, sometimes people actually will say "No, I was kidding" if you ask them.

Or "Oops, I accidentally typed an extra zero. Twice."

Comment author: Qiaochu_Yuan 09 December 2012 07:40:20AM 0 points [-]

That is unlikely owing to the placement of the commas.

Comment author: Kindly 09 December 2012 03:23:17PM 2 points [-]

No, that just makes it worse, because 20,00$ could be referring to donating 20 dollars.

Comment author: Qiaochu_Yuan 09 December 2012 09:28:03PM *  2 points [-]

Ah, right. I had forgotten that some people use commas where I would expect periods. Adding an extra zero twice is still somewhat unlikely, though. My current hypotheses about the distribution of LW users make it more plausible that the tail of high income can afford fairly large donations.

Comment deleted 07 December 2012 05:35:54AM *  [-]
Comment deleted 07 December 2012 08:37:47AM *  [-]
Comment deleted 07 December 2012 09:16:41AM *  [-]
Comment deleted 07 December 2012 05:13:41PM *  [-]
Comment deleted 07 December 2012 06:18:23PM *  [-]
Comment author: Alicorn 08 December 2012 01:59:29AM 8 points [-]

There is a largely innocuous conversation below this comment which has been banned in its entirety. Who did this? Why?

Comment deleted 08 December 2012 12:09:58PM *  [-]
Comment author: NancyLebovitz 08 December 2012 12:43:04PM 13 points [-]

In general, I'd say that people's desire to be anonymous should be respected unless there's a very good reason to override it, and solving a puzzle is not a very good reason.

Comment author: [deleted] 09 December 2012 11:50:52AM 0 points [-]

Anyway, he pretty much admitted who he is now.

Comment author: ArisKatsaris 08 December 2012 03:30:29PM *  33 points [-]

Just donated 400 €.

Comment author: lukeprog 09 December 2012 08:50:21AM 3 points [-]

Thanks so much!

Comment author: ArisKatsaris 03 January 2013 03:46:19AM 3 points [-]

My new year's resolution is tithing, to be split roughly half-in-half between "serious" causes and things like supporting my favorite webcomics/fansubbers/whatever. As part of the former, I decided to add 1000 € to the above donation.

Comment author: wmorgan 14 December 2012 08:16:59PM 30 points [-]

$1,340.00

Comment author: moshez 06 December 2012 09:04:04PM 25 points [-]

I am looking forward the the ebooks. I hope you'll provide them in ePub format, for those of us who prefer that. [I was pleased to donate $40, which should soon be matched by my employer as part of the employee-match program, thus getting me double-matched!]

Comment author: Kutta 07 December 2012 11:40:55AM *  24 points [-]

I donated 250$.

Update: No, I apparently did not. For some reason the transfer from Google Checkout got rejected, and now PayPal too. Does anyone have an idea what might've gone wrong? I've a Hungarian bank account. My previous SI donations were fine, even with the same credit card if I recall correctly, and I'm sure that my card is still prefectly valid.

Comment author: philh 07 December 2012 07:38:05PM *  17 points [-]

I'm having the same problem. I used the card to buy modafinil yesterday, which might raise a red flag in fraud detection software? But if you're having it too, I'd update in the direction of it being a problem on SIAI's end.

Has anyone successfully donated since Kutta posted?

edit - Amazon is declining my card as well.

edit 2 - It's sorted out now, just donated £185.

Comment author: MichaelAnissimov 08 December 2012 12:24:52AM 5 points [-]

I'm looking into this now, can you send me an email at michael@intelligence.org so we can share any further details necessary to work out the problem?

Comment author: philh 08 December 2012 01:21:52AM 3 points [-]

Email sent.

Comment author: MichaelAnissimov 08 December 2012 04:06:00AM 7 points [-]

After investigating the issue, it proved to be a problem on Kutta's side, not ours.

Comment author: Kutta 08 December 2012 08:33:02AM *  5 points [-]

Thanks for your effort. I'll contact my bank.

Comment author: MichaelAnissimov 08 December 2012 12:43:14AM 6 points [-]

I just verified that donations in general are working via PayPal and Google Checkout. We'll investigate this specific issue to see where the problem is.

Comment author: drethelin 10 January 2013 05:35:39AM 23 points [-]

Still donating 500 a month.

Comment author: Eliezer_Yudkowsky 10 January 2013 10:15:14PM 10 points [-]

Five cheers for this! Those who are steadily donating should get applause every time.

Comment author: DeevGrape 14 December 2012 06:42:16PM 23 points [-]

Just donated $500 (with the Singularity credit card, so it's really more like $505 ^_^).

Comment author: John_Maxwell_IV 07 December 2012 10:39:46AM 21 points [-]

This is great news. Thanks to Edwin Evans, Mihaly Barasz, Rob Zahra, Alexei Andreev, Jeff Bone, Michael Blume, Guy Srinivasan, and Kevin Fischer for providing matching funds!

Comment author: [deleted] 06 January 2013 07:39:39PM 20 points [-]

Ok I think I just set up a $1000 monthly.

Comment author: JGWeissman 04 January 2013 03:26:42PM 19 points [-]

I mailed a check for $20,000.

I'm excited about the pivot to research.

Comment author: Yvain 18 December 2012 06:22:09AM *  19 points [-]

I have some money that I was saving for something like this, but I also just saw Eliezer's (very convincing) request for CFAR donations yesterday and heard a rumor that SIAI was trying to get people to donate to CFAR because they needed it more.

This seems weird to me because I would expect that with SIAI's latest announcement they have shifted from waterline-raising/community-building to more technical areas where CFAR success would be of less help to them, but I'd be very interested in hearing from an SIAI higher-up whether they really want my money or whether they would prefer I give it to CFAR instead.

Comment author: Eliezer_Yudkowsky 20 December 2012 12:11:57AM 17 points [-]

1) In the long run, for CFAR to succeed, it has to be supported by a CFAR donor base that doesn't funge against SIAI money. I expect/hope that CFAR will have a substantially larger budget in the long run than SIAI. In the long run, then, marginal x-risk minimizers should be donating to SIAI.

2) But since CFAR is at a very young and very vital stage in its development and has very little funding, it needs money right now. And CFAR really really needs to succeed for SIAI to be viable in the long-term.

So my guess is that a given dollar is probably more valuable at CFAR right this instant, and we hope this changes very soon (due to CFAR having its own support base)...

...but...

...SIAI has previously supported CFAR, is probably going to make a loan to CFAR in the future, and therefore it doesn't matter as much exactly which organization you give to right now, except that if one maxes out its matching funds you probably want to donate to the other until it also maxes...

...and...

...even the judgment about exactly where a marginal dollar spent is more valuable is, necessarily, extremely uncertain to me. My own judgment favors CFAR at the current margins, but it's a very tough decision. Obviously! SIAI has given money to CFAR. If it had been obvious that this amount should've been shifted in direction A or direction B to minimize x-risk, we would've necessarily been organizationally irrational, or organizationally selfish, about the exact amount. SIAI has been giving CFAR amounts on the lower side of our error bounds because of the hope (uncertainty) that future-CFAR will prove effective at fundraising. Which rationally implies, and does actually imply, that an added dollar of marginal spending is more valuable at CFAR (in my estimates).

The upshot is that you should donate to whichever organization gets you more excited, like Luke said. SIAI is donating/loaning round-number amounts to CFAR, so where you donate $2K does change marginal spending at both organizations - we're not going to be exactly re-fine-tuning the dollar amounts flowing from SIAI to CFAR based on donations of that magnitude. It's a genuine decision on your part, and has a genuine effect. But from my own standpoint, "flip a coin to decide which one" is pretty close to my own current stance. For this to be false would imply that SIAI and I had a substantive x-risk-estimate disagreement which resulted in too much or too little funding (from my perspective) flowing to CFAR. Which is not the case, except insofar as we've been giving too little to CFAR in the uncertain hope that it can scale up fundraising faster than SIAI later. Taking this uncertainty into account, the margins balance. Leaving it out, a marginal absolute dollar of spending at CFAR does more good (somewhat) (in my estimation).

Comment author: Yvain 25 December 2012 06:08:05AM 12 points [-]

Thank you; that helps clarify the issue for me. Since people who know more seem to think it's a tossup and SIAI motivates me more, I gave $250 to them.

Comment author: wedrifid 20 December 2012 01:53:58AM *  7 points [-]

And CFAR really really needs to succeed for SIAI to be viable in the long-term.

That's an extremely strong claim. Is that actually your belief? Not merely that CFAR success would be useful to SIAI success? There is no alternate plan for SIAI to be successful that doesn't rely on CFAR?

Comment author: Eliezer_Yudkowsky 20 December 2012 03:24:29AM 12 points [-]

I have backup plans, but they tend to look a lot like "Try founding CFAR again."

I don't know of any good way to scale funding or core FAI researchers for SIAI without rationalists. There's other things I could try, and would if necessary try, but I spent years trying various SIAI-things before LW started actually working. Just because I wouldn't give up no matter what, doesn't mean there wouldn't be a fairly large chunk of success-probability sliced off if CFAR failed, and a larger chunk of probability sliced off if I couldn't make any alternative to CFAR work.

I realize a lot of people think it shouldn't be impossible to fund SIAI without all that rationality stuff. They haven't tried it. Lots of stuff sounds easy if you haven't tried it.

Comment author: wedrifid 20 December 2012 03:43:45AM 2 points [-]

Thankyou Eliezer. I'm fascinated by the reasoning and analysis that you're hinting at here. It helps puts the decisions you and SIAI have made in perspective.

Could you give a ballpark estimate of how much of the importance of successful rationality spin offs is based on expectations of producing core FAI researchers versus producing FAI funding?

Comment author: Eliezer_Yudkowsky 20 December 2012 03:59:18AM 2 points [-]

I've tried less hard to get core FAI researchers than funding. I suspect that given sufficient funding produced by magic, it would be possible to solve the core-FAI-researchers issue by finding the people and talking to them directly - but I haven't tried it!

Comment author: Halfwit 20 December 2012 08:48:45AM *  6 points [-]

How much money would you need magicked to allow you to shed fundraising and infrastructure, etc, and just hire and hole up with a dream team of hyper-competent maths wonks? Restated, at which set amount would SIAI be comfortably able to aggressively pursue its long-term research?

Comment author: hairyfigment 16 January 2013 07:47:31AM 0 points [-]

He once mentioned a figure of US $10 million / year. Feels like he's made a similar remark more recently, but it didn't show in my brief search.

Comment author: Nick_Tarleton 07 June 2013 09:04:45PM 2 points [-]

So my guess is that a given dollar is probably more valuable at CFAR right this instant, and we hope this changes very soon (due to CFAR having its own support base)...

an added dollar of marginal spending is more valuable at CFAR (in my estimates).

Is this still your view?

Comment author: lukeprog 19 December 2012 02:14:38AM *  6 points [-]

[SI has now] shifted from waterline-raising/community-building to more technical areas where CFAR success would be of less help to them

Remember that the original motivation for the waterline-raising/community-building stuff at SI was specifically to support SI's narrower goals involving technical research. Eliezer wrote in 2009 that "after years of bogging down [at SI] I threw up my hands and explicitly recursed on the job of creating rationalists," because Friendly AI is one of those causes that needs people to be "a bit more self-aware about their motives and the nature of signaling, and a bit more moved by inconvenient cold facts."

So, CFAR's own efforts at waterline-raising and community-building should end up helping SI in the same way Less Wrong did, even though SI won't capture all or even most of that value, and even though CFAR doesn't teach classes on AI risk.

I've certainly found it to be the case that on average, people who get in contact with SI via an interest in rationality tend to be more useful than people who get in contact with SI via an interest in transhumanism or the singularity. (Though there are plenty of exceptions! E.g. Edwin Evans, Rick Schwall, Peter Thiel, Carl Shulman, and Louie Helm came to SI via the singularity materials.)

If someone has pretty good rationality skills, then it usually doesn't take long to persuade them of the basics about AI risk. But if someone is filtered instead for a strong interest in transhumanism or the singularity (and not necessarily rationality), then the conclusions they draw about AI risk, even after argument, often appear damn-near random.

There's also the fact that SI needs unusually good philosophers, and CFAR-style rationality training has some potential to help with that.

I'd be very interested in hearing from an SIAI higher-up whether they really want my money or whether they would prefer I give it to CFAR instead.

My own response to this has generally been that you should give to whichever organization you're most excited to support!

Comment author: TheOtherDave 19 December 2012 03:42:08AM 3 points [-]

My own response to this has generally been that you should give to whichever organization you're most excited to support!

Why is that your response?

More precisely... do you actually believe that I should base my charitable giving on my level of excitement? Or do you assert that despite not believing it for some reason?

Comment author: lukeprog 19 December 2012 04:16:45AM *  8 points [-]

Why...?

Oh, right...

Basically, it's because I think both organizations Do Great Good with marginal dollars at this time, but the world is too uncertain to tell whether marginal dollars do more good at CFAR or SI. (X-risk reducers confused by this statement probably have a lower estimate of CFAR's impact on x-risk reduction than I do.) For normal humans who make giving decisions mostly by emotion, giving to the one they're most excited about should cause them to give the maximum amount they're going to give. For weird humans who make giving decisions mostly by multiplication, well, they've already translated "whichever organization you're most excited to support" into "whichever organization maximizes my expected utility [at least, with reference to the utility function which represents my philanthropic goals]."

Comment author: Gastogh 15 December 2012 04:50:44PM 18 points [-]

Gave 200 $ this time.

Comment author: Qiaochu_Yuan 25 December 2012 02:55:31AM 17 points [-]

I just donated $1,000... to CFAR. Does that still count?

Comment author: lukeprog 26 December 2012 01:17:15AM *  0 points [-]

Thanks! That counts for CFAR's drive.

Comment author: katydee 25 December 2012 03:23:39AM -1 points [-]

Yes.

Comment author: Furcas 16 December 2012 07:14:54AM *  17 points [-]

I've just donated 500 Canadian dollars to the Singularity Institute (at the moment, 1 Canadian dollar = 1.01 US dollar).

[edited]

Comment author: [deleted] 06 December 2012 11:47:08PM *  17 points [-]

I assume a mailed cheque will work?

This post made me super excited. I was just thinking about donating before I found this. Now I really have to. Thanks for the initiative.

Comment author: lukeprog 07 December 2012 12:48:30AM *  6 points [-]

Certainly. Please see the instructions under 'Donate by Check' on the donate page. Thanks very much!

Comment author: [deleted] 13 December 2012 02:37:39AM 40 points [-]

Check in the mail for $3k

(Took me long enough.)

Now give me my karma.

Comment author: lukeprog 13 December 2012 02:27:57PM 2 points [-]

Thanks very much!!

Comment author: Giles 09 December 2012 09:28:21PM 13 points [-]

As part of Singularity University's acquisition of the Singularity Summit, we will be changing our name and ...

OK, this is big news. Don't know how I missed this one.

Comment author: Morendil 07 December 2012 07:41:33AM *  31 points [-]

Sent in 100€. Merry Newtonmas!

Comment author: Rain 08 December 2012 02:23:34AM 47 points [-]

I continue to donate $1000 a month, and intend to reduce my retirement savings next year so I can donate more.

Comment author: MixedNuts 08 December 2012 03:49:22PM 9 points [-]

That's a hell of a gamble, kid. Rock on.

Comment author: Rain 08 December 2012 05:42:23PM 12 points [-]

The singularity is my retirement plan.

-- tocomment, in a Hacker News post

Comment author: Halfwit 09 December 2012 07:43:32AM 5 points [-]

The quantum lottery is my retirement plan, my messy messy retirement plan.

Comment author: Alexei 08 December 2012 09:36:26PM 4 points [-]

I'm glad I'm not the only one that thinks like that. :)

Comment author: Benja 04 January 2013 03:09:15PM 10 points [-]

Donated $150. One more day! Please donate, too!

Comment author: CronoDAS 07 December 2012 02:56:21AM 24 points [-]

I donated $20, roughly the price of a cheap hardcover novel.

Comment author: fizzfaldt 07 December 2012 03:14:46AM 36 points [-]

I just donated $250. Can't afford as much as last year; I switched to a lower-paying job that makes me happier.

Comment author: Dusk 09 December 2012 12:55:56PM 12 points [-]

I have donated $30 in payback for a free dinner hosted by the Melbourne LessWrong meetup.

Comment author: Furcas 07 December 2012 04:38:10AM 11 points [-]

Does agreeing to display my name in the public donor list help the SI in any way?

Comment author: JoshuaFox 07 December 2012 07:10:43AM 18 points [-]

Social proof. Very useful.

Comment author: Furcas 08 December 2012 01:32:11AM 2 points [-]

Okay, thanks.

Another question: Will my donation be matched even if I donate to the Singularity Institute For AI Canada Association?

Comment author: Furcas 16 December 2012 07:04:21AM *  5 points [-]

I asked Louie Helm, as advised by Joshua Fox. Below is [edit] was his reply, which he's asked me to remove.

Comment author: Eliezer_Yudkowsky 17 December 2012 04:28:12AM 2 points [-]

No.

Comment author: Furcas 01 January 2013 07:59:57PM *  1 point [-]

FYI, the SIAI Canada page on the Singularity Institute website still says this:

SIAI-CA is the Canadian ‘on-ramp’ for supporters of SIAI. We exist to facilitate Canadians in supporting the charitable objectives of SIAI. We do this in two ways: tax relief and oversight.

http://singularity.org/siai-canada/

I know if I hadn't asked Louie before donating to SIAI I would have donated to SIAI Canada, thinking it would have the same consequences except I'd get a tax break. I wonder how many thousand of dollars you've lost this way?

Comment author: lukeprog 01 January 2013 09:27:33PM -1 points [-]

Not much, at least not since I took over SI in November 2011. SIAI-CA executed our recommendation for how to spend the last $5k they've spent since November 2011 — though it can be quite a lot of effort to find a good way for SIAI-CA to spend the money from Canada. Even more importantly, we know who all our biggest supporters in Canada are, so we've explained the situation to them personally and they generally donate directly rather than through SIAI-CA.

Comment author: JoshuaFox 08 December 2012 06:42:28PM 1 point [-]

I suggest you ask Louie Helm.

Comment author: Alexei 08 December 2012 09:38:44PM 8 points [-]

It helps people like me, who look at it almost like a competition. The more people competing ,the merrier.

Comment author: Rain 08 December 2012 10:14:02PM 19 points [-]

Yeah, I wanted to catch Jaan Tallinn on the Top Donors page to prove some random middle-class person could do better charity than the rich types, but he keeps pulling further ahead and I dropped a couple places in the rankings :-/ Gotta work harder!

Comment author: PhilipL 10 January 2013 02:19:18PM 4 points [-]

Not sure if anyone else noticed, but the end date was pushed back until Jan 20. Although personally, I would rather donate to CFAR (and have done so, $500, and another $500 before the fundraiser timeframe.)

Comment author: ArisKatsaris 24 December 2012 12:24:43AM 3 points [-]

...unless the donation bar is lagging, slightly less than 1/3rd the hoped-for sum has been filled, with only about 11 days remaining. That's rather worrisome.

Comment author: lukeprog 25 December 2012 01:05:15AM 1 point [-]

The donation bar lags somewhat, and it's normal for most of the funds to come in "at the last minute."

Comment author: JMiller 06 December 2012 08:51:29PM *  6 points [-]

Luke, the link in the third line "Now is your chance to double your impact while helping us raise up to $230,000 to help fund our research program" does not work.

Comment author: lukeprog 06 December 2012 09:04:20PM 1 point [-]

Meant to go to singularity.org/research. Will an editor please fix? I'm working from my phone now.

Comment author: Vladimir_Nesov 06 December 2012 09:17:22PM 12 points [-]

Fixed.

Comment author: Qiaochu_Yuan 08 December 2012 08:45:02AM 3 points [-]

Do we get some kind of reasonable guarantee that there won't in the future be an even better matching offer (say a tripling of our impact), or is the idea here that the value of an SIAI donation is heavily time discounted?

Comment author: lukeprog 08 December 2012 10:58:14AM *  9 points [-]

We've never done such a drive in the past and have no current plans for one. We do have a pretty heavy discount rate. Sorry i can't say more.

Comment author: [deleted] 13 December 2012 02:59:23AM 3 points [-]

Sorry i can't say more.

oooooohhhh, super secret time pressure.

Maybe I should donate more...

Comment author: Kevin 09 December 2012 10:10:44AM 6 points [-]

Given that historically SI has completed all matching drives to 100%, I wouldn't even recommend waiting for a 2x match to donate.

Comment author: John_Maxwell_IV 09 December 2012 10:59:16AM *  9 points [-]

Probably the best of all is to be a matching drive sponsor.

Comment author: Kevin 09 December 2012 11:01:45AM 2 points [-]

Probably the best of all is to be a matching drive sponsor.

I can't argue with that!

Comment author: John_Maxwell_IV 09 December 2012 11:05:48AM 7 points [-]

Yep!

To stay honest though, if someone is reading this thread and planning to do this, they should contact SI now with the amount they're willing to match during a future drive... otherwise they're highly liable to fall prey to donor akrasia.

Comment author: Benja 08 December 2012 07:15:33PM *  5 points [-]

There doesn't seem to be anything SIAI would gain from running such a program. If big donors are willing to give $N to match donations, if donations are matched dollar-for-dollar then SIAI can reasonably hope to raise $2N in the fundraiser; if donations are matched two-dollars-for-every-dollar, SIAI will only get $(3/2)N. Unless, of course, the big donors would donate more if SIAI sets up the second type of matching program, but why would they?

The only scenario I can see where this would make sense is if SIAI expects small donors to donate less than $(1/2)N in a dollar-for-dollar scheme, so that its total gain from the fundraiser would be below $(3/2)N, but expects to get the full $(3/2)N in a two-dollars-for-every-dollar scheme. But not only does this seem like a very unlikely story, even if it did happen it seems that you should want to donate in the current fundraiser if you're willing to do so at all, since this means that more matching funds would be available in the later two-dollars-for-every-dollar fundraiser for getting the other people to donate who we are postulating aren't willing to donate at dollar-for-dollar.

Comment author: Benja 24 December 2013 04:41:13AM 2 points [-]

The only scenario I can see where this would make sense is if SIAI expects small donors to donate less than $(1/2)N in a dollar-for-dollar scheme, so that its total gain from the fundraiser would be below $(3/2)N, but expects to get the full $(3/2)N in a two-dollars-for-every-dollar scheme. But not only does this seem like a very unlikely story [...]

One year later, the roaring success of MIRI's Winter 2013 Matching Challenge, which is offering 3:1 matching for new large donors (people donating >= $5K who have donated less that $5K in total in the past) -- almost $232K out of the $250K maximum donated by the time of writing, with more than three weeks time left, where the Winter 2012 Fundraiser the parent is commenting on only reached its goal of $115K after a deadline extension, and the Summer 2013 Matching Challenge only reached its $200K goal around the time of the deadline -- means that I pretty much need to eat my hat on the "very unlikely story" comment above. (There's clearly an upward growth curve as well, but it does seem clear that lots of people wanted to take advantage of the 3:1.)

So far I still stand by the rest of the comment, though:

[...] even if it did happen it seems that you should want to donate in the current fundraiser if you're willing to do so [at 1:1 matching], since this means that more matching funds would be available in the later two-dollars-for-every-dollar fundraiser for getting the other people to donate who we are postulating aren't willing to donate at dollar-for-dollar.

Comment author: GuySrinivasan 10 December 2012 01:03:38AM 0 points [-]

I seem to recall reading a study that concluded that the multiplier on the match (above 0.5x) doesn't change the increase in donations much. Cursory searching didn't refind it though.

Comment deleted 15 December 2012 10:16:12PM *  [-]
Comment author: gwern 16 December 2012 02:56:35AM *  5 points [-]

What report is that? A site-search for "140,000" turns up a number of figures but none from EY; the latest Form 990 I know of lists his compensation at ~$104k (pg7, summing both columns) or ~50% less than your number.

Comment author: Eliezer_Yudkowsky 16 December 2012 06:49:50AM 5 points [-]

I've sometimes earned more than my SIAI base salary from speaking fees, but I've never earned $140K in any year, and will cheerfully exhibit my tax returns if Luke, Holden, or any other sufficiently reputable entity requests them. I've also got no idea what that "estimated extra compensation" line is about, unless it's health insurance or something - per the wishes of Peter Thiel, SIAI never pays $100k in any year to any employee, including bonuses.

(Note that, as usual when a poster has received many sufficiently extreme downvotes in their history, I designate them a troll and delete their comments at will.)

Comment author: drethelin 01 January 2013 10:31:40PM 1 point [-]

How encouraging is it to people to see comments saying people donated? To me it just seems like kinda self aggrandizing karma whoring. Have you read this thread and been influenced to donate or to donate more?

Comment author: Qiaochu_Yuan 10 January 2013 02:26:38AM *  6 points [-]

I was influenced both to donate and to donate more. Social proof is very powerful. I also would not have posted if I didn't think it would encourage people to donate or donate more.

Comment author: ArisKatsaris 01 January 2013 11:10:53PM 4 points [-]

self aggrandizing karma whoring

If I didn't hope it would help encourage others, I wouldn't post about my donation. I can't think of reasons that knowing of a donation of mine might discourage others of donating, so I believe it will help encourage them, even if minimally so.

Have you read this thread and been influenced to donate or to donate more?

Generalizing to "this and similar threads", I think the answer is yes in regards to me.

Comment author: Halfwit 10 January 2013 02:08:14AM *  0 points [-]

I highly support changing your name--there's all sorts of bad juju associated with the term "singularity". My advice, keep the new name as bland as possible, avoiding anything with even a remote chance of entering the popular lexicon. The term "singularity" has suffered the same fate as "cybernetics".

Comment author: MugaSofer 10 January 2013 11:23:49AM -2 points [-]

I note that you've retracted you post, but I still feel the need to ask: shouldn't the name reflect what they do?

Comment author: Halfwit 10 January 2013 05:45:26PM *  2 points [-]

In terms of minimizing the status loss for academics affiliating with SIAI, a banal minimally-descriptive name may be superior. People often overestimate the value of the piquant. Beige may not excite, but it doesn't offend. Any term which has the potential to become a buzzword, or acquire alternative definitions, should be avoided. The more exciting the term, the higher the chance of appropriation.

This was the point I was trying to make; on rereading it after posting, I realized it was remarkably poorly written and wasn't even clearly conveying what I was thinking when I wrote it. I didn't have time to edit it then, so I retracted.

Comment author: [deleted] 10 January 2013 05:51:00PM 0 points [-]

BTW, here's an interesting blog post about considerations relevant to naming stuff.

Comment author: MugaSofer 11 January 2013 01:22:15PM -1 points [-]

Thank you for clarifying.