SIAI firing Eliezer would be like Nirvana firing Kurt Cobain. Most of the money and public attention will follow Eliezer, not stay with SIAI.
You're not alone in wanting Eliezer to start publishing new results already. But there's also the problem that he likes secrecy way too much. Alexandros Marinos once compared his attitude to staying childless: every childless person came from an unbroken line of people who reproduced (=published their research), and couldn't exist otherwise.
For example, our decision-theory-workshop group is pretty much doing its own thing now. I believe it diverged from Eliezer's ideas a while ago, when we started thinking about UDT-ish theorem provers instead of TDT-ish causal graph thingies. I don't miss Eliezer's guidance, but I sure miss his input - it could be very valuable for the topics that interest us. But our discussions are open, so I guess it's a no go.
This is something I've never really understood. I can understand wanting to keep any moves directly towards creating an AI quiet - if you create 99% of an AI and someone else does the other 1%, goodbye world. It may not be optimal, but it's a comprehensible position. But the work on decision theory is presumably geared towards codifying Friendliness in such a way that an AI could be 'guaranteed Friendly'. That seems like the kind of thing that would be aided by having many eyeballs looking at it, while being useless for anyone who wanted to put together a cobbled-together quick-results AI.
Please refer to the updated documented here: http://lesswrong.com/lw/5il/siai_an_examination/
This version is an old draft.
NOTE: Analysis here will be updated as people point out errors! I've tried to be accurate, but this is my first time looking at these (somewhat hairy) non-profit tax documents. Errors will be corrected as soon as I know of them! Please double check and criticize this work that it might improve.
Document History:
Todo:
Disclaimer:
Acting on gwern's suggestion in his Girl Scout Cookie analysis, here is a first pass at looking at SIAI funding, suggestions for a funding task-force, etc.
The SIAI's Form 990's are available at GuideStar and Foundation Center. You must register in order to access the files at GuideStar.
Overview
Analysis:
Revenue
Analysis:
Expenses
Analysis:
Big Donors
Analysis
Officer Compensation
Prior to doing this investigation, I had some expectation that the Singularity Summit was a money losing operation. I had an expectation that Eliezer probably made around $70k (programmer money discounted for being paid by a non-profit). I figured the SIAI had a broader donor base. I was off base on all counts.* I am not currently an SIAI supporter. My findings have greatly increased the probability that I will donate in the future.
Overall, the allocation of funds strikes me as highly efficient. I don't know exactly how much the SIAI is spending on food and fancy tablecloths at the Singularity Summit, but I don't think I care: it's growing and it's nearly breaking even. An attendee can have a very confident expectation that their fee covers their cost to the organization. If you go and contribute you add pure value by your attendance.
At the same time, the organization has been able to expand services without draining the coffers. A donor can hold a strong expectation that the bulk of their donation will go toward actual work in the form of salaries for working personnel or events like the Visiting Fellows Program.
Eliezer's compensation is slightly more than I thought. I'm not sure what upper bound I would have balked at or would balk at. I do have some concern about the cost of recruiting additional Research Fellows. The cost of additional RFs has to be weighed against new programs like Visiting Fellows.
The organization appears to be managing its cash reserves well. It would be good to see the SIAI build up some asset reserves so that it could operate comfortably in years were public support dips or so that it could take advantage of unexpected opportunities.
The organization has a heavy reliance on major donor support. I would expect the 2010 filing to reveal a broadening of revenue and continued expansion of services, but I do not expect the organization to have become independent of big donor support. Things are much improved from 2006 and without the initial support from Peter Thiel the SIAI would not be able to provide the services it has, but it would still be good to see the SIAI operating capacity be larger than any one donor's annual contribution. It is important for Less Wrong to begin a discussion of broadening SIAI revenue sources.
Where to Start?
There is low hanging fruit to be found. The SIAI's annual revenue is well within the range of our ability to effect significant impact. These suggestions aren't all equal in their promise, they are just things that come to my mind.