No. Yudkowsky is paid by the SI, hence he could just donate to the SI just by accepting a lower salary.
He claims that any single dollar of extra funding the SI has could make the difference between an exceptionally positive scenario (Frendly superhuman AI, intergalactic civlization, immortality, etc) and an exceptionally negative one (evil robots who kill us all). He asks other people to forfeit a substantial part of their income to secure this positive scenario and avert the negative one. He claims to be working to literaly save the world, therefore, to be working on his very own survival.
And then, he draws from the SI resources that could be used to hire additional staff and do more research, just to support his lifestyle of relative luxury.
He could live in a smaller house, he could move himself and the SI to a less expensive area (the Silicon Valley is one of the most expensive areas in the world, and there doesn't seem to be a compelling reason for the SI to be located there). If he is honest about his claimed beliefs, if he "confronted them, rationally, full-on", how could he be possibly trading any part of the bright future of ours (and his) intergalactic descendants, how could he be trading the chance of his own survival, for a nice house in an expensive neighborhood?
I'm not suggesting he should move to a slum in Calcutta and live on a subsistence wage, but certainly he doesn't seem to be willing to make any sacrifice for what he claims to believe, expecially when he asks other people to make such sacrifices.
Of course, I'm sure he can come with a thousand rationalizations for that behavior. He could say that a lifestyle any less luxurious than his current one would negatively affect the productivity of his so much important work. I won't buy it, but everyone is entitled to their opinion.
No. Yudkowsky is paid by the SI, hence he could just donate to the SI just by accepting a lower salary.
Oh, right. That makes sense, I guess. Of course, as you say, he may have reasons he hasn't shared for this lifestyle. Low prior probability of them being good reasons though.
Please refer to the updated documented here: http://lesswrong.com/lw/5il/siai_an_examination/
This version is an old draft.
NOTE: Analysis here will be updated as people point out errors! I've tried to be accurate, but this is my first time looking at these (somewhat hairy) non-profit tax documents. Errors will be corrected as soon as I know of them! Please double check and criticize this work that it might improve.
Document History:
Todo:
Disclaimer:
Acting on gwern's suggestion in his Girl Scout Cookie analysis, here is a first pass at looking at SIAI funding, suggestions for a funding task-force, etc.
The SIAI's Form 990's are available at GuideStar and Foundation Center. You must register in order to access the files at GuideStar.
Overview
Analysis:
Revenue
Analysis:
Expenses
Analysis:
Big Donors
Analysis
Officer Compensation
Prior to doing this investigation, I had some expectation that the Singularity Summit was a money losing operation. I had an expectation that Eliezer probably made around $70k (programmer money discounted for being paid by a non-profit). I figured the SIAI had a broader donor base. I was off base on all counts.* I am not currently an SIAI supporter. My findings have greatly increased the probability that I will donate in the future.
Overall, the allocation of funds strikes me as highly efficient. I don't know exactly how much the SIAI is spending on food and fancy tablecloths at the Singularity Summit, but I don't think I care: it's growing and it's nearly breaking even. An attendee can have a very confident expectation that their fee covers their cost to the organization. If you go and contribute you add pure value by your attendance.
At the same time, the organization has been able to expand services without draining the coffers. A donor can hold a strong expectation that the bulk of their donation will go toward actual work in the form of salaries for working personnel or events like the Visiting Fellows Program.
Eliezer's compensation is slightly more than I thought. I'm not sure what upper bound I would have balked at or would balk at. I do have some concern about the cost of recruiting additional Research Fellows. The cost of additional RFs has to be weighed against new programs like Visiting Fellows.
The organization appears to be managing its cash reserves well. It would be good to see the SIAI build up some asset reserves so that it could operate comfortably in years were public support dips or so that it could take advantage of unexpected opportunities.
The organization has a heavy reliance on major donor support. I would expect the 2010 filing to reveal a broadening of revenue and continued expansion of services, but I do not expect the organization to have become independent of big donor support. Things are much improved from 2006 and without the initial support from Peter Thiel the SIAI would not be able to provide the services it has, but it would still be good to see the SIAI operating capacity be larger than any one donor's annual contribution. It is important for Less Wrong to begin a discussion of broadening SIAI revenue sources.
Where to Start?
There is low hanging fruit to be found. The SIAI's annual revenue is well within the range of our ability to effect significant impact. These suggestions aren't all equal in their promise, they are just things that come to my mind.