By check? Can you PM or email me with the name? The reason I ask is so that I can figure out how close HPMOR is to the 4-day update threshold, add it into my calculations in advance, and make sure it doesn't get double-counted when the actual check arrives. (BTW, do you want credit with my thousands of fanatic readers for bringing the threshold closer?)
Feels counterintuitive, but if just 50 people establish arrangements like this one, SingInst gets a reliable supply of funding on the current spending level independent of funding rallies or big-sum donors.
50 people is a lot. Certainly a large number of people here simply cannot afford that sort of commitment to any charity. There are a lot of grad students here for example, some of whom are getting less than that for their monthly salaries. In fact when I saw Rain's comment my first thought was "how the heck does Rain have that kind of money?"
my first thought was "how the heck does Rain have that kind of money?"
Low cost of living and a good job. I've always wondered the opposite, "how the heck does nobody else have any money?" I have so much left over every month, I wondered what to do with it for a long time before deciding on making a better future in the best way I know how.
I don't know if you'll be able to translate to SEK, but here's my canadian dollar budget:
3000/month income after tax
-100/month food
-400/month housing
-300/month personal spending
The rest (2200) is for savings and SI (not that I've organized a monthly $1k yet or anything).
$100 for food: people are consistently amazed at this one. Oatmeal + milk + granola for breakfast. Eggs + english muffins + cheese + mayonaise + celery + peanut butter + carrots + leftovers for lunch. Cheap meat and veggies and rice and such for dinner. I shop at the local grocer for meat and veggies, and Real Canadian Superstore for everything else.
The trick is to be strict about it. Put your money in a box at the begin of the month, eat fucking beans and rice for a week if you blow the budget. You learn quick this way. Only problem is cooking. Eats up like 4 hours a week.
$400 for housing: live with roommates, and rent.
$300/mo personal: that's actually a lot of money, but you do have to be careful, you can't be buying a new jacket every month, or you won't be able to buy anything else. Again, strict budgeting.
I hope this helps people become more effective altruists!
50 people is still an idea that fits into human imagination and feels usual, unlike a sum of $600,000 or a single donation of $200,000.
Lots of people in the US/UK make several thousand a month, and anything on the order of 10% of income is usually expendable. Of course, depending on income, lower pledge works out proportionally.
Hiring Luke full time would be an excellent choice for the SIAI. I spent time with Luke at mini-camp and can provide some insight.
Luke is an excellent communicator and agent for the efficient transmission of ideas. More importantly, he has the ability to teach these skills to others. Luke has shown this skill publicly on Less Wrong and also on his blog, with this distilled analysis of Eliezer's writing "Reading Yudkowsky."
Luke is a genuine modern day renaissance man, a true polymath. However, Luke is very self-aware of his limitations and has devoted significant work to finding ways of removing or mitigating those limitations. For example any person with a broad range of academic interests could fall prey to never acquiring useful skills in any of those interest areas. Luke sees this as a serious problem of concern and wants to maximize the efficiency of searching the academic space of ideas. Again, for Luke this is a teachable skill. His session "Productivity and Scholarship" at minicamp outlined techniques for efficient research and reducing akrasia. None of that material would be particularly surprising for a regular reader of Less Wrong -- because Luke
Was planning on waiting 'til the last day to decide with maximum info (in particular, whether the maximum match amount was met). If enough other people think like me, SIAI should see a rush of cash in the last few days of the contest.
But Eliezer forced my hand with this from MoR:
Thus this fic will next update at 7pm Pacific Time, on August 30th 2011, unless the Summer Challenge reaches $50,000 or more before then, in which case the fic will update sooner (but still at 7pm, because I'm not cruel).
Also we need more Luke.
So that's another $1000 for SIAI
So, no plans for providing any substantiation of the mini-camp's purported success? (Some want to know.) Or of people who have increased their level of life success as a result of the winning at life guides?
Among the ultimate criteria for the minicamps is their impact on long-term life success. To assess this, both minicamp participants and a control group completed a long, anonymous survey containing many indicators of life success (income, self-reported happiness and anxiety levels, many questions about degree of social connectedness and satisfaction with relationships, etc.); we plan to give it again to both groups a year after mini-camp, to see whether minicampers improved more than controls. I’m eager to see and update from those results, but we’re only a couple months into the year’s waiting period. (The reason we decided ahead of time to wait a year is that minicamp aimed to give participants tools for personal change; and, for example, it takes time for improved social skills, strategicness, and career plans to translate into income.)
Meanwhile, we’re working with self-report measures because they are what we have. But they are more positive than I anticipated, and that can’t be a bad sign. I was also positively surprised by the number of rationality, productivity, and social effectiveness habits that participants reported using regularly, in response to my email asking, t...
both minicamp participants and a control group
How was the control group selected? Did you select a pool of candidates larger than you could accept then randomly take a subset of these as a control? If not then calling it a 'control group' is borderline at best.
The prior expectation of the influence of one week of training on personal success over a year is far lower than that of various personal and environmental qualities in the individual. This being the case it is more reasonable to attribute differences in progress between the groups to the higher potential for growth in the chosen minicampers. This primarily reflects well on the ability of the Singinst rationality trainers to identify indicators of future success - a rather important skill in its own right!
A good point. The control group was of folks who made it through the initial screening but not the final screening, so, yes, there are differences. We explicitly discussed the possibility of randomizing admissions, but, for our first go, elected to admit the 25 people we most wanted, and to try randomizing some future events if the first worked well enough to warrant follow-ups (which it did). It is a bit of a hit to the data-gathering, but it wasn't growth potential as such that we were selecting for -- for example, younger applicants were less likely to have cool accomplishments and therefore less likely to get in, although they probably have more growth potential -- so there should still be evidence in the results.
Also, we marked down which not-admitted applicants were closest to the cut-off line (there were a number of close calls; I really wished we had more than 25 spaces), so we can gain a bit of data by seeing if they were similar to the minicamp group or to the rest of the controls.
I have a real hard time deciding how seriously I should take this survey.
The halo effect for doing anything around awesome people like are found in a selected group of Lesswrongians is probably pretty strong. I fear at least some of the participants may have mixed up being with awesome people with becoming awesome. Don't get me wrong being with awesome people in of itself will work ... for a while, until you leave that group.
I'm not that sceptical of the claims, but from the outside its hard to tell the difference between this scenario and the rationality camps working as intended.
You're right to suspect that this could have happened. That said: I was a mini-camp participant, and I actually became more awesome as a result. Since mini-camp, I've:
I stuck around in California for the summer, and gained a lot from long conversations with other SIAI-related people. The vigor and insight of the community was a major factor in showing me how much more was possible and helping me stick to plans I initiated.
But, that said - the points listed above appear to be a direct result of the specific things I learned at mini-camp.
Yes; what I meant by "success" was more like a successful party or conference; Luke pulled off an event that nearly all the attendees were extremely glad they came to, gave presentations that held interest and influenced behavior for at least the upcoming weeks, etc. It was successful enough that, when combined with Luke's other accomplishments, I know we want Luke, for his project-completion, social effectiveness, strategicness, fast learning curves, and ability to fit all these qualities into SingInst in a manner that boosts our overall effectiveness. I don't mean "Minicamp definitely successfully created new uber-rationalists"; that would be a weird call from this data, given priors.
SilasBarta,
We collected lots of data before and during minicamp. We are waiting for some time to pass before collecting followup data, because it takes time for people's lives to change, if they're going to change. Minicamp was only a couple months ago.
Minicampers are generally still in contact, and indeed we are still gathering data. For example, several minicampers sent me before and after photos concerning their fashion (which was part of the social effectiveness section of the minicamp) and I'm going to show them to people on the street and ask them to choose which look they prefer (without indicating which is 'before' and which one is 'after').
So yes: by all qualitative measures, minicamp seems to have been a success. The early quantitative measures have been taken, but before-and-after results will have to wait a while.
As for future rationality training, we are taking the data gathered from minicamp and boot camp and also from some market research we did and trying to build a solid curriculum. To my knowledge, four people are seriously working on this project, and Eliezer is one of them.
Cheers,
Luke
So it's early enough to call it an unqualified success, but too soon for evidence to exist that it was was a success? If I have to be patient for the evidence to come back, shouldn't you be a little more patient about judging it a success?
Edit: I gave a list of information you could post. The fashion part isn't suprising enough to count as strong evidence, and was a relatively small part of the course that, in any case, you previously claimed could be accomplished by looking at a few fashion magazines.
I mean 'evidence' in the Bayesian sense, not the scientific sense. I have significant Bayesian evidence that minicamp was a success on several measures, but I can't know more until we collect more data.
Thanks for providing a list of information we could post. One reason for not posting more information is that doing so requires lots of staff hours, and we don't have enough of those available. We're also trying to, for example, develop a rationality curriculum and write a document of open problems in FAI theory.
If you're anxious to learn more about the rationality camps before SI has time to publish about that data, you're welcome to contact the people who attended; many of them have identified themselves on Less Wrong.
I'm fairly confident that campers got more out of my fashion sessions than what they can learn only from looking at a few fashion magazines.
Cheers,
Luke
Good grief, people. There are conspiracies that need ferreting out, but they do not revolve around generating fake data about the effectiveness of an alpha version of a rationality training camp that was offered for free to a grateful public.
I went to the minicamp, I had a great time, I learned a lot, and I saw shedloads of anecdotal evidence that the teachers are striving to become as effective as possible. I'm sure they will publish their data if and when they have something to say.
Meanwhile, consider re-directing your laudable passion for transparency toward a publicly traded company or a medium-sized city or a research university. Fighting conspiracies is an inherently high-risk activity, both because you might be wrong about the conspiracies' existence, and because even if the conspiracy exists, you might be defeated by its shadowy and awful powers. Try to make sure the risks you run are justified by an even bigger payoff at the end of the tunnel.
There are conspiracies that need ferreting out, but they do not revolve around generating fake data about the effectiveness of an alpha version of a rationality training camp that was offered for free to a grateful public.
I don't think anybody is accusing the minicamp folks of anything of the kind. But public criticism and analysis of conclusions is the only reliable way to defend against overconfidence and wishful thinking.
When I ended my term as an SIAI Visiting Fellow, I too felt like the experience would really change my life. In reality, most of the effects faded away within some months, though a number of factors combined to permanently increase my average long-term happiness level.
Back then the rationality exercises were still being worked out and Luke wasn't around, so it's very plausible that the minicamp is a lot more effective than the Visiting Fellow program was for me. But the prior for any given self-help program having a permanent effect is small, even if participants give glowing self-reports at first, so deep skepticism is warranted. No conspiracies are necessary, just standard wishful thinking biases.
Though I think this was the third time that Silas raised the question before finally getting a reply, despite his comment being highly upvoted each time. If some people are harboring suspicions of SIAI covering up information, well, I can't really say I'd blame them after that.
I wouldn't be that surprised. Explicit rationality exercises were only starting to be developed during the last month of my stay, and at that point they mostly fell into the category of "entertaining, but probably not hugely useful". The main rationality boost came from being around others with a strong commitment to rationality, but as situationist psychology would have it, the effect faded once I was out of that environment.
this is a pretty weak claim: the camp beat expectations
"Please give us money" and "Co-organized and taught sessions for a highly successful one-week Rationality Minicamp" are stronger claims.
Oh, sure. The reason is easy to communicate. We explicitly told minicampers that their feedback on the exit survey would be private and anonymous, for maximal incentive to be direct and honest. We are not going to violate that agreement. The testimonials were given via a separate form with permission granted to publish THAT data publically.
A summary of the data can be published, for example median scores for measured values. But the data can't be published in raw form.
Not sure if raw testimonial data has been published yet. We do have data beyond testimonials and exit surveys, but that, too, requires precious staff hours to compile and write up, and it is still in the process of being collected.
Typing this stuff from a phone, pardon the brevity...
(I disapprove of downvoting the parent (which I just found at '-2'). It continues the same conversation as the previous Silas's posts, pointing out what does look like rationalization. If raising a possibility of interlocutor's rationalizing in defense of their position is considered too rude to tolerate, we'll never fix such problems.)
I suspect that most of the downvotes came from the very last sentence, which struck me as more than a little snarky. "Cheers" might not be necessary, but it is a gesture of politeness and was probably added in an attempt to convey a positive tone (which is important but somewhat tricky in text). I wouldn't say "not necessary" if someone held the door for me, even if it is obviously true.
Agree with you that the actual substance of the post was in no way downvote-worthy.
Because "signing" comments is not customary here, doing so signals a certain aloofness or distance from the community, and thus can easily be interpreted as a passive-aggressive assertion of high status. (Especially coming from Luke, who I find emits such signals rather often -- he may want to be aware of this in case it's not his intention.)
I interpret Silas's "Not necessary" as roughly "Excuse me, but you're not on Mount Olympus writing an epistle to the unwashed masses on LW down below".
I just looked through several pages of lukeprog's most recent comments, and the only ones that were signed were direct replies to SilasBarta.
I get annoyed by people who "sign" posts in the text like that, especially when they do it specifically on replies to me. It really isn't necessary. I'm interested in substance, not pleasantries, as I was three months ago when I asked how the mini-camp was a success.
Great, so did I! Now communicate that evidence. If it can't be communicated, I don't think you should be so confident in it.
Here:
It may take time for the participants to report back, but not for you to tabulate the results.
No, it definitely takes time to tabluate the results and write a presentable post about the results. I've personally spent 3 hours on it already but the project is unfinished.
Do you want your audience to be people who just take your word on something like this?
Ah. I may not have communicate this clearly: I think your skepticism concerning the success of the minicamp is warranted because almost no evidence is available to you. You're welcome to not take my word for it. When I have another 5-10 hours to finish putting together the results and write a post with more details about minicamp, I will, but I'm m...
Given the large amount of effort it took to get to the miny camps, all four of these could be easily explained by cognitive dissonance.
What evidence would you expect them to have if the "bootcamp" was a genuine success?
Bootcamp? I found the wording Eliezer used fascinating:
Co-organized and taught sessions for a highly successful one-week Rationality Minicamp, and taught sessions for the nine-week Rationality Boot Camp.
Have they actually claimed anywhere here that the bootcamp was successful?
There is a good reason. A lot of things people know can't contribute to forming reliable public knowledge for all sorts of practical reasons. And you know the reference for the arguments about this question: Scientific Evidence, Legal Evidence, Rational Evidence.
Here are excerpts from the Minicamp testimonials (which were written to be shown to the public), with a link to the full list at the end:
“The week I spent in minicamp had by far the highest density of fun and learning I have ever experienced. It's like taking two years of college and condensing it to a week: you learn just as much and you have just as much fun. The skills I've learned will help me set and achieve my own life goal, and the friends I've made will help me get there.” --Alexei
“This was an intensely positive experience. This was easily the most powerful change self-modification I've ever made, in all of the social, intellectual, and emotional spheres. I'm now a more powerful person than I was a week ago -- and I can explain exactly how and why this is true.
At mini-camp, I've learned techniques for effective self-modification -- that is, I have a much deeper understanding of how to change my desires, gather my willpower, channel my time and cognitive resources, and model and handle previously confusing situations. What's more, I have a fairly clear map of how to build these skills henceforth, and how to inculcate them in others. And all this was presented in such a wa...
Great! Much of the minicamp data is private and anonymous, so I can't share that with you, but I definitely have tasks for volunteers to do that will free uo time for me to write up a minicamp report - some of those tasks are even directly relevant to minicamp. Please email me at lukeprog at gmail if you'd like to help.
SilasBarta,
I need to write up the results myself because I personally ran the minicamp with Anna Salamon and Andrew Critch. But I have tons of stuff I could have you do that would free up more of my time to get around to writing up results of minicamp data. If you're interested in helping, that would be awesome. You can contact me at lukeprog [at] gmail.
"Sure, I'd love to! (I thought I didn't qualify to volunteer for SIAI?)"
LOL. Way to play up the role of the passive-aggressive outsider.
I agree with 'more testing and evidence, please', but you often come across as adversarial and I think that generally makes it harder for you to convince the people you want to convince.
As an aside, remember that the minicamp was a relatively unplanned event; it came about because SIAI had extra time and space. I will be more concerned if the megacamp has a similar lack of testing.
I agree with 'more testing and evidence, please', but you often come across as adversarial and I think that generally makes it harder for you to convince the people you want to convince.
Well, I hope they're not relying on "Silas is a meanie" as their intellectual "covering fire" for not substantiating this claim. And it's not that I want more testing and evidence, I just want to see what they think proves its success.
As an aside, remember that the minicamp was a relatively unplanned event; it came about because SIAI had extra time and space.
True, but I wouldn't be asking for any of this if leaders didn't try to paint it afterwards as a major success. If they want to take a risky venture, fine. If they want to play, "I meant to do that", let's see what it accomplished.
I'm not talking about 'covering fire'. If your goal is to win an argument or appear righteous, then your strategy is alright. If your goal is to actually get SIAI to change their behavior, then your language is hurting your cause. You want to make it as easy as possible for them to change their behavior, and it's psychologically much easier to do something because an ally asks than because an adversary asks.
You have seen evidence: both Guy (link) and I (link) posted 'lessons learned' for the minicamp. You are right to say this is not especially strong evidence, but it is evidence. I think it would have been good to video tape some of the sessions and post them and post the exit surveys (they took testimonials too).
One of the many things I updated on as a result of the 9-week bootcamp is the importance of tone. I'm sympathetic to your data-crusade, but the way in which you're prosecuting it is leading me to dislike you.
You've made a number of posts indicating that you place a high priority on finding and joining a rationalist community. That will be more difficult if you are generally perceived by rationalists as a hostile conversationalist; you should be more strategic about achieving your goals.
Upvoted, and props for sticking to your guns on this.
I appreciate that substantiating this sort of thing is non-trivial, but I would like to see at least an effort at some sort of evaluation.
At the very least I would like to see some kind of plan for evaluating future ventures... it may be too late now for anything but post-hoc qualitative for the mini-camp (or mega-camp?). Except that we did take a survey just before the camp, whose general idea I remember but not the specific questions, and I think there are plans to send it out again 10 months from now. Unfortunately that survey is available online IIRC, but I won't go look at it.
Something about Help Fund Luke, The Summer Challenge, and HPMOR just got a $50 donation out of a 20 year old starving student. Feels good.
I've been low on cash recently, so I can't donate as much as I've used to, but I resumed the 10 EUR/month regular donation I previously had going on and which got cancelled for some reason which I forget. (Hopefully, I should be keeping that going indefinitely.)
I felt a remarkable resistance to donating such a small amount, feeling that it's even more embarassing than not donating at all. But then I came to my senses and figured I'd post this comment to encourage others to also make a small donation rather than no donation. If you're considering giving a sum which seems too small to be worth it, there's no shame in that! I did it too!
Also, go Luke! You're awesome, and have the kind of amazing energy to work on these issues that I can only dream of having. Just be careful not to burn yourself out.
Also mentioned in the email that went out to the SIAI mailing list, a pledge of monthly donations for the next 12 months counts at the full year's value for purposes of matching.
I lurk on this site every day, and this it the first I've heard about the Summer Challenge. Close call! I just added singinst.org/blog to my feed reader, and sent $1000 Luke's way.
I love funds-matching opportunities! And yet I didn't get an email or anything?
I think a lot of the hubbub in this thread is due to different interpretations of SIAI related folks saying that the minicamp was 'successful'. I think many people here have interpreted 'success' as meaning something like "definitely improved the rationality of the attendants lastingly" and I think SIAI folks intended to say something like "was competently executed and gives us something promising to experiment with in the future".
I just updated the Challenge Grant total to $39,695, or about 32% of our total. THANK YOU to everyone who donated in the last couple days, many of you did!
May I suggest that with regard to Eliezers HP/MOR-Challenge the page gets updated at least once every 24 hours, possible even more often?
Donated $285, unrestricted, although funding Luke sounds like a fine thing to do with it. Also, will Bitcoin donations be matched by their dollar exchange rate?
Should this really be under main, and promoted at that? My impression was that main posts, and especially promoted ones were supposed to be reserved for posts discussing rationality and its applications, meant to be held up as examples of our best work. I have nothing against lukeprog, the SIAI, or this effort, but I don't think this really qualifies.
Should this really be under main, and promoted at that?
Yes, absolutely. Calls for action in support of causes associated with this site are material for promoted front page articles. It is valuable to send the message that participating, actually donating money rather than just thinking "That's nice, SIAI is hiring another research fellow", is important by giving prominence to the announcement.
But there aren't causes associated with the site.
That is simply false. LW was created by SIAI with the purpose of generating rationalists interested in reducing existential risks, and accepting and even encouraging that it might produce rationalists interested in other causes.
In the early days, we specifically avoided talking about SIAI, FAI, and existential risks because we didn't want shiny discussions about those topics to overwhelm our work on rationality. Now that we are more established, we no longer do that. From the beginning, that policy was meant to be temporary.
We need to be promoting articles that say why the SIAI is doing good work, and discussing the rationality behind supporting it.
False dichotomy. We have had lots of discussions about the effectiveness of SIAI. We can also have announcements of when they have a project that needs funding. There are people here who are already convinced SIAI is an effective charity worth supporting, but need some encouragement to actually support it. That is why this kind of announcement is important.
That is simply false.
You're right. That was a horribly crafted sentence, in many ways. They are clearly associated. But the site is about rationality, not the SIAI. That was my point. (The statement is also patently false if you take "rationality" as a cause, which is entirely reasonable.)
False dichotomy. We have had lots of discussions about the effectiveness of SIAI. We can also have announcements of when they have a project that needs funding. There are people here who are already convinced SIAI is an effective charity worth supporting, but need some encouragement to actually support it. That is why this kind of announcement is important.
Sure. But that doesn't mean it needs to be in Main and promoted...
From About Less Wrong
Once you have 20 or more karma points, you're allowed to make posts to the main community blog. (Click 'Create New Article' and change 'Post to' to 'Less Wrong'.) This section is intended for posts about rationality theory or practice that display well-edited writing, careful argument, and new material.
If posts like these are acceptable, than that statement is patently false, and should be changed. If it is not false, then posts like this are inappropriate on main.
What does the description "is acceptable" refer to? Acceptable by what criterion? The real question is whether things like this should be encouraged or discouraged, using whatever methods are at our disposal, including establishing a policy for moving "off-topic" posts out of Main. Instead, you seem to be appealing to an existing social attitude, which shouldn't be a major factor, as it too can be influenced by good arguments and other means.
I just made a donation using Google Checkout. (AFAICT, it should be an unrestricted donation - I won't tell you how best to spend your money.)
I wonder if anyone here shares my hesitation to donate (only a small amount, since I unfortunately can't afford anything bigger) due to thinking along the lines of "let's see, if I donate a 100$, that may buy a few meals in the States, especially CA, but on the other hand, if I keep them, I can live ~2/3 of a month on that and since I also (aspire to) work on FAI-related issues, isn't this a better way to spend the little money I have?"
But anyway, since even the smallest donations matter (tax laws an' all that, if I'm not mistaken) and -5$ isn't going to kill me, I've just made this tiny donation...
Your direct pleas for money often work the best against my Akrasia Eliezer. Maybe some new lack of fallacy's in my thinking has convinced me to give you money for many ideas that some consider foolish but here is a Benjamin for the cause.
Continue to develop his metaethics sequence, the conclusion of which will be a sort of Polymath Project for collaboratively solving open problems in metaethics relevant to FAI development.
This in itself, well-run, is worth a salary in expectation. Luke, have you talked to Michael Nielsen or Tim Gowers or Terrence Tao or Gil Kalai about things that worked or didn't in the polymath projects they've run? I'm certain at least Nielsen will be very interested.
Edit: feels like it's worth. I didn't do any math.
I'm confused. Why are donations for this separate from other kinds of SIAI donations? Some kind of psychological trick to increase total donations? Or will his wage be directly proportional to how much has such comments?
From the article, my semi-guess is that the SIAI has a number of things they'd like to fund, one of which is hiring Luke. If you think that hiring Luke is more important than other things the SIAI could be doing with immediate income right now, then you can specify that in your donation and make it happen.
Which brings up a question: what are the other things the SIAI would be spending new money on other than hiring Luke? Nothing against Luke, who is clearly awesome, but it's hard to build an ordered list of preferences if every choice but one is labelled "???".
There are a number of ways by which I could imagine SIAI choosing between the things they'd like to fund. A popular vote where only one of the options is specified isn't one of them.
I'd guess this is an attempt to get more donations, plus an experiment on the effectiveness of such an approach.
I had been planning on donating right before New Years when I do all my other charitable donation, but now that I'm aware of the matching I'm moving things up.
This really should have been done as a Kickstarter project. If SIAI suddenly decides it doesn't have enough money to fund lukeprog, what is going to happen to the people donating "unrestricted" but with the intent to fund lukeprog? Why should SIAI waste resources administrating the fundraiser while a perfectly good third-party product exists?
All of this is surprisingly effective in overcoming my akrasia, esp. Nisans way, so on top of my donation I wanted to subscribe monthly- unfortunatly it seems that a credit card is neccesary for that. Any ideas how to circumvent this? I do not want to get a (regular) creditcard.
Saying "the rationality minicamp was highly successful" before you have analyzed the data you have gathered to assess the success of the rationality minicamp is irrational.
If success at the minicamp is important - suggested by it listed first on Eliezer's recommendation - why not wait until you CAN analyze the data, to see whether it really was successful, before you recommend hiring Luke? Doing so means a) you can make a more persuasive case to donors, and b) if the minicamp WASN'T successful, then one can reconsider the hire.
The fact this plu...
Singularity Institute desperately needs someone who is not me who can write cognitive-science-based material. Someone smart, energetic, able to speak to popular audiences, and with an excellent command of the science. If you’ve been reading Less Wrong for the last few months, you probably just thought the same thing I did: “SIAI should hire Lukeprog!” To support Luke Muelhauser becoming a full-time Singularity Institute employee, please donate and mention Luke (e.g. “Yay for Luke!”) in the check memo or the comment field of your donation - or if you donate by a method that doesn’t allow you to leave a comment, tell Louie Helm (louie@intelligence.org) your donation was to help fund Luke.
Note that the Summer Challenge that doubles all donations will run until August 31st. (We're currently at $31,000 of $125,000.)
During his stint as a Singularity Institute Visiting Fellow, Luke has already:
As a full-time Singularity Institute employee, Luke could:
If you’d like to help us fund Luke Muehlhauser to do all that and probably more, please donate now and include the word “Luke” in the comment field. And if you donate before August 31st, your donation will be doubled as part of the 2011 Summer Singularity Challenge.