handoflixue comments on Help Fund Lukeprog at SIAI - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (276)
Good grief, people. There are conspiracies that need ferreting out, but they do not revolve around generating fake data about the effectiveness of an alpha version of a rationality training camp that was offered for free to a grateful public.
I went to the minicamp, I had a great time, I learned a lot, and I saw shedloads of anecdotal evidence that the teachers are striving to become as effective as possible. I'm sure they will publish their data if and when they have something to say.
Meanwhile, consider re-directing your laudable passion for transparency toward a publicly traded company or a medium-sized city or a research university. Fighting conspiracies is an inherently high-risk activity, both because you might be wrong about the conspiracies' existence, and because even if the conspiracy exists, you might be defeated by its shadowy and awful powers. Try to make sure the risks you run are justified by an even bigger payoff at the end of the tunnel.
The positive endorphin rush from you and lukeprog sends signals that loook just like the enthusiastic gushing I see from any week-long "how to fix your life in five easy steps!" seminar. Smart people get caught up in biased thinking all the time. I had a good friend quit AI research to sell a self-help book, so I may be particularly sensitive to this :)
Objective data means I can upgrade this from "oh bunnies, another self-help meme" to "oooh, fascinating and awesome thing that I want to steal for myself." As long as it signals like a self-help meme, I'm going to shoot it down just like I'd shoot down any similar meme that tried to sell itself here on LessWrong.
All right, but there's a fine line between shooting down self-help memes and unnecessarily discouraging project-builders from getting excited about their work. It's not fun or helpful for a pioneer to have his or her every first step be met with boundless skepticism. Your concerns sound real enough to me, but even an honest concern can be rude, and even a rationalist can validly trade off a tiny little bit of honesty for a whole lot of politeness and sympathy.
Why do I say "a tiny little bit of honesty?" Well, if the minicamp were being billed as "finished," "polished," "complete," "famous," "proven," or "demonstrably successful," as many self-help programs are, then it would make sense to demand data supporting those claims.
Instead, the PR blurb says that "Starting on May 28th, the Singularity Institute ran a one-week Rationality Training Camp. Our exit survey shows that the camp was a smashing success, surpassing the expectations of the organizers and the participants."
Leaving aside the colorful language that can and should characterize most press releases, this is a pretty weak claim: the camp beat expectations. Do you really need to see data to back that up?
"Please give us money" and "Co-organized and taught sessions for a highly successful one-week Rationality Minicamp" are stronger claims.