Mass_Driver comments on Help Fund Lukeprog at SIAI - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (276)
This comes off very strongly as the typical bureaucratic protectiveness - a business doesn't want to share raw data, because raw data is a valuable resource. If you came out and said this was the reason, I'd be more understanding, but it would still feel like a major violation of community norms to be so secretive.
If simple secrecy is indeed the case, I would urge you, please, be honest about this motive and say so explicitly! At least then we are having an honest discussion, and the rest of this comment can be disregarded.
In short, what is the reason you can't share this RAW data, which you state you collected, and which you've presumably found sufficient for your own preliminary conclusions? I don't think Silas is asking for or expecting an elegant power-point presentation or a concise statistical analysis - I know I would personally love to simply see raw data.
Is there truly not a single spreadsheet or writeup that you could drop up for us to study while you collect the rest of the data?
Good grief, people. There are conspiracies that need ferreting out, but they do not revolve around generating fake data about the effectiveness of an alpha version of a rationality training camp that was offered for free to a grateful public.
I went to the minicamp, I had a great time, I learned a lot, and I saw shedloads of anecdotal evidence that the teachers are striving to become as effective as possible. I'm sure they will publish their data if and when they have something to say.
Meanwhile, consider re-directing your laudable passion for transparency toward a publicly traded company or a medium-sized city or a research university. Fighting conspiracies is an inherently high-risk activity, both because you might be wrong about the conspiracies' existence, and because even if the conspiracy exists, you might be defeated by its shadowy and awful powers. Try to make sure the risks you run are justified by an even bigger payoff at the end of the tunnel.
I don't think anybody is accusing the minicamp folks of anything of the kind. But public criticism and analysis of conclusions is the only reliable way to defend against overconfidence and wishful thinking.
When I ended my term as an SIAI Visiting Fellow, I too felt like the experience would really change my life. In reality, most of the effects faded away within some months, though a number of factors combined to permanently increase my average long-term happiness level.
Back then the rationality exercises were still being worked out and Luke wasn't around, so it's very plausible that the minicamp is a lot more effective than the Visiting Fellow program was for me. But the prior for any given self-help program having a permanent effect is small, even if participants give glowing self-reports at first, so deep skepticism is warranted. No conspiracies are necessary, just standard wishful thinking biases.
Though I think this was the third time that Silas raised the question before finally getting a reply, despite his comment being highly upvoted each time. If some people are harboring suspicions of SIAI covering up information, well, I can't really say I'd blame them after that.
For the record, I for one don't recall reading any of SilasBarta's earlier comments on this topic.
It seems rather unlikely to me that being a mini-camp participant would have more of an effect on someone's life than being a Visiting Fellow, new techniques or not-- and if I am wrong, I would very much want to encounter these new techniques!
I wouldn't be that surprised. Explicit rationality exercises were only starting to be developed during the last month of my stay, and at that point they mostly fell into the category of "entertaining, but probably not hugely useful". The main rationality boost came from being around others with a strong commitment to rationality, but as situationist psychology would have it, the effect faded once I was out of that environment.
The positive endorphin rush from you and lukeprog sends signals that loook just like the enthusiastic gushing I see from any week-long "how to fix your life in five easy steps!" seminar. Smart people get caught up in biased thinking all the time. I had a good friend quit AI research to sell a self-help book, so I may be particularly sensitive to this :)
Objective data means I can upgrade this from "oh bunnies, another self-help meme" to "oooh, fascinating and awesome thing that I want to steal for myself." As long as it signals like a self-help meme, I'm going to shoot it down just like I'd shoot down any similar meme that tried to sell itself here on LessWrong.
All right, but there's a fine line between shooting down self-help memes and unnecessarily discouraging project-builders from getting excited about their work. It's not fun or helpful for a pioneer to have his or her every first step be met with boundless skepticism. Your concerns sound real enough to me, but even an honest concern can be rude, and even a rationalist can validly trade off a tiny little bit of honesty for a whole lot of politeness and sympathy.
Why do I say "a tiny little bit of honesty?" Well, if the minicamp were being billed as "finished," "polished," "complete," "famous," "proven," or "demonstrably successful," as many self-help programs are, then it would make sense to demand data supporting those claims.
Instead, the PR blurb says that "Starting on May 28th, the Singularity Institute ran a one-week Rationality Training Camp. Our exit survey shows that the camp was a smashing success, surpassing the expectations of the organizers and the participants."
Leaving aside the colorful language that can and should characterize most press releases, this is a pretty weak claim: the camp beat expectations. Do you really need to see data to back that up?
"Please give us money" and "Co-organized and taught sessions for a highly successful one-week Rationality Minicamp" are stronger claims.
For me, this isn't about making SIAI transparent; it does quite enough in that regard. It's about stopping an information cascade genie that's already out of the bottle.
Let me put it this way: right now the ratio of "relying on the assumption of mini-camp's success for decision making" to "available evidence for its success" is about 20-to-1. As I warned before, it's quickly becoming something "everyone knows" despite the lack of evidence (and major suspicions of many people that it wouldn't succeed going in). And that believe will keep feeding on itself unless someone traces it back to its original evidence.
It doesn't reassure me that I'm told I have to keep waiting before anything's conclusive, yet they can declare it a success now.
I just want the reliable evidence they claim to have, rather than just dime-a-dozen self-help testimonials. They collected hard data, and I gave them a list of things they could provide that are easy to gather and don't compromise privacy, and are much more likely to be present if the success were real than if it were not. Even after AnnaSalamon's circling of the wagons I don't see that.
I think this is largely a case of people reading different things into 'success'.