Related to: Science: Do It Yourself, How To Fix Science, Rationality and Science posts from this sequence, Cargo Cult Science, "citizen science"
You think you have a good map, what you really have is a working hypothesis
You did some thought on human rationality, perhaps spurred by intuition or personal experience. Building it up you did your homework and stood on the shoulders of other people's work giving proper weight to expert opinion. You write an article on LessWrong, it gets up voted, debated and perhaps accepted and promoted as part of a "sequence". But now you'd like to do that thing that's been nagging you since the start, you don't want to be one of those insight junkies consuming fun plausible ideas forgetting to ever get around to testing them. Lets see how the predictions made by your model hold up! You dive into the literature in search of experiments that have conveniently already tested your idea.
It is possible there simply isn't any such experimental material or that it is unavailable. Don't get me wrong, if I had to bet on it I would say it is more likely there is at least something similar to what you need than not. I would also bet that some things we wish where done haven't been so far and are unlikely to be for a long time. In the past I've wondered if we can in the future expect CFAR or LessWrong to do experimental work to test many of the hypotheses we've come up with based on fresh but unreliable insight, anecdotal evidence and long fragile chains of reasoning. This will not happen on its own.
With mention of CFAR, the mind jumps to them doing expensive experiments or posing long questionnaires with small samples of students and then publishing papers, like everyone else does. It is the respectable thing to do and it is something that may or may not be worth their effort. It seems doable. The idea of LWers getting into the habit of testing their ideas on human rationality beyond the anecdotal seems utterly impractical. Or is it?
That ordinary people can band together to rapidly produce new knowledge is anything but a trifle
How useful would it be if we had a site visited by thousands or tens of thousands solving forms or participating in experiments submitted by LessWrong posters or CFAR researchers? Something like this site. How useful would it be if we made such a data set publicly available? What if we could in addition to this data mine how people use apps or an online rationality class? At this point you might be asking yourself if building knowledge this way even possible in fields that takes years to study. A fair question, especially for tasks that require technical competence, the answer is yes.
I'm sure many at this point, have started wondering about what kinds of problems biased samples might create for us. It is important to keep in mind what kind of sample of people you get to participate in the experiment or fill out your form, since this influences how confident you are allowed to be about generalizations. Learning things about very specific kinds of people is useful too. Recall this is hardly a unique problem, you can't really get away from it in the social sciences. WEIRD samples aren't weird in academia. And I didn't say the thousands and tens of thousands people would need to come from our own little corner of the internet, indeed they probably couldn't. There are many approaches to getting them and making the sample as good as we can. Sites like yourmorals.org tried a variety of approaches we could learn from them. Even doing something like hiring people from Amazon Mechanical Turk can work out surprisingly well.
LessWrong Science: We do what we must because we can
The harder question is if the resulting data would be used at all. As we currently are? I don't think so. There are many publicly available data sets and plenty of opportunities to mine data online, yet we see little if any original analysis based on them here. We either don't have norms encouraging this or we don't have enough people comfortable with statistics doing so. Problems like this aren't immutable. The Neglected Virtue of Scholarship noticeably changed our community in a similarly profound way with positive results. Feeling that more is possible I think it is time for us to move in this direction.
Perhaps just creating a way to get the data will attract the right crowd, the quantified self people are not out of place here. Perhaps LessWrong should become less of a site and more of a blogosphere. I'm not sure how and I think for now the question is a distraction anyway. What clearly can be useful is to create a list of models and ideas we've already assimilated that haven't been really tested or are based on research that still awaits replication. At the very least this will help us be ready to update if relevant future studies show up. But I think that identifying any low hanging fruit and design some experiments or attempts at replication, then going out there and try to perform them can get us so much more. If people have enough pull to get them done inside academia without community help great, if not we should seek alternatives.
A young and learning member calling reading papers "fun" without a second thought is already impressive progress when compared to the epistemic attitude of most people around us, I'd say.
LW posters have noticed many times that the most instrumentally rational people, hailed for making the world better or at any rate leaving a mark on it (Page & Brin, Warren Buffett, Linus Torvalds, maybe Thiel; among politicians either Gandhi, Churchill or Lee Kuan Yew - they wouldn't have got along! - and maybe some older ones like Alexander II of Russia or the people behind the Meiji Restoration...), rarely behave like Eliezer or Traditional Rationality would want them to. They exploited some peculiar factors, innate or unintentionally acquired advantages (genes, lucky upbringing, broad life experience) that LW attempts to emulate through some written advice and group meetings. Most haven't even heard of Bayes or can't name a couple of fallacies! :)
At this stage, if an LW user actually uses the letter and spirit of LW materials to gain rent in some complicated, important area (like education, career, interpersonal relations, "Luminosity", fighting akrasia) - well, that's a pleasant surprise but an improbable prior. And some might not even pretend to heed the advice. E.g. my choice of education and career (Social Sciences) directly contradicts the common LW wisdom, that much of it is pure woo and will be made irrelevant in the transhuman world anyway. I can't even formulate a "rationalist" argument against that wisdom, besides some vague guesses that principles of social organization and grand-scale value conflict like farmers vs. foragers - what LW likes to dismiss as "politics" - might stay important after we handle FAI, death or scarcity. For all the LW consensus knows, I might be insane for choosing to blow the next few years on empty talk instead of going the 80000 Hours way, or raising x-risk awareness by writing fiction, or something else "rationalist".
Even our smallest real gains (openness, changing one's mind, "luminosity", intellectual rigor) are impressive, given just how ineffective or double-edged most deliberate attempts at instrumental rationality are. New Atheism, "pragmatic" politics (along the lines of moldbuggery), "PUA", theology-based intellectual traditions like the Jewish ones - all claim to make you wiser, more truth-oriented, with better heuristics... yet all can have specific, awful, all-too-commonly seen negative effects on their real audiences.
Ack, noticed some tribe blindness in myself here. Out of the examples you list in your last paragraph:
I can immediately think of negative effects each of these ideas have on their audiences, except for the first one, New Atheism. Of course (remarkable coincidence!) that happens to be the one that I have personal association with. Can you elaborate on the negative effects you were thinking of when you mentioned New Atheism?