As I've been talking about on my shortform, I'd be excited about attracting more "programmer's programmers". AFAICT, a lot of LW users are programmers, but a large fraction of these users either are more interested in transitioning into theoretical alignment research or just don't really post about programming. As a small piece of evidence for this claim, I've been consistently surprised to see the relatively lukewarm reaction to Martin Sustrik's posts on LW. I read Sustrik's blog before he started posting and consistently find his posts there and here pretty interesting (I am admittedly a bit biased because I was already impressed by Sustrik's work on ZeroMQ).
I think that's a bit of a shame because I personally have found LW-style thinking useful for programming. My debugging process has especially benefited from applying some combination of informal probabilistic reasoning and "making beliefs pay rent", which enabled me to make more principled decisions about which hypotheses to falsify first when finding root causes. For a longer example, see this blog post about reproducing a deep RL paper, which discusses how noticing confusion helped the author make progress (CFAR is specifically mentioned). LW-style thinking has also helped me stop obsessing over much of the debate around some of the more mindkiller-y topics in programming like "should you always write tests first", "are type-safe languages always better than dynamic ones". In my ideal world, LW-style thinking applied to fuzzier questions about programming would help us move past these "wrong questions".
Programming already has a few other internet locuses such as Hacker News and lobste.rs, but I think those places have fewer "people who know how to integrate evidence and think probabilistically in confusing domains."
Assuming this seems appealing, one way to approach getting more people of the type I'm talking about would be to reach out to prominent bloggers who seem like they're already somewhat sympathetic to the LW meme-plex and see if they'd be willing to cross-post their content. Example of the sorts of people I'm thinking about include:
Hillel Wayne: who writes about empiricism in software engineering and formal methods.
Jimmy Koppel: who writes about insights for programming he's gleaned from his "day job" as a programming tools researcher (I think he has a LW account already).
Julia Evans: Writes about programming practice and questions she's interested in. A blog post of hers that seems especially LW-friendly is What does debugging a program look like?
Last, I do want to include add a caveat for all this which I think applies to reaching out to basically any group: there's a big risk of culture clash/dilution if the outreach effort succeeds (see Geeks, MOPs, and sociopaths for one exploration of this topic). How to mitigate this is probably a separate question, but I did want to call it out in case it seems like I'm just recommending blindly trying to get more users.
Jimrandomh recently had the interesting observation that there might have been legitimately fewer rationalists in the world prior to the invention of programming, because it actually forces you to notice when your model is broken, form new hypotheses, and test them, all with short feedback loops.
I think that's a bit of a shame because I personally have found LW-style thinking useful for programming. My debugging process has especially benefited from applying some combination of informal probabilistic reasoning and "making beliefs pay rent", which enabled me to make more principled decisions about which hypotheses to falsify first when finding root causes.
As someone who landed on your comment specifically by searching for what LW has said about software engineering in particular, I'd love to read more about your methods, experiences, and thoughts on the subject. Have you written about this anywhere?
(minor note: the Jimmy and Julia links didn't work properly, because external links need to be prefaced with "https://www.")
lobste.rs,
This link is broken. It goes to: https://www.lesswrong.com/posts/KFBhguD7dSjtmRLeg/lobste.rs
In multiple LessWrong surveys biorisk was rank has a more probable existential risk then AGI. At the same time there's very little written on LessWrong about biorisk. If we could recruit people into our community who could represent that topic well, I think it would be very valuable.
Minor conflict of interest disclaimer: I've recently become much more interested in computational biology and therefore have a personal interest in having more content related to biology in general on LW.
I'd be excited about having more representation from the experimental sciences, e.g. biology, certain areas of physics, chemistry, on LessWrong. I don't have a good sense of how many total LW users come from these fields, but it certainly doesn't seem like many prominent posters/commenters do. The closest thing to a prominent poster who talks about experimental science is Scott Alexander.
My sense from random conversations I've had over the years is that there's a lot of tacit but important knowledge about how to do experimental research and lab work well that isn't written down anywhere and could make for interesting complementary content to the wealth of content on LW about the connection between rationality and doing theory well. There's also an untapped treasure trove of stories about important discoveries in these areas that could make for good LW post series. I'd love to see someone take me through the history of Barbara McClintock's discoveries or the development of CRISPR from a rationalist perspective (i.e. what were the cognitive strategies that went along with discovering these things). There are books on discoveries like this of course, but there are also books on most of the material in the Sequences.
Having more LWers from experimental sciences could also provide a foundation for more detailed discussion of X-risks outside of transformative AI, bio-risks in particular.
In terms of attracting these sorts of people, one challenge is that younger researchers in these areas in particular tend to have long hours due to the demands of lab work and therefore may have less time to post on LW.
Biology-nerd LWer here (or ex-biology-nerd? I do programming as a job now, but still talk and think about bio as a fairly-high-investment hobby). BS in entomology. Disclaimer that I haven't done grad school or much research; I have just thought about doing it and talked with people who have.
I suspect one thing that might appeal to these sorts of people, which we have a chance of being able to provide, is an interesting applied-researcher-targeted semi-plain-language (or highly-visual, or flow-chart/checklist, or otherwise accessibly presented) explanation of certain aspects of statistics that are particularly likely to be relevant to these fields.
ETA: A few things I can think of as places to find these people are "research" and "conferences." There are google terms they're going to use a lot (due to research), and also a lot of them are going to be interested in publishing and conferences as a way to familiarize themselves with new research in their fields and further their careers.
Leaning towards the research funnel... here's some things I understand now that I did not understand when I graduated, many of which I got from talking/reading in this community, which I think a "counter
...We used to have posts like https://www.lesswrong.com/posts/pWi5WmvDcN4Hn7Bo6/even-if-you-have-a-nail-not-all-hammers-are-the-same , so quite a few people would read it.
Finance. Trading specifically.
I'd be interested in you saying more words about this – both about why it seems like a particularly promising area, and also if you have recommendations of who to approach or how to go about it.
One question might be "are there particular trading bloggers that seem "LW-adjaecent" that we could get to crosspost here?
I'd be very interested to see someone talk about how many forces in finance are driven by superstition about superstition.. for instance, how you can have situations where nobody really believes tulips are valuable, but how disastrous things must now happen as a result of everyone believing that others believe that others believe that [...seeming ad infinitum...] tulips are valuable. Where do these beliefs come from? How can they be averted? This kind of question seems very much in this school's domain.
There would have to be some speculation abou...
I would be interested in seeing more applied fields, and also specializations which operate at the intersection of multiple fields. Some examples include:
The adjacent memeplex of Effective Altruism seems to have a bunch of operations and finance people in it.
We might consider trying to target people who are connected to teaching or instruction in their area of expertise somehow. I expect the average level of engagement with a new subject is quite a bit deeper here than in most other communities, so we might be in a position to offer an audience of motivated learners as an enticement to them. Simultaneously, the instruction experience will help with the problem of technical posts having too high a threshold to engage with them.
I'd make an argument for 'soft-sciences' and humanities. Philosophy, cultural anthropology, history, political science, sociology, literature, and maybe even gender studies. Computer science, mathematics, economics, and other STEM-heavy fields are already pretty well represented within the current LW community.
The focus on group rationality and developing a thriving community seems like it could benefit from the expertise these fields bring to the table. This might also reduce the amount of 'reinventing the wheel' that goes on (which I don't necessarily think is a bad thing but also consumes scarce cognitive resources).
Further, I think there's a case to be made that a lot of the goals of the rationalist movement could be furthered by strengthening connections to serious academic fields that are less likely to come into memetic contact with rationalist ideas. If nothing else, it would probably help raise the sanity waterline.
Any reasonable scholar who's in gender studies faces a high reputational risk if they would debate on LessWrong in a reasonable way about their field. Any field that has dogma's that aren't allowed to be publicly debated has a problem with the kind of open discussion we are having here.
Again, I'm not super confident in this and I think there is a decent chance that this will wind up being pointless but it still seems worth spending a little time investigating.
The question is not just whether it's pointless but about whether it's potentially harmful.
We lost a room in which we held LW meetups in Berlin because LW discusses topics that shouldn't be discussed. The discussion in itself is 'unsafe' regardless of how you discuss or what conclusions are reached.
That's norms for using a meeting room. When it comes to norms that the gender studies community expects there own members to follow, a person who has a reputational stake in the community has a lot more to lose from violating norms in that way.
This isn't even a question of the academic quality of their discourse. a/atheism doesn't attack people in a way that destroys careers and isn't dangerous to anyone. This is different here. I wouldn't want a lone reasonable person in the gender studies field to lose their social capital and/or career for associating with this place.
The standard way LW historically handled politics is by discouraging it's discussion. SSC did things differently and payed a price for it.
That's all separate from the actual quality of the academic discourse but it matters. As far as the discourse goes https://quillette.com/2019/09/17/i-basically-just-made-it-up-confessions-of-a-social-constructionist/ is an article by an insider where he reflects on the low standards he used over the decades.
IQ is around 50% heritable, the other 50% also matters, though.
This sounds like it's written by a person who's not quite clear what X percent heritable means. Apart from that making up numbers like this for rhetorical purposes and treating them as if they are factual is bad form.
The right answer to the nature vs. nature debate isn't it's 50-50 but: That's a bad question and a bad frame for understanding reality.
Instead of debating nature vs. nature one should look at the empirical findings we have and build up a view on the world based on them.
I'm afraid that's the case.
Alright, a different angle then. If we did find some academic feminists or gender studies researchers who were willing to engage in good faith, serious discussion without trying to be activist or throwing around accusations of -isms or -phobics, would you object to their presence in the community? The hostility you've shown towards an entire field is something I find deeply concerning.
Perhaps you and I just have fundamentally different approaches towards outgroups since I honestly cannot think of a single group I would treat the way you've been treating feminists in this discussion.
New age pagans, reactionaries, anarchists, neoliberals, small-c-conservatives, and even the alt-right; I consider these to be among my outgroups and I could make major criticisms of their core philosophies as well as how they generally conduct themselves in discourse. But if a member of any one of them actually wanted to engage me in a real discussion in good faith I would take them up on it (time permitting, of course) and if they brought up evidence I had overlooked or perspectives I hadn't considered then I would gladly update my views in response.
It...
I contest that those are not actually claims made by sociologists. Or if they are, they are minority opinions (in which case there would be other sociologists debunking them).
As a test, if you provide links to sociologists (or academic feminists/gender studies researchers) making each of those claims I will try to find others within the same field arguing against them.
Aside from what's already here, I can think of a few "character profiles" of fields that would benefit from LessWrong infrastructure:
I don't think the bottleneck is lack of recruitment though, the problem is that content has no place to go. As you rightly point out, things that aren't interesting to the general LW audience get crickets. I have unusual things I really want to show on LessWrong that are on their 5th rewrite because I have to cross so many inferential gaps and somehow make stuff LW doesn't care about appealing enough to stay on the front page.
I was curious what fields of science were underrepresented here, so I googled "List of sciences" and this Wikipedia article comes up.
It lists as part of formal sciences: decision theory, logic, mathematics, statistics, systems theory and computer science. All of those are well represented on Lesswrong, with the exception of systems theory.
Of the natural sciences, it lists physics, chemistry, Earth science, ecology, oceanography, geography, meteorology, astronomy, and biology. While physics and biology are talked about quite a bit here, pretty much none of the other natural sciences are.
LessWrong is about learning rationality, and applying rationality to interesting problems.
An issue is that solving interesting problems often requires fairly deep technical knowledge of a field. To use rationality to help solving problems (especially as a group), you need both people who have skills in probability/meta-cognition/other-rationality skills, as well as the actual skills directly applicable to whatever problem is under discussion.
But if you show up on LW and post something technical (or even just "specialized") in a field that isn't already well represented on the forum, it'll be hard to have meaningful conversations about it.
Elsewhere on the internet there are probably forums focused on whatever-your-specialization is, but those places won't necessarily have people who know how to integrate evidence and think probabilistically in confusing domains.
So far the LW userbase has a cluster of skills related to AI alignment, some cognitive science, decision theory, etc. If a technical post isn't in one of those fields, you'll probably get better reception if it's somehow "generalist technical" (i.e. in some field that's relevant to a bunch of other fields), or if it somehow starts one inferential unit away from the overall LW userbase.
A plausibly good strategy is to try to recruit a number of people from a given field at once, to try to increase the surface area of "serious" conversations that can happen here.
It might make most sense to recruit from fields that are close enough to the existing vaguely-defined-LW memeplex that they can also get value from existing conversations here.
Anyone have ideas on where to do outreach in this vein? (Separately, perhaps: how to do outreach in this vein?). Or, alternately, anyone have a vague-feeling-of-doom about this entire approach and have alternate suggestions or reasons not to try?