LessWrong is about learning rationality, and applying rationality to interesting problems.
An issue is that solving interesting problems often requires fairly deep technical knowledge of a field. To use rationality to help solving problems (especially as a group), you need both people who have skills in probability/meta-cognition/other-rationality skills, as well as the actual skills directly applicable to whatever problem is under discussion.
But if you show up on LW and post something technical (or even just "specialized") in a field that isn't already well represented on the forum, it'll be hard to have meaningful conversations about it.
Elsewhere on the internet there are probably forums focused on whatever-your-specialization is, but those places won't necessarily have people who know how to integrate evidence and think probabilistically in confusing domains.
So far the LW userbase has a cluster of skills related to AI alignment, some cognitive science, decision theory, etc. If a technical post isn't in one of those fields, you'll probably get better reception if it's somehow "generalist technical" (i.e. in some field that's relevant to a bunch of other fields), or if it somehow starts one inferential unit away from the overall LW userbase.
A plausibly good strategy is to try to recruit a number of people from a given field at once, to try to increase the surface area of "serious" conversations that can happen here.
It might make most sense to recruit from fields that are close enough to the existing vaguely-defined-LW memeplex that they can also get value from existing conversations here.
Anyone have ideas on where to do outreach in this vein? (Separately, perhaps: how to do outreach in this vein?). Or, alternately, anyone have a vague-feeling-of-doom about this entire approach and have alternate suggestions or reasons not to try?
Biology-nerd LWer here (or ex-biology-nerd? I do programming as a job now, but still talk and think about bio as a fairly-high-investment hobby). BS in entomology. Disclaimer that I haven't done grad school or much research; I have just thought about doing it and talked with people who have.
I suspect one thing that might appeal to these sorts of people, which we have a chance of being able to provide, is an interesting applied-researcher-targeted semi-plain-language (or highly-visual, or flow-chart/checklist, or otherwise accessibly presented) explanation of certain aspects of statistics that are particularly likely to be relevant to these fields.
ETA: A few things I can think of as places to find these people are "research" and "conferences." There are google terms they're going to use a lot (due to research), and also a lot of them are going to be interested in publishing and conferences as a way to familiarize themselves with new research in their fields and further their careers.
Leaning towards the research funnel... here's some things I understand now that I did not understand when I graduated, many of which I got from talking/reading in this community, which I think a "counterfactual researcher me" would have benefited from a lucid explanation of:
Things I think we've done that seem appealing from a researcher perspective include...
(...damn, is Scott really carrying the team here, or is this a perception filter and I just really like his blog?)
Small sample sizes, but I think in the biology reference class, I've seen more people bounce off of Eliezer's writing style than the programming reference class does (fairly typical "reads-as-arrogant" stuff; I didn't personally bounce off it, so I'm transmitting this secondhand). I don't think there's anything to be done about this; just sharing the impression. Personally, I've felt moments of annoyance with random LWers who really don't have an intuitive feel for the nuances for evolution, but Eliezer is actually one of the people who seems to have a really solid grasp on this particular topic.
(I've tended to like Elizer's stuff on statistics, and I respected him pretty early on because he's one of the (minority of) people on here who have a really solid grasp of what evolution is/isn't, and what it does/doesn't do. Respect for his understanding of a field-of-study I did understand, rubbed off as respecting him in fields of study he understood better than I did (ex: ML) by default, at least until my knowledge caught up enough that I could reason about it on my own.)
((FWIW; I suspect people in finance might feel similarly about "Inadequate Equilibria," and I suspect they wouldn't be as turned off by the writing style. They are likely to be desirable recruits for other reasons: finance at its best is fast-turnaround and ruthlessly empirical, it's often programming or programming-adjacent, EA is essentially "charity for quantitatively-minded people who think about black swans," plus there's something of a cultural fit there.))
Networking and career-development-wise... quite frankly, I think we have some, but not a ton to offer biologists directly. Maybe some EA grants for academics and future academics that are good at self-advocacy and open to moving. I've met maybe a dozen rationalists I could talk heavy bio with, over half of which are primarily in some other field at this point. Whereas we have a ton to offer programmers, and at earlier stages of their careers.
(I say this partially from personal experience, although it's slightly out-of-date: I started my stay in the Berkeley rationalist community ~4 years ago with a biology-type degree. I had a strong interest in biorisk, and virology in particular. I still switched into programming. There weren't many resources pointed towards early-career people in bio at the time (this may have changed; a group of bio-minded people including myself got a grant to host a group giving presentations on this topic, and were recently able to get a grant to host a conference), and any that existed was pointed at getting people to go to grad school. Given that I had a distaste for academia and no intention of going to grad school, I eventually realized the level of resources or support that I could access around this at the time was effectively zero, so I did the rational thing and switched to something that pays well and plugged in with a massive network of community support. And yes, I'm a tad bitter about this. But that's partially because I just had miscalibrated expectations, which I'm trying to help someone else avoid.)