AI content for specialists
There is a lot of AI content recently, and it is sometimes of the kind that requires specialized technical knowledge, which I (an ordinary software developer) do not have. Similarly, articles on decision theories are often written in a way that assumes a lot of background knowledge that I don't have. As a result there are many articles I don't even click at, and if I accidentally do, I just sigh and close them.
This is not necessarily a bad thing. As something develops, inferential distances increase. So maybe, as a community we are developing a new science, and I simply cannot keep up with it. -- Or maybe it is all crackpottery; I wouldn't know. (Would you? Are some of us upvoting content they are not sure about, just because they assume that it must be important? This could go horribly wrong.) Which is a bit of a problem for me, because now I can no longer recommend Less Wrong in good faith as a source of rational thinking. Not because I see obviously wrong things, but because there are many things where I have no idea whether they are right or wrong.
We had some AI content and decision theory here since the beginning. But those articles written back then by Eliezer were quite easy to understand, at least for me. For example, "How An Algorithm Feels From Inside" doesn't require anything beyond high-school knowledge. Compare it to "Hypothesis: gradient descent prefers general circuits". Probably something important, but I simply do not understand it.
Just like historically MIRI and CFAR split into two organizations, maybe Less Wrong should too.
Feeling of losing momentum
I miss the feeling that something important is happening right now (and I can be a part of it). Perhaps it was just an illusion, but at the first years of Less Wrong it felt like we were doing something important -- building the rationalist community, inventing the art of everyday rationality, with the perspective to raise the general sanity waterline.
It seems to me that we gave up on the sanity waterline first. The AI is near, we need to focus on the people who will make a difference (whom we could recruit for an AI research), there is no time to care about the general population.
Although recently, this baton was taken over by the Rational Animations team!
Is the rationalist community still growing? Offline, I guess it depends on the country. In Bratislava, where I live, it seems that ~ no one cares about rationality. Or effective altruism. Or Astral Codex Ten. Having five people at a meetup is a big success. Nearby Vienna is doing better, but it is merely climbing back to pre-COVID levels, not growing. Perhaps it is better at some other parts of the world.
Online, new people are still coming. Good.
Also, big thanks to all people who keep this website running.
But still it no longer feels to me anymore like I am here to change the world. It is just another form of procrastination, albeit a very pleasant one. (Maybe because I do not understand the latest AI and decision theory articles; maybe all the exciting things are there.)
Etc.
Some dialogs were interesting, but most are meh.
My greatest personal pet peeve was solved: people no longer talk uncritically about Buddhism and meditation. (Instead of talking more critically they just stopped talking about it at all. Works for me, although I hoped for some rational conclusion.)
It is difficult for me to disentangle what happens in the rationalist community from what happens in my personal life. Since I have kids, I have less free time. If I had more free time, I would probably be recruiting for the local rationality (+adjacent) community, spend more time with other rationalists, maybe even write some articles... so it is possible that my overall impression would be quite different.
(Probably forgot something; I may add some points later.)
Sadly, LW isn't a community that I would say that I am a part of. I say that begrudgingly, as LW seems to 'have been' and still is, 'a decent place on the internet'.
The issue with being decent, is that it doesn't work long term, at least not for me.
Why did other people leave LW before? I'm not sure. Why do I want to leave? And what drew me here in the first place?
I came here to seek for people with integrity, people thinking outside the box, highly intelligent and willing to both pursue their individuality and take/give feedback from equals/peers - with the intention of getting help, but also provide support in growing my own as well as the rationality/general intelligence/EQ/bigger goals of others, in a congruous, open-ended, honest, sincere and cooperative environment.
To take ideas, concepts and take them to their logical conclusion, is something I care about, and was hoping to find a community that is Congruous and Coherent according to its own explicit ideas and values, with enough discernment to make it work. This is a tall order perhaps, but I was hoping, when I found this place, that it was closer to that ideal.
From what I've seen, there might be a slightly higher population of the kinds of people I'm looking for here, but on the other hand, there is a wide gulf between what those people want and need to thrive, and the kind of environment LW is providing.
I'm not the most articulate in writing, but I wrote about this gulf of Who is LW for in some comments, and also a post called "The LW crossroads of purpose".
And, I see it as a very pressing matter, not only because laissez-faire seems to ruin subcultures, but because there are so many places on the internet where your average Joe can go, but so few where it seems those that crave high-end personal, rational, emotional development, can actually get support, and support each other.
A place where integrity, respect and cooperation is a fundamental practice, and where things aren't solved through "democracy", but by finding the best way to go forward. A place that supports the creation of the very good/best, and not the decently/average+ good.
I'm not aware if those that 'left' LW went somewhere more coherent in this regard. Substack seems to be a place, but is there a 'community' out there waiting? At least not that I am aware of? Which means I would rather write this, and on the total off chance that this idea gets traction, and LW will have a "serious dojo" for rationality - with a high bar to entry, in a high trust environment that grows organically and slowly; I'll at least hear of it, and might even want to join.
I wouldn't even mind if it had a subscription fee of sorts, and some of the members got paid. Why sweat the small stuff.
For now, I'll stay in the shadows, and maybe look at older posts and see who was here before. Maybe some of them is someone I want to talk to.
Kindly,
Caerulea-Lawrence