AI content for specialists
There is a lot of AI content recently, and it is sometimes of the kind that requires specialized technical knowledge, which I (an ordinary software developer) do not have. Similarly, articles on decision theories are often written in a way that assumes a lot of background knowledge that I don't have. As a result there are many articles I don't even click at, and if I accidentally do, I just sigh and close them.
This is not necessarily a bad thing. As something develops, inferential distances increase. So maybe, as a community we are developing a new science, and I simply cannot keep up with it. -- Or maybe it is all crackpottery; I wouldn't know. (Would you? Are some of us upvoting content they are not sure about, just because they assume that it must be important? This could go horribly wrong.) Which is a bit of a problem for me, because now I can no longer recommend Less Wrong in good faith as a source of rational thinking. Not because I see obviously wrong things, but because there are many things where I have no idea whether they are right or wrong.
We had some AI content and decision theory here since the beginning. But those articles written back then by Eliezer were quite easy to understand, at least for me. For example, "How An Algorithm Feels From Inside" doesn't require anything beyond high-school knowledge. Compare it to "Hypothesis: gradient descent prefers general circuits". Probably something important, but I simply do not understand it.
Just like historically MIRI and CFAR split into two organizations, maybe Less Wrong should too.
Feeling of losing momentum
I miss the feeling that something important is happening right now (and I can be a part of it). Perhaps it was just an illusion, but at the first years of Less Wrong it felt like we were doing something important -- building the rationalist community, inventing the art of everyday rationality, with the perspective to raise the general sanity waterline.
It seems to me that we gave up on the sanity waterline first. The AI is near, we need to focus on the people who will make a difference (whom we could recruit for an AI research), there is no time to care about the general population.
Although recently, this baton was taken over by the Rational Animations team!
Is the rationalist community still growing? Offline, I guess it depends on the country. In Bratislava, where I live, it seems that ~ no one cares about rationality. Or effective altruism. Or Astral Codex Ten. Having five people at a meetup is a big success. Nearby Vienna is doing better, but it is merely climbing back to pre-COVID levels, not growing. Perhaps it is better at some other parts of the world.
Online, new people are still coming. Good.
Also, big thanks to all people who keep this website running.
But still it no longer feels to me anymore like I am here to change the world. It is just another form of procrastination, albeit a very pleasant one. (Maybe because I do not understand the latest AI and decision theory articles; maybe all the exciting things are there.)
Etc.
Some dialogs were interesting, but most are meh.
My greatest personal pet peeve was solved: people no longer talk uncritically about Buddhism and meditation. (Instead of talking more critically they just stopped talking about it at all. Works for me, although I hoped for some rational conclusion.)
It is difficult for me to disentangle what happens in the rationalist community from what happens in my personal life. Since I have kids, I have less free time. If I had more free time, I would probably be recruiting for the local rationality (+adjacent) community, spend more time with other rationalists, maybe even write some articles... so it is possible that my overall impression would be quite different.
(Probably forgot something; I may add some points later.)
I feel pretty good about LessWrong. The amount of attention I give to LW tends to ebb and flow in phases, and I'm currently in a phase where I gave it less attention (In large part due to the war in Israel), and now I'm probably going to enter into a phase of giving it a lot of attention because of the 2022 review.
I think the team is doing a great job with the site, both in terms of feature and moderation, and the site keeps getting better.
I do feel the warping effect of the AI topic on the site, and I'm ambivalent about it. On the one hand, I do think it's an important topic that should be discussed here, on the other hand, it does flood out everything else (I've changed my filters to deal with it) and a lot of it is low quality. I also see and feel the pressure to make everything related to AI somehow, which again, I'm ambivalent about. On the one hand if it's so significant and important, then it makes sense to connect many things to it, on the other hand, I'm not sure it does much good to the writing on the site.
I also wish the project to develop the art of rationality got more attention, as I think it is still important and there's a lot of progress to be made and work to be done. But I also wish that whatever attention it got would be of higher quality - there are very few rationality posts in the last few years that were on the level of the old essays from Eliezer and Scott.
Perhaps the problem is that good writers just don't stay on LessWrong, and prefer to go on their own platforms or to twitter where they can get more attention and make money from their writing. One idea I have to deal with that is to implement a gifting feature (with real money), perhaps using plural funding. I think it can incentivize people to write better things, and incentivize good writers to also post on LW. I definitely know it would motivate me, at least.
Another thing I would like, which would help deal with the fact that lots of writing that's relevant to LW isn't on LW, is to improve the way linkposts work. Currently, I come across a lot of writing that I want to share on LW, but it would be drowned out if I shared it in the open thread or a shortform post, and I don't want to share it as a linkpost because I don't want it to be displayed on my page as one of my posts (and drown out the rest of my posts). I also don't feel like I deserve all the Karma it would get, so it feels a bit... dirty? Here's what I have in mind instead - have a clear distinction between corssposts and linkposts:
I think these two features would greatly help LW be a place where great writing can be found and discussed, and hopefully that disproportionately includes writing on the art of rationality.
My main aversion is that I don't want them to drown out my own posts on my user page.