I'm an admin of LessWrong. Here are a few things about me.
Randomly: If you ever want to talk to me about anything you like for an hour, I am happy to be paid $1k for an hour of doing that.
FWIW I think it's pretty common for organizers of an event to feel exhausted and beleaguered by the attendees. It's pretty tiring and they make tons of requests and sometimes they act kind of counter to the spirit of the event.
+4. This post stuck with me distinctly from 2024, and often when I mention it to people they have already read it. I think that during Covid, challenge trials seemed like a mystical thing from fantasy novels that would happen in dath ilan or some superior civilization. So it was a pleasant surprise to read an actual account of one from someone I already know, especially by a biologist who could teach me about this basic disease as well.
+4. In some regards it's sad to have to rehash this argument, but I feel that this argument has been going around in the public discourse, and so it's worthwhile to write up a thorough account of what's naive about it and how to move past it. My sense is that it has become less prevalent since the essay; perhaps the essay helped.
Many folks have distaste for Eliezer's style, or for perhaps implying that a weak-man argument is fully representative of positions he disagrees with; I think some of these criticisms are valid but do not mean the essay isn't (a) pretty right in many places and well-written, nor (b) the best existing rebuttal to this perspective I'm aware of. If anyone has a better one to link to then please do and I'll reduce my vote for this to +1!
+4. I recall learning about dimensional analysis as a teenager and it's still a basic element of my thinking, though I should practice it more. Fermis too. Anyway, this is a fantastic little explainer and fun introduction to these fundamental ways of thinking, I'm pretty confident it should make the top 50 list for the year.
And, man, I wish LessWrong caused me to practice doing these sorts of arithmetic more often.
FWIW the prestigious Clarion West writers' workshop gives out a lot of scholarships, I think this is pretty normal for writers' events:
The tuition for the 2026 Six-Week Workshop is $3,600. We’re committed to helping our accepted students get to the workshop, and we do our best to meet the financial needs of everyone who requests tuition assistance. In recent years, up to 94% of our class has received scholarship aid.
My process for financial aid was essentially to accept everyone who we clearly wanted to accept based on their writing, and then figure out what they could afford (to be clear I was unwilling to offer anyone a full scholarship, all had to pay something, and I believe I understood their financial situations well enough to believe they were genuinely paying what they could afford—I understand the financial situation of students, of people in certain industries, of people between jobs, etc). Then we looked at the marginal cases, and selected in favor of those who were able to pay full or near-full. Looking through my initial votes on people now, I don't see anyone who I thought was marginal who didn't pay full or at least >50% of the price. And to be clear, marginal costs of such Inkhaven Residents aren't that high, as long as their contributions to the program are net positive.
My point being, I think there was little to be gained by being stricter on the margin—and much value to miss out on in terms of interesting and valuable writing from those who wouldn't have joined the cohort.
Splendid post. I am generally pro CFAR being alive; I was also pleased to read that the new workshops will still be ~2/3rds the good content from the original workshops, not exclusively new and experimental stuff, which makes me more confident in encouraging people to go (i.e. that the floor on the experience will still be quite good). Many things in this post seem to me to accurately address pathologies in CFAR 1.0; here's to CFAR 2.0 having even more success in developing an art of human rationality than CFAR 1.0.
Ways to help CFAR or to connect to CFAR besides donating:
[...]
- Book our venue (or help a friend realize they’d enjoy booking the venue, if they would)
I'll also let people know that I had a great experience renting the CFAR venue in Bodega Bay. I took the Inkhaven residents there for a weekend off-site, and the participants rated it really highly. It was a great bonding experience to be in a single house altogether, we had nice daily walks to the ocean, and Jack & Sunny were lovely hosts with their two adorable kittens. Endorsed as a good getaway space for up to ~50 people.
(And we do still need money to be viable, because being a custodian of such a community requires staff time and money for food/lodging/staff flights/etc.)
As a minor issue, I think I'm failing to understand this parenthetical. I already believe that many good non-profits need donations to live, and cannot sustain themselves fully on sales and revenue. This seemed to read to me though that aCFAR is justifying itself as needing funds primarily because of trying to sustain a community. Slightly earlier you wrote about the alumni community, which you felt was originally quite generative, then became lower quality, and you'd like to do something to get life into one again. But I don't think you mean to imply that the alumni community is the sole purpose of donations. What did you mean here?
I'm sorry to hear about your health/fatigue. That's a very unfortunate turn of events, for everyone really.
It’s actually been this way the whole time. When I first met Eliezer 10 years ago at a decision theory workshop at Cambridge University, I asked him what his AI timelines were over lunch; he promptly blew a raspberry as his answer and then fell asleep.
@dirk Anti-reacts aren't for disagreement, they're for "this is an inappropriate use of the react" (e.g. if someone writes "haha" on something that wasn't meant as a joke, or someone hits "typo" on something that is actually correctly spelled).
So please don't anti-react my "Plus One" react if you strongly disagree with it. You can just react to the claim with your epistemic state (as you have done with your disagree-react).
I'm not interested in making such a request for expanding on it, thanks for the offer. (I'm not asking you not to, to be clear.)
To respond to your point, you may be aware that there's a large class of Singerian EAs that are pathologically self-guilting and taking-personal-responsibility-for-the-bad-things-in-the-world, and it was kind to some of them to point out what was believed to be a true argument for why that was not the case here. I don't think it is primarily explained by self-serving motivation; and as evidence you can see from the comments that Eliezer was perfectly open to evidence he was mistaken (via encouraging Habryka to post their chat publicly where Habryka gave counterevidence), so I think it's unfair to read poor intent into this, as opposed to genuine empathy/sympathy for people who are renowned for beating themselves up about things in the world that they are barely responsible for and have relatively little agency over.
I think I am a little confused by the fact that CFAR 2.0 is still doing workshops in roughly the same format as CFAR 1.0. I think if I were trying to build and teach an art of rationality from first principles, I would explore the space of things more, such as teaching ongoing courses weekly, or doing individual coaching, or running clearly measurable experiments, or lots of other things as well as workshops. On the other hand it makes sense to keep doing the things you've spec'd into, be that for good or for ill.
Anyway I am most likely just missing something simple. I am interested to know why you are doing roughly the same shape of thing?