Posts

Sorted by New

Wiki Contributions

Comments

Seems like the forces that turn people crazy are the same ones that lead people to do anything good and interesting at all. At least for EA, a core function of orgs/elites/high status community members is to make the kind of signaling you describe highly correlated with actually doing good. Of course it seems impossible to make them correlate perfectly, and that’s why setting with super high social optimization pressure (like FTX) are gonna be bad regardless.

But (again for EA specifically) I suspect the forces you describe would actually be good to increase on the margin for people not living in Berkeley and/or in a group house which is probably a majority of self-identified EAs but a strong minority of the people-hours OP interacts with irl.