So many people seem eager to rush to sell their souls, without first checking to see if the Devil’s willing to fulfill his end of the bargain.
In cases like this I assume the point is to prove one's willingness to be make the hard choice, not to be effective (possibly to the extent of being ineffective on purpose). This can be just proving to oneself, out of fear of being the kind of person who's not able to make the hard choice — if I'm not in favor of torturing the terrorist, that might be because I'm squeamish (= weak, or limited in what I can think (= unsafe)), so I'd better favor doing it without thought of whether it's a good idea.
Plus
the media runs with exactly the sort of story you'd expect it to run with
I haven't closely followed it, but my impression is that media coverage has been less unfavorable than one might have predicted / than this suggests, tending to recognize the Zizians as a rejected weird fringe phenomenon and sympathetically quote rationalists saying so.
I don't know best practice here at all, but putting the overall target actually seems reasonable to me; Lightcone did so using $1M (their first goal) and Oli might have thoughts.
Our old workshop had a hyper/agitated/ungrounded energy running through it: “do X and you can be cool and rational like HPMOR!Harry”; “do X and you can maybe help with whether we’ll all die.”
This also seems like an important factor in making it easier for alumni to get pulled into cults — upvoting an urgency/desperation to Fix Something ⇒ finding more appeal in questionable exotic offers of power. (Not unique to CFAR, of course — that urgency/desperation is a deeper thread in rationalist culture + something that people might come in with from the start — but I would think CFAR / this energy acted as a vector for it.)
... actually, the rest of that reply is a good comment on "ambiguous impact on health":
That said, I think that the rationality project broadly construed has often fallen into a failure mode of trying to do radically ambitious stuff without first solidly mastering the boring and bog standard basics. This led to often undershooting, not just our ambitions, but the more boring baselines.
We aimed to be faster than science. But, in practice, I think we often didn't meet the epistemic standards of a reasonably healthy scientific subfield.
If I invest substantial effort in rationality development in the future, I intend to first focus on doing the basics really well before trying for superhuman rationality.
Adele argued recently that a rationality curriculum worthy of the name would leave folks less vulnerable to psychosis, and that many current rationalists (CFAR alums and otherwise) are appallingly vulnerable to psychosis. After thinking about it some, I agree.
I want to quote (and endorse, and claim as important) the start of @Eli Tyre's reply at the time:
For what it's worth, I think this is directionally correct, and important, but I don't necessarily buy it as worded.
Sometimes advanced techniques / tools allow power users to do more than they otherwise would be able to, but also break basic-level stuff for less advanced users. There are some people that are able to get a lot more out of their computers with a Linux install, and also, for most people, trying to use and work with Linux can totally interfere with pretty basic stuff that that "just worked" when using windows, or (if you do it wrong) just break your machine, without having the tools to fix it.
It's correspondingly not that surprising to me if power tools for making big changes to people's epistemologies sometimes have the effect of making some people worse at the basics. (Though obviously, if this is the case, a huge priority needs to be attending to and mitigating this dynamic.)
Since I didn't know this until a few days ago: the maximum deduction for state and local taxes is much higher this year ($40K, up from $10K), so more people will take more than the standard deduction even before donating anything.
Homeopathy is (very strongly) antipredicted by science as we understand it, not just not-predicted.
Also, how many psychological techniques or informal theories are actively predicted to work by mainstream scientific theory? How much of folk psychology or social common sense is? (This isn't to say that there's no epistemic difference between eg Focusing and folk psychology, obviously one has much more unscientific validation, but that "this doesn't follow from science as we understand it" doesn't match usage or practice.)
(I care about this discussion but feel a little bad about having it near the top of the comments section of an unrelated post.)
Some people seemingly just have a detail-less prior that the USG is more powerful than anyone else in whatever domain (e.g. has broken all encryption algorithms); without further information I'd assume this is more of that.