- 7 weeks ago, I precommitted that censoring a post or comment on LessWrong would cause a 0.0001% increase in existential risk.
- Earlier today, Yudkowsky censored a post on less wrong
- 20 minutes later, existential risks increased 0.0001% (to the best of my estimation).
In that case they should present their evidence and/or a strong argument for this, not attempt to blackmail moderators.
I actually explicitly said what oscar said in the discussion of the precommitment.
I also posted my reasoning for it.
Those are both from the "precommitted" link in my article.