- 7 weeks ago, I precommitted that censoring a post or comment on LessWrong would cause a 0.0001% increase in existential risk.
- Earlier today, Yudkowsky censored a post on less wrong
- 20 minutes later, existential risks increased 0.0001% (to the best of my estimation).
I'm just some random lurker, but I'd be very interested in these articles. I share your view on cryonics and would like to read some more clarification on what you mean by "compartmentalization failure" and some examples of a rejection of the outside view.
Here's my view of current lesswrong situation.
On compartmentalization failure and related issues there are two schools present on less wrong:
Right now there doesn't seem to be any hope of reaching Aumann agreement... (read more)