New to LessWrong?

New Comment
6 comments, sorted by Click to highlight new comments since: Today at 11:33 AM

I have to add that there is (informally) even smaller purple team, which thinks that climate change could happen sooner and in more violent form, like runaway global warming. The idea has similarities with the idea of self-improving AI as in both cases unstoppable process with positive feedback will result in human extinction in 21 century.

Personally i think people often do indeed write rigorous criticisms of various points of rationality and EA consensus. It's not an under-debated topic. Maybe some of the very deep assumptions are less debated, eg some of the basic assumptions of humanism. But i think that's just because no one finds them faulty.

I thought that's what Lumifer is doing ;)

Sigh. It used to be called science. Just science. But in our enlightened age in order to do plain-vanilla science you need to reframe it as a war between the Blues and the Reds?

Point taken, and I agree.

Edit: generalizing, I think it should be said that rather than need big a red team, there is really no room for a blue team. Everyone should be red teaming their own beliefs and generally accepted "truths." That is part and parcel what it means to be a rationalist. To practice rationality it to approach everything with a "red team" mindset.