Today's post, Your Strength as a Rationalist, was originally published on 11 August 2007. A summary (taken from the LW wiki):

A hypothesis that forbids nothing, permits everything, and thereby fails to constrain anticipation. Your strength as a rationalist is your ability to be more confused by fiction than by reality. If you are equally good at explaining any outcome, you have zero knowledge.

Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, in which we're going through Eliezer Yudkowsky's old posts in order, so that people who are interested can (re-)read and discuss them. The previous post was The Apocalypse Bet, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New Comment
3 comments, sorted by Click to highlight new comments since:

Your strength as a rationalist is your ability to be more confused by fiction than by reality.

Does that lead to the conclusion that Newcomb's problem is irrelevant? Mind-reading aliens are pretty clearly fiction. Anyone who says otherwise is much more likely to be schizophrenic than to have actual information about mind-reading aliens.

[-][anonymous]20

I'm pretty sure Eliezer thinks this heuristic should be applied to events that occurred in the past, not ones that will occur in the future--it's a way of assessing whether a piece of evidence should be trusted or whether we should defy it. It's also a way of weeding out hypotheses that don't actually make experimental predictions. I don't think he's trying to say that we should ignore things that seem weird, particularly because he speaks out against the absurdity heuristic later on.

Does that lead to the conclusion that Newcomb's problem is irrelevant? Mind-reading aliens are pretty clearly fiction.

This reminds me of the very first comment on the Pascal's Mugging post.

Thought experiments are good "to ask how meaningful would someone’s position on an issue be if it were taken to its logical extreme".