A simple way to avoid the burden of empirical criticism is to frame a belief as a prediction.
Dragon slayer: At some point there will be a dragon in my garage and you should better take that possibility seriously because you might be eaten.
Dragon skeptic: Dragons are mythical creatures, I doubt that there will ever appear a dragon in your garage.
Dragon slayer: Give what we know about physics, dragons are possible. And since being eaten is extremely negative, even if you assign a low probability to the possibility of a dragon appearing in my garage, you should take that possibility seriously. Just think about your possible children and your children's children and their children. You have to save those people!
Dragon skeptic: Ummm, okay. To refine my estimations regarding your dragon, what do you anticipate to happen before the dragon will appear in your garage, is there any possibility to update on evidence before the dragon appears?
Dragon slayer: No, I don't know enough about dragons to be more specific about my prediction. I will know when I see the dragon though.
Dragon skeptic: Hmm. Could you then tell me what led you to believe that a dragon might appear in your garage and why it would be dangerous?
Dragon slayer: We know that once upon a time huge giant reptiles roamed the earth. We also know that flamethrowers are technical feasible. Further, most giant animals do not care about human well-being, which makes them extremely dangerous. Well okay, elephants and whales are not dangerous but you can't reasonably expect that most giant flame throwing reptiles are like that...
Dragon skeptic: Ok, let's assume such a thing is possible. Why would it appear in your garage? I mean, sure, evolution might result in such a thing as a dragon at some point but...
Dragon slayer: I didn't say it will happen tomorrow, I don't like to talk about time frames.
Title: [SEQ RERUN] Belief in Belief Tags: sequence_reruns Today's post, Belief in Belief was originally published on July 29, 2007. A summary (taken from the LW wiki):
Discuss the post here (rather than in the comments to the original post).
This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Making Beliefs Pay Rent (in Anticipated Experiences), and you can use the sequence_reruns tag or rss feed to follow the rest of the series.
Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.