Today's post, Science Doesn't Trust Your Rationality was originally published on 14 May 2008. A summary (taken from the LW wiki):
The reason Science doesn't always agree with the exact, Bayesian, rational answer, is that Science doesn't trust you to be rational. It wants you to go out and gather overwhelming experimental evidence.
Discuss the post here (rather than in the comments to the original post).
This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was The Dilemma: Science or Bayes?, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.
Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.
How does one make sure that this "probability-theoretic calculation" is not a "different armchair reasoning"?
This seems like a safe assumption. On the other hand, trusting in your powers of Solomonoff induction and Bayesianism doesn't seem like one: what if you suck at estimating priors and too unimaginative to account for all the likely alternatives?
Again a straw-collapse. No one believes in faster-than-light quantum "collapse", except for maybe some philosophers of physics.
Totally agreed. Thing is in general incomputable, how much more you need not to trust yourself doing it correctly? Clearly you can't have a process that relies on computing incomputable things right. I'm becoming increasingly convinced, either via confirmation bias, or via proper updates, that Eliezer skipped helluva lot of fundamentals.