I think Yudkowsky's analysis here isn't putting enough weight on the social aspects. "Science", as we know it, is a social process, in a way that Bayesian reasoning is not.
The point of science isn't to convince yourself -- it's to convince an audience of skeptical experts.
A large group of people, with different backgrounds, experiences, etc aren't going to agree on their priors. As a result, there won't be any one probability on a given idea. Different readers will have different background knowledge, and that can make a given hypothesis seem more or less believable.
(This isn't avoidable, even in principle. The Solomonoff prior of an idea is not uniquely defined, since encodings of ideas aren't unique. You and the reviewers are not necessarily wrong in putting different priors on an idea even if you are both using a Solomonoff prior. The problem wouldn't go away, even if you and the reviewers did have identical knowledge, which you don't.)
Yudkowsky is right that this makes science much more cautious in updating than a pure Bayesian. But I think that's desirable in practice. There is a lot of value to having a scientific community all use the same theoretical language and have the same set of canonical examples. It's expensive (in both human time and money) to retrain a lot of people. Societies cannot change their minds as quickly or easily as the members can, so it makes sense to move more slowly if the previous theory is still useful.
Other issue is that the process should be difficult to maliciously subvert (or non maliciously by rationalization of erroneous belief). That results in a boatload of features that may be frustrating to those wanting to introduce unjustified untestable propositions for fun and profit (or to justify erroneous beliefs).
Today's post, Science Doesn't Trust Your Rationality was originally published on 14 May 2008. A summary (taken from the LW wiki):
Discuss the post here (rather than in the comments to the original post).
This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was The Dilemma: Science or Bayes?, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.
Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.