http://vimeo.com/22099396
What do people think of this, from a Bayesian perspective?
It is a talk given to the Oxford Transhumanists. Their previous speaker was Eliezer Yudkowsky. Audio version and past talks here: http://groupspaces.com/oxfordtranshumanists/pages/past-talks
I don't like attributing to people false ideas they didn't actually write. I think that's a recipe for disaster. You disagree?
I wasn't talking about silent corrections either.