Will_Newsome comments on Intellectual Hipsters and Meta-Contrarianism - LessWrong

147 Post author: Yvain 13 September 2010 09:36PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (323)

You are viewing a single comment's thread. Show more comments above.

Comment author: Will_Newsome 18 September 2010 08:08:25AM *  3 points [-]

Existentially dangerous doesn't mean the benefits still don't outweigh the costs. If there's a 95% chance that uFAI kills us all, that's still a whopping 5% chance at unfathomably large amounts of utility. Technology still ends up having been a good idea after all.

Each level adds necessary nuance. Unfortunately, at each level is a new chance for unnecessary nuance. Strong epistemic rationality is the only thing that can shoulder the weight of the burdensome details.

Added: Your epistemic rationality is limited by your epistemology. There's a whole bunch of pretty and convincing mathematics that says Bayesian epistemology is the Way. We trust in Bayes because we trust in that math: the math shoulders the weight. A question, then. When is Bayesianism not the ideal epistemology? As humans the answer is 'limited resources'. But what if you had unlimited resources? At the limit, where doesn't Bayes hold?