You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

cousin_it comments on Philosophy that can be "taken seriously by computer scientists" - Less Wrong Discussion

12 Post author: lukeprog 27 December 2011 02:39AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (15)

You are viewing a single comment's thread.

Comment author: cousin_it 27 December 2011 04:08:52PM *  6 points [-]

The manifesto has a nice paragraph where Glymour lists the contributions of many mathematical philosophers. This might be relevant to UDT:

Philosophers and statisticians alike want to posit probabilities over sentences, but how would that work with a language adequate to science and mathematics, say first order logic? Haim Gaifman told us, and worked out the implications for what is and what is not learnable.

Comment author: lukeprog 27 December 2011 04:34:28PM *  6 points [-]

Yup. This is why I was so surprised in January 2011 that Less Wrong had never before mentioned formal philosophy, which is the branch of philosophy most relevant to the open research problems of Friendly AI. See, for example, Self-Reference and the Acyclity of Rational Choice or Reasoning with Bounded Resources and Assigning Probabilities to Arithmetical Statements.

Comment author: cousin_it 27 December 2011 05:10:27PM *  2 points [-]

Thanks for the links. I just read those two papers and they don't seem to be saying anything new to me :-(

Comment author: JonathanLivengood 29 December 2011 01:22:34AM 1 point [-]

In your linked piece, you were talking about formal epistemology. Here you say "formal philosophy." Is that a typo, or do you think that formal epistemology exhausts formal philosophy? (I would hope not the latter, since lots of formal work gets done in philosophy outside epistemology!)

Comment author: lukeprog 29 December 2011 04:25:17AM 1 point [-]

Formal epistemology is a subfield within formal philosophy, probably the largest.

Comment author: JonathanLivengood 29 December 2011 06:13:42AM 0 points [-]

Larger than logic? Hmm ... maybe you're thinking about "formal philosophy" in a way that I am unfamiliar with.

Comment author: Will_Newsome 28 December 2011 08:04:49PM *  2 points [-]

This is pretty much unrelated but do you think maybe you could write a short post about the relevance of algorithmic probability for human rationality? There's this really common error 'round these parts where people say a hypothesis (e.g. God, psi, etc) is a prior unlikely because it is a "complex" hypothesis according to the universal prior. Obviously the "universal prior" says no such thing, people are just taking whatever cached category of hypotheses they think are more probable for other unmentioned reasons and then labeling that category "simple", which might have to do with coding theory but has nothing to do with algorithmic probability. Considering this appeal to simplicity is one of the most common attempted argument stoppers it might benefit the local sanity waterline to discourage this error. Fewer "priors", more evidence.

ETA: I feel obliged to say that though algorithmic probability isn't that useful for describing humans' epistemic states, it's very useful for talking about FAI ideas; it's basically a tool for transforming indexical information about observations into logical information about programs and also proofs thanks to the Curry--Howard isomorphism, which is pretty cool, among other reasons it's cool.

Comment author: cousin_it 28 December 2011 08:43:06PM *  3 points [-]

I already have a post about that. Unfortunately I screwed up the terminology and was rightly called on it, but the point of the post is still valid.

Comment author: Will_Newsome 28 December 2011 08:52:44PM 2 points [-]

Thanks. I actually found your amendment more enlightening. Props again for your focus on the technical aspects of rationality, stuff like that is the saving grace of LW.