RobbBB comments on A forum for researchers to publicly discuss safety issues in advanced AI - Less Wrong

12 Post author: RobbBB 13 December 2014 12:33AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (73)

You are viewing a single comment's thread. Show more comments above.

Comment author: RobbBB 14 December 2014 07:43:35PM *  3 points [-]

So who, in (contemporary, analytical) philosophy talks about true essences?

How is that relevant?

But that's inefficient. It's wasted effort to quantify what doesn't work conceptually. It may be impossible to always get the conceptual stage right first time, but one can adopt a policy of getting it as firm as possible...

Writing your intuitions up in a formal, precise way can often help you better understand what they are, and whether they're coherent. It's a good way to inspire new ideas and spot counter-intuitive relationships between old ones, and it's also a good way to do a sanity check on an entire framework. So I don't think steering clear of math and logic notation is a particularly good way to enhance the quality of philosophical thought; I think it's frequently more efficient to quickly test your ideas' coherence and univocality.

Comment author: TheAncientGeek 14 December 2014 08:57:58PM 0 points [-]

So who, in (contemporary, analytical) philosophy talks about true essences?

How is that relevant?

It's relevant to my preference for factually based critique.

Writing your intuitions up in a formal, precise way can often help you better understand what they are, and whether they're coherent.

Indeed. I was talking about quantification, not formalisation.

Comment author: RobbBB 15 December 2014 10:35:23AM 2 points [-]

'Formalization' and mathematical logic is closer to what MIRI has in mind when it says 'mathematics'. See http://intelligence.org/research-guide.