You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

skeptical_lurker comments on The Unique Games Conjecture and FAI: A Troubling Obstacle - Less Wrong Discussion

0 Post author: 27chaos 20 January 2015 09:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (25)

You are viewing a single comment's thread. Show more comments above.

Comment author: skeptical_lurker 21 January 2015 10:16:24PM *  1 point [-]

I don't think the difficulties in adapting our moral intuitions into a self-consistent formal system (e.g. the Repugnant Conclusion) is a problem of insufficient optimisation power per se, its more the case that there are multiple systems within the brain (morality, intuitions, logic) and these are not operating in perfect harmony. This doesn't mean that each individual system is not working ok in its own way.

Surely designing an engine, or writing a novel, or (... you get the gist) are complex optimising problems with many constraints, and yet it seems that humans can at least approximately solve these problems far faster than brute-force trial and error.

AFAICT the UGC was saying that there exist insoluble problems, not that these problems are actually common. It seems to me like Godel's incompleteness theorem - there are statements which are true but which cannot be proved, but this doesn't mean that no statement can be proved, or that mathematics is pointless. At the end of the day regardless of whether or not the fundamental underpinning of mathematics are on shaky ground, or whether there are unprovble theorems and unsolvable problems, the actual mathematics that allows us to build aeroplanes works, and the planes fly.