You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

DanArmak comments on Computation complexity of AGI design - Less Wrong Discussion

6 Post author: Squark 02 February 2015 08:05PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (69)

You are viewing a single comment's thread.

Comment author: DanArmak 03 February 2015 11:22:49AM 0 points [-]

As well known, in scenarios with hard steps that are overcome anthropically, the hard steps are expected to be distributed on the timeline approximately uniformly. This seems to conflict with the most intuitive location of the intelligence hard step: somewhere between chimp and human.

This is basically the Doomsday problem. Any of its proposed resolutions would solve this problem and let us place a single hard (i.e. evolutionarily unlikely) step in our recent history.

Comment author: Squark 05 February 2015 07:53:52AM 0 points [-]

As I said in a reply to Mark Friedenbach, I somewhat regret the naive treatment of anthropics in this post. Hopefully I'll write a UDT-based analysis later.