You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

timtyler comments on Hanson Debating Yudkowsky, Jun 2011 - Less Wrong Discussion

14 Post author: XiXiDu 03 July 2011 04:59PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (78)

You are viewing a single comment's thread. Show more comments above.

Comment author: timtyler 04 July 2011 06:56:13AM *  1 point [-]

There's something more basic about the analogy I object to. Google, Friendster, Altavista and Facebook were not founded at the beginning of a singularity.

What do you mean by that? You think Google isn't going to go on to develop machine intelligence? Surely they are among the front runners - though admittedly there are some other players in the game. This is not a case of hanging around for some future event.

  • The first creatures to develop human-level intelligence came to dominate all other creatures.
  • The first humans to develop agriculture did not come to dominate all other humans

The wording of the question was:

Compared to the farming and industrial revolutions, intelligence explosion first-movers will quickly control a much larger fraction of their new world.

It doesn't say the control is kept indefinitely. So - for instance - Sergey and Larry might die - but they will still have quickly come to control a larger fraction of the world than any farmer or industrialist.