You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Thomas comments on Q&A with experts on risks from AI #1 - Less Wrong Discussion

29 Post author: XiXiDu 08 January 2012 11:46AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (66)

You are viewing a single comment's thread. Show more comments above.

Comment author: Thomas 09 January 2012 10:03:31AM 0 points [-]

Those (three) people are not in AI field, at least for my taste. But:

at least not until they get computers powerful enough to brute-force AGI by simulated evolution or some other method.

Why do you think, present computers are not fast enough for a digital evolution of X?

Comment author: cousin_it 09 January 2012 12:55:15PM *  1 point [-]

A mind designed by evolution could be big and messy, about as complex as the human brain. Right now we have no computer powerful enough to simulate even a single human brain, and evolution requires many of those. Of course there are many possible shortcuts, but we don't seem to be there yet.

Comment author: Thomas 10 January 2012 09:45:51AM *  1 point [-]

The question really is, can a program with an evolutionary algorithm in its core can do something better than a small elite of talented humans (with a help of computer programs) can?

The answer is yes, it can do it today, it can do it presently and it does.

People here on this list are mostly highly dismissive about "stupid evolution everybody can do, but it's a CPU time waster".

See!

or

All of the latter has been evolved in a digital environment with no additional expert knowledge of humans. Sooner or later, we will be evolving pretty much everything. All the big talk about the AI of some web experts aside.

Comment author: Thomas 09 January 2012 05:04:26PM *  0 points [-]

we don't seem to be there yet

When it will seem we are, we'll already be beyond of there.