James_Miller comments on The Robots, AI, and Unemployment Anti-FAQ - Less Wrong

47 Post author: Eliezer_Yudkowsky 25 July 2013 06:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (267)

You are viewing a single comment's thread. Show more comments above.

Comment author: bluej100 24 July 2013 06:17:02AM 4 points [-]

"There's a thesis (whose most notable proponent I know is Peter Thiel, though this is not exactly how Thiel phrases it) that real, material technological change has been dying."

Tyler Cowen is again relevant here with his http://www.amazon.com/The-Great-Stagnation-Low-Hanging-ebook/dp/B004H0M8QS , though I think he considers it less cultural than Thiel does.

"We only get the Hansonian scenario if AI is broadly, steadily going past IQ 70, 80, 90, etc., making an increasingly large portion of the population fully obsolete in the sense that there is literally no job anywhere on Earth for them to do instead of nothing, because for every task they could do there is an AI algorithm or robot which does it more cheaply."

As someone working in special-purpose software rather than general-purpose AI, I think you drastically overestimate the difficulty of outcompeting humans in significant portions of low-wage jobs.

"The concrete illustration I often use is that a superintelligence asks itself what the fastest possible route is to increasing its real-world power, and...just moves atoms around into whatever molecular structures or large-scale structures it wants....The human species would end up disassembled for spare atoms"

I also think you overestimate the ease of fooming. Computers are already helping us design themselves (see http://www.qwantz.com/index.php?comic=2406), and even a 300 IQ AI will be starting from the human knowledge base and competing with microbes for chemical energy at the nano scale and humans for energy at the macro scale. I think that a 300-IQ AI dropped on earth today would take five years to dominate scientific output.

Comment author: James_Miller 24 July 2013 09:29:03PM 10 points [-]

300 IQ is 10 standard deviations above the mean. So picture a trillion planets each with a trillion humans on them and take the smartest person out of all of this and transport him to our reality and make it very easy for him to quickly clone himself. Do you really think it would take this guy five full years to dominate scientific output?

Comment author: Jack 24 July 2013 10:20:49PM 13 points [-]

So picture a trillion planets each with a trillion humans on them

There is almost no way this hypothetical provokes accurate intuitions about a 300 IQ. It's hard to ask someone to picture something they are literally incapable of picturing and I suspect people hearing this will just default to "someone a little smarter than the smartest person I know of".

Comment author: Normal_Anomaly 25 July 2013 08:41:57PM 4 points [-]

I know I'm doing that and I can't stop doing it. "A trillion planets each with a trillion humans on them" is something important, but I can't visualize it at all.

Comment author: Baughn 27 July 2013 07:23:58PM 1 point [-]

I'm picturing someone with the optimization power of the entire human civilization, which seems a little more tractable.

It's also based on nothing whatsoever, but it's at least in the right direction? I hope.