Lumifer comments on Open Thread March 28 - April 3 , 2016 - Less Wrong

7 Post author: Clarity 28 March 2016 02:06AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (69)

You are viewing a single comment's thread.

Comment author: Lumifer 29 March 2016 04:50:46PM *  4 points [-]

EY arguing that a UFAI threat is worth considering -- as a response to Bryan Caplan's scepticism about it. I think it's a repost from Facebook, though.

ETA: Caplan's response to EY's points. EY answers in the comments.

Comment author: Algon 30 March 2016 10:40:54AM 2 points [-]

But, isn't this what he's been saying for years? What's the point in posting about it?

Comment author: Vaniver 30 March 2016 01:23:29PM 2 points [-]

Caplan posted that he was skeptical, Yudkowsky responded with "which part of this argument do you disagree with?"

Comment author: V_V 30 March 2016 08:15:29PM -2 points [-]

EY warns against extrapolating current trends into the future. Seriously?

Comment author: knb 30 March 2016 10:26:24PM 1 point [-]

Why does that surprise you? None of EY's positions seem to be dependent on trend-extrapolation.

Comment author: entirelyuseless 31 March 2016 03:49:45PM 1 point [-]

Trend extrapolation is more reasonable than invoking something that hasn't happened at all yet, and then claiming, "When this happens, it will become an unstoppable trend."

Comment author: knb 02 April 2016 02:21:35AM 0 points [-]

It would be more reasonable to use trend-extrapolation if it was a field where you would necessarily be able to discern a trend. Yudkowsky argues there could be sharp discontinuities. Personally I don't really feel qualified to have a strong opinion, and I would not be able to discern a trend even if it exists.

Comment author: V_V 30 March 2016 10:41:15PM -1 points [-]

Other than a technological singularity with artificial intelligence explosion to a god-like level?

Comment author: knb 30 March 2016 11:49:19PM 1 point [-]

I don't believe that prediction is based on trend-extrapolation. Nothing like that has ever happened, so there's no trend to draw from.

Comment author: Lumifer 31 March 2016 02:26:47PM 2 points [-]

You are right about the singularity, but the underlying trend extrapolation is that of technical progress and, specifically, of software getting smarter.

Nowadays people got used to rapid technical progress and often consider it, um, inevitable. A look at history should disabuse one of that notion, though.

Comment author: knb 02 April 2016 02:08:34AM 0 points [-]

Yudkowsky explicitly doesn't believe in rapid technical progress. He's talked about the fact that he believes in the Great Stagnation (slowdown in science/tech/economic progress) which is possibly a good thing since it may retard the creation of AGI, giving people a better shot to work on friendliness first.

Comment author: Lumifer 02 April 2016 03:13:29AM 0 points [-]

Yudkowsky explicitly doesn't believe in rapid technical progress.

Links? What is "rapid"? Did he look at his phone recently?

The Great Stagnation is phenomenon on the time scale of decades. How about the time scale of centuries?

Comment author: knb 02 April 2016 04:40:08AM *  0 points [-]
Comment author: Lumifer 02 April 2016 09:12:52PM -2 points [-]

Yes, he believes in the Great Stagnation. That does not imply he doesn't believe in rapid technological progress. Again, what is "rapid"?