asr comments on Open Thread for January 8 - 16 2014 - Less Wrong

5 Post author: tut 08 January 2014 12:14PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (343)

You are viewing a single comment's thread. Show more comments above.

Comment author: Stabilizer 13 January 2014 05:19:54PM *  7 points [-]

I don't see any discussion about this blog post by Mike Travern.

His point is that people trying to solve for Friendly AI are doing so because it's an "easy", abstract problem well into the future. He contends that we are already taking significant damage from artificially created human systems like the financial system, which can be ascribed agency and it's goals are quite different from improving human life. These systems are quite akin to "Hostile AI". This, he contends, is the really hard problem.

Here is a quote from the blogpost (which is from a Facebook comment he made):

I am generally on the side of the critics of Singulitarianism, but now want to provide a bit of support to these so-called rationalists. At some very meta level, they have the right problem — how do we preserve human interests in a world of vast forces and systems that aren’t really all that interested in us? But they have chosen a fantasy version of the problem, when human interests are being fucked over by actual existing systems right now. All that brain-power is being wasted on silly hypotheticals, because those are fun to think about, whereas trying to fix industrial capitalism so it doesn’t wreck the human life-support system is hard, frustrating, and almost certainly doomed to failure.

It's a short post, so you can read it quickly. What do you think about his argument?

Comment author: asr 13 January 2014 06:25:17PM *  14 points [-]

It's a short post, so you can read it quickly. What do you think about his argument?

I think it's silly. I suspect MIRI and every other singulatarian organization, and every other individual working on the challeges of unfriendly AI, could fit comfortably in a 100-person auditorium.

In contrast, "trying to fix industrial capitalism" is one of the main topics of political dispute everywhere in the world. "How to make markets work better" is one of the main areas of research in economics. The American Economic Association has 18,000 members. We have half a dozen large government agencies, with budgets of hundreds of millions of dollars each, to protecting people from hostile capitalism. (The SEC, the OCC, the FTC, etc etc, are all ultimately about trying to curb capitalist excess. Each of these organizations has a large enforcement bureaucracy, and also a number of full-time salaried researchers.)

The resources and human energy devoted to unfriendly AI are tiny compared to the amount expended on politics and economics. So it's strange to complain about the diversion of resources.

Comment author: Stabilizer 13 January 2014 06:29:31PM *  7 points [-]

Excellent point. I'm surprised this did not occur to me. This reminds me of Scott Aaronson's reply when someone suggested that quantum computational complexity is quite unimportant compared to experimental approaches to quantum computing and therefore shouldn't get much funding:

I find your argument extremely persuasive—assuming, of course, that we’re both talking about Bizarro-World, the place where quantum complexity research commands megabillions and is regularly splashed across magazine covers, while Miley Cyrus’s twerking is studied mostly by a few dozen nerds who can all fit in a seminar room at Dagstuhl.

Comment author: [deleted] 14 January 2014 05:34:29PM 1 point [-]

I think it's silly. I suspect MIRI and every other singulatarian organization, and every other individual working on the challeges of unfriendly AI, could fit comfortably in a 100-person auditorium.

It looks to me like the room in this picture contains more than 100 people.

Comment author: asr 14 January 2014 07:24:44PM 1 point [-]

Yes. I will revise upwards my impression of how many people are working on Singularity topics. That said, not everybody who showed up at the summit was working on singularity-problems. Some were just interested bystanders.