Eliezer_Yudkowsky comments on Snowdenizing UFAI - Less Wrong

5 Post author: JoshuaFox 05 December 2013 02:42PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (71)

You are viewing a single comment's thread.

Comment author: Eliezer_Yudkowsky 05 December 2013 08:30:11PM 4 points [-]

What good would a Snowden do? The research would continue.

Comment author: JoshuaFox 05 December 2013 09:05:15PM *  4 points [-]

Yes, but fear of a Snowden would make project leaders distrustful of their own staff.

And if many top researchers in the field were known to be publicly opposed to any unsafe project that the agencies are likely to create, it would shrink their recruiting pool.

The idea is to create a moral norm in the community. The norm can be violated, but it would put a crimp in the projects as compared to a situation where there is no such moral norm.

Comment author: oooo 05 December 2013 09:41:16PM 2 points [-]

This presupposes that the AGI community is, on average, homogenous across the world and would behave accordingly. What if the political climates, traditions and culture make certain (powerful) countries less likely to be fearful given their own AGI pool?

In otherwords, if country A distrusts their staff more than country B due to political/economic/cultural factors, country A would be behind in the AGI arms race, which would lead to the "even if I hold onto my morals, we're still heading into the abyss" attitude. I could see organizations or governments rationalizing against the community moral pledge in this way by highlighting the futility of slowing down the research.

Comment author: JoshuaFox 05 December 2013 10:08:34PM 3 points [-]

AGI community is, on average, homogenous

The AGI community is tiny today. As it grows, its future composition will be determined by the characteristic of the tiny seed that expands.

I won't claim that the future AGI community will be homogeneous, but it may be possible to establish norms starting today.

Comment author: David_Gerard 05 December 2013 09:40:49PM 1 point [-]

Indeed. Just imagine the fear of the next Snowden in the NSA, and trying to work out just how many past Snowdens they've had who took their secrets to the enemy rather than the public.

Comment author: JoshuaFox 05 December 2013 10:06:41PM *  1 point [-]

Yes, exactly.

You've made my point clearly--and perhaps I didn't make it clearly enough in my post. I was focusing not on a leak in itself, but on what suspicion can do to an organization. As I described it, the suspicion would "cast a shadow" and "hover over" the project.

At this point, NSA may well be looking for anyone who expressed hacker/cypherpunk/copyfighter sentiments. Not that these need to disqualify someone from serving in the NSA, but at this point, the NSA is probably pretty suspicious.

Comment author: passive_fist 05 December 2013 10:57:37PM 1 point [-]

I would like to agree with you but experience says otherwise. Tyrants have always been able to find enough professionals with dubious morals to further their plans.

Comment author: JoshuaFox 06 December 2013 08:44:08AM *  1 point [-]

In World War I, German Jewish scientists contributed to the German war effort. In World War II, refugee scientist contributed to the Allied war effort. Tyrants can shoot themselves in the foot quite effectively.

A few top physicists were left in Germany, including Heisenberg, but it was not enough to move the project forward, and it's suspected that Heisenberg may have deliberately sabotaged the project.

But you have a point. So long as AGI is at the cutting edge, only a handful of top people can move it forward. As Moore's Law of Mad Science has its effect, "ordinary" scientists will be enough.

(And to make it clear, I am not suggesting that the US government is tyrannical.)

Comment author: ChristianKl 17 December 2013 03:10:40PM 0 points [-]

Tyrants have always been able to find enough professionals with dubious morals to further their plans.

There are plenty cases where government puts a bunch of incompetent people on a project and the project fails.

Comment author: JoshuaFox 24 December 2013 03:35:20PM 0 points [-]

if the project does not take safety into account, we want exactly this -- so long as it doesn't get close enough to success that failure involves paper-clipping the world.