handoflixue comments on Why Politics are Important to Less Wrong... - Less Wrong

6 Post author: OrphanWilde 21 February 2013 04:24PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (96)

You are viewing a single comment's thread. Show more comments above.

Comment author: handoflixue 22 February 2013 12:11:42AM 4 points [-]

Your edit pretty much captures my point, yes :) If nothing else, a Weak Friendly AI should eliminate a ton of the trivial distractions like war and famine, and I'd expect that humans have a much more unified volition when we're not constantly worried about scarcity and violence. There's not a lot of current political problems I'd see being relevant in a post-AI, post-scarcity, post-violence world.

Comment author: Dre 22 February 2013 05:21:43PM 1 point [-]

The problem is that we have to guarantee that the AI doesn't do something really bad while trying to stop these problems; what if it decides it really needs more resources suddenly, or needs to spy on everyone, even briefly? And it seems (to me at least) that stopping it from having bad side effects is pretty close, if not equivalent to, Strong Friendliness.

Comment author: handoflixue 22 February 2013 07:20:25PM 0 points [-]

I should have made that more clear: I still think Weak-Friendliness is a very difficult problem. My point is simply that we only need an AI that solves the big problems, not an AI that can do our taxes. My second point was that humans seem to already implement weak-friendliness, barring a few historical exceptions, whereas so far we've completely failed at implementing strong-friendliness.

I'm using Weak vs Strong here in the sense of Weak being a "SysOP" style AI that just handles catastrophes, whereas Strong is the "ushers in the Singularity" sort that usually gets talked about here, and can do your taxes :)

Comment author: OrphanWilde 22 February 2013 12:41:13AM *  1 point [-]

This... may be an amazing idea. I'm noodling on it.

Comment author: Rukifellth 22 February 2013 05:35:15AM 0 points [-]

I know this wasn't the spirit of your post, but I wouldn't refer to war and famine as "trivial distractions".

Comment author: Rukifellth 22 February 2013 01:39:29AM *  0 points [-]

Wait, if you're regarding the elimination of war, famine and disease as consolation prizes for creating an wFAI, what are people expecting from a sFAI?

Comment author: Fadeway 22 February 2013 03:43:59AM 1 point [-]

God. Either with or without the ability to bend the currently known laws of physics.

Comment author: Rukifellth 22 February 2013 05:17:41AM 1 point [-]

No, really.

Comment author: RichardKennaway 22 February 2013 02:26:16PM 3 points [-]

Really. That really is what people are expecting of a strong FAI. Compared with us, it will be omniscient, omnipotent, and omnibenevolent. Unlike currently believed-in Gods, there will be no problem of evil because it will remove all evil from the world. It will do what the Epicurean argument demands of any God worthy of the name.

Comment author: Rukifellth 22 February 2013 02:44:20PM 0 points [-]

Are you telling me that if a wFAI were capable of eliminating war, famine and disease, it wouldn't be developed first?

Comment author: RichardKennaway 22 February 2013 06:13:38PM 2 points [-]

Well, I don't take seriously any of these speculations about God-like vs. merely angel-like creations. They're just a distraction from the task of actually building them, which no-one knows how to do anyway.

Comment author: Rukifellth 22 February 2013 06:40:17PM 0 points [-]

But still, if a wFAI was capable of eliminating those things, why be picky and try for sFAI?

Comment author: RomeoStevens 22 February 2013 09:41:25PM *  1 point [-]

Because we have no idea how hard it is to specify either. If, along the way it turns out to be easy to specify wFAI and risky to specify sFAI, then the reasonable course is expected. Doubly so since a wFAI would almost certainly be useful in helping specify a sFAI.

Seeing as human values are a miniscule target, it seems probable that specifying wFAI is harder than sFAI though.

Comment author: Rukifellth 25 February 2013 05:05:53AM 0 points [-]

"Specify"? What do you mean?