Juno_Watt comments on The genie knows, but doesn't care - Less Wrong

54 Post author: RobbBB 06 September 2013 06:42AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (515)

You are viewing a single comment's thread. Show more comments above.

Comment author: linkhyrule5 10 September 2013 11:04:30PM 1 point [-]

You need non-cyclical reasoning. Which would generally be something where you aren't the one having to explain people that the achievement in question is profound.

This bit confuses me.

That aside:

You think Yudkowsky is not a crank, so you think the folks that play that silly game with him are intelligent and rational

Non sequitur. From the posts they make, everyone on this site seems to me to be sufficiently intelligent as to make "selling snake oil" impossible, in a cut-and-dry case like the AI box. Yudowsky's own credibility doesn't enter into it.

Comment author: Juno_Watt 12 September 2013 06:53:17AM 0 points [-]

Some folks on this site have accidentally bought unintentional snake oil in The Big Hoo Hah That Shall not Be Mentioned. Only an intelligent person could have bought that particular puppy,

Comment author: linkhyrule5 12 September 2013 07:28:57AM 0 points [-]

Granted. And it may be that additional knowledge/intelligence makes yourself more vulnerable a Gatekeeper.

Comment author: Peterdjones 12 September 2013 08:24:20AM 0 points [-]

Trying to think this out in terms of levels of smartness alone is very unlikely to be helpful.

Comment author: linkhyrule5 12 September 2013 05:10:12PM 0 points [-]

Well yes. It is a factor, no more no less.

My point is, there is a certain level of general competence after which I would expect convincing someone with an OOC motive to let an IC AI out to be "impossible," as defined below.