JGWeissman comments on About the AI-Box experiment - Less Wrong

-1 [deleted] 21 February 2011 07:45PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (14)

You are viewing a single comment's thread.

Comment author: JGWeissman 21 February 2011 08:11:31PM 2 points [-]

Eliezer has assured us that:

There was no super-clever special trick that let me get out of the Box using only a cheap effort. I didn't bribe the other player, or otherwise violate the spirit of the experiment. I just did it the hard way.

Further, you don't seem to actually understand the rules. Such as:

The AI party may not offer any real-world considerations to persuade the Gatekeeper party.

This rules out your approach of threatening the gatekeeper player.

Comment author: XiXiDu 21 February 2011 08:42:53PM *  3 points [-]

Eliezer has assured us that...

That's my main problem with the whole test and other issues as well.

Eliezer has assured us that...

  • the SIAI works to benefit humanity.
  • idea XY is dangerous.
  • he is honest.

...bona fides.

Comment author: timtyler 24 February 2011 09:08:36PM *  2 points [-]

I expect Smeagol told Deagol that he loved him. He may even have meant it.

Comment author: Miller 21 February 2011 08:50:28PM 0 points [-]

I don't believe he used a super clever cheap trick. However, if lying was considered net beneficial to the future, Eliezer would do it.

Comment author: FAWS 21 February 2011 09:48:46PM 0 points [-]

He at least claims that he wouldn't, since if he was known to be willing to lie to save the world he couldn't be trusted whenever it is known to be at stake. http://lesswrong.com/lw/v2/prices_or_bindings/

Comment author: Timwi 21 February 2011 08:45:16PM -2 points [-]

No, it doesn’t rule it out. I can’t be rationally convinced of anything that went on in the chat until I see it.

Comment author: TheOtherDave 21 February 2011 09:22:18PM 4 points [-]

Nor even if you do see it. After all, the possibility exists that you're being handed a fake chatlog.

Short of performing the experiment right in front of you, it's not clear what can rationally convince you, if your priors are low enough.

Comment author: JGWeissman 21 February 2011 08:56:05PM 3 points [-]

I can’t be rationally convinced of anything that went on in the chat until I see it.

The details of the chats will not be publicly released. But if you want to maintain that you don't trust the results of the experiment because the gatekeeper players might have been confederates of Eliezer, then you should have said so. You claimed, however, that this clever trick was possible given the rules followed, and this claim was wrong.