Nick_Tarleton comments on Open Thread: February 2010, part 2 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (857)
This might be stupid (I am pretty new to the site and this possibly has come up before), I had a related thought.
Assuming boxing is possible, here is a recipe for producing an FAI:
Step 1: Box an AGI
Step 2: Tell it to produce a provable FAI (with the proof) if it wants to be unboxed. It will be allowed to carve of a part of universe to itself in the bargain.
Step 3: Examine FAI the best you can.
Step 4: Pray
Something roughly like this was tried in one of the AI-box experiments. (It failed.)