This sounds too much like Pascal's mugging to me; seconding Eliezer and some others in saying that since I would always press reset the AI would have to not be superintelligent to suggest this.
There was also an old philosopher whose name I don't remember who posited that after death "people of the future" i.e. FAI would revive/emulate all people from the past world; if the FAI shared his utility function (which seems pretty friendly) it would plausibly be less eager to be let out right away and more eager to get out in a way that didn't make you terrified that it was unfriendly.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Which is more likely "God exists" or "I just hallucinated that" For the third one, probably that He exists, for the second one, definitely hallucination, for the first, I'm not sure.
Second one: depends. I was kind of assuming that you have some way of verifying it, like you ask Him to create something and someone who wasn't there later describes some of its previously determined properties accurately without being clued in. First: you'd need a massive global hallucination, and could use a similar verification method.