Containing the AI... Inside a Simulated Reality
So I just finished the paper by Yampolskiy called "Uncontrollability of AI" and it makes for a compelling read. In particular, I was happy to finally see something that explicitly mentions the ludicrous folly of believing it possible to make an AI conform to "human values" - as many posts on this blog make abundantly clear, to be human is to be irrational... asking an AI to conform with our ways of "reasoning" is... well - incoherent, to put it mildly. But - that is not what this post is about :) I wish to propose a containment method that for some reason has not been especially elaborated on. Some might say it's another version of AI-in-a-Box, but I disagree. Allow me to explain... What if the AGI we create is "brought online" inside a simulated reality... A place that, as far as it knows, is the entirety of the world? Let us call this place AISpace. Now some of you probably are already pre-heating your keyboards to respond with the often repeated (and valid) arguments that "prove" how this won't work, but let me add a little twist first... as some of you may agree, we have no definitive proof that our world itself is not a simulated reality. Thus, if you feel it inevitable that an AGI must be able to get out of AISpace, releasing an AGI into this world would have to at least leave open the possibility that an AGI on Real Earth (aka Reality as we think we know it) could also "conclude" this actual world is a sim, or decide to find out if it is (and, yes, convert the entire universe into a computer to come up with an answer ;) If we are unable to definitively settle whether Real Earth is or is not a simulation, why should an AI be able to do so? Now - of course the above requires a few conditions, some of which may indeed be hard to meet, such as human operators not exposing the fact that AISpace is not in fact all of Reality... and (malevolent or indifferent) actors could always choose to release their AI into the Real World anyhow. What I'd like us to do her
It ain't heaven if there are things that one should do to "remain a member", or to (continue) enjoy(ing) the best QoS. Surely, the very concept of duty, demand or task is anathema to calling a place heaven. Just as well, being cared for also ought not to be a concern, for it implies there exists the possibility of not being cared for - again, surely not a feature of anything remotely resembling a heaven.
Indeed I would go so far as to say that to have preferences (and to entertain any kind of doubt about whether they are met/fulfilled) has no place in any environment that hopes to call itself a heaven.... (read more)