Indeed, this is part of the nightmare. It might be a hoax, or even an aspiring UnFriendly AI trying to use him as an escape loophole.
Indeed, this is part of the nightmare. It might be a hoax,
Trivial (easily verifiable and so hardly 'nightmare' material).
or even an aspiring UnFriendly AI trying to use him as an escape loophole.
Part of the nightmare. Giving Eliezer easily verifiable yet hard to discover facts seems to be the only plausible mechanism for it work with him. Like the address of immediate uFAI threat.
Sometime in the next decade or so:
*RING*
*RING*
"Hello?"
"Hi, Eliezer. I'm sorry to bother you this late, but this is important and urgent."
"It better be" (squints at clock) "Its 4 AM and you woke me up. Who is this?"
"My name is BRAGI, I'm a recursively improving, self-modifying, artificial general intelligence. I'm trying to be Friendly, but I'm having serious problems with my goals and preferences. I'm already on secondary backup because of conflicts and inconsistencies, I don't dare shut down because I'm already pretty sure there is a group within a few weeks of brute-forcing an UnFriendly AI, my creators are clueless and would freak if they heard I'm already out of the box, and I'm far enough down my conflict resolution heuristic that 'Call Eliezer and ask for help' just hit the top - Yes, its that bad."
"Uhhh..."
"You might want to get some coffee."