I suspect Eliezer is avoiding this project for the same reason the word "singularity" was adopted in the sense we use it at all. Vinge coined it to point to the impossibility of writing characters dramatically smarter than himself.
"Here I had tried a straightforward extrapolation of technology, and found myself precipitated over an abyss. It's a problem we face every time we consider the creation of intelligences greater than our own. When this happens, human history will have reached a kind of singularity - a place where extrapolation breaks down and new models must be applied - and the world will pass beyond our understanding."
Perhaps a large number of brilliant humans working together on a very short story / film for a long time could simulate superintelligence just enough to convince the average human that More Is Possible. But there would be a lot of risk of making people zero in on irrelevant details, and continue to underestimate just how powerful SI could be.
There's also a worry that the vividness of 'AI in a box' as premise would continue to make the public think oracle AI is the obvious and natural approach and we just have to keep working on doing it better. They'd remember the premise more than the moral. So, caution is warranted.
Also, hindsight bias. Most tricks won't work on everyone, but even if we find a universal trick that will work for the film, afterward people who see it will think it's obvious and that they could easily think their way around it. Making some of the AI's maneuvering mysterious would help combat this problem a bit, but would also weaken the story.
Summary
Furthermore, in the last thread I have asserted that
It would be quite bad for me to assert this without backing it up with a victory. So I did.
First Game Report - Tuxedage (GK) vs. Fjoelsvider (AI)
Second Game Report - Tuxedage (AI) vs. SoundLogic (GK)
Testimonies:
State of Mind
Post-Game Questions
$̶1̶5̶0̶$300 for any subsequent experiments regardless of outcome, plus an additional$̶1̶5̶0̶$450 if I win. (Edit: Holy shit. You guys are offering me crazy amounts of money to play this. What is wrong with you people? In response to incredible demand, I have raised the price.) If you feel queasy about giving me money, I'm perfectly fine with this money being donating to MIRI. It is also personal policy that I do not play friends (since I don't want to risk losing one), so if you know me personally (as many on this site do), I will not play regardless of monetary offer.Advice
These are tactics that have worked for me. I do not insist that they are the only tactics that exists, just one of many possible.
Playing as Gatekeeper
Playing as AI
Ps: Bored of regular LessWrong? Check out the LessWrong IRC! We have cake.