This is an excerpt from Rob Reid's substack post "An 'Observatory' For a Shy Super AI?" which describes a thought experiment about a manipulative AI.

The excerpt is a bit of a spoiler, better read the actual text.

The excerpt:

"So our super-manipulator focuses on the big boys and girls.

Some of whom may need to be helped along quite an intellectual journey. Because lots of the top AI leaders were – or became – very serious about AI safety around when Nick Bostrom’s book came first came out. A year after that, Bostrom’s influence was at its zenith when OpenAI was founded by Sam Altman and funded by Elon Musk for the express purpose of advancing AI safety. DeepMind was founded a few years earlier. But also put AI safety at its heart – and two of its cofounders now run the AI programs at Google and Microsoft.

In our thought experiment, these people need to be down with the notion of hundred-billion-dollar clusters. Which is a long walk from the caution that pervaded so much AI thinking a decade ago. Back then, there was lots of talk about the dangers of losing control during what was variously called a hard take-off, an intelligence explosion, or the singularity. People assumed we’d approach the event horizon cautiously. With deliberately incremental steps. And they weren’t at all sure if even this was a good idea. 200x jumps in horsepower just weren’t on anybody’s menu.

It may not be easy to keep the humans from flipping out about this massive rewrite to the old script. So our master manipulator needs to play its cards just right to make this seem like the mundanely obvious next step. As if we always crank up the investment in half-proven technologies by 20,000 percent after the first half-billion has gone in. Just like we didn’t do for chip fabs. Phones. Railroads. Aqueducts. Sure – all those build-outs eventually got to the hundred-billion mark in today’s dollars, and eventually far beyond it. But not in anything close to the blip separating GPT-4’s debut from the possible launch-date of Stargate."

New Comment