If Strong AI turns out to not be possible, what are our best expectations today as to why?
I'm thinking of trying myself at writing a sci-fi story, do you think exploring this idea has positive utility? I'm not sure myself: it looks like the idea that intelligence explosion is a possibility could use more public exposure, as it is.
I wanted to include a popular meme image macro here, but decided against it. I can't help it: every time I think "what if", I think of this guy.
It seems like a weak premise in that human intelligence is just Strong NI (Strong Natural Intelligence). What would it be about being Strong AI that it would kill everything when Strong NI does not? A stronger premise would be more fundamental, be a premise about something more basic about AI vs NI that would explain how it came to be that Strong AI killed everything when Strong NI obviously does not.
But OK, its a premise for a story.