If Strong AI turns out to not be possible, what are our best expectations today as to why?
I'm thinking of trying myself at writing a sci-fi story, do you think exploring this idea has positive utility? I'm not sure myself: it looks like the idea that intelligence explosion is a possibility could use more public exposure, as it is.
I wanted to include a popular meme image macro here, but decided against it. I can't help it: every time I think "what if", I think of this guy.
The possibility that there is no such thing as computationally tractable general intelligence (including in humans), just a bundle of hacks that work well enough for a given context.
Nobody said it has to work in every context. AGI just means something about as versatile as humans.