If Strong AI turns out to not be possible, what are our best expectations today as to why?
I'm thinking of trying myself at writing a sci-fi story, do you think exploring this idea has positive utility? I'm not sure myself: it looks like the idea that intelligence explosion is a possibility could use more public exposure, as it is.
I wanted to include a popular meme image macro here, but decided against it. I can't help it: every time I think "what if", I think of this guy.
Be sure not to rule out the evolution of Human Level AI on neurological computers using just nucleic acids and a few billion years...
That's another possibility I didn't think of.
I guess I was really interested in a question "Why could Strong AI turn out to be impossible to build by human civilization in a century or ten?"