If Strong AI turns out to not be possible, what are our best expectations today as to why?
I'm thinking of trying myself at writing a sci-fi story, do you think exploring this idea has positive utility? I'm not sure myself: it looks like the idea that intelligence explosion is a possibility could use more public exposure, as it is.
I wanted to include a popular meme image macro here, but decided against it. I can't help it: every time I think "what if", I think of this guy.
That other method might not have a speedup over carbon, though.
Then we'll pick one of the methods that does. Evolution only finds local maximums. It's unlikely that it hit upon the global maximum.
Even on the off chance that it did, we can still improve upon the current method. Humans have only just evolved civilization. We could improve with more time.
Even if we're at the ideal for our ancestral environment, our environment has changed. Being fluent in a programming language was never useful before, but it is now. It used to be hard to find enough calories to sustain the brain. That is no longer a problem.