If Strong AI turns out to not be possible, what are our best expectations today as to why?
I'm thinking of trying myself at writing a sci-fi story, do you think exploring this idea has positive utility? I'm not sure myself: it looks like the idea that intelligence explosion is a possibility could use more public exposure, as it is.
I wanted to include a popular meme image macro here, but decided against it. I can't help it: every time I think "what if", I think of this guy.
Still doesn't address the underlying problem. The Milky Way is about 100,000 light years across, but billions of years old. It is extremely unlikely that some non-terrestrial strong AI just happened to come into history in the exact same time that modern humans evolved, and is spreading throughout the universe at near the speed of light but just hasn't reached us yet.
Note that "moving at the speed of light" is not the issue here. Even predictions of how long it would take to colonize the galaxy with procreating humans and 20th century technology still says that the galaxy should have been completely tiled eons ago.
Imagine that 99.9999999999999% of the universe (and 100% of most galaxies) is under the control of strong AIs, and they expand at the speed of light. Observers such as us would live in the part of the universe not under their control and would see no evidence of strong AIs.
The universe (not necessarily just the observable universe) is very big so I don't agree. It would be true if you wrote galaxy instead of universe.