If Strong AI turns out to not be possible, what are our best expectations today as to why?
I'm thinking of trying myself at writing a sci-fi story, do you think exploring this idea has positive utility? I'm not sure myself: it looks like the idea that intelligence explosion is a possibility could use more public exposure, as it is.
I wanted to include a popular meme image macro here, but decided against it. I can't help it: every time I think "what if", I think of this guy.
I'm not sure they're a big part of listic's target audience.
If so, then the explanation proposed by Lalartu won't hold water with the target audience, i.e. the subset of humans who don't happen to hold that idea for granted.
If it's not, and the audience includes general-muggle-population in any non-accidental capacity, then it's worth pointing out that the majority of people accept the idea for granted, and thus that that subset of the target audience would take this explanation in stride.
Either way, the issue is relevant.
Mostly, I just wanted to respond to the emotionally-surprising assertion that they'd never cognizantly encountered this view.