Saying no FAI exists in design space that could satisfy us is equivalent to saying nothing can satisfy us. In other words, if you are correct then the AI isn't the problem and humanity would be "straitjacketed" anyway.
Saying we could never build an AI that would satisfy us because of the technical difficulty is plausible, but I don't think that's what you are saying.
Saying no FAI exists in design space that could satisfy us is equivalent to saying nothing can satisfy us. In other words, if you are correct then the AI isn't the problem and humanity would be "straitjacketed" anyway.
I don't see how not being fully satisfied is a straitjacket. I'm saying that our (the mankind) maximum satisfaction may be when straitjacketed, because mankind isn't sane (and if there isn't any truly sane morality system edit: to clarify. if there is truly sane morality system, then mankind can be cured of insanity).
I laughed: SMBC comic.