this was posted after your comment, but i think this is close enough:
And the idea that intelligent systems will inevitably want to take over, dominate humans, or just destroy humanity through negligence is preposterous.
They would have to be specifically designed to do so.
Whereas we will obviously design them to not do so.
unless i'm misunderstanding you or MIRI, that's not their primary concern at all: