Ignorance widens the space of possible outcomes, it doesn't narrow it.
i.e. it makes no sense to make arguments like "we know nothing about the mind of god, but he doesn't like gay sex"
If you are ignorant about the nature of superintelligence, then you don't know whether or not it entails certain goals.
Ignorance does not allow you to hold confidence in the proposition that "high intelligence will not automatically imply certain goals".
Adopting this argument from ignorance puts you in the unfortunate position of being like the uninformed layman attempting to convince particle physicists of the grave dangers of supercolliders destroying the earth.
For in fact there is knowledge to be had about intelligence and the nature of future AI, and recognized experts in the field (Norvig, Kurzweil, Hawkins etc) are not dismissing the SIA position out of ignorance.
It's just occurred to me that, giving all the cheerful risk stuff I work with, one of the most optimistic things people could say to me would be:
"You've wasted your life. Nothing of what you've done is relevant or useful."
That would make me very happy. Of course, that only works if it's credible.