I think you're reading things into what he said that he never intended to put there.
His central claim is certainly not a reassuring, relaxing one: disaster is "a near certainty". He says it'll be at least a hundred years before we have "self-sustaining colonies in space" (note the words "self-sustaining"; he is not talking about a colony on Mars that "will die off without supplies") and that this means "we have to be very careful in this period".
Yes, indeed, the timescale on which he said disaster is a near certainty is "the next thousand or ten thousand years". I suggest that this simply indicates that he's a cautious scientific sort of chap and doesn't like calling something a "near certainty" if it's merely very probable. Let's suppose you're right about nuclear war and artificial viruses, and let's say there's a 10% chance that one of those causes a planetary-scale disaster within 50 years. (That feels way too pessimistic to me, for what it's worth.) Then the time until there's a 99% chance of such disaster -- which, for me, is actually not enough to justify the words "near certainty" -- is 50 log(0.9)/log(0.01) years ... or about 2000 years. Well done, Prof. Hawking!
Indeed, the statement that "migration to other planets automatically means salvation" is false. But that goes beyond what he actually said. A nuclear war or genetically engineered flu virus that wiped out most of the population on earth probably wouldn't also wipe out the population of a colony on, say, Mars. (You say "nuclear missiles would reach [Mars] as well", but why? Existing nuclear missiles certainly aren't close to having that capability, and there's a huge difference from the defensive point of view between "missiles have been launched and will hit us in a few minutes if not intercepted" and "missiles have been launched and will hit us in a few months if not intercepted".)
You ask "Why namely 100 years?" but, at least in the article you link to, Hawking is not quoted as saying that there are no AI risks on timescales shorter than that. Maybe he's said something similar elsewhere?
I really don't think many people are going to read that article and come away feeling more relaxed about humanity's prospects for survival.
I comment on what impression he translates to public, that risks are remote and space colonies will save us. He may secretly have other thoughts on the topic, but this does not matter. We could reconstruct his line of reasoning, but most people will not do it. Even if he thinks that risks are 1 per cent in next 50 years it results in 87 per cent in 10 000 years. I think that attempts to reconstructs some one line of reasoning with goal to get more comforting result is biased.
For example one may said: "I want to kill children". But we know apri...
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.