If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.
Does not work. AGI is unlikely to be the Great Filter since expanding at less than light speed would be visible to us and expanding at close to light speed is unlikely. Note that if AGI is a serious existential threat then space colonies will not be sufficient to stop it. Colonization works well for nuclear war, nanotech problems, epidemics, some astronomical threats, but not artificial intelligence.
Good point about AGI probably not being the Great Filter. I didn't mean space colonization would prevent existential risks from AI though, just general threats.
So, we've established that existential risks (ignoring heat death, if it counts as one) will very probably occur within 1000 years, but can we get more specific?