Attempt at the briefest content-full Less Wrong post:
Once AI is developed, it could "easily" colonise the universe. So the Great Filter (preventing the emergence of star-spanning civilizations) must strike before AI could be developed. If AI is easy, we could conceivably have built it already, or we could be on the cusp of building it. So the Great Filter must predate us, unless AI is hard.
Thank you. An interesting read. I found your treatment very thorough given its premises and approach. Sadly we disagree at a point which you seem to take as given without further treatment but which I question:
The ability and energy to set-up infrastructure to exploit interplanetary resources with sufficient net energy gain to sufficiently mine mercury (much less build a dyson sphere).
The problem here is that I do not have refereneces to actually back my opinion on this and I didn't have enough time yet to build my complexity theoretic and thermodynamics arguments into a sufficiently presentable form.
http://lesswrong.com/lw/ii5/baseline_of_my_opinion_on_lw_topics/
We already have solar panel setups with roughly the required energy efficiency.