Attempt at the briefest content-full Less Wrong post:
Once AI is developed, it could "easily" colonise the universe. So the Great Filter (preventing the emergence of star-spanning civilizations) must strike before AI could be developed. If AI is easy, we could conceivably have built it already, or we could be on the cusp of building it. So the Great Filter must predate us, unless AI is hard.
Especially if the builders are concerned about unintended consequences, the final goal might be relatively narrow and easily achieved, yet result in the wiping out of the builder species.