Articles covering the ideas of inside view and outside view:
Beware the Inside View (by Robin Hanson)
Outside View LessWrong wiki article
Article discussing the weighting of inside view and outside view:
The World is Mad (by ozymandias)
A couple of potential extinction events which seem to be most easily mitigated (the machinery involved is expensive):
Broadcasting powerful messages to the stars:
Should Earth Shut the Hell Up? (by Robin Hanson)
Large Hadron Collider:
Anyone who thinks the Large Hadron Collider will destroy the world is a t**t. (by Rebecca Roache)
How should the inside view versus the outside view be weighted when considering extinction events?
Should the broadcast of future Arecibo messages (or powerful signals in general) be opposed?
Should the expansion of energy levels (or continued operation at all) of the Large Hadron Collider be opposed?
Feeding biologically active compounds to large numbers of humans has a long track record of being dangerous in a reasonably large portion of cases.
The FDA was created once there was a realistic, significant risk.
Similarly, if you want to release pathogens or modified animals there's already a history of adverse events and a reasonable chance of non-zero risk. Even without GM we've had killer bees from normal crossbreeding. There's an established pattern of realistic, significant risk.
There are already lots of ~zero risk AI projects which change their own source code. Any law which bans Tierra or Avida are, likewise, poorly thought out laws.