WilliamTrinket

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
Answer by WilliamTrinket31

If your two assumptions hold, takeover by misaligned AGIs that go on to create space-faring civilizations looks much worse than existential disasters that simply wipe humanity out (eg, nuclear extinction). In the latter case, an alien civilization will soon claim the resources humanity would have taken over had we become a space-faring civ, and they'll create just as much value with these resources as we would have created. Assuming that all civs eventually come to an end at some fixed time (see this article by Toby Ord), some amount of potential value will be lost, but not as much as would be lost if a misaligned AI used all the resources in our future light cone for something valueless while excluding aliens from using them. So the main strategic implication is that we should try to build fail-safes into AGI to prevent it from becoming grabby in the event of alignment failure.