In this case "be blackmailed" means "contribute to creating the damn AI".
To quote someone else here: "Well, in the original formulation, Roko's Basilisk is an FAI that decided the good from bringing an FAI into the world a few days earlier (saving ~150,000 lives per day earlier it gets here)". The AI acausally blackmails people into building it sooner, not into building it at all. So failing to give into the blackmail results in the AI still being built but later and it is capable of punishing people.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I see.
This is the gist of the AI Box experiment, no?
No. Bribes and rational persuasion are fair game too.