If you don't mean skynet as skynet in terminator what do you mean with the skynet?
Yeah I was talking about that terminator guy in terms that AI got ultimate control and used its power against humans, but was defeated, it is not obligatory to include here cyborgs time-travellers.
How you do measure, besides your gut feelings, realisticness of these kinds of scenarios? There is no way to assign probabilities accurately, all we can and should do is imagine as much consequences as possible
I've been returning to my "reduced impact AI" approach, and currently working on some idea.
What I need is some ideas on features that might distinguish between an excellent FAI outcome, and a disaster. The more abstract and general the ideas, the better. Anyone got some suggestions? Don't worry about quality at this point, originality is more prized!
I'm looking for something generic that is easy to measure. At a crude level, if the only options were "papercliper" vs FAI, then we could distinguish those worlds by counting steel content.
So basically some more or less objective measure that has a higher proportion of good outcomes than the baseline.