DaFranker comments on Humor: GURPS Friendly AI - Less Wrong

9 Post author: hankx7787 04 February 2013 04:38PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (28)

You are viewing a single comment's thread. Show more comments above.

Comment author: DaFranker 04 February 2013 07:18:10PM *  5 points [-]

The idea, if I parse correctly, is that in order to fail that hard you have to at least know part of what you're doing, and automatic failures are always regular failures (Boom!). However, your implementation failed in some detail somewhere, and now the FAI is being weird. Not necessarily entirely bad, just something unexpected or not-quite-what-we-wanted.

I think the FAI Critical Success Table would look more like:

Roll 1d6.

  1. Everyone immediately obtains universally consistent root access without the Core Wars, wherein the laws of the universe start literally bending to accomodate even the most contradictory concepts such as "Torture for 5^^^5 years 4^^^4 sentient beings, with them experiencing pain solely for my personal enjoyment" not generating any kind of negative utility for anyone, including the sentient beings being tortured, and actions that lower any utility below optimal levels simply being timelessly non-existent (i.e. there is always some reason why they're never desired, never implemented, never enforced that also happens to be a strictly dominant strategy for all implemented agents).

2 ... 6. (variations on the same theme of ultimate transdimentional / unboxed godhood)