You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

wedrifid comments on [LINK] Another "LessWrongers are crazy" article - this time on Slate - Less Wrong Discussion

9 Post author: CronoDAS 18 July 2014 04:57AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (129)

You are viewing a single comment's thread. Show more comments above.

Comment author: Viliam_Bur 18 July 2014 12:09:04PM 3 points [-]

Also, in Newcomb's problem, the goal is to go away with as much money as possible. So it's obvious what to optimize for.

What exactly is the goal with the Basilisk? To give as much money as possible, just to build an evil machine which would torture you unless you gave it as much money as possible, but luckily you did, so you kinda... "win"? You and your five friends are the selected ones who will get the enjoyment of watching the rest of humanity tortured forever? (Sounds like how some early Christians imagined Heaven. Only the few most virtuous ones will get saved, and watching the suffering of the damned in Hell will increase their joy of their own salvation.)

Completely ignoring the problem that just throwing a lot of money around doesn't solve the problem of creating a safe recursively self-improving superhuman AI. (Quoting Sequences: "There's a fellow currently on the AI list who goes around saying that AI will cost a quadrillion dollars—we can't get AI without spending a quadrillion dollars, but we could get AI at any time by spending a quadrillion dollars.") So these guys working on this evil machine... hungry, living in horrible conditions, never having a vacation or going on a date, never seeing a doctor, probably having mental breakdowns all the time; because they are writing the code that would torture them if they did any of that... is this the team we could trust with doing sane and good decisions, and getting all the math right? If no, then we are pretty much fucked regardless of whether we donated to the Basilisk or not, because soon we are all getting transformed to paperclips anyway; the only difference is that 99.9999999% of us will get tortured before that.

How about, you know, just not building the whole monster at the first place? Uhm... could the solution to this horrible problem really be so easy?

Comment author: wedrifid 18 July 2014 04:29:38PM 4 points [-]

How about, you know, just not building the whole monster at the first place? Uhm... could the solution to this horrible problem really be so easy?

Yes.