wedrifid comments on Scenario analysis: semi-general AIs - Less Wrong

1 Post author: Will_Newsome 22 March 2012 09:11AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (66)

You are viewing a single comment's thread. Show more comments above.

Comment author: wedrifid 22 March 2012 01:09:46PM 5 points [-]

It really seems to me that you'd have to be very, very confident that there were no gods around to punish you for you to think it was worth it to turn the humans into computronium.

You just Pascal-Mugged future superintelligences into letting humans live. I've got to admit. That's kind of baddass.

Comment author: Will_Newsome 22 March 2012 02:27:21PM 3 points [-]

I just learned that von Neumann got... um, "wagered". Considering von Neumann was clearly a transhuman this establishes a lower bound on how smart you can be and still accept Pascal's wager. (Though I somewhat suspect that von Neumann's true reasons for returning to Catholicism late in life are more complicated than that.)

Comment author: Wei_Dai 22 March 2012 09:53:53PM *  4 points [-]

Gary Drescher once reminded me that von Neumann may have already suffered neurological damage from metastatic cancer at that point, so this lower bound may not be as high as you think (unless that's what you were alluding to by "true reasons").

Comment author: Will_Newsome 22 March 2012 10:10:21PM 3 points [-]

Von Neumann had been a practicing Catholic earlier in life, so it's not that strange that he would return to Catholicism near the end. By "true reasons" I didn't mean brain damage, though thanks for bringing up that possibility. He was still transhumanly intelligent up until the end, but maybe not quite as transhumanly intelligent. But I guess I just meant that Pascal's wager surely wasn't his only consideration.

Comment author: Dmytry 22 March 2012 07:11:37PM *  1 point [-]

To expand on that, the difference between the let humans live pascal's wager, and the ordinary pascal's wager, is that letting humans live is the status quo.

Consider the 'don't eat me, my daddy is a cop' argument. It is hardly a pascal's wager. It is a rational thing to consider, especially when there's no shortage of food. It is more plausible that the status quo is product of actions of ultra powerful being, than 'you must give all money you got to me or the god will be pissed off' .

Comment author: XiXiDu 10 April 2012 10:29:44AM *  -1 points [-]

I just learned that von Neumann got... um, "wagered". Considering von Neumann was clearly a transhuman this establishes a lower bound on how smart you can be and still accept Pascal's wager.

It is quite fascinating how the belief in God does oscillate between intelligence (rationality?) levels. Chimpanzees are naturally atheistic. Average humans are religious. Above average humans are usually atheistic. High IQ individuals like Eliezer Yudkowsky tend to be agnostic, in the sense that they assign a nonzero probability to the existence of God and believe in the existence of natural or artificial Gods. And people on the verge of posthumanism like John von Neumann, of whom was said that "only he was fully awake", are again leaning towards theism. I wonder if a truly posthuman AI would oscillate back to atheism while conjecturing that Omega is probably a theist.

Comment author: pedanterrific 10 April 2012 11:54:04AM 0 points [-]

Well, Omega would have to have a rather strange mind design not to believe in itself.

Comment author: Nisan 27 April 2012 02:44:33PM 1 point [-]

Like AIXI.