Nick_Tarleton comments on In conclusion: in the land beyond money pumps lie extreme events - Less Wrong

4 Post author: Stuart_Armstrong 23 November 2009 03:03PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (21)

You are viewing a single comment's thread. Show more comments above.

Comment author: Nick_Tarleton 23 November 2009 03:53:34PM *  7 points [-]

Obvious guess: Eli^H^H^H Michael Vassar doesn't think SIAI's budget shows increasing marginal returns. (Nor, for what it's worth, can I imagine why it would.)

Comment author: cousin_it 23 November 2009 04:01:39PM *  4 points [-]

That one's easy: successfully saving the world requires more money than they have now, and if they don't reach that goal, it makes little difference how much money they raise. Eliezer believes most non-winning outcomes are pretty much equivalent:

Mostly, the meddling dabblers won't trap you in With Folded Hands or The Metamorphosis of Prime Intellect. Mostly, they're just gonna kill ya.

(from here)

Comment author: Zack_M_Davis 23 November 2009 06:25:23PM 0 points [-]

But cf. also:

I doubt my ability to usefully spend more than $10 million/year on the Singularity. What do you do with the rest of the money?

Comment author: Stuart_Armstrong 24 November 2009 09:12:23AM -1 points [-]

And I probably should defer to their judgement on this, as they certainly know more than me about the SIAI's work and what it could do with more money.

I was simply saying that in my estimation, expected utility would recommend that they splurge on Tr-Ro lottery tickets - but I'm still happy that they don't.

(Just in case my estimation is relevant: I feel the SIAI has a decent chance of moving the world towards an AI that is non-deadly, useful, and doesn't constrain humanity too much. With a lot more money, I think they could implement an AI that is fun heaven on earth. Expected utility is positive, but the increased risk of us all dying horribly doesn't make it worthwhile).