XiXiDu comments on How many people here agree with Holden? [Actually, who agrees with Holden?] - Less Wrong

4 Post author: private_messaging 14 May 2012 11:44AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (105)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 14 May 2012 06:52:40PM *  6 points [-]

Do you feel this conflicts with opinions expressed on your blog? If not, why not?

Your question demands a thoughtful reply. I don't have the time to do so right now.

Maybe the following snippet from a conversation with Holden can shed some light on what is really a very complicated subject:

I even believe that SIAI, even given its shortcomings, is valuable. It makes people think, especially the AI/CS crowd, and causes debate.

I certainly do not envy you for having to decide if it is a worthwhile charity.

What I am saying is that I wouldn't mind if it kept its current funding. Although if I believed that there was even a small chance that they could be building the kind of AI that they envision, then in that case I would probably actively try to make them lose funding.

My position is probably inconsistent and highly volatile.

Just think about it this way. If you asked me if I do desire a world state where people like Eliezer Yudkowsky are able to think about AI risks, then I would say yes. If you asked me how come I wouldn't allocate the money to protect poor people against malaria, then I can only admit that I don't have a good answer. That is an extremely difficult problem.

As I said, I am glad that people like you are thinking about those questions. And if I had to decide, if it was either you, thinking about charitable giving in general, or Eliezer Yudkowsky, thinking about AI risks, then I would certainly fund you.

END OF EMAIL

I think it would be worth a lot of investment (not 1% of GDP! but more than $500,000 a year) to decrease the likelihood of an agent coming about that is far smarter than humans and hostile to them.

That doesn't mean that I believe that "this is crunch time for the entire human species". If it was at me to allocate the worlds resources I would also fund David Chalmers to think about consciousness.

I wrote that I "would be pleased" if they were to keep their current level of funding. I did not say that I recommend people to contribute money to SIAI or that I would personally donate money.

I might change my mind at any time though. I am still at the beginning of the exploration phase.

Comment author: Rain 14 May 2012 07:02:57PM *  3 points [-]

My position is probably inconsistent and highly volatile.

Okay.