JonahSinick comments on A Proposed Adjustment to the Astronomical Waste Argument - Less Wrong

19 Post author: Nick_Beckstead 27 May 2013 03:39AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (38)

You are viewing a single comment's thread. Show more comments above.

Comment author: JonahSinick 27 May 2013 07:28:19PM 5 points [-]

It's important to note that:

  1. There may be highly targeted interventions (other than x-risk reduction efforts) which can have big trajectory changes (including indirectly improving humans' ability to address x-risks).

  2. With consideration #1 in mind, in deciding whether to support x-risk interventions, one has to consider room for more funding and marginal diminishing returns on investment.

(I recognize that the claims in this comment aren't present in the comment that you responded to, and that I'm introducing them anew here.)

Comment author: ygert 27 May 2013 07:59:34PM *  4 points [-]

There may be highly targeted interventions (other than x-risk reduction efforts) which can have big trajectory changes (including indirectly improving humans' ability to address x-risks).

This is, more or less, the intended purpose behind spending all this energy on studying rationality rather than directly researching FAI. I'm not saying I agree with that reasoning, by the way. But that was the initial reasoning behind Less Wrong, for better or worse. Would we be farther ahead if rather than working on rationality, Eliezer started working immediately on FAI? Maybe, but but likely not. I could see it being argued both ways. But anyway, this shows an actual, very concrete, example of this kind of intervention.

Comment author: Eliezer_Yudkowsky 27 May 2013 07:58:38PM 8 points [-]

Mm, I'm not sure what the intended import of your statement is, can we be more concrete? This sounds like something I would say in explaining why I directed some of my life effort toward CFAR - along with, "Because I found that really actually in practice the number of rationalists seemed like a sharp limiting factor on the growth of x-risk efforts, if I'd picked something lofty-sounding in theory that was supposed to have a side impact I probably wouldn't have guessed as well" and "Keeping in mind that the top people at CFAR are explicitly x-risk aware and think of that impact as part of their job".

Comment author: JonahSinick 27 May 2013 08:10:09PM 2 points [-]

Something along the lines of CFAR could fit the bill. I suspect CFAR could have a bigger impact if it targeted people with stronger focus on global welfare, and/or people with greater influence, than the typical CFAR participant. But I recognize that CFAR is still in a nascent stage, so that it's necessary to cooptimize for the development of content, and growth.

I believe that there are other interventions that would also fit the bill, which I'll describe in later posts.

Comment author: Eliezer_Yudkowsky 27 May 2013 08:49:50PM 7 points [-]

CFAR is indeed so cooptimizing and trying to maximize net impact over time; if you think that a different mix would produce a greater net impact, make the case! CFAR isn't a side-effect project where you just have to cross your fingers and hope that sort of thing happens by coincidence while the leaders are thinking about something else, it's explicitly aimed that way.