Bugmaster comments on Thoughts on the Singularity Institute (SI) - Less Wrong

256 Post author: HoldenKarnofsky 11 May 2012 04:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1270)

You are viewing a single comment's thread. Show more comments above.

Comment author: Bugmaster 15 May 2012 10:01:31PM 3 points [-]

In your comment above, you said:

...I can't afford to take a taxi to and from the eye doctor, which means I spend 1.5 hrs each way changing buses to get there, and spend less time being productive on x-risk. That is totally not worth it. Future civilizations would look back on this decision as profoundly stupid.

You also quoted Eliezer saying something similar.

This outlook implies strongly that whatever SIAI is doing is of such monumental significance that future civilizations will not only remember its name, but also reverently preserve every decision it made. You are also quite fond of saying that the work that SIAI is doing is tantamount to "saving the world"; and IIRC Eliezer once said that, if you have a talent for investment banking, you should make as much money as possible and then donate it all to SIAI, as opposed to any other charity.

This kind of grand rhetoric presupposes not only that the SIAI is correct in its risk assessment regarding AGI, but also that they are uniquely qualified to address this potentially world-ending problem, and that, over the ages, no one more qualified could possibly come along. All of this could be true, but it's far from a certainty, as your writing would seem to imply.

Comment author: jacob_cannell 16 May 2012 09:43:41AM *  2 points [-]

This outlook implies strongly that whatever SIAI is doing is of such monumental significance that future civilizations will not only remember its name, but also reverently preserve every decision it made.

In the SIA/Transhumanist outlook, if civilization survives some large (perhaps majority) of extant human minds will survive as uploads. As a result, all of their memories will likely be stored, dissected, shared, searched, judged, and so on. Much will be preserved in such a future. And even without uploading, there are plenty of people who have maintained websites since the early days of the internet with no loss of information, and this is quite likely to remain true far into the future if civilization survives.

Comment author: lukeprog 15 May 2012 11:59:22PM 0 points [-]

I'm not seeing how the above implies the thing you said:

[You assume] our choices are limited to only two possibilities: "Support SIAI, save the world", and "Don't support SIAI, the world is doomed".

(Note that I don't necessarily endorse things you report Eliezer as having said.)

Comment author: Bugmaster 16 May 2012 09:21:09PM *  2 points [-]

You appear to be very confident that future civilizations will remember SIAI in a positive way, and care about its actions. If so, they must have some reason for doing so. Any reason would do, but the most likely reason is that SIAI will accomplish something so spectacularly beneficial that it will affect everyone in the far future. SIAI's core mission is to save the world from UFAI, so it's reasonable to assume that this is the highly beneficial effect that the SIAI will achieve.

I don't have a problem with this chain of events, just with your apparent confidence that a). it's going to happen in exactly that way, and b). your organization is the only one who is qualified to save the world in this specific fashion.

(EDIT: I forgot to say that, if we follow your reasoning to its conclusion, then you are indeed implying that donating as much money or labor as possible to SIAI is the only smart move for any rational agent.)

Note that I have no problem with your main statement, i.e. "lowering the salaries of SIAI members would bring us too much negative utility to compensate for the monetary savings". This kind of cost-benefit analysis is done all the time, and future civilizations rarely enter into it.

Comment author: ciphergoth 16 May 2012 09:29:13AM 1 point [-]

Well no, of course it's not a certainty. All efforts to make a difference are decisions under uncertainty. You're attacking a straw man.

Comment author: Bugmaster 16 May 2012 09:06:46PM 2 points [-]

Please substitute "certainty minus epsilon" for "certainty" wherever you see it in my post. It was not my intention to imply 100% certainty; just a confidence value so high that it amounts to the same thing for all practical purposes.

Comment author: ciphergoth 17 May 2012 05:35:24AM *  0 points [-]

And where do SI claim even that? Obviously some of their discussions are implicitly conditioned on the fundamental assumptions behind their mission being true, but that doesn't mean that they have extremely high confidence in those assumptions.

Comment author: dlthomas 16 May 2012 09:34:18PM 0 points [-]

I don't think "certainty minus epsilon" improves much. It moves it from theoretical impossibility to practical - but looking that far out, I expect "likelihood" might be best.

Comment author: Bugmaster 17 May 2012 01:49:45AM 2 points [-]

I don't understand your comment... what's the practical difference between "extremely high likelihood" and "extremely high certainty" ?