Alicorn, a deontologist, wishes that a certain consequence (the salvation of the world) obtain. Whether she is involved in producing that consequence or not.
Giles, presumably a consequentialist, phrases his own wish so as to egoistically place himself at the site of the action.
The juxtaposition carries a certain irony.
Before seeing this subthread, I interpreted it almost exactly opposite. I thought of "I want the world to be saved" as just that, but "I want to save the world" as meaning "I want the world to be saved, and I am willing to work toward this goal myself." Sort of along the lines of this exchange from Terry Prachett's The Wee Free Men:
‘Ah. Something bad is happening.’
Tiffany looked worried.
‘Can I stop it?’
‘And now I’m slightly impressed,’ said Miss Tick. ‘You said, “Can I stop it?”, not “Can anyone stop it?” or “Can we stop it?” That’s good. You accept responsibility. That’s a good start.
When I say that I want to save the world, that's what I try to mean.
In general, though, I am sceptical that "producing world-saving actions" is what we should be aiming for. Maybe I am biased by the fact that I am a cautious person, but I think that if only we could make everyone a lot more cautious
Aaaand not to put too fine a point on it, but how much research is that caution getting done, exactly? Philanthropic donations produced by this philosophy? Anything?
I don't actively want to be involved in doing it. I would be quite happy to be among the masses of the saved by someone else's hand. I'm willing to help when ways to do that present themselves, since ignoring ways to make things I want to happen happen would be pretty dumb.
This comment inspired me to make a donation to Village Reach. Your right action just got $350 worth of preventative medical care for kids, plus this praising comment.
I will extol thee, my fellow LessWronger, O SIAI donor, and I will bless thy name until June 1. Every day I will bless thee; and I will praise thy name until June 1. Great is Rain, and greatly to be praised; and eir greatness is searchable and indexed by Google.
Your action is particularly right in not requiring that every user must limit the amount of praise to one comment.
I do a virtual Rain dance to honor this right action.
Further, I compound this by donating an additional $30 myself to SIAI right now.
I'll pat myself on the back for coming up with this idea, which has promised $340 to SIAI as of me submitting this comment.
Where can I exchange snide remarks for constructive criticism?
In all seriousness, I don't think Giles is trying to get much applause here, so much as make it easier for people to coordinate their efforts.
I think (correct me if I'm wrong) that he knows that he doesn't know the specific steps to take in order to accomplish his goals. Which is why he wants to talk to these people.
I think that he has done a pretty bad job of PR, and should have more concrete ideas and plans before he continues posting on the subject. Furthermore, he's continuing to use the heavily loaded phrase "save the world" in ways which probably discredit it, and this site.
That being said, I think that this comment is almost entirely destructive, and makes no progress towards anything other than continuing to tear Giles down. Which the current karma system is already doing.
I'm planning to save the world by accumulating a large amount of money and donating it to the most effective charity that I can find.
Two reasons why I currently think this path is best for me:
1) I think that my mind is much better suited to accumulating money than directly working on really hard problems. Decision theory just makes my head hurt.
2) If I change my mind about which charity I consider effective, being a donor allows me to immediately act on my updated beliefs without wasting my past learning. Ex: If I became an FAI researcher and then (after I had spent years learning how to be an effective FAI researcher) decided that life-extension technologies were more effective, I would have to study a bunch of new stuff. If I'm donating, I just send the money to a different place. Curious note: The influence of this factor on my final decision is inversely related to my confidence level in my current judgement.
Edit: I may be wrong about #2; the instrumental utility granted from such may be smaller than I am estimating it to be. However, I think that I have enough of a comparative advantage in making money that even if #2 grants me only a small amount of utility, my decision is...
I would very strongly advise that you donate something while you're trying to accumulate money. Otherwise I would bet against a generic person in your situation ever following through (Outside View).
If I have a choice between actions, and one of them is more likely to save the world than the other, I will take the one that is more likely to save the world.
Even I don't live up to that every time, not even close, but it sure sounds a lot scarier than "wanting to save the world", doesn't it?
Yesterday I went on vacation from LW, but today I thought I'd see how this post was going, since it had the potential to produce something new... Alas, in about 12 hours, it has sunk from -1 to -6, as the mob decides it is about nothing but "applause lights" and votes it down. This is a failure of imagination and it's about to become a lost opportunity. It is not every day that someone shows up wanting to organize the world-savers, and in this case, I see definite potential. Or is it really the case that all those altruists have no need for support? End of lecture, back to vacation.
Yes, I strongly prefer that earth-originating humane life survive and thrive and spread throughout the universe and make it much more fun and awesome to the fullest extent of what the laws of physics will allow, and I intend to use my life for this purpose.
(Though I'm curious, what kind of cooperation are you talking about, beyond what's already facilitated by entities like SIAI, LW, FHI, and the Existential Risk Reduction Career Network?)
I am nauseated by the very thought of being included in your list, despite my own practical plans in that direction. What is it with with empty applause generating exhortations these days? Ick. Double ick.
PS: Being put on a list of people with Dorikka's line of thought would not be psychologically distressing to me in the least. It is not nearly so creepy sounding.
Looks like if you want to save the world, you've gotta accept that you're going to lose some karma.
A call to action should come with a definite goal IMHO. This call to action comes with not much more than a collection of vague motherhood statements.
I want the world to not need to be saved, but will settle for it being saved. The reality of existential risk is such an inconvenience. I want to help, but probably won't have, recognize, and successfully act on the opportunity to do so.
The scenarios I can imagine where a list like this would be useful are farfetched.
Well, I think most of us want to save the world, or at least help to save it. The BIG problem is to find an efficient strategy to do so. We should make concrete proposals, not merely profess our altruism. ....and to be not too hypocritical here are my naive proposals:
I want the world to be saved, and am willing to take action to make that happen so long as the actions I take to make it happen don't make me feel like a victim. I tend to feel like a victim if I take an action that reduces my standard of living, I contribute to a lost cause, or a few other scenarios that don't seem relevant here.
I presently feel that SIAI is blocking itself by apparently believing that solving the FAI problem is blocked on any or all of the following:
"I am concerned that this statement feels extreme and arrogant even if technically accurate; I really don't want my identity so publicly associated with this position." Could you remove my name from the list please?
ETA: Thanks!
I want the world to be saved. If that means I have to do something about it, then I have to do something about it.
I want to increase the probability of world survival. This I intend to do by choosing a career which has some impact on existential risk and by donating money to SIAI. I also believe that promoting cryonics decreases existential risk indirectly - if you expect to be around 1000 years from now, that tends to give a longer-term view on matters.
The effort it takes to keep up with the amount of analysis and meta analysis done here is quite exhausting.
The Lifeboat Foundation has built a list of people, some high-status, who have said that they want the world saved. They have done nothing else, but this list is a good thing to have.
I want to help save the world just as much as I want the world to be saved but either would be amazing from my perspective.
I want the world (i.e., civilization) to survive. I would choose a lower standard of living for myself and a lower probability of personal survival to increase the probability of global survival.
Except for rather minor exertions (such as devoting a minor fraction of my time and energy over a couple of years to making sure that my rather strange set of values had at least one advocate in the singularitarian conversation -- something I stopped doing about Apr 2009) I have not actually done anything for my civilization because I am so ridiculously disabled by chronic illness that with p=.95 I must allocate almost all of my resources into solving that bitch of a problem before I can be any significant use to myself or the world.
But equally clearly, the list [of people who want to save the world] will not include everyone.
What are you basing this claim on?
"Save the world" is a subset of "improve the world" where saving is improving by a lot in a way that the world really needs it. "Improving the world" can mean settling for a smaller improvement, but probably doesn't mean "improving in every way so it will include saving the world". If people stop wanting to "save the world" because they weighted their desire to improve it in lesser ways anywhere near their desire to save it, to sound less egotistical, to avoid the applause light, or to dissociate from peopl...
But maybe some of us can find somewhere to talk that's a little quieter.
I guess we could have an irc meetup or something? To talk about what specifically we're doing, and what we can help each other with.
I think this sort of thing is quite common:
Rescuing things is widely regarded as being good - and the whole world acts as a superstimulus.
Comparing a disliked belief to a religious one has all the universal applicability of repeating what they say in a high - pitched tone of voice.
Post any "meta" (i.e. anything that's not "I want to save the world") under here to keep things tidy. Thanks.
"Save the world" has icky connotations for me. I also suspect that it's too vague for there to be much benefit to people announcing that they would like to do so. Better to discuss concrete problems, and then ask who is interested/concerned with those problems and who would like to try to work on them.
Well I just want to rule the world. To want to abstractly "save the world" seems rather absurd, particularly when it's not clear that the world needs saving. I suspect that the "I want to save the world" impulse is really the "I want to rule the world" impulse in disguise, and I prefer to be up front about my motives...
Everyone wants to save something, don't you think?
(ETA: I've realized that my comment isn't helpful.)
As someone who accepts both the doomsday argument and EDT (as opposed to TDT), I don't think the world can be saved.
I want to improve the world.
atucker wants to save the world.
ciphergoth wants to save the world.
Dorikka wants to save the world.
Eliezer_Yudkowsky wants to save the world.
I want to save the world.
Kaj_Sotala wants to save the world.
lincolnquirk wants to save the world.
Louie wants to save the world.
paulfchristiano wants to save the world.
Psy-Kosh wants to save the world.
Clearly the list I've given is incomplete. I imagine most members of the Singularity Institute belong here; otherwise their motives are pretty baffling. But equally clearly, the list will not include everyone.
What's my point? My point is that these people should be cooperating. But we can't cooperate unless we know who we are. If you feel your name belongs on this list then add a top-level comment to this thread, and feel free to add any information about what this means to you personally or what plans you have. Or it's enough just to say, "I want to save the world".
This time, no-one's signing up for anything. I'm just doing this to let you know that you're not alone. But maybe some of us can find somewhere to talk that's a little quieter.