Increase how much coercive power we hold over how many of our fellow human beings: the ability to make them do things or else.
I don't think this sort of power is very useful in the real world at all. It's great for Jack Bower fantasies, but when was the last time you got something you really wanted from a prison inmate? And the amount of man-power and resources it takes to control people at that level is wastefully inefficient. Coalitions of willing allies are much more powerful.
Increase how much coercive power we hold over how many of our fellow human beings: the ability to make them do things or else.
Coalitions of willing allies are much more powerful.
I was thinking of a way of expressing my thoughts on this, and this chart from Yvain occurred to me.
http://lesswrong.com/lw/6nz/approving_reinforces_loweffort_behaviors/
In terms of getting more people to do X, ideally people should want X to occur, should like X occurring, and should approve of X occurring.
Coercion seems like it usually implies at least one of the following: Tha...
In this post, I'll try to tackle the question of whether this community and its members should focus more efforts and resources on improving their strength as individuals and as a community, than on directly tackling the problem of singularity. I'll start off with a personal anecdote, because, while I know it's not indispensable, I think anecdotes help the reader to think in near rather than far mode, and this post's topic is already too easily thought of in far mode in the first place.
The other day, I was in an idle conversation with a cab driver when he asked me: What would you do if you won the lottery? Is there some particular dream you have, such as travelling the world or something? I said (and I apologize in advance for the grandiosity and egotism of what follows, mostly because it might show a poor appraisal of my own competence and ability)
My reply surprised both of us. Him, because it was atypical (apparently most people would spend them on luxury items and so on, that is, they would spend their newfound money in signalling that they have it... I think the mistake comes from seeing rich people doing it and then assuming that that's what you should do if you become rich, the only other option apparently being saving it up in an account). That a modern rationalist came up with an atypical answer to such a question is only to be expected.
But I was surprised too, because I found it strange that what I thought I ought to do and what I wanted to do coincided so perfectly. I wasn't even expecting those last two points, they sort of naturally came out in the spur of the moment. Upon further thought, I was also surprised that this turned out to be merely an exaggeration and heavy of my pre-existing plan, which I am already attempting to follow with far less material means. That is to say, the dramatic change in money did not fundamentally change what I wanted to do with my (currently limited) lifetime.
But then I asked myself: if my priority is reducing existential risk, why am I not giving all the money to my favourite nonprofits immediately?
And that's where it hit me: I wanted to make myself stronger. And the point I'm trying to make is that, well, so should we all. Why?
There's a strong selfish component to that (not that there's anything wrong with healthy selfishness), but, for someone who considers existential risk an extremely important factor, enlightened self-interest might still be on the side of donating immediately.
But it might also be a sound strategy, sounder, perhaps, to exponentially increase our ability to help fight existential risk, in terms of fear, and improve the general level of human rationality, in terms of desire (I understand that we would all be happier in a world with more rational people, for many, many reasons, not all of which are altruistic). So, how would we go about this? I submit to you this tentative strategy draft.
A lot of effort has already been expended by the community in working on these first steps. But there's a third step that isn't getting worked on much, perhaps because of aesthetic values, perhaps because it's one of the most dangerous to wield, both to the world and to ourselves and our own personal integrity:
Those are partly selfish goals unto themselves: power means freedom to do what you want, and that and high social status are already very enjoyable for their own sake. Additionally, the more of us achieve them (and the larger the capacity in which they achieve them), the more resources they can get assigned and the more support they can gather (or force) for the sake of efforts towards preventing existential risk. But I suggest that they be mainly planned, optimized and instrumentalized for Step Four, the most dangerous of all:
Which has the following advantages I can think of, listed without regard for altruism or selfishness:
Does achieving Step Four mean humanity will actually be in less of a danger of self-destructing at that point? It's not a rhetorical question, and I don't think its answer is trivial: in particular, having many half rationalists (I might well still be one myself) running around might represent a considerable danger, which could be sustained in time. However, projects such as Methods of Rationality or The Centre for Modern Rationality, as well as this site's very existence seem to hint that some of the smartest among us are willing to take the risk.
So, the immediate question I ask of you in earnest, the whole point of this post: How do we go about spending our money and effort in the most effective way to prevent existential risk? How much to de expend in directly attacking the problem as we are, how much do we expend in actually making ourselves stronger?
In sillier terms: Should the Z Warriors go and try to confront Cell right now, before he grows too strong to beat, or should they avoid the fight and go train instead? (assume that they do nothing with their lives but be in fights, train to prepare for fights, or run away from fights they are not prepared for yet)