Related to: People who want to save the world
I have recently been diagnosed with cancer, for which I am currently being treated with good prognosis. I've been reevaluating my life plans and priorities in response. To be clear, I estimate that the cancer is responsible for much less than half the total danger to my life. The universals - X-risks, diseases I don't have yet, traffic accidents, etc. - are worse.
I would like to affirm my desire to Save Myself (and Save The World For Myself). Saving the world is a prerequisite simply because the world is in danger. I believe my values are well aligned with those of the LW community; wanting to Save The World is a good applause light but I believe most people want to do so for selfish reasons.
I would also like to ask LW members: why do you prefer to contribute (in part) towards humankind-wide X-risk problems rather than more narrow but personally important issues? How do you determine the time- and risk- tradeoffs between things like saving money for healthcare, and investing money in preventing an unfriendly AI FOOM?
It is common advice here to focus on earning money and donating it to research, rather than donating in kind. How do you decide what portion of income to donate to SIAI, which to SENS, and which to keep as money for purely personal problems that others won't invest in? There's no conceptual difficulty here, but I have no idea how to quantify the risks involved.
What absurdity? Here's you, Neil, and that's Tyler. It's possible to tell who of you two is Neil and who is not. A copy-Neil might be about the same thing as Neil, but this doesn't interfere with the simplicity of telling that Tyler is not the same thing. You can well care about Neil-like things more than about Tyler-like things. It's plausible from an evolutionary psychology standpoint that humans care about themselves more than other people. By "myself" I mean "a thing like the one I'm pointing at", and the rest is a process of evaluating this symbolic reference into a more self-contained definition; this simple reference is sufficient to specify the intended meaning.
What I meant was that the common situation whereby a person both (i) believes in persisting subjective identity (sameness of Cartesian Theater over time) and (ii) attaches massive importance to it (e.g. using words like 'death' to refer to its extinction), doesn't obviously or frequently give rise to irrational decision-making until we start talking about things like cryonics, teleportation and cloning.
I apologise for the unclarity of my final sentence if you took me to be saying something stronger.