I wanted to love this post, but stylistic issues got in the way.
It read too much like a gwern essay: certainly interesting, but in need of a summary and a guide for how it is practically applicable. A string of highlights and commentary with no clear underlying organization and conclusion is not optimally useful.
That being said, I appreciate you taking the time to create this post, as well as your call for constructive criticism.
Hmmm. You do have some interesting ideas regarding cryonics funding that do sound promising, but to be safe I would talk to Alcor, specifically Diane Cremeens, about them directly to ensure ahead of time that they'll work for them.
That does sound about right, but with two potential caveats: one is that individual circumstances might also matter in these calculations. For example, my risk of dying in a car accident is much lowered by not driving and only rarely riding in cars. However, my risk of dying of heart disease is raised by a strong family history.
There may also be financial considerations. Cancer almost certainly and often heart disease and stroke take time to kill. If you were paying for cryonics out-of-pocket, this wouldn't matter, but if you were paying with life insuranc...
I don't think it's been asked before on Less Wrong, and it's an interesting question.
It depends on how much you value not dying. If you value it very strongly, the risk of sudden, terminal, but not immediately fatal injuries or illnesses, as mentioned by paper-machine, might be unacceptable to you, and would point toward joining Alcor sooner rather than later.
The marginal increase your support would add to the probability of Alcor surviving as an institution might also matter to you selfishly, since this would increase the probability that there will exist...
Good idea. (Note, if you haven't seen the film, here's a spoiler-heavy synopsis).
My line of thought:
Gur pngnfgebcuvp (yngre erirnyrq gb or rkvfgragvny) evfx bs gur svyz vf gur svpgvbany bar bs tvnag zbafgref, be xnvwh, nggnpxvat uhzna cbchyngvba pragref, ohg greebevfz hfvat shgher jrncbaf, nagvovbgvp erfvfgnapr, pyvzngr punatr, rgp. pna pyrneyl or fhofgvghgrq va.
Tvnag zbafgref pna jbex nf na rfcrpvnyyl ivfpreny qenzngvmngvba bs pngnfgebcuvp evfxf, nf abg bayl Cnpvsvp Evz ohg gur bevtvany, harqvgrq, Wncnarfr-ynathntr irefvba bs gur svyz Tbwven (gur frzvany...
Google Translate gets me "flight of death" or "wants death". "Flight of death" might refer to AK. More interestingly, "wants death" would make no sense in reference to himself wanting death, but it would make sense in reference to Voldemort wanting the deaths of others. There's some possible support for your interpretation there.
Your heuristic for getting the news checks out in my experience, so that seems worth trying.
I wouldn't be surprised if we've both seen plenty of Snowden/NSA on Hacker News.
Thanks for the links.
And while I agree with you that quitting the news would likely be intellectually hygienic and emotionally healthy, it would probably also work as an anti-akrasia tactic in the specific case of cutting out something I often turn to to avoid actual work. Similar to the "out of sight, out of mind" principle, but more "out of habit, out of mind".
To add to Principle #5, in a conversational style: "if something exists, that something can be quantified. Beauty, love, and joy are concrete and measurable; you just fail at it. To be fair, you lack the scientific and technological means of doing so, but - failure is failure. You failing at quantification does not devalue something of value."
Try some exposure therapy to whatever it is you're often afraid of. Can't think of what you're often afraid of? I'd be surprised if you're completely immune to every common phobia.
Very interested.
Also, here's a bit of old discussion on the topic I found interesting enough to save.
I recently noticed "The Fable of the Dragon-Tyrant" under the front page's Featured Articles section, which caused me to realize that there's more to Featured Articles than the Sequences alone. This particular article (an excellent one, by the way) is also not from Less Wrong itself, yet is obviously relevant to it; it's hosted on Nick Bostrom's personal site.
I'm interested in reading high-quality non-Sequences articles (I'm making my way through the Sequences separately using the [SEQ RERUN] feature) relevant to Less Wrong that I might have miss...
Michaelcurzi's How to avoid dying in a car crash is relevant. Bentarm's comment on that thread makes an excellent point regarding coronary heart disease.
There is also Eliezer Yudkowsky's You Only Live Twice and Robin Hanson's We Agree: Get Froze on cryonics.
I have a few questions, and I apologize if these are too basic:
1) How concerned is SI with existential risks vs. how concerned is SI with catastrophic risks?
2) If SI is solely concerned with x-risks, do I assume correctly that you also think about how cat. risks can relate to x-risks (certain cat. risks might raise or lower the likelihood of other cat. risks, certain cat. risks might raise or lower the likelihood of certain x-risks, etc.)? It must be hard avoiding the conjunction fallacy! Or is this sort of thing more what the FHI does?
3) Is there much ten...
I saw this same query in the last open thread. I suspect you aren't getting any responses because the answer is long and involved. I don't have time to give you the answer in full either, so I'll give you the quick version:
I am in the process of signing up with Alcor, because after ten years of both observing cryonics organizations myself and reading what other people say about them, Alcor has given a series of cues that they are the more professional cryonics organization.
So, the standard advice is: if you are young, healthy with a long life expectancy, ... (read more)