Look, while nothing you're saying here is particularly objectionable in my opinion (not that I agree, it's just that the disagreement is not violent), I've just gone over your comment history and they were all like "I don't believe I've gained any benefit from reading this post", "I don't think there's much worth in discussing this", "I'm not very convinced by the arguments made in this post" etc. It goes like this for about half a year.
Which gets me thinking, okay, so you didn't like LessWrong from the very beginning -- but then why spend time on showing this to everybody? It doesn't make sense to make an account just to periodically express your dissatisfaction with the content posted -- I mean, when I believe a website to be boring and useless, I prefer not to bother with it and click the red X instead. Do you do this for every other site you stumble upon and get to dislike? Because that would be quite a lot of time wasted on places that just aren't worth it.
Presumably aceofspades thinks getting people to stop reading LW is a valuable use of his time (e.g. because it frees up the time of smart and talented people who could be doing more productive things) and/or that his leaving would be an unacceptable example of evaporative cooling.
That or, y'know, someone is wrong on the internet.
There are probably several things where I would broadly agree with you, however your post would be much better without the condescending tone. And perhaps without all the non sequiturs:
If the rest of the world is underconfident about these ideas, then these investments would surely have an enormous expected rate of return.
Why? If people don't believe that cryonics will work, you can't sell it to them for a lot of money even if they are wrong. (Disclaimer: I haven't signed for cryonics.)
How many people responding to this survey have actually made significant personal preparations for survival, like a fallout shelter with food and so on which would actually be useful under most of the different scenarios listed?
(Disclaimer: I don't especially fear future global catastrophes and moreover don't think that we can predict them significantly better than by random guess.)
The questions on dust specks vs torture and Newcomb's Problem are so unlikely to ever be relevant in reality that I view discussion about them as worthless.
Relevant to what? It seems that those discussions were intended as illustrations of theoretical problems with common utilitarian and decision-theoretic intuitions. Learning that one's intuitions have bounded domain and don't work well in extreme unrealistic scenarios isn't perhaps a life changing achievement, but it is at least interesting. Perhaps not interesting to you, but not interesting to you and worthless are different things. (Disclaimer: I don't think that having correct answer to Newcomb and dust specks is going to be practically important in and of itself.)
Other people here apparently disagree, but if the rest of the world is undervaluing cryonics at the moment then why do those here with privileged information not invest heavily in the formation of new for-profit cryonics organizations, or start them alone, or invest in technology which will soon develop to make the revival of cryonics patients possible? If the rest of the world is underconfident about these ideas, then these investments would surely have an enormous expected rate of return.
I think you're relying on a whole slew of assumptions here which obviously do not hold.
How many people responding to this survey have actually made significant personal preparations for survival, like a fallout shelter with food and so on which would actually be useful under most of the different scenarios listed? I generously estimate 5% have made any such preparations.
What existential threat would a fallout shelter help with? There's a long list of existential threats which one could be worried about, you know... For someone enamored of economic judgment, you don't seem to be doing it very well.
My judgment of this site as of now is that way too much time is spent discussing subjects of such low expected value (usually because of absurdly low expected probability of occurring) for using this site to be worthwhile.
Possibly, but criticizing people for not doing cryonics startups or digging shelters is not going to prove such assertions.
I have never received evidence that I am less likely to be overconfident about things than people in general or that any other particular person on this site is.
You've never caught yourself in the act of falling for a cognitive bias detailed on this site?
My judgment of this site as of now is that way too much time is spent discussing subjects of such low expected value (usually because of absurdly low expected probability of occurring) for using this site to be worthwhile. In fact I hypothesize that this discussion actually causes overconfidence related to such things happening, and at a minimum I have seen insufficient evidence for the value of using this site to continue doing so.
I'm curious about what other web sites satisfy similarly high expectations. No snark intended.
RE: Cryonics - that particular *reverse Kool-Aid doesn't come in my flavor yet, but I enjoy that a notable minority are willing to put their money where their mouth is. It gives discussions of futuristic edge-cases a novel weight.
why do those here with privileged information not invest heavily in the formation of new for-profit cryonics organizations, or start them alone, or invest in technology which will soon develop to make the revival of cryonics patients possible?
"The market can stay irrational longer than you can stay solvent." (John Maynard Keynes)
Also, at which amount of money would you be indifferent between either being put to death and receiving that much (to do as you please with, as set in your last will if you wish), or staying alive and not receiving that amount of money?
The proposed dollar value of a human life to the economy and the amount where a person would be indifferent to dying and being able to dispose of that amount as they saw fit in their will, or living and not receiving it, are not the same.
The value of a person's life to the government and/or economy is more like the value of that person's life to themself, minus all consumer surplus.
The proposed dollar value of a human life to the economy
VSL isn't a measure of value "to the economy," it's a measure of the value people place on risks to their own lives, relative to other consumption choices they could make. It maps on to things like people's willingness to pay for safety features in cars, trade wages for job risk, and so forth.
However, there is still a wedge between VSL
and the amount where a person would be indifferent to dying and being able to dispose of that amount as they saw fit in their will, or living and not receiving it, are not the same.
A person who would accept a 50% risk of death in exchange for a billion dollars (to spend on hedonism) in the event of survival could be unconcerned with the fates of her heirs or any other uses for an estate after death.
Thanks for the correction.
I'm a bit surprised though. Value-to-the-economy may not be a very good proxy for the value of a human life, but at least it's a coherent one, whereas I would be fairly shocked if the amount people in general were willing to pay to mitigate risks to their life turned out to be coherent on a basis of money per amount risk.
To take one of the metrics from the linked page
Another method economists can use to estimate the VSL is by simply asking people ( perhaps through questionnaires) how much they would be willing to pay for a reduction in the likelihood of dying, perhaps by purchasing safety improvements.
I'd be willing to bet good money that if you performed such a survey, and another survey in which you posited a certain number of deaths per year due to terrorism, and asked how much money tax ought to go to fighting terrorism, that the extrapolated value that they assign to mitigating terrorist risk would be inconsistent with their stated value of home safety.
Certainly people's "revealed preferences" do not appear to indicate that they're consistent according to such a metric.
Newcomb's problem and Specks vs. Torture...yes...the theory behind them, used to program AI...not so much. Eliezer is generous enough to write in English for those who don't speak C++/pseudocode/math.
This post was pretty useless, to me, by the way. I've heard these very same opinions before, on this very website (put much more politely, eloquently, and, in some cases, even better argued for), so this post doesn't affect my beliefs. Downvoting.
Edit: By the way, I'm here mostly because I enjoy spending time here. I didn't start reading this website to learn about existential risk OR cryonics OR Newcomb's problem.
While it's a fair point that this forum has a lot of noise and nonsense, the amount of insights not easily available elsewhere easily justifies spending some of my time here for me. It only takes a moment to ignore/downvote posts you don't care for, anyway. YMMV.
It's not just the material there, but the references to material elsewhere that pay dividends.
I came here out of interest towards references to Jaynes and Korzybski. People who take those guys seriously are ahead of the curve, IMO. So they also take Cialdini and Kahneman seriously. And Hugh Everett. And some of them, Mencius Moldbug.
All guys with valuable insights.
I don't share the fascination over Newcomb problems either. So I don't read them. End of problem. Seems like the OP is busy looking for what he doesn't value, instead of what he does. That's too bad for him. This place is a nexus of a lot of good ideas.
Evidently 72.9% of people here are at least considering signing up. I think the chance of cryonics working [...]
Doom avoidance and death avoidance evidently go together.
Belief in big technological advances over the next century or so would bump up both the chance of some kind of technological catastrophe and of cryonic revival being possible. On the other hand, a higher estimate of (cryonics-disrupting) catastrophe should reduce the expected payoff of cryonics.
Link to those results: http://lesswrong.com/lw/fp5/2012_survey_results/
I've been basically lurking this site for more than a year now and it's incredible that I have actually taken anything at all on this site seriously, let alone that at least thousands of others have. I have never received evidence that I am less likely to be overconfident about things than people in general or that any other particular person on this site is.
Yet in spite of this apparently 3.7% of people answering the survey have actually signed up for cryonics which is surely greater than the percent of people in the entire world signed up for cryonics. The entire idea seems to be taken especially seriously on this site. Evidently 72.9% of people here are at least considering signing up. I think the chance of cryonics working is trivial, for all practical purposes indistinguishable from zero (the expected value of the benefit is certainly not worth several hundred thousand dollars in future value considerations). Other people here apparently disagree, but if the rest of the world is undervaluing cryonics at the moment then why do those here with privileged information not invest heavily in the formation of new for-profit cryonics organizations, or start them alone, or invest in technology which will soon develop to make the revival of cryonics patients possible? If the rest of the world is underconfident about these ideas, then these investments would surely have an enormous expected rate of return.
There is also a question asking about the relative likelihood of different existential risks, which seems to imply that any of these risks are especially worth considering. This is not really a fault of the survey itself, as I have read significant discussion on this site related to these ideas. In my judgment this reflects a grand level of overconfidence in the probabilities of any of these occurring. How many people responding to this survey have actually made significant personal preparations for survival, like a fallout shelter with food and so on which would actually be useful under most of the different scenarios listed? I generously estimate 5% have made any such preparations.
I also see mentioned in the survey and have read on this site material related to in my view meaningless counterfactuals. The questions on dust specks vs torture and Newcomb's Problem are so unlikely to ever be relevant in reality that I view discussion about them as worthless.
My judgment of this site as of now is that way too much time is spent discussing subjects of such low expected value (usually because of absurdly low expected probability of occurring) for using this site to be worthwhile. In fact I hypothesize that this discussion actually causes overconfidence related to such things happening, and at a minimum I have seen insufficient evidence for the value of using this site to continue doing so.