Yes, if he had said "I think there is a small-but-reasonable probability that FAI could affect way way more than 3^^^3 people", I wouldn't have had a problem with that (modulo certain things about how big that probability is).
Well, small-but-reasonable times infinite equals infinite. Which is indeed way, way bigger than 3^^^3.
If we [respond strongly to all low-probability threats], we spend 10 times GDP.
10 times current GDP perhaps. Motivating organization can do wonders for productivity. We are hardly at capacity.
Thermal radiation...?
If I were building a dyson sphere, I'd want to collimate all the radiation toward a single direction, perhaps gating it periodically. Make it look like a pulsar.
don’t overemphasise any difficulties
Tautologically true, a truism. The "over"emphasise reduces your advice to "don't emphasise more than you should emphasise", which is a non-statement. The crux of the argument is, of course, to elicit how much is too much, and to find that right balance: "Do the correct action!" isn't helpful advice in my opinion.
The rest of your advice, building up some authority first so that the argument you're most interesting in will be taken seriously, is instrumentally useful, but epistemically fragile: "Become an authority so that people tend to believe you by default" has a bit of a dark arts ring to it.
While in generality it is a valid Bayesian inference to predict that someone who has turned out to be correct a bunch of times will continue to do so, if that history of being correct was built up mainly to lend credence to that final and crucial argument, the argumentum ad auctoritatem fails: like many a snake oil salesmen, building rep to make the crucial sale is effective, but it shouldn't work, it's building a "yes, yes, yes" loop when your final argument should stand on its own merits.
While in generality it is a valid Bayesian inference to predict that someone who has turned out to be correct a bunch of times will continue to do so, if that history of being correct was built up mainly to lend credence to that final and crucial argument, the argumentum ad auctoritatem fails
You're right that the argument should stand on its own merit if heard to completion.
The point here is that heuristics can kick in early and the listener, either due to being irrational or due to time considerations, might not give the argument the time and attention to finish. This is about how to craft an argument so that it is more likely to be followed to completion.
We can't use our current economic theories to effectively model such a situation.
I'm still unclear, why not? Once the sphere is built, while the raw energy available is fixed, we can still have growth in computation per unit energy, right?
Is there anything specific about your case, or is the same procedure likely to help a lot of people to improve their hearing?
This should only help people who currently have earwax obstructions.
Something else. I'm not too familiar with the newspaper, so I wouldn't know whether it is right-libertarian biased (JoshuaZ says it's not, and the fact that it is based in SF is also evidence that it's not). The article had enough specifics that it's pretty hard to attribute all of its claims to bias anyway. And 1 isn't quite right either; the New England states are easy counterexamples. I was simply noting that a lot of the problems in San Francisco are fairly similar to the ones right-libertarians are often concerned about. I guess my comment was a bit vague; sorry.
It's possible that many of the right-libertarians ended up that way because of SF's problems.
View more: Prev
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I can't remember the article where this was stated, but we have instincts for morality because following them made our ancestors more successful. They're their for our benefit, not each others'. It seemed to your ancestors that killing someone and taking their stuff would be a net benefit, and if they didn't have a built-in aversion they'd do it, and they would likely get caught and punished.
This does not actually speak to the utility of such instincts to individuals. Rather, it indicates their utility the gene bundle, by increasing the genes' probability of propagating. A tribe that stole from itself would not get very far through time.