Carl_Shulman

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

I.e. I agree with your analysis that they (and artemisinin treatment) are great and worth doing if the local governments don't tax or steal them (in various ways) too intensively.

Douglas,

It's $1000 per life not per net, because in most cases nets or treatment won't avert a death.

g,

There's plenty of room to work on vaccines and drugs for tropical diseases, improved strains of African crops like cassava, drip irrigation devices, charcoal technology, etc.

http://en.wikipedia.org/wiki/Amy_Smith http://web.mit.edu/newsoffice/2008/lemelson-sustainability-0423.html

kebko,

The best interventions today seem to cost $1000 per life saved. Much of the trillion dollars was Cold War payoffs, bribing African leaders not go Communist, so the fact that it was stolen/wasted wasn't that much of a concern.

I tend to prefer spending money on developing cheaper treatments and Africa-suitable technologies, then putting them in the public domain. That produces value but nothing to steal.

Regarding g's point, I note that there's a well-established market niche for this sort of thing: it's like the popularity of Ward Connerly among conservatives as an opponent of affirmative action, or Ayaan Hirsi Ali (not to downplay the murderous persecution she has suffered, or necessarily to attack her views) among advocates of war against Muslim countries. She'll probably sell a fair number of books, get support from conservative foundations, and some nice speaking engagements.

g,

This is based on the diavlog with Tyler Cowen, who did explicitly say that decision theory and other standard methodologies doesn't apply well to Pascalian cases.

Pablo,

Vagueness might leave you unable to subjectively distinguish probabilities, but you would still expect that an idealized reasoner using Solomonoff induction with unbounded computing power and your sensory info would not view the probabilities as exactly balancing, which would give infinite information value to further study of the question.

The idea that further study wouldn't unbalance estimates in humans is both empirically false in the cases of a number of smart people who have undertaken it, and looks like another rationalization.

The fallacious arguments against Pascal's Wager are usually followed by motivated stopping.

"that equally large tiny probabilities offer opposite payoffs for the same action (the Muslim God will damn you for believing in the Christian God)." Utilitarian would rightly attack this, since the probabilities almost certainly won't wind up exactly balancing. A better argument is that wasting time thinking about Christianity will distract you from more probable weird-physics and Simulation Hypothesis Wagers.

A more important criticism is that humans just physiologically don't have any emotions that scale linearly. To the extent that we approximate utility functions, we approximate ones with bounded utility, although utilitarians have a bounded concern with acting or aspiring to act or believing that they aspire to act as though they have concern with good consequences that is close to linear with the consequences, i.e. they have a bounded interest in 'shutting up and multiplying.'

Load More