satt comments on A critique of effective altruism - Less Wrong

64 Post author: benkuhn 02 December 2013 04:53PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (152)

You are viewing a single comment's thread. Show more comments above.

Comment author: atucker 02 December 2013 07:44:26PM *  3 points [-]

It seems easier to evaluate "is trying to be relevant" than "has XYZ important long-term consequence". For instance, investing in asteroid detection may not be the most important long-term thing, but it's at least plausibly related to x-risk (and would be confusing for it to be actively harmful), whereas third-world health has confusing long-term repercussions, but is definitely not directly related to x-risk.

Even if third world health is important to x-risk through secondary effects, it still seems that any effect on x-risk it has will necessarily be mediated through some object-level x-risk intervention. It doesn't matter what started the chain of events that leads to decreased asteroid risk, but it has to go through some relatively small family of interventions that deal with it on an object level.

Insofar as current society isn't involved in object-level x-risk interventions, it seems weird to think that bringing third-world living standards closer to our own will lead to more involvement in x-risk intervention without there being some sort of wider-spread availability of object-level x-risk intervention.

(Not that I care particularly much about asteroids, but it's a particularly easy example to think about.)

Comment author: satt 03 December 2013 02:58:41AM *  1 point [-]

investing in asteroid detection may not be the most important long-term thing, but it's at least plausibly related to x-risk (and would be confusing for it to be actively harmful), whereas third-world health has confusing long-term repercussions, but is definitely not directly related to x-risk.

I'm inclined to agree. A possible counterargument does come to mind, but I don't know how seriously to take it:

  1. Global pandemics are an existential risk. (Even if they don't kill everyone, they might serve as civilizational defeaters that prevent us from escaping Earth or the solar system before something terminal obliterates humanity.)

  2. Such a pandemic is much more likely to emerge and become a threat in less developed countries, because of worse general health and other conditions more conducive to disease transmission.

  3. Funding health improvements in less developed countries would improve their level of general health and impede disease transmission.

  4. From the above, investing in the health of less developed countries may well be related to x-risk.

  5. Optional: asteroid detection, meanwhile, is mostly a solved problem.

Point 4 seems to follow from points 1-3. To me point 2 seems plausible; point 3 seems qualitatively correct, but I don't know whether it's quantitatively strong enough for the argument's conclusion to follow; and point 1 feels a bit strained. (I don't care so much about point 5 because you were just using asteroids as an easy example.)

Comment author: atucker 03 December 2013 05:35:05AM *  2 points [-]

Though, I can come up with a pretty convincing argument for the opposite.

Diseases only become drug-resistant as a result of natural selection in an environment in which drugs which try to treat the disease are used.

Third world countries have issues with distributing drugs/treatments to everyone in the society, and so it is likely that diseases will not be completely eradicated, but instead exist in an environment with drugs in use. Even in individuals, there are problems with consistently treating the disease, and so it's likely to pressure the disease without curing it.

On the other hand, diseases rarely become drug-resistant when they're not exposed to the drugs.

Therefore, treating people in third-world countries increases the probability of producing drug-resistant strains of existing diseases.