ciphergoth comments on Other Existential Risks - Less Wrong

32 Post author: multifoliaterose 17 August 2010 09:24PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (120)

You are viewing a single comment's thread.

Comment author: ciphergoth 18 August 2010 07:55:23AM 2 points [-]

WRT point D, it should be possible to come up with some sort of formula that gives the relative utility according to maxipok of working on various risks. Something that takes into account

  • The current probability of a particular risk causing existential disaster
  • The total resources in dollars currently expended on that risk
  • The relative reduction in risk that a 1% increase in resources on that risk would bring

These I think are all that are needed when considering donations. When considering time rather than money, you also need to take into account:

  • The dollar value of a one hour of a well-suited person's leisure time spent on the risk
  • The relative value of one's own time on the risk compared to the arbitrary well-suited person measured against

This is to take into account that it might be rational to work on AI risk even as you donated to, say, a nanotech-related risk organisation, if your skillset was particularly well suited to it.

Comment author: taw 18 August 2010 12:45:13PM -2 points [-]
  • The current probability of a particular risk causing existential disaster
  • The total resources in dollars currently expended on that risk
  • The relative reduction in risk that a 1% increase in resources on that risk would bring

How #1 and especially #3 can be anything more than ass pulls? I don't even see how to calculate #2 in a reasonable way for most risks.

Comment author: ciphergoth 18 August 2010 07:08:06PM 1 point [-]

What superior method of comparing such charities are you comparing this to?