Eliezer_Yudkowsky comments on Money: The Unit of Caring - Less Wrong

95 Post author: Eliezer_Yudkowsky 31 March 2009 12:35PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (126)

You are viewing a single comment's thread. Show more comments above.

Comment author: Johnicholas 31 March 2009 04:52:07PM *  13 points [-]

I tried to make this observation before, but my point doesn't seem to have been addressed in this followup.

Throwing money in the direction of a problem without checks and balances to ensure that the money is actually spent productively is wrong.

For example, suppose that Dark Side Charity's message is just like Light Side Charity's message: "give me money to save the world". However, Dark Side Charity doesn't spend the money on saving the world, but on sending out more and more requests. Giving money to Dark Side Charity would be wrong. Because the two charities's requests are identical, giving money to Light Side Charity based only on the request is also wrong.

You might argue that you just need to estimate the probability that you are talking to the Light Side. However, remember that Dark Side Charity will grow when someone sends it money, changing the frequency that Dark Side Charity requests are encountered. If (as might well be the case) the system is already at equilibrium, then your probability estimate will depend primarily on the force stopped the positive feedback - e.g. the cost of sending the request. Spam is frequent primarily because it is cheap to send.

My suggestion: Incorporate this idea into the request for money, and proffer evidence that the money is being spent well. A list of "this is how we spent last year's money" isn't sufficient - Dark Side Charity could easily make a list. Independent 3rd party auditor's stamp of approval might help. Successes broadcast to the world might help. Accepting volunteers even though it seems inefficient might help.

Comment author: Eliezer_Yudkowsky 31 March 2009 11:31:31PM 13 points [-]

Did you just prove that in the absence of trustworthy auditors believed to be trustworthy, the Dark Side always wins because it invests more resources into future growth?

Comment author: MichaelVassar 31 March 2009 11:33:34PM 3 points [-]

wasn't that obvious?

Comment author: Eliezer_Yudkowsky 31 March 2009 11:52:00PM 5 points [-]

It is never obvious that the Dark Side wins.

If we're talking about replicators versus fun theorists, then in this circumstance the rational fun theorist will grow as fast as possible (since it is possible) up until exponential growth hits a barrier, and only then begin devoting any resources to fun. It still loses to the replicator but not by much.

Among humans the Light Side will use generally different tactics and will seek other advantages.

Comment author: PhilGoetz 01 April 2009 04:00:50AM 18 points [-]

In this particular case, the Light charity is like a bacteria that you've engineered to produce a desired protein that you want that is not needed for its own survival. When you put these bacteria in a bioreactor, mutations inevitably take some back to the wild type, which don't make that protein but put all their energy into reproduction. They quickly take over the bioreactor and drive the "altruistic" bacteria into extinction. This is not a PD case where some equilibrium arises between exploitation and cooperation. Without some countervailing force not specified here, exploitation wins.

Comment author: Johnicholas 01 April 2009 02:16:23PM 1 point [-]

Auditors are only one of the ways that the Light Side Charity can distinguish itself.

I think this is a signalling problem; the Light Side Charity needs to find a visible activity that it can do more cheaply than the Dark Side Charity, and invest sufficient effort into that activity to distinguish itself.