Comment author: MichaelVassar 07 January 2010 01:05:19AM 0 points [-]

In practice, if you are only talking about the 70 most important steps that people are prone to messing up, that could easily be correct. Not to mention the probability of doing harm. Certainly there are a lot more than 10 steps that people are prone to messing up which reduce value by more than 80% in practice.

Comment author: dansmith 07 January 2010 01:27:11AM 1 point [-]

I suppose it depends what kinds of decisions you're talking about making. (eg keeping AIs from destroying humanity.) I was thinking along the lines of day-to-day decision making, in which people generally manage to survive for decades in spite of ridiculously flawed beliefs -- so it seems there are lots of situations where performance doesn't appear to degrade nearly so sharply.

At any rate, I guess I'm with ciphergoth, the more interesting question is why 99% accurate is "maybe maybe" okay, but 95% is "hell no". Where do those numbers come from?

Comment author: bogus 06 January 2010 02:02:02AM *  9 points [-]

I've learned that ironic self-referential humor has a surprisingly low chance of making it across the Internet gap.

<sarcasm>You don't say?</sarcasm>

I'm not really sure about Furcas's remark. There is a real correlation between having a "thick skin" and propensity to be mean to others, and far too many people seem to think that the former entitles them to the latter. This is why Crocker's Rules have been so widely misinterpreted.

Comment author: dansmith 07 January 2010 12:35:10AM 6 points [-]

On the other hand, people also often seem to think that a propensity to be nice to others entitles them to have an overly thin skin.

Comment author: MichaelVassar 06 January 2010 11:42:44PM 1 point [-]

I can't either, but my basic reaction is simply that in practice purity is critical here. If, in order to act correctly, a person needs to do more than 70 cognitive things correctly, their expected value falls by half for every 1% that they are wrong.

Comment author: dansmith 07 January 2010 12:19:13AM 0 points [-]

Assuming any action anywhere short of optimal results in zero value, sure. In practice?

Comment author: Rain 31 December 2009 06:30:07PM *  3 points [-]

What grant project listed there will produce the most utilons?

My first guess would be the General Fund, as SIAI employees would then be free to choose the topics they feel are most effective at achieving their goals, and they benefit from the most motivation and knowledge of the topics at hand.

However, they have insider views, which may skew their interpretation of the value some projects will produce. It also appears they want to signal that they will accept guidance from outside in the form of donations to specific projects, or are attempting to solicit donations from people who otherwise would not have donated had they not known where their money was going.

The latter doesn't appear to be working very well considering the low dollar figure of donations outside the general fund, and may in fact prove discouraging to those few who attempted to fund their pet project. If this is true, then I would suggest only allowing donations to specific projects as a special case.

Comment author: dansmith 31 December 2009 08:27:13PM 2 points [-]

Regarding the last point, at the very least SIAI would be better off not advertising that several projects are partially funded to the tune of $5 out of thousands. It doesn't exactly motivate one to open up his own wallet for a similar small donation.

If you need to keep contributors updated on the extent to which their projects have received funding, perhaps do so privately by email on request?

Comment author: dansmith 29 December 2009 10:28:38AM 1 point [-]

In this model, people aren't just seeking status, they're (also? instead?) seeking a state of affairs that allows them to believe they have status.

It seems like most situations that this theory covers are already explained by either: (a) people seek status not only in the context of society at large but also in the context of small groups (b) for the cases where no one else knows, ego -- people seek to feel good about themselves (including that they are smart)

Perhaps the (b) cases are explained better by the "seeking plausible belief in own status" model, but I'm not sure that that's clear, at least from what's been written so far.