One of my frequent criticisms of LessWrong denizens is that they are very quick to say "This is too confused" when they should be saying "I don't understand and don't care to take the time to try to understand".
The burden of clarity falls on the writer. Not all confusion is the writer's fault, but confused writing is a very major problem in philosophy. In fact, I would say it's more of a problem than falsehood is. There's no shame in being confused - almost everyone is, especially around complex topics like morality. But you can't expect to make novel contributions that are any good until you've untangled the usual confusions and understood the progress that's previously been made.
Or, can you really not see someone over-optimizing their search for money at the expense of their happiness or the rest of their life.
If someone sacrifices happiness to seek money, the problem is not that they're doing too good a job of earning money, it's that they're optimizing the wrong thing entirely. An AI wouldn't see your advice against over-optimizing and put more resources into finding happiness for people; instead, it would waste some of its money to make sure it didn't have too much.
In the spirit of Asimov’s 3 Laws of Robotics
It is my contention that Yudkowsky’s CEV converges to the following 3 points:
I further contend that, if this CEV is translated to the 3 Goals above and implemented in a Yudkowskian Benevolent Goal Architecture (BGA), that the result would be a Friendly AI.
It should be noted that evolution and history say that cooperation and ethics are stable attractors while submitting to slavery (when you don’t have to) is not. This formulation expands Singer’s Circles of Morality as far as they’ll go and tries to eliminate irrational Us-Them distinctions based on anything other than optimizing goals for everyone — the same direction that humanity seems headed in and exactly where current SIAI proposals come up short.
Once again, cross-posted here on my blog (unlike my last article, I have no idea whether this will be karma'd out of existence or not ;-)