Nick_Tarleton comments on Anthropomorphic Optimism - Less Wrong

25 Post author: Eliezer_Yudkowsky 04 August 2008 08:17PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (51)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Nick_Tarleton 05 August 2008 06:03:02PM 0 points [-]

If there's any slightest chance that it will result in infinite bad, then the problem is much more complicated.

There's always a nonzero chance that any action will cause an infinite bad. Also an infinite good. Even with finite but unbounded utility functions, this divergence occurs.

Utility maximizing together with an unbounded utility function necessarily lead to what Nick calls fanaticism.

Bounded utility functions have counterintuitive results as well. Most of these only show up in rare (but still realistic) global "what sort of world should we create" situations, but there can be local effects too; as I believe Carl Shulman pointed out, bounded utility causes your decisions to be dominated by low-probability hypotheses that there are few people (so your actions can have a large effect.)