No write-up. The idea is that you can decide between two situations by choosing the one with greater information or complexity. The trickiness is in deciding how to measure information or complexity, and in deciding what to measure the complexity of. You probably don't want to conclude that, in a closed system, the ethically best thing to do is nothing because doing anything increases entropy. (Perhaps using a measure of computation performed, instead of a static measure of entropy, would address that.)
This gives you immediately a lot of ethical principles that are otherwise difficult to justify; such as valuing evolution, knowledge, diversity, and the environment; and condemning (non-selective) destruction and censorship. Also, whereas most ethical systems tend to extreme points of view, the development of complexity is greatest when control parameters take on intermediate values. Conservatives value stasis; progressives value change; those who wish to increase complexity aim for a balance between the two.
(The equation in my comment is not specific to that idea, so it may be distracting you.)
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Or more succinctly and broadly, learn to:
pay attention
correct bias
anticipate bias
estimate well
With a single specific enumeration of means to accomplish these competencies you risk ignoring other possible curricula. And you encourage the same blind spots for the entire community of aspiring rationalists so educated.