timtyler comments on Journal of Consciousness Studies issue on the Singularity - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (85)
Does this make sense? How much does the scan cost? How long does it take? What are the costs and risks of the treatment? Essentially, are the facts as you state them?
I don't think so. Are you thinking of utilitarianism? If so, expected utility maximization != utilitarianism.
Ok what's the difference here? By "utilitarianism" do you mean the old straw-man version of utilitarianism with bad utility function and no ethical injunctions?
I usually take utilitarianism to be consequentialism + max(E(U)) + sane human-value metaethics. Am I confused?
The term "utilitarianism" refers to maximising the combined happiness of all people. The page says:
So: that's a particular class of utility functions.
"Expected utility maximization" is a more general framework from decision theory. You can use any utility function with it - and you can use it to model practically any agent.
Utilitarianism is a pretty nutty personal moral philosophy, IMO. It is certainly very unnatural - due partly to its selflessness and lack of nepotism. It may have some merits as a politial philosophy (but even then...).
Thanks.
Is there a name for expected utility maximisation over a consequentialist utility function built from human value? Does "consequentialism" usually imply normal human value, or is it usually a general term?
See http://en.wikipedia.org/wiki/Consequentialism for your last question (it's a general term).
The answer to your "Is there a name..." question is "no" - AFAIK.
I get the impression that most people around here approach morality from that perspective, it seems like something that ought to have a name.