Manfred comments on What would you do with infinite willpower? - Less Wrong

9 Post author: D_Malik 03 June 2011 12:22PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (60)

You are viewing a single comment's thread.

Comment author: Manfred 03 June 2011 12:39:32PM 4 points [-]

Why have infinite willpower if not to use it to satisfy preferences? Smell the roses, play games, chat with friends. The only reason not to do this on any large scale is if there was something you had to do now that could have huge returns later. Code a self-improving AI, discover immortality, that sort of thing. However, even with infinite willpower I don't think everyone is cut out for that, so for most people I'd say make enough money to hit diminishing returns on investing it in research, invest it in research, and live the good life.

Comment author: D_Malik 03 June 2011 01:09:14PM *  8 points [-]

Smell the roses

While you smell the roses, 100 people die horrible painful deaths. What would you like to do next?

My point is, until the Singularity (if then) other people's suffering will outweigh rose-smelling by people who can control their emotions at will anyway (and by other people), even if they've invested in research to zero marginal utility.

Comment author: prase 03 June 2011 05:43:19PM *  1 point [-]

So any pleasure is considered luxury until death is eradicated, and finite willpower is the only apology that can justify not concentrating all one's efforts to fighting against death?

Imagine that aging was curable and you were essentially immortal - only there was an annual chance of 10^(-8) that you will die painful death by some accident. Would you forgo all trivial pleasures, if that spared you from the risk? In such a world, with present population of order 10^10, 100 people would still die painful deaths each year.

Comment author: D_Malik 03 June 2011 06:25:35PM *  1 point [-]

I would try to figure out how painful the deaths are and how trivial the pleasures are.

I don't think this world's an edge case yet though - decreasing pleasure now seems like it would increase long-term expected pleasure, e.g. by working more and using the money to make FAI more likely, or giving it to any effective charity.

Comment author: Manfred 03 June 2011 08:41:49PM 1 point [-]

Just because people are dying doesn't mean you shouldn't do a cost/benefit calculation. Which sounds terrible until I bet you haven't donated everything you can to charity. Now it just sounds human to me.

Comment author: steven0461 03 June 2011 09:07:45PM 1 point [-]

"Enough to hit diminishing returns" doesn't mean anything until you specify how strongly diminishing.

Comment author: Duke 04 June 2011 04:50:42PM *  0 points [-]

Redacted because I misread Manfred's comment the first time.

Comment author: Alicorn 04 June 2011 05:22:30PM 0 points [-]

After a while of doing that, you will no longer need willpower to prevent you from chatting with friends.