You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Raemon comments on Abandoning Cached Selves to Re-Write My Source Code Partially, I've Become Unstable - Less Wrong Discussion

6 Post author: diegocaleiro 10 October 2012 05:47PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (101)

You are viewing a single comment's thread.

Comment author: Raemon 10 October 2012 06:18:00PM *  7 points [-]

A year ago, I thought honestly about my long term and short term goals. I concluded that "utilitarian concern for global human flourishing" was something like maybe 5-10% of my personal utility function. The rest is a combination of desire for personal happiness, and artistic development. Included with the "personal happiness" includes feeling like a good person, which I track separately from the "actually being a good person according to utilitarian ethics." (It's a lot easier to feel like a good person than be a good person. This is true even when I know that I'm only feeling like a good person)

I think utilitarian types need to be honest about what their values are. It lets you make more informed choices about what courses of action are long-term sustainable. And then decide either to craft a longterm plan you can manage (in which you continuously do reasonably good things for the world), or figure out how to spend your life going on various "binges" (i.e. spend a few months working full-time on an Effective Altruist project, then spend a few months vagabonding, or whathaveyou).

One my primary goals is to grow the Effective Altruist community in a responsible manner, and one thing I've been wondering is "can and should EA-people people created, or do you need to find people who naturally gravitate towards EA goals and just help them sort out their priorities?"

I was once a non-EA person, and I became an EA person over the course of 7 years. So it's obviously possible for people who don't currently identify as world-savers to change their values, or at least think that they're changing their values. But trying to manage even that 5-10% function HAS been stressful for me and I'm not sure I'd have wanted someone to turn me into my present self without my consent.

This post seems like a pretty important data point. Though I'm not sure exactly how I should be updating.

Regardless, Diego - you seem like you've already made up your mind, but I do support you going off and vagabonding for a while (I plan to myself sometime in the future). I hope that afterwards you find some better longterm solutions that preserve all of your values.

Comment author: drethelin 10 October 2012 07:29:48PM 5 points [-]

I strongly approve of this. I think you can get way more value from a larger number of people who stick with being good to a smaller extent than from a tiny number who burn out.

Comment author: NancyLebovitz 10 October 2012 08:11:31PM 4 points [-]

Especially considering that other people might notice the burnouts and decide to not try being good.

Comment author: Giles 17 October 2012 10:27:09PM 0 points [-]

I think utilitarian types need to be honest about what their values are.

Can you dissolve/unpack this a little?

One related thing I would have benefited from in the past (and possibly still would) is being more honest about how difficult it would be to put my values into practice.

Comment author: Raemon 17 October 2012 10:54:51PM -1 points [-]

Values is a tricky word (so tricky, in fact, that I think it would be reasonable to say that "values" aren't actually a real thing in the first place). I'm using it approximately to mean "things that you care about."

I want humanity to flourish and un-necessary suffering to end. But there's a limit to my caring energy, and I have to divide it between "universal utilitarian good" and "things I personally want for myself." Right now, UUG gets about 5-10% of my caring energy.

I would take a pill that increased UUG to getting 15-20% of my caring energy. (And in real life, this takes the form of investing myself in altruist communities, which reinforces my self-image as someone who does good things). But I honestly don't have interest in becoming 100% altruist.

Part of me wants to be able to say "I'd take a pill that makes me 100% altruist, so that I only feel motivation to do the bare minimum of selfish-things to survive, and otherwise direct my energy to whatever accomplishes the most good." It's a nice thought to believe that about myself. But it's not true. (If I became 5% more altruist, I might want to become an additional 5% more altruist, and maybe the cycle would repeated. Not sure. But if I got to take exactly one pill and then there would be no more pills after, I don't think I'd choose to become more than 50% altruist).

One related thing I would have benefited from in the past (and possibly still would) is being more honest about how difficult it would be to put my values into practice.

That is also important, and slightly different.

Comment author: Giles 29 October 2012 11:00:06PM *  0 points [-]

Thanks - your comment implied a concrete example of what I was after: someone who thinks that they would take the 100% altruism pill when in fact they wouldn't, isn't being honest about their values. I found this helpful.

I think it would be reasonable to say that "values" aren't actually a real thing in the first place). I'm using it approximately to mean "things that you care about."

I'd hazard a guess that calling it something different doesn't make it any realer, but we don't need to get into that right now ;-)

EDIT: what I meant by "concrete" in this case was "without reifying values/preferences/caring etc."

Comment author: diegocaleiro 20 October 2012 02:42:51AM 0 points [-]

That is how I feel. I choose to test for two months how does it feel to be the maximum percent altruist. I got nearly 70%. But after the precommitment of two months ended, the whole story of this post was starting to emerge.

Now 0% and 80% sound emotionally alike, because I dug too deep.