David_Gerard comments on Q: What has Rationality Done for You? - Less Wrong

11 Post author: atucker 02 April 2011 04:13AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (88)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 02 April 2011 09:54:35AM *  10 points [-]

Most of all it just made me sad and depressed. The whole "expected utility" thing being the worst part. If you take it seriously you'll forever procrastinate having fun because you can always imagine that postponing some terminal goal and instead doing something instrumental will yield even more utility in future. So if you enjoy mountain climbing you'll postpone it until it is safer or after the Singularity when you can have much more safe mountain climbing. And then after the Singularity you won't be able to do it because the resources for a galactic civilization are better used to fight hostile aliens and afterwards fix the heat death of the universe. There's always more expected utility in fixing problems, it is always about expected utility never about gathering or experiencing utility. And if you don't believe into risks from AI then there is some other existential risk and if there is no risk then it is poverty in Obscureistan. And if there is nothing at all then you should try to update your estimates because if you're wrong you'll lose more than by trying to figure out if you're wrong. You never hit diminishing returns. And in the end all your complex values are replaced by the tools and heuristics that were originally meant to help you achieve them. It's like you'll have to become one of those people who work all their life to save money for their retirement when they are old and lost most of their interests.

Comment author: David_Gerard 02 April 2011 10:32:09AM *  8 points [-]

(Who voted down this sincere expression of personal feeling? Tch.)

This is why remembering to have fun along the way is important. Remember: you are an ape. The Straw Vulcan is a lie. The unlived life is not so worth examining. Remember to be human.

Comment author: XiXiDu 02 April 2011 03:03:16PM 12 points [-]

This is why remembering to have fun along the way is important.

I know that argument. But I can't get hold of it. What can I do, play a game? I'll have to examine everything in terms of expected utility. If I want to play a game I'll have to remind myself that I really want to solve friendly AI and therefore have to regard "playing a game" as an instrumental goal rather than a terminal goal. And in this sense, can I justify to play a game? You don't die if you are unhappy, I could just work overtime as street builder to earn even more money to donate it to the SIAI. There is no excuse to play a game because being unhappy for a few decades can not outweigh the expected utility of a positive Singularity and it doesn't reduce your efficiency as much as playing games and going to movies. There is simply no excuse to have fun. And that will be the same after the Singularity too.

Comment author: ciphergoth 03 April 2011 08:02:54AM 12 points [-]

As a result of this thinking, are you devoting every moment of your time and every Joule of your energy towards avoiding a negative Singularity?

No?

No, me neither. If I were to reason this way, the inevitable result for me would be that I couldn't bear to think about it at all and I'd live my whole life neither happily nor productively, and I suspect the same is true for you. The risk of burning out and forgetting about the whole thing is high, and that doesn't maximize utility either. You will be able to bring about bigger changes much more effectively if you look after yourself. So, sure, it's worth wondering if you can do more to bring about a good outcome for humanity - but don't make gigantic changes that could lead to burnout. Start from where you are, and step things up as you are able.

Comment author: Mycroft65536 05 April 2011 03:54:03AM 8 points [-]

Lets say the Singularity is likely to happen in 2045 like Kurzweil says, and you want to maximize the chances that it's positive. The idea that you should get to work making as much money to donate to SIAI, or that you should start researching fAGI (depending on your talents). What you do tomorrow doesn't matter. What matters is the average output over the next 35 years.

This is important because a strategy where you have a emotional breakdown in 2020 fails. If you get so miserable you kill yourself you've failed at your goal. You need to make sure that this fallible agent, XIXIDu, stays at a very high level of productivity for the next 35 years. That almost never happens if you're not fulfilling the needs your monkey brain demands.

Immediate gratification isn't a terminal goal, you've figured this out, but it does work as an instrumental goal on the path of a greater goal.

Comment author: MatthewBaker 11 July 2011 07:23:21PM *  0 points [-]

Ditto

Comment author: David_Gerard 02 April 2011 07:06:36PM *  16 points [-]

The reason it's important is because it counts as basic mental maintenance, just as eating reasonably and exercising a bit and so on are basic bodily maintenance. You cannot achieve any goal without basic self-care.

For the solving friendly AI problem in particular: the current leader in the field has noticed his work suffers if he doesn't allow play time. You are allowed play time.

You are not a moral failure for not personally achieving an arbitrary degree of moral perfection.

You sound depressed, which would mean your hardware was even more corrupt and biased than usual. This won't help achieve a positive Singularity either. Driving yourself crazier with guilt at not being able to work for a positive Singularity won't help your effectiveness, so you need to stop doing that.

You are allowed to rest and play. You need to let yourself rest. Take a deep breath! Sleep! Go on holiday! Talk to friends you trust! See your doctor! Please do something. You sound like you are dashing your mind to pieces against the rock of the profoundly difficult, and you are not under any obligation to do such a thing, to punish yourself so.

Comment author: Gray 03 April 2011 05:17:59AM 5 points [-]

One thing that I've come with when thinking about personal budgeting, of all things, is the concept of granularity. For someone who is poor, the situation is analogous to yours. The dad, lets say, of the household might be having a similar attack of conscience as you are on whether he should buy a candy bar at the gas station, when there are bills that can't be paid.

But it turns out that a small enough purchase, such as a really cheap candy bar (for the sake of argument), doesn't actually make any difference. No bill is going to go from being unpaid to paid because that candy was bought rather than unbought.

So relax. Buy a candy bar every once in a while. It won't make a difference.

Comment author: atucker 03 April 2011 04:13:52AM 3 points [-]

I took too long to link to this.

Comment author: Eliezer_Yudkowsky 03 April 2011 12:16:21AM 3 points [-]

I don't tell people this very often. In fact I'm not sure I can recall ever telling anyone this before, but then I wouldn't necessarily remember it. But yes, in this case and in these exact circumstances, you need to get laid.

Comment author: NancyLebovitz 03 April 2011 11:13:09AM 8 points [-]

Could you expand on why offering this advice makes sense to you in this situation, when it hasn't otherwise?