Unnamed comments on Rationality quotes: October 2010 - Less Wrong

4 Post author: Morendil 05 October 2010 11:38AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (472)

You are viewing a single comment's thread.

Comment author: Unnamed 05 October 2010 08:48:57PM 13 points [-]

God, grant me the serenity To accept the things I cannot change; Courage to change the things I can; And wisdom to know the difference.

-- adapted from Reinhold Niebuhr

Is this a piece of traditional deep wisdom that's actually wise?

Comment author: Eliezer_Yudkowsky 07 October 2010 10:33:19AM 12 points [-]

God grant me the strength to change the things I can,

The intelligence to know what I can change,

And the rationality to realize that God isn't the key figure here.

Comment author: xamdam 07 October 2010 06:47:57PM 0 points [-]

Cute, but you just undermined "strength" :)

Comment author: Unnamed 06 October 2010 02:00:55AM *  9 points [-]

What I like about the serenity prayer (at least the way I interpret it) is that it puts the priority on changing things; serenity is just a second-best option for things that are unchangeable.

In that respect, it's like a transhumanist slogan. With something like life extension, I want to point to the serenity prayer and say we can change this, which means we need to have the courage to change. Death at the end of the current lifespan isn't something that we should serenely accept because we can change it. The serenity prayer calls for courage and action to follow through and make those changes.

Part of the difficulty is that the wisdom to know the difference also requires the wisdom to change your mind. Once people accept that something cannot be changed, then their serenity-producing mechanisms prevent them from reconsidering the evidence and recognizing that maybe it really can (and should) be changed.

If I was going to alter the serenity prayer, that's one thing I'd add. In Alicorn's version, that means the strength as a rationalist to distinguish what I can and cannot change, and to update those categorizations as new evidence arises.

Comment author: Mass_Driver 06 October 2010 04:12:53AM 12 points [-]

Friends, help me build the serenity to accept the things I cannot change; the courage to change the things I can; and the wisdom to continually update which is which based on the best available evidence.

Comment author: Alicorn 05 October 2010 09:46:49PM 17 points [-]

I think the local version would be something like, "May my strength as a rationalist give me the ability to discern what I can and cannot change, and the determination to make a desperate effort at the latter when remaining uncertainty allows that this has the highest expected utility."

Comment author: wedrifid 08 October 2010 12:07:02AM *  1 point [-]

(Where leaving out or replacing 'strength as a rationalist' makes the quote a whole lot more appealing to me if nobody else. Heck, even the jargon term 'luminosity' would feel better.)

Comment author: [deleted] 07 October 2010 04:07:40AM *  1 point [-]

That was beautiful. :)

Comment author: arch1 06 October 2010 09:15:06PM 4 points [-]

Er, how about the wisdom to know whether a thing should be changed in the 1st place?

Comment author: wedrifid 08 October 2010 12:08:08AM 2 points [-]

A good point... although I would remove the 'should' and instead emphasise the coherence and self awareness to know which things I want.

Comment author: James_K 06 October 2010 05:40:04AM 2 points [-]

I think it genuinely wise, it contains three related important concepts: 1) You should try to make the world a better place, 2) You shouldn't waste your effort in attempting 1 in situations when you will almost certainly fail, 3) in order to succeed at 1 & 2 you need to be able to understand the world around you, a desire, to affect change isn't enough.

The only thing that's missing form it is something about having the insight to distinguish good changes form bad ones.

Comment author: wedrifid 08 October 2010 12:01:36AM 3 points [-]

You shouldn't waste your effort in attempting 1 in situations when you will almost certainly fail,

Not quite. You want to consider the expected value of the attempt, not the raw probability of success. A 0.1% chance of curing cancer or 'old age' is to be preferred over an 80% chance of winning the X-Factor (particularly given that the latter applies to yourself).

It would definitely be foolish to waste effort attempting something that will certainly fail.

Comment author: James_K 08 October 2010 04:25:29AM 0 points [-]

I agree with your qualifications, I was oversimplifying. And the reason I didn't say certainly fail because I try to avoid using the word "certain" unless I'm dealing with purely logical systems.

Comment author: wedrifid 08 October 2010 04:38:55AM 1 point [-]

And the reason I didn't say certainly fail because I try to avoid using the word "certain" unless I'm dealing with purely logical systems.

A worthy goal. Usually that will prevent you from making claims that are technically wrong despite being inspired by good thinking. This seems to be a rare case where defaulting to not using an absolute introduces the technical problem.

Comment author: James_K 08 October 2010 10:46:26AM 0 points [-]

Just an indication that one should avoid absolutes: even an absolute directive to avoid absolutes ;)

Comment author: soreff 09 October 2010 05:22:10PM *  1 point [-]

I don't think that

1) You should try to make the world a better place

is actually implied by the original wording. Clippy could also view

God, grant me the serenity To accept the things I cannot change; Courage to change the things I can; And wisdom to know the difference.

as wise, though in vis case, "the things I cannot change" would be closer to "the resources I am unable to apply to paperclips". One can't expect too much specificity from a 25 word quote... I'm taking your point

The only thing that's missing form it is something about having the insight to distinguish good changes form bad ones.

(which I agree with) as meaning that one should have the insight to distinguish instrumental subgoals that actually will advance one's ultimate goals from subgoals that don't accomplish this. (This is separate from differences in ultimate goals.)

Comment author: James_K 09 October 2010 08:42:38PM 1 point [-]

That all sounds right to me.

Comment author: Scott78704 06 October 2010 03:13:12PM 0 points [-]

Except for the God grant me part, yeah.

Comment author: xamdam 07 October 2010 06:46:36PM -2 points [-]

I think Mike Vassar said something like "you should not have preferences over the current states of the world, only over your emotional dispositions". It's a second-hand quote, but seems like a good way of putting it.

Comment author: Document 10 July 2011 11:14:14PM 0 points [-]

Are you sure you don't have his comment backwards?

Comment author: xamdam 07 October 2010 08:58:47PM -1 points [-]

I didn't expect much karma for this, but WTF with the downvote?

Comment author: gwern 07 October 2010 11:11:32PM 2 points [-]

Because the quote seems to be endorsing wireheading, which is pretty universally condemned here, and seems of little relevance anyway.

Comment author: xamdam 08 October 2010 12:49:06AM 2 points [-]

As far as the false suspicion of wireheading, I am not sure about the attitudes here, but isn't it just a value? I mean I don't think I am interested in wireheading, but if someone truly thinks it's for them, why would we condemn? I thought the forum is about being rational, not about a specific set of values.

Comment author: wedrifid 08 October 2010 12:58:39AM *  4 points [-]

As far as the false suspicion of wireheading, I am not sure about the attitudes here, but isn't it just a value? I mean I don't think I am interested in wireheading, but if someone truly thinks it's for them, why would we condemn? I thought the forum is about being rational, not about a specific set of values.

Your point is valid.

Where it does make sense to call another's choice to wirehead a mistake (rather than just a difference in values) is when that person thinks that wireheading is what they want but they are actually mistaken about their own values or how to achieve them.

It is a little counterintuitive but even though values are entirely subjective people are actually not the absolute authority on what their subjective preferences are. Subjective preferences are objective facts in as much as they are represented by the physical state of the universe (particularly that part of the universe that is the person's head). People's beliefs about that part of the universe and the implications thereof can (and often are) wrong. This particularly applies to abstract concepts - we aren't very good at wiring up our abstract beliefs with rest of our desires.

Comment author: xamdam 08 October 2010 01:32:02AM 0 points [-]

It is a little counterintuitive but even though values are entirely subjective people are actually not the absolute authority on what their subjective preferences are.

Absolutely. In a way we owe this understanding to Freud, he popularized the notion that people do not know what they are really pursuing. Of course he thought they were pursuing sex with their mother...

Comment author: wedrifid 08 October 2010 02:06:37AM 1 point [-]

Absolutely. In a way we owe this understanding to Freud

Could we instead say "this understanding is predated by Freud's popularized notion..."? There is no debt if the concept is arrived at independently and this is a general philosophical point that is not limited to humans specifically while Freud's is proto-psychology.

Comment author: wedrifid 07 October 2010 11:50:32PM *  2 points [-]

+(-1)

Did Vassar really say something like that? I didn't think he was, well, silly.

Comment author: gwern 07 October 2010 11:59:07PM 0 points [-]

I didn't either; fortunately, no source has been presented, so I don't need to believe he said that and can postulate that he actually said the opposite or was engaged in criticizing such a position.

Comment author: LucasSloan 08 October 2010 12:37:03AM 5 points [-]

I can confirm he said something like it. However, what he meant by it was that our emotions should be keyed to how we act, not how the universe is. We should be rewarded for acting to produce the best outcome possible. We don't control what the universe is, just our actions, so we shouldn't be made to feel bad (or good) because of something we couldn't control. For example, if we imagine a situation where 10 people were going to die, but you managed to save 5 of them, your emotional state shouldn't be sad, because they should reward the fact that you saved 5 people. Equivalently, you shouldn't really be all that happy that a thousand people get something that makes them really happy when your actions reduced the number of people who received whatever it is by 500. Just because the people are better off you shouldn't be emotionally rewarded, because you reduced the number who would be happy. If the best you can make the universe is horrible you shouldn't be depressed about it, because it isn't good to increase the amount of disutility in the universe and doesn't incentivize acting to bring the best situation about. Conversely, if the worse you can do is pretty damn good, you shouldn't be happy about it, because you shouldn't incentivize leaving utility on the table. Basically, it's an endorsement of virtue ethics for human-type minds.

Comment author: xamdam 08 October 2010 12:47:45AM 2 points [-]

Thanks, that is a deeper understanding than I got from it second - hand (though I did not think it meant wireheading). I understood it to warn having and reacting to false sense of control, which I often see, "accepting that there are (many) things you cannot change".

Comment author: wedrifid 08 October 2010 12:51:18AM *  1 point [-]

Equivalently, you shouldn't really be all that happy that a thousand people get something that makes them really happy when your actions reduced the number of people who received whatever it is by 500.

I've got no problem with being happy that a thousand people get a bunch of utility (assuming they are people for whom I have altruistic interest). I would not be glad about the fact that I somehow screwed up (or was unlucky) and prevented even more altruistic goodies but I could be glad (happy) that some action of mine or external cause resulted in the boon for the 1,000.

I have neither the need nor desire to rewire my emotions such that I could unload a can of Skinner on my ass.