Comment author: moridinamael 09 September 2014 08:46:40PM 32 points [-]

I have found that the more I use my simulation of HPMOR!Quirrell for advice, the harder it is to shut him up. As with any mental discipline, thinking in particular modes wears thought-grooves into your brain's hardware, and before you know it you've performed an irreversible self-modification. Consequently, I would definitely recommend that anybody attempting to supplant their own personality (for lack of a better phrasing) with a model of some idealized reasoner try to make sure that the idealized reasoner shares your values as thoroughly as possible.

Comment author: BrienneYudkowsky 17 September 2014 01:26:54PM 6 points [-]

This whole comment thread is utterly delightful.

Comment author: xnn 09 September 2014 09:11:48AM 10 points [-]

I wish this were posted in main.

Comment author: BrienneYudkowsky 09 September 2014 09:51:08PM 13 points [-]

Wish granted!

Comment author: lukeprog 08 September 2014 07:43:05PM *  29 points [-]

What I want to know is Why didn't Anna tell me about her technique for managing decision fatigue earlier?

Comment author: BrienneYudkowsky 08 September 2014 09:44:43PM 14 points [-]

Please do report back on whether it helps you!

Comment author: BrienneYudkowsky 22 August 2014 06:04:51PM 2 points [-]

This is one of the most valuable things I've read in months. Thank you!

Comment author: So8res 09 May 2014 01:12:23AM 10 points [-]

Keep in mind here that I'm steelmanning someone else's argument, perhaps improperly. I don't want to put words in anyone else's mouth. That said, I used the term 'purity' in loose analogy to a 'pure' programming language, wherein one exception is sufficient to remove much of the possible gains.

Continuing the steelmanning, however, I'd say that while no human can achieve epistemic perfection, there's a large class of epistemic failures that you only recognize if you're striving for perfection. Striving for purity, not purity itself, is what gets you the gains.

Comment author: BrienneYudkowsky 09 May 2014 10:25:49PM 3 points [-]

I can already predict, though, that much or my response will include material from here and here.

Comment author: brazil84 08 May 2014 09:07:40AM 9 points [-]

Ultimately, I think beliefs are inputs for predictions

As Robin Hanson has pointed out, beliefs are also a way of showing something about oneself. Tribal membership, moral superiority, etc. A good Cimmerian believes in Crom, the grim gloomy unforgiving god.

Often, when we attempt to accept contradictory statements as correct, it causes cognitive dissonance--that nagging, itchy feeling in your brain that won't leave you alone until you admit that something is wrong.

My impression is that most people never admit that their beliefs are contradictory, instead they either lash out at whoever is bringing the contradictions to the forefront of their mind or start ignoring him.

But I was wrong. And that mattered. Having accurate beliefs is a ridiculously convergent incentive. Every utility function that involves interaction with the territory--interaction of just about any kind!--benefits from a sound map.

Can you give three examples of improvements in your life since your epiphany?

Comment author: BrienneYudkowsky 09 May 2014 02:40:15AM 2 points [-]

Can you give three examples of improvements in your life since your epiphany?

Sure!

1) My conversations with friends are more efficient illuminating. 2) I learn more quickly from mistakes. 3) I prevent more mistakes before they get the chance to happen.

If I hadn't given those examples, could you have predicted positive changes resulting from having generally more accurate beliefs? It really doesn't seem that surprising to me that someone's life would improve in a zillion different ways if they weren't wrong so much.

Comment author: So8res 09 May 2014 01:12:23AM 10 points [-]

Keep in mind here that I'm steelmanning someone else's argument, perhaps improperly. I don't want to put words in anyone else's mouth. That said, I used the term 'purity' in loose analogy to a 'pure' programming language, wherein one exception is sufficient to remove much of the possible gains.

Continuing the steelmanning, however, I'd say that while no human can achieve epistemic perfection, there's a large class of epistemic failures that you only recognize if you're striving for perfection. Striving for purity, not purity itself, is what gets you the gains.

Comment author: BrienneYudkowsky 09 May 2014 02:34:10AM 9 points [-]

So8ers, you're completely accurate in your interpretation of my argument. I'm going to read some more of your previous posts before responding much to your first comment here.

Comment author: [deleted] 08 May 2014 04:00:11PM 12 points [-]

You'd better remove Scott's real last name from your post before search engines index it, because he doesn't want it to be easy to find his blog given his full name.

In response to comment by [deleted] on A Dialogue On Doublethink
Comment author: BrienneYudkowsky 08 May 2014 06:15:41PM 11 points [-]

done. sorry, didn't know.

Comment author: ChrisHallquist 06 May 2014 09:31:32PM 0 points [-]

...my motivation has been "I see people around me succeeding by these means where I have failed, and I want to be like them".

Seems like noticing yourself wanting to imitate successful people around you should be an occasion for self-scrutiny. Do you really have good reasons to think the things you're imitating them on are the cause of their success? Are the people you're imitating more successful than other people who don't do those things, but who you don't interact with as much? Or is this more about wanting to affiliate the high-status people you happen to be in close proximity to?

Comment author: BrienneYudkowsky 06 May 2014 11:40:29PM *  7 points [-]

It is indeed a cue to look for motivated reasoning. I am not neglecting to do that. I have scrutinized extensively. It is possible to be motivated by very simple emotions while constraining the actions you take to the set endorsed by deliberative reasoning.

The observation that something fits the status-seeking patterns you've cached is not strong evidence that nothing else is going on. If you can write off everything anybody does by saying "status" and "signaling" without making predictions about their future behavior--or even looking into their past behavior to see whether they usually fit the patterns-- then you're trapped in a paradigm that's only good for protecting your current set of beliefs.

Yes, I do have good reasons to think the things I'm imitating are causes of their success. Yes, they're more successful on average than people who don't do the things, and indeed I think they're probably more successful with respect to my values than literally everybody who doesn't do the things. And I don't "happen" to be in close proximity to them; I sought them out and became close to them specifically so I could learn from them more efficiently.

I am annoyed by vague, fully general criticisms that don't engage meaningfully with any of my arguments or musings, let alone steel man them.

Comment author: BrienneYudkowsky 05 May 2014 04:03:47AM *  13 points [-]

I was not signaling. Making it a footnote instead of just editing it outright was signaling. Revering truth, and stating that I do so, was not.

Now that I've introspected some more, I notice that my inclination to prioritize the accuracy of information I attend to above its competing features comes from the slow accumulation of evidence that excellent practical epistemology is the the strongest possible foundation for instrumental success. To be perfectly honest, deep down, my motivation has been "I see people around me succeeding by these means where I have failed, and I want to be like them".

I have long been more viscerally motivated by things that are interesting or beautiful than by things that correspond to the territory. So it's not too surprising that toward the beginning of my rationality training, I went through a long period of being so enamored with a-veridical instrumental techniques that I double-thought myself into believing accuracy was not so great.

But I was wrong, you see. Having accurate beliefs is a ridiculously convergent incentive, so whatever my goal structure, it was only a matter of time before I'd recognize that. Every utility function that involves interaction with the territory--interaction of just about any kind!--benefits from a sound map. Even if "beauty" is a terminal value, "being viscerally motivated to increase your ability to make predictions that lead to greater beauty" increases your odds of success.

Recognizing only abstractly that map-territory correspondence is useful does not produce the same results. Cultivating a deep dedication to ensuring every motion precisely engages reality with unfailing authenticity prevents real-world mistakes that noting the utility of information, just sort of in passing, will miss.

For some people, dedication to epistemic rationality may most effectively manifest as excitement or simply diligence. For me, it is reverence. Reverence works in my psychology better than anything else. So I revere the truth. Not for the sake of the people watching me do so, but for the sake of accomplishing whatever it is I happen to want to accomplish.

"Being truth-seeking" does not mean "wanting to know ALL THE THINGS". It means exhibiting patters of thought and behavior that consistently increase calibration. I daresay that is, in fact, necessary for being well-calibrated.

View more: Prev | Next