Comment author: shminux 27 April 2013 06:40:13PM 6 points [-]

This is true, but why privilege rationality? Integrity, kindness and other desirable traits scale the same way and are probably just as important.

Comment author: alfredmacdonald 27 April 2013 10:16:00PM 1 point [-]

Kindness will only affect decisions where altruistic behavior wouldn't occur if lacking kindness. Integrity I'm even less sure about. Rationality could affect any decision where bias or fuzzy reasoning is involved, which is almost every decision.

The Upward Scaling Importance of Rationality

5 alfredmacdonald 27 April 2013 06:30PM

I've read about a quarter of the sequences, but I'm not sure if this topic has been addressed on LessWrong before. If it has, let me know.

The Upward Scaling Importance of Rationality goes like this:

The more influence your thought process and decisions have, the more important it is that you're rationalist. In the grand scheme of things, it is relatively unimportant that a barback at a restaurant is a rationalist, and I say this having done that. It is extremely important that a leader of a highly influential company, or a president of a university or country is a rationalist. Their decisions affect thousands if not millions of people.

The more influential you are, the more your decisions have potential to screw over other people. Influence doesn't necessarily have to be in a management position: elementary school teachers and police officers are highly influential, even though they aren't in control of an organization. Influence can even be by virtue of the people you reach out to. A famous person with a large fanbase or a parent of a child prodigy, both have the capacity to influence the world with their decisions.

Though arguably, this can  be extended to anyone who votes.

So rationality scales upward: the more influential someone is, the more important is it they're rationalists. Neglecting this can have bad consequences.

Comment author: alfredmacdonald 27 April 2013 06:20:17PM *  13 points [-]

You should get a Ph.D. in Philosophy if you consider the material studied in philosophy to be an end in itself. Philosophy is a truthseeking discipline, so if you find that inherently rewarding and could imagine doing that for a large part of your life it's a good decision. Don't worry about the wariness of philosophy: I can guarantee you that the criticisms levied here against philosophy have been addressed tenfold in actual philosophy departments, by people with sympathies closer to Luke's than you'd think.

That said, a lot of people go into graduate programs for bad reasons. Here are two I've been tempted by:

1.

Minimizing Status Risk. A lot of people think about risk in terms of financial gain or loss, but few think about risk in terms of status when it's a real concern for many people. Graduating college can be intimidating, especially if you're at a prestigious college, because you're about to be stripped of your hierarchical standing among people your age. If you've attended, say, Harvard for four years, you've spent those four years thinking of yourself on the top of the food chain relative to other college students.

Once you're out of college, this is no longer true, and you're measured by what kind of job you have. It's extremely tempting to avoid this by applying to graduate school, because graduate school allows you to continue the imagined hierarchical standing that you've had for the past few years. Eventually you'll get a Ph.D. and be on top of the intellectual food chain. This has nothing to do with "avoiding the real world", because "the real world" as an employment area is conspicuously centered on office jobs or whatever the majority of people happen to do for money. (I wonder if farmers consider everyone else to have a "fake" job. Probably.)

It's a way of avoiding vulnerability to your status, because working as a clerk or receptionist or barista or server or whatever after college is generally not prestigious and makes you feel like your intellect isn't worth anything. That's an uncomfortable feeling, sure, but make sure you're not eyeing a Ph.D. just to avoid that feeling.

2.

Even if you're not avoiding Status Risk, make sure you're not getting a Ph.D. just to feel like an intellectual hotshot anyway. A lot of people reason about competence in binary ways (expert or non-expert) even though competence obviously exists on a spectrum, so it's tempting to get a title that lends you immediately to the "expert" end of any discussion. That way, you can throw your weight around whenever there's a clash of words.

With philosophy especially, it's enigmatic to a lot of people. There's a mystery of what you're actually learning in an advanced program. So a Ph.D. looks like a "certified smart person" badge to a lot of people, and that's tempting. Make sure you're not getting it for that reason either.


Here's the litmus test. Ask yourself: "would I self-study this material anyway if I had the next three-five years paid for? Would this occupy a large part of my time regardless of what I'm doing?" If so, it's worth it.

Comment author: [deleted] 11 February 2013 03:09:28AM 0 points [-]

The proactive thing to do, naturally, is to try to minimize how many mistakes you make.

To me, there seems to be something kind of off about this sentence. Suppose I'm trying to get better at a game like Starcraft. Starcraft is sufficiently complicated that attaining a basic level of still (by which I mean "macromanagement": being able to ensure that all your resources are being used somehow, without worrying about using them well) takes hours and hours of practice. And during that practice, you will inevitably make mistakes; the only way to avoid making mistakes is by not practicing. Indeed, every mistake teaches you something, so I'm tempted to say that what you want to do is to maximize the number of mistakes you make.

In short, it seems to me that minimizing the number of mistakes you make doesn't serve the purpose of making you more skilled. So what purpose does it serve?

In response to comment by [deleted] on The Wrongness Iceberg
Comment author: alfredmacdonald 11 February 2013 09:20:19PM 0 points [-]

Sure, in the very short run (starting from absolutely no knowledge of the game) you'd have to make mistakes to learn anything at all. But the process of getting better is a gradual decrease of the frequency of those mistakes. You'd want to minimize your mistakes as much as possible as you got better, because the frequency of mistakes will be strongly correlated with how much you lose.

I think you're seeing "try to minimize how many mistakes you make" and reading that as "trying to make no mistakes." There are certainly mistakes you'll have to make to get better, but then there are superfluous mistakes that some people may make while others won't, or catastrophic mistakes that would make you look really bad which you'd definitely want to avoid. The depth of mistakes can go much deeper than the necessary mistakes you'd have to make to get better, in other words.

Comment author: alfredmacdonald 08 February 2013 05:53:46AM 2 points [-]

I really liked this post, and I think a lot of people aren't giving you enough credit. I've felt similarly before -- not to the point of suicide, and I think you might want to find someone who you can confide those anxieties with -- but about being angered at someone's dismissal of rationalist methodology. Because ultimately, it's the methodology which makes someone a rationalist, not necessarily a set of beliefs. The categorizing of emotions as in opposition to logic for example is a feature I've been frustrated with for quite some time, because emotions aren't anti-logical so much as they are alogical. (In my personal life, I'm an archetype of someone who gets emotional about logical issues.)

What I suspect was going on is that you felt that this person was being dismissive of the methodology and that the person did not believe reason to be an arbiter of disagreements. This reads to me like saying "I'm not truth-seeking, and I think my gut perception of reality is more important than the truth" -- a reading that sounds to me both arrogant and immoral. I've ran across people like this too, and every time I feel like someone is de-prioritizing the truth over their kneejerk reaction, it's extremely insulting. Perhaps that's what you felt?

Comment author: wwa 05 February 2013 01:12:24PM 3 points [-]

Isn't anxiety the primary problem? The obvious solution to make less mistakes is to gather data and figure stuff out, but you're not asking about that. You're not afraid of making mistakes. You're afraid of people (including yourself) discovering your alleged incompetence. The obvious solution to that is to fix the anxiety problem. Yes, it might be hard and/or require external help, but you said it yourself :

the anxiety has been catastrophic

In response to comment by wwa on The Wrongness Iceberg
Comment author: alfredmacdonald 08 February 2013 05:41:19AM 1 point [-]

I don't currently work at a restaurant, so at the moment I'm afraid of nothing.

But for the purposes of the example, it's not about discovering mistakes or incompetence -- it's about your level of incompetence being much greater than you previously estimated, for reasons you were unaware of prior to being exposed to those reasons.

Comment author: BerryPick6 04 February 2013 09:59:11AM 4 points [-]

What about the iceberg iceberg, when noticing your first iceberg you realize there was a metric ton of icebergs under the iceberg.

Or a recursive iceberg, where you realize there's a whole nautical mile worth of rabbit hole left to go down?

Comment author: alfredmacdonald 04 February 2013 10:42:27AM *  3 points [-]

I find that similar to the concept of fractal wrongness. What distinguishes an iceberg from a fractal is that you're in situations where someone is resisting exposing the whole iceberg for one reason or another. In the dishonesty scenario, you realize one lie reveals many others but only because that person has left you a tidbit of information that cracks their facade and allows you to infer just how deeply they've lied to you -- or in the case of attraction, an event or action that only would occur if they had a much greater level of attraction existing below the surface.

The Wrongness Iceberg

20 alfredmacdonald 04 February 2013 09:02AM

As soon as I got out of college I got a job at a restaurant. At the time I had never had a job at a restaurant, but my mom had known the owners and I felt obligated to avoid performing badly. Yet inevitably I did perform badly, and how this performance was evaluated would greatly affect my way of perceiving my mistakes.

If you're entrenched in an organization, there's a good chance you have an idea of what it is you're supposed to do and what mistakes you will or will not be making. But suppose you're in a position like this one: by way of your ignorance you know you're going to make a lot of mistakes, and it's just a question of when and how much. Further, you know that if you make too many mistakes, you make people you care about look bad. And finally, there are a lot of unknown unknowns: you don't know what possible mistakes and acts of ignorance exist to begin with, so many mistakes you've made you will be blind to.

The proactive thing to do, naturally, is to try to minimize how many mistakes you make.

There are two key ways to gauge the depth of being told you have made a mistake. The first way is to take mistakes literally, as if no other mistake exists, and any other mistake would be pointed out to you. So if you correct this mistake, everything else should be fine. This is how you'd expect to take mistakes if you were, say, under the supervision of an editor.

But the second kind is where the title of this writeup comes in. Not everyone is literal, or critical enough to notice every mistake. Much of the time, you'll only receive news of a mistake if many other mistakes are already afoot, and this mistake just happens to stand out from the set of mistakes you've already made. And since you don't know what mistakes you could be making, you don't know if there are many more mistakes under your level of awareness that you could be correcting for, but aren't.

In short, you're tasked with avoiding a wrongness iceberg: a mistake indicative of a nautical mile of mistakes below the surface and your level of awareness.

This is a debilitating position to be in, because your mental map of your performance prior to discovering the iceberg needs to be completely rewritten; in addition to accounting for all of the new areas you need to work on, you will likely account for the embarrassment of realizing that you have opened up a new frontier of mistakes to reflect on from your period of unaware incompetence.

While I don't think it's impossible that people exist who have never been in a situation like this, I think anyone who dives into a new field or skill is familiar, at least, with this feeling of brief yet total incompetence. And if you're in a field with enough depth and subjective calls to allow for a wrongness iceberg scenario, there might not be much you can do to prevent it. The most you can do is provide adequate resistance for the inevitable.

That's why I've created this mental model to think about it constructively. In every situation where I've faced a wrongness iceberg, the anxiety has been catastrophic. If you can at least deal with it, you can realize why it is you're anxious and what's going on with your assessment of your own mistakes. From experience, knowing that I'm worried about making this kind of iceberg-revealing mistake is helpful for mitigating my stress. And if you can somehow preempt an iceberg, that's even better.

side note: I've extended this concept to other domains, and it works well. A "dishonesty iceberg" is when one person's lie reveals a nautical mile of lies below the surface, and an "attraction iceberg" is when one person's expression of attraction toward you are indicative of a much greater level of internal attraction.

Comment author: phane 07 May 2009 02:09:27PM 21 points [-]

I don't think "Not sending in your $200 rebate" and "not writing in an article to Overcomingbias" are the same phenomena at all.

It's not that people who are now writing all these LW posts felt like it was too much of a hassle to send an email to Overcomingbias; it's that deliberately and unusually sticking your neck out to contribute has a different social connotation than simply participating in the expected community behavior.

Contributing to Overcomingbias is like getting on stage: walking up to the stage is a socially loaded act in and of itself. "Hey, everyone, I'm going to stand out here and say something." Lesswrong, since the entire site is built around community posting, practically invites you to post as you please. There's nothing out of the ordinary about it. How could there be? The tools to do so are right there, embedded into the infrastructure of the site. It must be expected for me to do that!

Comment author: alfredmacdonald 01 January 2013 07:01:56PM 3 points [-]

I think LessWrong actually has a higher barrier for contribution -- at least for articles -- because you're expected to have 20 comment karma before you can submit. This means that, if you're honest anyway, you'll have to spend your time in the pit interacting with people who could potentially shout you down, or call you a threat to their well-kept garden, or whatever.

I have at least 3 articles in draft format that I want to submit once I reach that total, but I don't comment on discussions as much because most of what I would say is usually said in one comment or another. For people like me, the barrier of "must email someone" is actually easier, since discussion contribution requires a sense of knowing how the community works, intuiting a sense of what the community deems a good comment, and posting along those lines.

Comment author: alfredmacdonald 15 December 2012 04:04:59PM 0 points [-]

Luke, I was curious: where does informal logic fit into this? It is the principal method of reasoning tested on the LSAT's logical reasoning section, and I would say the most practical form of reasoning one can engage in, since most everyday arguments will utilize informal logic in one way or another. Honing it is valuable, and the LSAT percentiles would suggest that not nearly as many people are as good at it as they should be.

View more: Next