Comment author: shminux 27 April 2013 06:40:13PM 6 points [-]

This is true, but why privilege rationality? Integrity, kindness and other desirable traits scale the same way and are probably just as important.

Comment author: alfredmacdonald 27 April 2013 10:16:00PM 1 point [-]

Kindness will only affect decisions where altruistic behavior wouldn't occur if lacking kindness. Integrity I'm even less sure about. Rationality could affect any decision where bias or fuzzy reasoning is involved, which is almost every decision.

Comment author: alfredmacdonald 27 April 2013 06:20:17PM *  13 points [-]

You should get a Ph.D. in Philosophy if you consider the material studied in philosophy to be an end in itself. Philosophy is a truthseeking discipline, so if you find that inherently rewarding and could imagine doing that for a large part of your life it's a good decision. Don't worry about the wariness of philosophy: I can guarantee you that the criticisms levied here against philosophy have been addressed tenfold in actual philosophy departments, by people with sympathies closer to Luke's than you'd think.

That said, a lot of people go into graduate programs for bad reasons. Here are two I've been tempted by:

1.

Minimizing Status Risk. A lot of people think about risk in terms of financial gain or loss, but few think about risk in terms of status when it's a real concern for many people. Graduating college can be intimidating, especially if you're at a prestigious college, because you're about to be stripped of your hierarchical standing among people your age. If you've attended, say, Harvard for four years, you've spent those four years thinking of yourself on the top of the food chain relative to other college students.

Once you're out of college, this is no longer true, and you're measured by what kind of job you have. It's extremely tempting to avoid this by applying to graduate school, because graduate school allows you to continue the imagined hierarchical standing that you've had for the past few years. Eventually you'll get a Ph.D. and be on top of the intellectual food chain. This has nothing to do with "avoiding the real world", because "the real world" as an employment area is conspicuously centered on office jobs or whatever the majority of people happen to do for money. (I wonder if farmers consider everyone else to have a "fake" job. Probably.)

It's a way of avoiding vulnerability to your status, because working as a clerk or receptionist or barista or server or whatever after college is generally not prestigious and makes you feel like your intellect isn't worth anything. That's an uncomfortable feeling, sure, but make sure you're not eyeing a Ph.D. just to avoid that feeling.

2.

Even if you're not avoiding Status Risk, make sure you're not getting a Ph.D. just to feel like an intellectual hotshot anyway. A lot of people reason about competence in binary ways (expert or non-expert) even though competence obviously exists on a spectrum, so it's tempting to get a title that lends you immediately to the "expert" end of any discussion. That way, you can throw your weight around whenever there's a clash of words.

With philosophy especially, it's enigmatic to a lot of people. There's a mystery of what you're actually learning in an advanced program. So a Ph.D. looks like a "certified smart person" badge to a lot of people, and that's tempting. Make sure you're not getting it for that reason either.


Here's the litmus test. Ask yourself: "would I self-study this material anyway if I had the next three-five years paid for? Would this occupy a large part of my time regardless of what I'm doing?" If so, it's worth it.

Comment author: [deleted] 11 February 2013 03:09:28AM 0 points [-]

The proactive thing to do, naturally, is to try to minimize how many mistakes you make.

To me, there seems to be something kind of off about this sentence. Suppose I'm trying to get better at a game like Starcraft. Starcraft is sufficiently complicated that attaining a basic level of still (by which I mean "macromanagement": being able to ensure that all your resources are being used somehow, without worrying about using them well) takes hours and hours of practice. And during that practice, you will inevitably make mistakes; the only way to avoid making mistakes is by not practicing. Indeed, every mistake teaches you something, so I'm tempted to say that what you want to do is to maximize the number of mistakes you make.

In short, it seems to me that minimizing the number of mistakes you make doesn't serve the purpose of making you more skilled. So what purpose does it serve?

In response to comment by [deleted] on The Wrongness Iceberg
Comment author: alfredmacdonald 11 February 2013 09:20:19PM 0 points [-]

Sure, in the very short run (starting from absolutely no knowledge of the game) you'd have to make mistakes to learn anything at all. But the process of getting better is a gradual decrease of the frequency of those mistakes. You'd want to minimize your mistakes as much as possible as you got better, because the frequency of mistakes will be strongly correlated with how much you lose.

I think you're seeing "try to minimize how many mistakes you make" and reading that as "trying to make no mistakes." There are certainly mistakes you'll have to make to get better, but then there are superfluous mistakes that some people may make while others won't, or catastrophic mistakes that would make you look really bad which you'd definitely want to avoid. The depth of mistakes can go much deeper than the necessary mistakes you'd have to make to get better, in other words.

Comment author: alfredmacdonald 08 February 2013 05:53:46AM 2 points [-]

I really liked this post, and I think a lot of people aren't giving you enough credit. I've felt similarly before -- not to the point of suicide, and I think you might want to find someone who you can confide those anxieties with -- but about being angered at someone's dismissal of rationalist methodology. Because ultimately, it's the methodology which makes someone a rationalist, not necessarily a set of beliefs. The categorizing of emotions as in opposition to logic for example is a feature I've been frustrated with for quite some time, because emotions aren't anti-logical so much as they are alogical. (In my personal life, I'm an archetype of someone who gets emotional about logical issues.)

What I suspect was going on is that you felt that this person was being dismissive of the methodology and that the person did not believe reason to be an arbiter of disagreements. This reads to me like saying "I'm not truth-seeking, and I think my gut perception of reality is more important than the truth" -- a reading that sounds to me both arrogant and immoral. I've ran across people like this too, and every time I feel like someone is de-prioritizing the truth over their kneejerk reaction, it's extremely insulting. Perhaps that's what you felt?

Comment author: wwa 05 February 2013 01:12:24PM 3 points [-]

Isn't anxiety the primary problem? The obvious solution to make less mistakes is to gather data and figure stuff out, but you're not asking about that. You're not afraid of making mistakes. You're afraid of people (including yourself) discovering your alleged incompetence. The obvious solution to that is to fix the anxiety problem. Yes, it might be hard and/or require external help, but you said it yourself :

the anxiety has been catastrophic

In response to comment by wwa on The Wrongness Iceberg
Comment author: alfredmacdonald 08 February 2013 05:41:19AM 1 point [-]

I don't currently work at a restaurant, so at the moment I'm afraid of nothing.

But for the purposes of the example, it's not about discovering mistakes or incompetence -- it's about your level of incompetence being much greater than you previously estimated, for reasons you were unaware of prior to being exposed to those reasons.

Comment author: BerryPick6 04 February 2013 09:59:11AM 4 points [-]

What about the iceberg iceberg, when noticing your first iceberg you realize there was a metric ton of icebergs under the iceberg.

Or a recursive iceberg, where you realize there's a whole nautical mile worth of rabbit hole left to go down?

Comment author: alfredmacdonald 04 February 2013 10:42:27AM *  3 points [-]

I find that similar to the concept of fractal wrongness. What distinguishes an iceberg from a fractal is that you're in situations where someone is resisting exposing the whole iceberg for one reason or another. In the dishonesty scenario, you realize one lie reveals many others but only because that person has left you a tidbit of information that cracks their facade and allows you to infer just how deeply they've lied to you -- or in the case of attraction, an event or action that only would occur if they had a much greater level of attraction existing below the surface.

Comment author: phane 07 May 2009 02:09:27PM 21 points [-]

I don't think "Not sending in your $200 rebate" and "not writing in an article to Overcomingbias" are the same phenomena at all.

It's not that people who are now writing all these LW posts felt like it was too much of a hassle to send an email to Overcomingbias; it's that deliberately and unusually sticking your neck out to contribute has a different social connotation than simply participating in the expected community behavior.

Contributing to Overcomingbias is like getting on stage: walking up to the stage is a socially loaded act in and of itself. "Hey, everyone, I'm going to stand out here and say something." Lesswrong, since the entire site is built around community posting, practically invites you to post as you please. There's nothing out of the ordinary about it. How could there be? The tools to do so are right there, embedded into the infrastructure of the site. It must be expected for me to do that!

Comment author: alfredmacdonald 01 January 2013 07:01:56PM 3 points [-]

I think LessWrong actually has a higher barrier for contribution -- at least for articles -- because you're expected to have 20 comment karma before you can submit. This means that, if you're honest anyway, you'll have to spend your time in the pit interacting with people who could potentially shout you down, or call you a threat to their well-kept garden, or whatever.

I have at least 3 articles in draft format that I want to submit once I reach that total, but I don't comment on discussions as much because most of what I would say is usually said in one comment or another. For people like me, the barrier of "must email someone" is actually easier, since discussion contribution requires a sense of knowing how the community works, intuiting a sense of what the community deems a good comment, and posting along those lines.

Comment author: alfredmacdonald 15 December 2012 04:04:59PM 0 points [-]

Luke, I was curious: where does informal logic fit into this? It is the principal method of reasoning tested on the LSAT's logical reasoning section, and I would say the most practical form of reasoning one can engage in, since most everyday arguments will utilize informal logic in one way or another. Honing it is valuable, and the LSAT percentiles would suggest that not nearly as many people are as good at it as they should be.

Comment author: Peterdjones 07 December 2012 11:05:16AM 0 points [-]

I think it's a task Luke isn't up to. To single-handedly reform teaching like that you would have to be a renowned philosopher or educationalist, a Dewey or Erasmus, not a twenty-something blogger. His understanding of philosophy is barely up to undergraduate level. Sorry, but that's the way it is.

Comment author: alfredmacdonald 15 December 2012 03:48:28PM 1 point [-]

His understanding of philosophy is barely up to undergraduate level. Sorry, but that's the way it is.

I feel like the phrasing "barely up to undergraduate level" is like saying something is "basic" or "textbook" not when it's actually basic or textbook but because it insinuates there is an ocean of knowledge that your opponent has yet to cross. If luke is "barely undergraduate" then I know a lot of philosophy undergrads who might as well not call themselves that.

While I agree that reform is far more likely to be done by a Dewey or Erasmus, your reasoning gives me a very "you must be accepted into our system if you want to criticize it" vibe.

Comment author: Epiphany 15 December 2012 09:48:13AM -7 points [-]

Not if LessWrong values truthseeking activities more than the general population, or considers lying/truth-fabrication a greater sin than the general population does, or if LessWrong attracts less sociopaths than the general population. If over 1000 fitness enthusiasts take a test about weight, the statistics re: obesity are not going to reflect the general population's. Considering the CRT scores of LessWrong and the nature of this website to admire introspection and truthseeking activities, I doubt that LW would be reflective of the general population in this way.

That is why I used the wording "statistically speaking" - it is understood to mean that I am working from statistics that were generated on the overall population as opposed to the specific population in question. You are completely ignoring my point which is that you have chosen a position which is going to be more or less impossible to defend. That position was:

I don't think anyone on Less Wrong has lied about their IQ.

It's considered very rude to completely ignore someone's argument and nit pick at their wording. That is what you just did.

Lies are more than untrue statements; at least, in the context of self-reports,

Now it's like you're trying to make up a new definition of the word lying so you can continue to think your ridiculous assessment that:

To lie about your IQ would mean you'd have to know to some degree what your real IQ is

By the common definition of the word "lie" producing a number when you do not know the number definitely does qualify as a lie. You're not fooling me by trying to make a new definition of the word "lie" in this context. This behavior just looks ridiculous to me.

Mensa doesn't need to be a professional IQ testing center for their normings to be accurate, however.

But they do need to provide a professional IQ testing service if they want their norms to mean something. The iqtest.dk might turn out to be a better indicator of visual-spatial ability than IQ, or it might discriminate against autistics, which LW might have an unusually large number of (seeing as how there are a lot of CS people here).

However, it's inaccurate to say that because someone puts their number in the box from IQTest.dk that they're "equally flawed" to the other intelligence questions.

Here you go twisting my wording. I specifically said:

In that way, they're equally flawed to the other intelligence questions...

The only reason I'm responding to you is because I am hoping you will see that you need to do more work on your rationality. Please consider getting some rationality training or something.

Comment author: alfredmacdonald 15 December 2012 10:26:08AM *  5 points [-]

The general population would contain 50 sociopaths to 1000; I don't think LessWrong contains 50 sociopaths to 1000. Rationality is a truth-seeking activity at its core, and I suspect a community of rationalists would do their best to avoid lying consciously.

I am not sure what "the common definition of the word 'lie'" is, especially since there are a lot of differing interpretations of what it means to lie. I know that wrong answers are distinct from lies, however. I think that a lot of LessWrong people might have put an IQ that does not reflect an accurate result. But I doubt that many LessWrong people have put a deliberately inaccurate result for IQ. Barring "the common definition" (I don't know what that is), I'm defining "stating something when you know what you are stating is false" as a lie, since someone can put a number when they don't know for sure what the true number is but don't know that the number they are stating is false either.

I don't know what you mean by "mean something" with respect to Mensa Denmark's normings. They will probably be less accurate than a professional IQ testing service, but I don't know why they would be inaccurate or "meaningless" by virtue of their organization not being a professional IQ testing service.

The only way I can think of in which the self-reported numbers would be more accurate than the IQTest.dk numbers is if the LW respondents knew that their IQ numbers were from a professional testing service and they had gone to this service recently. But since the test didn't specify how they obtained this self-report, I can't say, nor do I think it's likely.

IQTest.dk uses Raven's Progressive Matrices which is a standard way to measure IQ across cultures. This is because IQ splits between verbal/spatial are not as common. It wouldn't discriminate against autistics, because it actually discriminates in favor of autistics; people with disorders on the autism spectrum are likely to score higher, not lower.

I'm not sure how the bolding of "in that way" bolsters your argument. Paraphrased, it would be "in the way that the user types the IQ score into the survey box themselves, the IQTest.dk questions are equally flawed to the other intelligence questions." But this neglects to consider that the source of the number is different; they are self-reports in the sense that the number is up to someone to recall, but if someone types in their IQTest.dk number you know it came from IQTest.dk. If someone types in their IQ without specifying the source, you have no idea where they got that number from -- they could be estimating, it could be a childhood test score, and so on.

Please consider getting some rationality training or something.

Remarks like these are unnecessary, especially since I've just joined the site.

View more: Next