abstractapplic

You wrote once that "rationality is about becoming right instead of wrong". I replied along the lines of "what, no it's not", and suggested the following schema:

Step 1: Figure out what's worth spending your limited time becoming right about.

Step 2: Become right about those things.

Step 3: Remember to actually use your right-ness, like, at all.

Step 4: Apply your right-ness well, and thereby achieve your goals.

(To this I would in retrospect add Step X: Don't be crazy. By this I mean: have a goal in the first place, don't fall for any of the dozens of biases haunting your monkey brain, don't believe things just because they're written down, etc etc etc.)

abstractapplic

To start with: do you think that's a valid set of joints at which to carve? And how would you taxonomize things differently?

lsusr

I think that's a reasonable set of joints at which to carve, but I feel you may be making some assumptions that I may not share.

For starters, what is "rationality"? I don't mean, "how do you be a good rationalist"? I mean what distinguishes rationality from competing philosophies and value systems? We all want to achieve our goals, but not everyone is a "rationalist".

abstractapplic

I'm mostly using the word "rationality" in the "systemized winning" sense. I guess the main thing that separates LW rats from most people-who-try-to-do-things is that we place an unusual emphasis on Step 2, and are unusually free re: what methods we use to accomplish it.

lsusr

You define rationality as "systematized winning"?

abstractapplic

How do you define it, if not like that?

lsusr

"The art of becoming legibly correct."

abstractapplic

To be fair, I think that is close to how most people outside LW would define it. Subcultures like ours tend to take common terms and make them mean slightly different things.

lsusr

It is true that subcultures tend to develop their own idiosyncratic jargons, but I don't think that's what's going on here. I think the difference in definitions reflects a true crux of disagreement between us about what rationality even is

But perhaps I'm wrong. Do you think we're saying the same thing with different words?

abstractapplic

I think Step 2 in my taxonomy isn't valuable without the rest. And I think knowing that they're going to face Steps 3 and 4, and then eventual Consequences, tends to help people get better at Step 2.

(This is true both because it facilitates feedback loops, and because it snaps them from Far mode to Near mode, and probably for some other reasons I don't apprehend. Contact with reality is a panacea.)

lsusr

What is the terminal value of rationality? If you had to pick between winning and being right, which would you choose?

abstractapplic

This seems kind of abstract. There are situations where you have resources you can spend on building really good models of problems or on fixing them; a better model helps to figure out how to spend those resources, but if you're just focusing on being right, there's a point past which that stops making sense.

abstractapplic

I'd say rationality (as opposed to just-winning) is a tendency to emphasize winning in ways where being right about things is a key step on the path, and an acknowledgement of how much smoother things get when you prioritize truth highly.

lsusr

Then rationality ≠ "systematized winning"?

abstractapplic

As most people use it, and as I'm choosing to use it now. (From here on I'll use "winnery" or something when I want to talk about rationality-as-rats-use-the-term, which my four-and-a-half steps aim to represent.)

lsusr

Earlier, you used the word "rationality" to mean "systematized winning". But now you're using the word "rationality" to refer to something else. What is that something else?

abstractapplic

The confusion here seems to mostly be down to you changing my mind / reminding me that most people outside LW don't use rationality to mean "winnery" like we do, so I should use the term the way I think most people do instead of how our subculture does.


abstractapplic

I brought this topic up because I noticed your (very good!) videos emphasize Steps 2 and X super hard, to the exclusion of the others. I don't think that's a mistake, but I do think it's a limitation.

abstractapplic

(Lest it sound like I'm grilling you, I freely confess that my own efforts lean super hard on Steps 2 and 4 to the exclusion of the others - Step 3 in particular is ruled out by the fact I directly tell people to use the provided information when working through my scenarios - and furthermore that I was originally aiming for Step X but managed to crit fail so hard I succeeded at something unrelated.)

lsusr

It warms my heart that you like my videos. As an aside, I actually like getting grilled, as long as it's done in the right way. That's why I created those videos. I wanted people to grill me in them. But to my surprise, nobody did.

I only publish a small bit of what I record. One of my favorite moments was where I called someone out for not doing Step 3. He exclaimed "Fuck you too" at me with a giant grin on his face.

I think one of the things which distinguishes my value system from yours, is that—to me—winning is instrumental to being right whereas—to you—being right is instrumental to winning.

abstractapplic

There's an interesting ambiguity in what you just said: "winning is instrumental to being right" could mean "winning is how you check whether you're right", "winning is how you motivate yourself to be right", or "if you're not winning you might not be able to afford the expensive books".

I'm guessing you agree with all three interpretations?

lsusr

I intended the first meaning, but I don't object to the other interpretations. Many things are instrumental to becoming right.

abstractapplic

I don't think I understand your values here. Presumably you wouldn't value memorizing whether every seven-digit number is prime . . . or maybe you would, but I doubt you'd value it as much as more generally useful information.

lsusr

I don't need to memorize whether every seven-digit number is prime. I already know that 1,000,000 is non-prime. Therefore not every seven-digit number is prime.

abstractapplic

s/every/each

Good catch. Point stands, though.

lsusr

You are correct that I do not care whether 1,590,201 is prime.

abstractapplic

. . . because you can already tell from the digits it's divisible by 9?

lsusr

It is?

lsusr

<checking>

abstractapplic

Is that NOT why you put the 2 in there?

(I saw you start with 1,590,001 in the editor, then change it.)

lsusr

Nope. Totally random. Unlike you, I do not burn compute cycles trying to factor random 7-digit numbers. 😜

abstractapplic

So . . . what differentiates a truth you value from a truth you don't? The identities of the primes are some of the most fundamental, non-contingent truths there are. If you value truth just for its own sake, it seems like you'd value that.

lsusr

Some information is useful. Other information is interesting. Both kinds are of value to me. But what I get obsessive over is when I notice I'm wrong about something, but don't know where the error might be.

abstractapplic

That's more 'coherency' than 'truth' imo; more Step X than Step 2.

lsusr

You are correct. What I'm spotting is incoherency, but my target is truth.

The fundamental problem is that being wrong feels too similar to being right. The difference is that being wrong (sometimes) produces incoherency whereas being right does not. So when I notice an incoherency, I latch onto it. It's like spotting ripples on the surface of a still ocean. It tells me there's something lurking underneath for me to catch.

abstractapplic

That's true, important and correct imo; but it doesn't answer my question. What fish are worth catching in the first place, if not those you intend to make use of?

lsusr

I live in a world where great powers beyond my comprehension—governments, corporations, religions, ideologies, cultural evolution, memetic evolution, biological evolution itself—have installed malware in my brain that corrupts my perception of reality and twists me to their ends. I want to remove it or, at the very least, flag it.

abstractapplic

So - in my taxonomy - you're using Step 2 to facilitate Step X, and the other steps to facilitate Step 2. That makes sense.

I think the main thing I want to get across is just how much the other steps help with Step X: brains tend to be about an order of magnitude less deceptive when falsehood has near-term consequences.

(I say, not knowing how I'd quantify or prove that, and accidentally providing an example of the exact kind of inexactitude I'm railing against. But I don't think I'm wrong!)

lsusr

Exactly. Victory is instrumental.

New Comment
3 comments, sorted by Click to highlight new comments since:

Victory is instrumental.

Something seems off to me about this view, and by off I mean it sounds like nonsense to me.

What's pinging for me is that I take "victory" to mean successfully getting what you want, and I would take this to not just be shallowly getting what you want, but deeply getting it by getting it in the way you want to get it and without any undesirably side effects. Thus there's no real sense to me in which it makes sense to say that victory is instrumental because it's the ideal state of getting your desires met.

My read is that you're trying to say something like victory only matters if it's achieved by endorsed means, but that doesn't mean victory is instrumental, only that it's incomplete if achieved the wrong way.

"The art of becoming legibly correct."

I like this, but when you put it this way, I notice that "legibly" is to some degree subjective. The same process may be called rational by some and irrational by others, depending on whether they can read your reasoning.

Rationality (epistemic): The collection of techniques[1] that help to obtain the (more) correct model of the world from observations.

Rationality (instrumental): The collection of techniques that help to achieve goals in most environments. This collection includes most of epistemc rationality, because to know in what environment you are is often useful.

Rationality (what people usually mean): The stuff The Sequences and Thinking, Fast and Slow is about. Arguably, the most useful meaning of the three.

  1. ^

    Or the property of an agent that it uses these techniques. Different types, but isomorphic enough so I propose implicit conversion.