School taught me to write banal garbage because people would thumbs-up it anyway. That approach has been interfering with me trying to actually express my plans in writing because my mind keeps simulating some imaginary prof who will look it over and go "ehh, good enough".

Looking good enough isn't actually good enough! I'm trying to build an actual model of the world and a plan that will actually work.

Granted, school isn't necessarily all like this. In mathematics, you need to actually solve the problem. In engineering, you need to actually build something that works. But even in engineering reports, you can get away with a surprising amount of shoddy reasoning. A real example:

Since NodeJS uses the V8 JavaScript engine, it has native support for the common JSON (JavaScript Object Notation) format for data transfer, which means that interoperability between SystemQ and other CompanyX systems can still be fairly straightforward (Jelvis, 2011).

This excerpt is technically totally true, but it's also garbage, especially as a reason to use NodeJS. Sure, JSON is native to JS, but every major web programming language supports JSON. The pressure to provide citable justifications for decisions which were made for reasons more like "I enjoy JavaScript and am skilled with it," produces some deliberately confirmation-biased writing. This is just one pattern—there are many others.

I feel like I need to add a disclaimer here or something: I'm a ringed engineer, and I care a lot about the ethics of design, and I don't think any of my shoddy thinking has put any lives (or well-being, etc) at risk. I also don't believe that any of my shoddy thinking in design reports has violated academic integrity guidelines at my university (e.g. I haven't made up facts or sources).

But a lot of it was still shoddy. Most students are familiar with the process of stating a position, googling for a citation, then citing some expert who happened to agree. And it was shoddy because nothing in the school system was incentivizing me to make it otherwise, and I reasoned it would have cost more to only write stuff that I actually deeply and confidently believed, or to accurately and specifically present my best model of the subject at hand. I was trying to spend as little time and attention as possible working on school things, to free up more time and attention for working on my business, the productivity app Complice.

What I didn't realize was the cost of practising shoddy thinking.

Having finished the last of my school obligations, I've launched myself into some high-level roadmapping for Complice: what's the state of things right now, and where am I headed? And I've discovered a whole bunch of bad thinking habits. It's obnoxious.

I'm glad to be out.

(Aside: I wrote this entire post in April, when I was finished my last assignments & tests. I waited awhile to publish it so that I've now safely graduated. Wasn't super worried, but didn't want to take chances.)

Better Wrong Than Vague

So today.

I was already aware of a certain aversion I had to planning. So I decided to make things a bit easier with this roadmapping document, and base it on one my friend Oliver Habryka had written about his main project. He had created a 27-page outline in google docs, shared it with a bunch of people, and got some really great feedback and other comments. Oliver's introduction includes the following paragraph, which I decided to quote verbatim in mine:

This document was written while continuously repeating the mantra “better wrong than vague” in my head. When I was uncertain of something, I tried to express my uncertainty as precisely as possible, and when I found myself unable to do that, I preferred making bold predictions to vague statements. If you find yourself disagreeing with part of this document, then that means I at least succeeded in being concrete enough to be disagreed with.

In an academic context, at least up to the undergrad level, students are usually incentivized to follow "better vague than wrong". Because if you say something the slightest bit wrong, it'll produce a little "-1" in red ink.

Because if you and the person grading you disagree, a vague claim might be more likely to be interpreted favorably. There's a limit, of course: you usually can't just say "some studies have shown that some people sometimes found X to help". But still.

Practising being "good enough"

Nate Soares has written about the approach of whole-assed half-assing:

Your preferences are not "move rightward on the quality line." Your preferences are to hit the quality target with minimum effort.

If you're trying to pass the class, then pass it with minimum effort. Anything else is wasted motion.

If you're trying to ace the class, then ace it with minimum effort. Anything else is wasted motion.

My last two yearly review blog posts have followed structure of talking about my year on the object level (what I did), the process level (how I did it) and the meta level (my more abstract approach to things). I think it's helpful to apply the same model here.

There are lots of things that humans often wished their neurology naturally optimized for. One thing that it does optimize for though is minimum energy expenditure. This is a good thing! Brains are costly, and they'd have to function less well if they always ran at full power. But this has side effects. Here, the relevant side effect is that, if you practice a certain process for awhile, and it achieves the desired object-level results, you might lose awareness of the bigger picture approach that you're trying to employ.

So in my case, I was practising passing my classes with minimum effort, and not wasting motion, following the meta-level approach of whole-assed half-assing. But while the meta-level approach of "hitting the quality target with minimum effort" is a good one in all domains (some of which will have much, much higher quality targets) the process of doing the bare minimum to create something that doesn't have any obvious glaring flaws, is not a process that you want to be employing in your business. Or in trying to understand anything deeply.

Which I am now learning to do. And, in the process, unlearning the shoddy thinking I've been practising for the last 5 years.

Related LW post: Guessing the Teacher's Password

(This article crossposted from my blog)

New Comment
4 comments, sorted by Click to highlight new comments since: Today at 9:31 AM

This post has been helpful. I have gathered a few things:

  1. look for shoddy thinking and unlearn it - not entirely sure how to do that, but I will be thinking about it for the future.
  2. better wrong than vague - be clear and specific enough to get things wrong. - I have started to notice sometimes I express things vaguely enough to "never be wrong" it's happened since hitting lesswrong and now I think its more important than I did before to correct this sooner.
  3. reminding me of nate soares' whole-ass-half-ass ideas. - a great idea of "how to use laziness to its optimum position"

so thanks!

School taught me to write banal garbage because people would thumbs-up it anyway.

Is going back to school, for a more advanced degree, at a university with stricter academic standards, a viable retraining choice for you at this point?

[-][anonymous]9y20

'We need 2 Stalins!'

Ehh, more advanced degree would probably help somewhat, although incentives are pretty messed up there too in different ways. The university I went to and program I took are both highly regarded—I'm biased, of course, but I've heard this from various less-biased sources.

I'm instead going to train my thinking skills on things like my business, because there I get actual feedback from the world on how successful I am. Also because the feedback there is made of dollars.