I agree, and this seems like a special case of what I sometimes think of as "extrapolating too far", which also occurs in reasoning of all kinds quite often and particularly when discussing the future.
An example would be the assumption that some scarce material resource eventually just "runs out" more or less suddenly, which people sometimes argue. In such cases, it's almost always the case that scarcity is gradually increasing and plays into a feedback loop of a search for alternatives. But if one just extrapolates the "this resource will eventually run out" idea in isolation, without taking into account the changes this has on the rest of the world and the relevant process, one can get to the conclusion that it just eventually just "hits 0".
Of course, extrapolating in more reasonable ways is often extremely difficult, as the systems involved are difficult/impossible to fully predict. But when one does isolated extrapolations, it's at least helpful to keep in mind that this necessarily comes with simplifications that won't hold, and that the world around the particular thing in question will likely react to the changes that the extrapolation entails. Or, as in your post, the extrapolated thing itself doesn't follow a linear trajectory to begin with.
I agree, and this seems like a special case of what I sometimes think of as "extrapolating too far", which also occurs in reasoning of all kinds quite often and particularly when discussing the future.
Yeah, I was somewhat on the fence about how to title the post. "Beware Overextrapolating" does feel more joint-carving, but "Don't Overdose Locally Beneficial Changes" seemed to be more likely to prompt the reader's mind to think in the direction of what the post is talking about.
Similarly, changing your mind is considered to be a good thing at LW, but there is also a virtue of... uhm... not going crazy just because someone just told you a new insight that has shattered your previous beliefs.
There is a value in changing your beliefs and resisting peer pressure, but that doesn't mean that you should immediately throw away all your sanity and outside view just because Vassar or Ziz just told you something cool.
On a small aside, in reference to the meditation thing, I think I saw in the comments of one of Scott Alexander's blog posts a long time ago (I know, such good provenance, can't find it atm) that a certain percentage of people are psychologically vulnerable to meditation. I'm fairly certain I'm one of those people. I can't handle psychedelics, including weed, and I get paranoid and anxious when meditating.
cf https://www.lesswrong.com/posts/fhL7gr3cEGa22y93c/meditation-is-dangerous
I can't handle psychedelics, including weed, and I get paranoid and anxious when meditating.
Do you generally find silence and not having anything to do uncomfortable/anxiety-inducing?
Not having anything to do, no, because I always find something to do. I can tolerate not having any electronics or books or anything for several hours, I will just think of things. But I probably would be in the bottom 15% of resistance to going crazy in complete isolation, I would guess. I actually like silence. I was surprised by how much I like silence when I visited an isolated area of Yellowstone National Park where there was (to the best ability of my ears) literally zero ambient noise. It was very relaxing.
But, I do get anxious when realizing that I am dreaming, and immediately start thinking about dream characters morphing into demons and turning against me. I also don't like going to sleep without listening to a video.
When hitting dead-on is too small a target, you can either over-correct or under-correct. Which mistake people make more?
I think people under-correct, by far, and are too scared of overcorrecting. It's more frustrating to double-back than to make uniform progress from one side. That doesn't mean avoiding overshooting is efficient.
I think people under-correct, by far, and are too scared of overcorrecting.
I agree that this is probably the case. In general, people have a normalcy bias (including the tendency to keep their behavior within the distribution of the behaviors of people around them) that holds them back from applying more dakka.
But also, like, sometimes avoiding overshooting is efficient, and it is often very efficient if you just stop to think for 5 minutes whether you should apply more of the thing. The biases of overshooting and undershooting are both bad and mostly don't cancel each other out.
(Not sure how much of a pushback against your comment this is.)
[Alternative title: apply More Dakka incrementally and carefully.]
If you are very overweight, then you should aim to cut down your daily caloric intake. This doesn't mean your optimal daily caloric intake is 100kcal.
If you are very underweight, then you should aim to ramp up your daily caloric intake. This doesn't mean your optimal daily caloric intake is 10,000kcal.
In general, if something is good to do some amount in some context, this doesn't mean that you should go as all-in on it as you can possibly manage. The utility of a change is context-dependent, and as you apply more of the change, the context also changes, and the marginal utility of the change might change along with the changed context (up or down).
...
This seems dead obvious, but I've been noticing various places to which this dead obvious point applies, but where many people seem to apply "seems good so far, so let's go all in" regardless.
For example: It's good to pull the mind's brakes, but it doesn't mean it's good to just stop it.
Some currents of thought latch onto the fact that certain changes to one's mind are clearly generally mostly beneficial and extrapolate maximally, proclaiming that the state of mind that got modified maximally along this axis is the most desirable one.
moridinamael writes:
Last year, I interacted with a practitioner of Buddhism who expressed a strange view to me, which I am now able to only vaguely recall. As far as I remember, the view was that as humans interact with each other, other living beings, and even the rest of the general non-living world around them, they are not passively allowing things to manifest themselves as they are, but rather imposing certain concepts on the Other, fitting the Other into preconceived frames. This is bad, the person said, because it puts us in "conflict" with the world.[1] The right choice is to abandon all our concepts, as they are "violent". If abandoning all the concepts means annihilation of the mind, so be it.
Listening to people trying to make sense of this after the Buddhist's departure made me think that this is an example of a broad pattern where someone notices a good mental movement or a change to one's mind and goes on to (implicitly) consider it absolutely good and something that is to be applied all the way.
One can gain insight, through various sorts of practice, that getting one's concepts to loosen their grip on the world, and letting the world manifest itself through the cracks left by the loosening of those concepts, can be good. See: Naturalism, Seeing with Fresh Eyes, Trapped Priors As A Basic Problem Of Rationality,[2] etc. This doesn't mean that you can just abandon all your concepts[3] because, in order to perceive in the first place, you need some concepts to make sense of the incoming information. A blank slate is not a mind.
[Caveat: I'm not saying that all Buddhist-ish practice is bad, and I am not claiming that this is the view that Buddha (or whatever specific major figure in the movement) held.]
To give a few more examples:
The above is an excerpt from Eliezer's old post When None Dare Urge Restraint. The issue I'm pointing at is something like: non-dare-urge-restraint-ness dynamics can also occur intra-personally.[4]
One of the things I asked the person was "Why call it 'conflict', rather than 'tension', which is like a clearly more apt term to me, because it's unclear to me that this needs to lead to any conflict, whereas there is some tension between, roughly, bottom-up processing and top-down processing, although it's unclear to me why this would be a proper tension between the perceiver and the perceived?". As far as I could tell, the person didn't offer a response.
In a sense, the entire point of this post could be described as "positive evaluation of an available action can become a trapped prior, and the consequences of it can be catastrophic".
I guess a better term than "concepts" would be something like "mental structures", but I'll limit esotericism by sticking to the more common term.
Maybe it makes sense to think about it in terms of myopic subagent power-seeking, a cancerous sort of goal (speculating, low confidence).