Someone mentioned Paul Feyerabend in response to this post. He was in favor of having slack in science, and I resonate strongly with some of these descriptions:
Feyerabend was critical of any guideline that aimed to judge the quality of scientific theories by comparing them to known facts. He thought that previous theory might influence natural interpretations of observed phenomena. Scientists necessarily make implicit assumptions when comparing scientific theories to facts that they observe. Such assumptions need to be changed in order to make the new theory compatible with observations. The main example of the influence of natural interpretations that Feyerabend provided was the tower argument. The tower argument was one of the main objections against the theory of a moving earth. Aristotelians assumed that the fact that a stone which is dropped from a tower lands directly beneath it shows that the earth is stationary. They thought that, if the earth moved while the stone was falling, the stone would have been "left behind". Objects would fall diagonally instead of vertically. Since this does not happen, Aristotelians thought that it was evident that the earth did not move. If one uses ancient theories of impulse and relative motion, the Copernican theory indeed appears to be falsified by the fact that objects fall vertically on earth. This observation required a new interpretation to make it compatible with Copernican theory. Galileo was able to make such a change about the nature of impulse and relative motion. Before such theories were articulated, Galileo had to make use of ad hoc methods and proceed counterinductively. So, "ad hoc" hypotheses actually have a positive function: they temporarily make a new theory compatible with facts until the theory to be defended can be supported by other theories.
Feyerabend commented on the Galileo affair as follows:
The church at the time of Galileo was much more faithful to reason than Galileo himself, and also took into consideration the ethical and social consequences of Galileo's doctrine. Its verdict against Galileo was rational and just, and revisionism can be legitimized solely for motives of political opportunism.[5][6][7]
The following is also a nice thing to keep in mind. Although less about slack and more about the natural pull to use tools like science to further political/moral aims.
According to Feyerabend, new theories came to be accepted not because of their accord with scientific method, but because their supporters made use of any trick – rational, rhetorical or ribald – in order to advance their cause. Without a fixed ideology, or the introduction of religious tendencies, the only approach which does not inhibit progress (using whichever definition one sees fit) is "anything goes": "'anything goes' is not a 'principle' I hold... but the terrified exclamation of a rationalist who takes a closer look at history." (Feyerabend, 1975).
The following is more controversial, and I don't fully agree with it. But it contains some interesting thought nuggets.
Feyerabend described science as being essentially anarchistic, obsessed with its own mythology, and as making claims to truth well beyond its actual capacity. He was especially indignant about the condescending attitudes of many scientists towards alternative traditions. For example, he thought that negative opinions about astrology and the effectivity of rain dances were not justified by scientific research, and dismissed the predominantly negative attitudes of scientists towards such phenomena as elitist or racist. In his opinion, science has become a repressing ideology, even though it arguably started as a liberating movement. Feyerabend thought that a pluralistic society should be protected from being influenced too much by science, just as it is protected from other ideologies.
Starting from the argument that a historical universal scientific method does not exist, Feyerabend argues that science does not deserve its privileged status in western society. Since scientific points of view do not arise from using a universal method which guarantees high quality conclusions, he thought that there is no justification for valuing scientific claims over claims by other ideologies like religions. Feyerabend also argued that scientific accomplishments such as the moon landings are no compelling reason to give science a special status. In his opinion, it is not fair to use scientific assumptions about which problems are worth solving in order to judge the merit of other ideologies. Additionally, success by scientists has traditionally involved non-scientific elements, such as inspiration from mythical or religious sources.
My more charitable interpretation is that, Science is a nicely rigorous method for truth-seeking, but because of its standards for rigor, it ends up missing things (like the 'ki' example from In praise of fake frameworks).
Also, I sense elitist attitudes from science / rationality / EA as not entirely justified. (Possibly this elitism is even counter to the stated goals of each.) I feel like I often witness 'science' or 'rationality' getting hijacked for goals unrelated to truth-seeking. And I'm currently a tiny bit skeptical of the confidence of EA's moral authority.
The opening Feyeraband quote is sounds very similar to (Scott's review of) Kuhn's Structure of Scientific Revolutions. Related: Jacob's post on the copernican revolution from the inside.
Attempt at definition.
If I have less slack in my belief system, that means I have more constraints in what counts as 'evidence' for a given statement or more preconceptions about what can count as 'true' or 'real'.
Either, I can be looking for specific signs/evidence/proofs/data ("I will only consider X if you can prove Y." "I will only consider X if you show me a person who flosses with their shoelace.").
Or, I can be looking for certain categories or classes of evidence ("I will only consider X if there are studies showing X." "I will only consider X if your arguments takes a certain form." "I will only consider X if 5 experts agree." Etc.)
Sometimes, it's better to have less slack. It makes sense for certain fields of mathematics to have very little slack.
Other times, it hinders progress.
Are you trying to define Slack in Your Belief System as "this is what those words together naturally mean" or are you defining it as "this is a useful concept to think about and this is what I choose to name it"?
Before reading your take, I thought about what 'slack in your belief system' would mean to me, and I came up with a lot of different things it could mean. Mostly my System-1 response was that SIYBS links back into Anna's concept that flinching away from truth is about protecting the epistomology: What beliefs could you change without changing everything? What must you defend lest huge chains of logic unravel and mindsets shift in disruptive ways? But also the simple, 'how tightly am I holding onto these beliefs' type of thing, a kind of uncertainty, how much you would update on new evidence in an area at all. That does go hand in hand with what types of evidence you'd update on. Often I think people have places where the 'wrong' kind of evidence is allowed, because there isn't 'right' evidence that needs to be dislodged, so you have more slack in those places, and so on. Kind of a levels-of-evidence thing. Also could be thought of as how well your system can adjust to counterfactuals or fake frameworks or ad argumentos, and still give reasonable answers.
I do think there's a specific kind of lack of Slack where you decide that something is Scientific and therefore you can only change your beliefs based on Proper Scientific Studies, and that this is very easy to take too far (evidence is and always is evidence). What this is really saying is that your prior is actually damn close to 0 or 1 at this point, so other types of evidence aren't likely to cut it, and/or that you think people are trying to trick you so you have to disregard such evidence?
Anyway, it's certainly an interesting thing to think about, and this is already pretty rambly, so I'll stop here.
Promoted to the frontpage. I like this post, and think it's an interesting extension and clarification of the Slack concept, though I still didn't walk away with a very precise sense of what "Slack in one's belief system" actually means, and I still have a few competing interpretations of the details. But I like the point overall. I would be glad to see people in the comments or in future posts to try to give a more rigorous definition of Slack in this context.
I was also somewhat unsure about the Slack-In-Belief-System definition and thought it'd have been nice to open with a sentence or two clarifying that.
It sounded something like a combination of "having some chunk of probability space for 'things I haven't thought of ' or 'I might be wrong'", as well as something vaguer like "being a bit flexible about how you come to believe things."
it'd have been nice to open with a sentence or two clarifying [the definition]
Yes! And it should come right between these two lines
You can have Slack in your life. But you can also have Slack in your belief system.
Initially, this seems like it might be bad.
Is it bad? I don't know. You haven't told me what you mean by Slack in belief system yet!
this is one of the posts when i wish for three examples for the thingy described. because i see two options:
1. this is weakman of the position i hold, in which i seek the ways to draw a map that correspond to the territory, and have my estimations of what work and what no, and disagree with someone about that. and the someone instead of providing evidence that his method providing good predictions or insights, just say i should have more slack.
all you description on why believe in things sounds anti-Beysian. it's not boolean believe-disbelieve. update yourself incrementally! if i believe something provide zero evidence i will not update, if the deviance dubious, i will update only a little. and then the question is how much credence you assign to what evidence, and methods to find evidence.
2. it's different worlds situation, when the post writer encountered problem i didn't.
and i have no way to judge that, without at least one, and better more, actual examples of the interaction, better linked to and not described by the author.
Follow-up to Zvi’s post on Slack
You can have Slack in your life. But you can also have Slack in your belief system.
Initially, this seems like it might be bad.
Won't Slack result in a lack of precision? If I give myself Slack to believe in whatever, won't I just end up with a lot of wrong beliefs? Shouldn't I always be trying to decrease the amount of Slack in my beliefs, always striving to walk the narrow, true path?
Claims:
[ I want to note that I fully believe I could be wrong about all four claims here, or thinking about this in the entirely wrong way. So fight me. ]
Now, I'm going to specifically discuss Slack in one's meta process.
So, while I can apply the concept of Slack to individual beliefs themselves (aka "holding beliefs lightly"), I am applying the concept more to the question of "How do I come to know/understand anything or or call a thing true?"
So, I'm not discussing examples of "I believe X, with more or less Slack." I'm discussing the difference between, "Doing a bunch of studies is the only way to know things" (less Slack) vs. "Doing a bunch of studies is how I currently come to know things, but I'm open to other ways" (more Slack).
The less Slack there is in your process for forming beliefs, the more constraints you have to abide before being able to claim you've come to understand something.
Examples of such constraints include:
Note that sometimes, it is good to have such constraints, at least for now.
Not everyone can interact with facts, claims, and beliefs without some harm to their epistemics. In fact, most people cannot, I claim. (And further, I believe this to be one of the most important problems in rationality.)
That said, I see a lot of people's orientations as:
"My belief-forming process says this thing isn't true, and in fact this entire class of thing is likely false and not worth digging into. You seem to be actively engaging with [class of thing] and claiming there is truth in it. That seems highly dubious—there is something wrong with your belief-forming process."
This is a reasonable stance to take.
After all, lots of things aren't worth digging into. And lots of people have bad truth-seeking processes. Theirs may very well be worse than yours; you don't have to consider something just because it's in front of you.
But if you notice yourself unwilling to engage with [entire class of thing]... to me this indicates something is suboptimal.
Over time, it seems good to aim for being able to engage with more classes of things, rather than fewer.
If something is politically charged, yes, your beliefs are at risk, and you may be better off avoiding the topic altogether. But—wouldn't it be nice, if one day, you could wade through the mire of politics and come out the other side, clean? Epistemics in tact? Even better, you come out the other side having realized new truths about the world?
I guess if I'm going to be totally honest, the reason I am saying this is because I feel annoyed when people dismiss entire [classes of thing] for reasons like, "That part of the territory is really swampy and dangerous! Going in there is bad, and you're probably compromised."
At least some of the time, the thing that is going on is the person just figured out how to navigate swamps.
But instead, I feel like the person lacks Slack in their belief-forming process and is also trying to enforce this lack of Slack onto others.
From the inside, I imagine this feels like, "No one can navigate swamps, and anyone who says they are is probably terribly mistaken or naive about how truth-seeking works, so I should inform them of the danger."
From the inside, Slack will feel incorrect or potentially dangerous. Without constraints, the person may feel like they'll go off the rails—maybe they'll even end up believing in *gasp* horoscopes or *gasp* the existence of a Judeo-Christian God.
My greatest fear is not having false beliefs. My greatest fear is getting trapped into a particular definition of truth-seeking, such that I permanently end up with many false beliefs or large gaps in my map.
The two things I do to avoid this are:
a) Learn more skills for navigating tricky territories. For example, one of the skills is noticing a belief that's in my mind because it would be beneficial for me to believe it, i.e. it makes me feel good in a certain way or I expect good things to happen as a result—say, it'd make a person like me more if I believed it. This likely requires a fair amount of introspective capacity.
b) Be open to the idea that other people have truth-seeking methods that I don't. That they're seeing entire swaths of reality I can't see. Be curious about that, and try to learn more. Develop taste around this. Maintain some Slack, so I don't become myopic.