To develop methods of teaching rationality skills, you need to learn to focus on mental events that occur in 5 seconds or less. Most of what you want to teach is directly on this level; the rest consists of chaining together skills on this level.
As our first example, let's take the vital rationalist skill, "Be specific."
Even with people who've had moderate amounts of exposure to Less Wrong, a fair amount of my helping them think effectively often consists of my saying, "Can you give me a specific example of that?" or "Can you be more concrete?"
A couple of formative childhood readings that taught me to be specific:
"What is meant by the word red?"
"It's a color."
"What's a color?"
"Why, it's a quality things have."
"What's a quality?"
"Say, what are you trying to do, anyway?"You have pushed him into the clouds. If, on the other hand, we habitually go down the abstraction ladder to lower levels of abstraction when we are asked the meaning of a word, we are less likely to get lost in verbal mazes; we will tend to "have our feet on the ground" and know what we are talking about. This habit displays itself in an answer such as this:
"What is meant by the word red?"
"Well, the next time you see some cars stopped at an intersection, look at the traffic light facing them. Also, you might go to the fire department and see how their trucks are painted."-- S. I. Hayakawa, Language in Thought and Action
and:
"Beware, demon!" he intoned hollowly. "I am not without defenses."
"Oh yeah? Name three."-- Robert Asprin, Another Fine Myth
And now, no sooner does someone tell me that they want to "facilitate communications between managers and employees" than I say, "Can you give me a concrete example of how you would do that?" Hayakawa taught me to distinguish the concrete and the abstract; and from that small passage in Asprin, I picked up the dreadful personal habit of calling people's bluffs, often using the specific phrase, "Name three."
But the real subject of today's lesson is how to see skills like this on the 5-second level. And now that we have a specific example in hand, we can proceed to try to zoom in on the level of cognitive events that happen in 5 seconds or less.
Over-abstraction happens because it's easy to be abstract. It's easier to say "red is a color" than to pause your thoughts for long enough to come up with the example of a stop sign. Abstraction is a path of least resistance, a form of mental laziness.
So the first thing that needs to happen on a timescale of 5 seconds is perceptual recognition of highly abstract statements unaccompanied by concrete examples, accompanied by an automatic aversion, an ick reaction - this is the trigger which invokes the skill.
Then, you have actionable stored procedures that associate to the trigger. And "come up with a concrete example" is not a 5-second-level skill, not an actionable procedure, it doesn't transform the problem into a task. An actionable mental procedure that could be learned, stored, and associated with the trigger would be "Search for a memory that instantiates the abstract statement", or "Try to come up with hypothetical examples, and then discard the lousy examples your imagination keeps suggesting, until you finally have a good example that really shows what you were originally trying to say", or "Ask why you were making the abstract statement in the first place, and recall the original mental causes of your making that statement to see if they suggest something more concrete."
Or to be more specific on the last mental procedure: Why were you trying to describe redness to someone? Did they just run a red traffic light?
(And then what kind of exercise can you run someone through, which will get them to distinguish red traffic lights from green traffic lights? What could teach someone to distinguish red from green?)
When you ask how to teach a rationality skill, don't ask "How can I teach people to be more specific?" Ask, "What sort of exercise will lead people through the part of the skill where they perceptually recognize a statement as overly abstract?" Ask, "What exercise teaches people to think about why they made the abstract statement in the first place?" Ask, "What exercise could cause people to form, store, and associate with a trigger, a procedure for going through hypothetical examples until a good one or at least adequate one is invented?"
Coming up with good ways to teach mental skills requires thinking on the 5-second level, because until you've reached that level of introspective concreteness, that fineness of granularity, you can't recognize the elements you're trying to teach; you can't recognize the patterns of thought you're trying to build inside a mind.
To come up with a 5-second description of a rationality skill, I would suggest zooming in on a concrete case of a real or hypothetical person who (a) fails in a typical fashion and (b) successfully applies the skill. Break down their internal experience into the smallest granules you can manage: perceptual classifications, contexts that evoke emotions, fleeting choices made too quick for verbal consideration. And then generalize what they're doing while staying on the 5-second level.
Start with the concrete example of the person who starts to say "Red is a color" and cuts themselves off and says "Red is what that stop sign and that fire engine have in common." What did they do on the 5-second level?
- Perceptually recognize a statement they made as overly abstract.
- Feel the need for an accompanying concrete example.
- Be sufficiently averse to the lack of such an example to avoid the path of least resistance where they just let themselves be lazy and abstract.
- Associate to and activate a stored, actionable, procedural skill, e.g:
4a. Try to remember a memory which matches that abstract thing you just said.
4b. Try to invent a specific hypothetical scenario which matches that abstract thing you just said.
4c. Ask why you said the abstract thing in the first place and see if that suggests anything.
and
- Before even 1: They recognize that the notion of "concrete" means things like folding chairs, events like a young woman buying a vanilla ice cream, and the number 17, i.e. specific enough to be visualized; and they know "red is a color" is not specific enough to be satisfying. They perceptually recognize (this is what Hayakawa was trying to teach) the cardinal directions "more abstract" and "less abstract" as they apply within the landscape of the mind.
If you are thinking on this level of granularity, then you're much more likely to come up with a good method for teaching the skill "be specific", because you'll know that whatever exercise you come up with, it ought to cause people's minds to go through events 1-4, and provide examples or feedback to train perception 0.
Next example of thinking on the 5-second scale: I previously asked some people (especially from the New York LW community) the question "What makes rationalists fun to be around?", i.e., why is it that once you try out being in a rationalist community you can't bear the thought of going back? One of the primary qualities cited was "Being non-judgmental." Two different people came up with that exact phrase, but it struck me as being not precisely the right description - rationalists go around judging and estimating and weighing things all the time. (Noticing small discordances in an important description, and reacting by trying to find an exact description, is another one of those 5-second skills.) So I pondered, trying to come up with a more specific image of exactly what it was we weren't doing, i.e. Being Specific, and after further visualization it occurred to me that a better description might be something like this: If you are a fellow member of my rationalist community and you come up with a proposal that I disagree with - like "We should all practice lying, so that we feel less pressure to believe things that sound good to endorse out loud" - then I may argue with the proposal on consequentialist grounds. I may judge. But I won't start saying in immense indignation what a terrible person you must be for suggesting it.
Now I could try to verbally define exactly what it is we don't do, but this would fail to approach the 5-second level, and probably also fail to get at the real quality that's important to rationalist communities. That would merely be another attempt to legislate what people are or aren't allowed to say, and that would make things less fun. There'd be a new accusation to worry about if you said the wrong thing - "Hey! Good rationalists don't do that!" followed by a debate that wouldn't be experienced as pleasant for anyone involved.
In this case I think it's actually easier to define the thing-we-avoid on the 5-second level. Person A says something that Person B disagrees with, and now in Person B's mind there's an option to go in the direction of a certain poisonous pleasure, an opportunity to experience an emotional burst of righteous indignation and a feeling of superiority, a chance to castigate the other person. On the 5-second level, Person B rejects this temptation, and instead invokes the procedure of (a) pausing to reflect and then (b) talking about the consequences of A's proposed policy in a tone that might perhaps be worried (for the way of rationality is not to refuse all emotion) but nonetheless is not filled with righteous outrage and indignation which demands that all others share that indignation or be likewise castigated.
(Which in practice, makes a really huge difference in how much rationalists can relax when they are around fellow rationalists. It's the difference between having to carefully tiptoe through a minefield and being free to run and dance, knowing that even if you make a mistake, it won't socially kill you. You're even allowed to say "Oops" and change your mind, if you want to backtrack (but that's a whole 'nother topic of 5-second skills)...)
The point of 5-second-level analysis is that to teach the procedural habit, you don't go into the evolutionary psychology of politics or the game theory of punishing non-punishers (by which the indignant demand that others agree with their indignation), which is unfortunately how I tended to write back when I was writing the original Less Wrong sequences. Rather you try to come up with exercises which, if people go through them, causes them to experience the 5-second events - to feel the temptation to indignation, and to make the choice otherwise, and to associate alternative procedural patterns such as pausing, reflecting, and asking "What is the evidence?" or "What are the consequences?"
What would be an exercise which develops that habit? I don't know, although it's worth noting that a lot of traditional rationalists not associated with LW also have this skill, and that it seems fairly learnable by osmosis from watching other people in the community not be indignant. One method that seems worth testing would be to expose people to assertions that seem like obvious temptations to indignation, and get them to talk about evidence or consequences instead. Say, you propose that eating one-month-old human babies ought to be legal, because one-month-old human babies aren't as intelligent as pigs, and we eat pigs. Or you could start talking about feminism, in which case you can say pretty much anything and it's bound to offend someone. (Did that last sentence offend you? Pause and reflect!) The point being, not to persuade anyone of anything, but to get them to introspectively recognize the moment of that choice between indignation and not-indignation, and walk them through an alternative response, so they store and associate that procedural skill. The exercise might fail if the context of a school-exercise meant that the indignation never got started - if the temptation/choice were never experienced. But we could try that teaching method, at any rate.
(There's this 5-second skill where you respond to mental uncertainty about whether or not something will work, by imagining testing it; and if it looks like you can just go test something, then the thought occurs to you to just go test it. To teach this skill, we might try showing people a list of hypotheses and asking them to quickly say on a scale of 1-10 how easy they look to test, because we're trying to teach people a procedural habit of perceptually considering the testableness of ideas. You wouldn't give people lots of time to think, because then that teaches a procedure of going through complex arguments about testability, which you wouldn't use routinely in real life and would end up associating primarily to a school-context where a defensible verbal argument is expected.)
I should mention, at this point, that learning to see the 5-second level draws heavily on the introspective skill of visualizing mental events in specific detail, and maintaining that introspective image in your mind's eye for long enough to reflect on it and analyze it. This may take practice, so if you find that you can't do it right away, instinctively react by feeling that you need more practice to get to the lovely reward, instead of instinctively giving up.
Has everyone learned from these examples a perceptual recognition of what the "5-second level" looks like? Of course you have! You've even installed a mental habit that when you or somebody else comes up with a supposedly 5-second-level description, you automatically inspect each part of the description to see if it contains any block units like "Be specific" which are actually high-level chunks.
Now, as your exercise for learning the skill of "Resolving cognitive events to the 5-second level", take a rationalist skill you think is important (or pick a random LW post from How To Actually Change Your Mind); come up with a concrete example of that skill being used successfully; decompose that usage to a 5-second-level description of perceptual classifications and emotion-evoking contexts and associative triggers to actionable procedures etcetera; check your description to make sure that each part of it can be visualized as a concrete mental process and that there are no non-actionable abstract chunks; come up with a teaching exercise which seems like it ought to cause those sub-5-second events to occur in people's minds; and then post your analysis and proposed exercise in the comments. Hope to hear from you soon!
As a separate point, I have always argued against the validity of a certain argument against theists, that they are obligated to say what would constitute evidence sufficient to change their minds. The demand is an argument from ignorance. Nonetheless, being able to articulate what sufficiently contradictory evidence would be is a point in an arguers favor, even though the inability to do so is not fatal.
In this case, I'd say the question is somewhat ill-formed for two reasons. First, many entirely different things would be sufficient evidence to get me to change my mind, but if other things were also the case, they would no longer be sufficient. Certain statements by the CIA might be sufficient, but not if there were also other statements from the FBI.
Second, there are many sorts of mind changing possible. The more sane conspiracy theorists simply say the official account is not credible. The others articulate theories that, even granting all of there premises, are still less likely than the official story. A related point is what it means to be wrong according to different logics. If I believe in Coca-Cola's version of Santa Claus and also believe that Kobe Bryant is left-handed, in one sense there is no "Kobe Bryant" in the same way that there is no "Santa Claus". In a more useful sense, we say "Kobe Bryant really exists, but is right handed, and Santa Claus does not exist." This is so even though there is no one thing preventing us from saying "Santa Claus is really young, not old, tall and thin, not fat, has no beard and shaves his head, is black, and not white, and plays shooting guard for the Lakers under the alias "Kobe Bryant", and does nothing unusual on Christmas." Whether you say things I learn falsify the official story or modify it is a matter of semantics, but certain elements-like the involvement of Al-Quaeda-are more central to it than others. These elements are better established by existing evidence and would take correspondingly more evidence to dislodge.
So the answer to "what evidence would be strong enough to change your mind?" varies a lot depending on exactly what is being asked.
I think it is notable and important that the different but similar things you said got different responses. One was downvoted unto automatic hiding (the threshold is set to hide at -3 or less (more negative) by default). One was downvoted much more. We can speculate as to why but its important to acknowledge different community responses to different behavior (I won't prejudge it by saying "Different going against social beliefs").
Onto speculation: one problem with the video as evidence for explosions was a certain kind of jumping to conclusions. The guy said he heard explosions, but this is skipping a step. I could just as well say I heard people in a box, when I had actually heard sound waves emitted by a speaker attached to a computer. The guy's insistence that explosions were causing the sound is very strange, even granted that he had heard explosions before and the sounds he heard may have sounded exactly like those. Likewise for his claim they were coming from beneath him, considering what was going on.
Similarly, your assumption about the reason for your downvotes is certainly skipping steps. Most noticeable is how you don't distinguish what you are being socially punished for among your several downvoted posts, but the response to them was so different.
It's not so simple as that you were "go[ing] against their beliefs". Not everyone uses the voting function identically, but assuming many others use it as I do I can offer an analysis. I use it to push things to where I think they should be, rather than as an expression that I was glad I read a post (in hopes others will do the same, such that votes reflect what individuals were glad to have read. I believe something like this was the intent of the system's creators). I see -4 and -15 as not inappropriate final marks for your posts, and so didn't weigh in on them through the voting mechanism.
The problem with your first post was that it unfairly pushed the work of argument onto Eliezer. This is the same problem with the poll sent out by the fundamentalists to philosophers a few months ago, I couldn't find it, but it included questions such as "Do you agree: life begins at conception?" and "Do you agree: humans are unique and unlike the other animals?" The problem with that question is that the work/number of words needed to adequately disentangle and answer that exceed those required to ask it. Your question also didn't start from anywhere, you would have gotten a better response if you had said you thought the beliefs either actually right or wrong but not insane.
The tl;dr is that it was a passive-aggressive question. A small sin, for which it gets a -4, as implicitly the one voicing it disagrees with it and is against the communal norm, how important that factor is, I can't know.
The video evidence was a larger sin, as it was basically a waste of time to listen to it. First, the guy emphasized that he certainly heard explosions beneath him, as if by disbelieving that one would be thinking him to be a liar. Like I said above, this is the same thing ghost observers do: I don't necessarily disbelieve you heard what you heard and saw what you saw, I just am unsure about the original cause of that noise, especially considering how humans hear what they hear based on what they are familiar with hearing and expecting to hear (the multiple-drafts model of cognition).
What's more, when the advocate of a position has an opportunity to direct someone to evidence supporting his or her position and must elect to give them one piece of evidence in an attempt to spread the belief, I expect them to go with their best argument, which in turn ought to sound pretty impressive, as even incorrect positions often have one compelling argument in their favor.
If I had come across the video you showed as the first video I saw in the course of randomly watching accounts of 9/11 survivors (if a random sample of survivors were filmed and archived), it would be maybe perhaps be somewhat suspicious. As a video cherry picked by someone trying to justify skepticism, it's catastrophically weak, shockingly so actually. I expect cherry picked evidence in favor of any conspiracy to at least induce a physiological response, e.g. OMG bush has reptilian eyes he is a reptile he is a lizard person, oh wait that's stupid, it's an artifact of light being shined on dozens of presidents millions of times and this video has been cherry-picked.