Also, learn to differentiate between genuine curiosity and what I like to call pseudo-curiosity - basically, being satisfied by conclusions rather than concepts. Don't let the two overlap. This is especially hard when conclusions are most of the time readily available and often the first item in a google search. In terms of genuine curiosity, google has been the bane of my existence - I will start off moderately curious, but instead of moving to that higher stage of curiosity, I will be sated by facts and conclusions without actually learning anything (similar to a guessing the teacher's password situation). After a couple hours of doing this, I feel very scholarly and proud of my ability to parse so much information, when in reality all I did was collect a bunch of meaningless symbols.
To combat this, I started keeping a "notebook of curiosities". The moment I get curious, I write whatever it is I'm curious about, and then write everything I know about it. At this point, I determine whether or not anything I know is a useful springboard; otherwise, I start from scratch. Then I circle my starting node and start the real work, with the following rules:
"A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?"
I had following (in rapid succession): 10 cents, whoops it adds up to 120 cents , aha, 5 cents, adds up to 110 , done.
Doesn't really matter what stupid heuristic you try if you verify the result. I can of course do: let a+b=1.1 , a=b+1 , b+1+b=1.1 , 2b=0.1 , b=0.05 , but it takes a lot longer to write, and to think, and note the absence of verification step here.
The "No! Algebra" is sure fire way to do things slower. Verification and double checking is the key imo. Algebra is for unwieldy problems where you can't test guesses quickly, failed to guess, have to use pencil and paper, etc. When you rely on short term memory you really could be best off trying to intuitively get the answer, then checking it, then rewarding yourself when correct (if verification is possible)
Having worked on the Voynich Manuscript (which you namecheck above) for over a decade now, I'd say that uncertainty isn't just a feeling: rather, it's the default (and indeed natural) state of knowledge, whereas certainty is normally a sign that we've somehow failed to grasp and appreciate the limits and nature of our knowledge.
Until you can eradicate the itch that drives you to want to make knowledge final, you can never be properly curious. Real knowledge doesn't do final or the last words on a subject: it's conditional, partial, constrained, and heuristic. I contend that you should train your ape-brain to stay permanently curious: almost all certain knowledge is either fake or tautologous.
Exercise 2.2: Make plans for different worlds... Maybe you live in a world where you'd improve your cognitive function by taking nootropics, or maybe you live in a world where the nootropics would harm you.
On the bright side, this is pretty much the thought process I go through whenever I don't know the right answer to something. On the other hand ("on the dark side"?) I think my automatic instinct is "there's no scientific consensus on this that I've read about in my textbooks...therefore this is a Permanent Blank in my map and I just h...
Once, I explained the Cognitive Reflection Test to Riley Crane by saying it was made of questions that tempt your intuitions to quickly give a wrong answer. For example:
This could use spoiler tags, or ideally some substitute: it's useful for people to have a chance to be administered the CRT unawares (lest they imagine by hindsight bias that they would not have been misled, or others lose the chance to test them).
In feeling that you do not know the answer, Luke suggests to "Think of things you once believed but were wrong about." Why not take it a step further and say
1.3 When thinking about a time when you were wrong, think about how right being wrong feels*up until the moment you realize we are wrong.
In reflecting on times when I have been wrong what I find most disturbing is not what I was wrong about, but the degree to which being wrong is cognitively similar to being right. In college, I went to an Elizabeth Loftus lecture where she shockingly announc...
Curiosity is one possible motivation that forces you to actually look at evidence. Fear is more reliable and can be used when curiosity is hard to manufacture.
If fear paralyzes, maybe it's best used in bursts at times when you don't immediately need anything done and can spend some time on reevaluating basic assumptions. I wonder if there should be a genre of fiction that's analogous to horror except aimed at promoting epistemic paranoia. I've heard the RPG Mage: the Ascension cited in that context. I guess there's also movies like the Matrix series, the Truman Show, Inception. One could have an epistemic counterpart to Halloween.
I just watched The Truman Show a few days ago. I interpreted it as a story about a schizophrenic who keeps getting crazier, eventually experiencing a full out break and dying of exposure. The scenes with the production crew and audience are actually from the perspective of the schizophrenic's imagination as he tries to rationalize why so many apparently weird things keep happening. The scenes with Truman in them are Truman's retrospective exaggerations and distortions of events that were in reality relatively innocuous. All this allows you to see how real some schizophrenics think their delusions are.
What I had in mind was replacing rituals involving the fear of being hurt with rituals involving the fear of being mistaken. So in a more direct analogy, kids would go around with signs saying "you have devoted your whole existence to a lie", and threaten (emptily) to go into details unless they were given candy.
Last Halloween i dressed as a P-zombie. I explained to anybody who would listen that i had the same physical composition as a conscious human being, but was not in fact conscious. I'm not sure that any of them were convinced that i really was in costume.
I consistently fail several times over at this. I always feel I DO know everything worth knowing, and while obviously wrong can't come up with any salient counterexamples. Probably related to memory problems I have, I don't seem able to come up with examples or counterexamples of anything ever.
And when I do consider multiple possibilities, they never seem to matter for what actions I should take, which drains any motivation to find out the answer if it takes more than 30 seconds of googling or I happen to not be at my computer when the question occurs.
All...
Good. Let's see if we can make progress.
have a jar of mini chocolate chips by your desk and pop one in your mouth every time you google an interesting question on scholar or wikipedia.
Is there any evidence this works? 1) Does the brain treat these discretionary pleasures as reinforcement? 2) If it does, do attribution effects undermine the efficacy? Research in attribution effects show that extrinsic rewards sometimes undermine intrinsic interest, i.e., curiosity. "Negative effects are found on high-interest tasks when the rewards are tangible, expected (offered beforehand), and loosely tied to level of performance."
I approve strongly! Publicly-posted exercises may yield practice, practice yields habit, and habit yields changed behavior. Developing deeper, more-focused curiosity would be a grand step towards becoming more awesome. But!
( summary: It is important to practice this skill at appropriate times, like when it is useful and feasible to work on answering the given question, and not just at random, or whenever it's convenient to schedule the practice. I plan to attach a reminder to my research to-do list.)
Alright, says I, this exercise seems plausible enough. So...
Another idea from Anna Salamon is just to brainstorm a ton of questions on the topic you want to get curious about for a predetermined period of N minutes. Very limited data suggests this method works significantly better for me.
Am I the only one who searched the phrase "I see you start to answer a question, and then you stop, and I see you get curious." to see who it referred to?
Closing my eyes gives me only the feeling of having defensively headed a long ball in soccer a few hours ago. Sometimes I try to think and nothing seems to happen :)
VoI shouldn't be abbreviated (even with hyperlink).
Thinking about how I've been mistaken in the past feels pretty bad for me - akin to true embarrassment. But I suppose it's almost the only reason I'm ever cautiously uncertain, and that seems sad.
I really value your suggestion to purposefully cultivate delight-based exploration, instead of merely looking to minimize regret (even fairly assigned regret at coming up short of boundedly-optimal-rational, without confusing outcome for expected outcome in hindsight).
Setting step one as "Feel that you don't already know the answer" fits with Loewenstein (1994)'s "gap theory of curiosity", summarized by Cooney (2010):
...[Loewenstein's] theory is that curiosity happens when people feel a gap in their knowledge about something... Laying out a question and inviting others to ponder it will help keep the individual's attention, because it gets them mentally involved and because there's an element of unexpectedness. This is why cliffhangers are often used at the end of television soap operas, to get viewer
Curious about what though? It seems like a very important piece of the above lesson is missing if we have no guidance as to what we should be curious about. It does me no good, perhaps no small amount of harm, to be intensely curious about the details of a fictional world. I ought not be curious about the personal life of my neighbor. And while curiosity about insects may serve some, it's unlikely to do most people any good at all. I think we have no good reason to believe that we're generally curious about the right sorts of things.
And there seems to be a...
...it will make you light and eager, and give purpose to your questioning and direction to your skills.
And this article rekindled that for me. I have a motivation to explore I have not felt in quite some time. Thanks for writing this, Luke!
If you have beliefs about the matter already, push the "reset" button and erase that part of your map. You must feel that you don't already know the answer.
It seems like a bad idea to intentionally blank part of your map. If you already know things, you shouldn't forget what you already know. On the other hand, if you have reason to doubt what you think you know, you should blank the suspect parts of your map when you had reason to doubt them, and not artificially as part of a procedure for generating curiosity.
I think what you may be trying t...
This is all good stuff, but it makes curiosity sound complicated. I thought that the point of using curiosity as a hook into epistemic rationality is that once you feel the emotion of curiosity, your brain often just knows what to do next.
Also curiosity feels good.
Bug report: step 2, exercise 2.1. If the consequences of my current best guess being wrong are much less dire than the consequences of being wrong on recomputing, my social circle thinks that the plan based on this current best guess is very important, and I hate the people who disagree, then I'm terrified of trying to recompute.
People try very hard to ignore the consequences of being wrong. Fear in this case is dangerous, because cause stagnation and break curiosity.
- lessdazed
- Bruce Lee
Recently, when Eliezer wanted to explain why he thought Anna Salamon was among the best rationalists he knew, he picked out one feature of Anna's behavior in particular:
For me, the ability to reliably get curious is the basic front-kick of epistemic rationality. The best rationalists I know are not necessarily those who know the finer points of cognitive psychology, Bayesian statistics, and Solomonoff Induction. The best rationalists I know are those who can reliably get curious.
Once, I explained the Cognitive Reflection Test to Riley Crane by saying it was made of questions that tempt your intuitions to quickly give a wrong answer. For example:
If you haven't seen this question before and you're like most people, your brain screams "10 cents!" But elementary algebra shows that can't be right. The correct answer is 5 cents. To get the right answer, I explained, you need to interrupt your intuitive judgment and think "No! Algebra."
A lot of rationalist practice is like that. Whether thinking about physics or sociology or relationships, you need to catch your intuitive judgment and think "No! Curiosity."
Most of us know how to do algebra. How does one "do" curiosity?
Below, I propose a process for how to "get curious." I think we are only just beginning to learn how to create curious people, so please don't take this method as Science or Gospel but instead as an attempt to Just Try It.
As with my algorithm for beating procrastination, you'll want to practice each step of the process in advance so that when you want to get curious, you're well-practiced on each step already. With enough practice, these steps may even become habits.
Step 1: Feel that you don't already know the answer.
If you have beliefs about the matter already, push the "reset" button and erase that part of your map. You must feel that you don't already know the answer.
Exercise 1.1: Import the feeling of uncertainty.
Exercise 1.2: Consider all the things you've been confident but wrong about.
Step 2: Want to know the answer.
Now, you must want to fill in this blank part of your map.
You mustn't wish it to remain blank due to apathy or fear. Don't avoid getting the answer because you might learn you should eat less pizza and more half-sticks of butter. Curiosity seeks to annihilate itself.
You also mustn't let your desire that your inquiry have a certain answer block you from discovering how the world actually is. You must want your map to resemble the territory, whatever the territory looks like. This enables you to change things more effectively than if you falsely believed that the world was already the way you want it to be.
Exercise 2.1: Visualize the consequences of being wrong.
Exercise 2.2: Make plans for different worlds.
Exercise 2.3: Recite the Litany of Tarski.
The Litany of Tarski can be adapted to any question. If you're considering whether the sky is blue, the Litany of Tarski is:
Exercise 2.4: Recite the Litany of Gendlin.
The Litany of Gendlin reminds us:
Step 3: Sprint headlong into reality.
If you've made yourself uncertain and then curious, you're now in a position to use argument, empiricism, and scholarship to sprint headlong into reality. This part probably requires some domain-relevant knowledge and an understanding of probability theory and value of information calculations. What tests could answer your question quickly? How can you perform those tests? If the answer can be looked up in a book, which book?
These are important questions, but I think the first two steps of getting curious are more important. If someone can master steps 1 and 2, they'll be so driven by curiosity that they'll eventually figure out how to do step 3 for many scenarios. In contrast, most people who are equipped to do step 3 pretty well still get the wrong answers because they can't reliably execute steps 1 and 2.
Conclusion: Curiosity in Action
A burning itch to know is higher than a solemn vow to pursue truth. If you think it is your duty to doubt your own beliefs and criticize your own arguments, then you may do this for a while and conclude that you have done your duty and you're a Good Rationalist. Then you can feel satisfied and virtuous and move along without being genuinely curious.
In contrast,
My recommendation? Practice the front-kick of epistemic rationality every day. For months. Train your ape-brain to get curious.
Rationality is not magic. For many people, it can be learned and trained.