MixedNuts's comment reminded me of a good resource for such techniques, and, indeed, for generally improving one's effectiveness at reading: How To Read A Book
It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so.
-- Mark Twain
Clearly Dennett has his sources all mixed up.
Voted up mainly for the Greg Egan recommendations.
But the problem is worse than that because "Sometimes, crows caw" actually does allow you to make predictions in the way "electricity!" does not.
The problem is even worse than that, because "Sometimes, crows caw" predicts both the hearing of a caw and the non-hearing of a caw. So it does not explain either (at least, based on the default model of scientific explanation).
If we go with "Crows always caw and only crows caw" (along with your extra premises regarding lungs, sound and ears etc), then we might end up wit...
Huh, I thought there was a fair bit of evidence around showing that people perform basically just as badly on tests which exploit cognitive biases after being told about them as they do in a state of ignorance.
I found Drive Yourself Sane useful for similar reasons.
I've been meaning to take a stab at Korzybski's Science and Sanity (available on the interwebs, I believe) for a while, but I've heard it's fairly impenetrable.
It's a wonderful thing to be clever, and you should never think otherwise, and you should never stop being that way. But what you learn, as you get older, is that there are a few million other people in the world all trying to be clever at the same time, and whatever you do with your life will certainly be lost - swallowed up in the ocean - unless you are doing it with like-minded people who will remember your contributions and carry them forward. That is why the world is divided into tribes.
-- Neal Stephenson, The Diamond Age
I neglected to record from which character the quote came.
Rationality is highly correlated intelligence
According to research K.E. Stanovich, this is not the case:
Intelligence tests measure important things, but they do not assess the extent of rational thought. This might not be such a grave omission if intelligence were a strong predictor of rational thinking. But my research group found just the opposite: it is a mild predictor at best, and some rational thinking skills are totally dissociated from intelligence.
The classic example of riding a bicycle comes to mind. No amount of propositional knowledge will allow you to use a bike successfully on the first go. Theory about gyroscopic effects of wheels and so forth all comes to nothing until you hop on and try (and fail, repeatedly) to ride the damn thing.
Conversely, most people never realise the propositional knowledge that in order to steer the bike left, you must turn the handle bars right (at least initially and at high speeds). But they do it unconsciously nonetheless.
But once procedural knowledge is had, it also incorporates things like body memory and pure automatic habit, which, when observed in oneself, are just as likely to be rationalized after the fact as they are to be antecedently planned for sound reasons. It's also easy to forget the initial propositions about a mastered procedure.
I've also noticed this kind of thing in my martial arts training.
For instance, often times high level black belts will be incredibly successful at a particular technique but unable to explain the procedure they use (or at least,...
This tendency can be used for good, though. As long as you're aware of the weakness, why not take advantage of it? Intentional self-priming, anchoring, rituals of all kinds can be repurposed.
Most of these bad Philosophers were encountered during the few classes I took to get a Philosophy minor.
Initially I thought you were talking about professional Philosophers, not students. This clears that up, but it would be better to refer to them as Philosophy students. Most people wouldn't call Science undergrads "Scientists".
My experience with Philosophy has been the opposite. Almost all the original writing we've read has been focused on how and why the original authors were wrong, and how modern theories address their errors. Admittedly,...
I would guess that it's because comments are shorter and tend to express a single idea. Posts tend to have a series of ideas, which means a voter is less likely to think all of them are good/worthy of an upvote.
Thirded. I completed half of my degree in CS before switching to Philosophy. I'm finding it significantly more stimulating. I don't think I learned anything in my CS classes that I couldn't easily have taught myself (and had more fun doing so).
According to this post, doing so would be "against blog guidelines". The suggested approach is to do top-level book review posts. I haven't seen any of these yet, though.
That sorted it, thanks.
Having recently received a couple of Amazon gift certificates, I'm looking for recommendations of 'rationalist' books to buy. (It's a little difficult to separate the wheat from the chaff.)
I'm looking mainly for non-fiction that would be helpful on the road to rationality. Anything from general introductory type texts to more technical or math oriented stuff. I found this OB thread which has some recommendations, but I thought that:
Nothing terrible will happen to Wednesday if she deconverts
The terrible thing has already happened at this stage. Telling your children that lies are true (i.e., that Mormonism is true), when they have no better way of discerning the truth than simply believing what you say, is abusive and anti-moralistic. It is fundamentally destructive of a person's ability to cope with reality.
I have never heard a story of deconversion that was painless. Everyone I know who has deconverted from a religious upbringing has undergone large amounts of internal (and often...
What do you do with the answer, though? I have a fair idea of why most of my procrastination occurs (if I leave something til the last minute and make a hash of it, I have a convenient excuse to protect my ego) but that has never seemed to help me actually overcome it.
I've always enjoyed Lewis Carroll's talk of maps:
..."That's another thing we've learned from your Nation," said Mein Herr, "map-making. But we've carried it much further than you. What do you consider the largest map that would be really useful?"
"About six inches to the mile."
"Only six inches!" exclaimed Mein Herr. "We very soon got to six yards to the mile. Then we tried a hundred yards to the mile. And then came the grandest idea of all! We actually made a map of the country, on the scale of a mile to the mile!
I'm not confident I could do a good job of it. He proposes that most problems in relationships come from our mythologies about ourselves and others. In order to have good relationships, we have to be able to be honest about what's actually going on underneath those mythologies. Obviously this involves work on ourselves, and we should help our partner to do the same (not by trying to change them, but by assisting them in discovering what is actually going on for them). He calls his approach to this kind of communication the "Real-Time Relationship.&quo...
I've found the work of Stefan Molyneux to be very insightful with regards to this (his other work has also been pretty influential for me).
You can find his books for free here. I haven't actually read his book on this specific topic ("Real-Time Relationships: The Logic of Love") since I was following his podcasting and forums pretty closely while he was working up to writing it.
If you don't know about relative motion and inertia, then it does seem like the sun moves around the earth (even when you know, it still looks that way). Prior to the "Copernican" revolution, it was generally thought that our sense experience of everyday life was sufficient to expose the truth to us. Those two things combined make a major roadblock in establishing that the earth rotates.
Now we can fully appreciate that it doesn't even make sense to make an absolute statement either way. If earth is taken to be stationary, then the sun does move around it (interestingly, this was Tycho Brahe's solution to the problem of shifting to a helio-centric view.)
I find it hard to believe that you haven't thought about the following, but you haven't mentioned it so I will. Conventional wisdom says:
1) Being at a healthy weight/having a 'healthy lifestyle' will (accidents and terminal genetic disorders aside) result in you living a longer life. This means more time to work on FAI stuff.
2) Exercise and good diet tend to increase feelings of well being and energy levels. This means better/more effective work on FAI stuff.
Discounting physical health and concentrating on intellectual life seems to me to be a status symbo...
Blood chokes still take several minutes to effect brain damage/death. I find the idea of accidentally throttling someone to death fairly suspicious. Besides, if it was truly an accident then where does Browne's guilt come from? I don't think the story suggested it was an accident.
I've recently started reading a book on the changes which Zen meditation seems to cause on neurology and consciousness, authored by a neurologist. The premise seems to fit with what you're saying.
I've heard that some meditative states (as measured by brain wave patterns) can be induced through the use of devices employing flashing lights and audio interference at certain frequencies ("binaural beats"). I've never really spent the time to investigate it seriously and there seems to be a fair amount of new-agey crap surrounding the idea, but it ma...
I was indeed thinking of the Mentats and Bene Gesserit. As you both point out, there was a significant mystical aspect to it. I suppose I was thinking more of the approach taken to mental training (within the world's internally consistent, but mystical, framework) rather than any specific techniques or events.
Mentats on the other hand have "minds developed to staggering heights of cognitive and analytical ability" (thanks Wikipedia) which would seem to fit the bill.
On the other hand, I suppose that neither of these instances are quite what Eliezer was after, as "you can't go out and do it at home".
The Dune series and Neal Stephenson's latest novel Anathem both come to mind. The Dune series includes a number of plot devices involving mental discipline (although it's all semi-mystical.) The world of Anathem, on the other hand, is split into two factions, one of which is specifically rationalist. It gets pretty philosophical and weird toward the end, but it mostly involves rationalist characters using math/science/etc to overcome the hurdles in their way. The world it describes sounds pretty similair to what I've read of Eliezer's Bayesian Conspiracy.