Great post.
If only we could flesh this out into a four year long class and call it high school...
One of the more common irrationalities I see on the internet is the sin of foolish consistency. i.e. people post something without a lot of thought and then when evidence or arguments arise which undermine their statement, they are overly dismissive because they do not want to appear inconsistent.
Actually make written down predictions about the future. If you don't make real written down predictions you will never know how bad you are at predicting the world around you.
The thing is, if you actually make a written down prediction, you're more likely to alter your actions purely for the sake of making it come true.
If you don't foresee/predict bad things happening, then you can't do anything to prevent them. UnFriendly AI anyone?
This reminds me of an architectural designer I used to work for, when we were laying out what needed to be done I would try to foresee what could go wrong, so I could head it off. I finally quit trying to do that with him because he kept accusing me of "negativity" as though thinking about bad things make them happen. (Since I was the one actually doing most of the work I still tried to predict what could go wrong, and head it off, I just quit talking to him about it.)
It's the poor craftsman who blames his tools!
(Or as Gibbon says, 'The wind and waves are always on the side of the ablest navigators.')
Your work context may provide you with frequent opportunities to do that.
For instance, if you are a programmer, you can make predictions about how long a given task is going to take, or alternately how many tasks you can take on in a given period.
If you train or teach people, you can predict what they will have understood at the end of a given session, and test those predictions by asking questions at the end of the session.
More generally, predictions of the form "I will achieve objective X by time T" are a useful class, as you normally have quite a lot of the relevant information, which ought to narrow your confidence bounds.
ETA: keeping appointments is another similar class. If you're never late, you're probably underconfident. (See Umeshisms.) You should have a general degree of confidence in your timeliness, e.g. "I will seek to show up on time 80% of the time." You may adjust that depending on criticality in given contexts, e.g. "...except that I hate to disappoint employers, so I'll show up to work on time 95% of the time".
How many people actually bother to do this?
Written predictions seem like they would be a lot of work, but maybe this is indeed worth trying.
Prediction Book is a really promising app, but it unfortunately has lots of problems, and it doesn't look like it's being improved very much at all.
Among the problems are:
Yes, it's definitely convenient. I guess I am a bit frustrated with PB because I thought it had tremendous potential and was very excited about using it. After using it for a while now and making more than a hundred predictions with it, the problems have worn me down and there seems little chance that it'll be improved, since they've said they'll only improve it if it's used a lot (and it doesn't get very much use because of all the problems).
The issue with not having the judgments of public options be able to be different per user is that there are lots of public predictions that include indexicals and that multiple people vote on (I've done it myself before I realized what it leads to), and there will probably continue to be plenty of those kinds of predictions, since there is no suggestion from PB that public predictions should not include indexicals or that people should avoid providing estimates on public predictions that include indexicals.
What happens is that multiple people make predictions, and then the judgment swings back and forth between Right and Wrong as different people judge it Right or Wrong for them, many of them probably thinking that they're rendering judgment for themselves and not for everybody else as well. Now, I make everything private to avoid these kinds of problems, but a site like that with most content private is much less useful than it would be if things were more public, and people could get ideas about things to make predictions on from other people and could comment on each others' predictions.
Another problem is that if, every time I see a prediction with an indexical (of which there are many) that I would like to add an estimate for, I have to create a new prediction and copy/paste the text or type it out again, then it becomes too much of a hassle -- especially given how slow everything is. I don't care how it's implemented, but I should just have to add an estimate and click a button. Anything more than that is too much work when the exact wording for the prediction I want to make already exists and I'm looking at it. Perhaps they could add another button to allow making a private estimate for that prediction, and then allow a private judgment for it that is only visible to the user as well.
And no, I didn't vote in the feedback. Requiring your users to sign up for a different account in order to provide feedback is just obnoxious.
Dude, we're trying to help on many fronts. We host OB at no charge; we developed and host LW at no charge; we wrote PB and offer it at no charge. If you tried a little harder, do you think you could come up with an explanation for why we'd use an external feedback service other than that we're obnoxious?
ETA 08:39:51 UTC: Sorry - that was overly snarky. You obviously want to be a passionate user but are being let down by our lack of time to tune and improve the site. Watch for a top level post on PB and its future coming soon.
I didn't say you or anybody else developing PB is obnoxious. I said a certain behavior (requiring signing up for 2 accounts) was obnoxious. And since behaviors aren't intrinsically obnoxious or not, I obviously meant that I judge that requiring users to sign up for a second account to give feedback is obnoxious. Colloquially, this just means that I find it annoying, and it doesn't imply that you're trying to annoy anyone or say anything about you as a person. I find the behavior annoying, which I gave as an explanation for why I didn't bother to provide feedback.
I can of course imagine plenty of reasons why you'd use an external feedback service, just like I can imagine plenty of reasons that the performance would be what it is, none of them involving any kind of malevolent intention or lack of skill on your part. Nevertheless, I and quite a few others find PB frustrating to use, which is a real shame for an app that holds so much promise.
For what it's worth, I applaud your pro bono work for OB and LW, and I hope you keep up the good work. I think PB holds incredible promise, and I hope that you do find a way to improve it.
No apology necessary. I'd probably react similarly if I felt that somebody was being unconstructively critical of an app that I created.
I was just frustrated, as you guessed, because I really care about the idea and the app, and I see so much promise there. I should have just said to the original poster that I didn't provide feedback because I didn't want to sign up for a second account, but my frustration made me get snarky, which I apologize for.
Thanks again to you and everybody else at Tricycle.
It's not really that there's much time involved.
Let's say you are waiting in line in the supermarket to pay. It costs you no additional time to take a paper and note down an estimated amount of money that you have to pay.
It's probably rather like quiting smoking. Going through life while being fuzzy about your expectations is just easier than making predictions.
However all that talk about cognitive biases doesn't do much when you just gather knowledge but don't change any of your deep seated habits.
It would cost me effort and thought cycles; I would pay $100 a year at least to have this prediction/calibration thing done by magic.
Though, one alternative that seems to work is making bets with people. It seems to work because the thought of gaining coolness/status points over the other person overcomes the resistance to putting in the effort of thinking about the prediction. Apparently some major financial firms (Renaissance?) have a culture where you are actively encouraged to do this.
Actually want an accurate map, because you have Something to protect.
Why does protection have to be everyone's Capitalized goal?
Or more succinctly and broadly, learn to:
pay attention
correct bias
anticipate bias
estimate well
With a single specific enumeration of means to accomplish these competencies you risk ignoring other possible curricula. And you encourage the same blind spots for the entire community of aspiring rationalists so educated.
I want to add "be wary of conclusions which make you feel safer or require less action", but that may just one of the "standard biases". (I have come to the conclusion that I don't have time to read the referenced book just now, but I suppose I should be suspicious of that conclusion because the alternative requires more work and may challenge the validity of this comment, thus making me feel less safe in making it...)
I like the idea of a list -- maybe it should really be limited to some fixed number -- say 13 to clarify the rationalist stance on superstition :)
Anyway, perhaps this current list is somewhat unbalanced -- for example, before including analytic philosophy, I think the e.g., familiarity with game theory seems much more important.
Also, the first couple of points are like simple rules to follow, while many of the later ones are more about pointing to fields of knowledge than given something short to keep in mind. There's something to be said for both, but it might be clearer to have separate lists for those, e.g., a list of short rules one can remember ("The map is not the area"), and a list of fields that are important, such as parts of economics, game theory, information theory and so on.
Bravo. Were this a religion, I'd be a member. Wait, I already am. Or is that self-contradictory?
An excellent way to improve one's skill as a rationalist is to identify one's strengths and weaknesses, and then expend effort on the things that one can most effectively improve (which are often the areas where one is weakest). This seems especially useful if one is very specific about the parts of rationality, if one describes them in detail.
In order to facilitate improving my own and others' rationality, I am posting this list of 11 core rationalist skills, thanks almost entirely to Anna Salamon.