My deconversion from Christianity had a large positive impact on my life. I suspect it had a small positive impact on the world, too. (For example, I no longer condemn gays or waste time and money on a relationship with an imaginary friend.) And my deconversion did not happen because I came to understand the Bayesian concept of evidence or Kolmogorov complexity or Solomonoff induction. I deconverted because I encountered some very basic arguments for non-belief, for example those in Dan Barker's Losing Faith in Faith.
Less Wrong has at least two goals. One goal is to raise the sanity waterline. If most people understood just the basics Occam's razor, what constitutes evidence and why, general trends of science, reductionism, and cognitive biases, the world would be greatly improved. Yudkowsky's upcoming books are aimed at this first goal of raising the sanity waterline. So are most of the sequences. So are learning-friendly posts like References & Resources for LessWrong.
A second goal is to attract some of the best human brains on the planet and make progress on issues related to the Friendly AI problem, the problem with the greatest leverage in the universe. I have suggested that Less Wrong would make faster progress toward this goal if it worked more directly with the community of scholars already tackling the exact same problems. I don't personally work toward this goal because I'm not mathematically sophisticated enough to do so, but I'm glad others are!
Still, I think the first goal could be more explicitly pursued. There are many people like myself and jwhendy who can be massively impacted for the better not by coming to a realization about algorithmic learning theory, but by coming to understand the basics of rationality like probability and the proper role of belief and reductionism.
Reasons for Less Wrong to devote more energy to the basics
- Such efforts to spread the basics will have a short-term impact on more people than will efforts toward Friendly AI, and these impacted people will in turn impact others, hopefully for the better.
- Some LWers may feel they have little to contribute because they aren't masters of Solomonoff induction or algorithmic learning theory. But they will be able to contribute to raising the sanity waterline by spreading the basics of rationality.
- Providing more basic resources will attract a wider base of readers to Less Wrong, leading to (1) more new rationalists and (2) more donations to SIAI, for solving the Friendly AI problem.
- Even for experienced rationalists, it can be easy to forget the basics at times. Humans are not naturally rational, and revert to pre-rationality rather quickly without ongoing training and practice.
How to do it
Let me put some meat on this. What does more focus on the basics look like? Here are some ideas:
- The sequences are great, but some people are too busy or lazy to read even those. Some of the sequences could be summarized into single posts crafted so as to have no prerequisites. These posts could be linked widely, and entered in relevant blog carnivals.
- There is another huge community who will watch a 10-minute video, but will not read a short post. So the YouTube lectures are a great idea. But they could be improved. As of today, three of the videos show a presenter against a whiteboard. To make this work well requires lots of resources: (1) a good camera, (2) a shotgun or lavavlier microphone, (3) a teleprompter, and (4) an experienced and enthusiastic presenter. That's hard to do! But videos in the familiar PowerPoint style or the Khan Academy style are easier to do well. All it requires is some free presentation software, free screen capture software and a $70 high-quality USB mic like the Blue Snowball. This approach would also allow more people to participate in making videos on the basics of rationality.
- Sometimes, a basic concept of rationality will only "click" with somebody if presented in a certain way. Some will need a story that illustrates the concept. There are some of these on Less Wrong already, or in something like Harry Potter and the Methods of Rationality, but there could be more. For others, perhaps a Cartoon Guide to Bayes' Theorem or a Cartoon Guide to Reductionism would make it "click." Those who are more ambitious might attempt to create an animation explaining some core rationalist concept, ala this visualization of special relativity.
- Write "Introduction to X" or "How to Use X" posts.
- Keep developing the wiki, obviously.
- Develop a rationality workbook.
Also, it helps to have someone with a good voice.
I think it may be useful to look at infrastructure changes that could promote these ends. Would it be worth having a video section (not sure if the karma value per upvote should be less or more than 10)? A basic and advanced section? It would be nice to have the site be useful for both beginning rationalists and people who are interested in Solomonoff induction without having them trip over each other.
The other benefit of having an explicit 'basic' section might be that restatements of other posts (ideally linked in the new post) could be encouraged- one basic educational fact is different people approach similar concepts in different ways, and so two posts with nearly identical content but different presentations can be rather valuable, and help things click for a larger group of people. It would be cool (but I don't know how easy it would be to make) to have a concept index, where I could type in "belief in belief" and get a list of all the basic posts explaining belief in belief, sorted by upvotes. (A less explicit method of doing this is the tag system we have already, but I doubt it should be used for this / would do this as well as it could. Actually, if the tags were specific to the basic section, that might work out well.)
Good thoughts. Would you do us a favor, thinking about this in some more detail, and write a discussion post?