I have a terrifying confession to make: I believe in God.
This post has three prongs:
First: This is a tad meta for a full post, but do I have a place in this community? The abstract, non-religious aspect of this question can be phrased, "If someone holds a belief that is irrational, should they be fully ousted from the community?" I can see a handful of answers to this question and a few of them are discussed below.
Second: I have nothing to say about the rationality of religious beliefs. What I do want to say is that the rationality of particular irrationals is not something that is completely answered after their irrationality is ousted. They may be underneath the sanity waterline, but there are multiple levels of rationality hell. Some are deeper than others. This part discusses one way to view irrationals in a manner that encourages growth.
Third: Is it possible to make the irrational rational? Is it possible to take those close to the sanity waterline and raise them above? Or, more personally, is there hope for me? I assume there is. What is my responsibility as an aspiring rationalist? Specifically, when the community complains about a belief, how should I respond?
My Place in This Community
So, yeah. I believe in God. I figure my particular beliefs are a little irrelevant at this point. This isn't to say that my beliefs aren't open for discussion, but here and now I think there are better things to discuss. Namely, whether talking to people like me is within the purpose of LessWrong. Relevant questions have to do with my status and position at LessWrong. The short list:
- Should I have kept this to myself? What benefit does an irrational person have for confessing their irrationality? (Is this even possible? Is this post an attempted ploy?) I somewhat expect this post and the ensuing discussion to completely wreck my credibility as a commentator and participant.
- Presumably, there is a level of entry to LessWrong that is enforced. Does this level include filtering out certain beliefs and belief systems? Or is the system merit-based via karma and community voting? My karma is well above the level needed to post and my comments generally do better than worse. A merit-based system would prevent me from posting anything about religion or other irrational things, but is there a deeper problem? (More discussion below.) Should LessWrong /kick people who fail at rationality? Who makes the decision? Who draws the sanity water-line?
- Being religious, I assume I am far below the desired sanity waterline that the community desires. How did I manage to scrape up over 500 karma? What have I demonstrated that would be good for other people to demonstrate? Have I acted appropriately as a religious person curious about rationality? Is there a problem with the system that lets someone like me get so far?
- Where do I go from here? In the future, how should I act? Do I need to change my behavior as a result of this post? I am not calling out for any responses to my beliefs in particular, nor am I calling to other religious people at LessWrong to identify themselves. I am asking the community what they want me to do. Leave? Keep posting? Comment but don't post? Convert? Read everything posted and come back later?
The Wannabe Sanity Waterline
This post has little to do with actual beliefs. I get the feeling that most discussions about the beliefs themselves are not going to be terribly useful. I originally titled this post, "The Religious Rational" but figured the opening line was inflammatory enough and as I began editing I realized that the religious aspect is merely an example of a greater group of irrationals. I could have admitted to chasing UFOs or buying lottery tickets. What I wanted to talk about is the same.
That being said, I fully accept all criticisms offered about whatever you feel is appropriate. Even if the criticism is just ignoring me or an admin deleting the post and banning me. I am not trying to dodge the subject of my religious beliefs; I provided myself as an example to be convenient and make the conversation more interesting. I have something relevant and useful to discuss in regards to the overall topic of rationalistic communities that applies to the act of spawning rationalists from within fields other than rationalism. Whether it directly applies to LessWrong is for you to decide.
How do you approach someone below the sanity waterline? Do you ignore them and look for people above the line? Do you teach them until they drop their irrational deadweight? How do you know which ones are worth pursuing and which are a complete waste of time? Is there a better answer than generalizing at the waterline and turning away everyone who gets wet? The easiest response to these people is to put the burden of rationality on their shoulders. Let them teach themselves. I think think there is a better way. I think there are people closer to the waterline than others and deciding to group everyone below the line together makes the job of teaching rationalism harder.
I, for example, can look at my fellow theists and immediately draw up a shortlist of people I consider relatively rationalistic. Compared to the given sanity waterline, all of us are deep underwater due to certain beliefs. But compared to the people on the bottom of the ocean, we're doing great. This leads into the question: "Are there different levels of irrationality?" And also, "Do you approach people differently depending on how far below the waterline they are?"
More discretely, is it useful to make a distinction between two types of theists? Is it possible to create a sanity waterline for the religious? They may be way off on a particular subject but otherwise their basic worldview is consistent and intact. Is there a religious sanity waterline? Are there rational religious? Is a Wannabe Rational a good place to start?
The reason I ask these questions is not to excuse any particular belief while feeling good about everything else in my belief system. If there is a theist struggling to verify all beliefs but those that involve God, then they are no true rationalist. But if said theist really, really wanted to become a rationalist, it makes sense for them to drop the sacred, most treasured beliefs last. Can rationalism work on a smaller scale?
Quoting from Outside the Laboratory (emphasis not mine):
Now what are we to think of a scientist who seems competent inside the laboratory, but who, outside the laboratory, believes in a spirit world? We ask why, and the scientist says something along the lines of: "Well, no one really knows, and I admit that I don't have any evidence - it's a religious belief, it can't be disproven one way or another by observation." I cannot but conclude that this person literally doesn't know why you have to look at things.
A certain difference between myself and this spirit believing scientist is that my beliefs are from a younger time and I have things I would rather do than gallop through that area of the territory checking my accuracy. Namely, I am still trying to discover what the correct map-making tools are.
Also, admittedly, I am unjustifiably attached to that area of my map. It's going to take a while to figure out why I am so attached and what I can do about it. I am not fully convinced that rationalism is the silver-bullet that will solve Life, the Universe, and Everything. I am not letting this new thing near something I hold precious. This is a selfish act and will get in the way of my learning, but that sacrifice is something I am willing to make. Hence the reason I am below the LessWrong waterline. Hence me being a Wannabe Rational.
Instead, what I have done is take my basic worldview and chased down the dogma. Given the set of beliefs I would rather not think about right now, where do they lead? While this is pure anathema to the true rationalist, I am not a true rationalist. I have little idea about what I am doing. I am young in your ways and have much to learn and unlearn. I am not starting at the top of my system; I am starting at the bottom. I consider myself a quasi-rational theist not because I am rational compared to the community of LessWrong. I am a quasi-rational theist because I am rational compared to other theists.
To return to the underlying question: Is this distinction valid? If it is valid, is it useful or self-defeating? As a community, does a distinction between levels of irrationally help or hinder? I think it helps. Obviously, I would like to consider myself more rational than not. I would also like to think that I can slowly adapt and change into something even more rational. Asking you, the community, is a good way to find out if I am merely deluding myself.
There may be a wall that I hit and cannot cross. There may be an upper-bound on my rationalism. Right now, there is a cap due to my theism. Unless that cap is removed, there will likely be a limit to how well I integrate with LessWrong. Until then, rationalism has open season on other areas of my map. It has produced excellent results and, as it gains my trust, its tools gain more and more access to my map. As such, I consider myself below the LessWrong sanity waterline and above the religious sanity waterline. I am a Wannabe Rational.
Why This Helps
The advantage of a distinction between different sanity waterlines is that it allows you to compare individuals within groups of people when scanning for potential rationalists. A particular group may all drop below the waterline but, given their particular irrational map, some of them may be remarkably accurate for being irrational. After accounting for dumb luck, does anyone show a talent for reading territory outside of their too-obviously-irrational-for-excuses belief?
Note that this is completely different than questioning where the waterline is actually drawn. This is talking about people clearly below the line. But an irrational map can have rational areas. The more rational areas in the map, the more evidence there is that some of the mapmaker's tools and tactics are working well. Therefore, this mapmaker is above the sanity waterline for that particular group of irrational mapmakers. In other words, this mapmaker is worth conversing with as long as the conversation doesn't drift into the irrational areas of the map.
This allows you to give people below the waterline an attractive target to hit. Walking up to a theist and telling them they are below the waterline is depressing. They do need to hear it, which is why the waterline exists in the first place, and their level of sanity is too low for them to achieve a particular status. But after the chastising you can tell them that other areas in their map are good enough to become more rational in those areas. They don't need to throw everything away to become a Wanna Rational. They will still be considered irrational but at least their map is more accurate than it was. It is at this point that someone begins their journey to rationalism.
If we have any good reason to help others become more rational, it seems as though this would count toward that goal.
Conversion
This last bit is short. Taking an example of myself, what should I be doing to make my map more accurate? My process right now is something like this:
- Look at the map. What are my beliefs? What areas are marked in the ink of science, evidence, rationalism, and logic? What areas aren't and what ink is being used there?
- Look at the territory. Beliefs are great, but which ones are working? I quickly notice that certain inks work better. Why am I not using those inks elsewhere? Some inks work better for certain areas, obviously, but some don't seem to be useful at all.
- Find the right ink. Contrasting and comparing the new mapmaking methods with the old ones should produce a clear winner. Keep adding stuff to the toolbox once you find a use for it. Take stuff out of the toolbox when it is replaced by a better, more accurate tool. Inks such as, "My elders said so" and "Well, it sounds right" are significantly less useful. Sometimes we have the right ink but we use incorrectly. Sometimes we find a new way to use an old ink.
- Revisit old territory. When I throw out an old ink, examine the areas of the map where that ink was used. Revisit the territory with your new tools handy. Some territory is too hard to access now (beliefs about your childhood) or some areas on your map don't have corresponding territories (beliefs about the gender of God).
These things, in my opinion, are learning the ways of rationality. I have a few areas of my map marked, "Do this part later." I have a few inks labeled, "Favorite colors." These are what keep me below the sanity waterline. As time moves forward I pickup new favorite colors and eventually I will come to the areas saved for later. Maybe then I will rise above the waterline. Maybe then I will be a true rationalist.
Not so much on the individual beliefs, as on what your thought processes are and in what ways you might want to improve them.
We do not possess isolated beliefs, but networks of beliefs. And a belief isn't, by itself, irrational; what is irrational is the process whereby beliefs are arrived at, or maintained, in the face of evidence.
I am an atheist, but I'm far from certain that none of my current beliefs and the way I maintain them would be deemed "irrational" if they came up for discussion, and judged as harshly as theism seems to be.
My intent in participating here is to improve my own thinking processes. Some of the ways this happens are a) coming across posts which describe common flaws in thinking, whereupon I can examine myself for evidence of these flaws; b) coming across posts which describe tools or techniques I can try out on my own; c) perhaps most interesting, seeing other people apply their own thinking processes to interesting issues and learning from their successes (and sometimes their failures).
The karma system strikes me as an inadequate filtering solution, but better than nothing. I'm now routinely browsing LW with the "anti-kibitzing" script in an effort to improve the quality of my own feedback in the form of up- and downvotes. My first reading of a comment from you would be looking for insights valuable in one of the three ways above. Perhaps if your comment struck me as inexplicably obscure I might check out your user name or karma.
By becoming a more active commenter and poster, I hoped to learn as others gave me feedback on whether my contributions are valuable in one of these ways. The karma system had significant and subtle effects on the ways I chose to engage others here - for good or ill, on balance, I'm still not sure.
Is it possible to glimpse or understand someone's thought processes are without delving into their particular beliefs? I assume yes. Since Religion is some of a touchy subject, I offer everything else I say as evidence of my thought processes. Is that enough?
... (read more)