Lumifer comments on Open thread, Nov. 3 - Nov. 9, 2014 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (310)
Well, if posting on LW is no longer fun, shouldn't we try to go more meta and fix the problem?
Of course, this shouldn't be Eliezer's top priority. And generally, it shouldn't be left to Eliezer to fix every single detail.
I think it would be good to have some kind of psychological task force on LessWrong. By which I mean people who actually study and apply the stuff, in the same way we have math experts here.
The next step in the Art could be to make rationality fun. And I don't mean "do funny things that signal your membership in LW community" but rather invent systematic ways how to make instrumentally rational things feel better, so you alieve they are good.
More generally, to overcome the disconnection between what we believe and how we feel. I think many people are doing the reversed stupidity here. We have learned that letting our emotions drive our thoughts is wrong. So the solution was to disconnect emotions from thoughts. That is a partial solution which works, but has a costly impact on motivation. Eliezer wrote that it is okay to accept some emotions, if they are compatible with the rational thoughts. But the full solution would be to let our thoughts drive our emotions. Not merely to accept the rational feeling, if it happens to exist, but to engineer it, by changing our internal and external environments. (On the other hand, this is just another way how insufficiently rational people could hurt themselves.)
Perhaps it would be best to learn from psychology. Psychology has shown that there's very little you can do to make yourself 'more rational.' Knowing about biases does little to prevent them from happening, and you can't force yourself to enjoy something you don't enjoy. Further, it takes a lot of conscious, slow effort to be rational. In the face of real-life problems, true rationality is often pretty much impossible as it would take more computing power than available in the universe. It's pretty clear that our irrationality is a mechanism to cope with the information overload of the real world by making approximate guesses.
It's because of things like this that I think maybe LW has gone severely overboard with the instrumental rationality thing. Note that knowing about biases is a noble goal that we should strive towards, but trying to fix them often backfires. The best we can usually hope for is to try to identify biases in our thinking and other people's.
But anyway, a lot of the issues of this site could simply be a matter of technical fixes. It was never really a good idea to base a rationality forum on a reddit template. Instead of the 'everyone gets to vote' system, I prefer the system where there are a handful of moderators. Moderators could be selected by the community and they would not be allowed to moderate discussions they themselves are participating in. This is the system that slashdot follows and I think it seems to work extremely well.
Citation needed.
Not to mention that what an average person can or can not do isn't particularly illuminating for non-representative subsets like LW.
I am not sure that is possible. Instrumental rationality is just making sure that what you are doing is useful in getting to wherever you want to go. What does "severely overboard" mean in this context?
Read Dan Kahneman's work. He's spent his entire lifetime studying this and won a nobel prize for it too. A good summary is given in http://www.newyorker.com/tech/frontal-cortex/why-smart-people-are-stupid Here's an excerpt:
'as the scientists note, “people who were aware of their own biases were not better able to overcome them.” This finding wouldn’t surprise Kahneman, who admits in “Thinking, Fast and Slow” that his decades of groundbreaking research have failed to significantly improve his own mental performance. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy”—a tendency to underestimate how long it will take to complete a task—“as it was before I made a study of these issues,” he writes. '
In fact it is; there is no substantial difference when it comes to trying to control biases between highly educated and non-educated people.
There is nothing wrong with 'making sure that what you are doing is useful in getting to wherever you want to go'. The problem is the idea of trying to 'fix' your behavior through self-imposed procedures, trial & error, and self-reporting. Experience shows that this often backfires, as I said. It's pretty amazing that "I tried method X, and it seemed to work well, I suggest you try it!" (look at JohnMaxwellIV's comment below for just one example) is taken as constructive information on a site dedicated to rationality.
First, rationality is considerably more than just adjusting for biases.
Second, in your quote Kahneman says (emphasis mine): "My intuitive thinking is just as prone...". The point isn't that your System 1 changes much, the point is that your System 2 knows what to look for and compensates as best as it can.
Sigh. Citation needed.
And what it the problem, exactly? I am also not sure what the alternative is. Do you want to just assume your own behaviour is immutable? Magically determined without you being able to do anything about it? Do you think you need someone else to change your behaviour for you? What?
Disagree. See comments in http://lesswrong.com/lw/d1u/the_new_yorker_article_on_cognitive_biases/
I'm not talking about the bias blind spot. I agree that more educated people are better able to discern biases in their own thoughts and others. In fact that's exactly what I said, not once but two times.
I'm talking about the ability to control one's own biases.
Are you distinguishing between "control one's own biases" and "adjusting and compensating for one's own biases"?
Huh? So what are more intelligence - and more educated - people doing, exactly, if not controlling their biases?