Comment author: Yvain 31 March 2009 08:07:42PM 6 points [-]

Why is that bad?

It's not, if you know you're doing it.

Are you sure that this isn't all about signaling being a truth-seeker?

Pretty sure. If I wanted to signal, I'd be a lot more high-falutin about it. Actually, my comments do sound a bit high-falutin' (I was looking for a better word than "truth seeker", but couldn't find one) but that wasn't exactly what I wanted to express. The untangling-wires metaphor works a little better. Nominull's "I only seek to be right because I hate being wrong." works too. It's less of a "I vow to follow the pure light of Truth though it lead me to the very pits of Hell" and more of an "Aaargh, my brain feels so muddled right now, how do I clear this up?"

Also, this would be a terrible community to signal truth-seeking in, considering how entrenched the "rationality as win" metaphor is. As I mentioned in the hair example, I think a lot more people here are signaling a burning interest in real-world application than really have one.

So, if you're saying we should seek truth just because it's the truth, and not because it brings practical benefit or pleasure or sends good signals, then what is the use of seeking truth?

Um...this line of argument applies to everything, doesn't it? What is the use of seeking money, if it doesn't bring pleasure or send good signals? What is the use of seeking love, if it doesn't bring pleasure or send good signals? What is the use of seeking 'practical benefits', if they don't bring pleasure or send good signals?

Darned if I know. That's the way my utility function works. And it certainly is mediated by pleasure and good signals, but I prefer not to say it's about pleasure and good signals because I'd rather not be turned into orgasmium just yet.

Comment author: Demosthenes 31 March 2009 10:35:00PM 2 points [-]

Yvain:

Do you really believe that you engage in Truth-Seeking for utilitarian reasons? I get the impression that you don't really believe that.

Would you be willing to to enter a computer simulation where you got to investigate higher math puzzles (or metaphysics) with no applications? Spend your days in a fantastic and never-ending Truth-Seeking project (we'll throw great sex, food and housing into the holodeck for you as well)?

I liked this better at the beginning when you were prodding people who say that they see rationalism as a means to an end! You seem to be going back to consequentialism!

I don't believe that rationalists WIN because I don't believe that winning WINS

Comment author: AlexU 31 March 2009 03:10:32PM 6 points [-]

Yes. I've been a semi-regular reader of OCB for about a year. I think it's an interesting blog. But have I learned anything useful from it? Has it made any practical difference in the choices I make, either day-to-day or longterm? The answer is no. Admittedly, this may be my own fault. But I recall a post, not too long ago, soliciting people's feedback on "the most important thing you learned from OCB in the past year," or something of that sort. And while there were lots of people excitedly posting about how much OCB has taught them, the examples they gave were along the lines of "I learned the power of fundamental attribution error!" or "I learned the importance of continually adjusting my priors!" with curiously few examples of real differences OCB made in anyone's practical choices. This raises the question: if tweaking our rationality has no appreciable affect on anything, then how can we say we're really tweaking our rationality at all? Perhaps we're just swapping new explanations for fundamentally irrational processes that are far too buried and obscure to be accessible to us.

That said, I think things like the recent posts on akrasia are strong moves in the right direction. Intellectually interesting, but with easy to grasp real-world implications.

Comment author: Demosthenes 31 March 2009 04:24:21PM 1 point [-]

This debate has already played out in attacking and defending Pragmatism.

A lot of the rubrics by which to judge whether or not rationalism wins or whether or not rationalism is an end in itself involve assigning meaning and value on a very abstract level. Eliezer's posts outline a reductionist, materialist standpoint with some strong beliefs about following the links of causality. Rationalism follows, but rationalism isn't going to prove itself true.

Deciding that rationalism is the best answer for your axiomatic belief system requires taking a metaphysical stand; I think that if you are looking for a definite metaphysical reason that you should practice rationalism, then you are interested in something that the practice of rationalism is not going to help much.

Comment author: Demosthenes 30 March 2009 12:48:54AM 0 points [-]

The Nacirema are actually just gaseous meat sacks:

http://www.youtube.com/watch?v=gaFZTAOb7IE

Comment author: ciphergoth 29 March 2009 08:25:04AM 4 points [-]

A lawyer's expertise is in rationalization, not rationality. Of course, many lawyers may also be excellent rationalists, but my experience is that they're not generally very sciency people.

Comment author: Demosthenes 29 March 2009 06:48:17PM 0 points [-]

That was my point; I was making a dig on the goals of argumentative atheists looking for a support group vs people who might want to advance rationalist goals

Comment author: Demosthenes 29 March 2009 06:30:56PM 12 points [-]

In a nutshell, it might be cool to make a website and organization that promotes data collections and debate.

Rationalism requires access to high quality empirical evidence. Holding your hypotheses up to constantly changing data is a major theme of this site.

We can only rationally discuss our hypotheses and beliefs when we have something to test and the quality of datasets floating around on the internet is often low or inaccessible.

A good rationalist project might be to highlight resources for empirical evidence, run "data debates" where experts attack and defend each others datasets; a wiki for best-practices in data collection, or a wiki for navigating popular issues through good datasets (try as a nonexpert to find what studies on taxation and inequality are best and you can end up running in circles).

I think you would want to tailor this kind of project toward non experts, giving people (especially journalists) a good starting place for finding meaningful, well-collected data that can form a good jump point for rational analysis.

A project like this also leaves the door open to many interpretations and many goals, so it isn't necessarily cutting down on the number of voices out there.

I would also be interested in cataloging failed attempts. More and more I have been trying to look at survivorship biases (http://en.wikipedia.org/wiki/Survivorship_bias) behind all my beliefs.

Are there any good examples of projects like this in existence? Maybe we can leverage the community here to throw our weight behind one.

In response to Church vs. Taskforce
Comment author: Yvain 28 March 2009 12:11:13PM *  7 points [-]

How is this significantly different from the Lions Club and Kiwanis, crossed with the local atheist organization?

I see how it's more rationalist-oriented than the Kiwanis, and more service-oriented than the Atheist Club. And they could probably get more charitable value for money by focusing on high-utility causes - if the rationalists were high-level enough, which the sort of people who respond to "rationalist club" ads might not be. But does "altruist rationalists" correspond to such a significant cluster in personspace that they need their own club? And is this just "we should start a fraternal organization"?

These clubs are interesting and do some good work, but I don't hear people speaking of them in the same breath as religion (except maybe when they get mystical, like the Freemasons).

In response to comment by Yvain on Church vs. Taskforce
Comment author: Demosthenes 29 March 2009 04:04:29AM 2 points [-]

Yvain is spot on; secular service organization already exist and function. I have occasionally attended some meetings at a Rotary club and it usually involves eating, a list of ongoing activities, community highlights and recognition of visiting members.

What is special about the way a rationalist helps people? Maybe starting a program to fund probability and philosophy of science classes in the community?

Law school sounds like the best option for finding fellow argumentative atheists.

Comment author: Erik 28 March 2009 05:51:52AM 7 points [-]

I think you may very well be correct in your interpretation of the original authors intention. However, I think Yvain's is more spot on for the majority of the upvotes the comment got.

Comment author: Demosthenes 29 March 2009 03:35:12AM 4 points [-]

In his youth, Steve Jobs went to India to be enlightened. After seeing that the nation claiming to be the source of this great spiritual knowledge was full of hunger, ignorance, squalor, poverty, prejudice, and disease, he came back and said that the East should look to the West for enlightenment.

....or maybe the quotation and by extension the entire comment were meant to suggest that traditionally materialist concerns like sanitation, wealth and longevity are more deserving of the title enlightenment and than our categorizing of enlightenment to only mean the spirit is not entirely accurate. Expressing wonder at reductionist, material understanding of the universe shouldn't be new to this crowd. Expressing value judgements do not a dark art make.

...or maybe it meant to ignore all Indian claims to enlightenment....

There is a lot of nonsense on OB and LW about separating content from style; the occasional attempts to translate into positivist verifiable claims or examples of Dark Arts often say more about the person doing the translating than illuminating the text for the reader.

Yvain obviously interpreted this in a very specific way. Yvain has a good basis for asking Phil to clarify the issues. These sorts of things are more valuable as discussions and instead it was turned into a broadcast.

This is not a criticism, but just a suggestion that the world of give-and-take, persuasion and rebuttal can be a lot more valuable than posting an instantiation of meaning for the comment that is highly suspect at best.

Comment author: Demosthenes 24 March 2009 03:09:13PM 2 points [-]

If anyone wants to do some background reading on the test before commenting, this paper addresses some of the common criticisms:

http://www.psy.utexas.edu/psy/FACULTY/Markman/jpsp01.pdf

"How do indirect measures of evaluation work? Evaluating the inference of prejudice in the IAT"

Comment author: Yvain 23 March 2009 04:13:02PM *  7 points [-]

From a poster's perspective: it is very hard to tell which ideas your audience considers beginner-level and which they consider advanced-level. Especially when the audience is as diverse and self-selected as at LW. I've posted a few times asking "Hey, does everyone here already know X or not?" and I've rarely gotten the answer I expected.

Responses to my post last night ranged from "this is obvious" to "this is wrong" to "this acronym could be useful" to "this was one of my favorite posts yet". I don't quite know what to do with that. Right now I am erring on the side of caution; I'd rather write something obvious to everyone than skip an inferential distance somewhere.

Upvoting ought to be the main feedback mechanism here, but right now I worry that a well-written true (but obvious) article will get voted up just because it's well-written and true, and everyone figures it will probably help someone else. Maybe make a rule that you should not upvote a post unless it teaches you something? Or maybe end a post whose difficulty level you're not sure of with "Please rate this as too obvious, okay, or too hard"?

EDIT: It's also hard to remember if something has already been covered on Overcoming Bias (see: source confusion). There's not any nice list of Robin or the other writers' posts like there is of Eliezer's, is there?

Comment author: Demosthenes 23 March 2009 10:20:25PM *  3 points [-]

Yes! something like a table of contents?

The Tag Cloud is a good way to start, but once you generate 10 posts a day for too long, the tag cloud is no longer a useful navigation tool

Something like this maybe: <http://drupal.org/project/hypergraph>

Drupal can also automatically generate "related content" based on whatever criteria you define as important or manually entered links. Adding more and more blocks to the page might not be good for efficiency, but providing more diverse paths to explore the content on these sites would be great.

In the long run, the more crosslinking there is, the easier it will be to visualize the stronger nodes and the easier it will become to find highly cited posts. At this point, good posts get even more citation. Good navigation is the critical first step.

I spend a lot of time these days fishing through older posts on Overcoming Bias, looking for something to read, but it is definitely not set up as a repository of knowledge.

In response to comment by Demosthenes on Cached Selves
Comment author: topynate 22 March 2009 11:13:38PM 2 points [-]

How someone could do enough compartmentalizing of their identity to pull of either of these tasks escapes me.

The motive behind these prescriptions is to make the decision we want to make for our current selves, so there's another way which non-rationalists use all the time. Suppose you make a New Year's Resolution to exercise more; you genuinely do want to exercise more. But when the equipment is installed in your living room, you don't feel like it any more. In fact, you'll end up convincing yourself that you were never really serious about your resolution in the first place, if you allow yourself to. I think that a person's past-self-concept does exert quite an influence on behaviour, but that current preferences can also alter the past-self-concept to fit. Consistency between past-self-concept and current self seems to be the overriding preference.

Of course this is a form of willing self-deception, so our overriding preference should be to actually do 3c and 3d, which are not self-deceptions, even if it does feel like compartmentalizing. I think one has to really convince oneself that such a perspective is not "compartmentalization"; that to disregard one's past preferences is not a betrayal of one's current self.

In response to comment by topynate on Cached Selves
Comment author: Demosthenes 23 March 2009 07:58:15PM *  1 point [-]

Has anyone brought up this study by Bruner and Potter (1964) before? I think it would relate to intertemporal beliefs and how we sometimes perceive them to be more sound than they really are:

<http://www.ahs.uwaterloo.ca/~kin356/bpdemo.htm>

In this demonstration, you will see nine different pictures. The pictures will get clearer and clearer. Make a guess as to what is being shown for each of the pictures, and write down your guess. Note the number of the picture where you were first able to recognize what was being shown. Then go backwards - press the "BACK" button on the browser - and see at which point you can no longer identify the picture. Are your "ascending" and "decending" points the same?

========

IF YOU HAVE TRIED THE STUDY:

Pictures of common objects, coming slowly into focus, were viewed by adult observers. Recognition was delayed when subjects first viewed the pictures out of focus. The greater or more prolonged the initial blur, the slower the eventual recognition. Interference may be accounted for partly by the difficulty of rejecting incorrect hypotheses based on substandard cues.

It would be interesting to think of your intertemporal frame of mind as discontinuous and running at 24 frames per second (like a film). Maybe your consciousness gives your sense of beliefs a false sense of flowing like a movie from one time state to the next.

View more: Prev | Next