The karma system motivates people not to bring up things not already known to the other members. People will be less interested, and more likely to ding them for "not being relevant" (or because they misunderstand it). For instance, the recent post on E-Prime was relevant; very short; interesting; and even practical. Yet it's still sitting there at 0.
I've noticed that very smart people often go to great effort to spend time with other very smart people; and then, instead of listening to them, try to talk as much as they can. Which defeats much of the purpose of spending time with very smart people. This indicates that the motivation for joining groups is as much to impress them, as to learn from them.
It would be very surprising if groups did not spend most of their time discussing things most of them already know. That would mean that the groups consisted mainly of people who were interested in a subject, yet uninformed about it.
If one person points out a previously-unknown implication of mutually-known facts to the group, does that still count as discussing the information shared by members?I've noticed that very smart people often go to great effort to spend time with other very smart people; and then, instead of listening to them, try to talk as much as they can. Which defeats much of the purpose of spending time with very smart people.
I think this is an interesting observation, but I find myself wondering if it really does defeat the purpose. When conversing with someone I suspect is smarter than me, I tend to value direct, critical responses to my own statements and speculations more than whatever smart thing happens to be on their mind. It's far easier (for me) to learn from having been wrong than it is just from hearing something that turns out to be right. I can't elicit as many of those responses if all I do is listen.
Good point. Though, often, the smart person doing the talking doesn't seem to be in doubt about his/her ideas.
I suspect people are interested in talking to smart people because they expect/hope that the smart people will tell them that they, too, are smart, and they value this statement from a smart person more than from some random individual of lesser intelligence.
I would go an extra step beyond Iogi. You can learn more from smart people, but you can also teach them far easier and more enjoyably. They also often get more out of it. Being able to finally be with someone who can understand what you're talking about can be a great relief even if and sometimes especially if you're not learning anything at the moment. This does also have the side benefit of impressing them and/or helping you gain high status, so the two can get intermingled.
There's also the fact that they also usually want to talk a lot, especially if you give them food for thought, so you have to go a long way before you risk monopolizing the conversation.
Unless you are saying that the karma system actually displaces such motivations as giving others something worthwhile to think about, and getting feedback on one's ideas, I don't see how a small (even zero) non negative karma rating would discourage anyone from posting an unusual idea. If we are more concerned with writing posts that gain karma than expressing our ideas, we are in trouble. What would we want the karma for anyways?
A related request: there are a lot of goals common enough that better achieving these goals should be of interest to a large-ish portion of LW. I'm thinking here of: happiness; income; health; avoiding auto accidents; reading more effectively; building better relationships with friends, family, dating partners, or co-workers; operationalizing one's goals to better track progress; more easily shedding old habits and gaining new ones.
Could we use our combined knowledge base, and our ability to actually value empirical data and consider counter-evidence and so on, to find and share some of the better known strategies for achieving these goals? (Strategies that have already been published or empirically validated, but that many of us probably haven’t heard?) We probably don’t want to have loads and loads of specific-goaled articles or links, because we don’t want to look like just any old random internet self-help site. But a medium amount of high-quality research, backed by statistics, with the LW-community’s help noticing the flaws or counter-arguments -- this sounds useful to me. Really useful. Much of the advantage of rationality comes from, like, actually using that rationality to sort through what’s known and to find and implement existing best practices. And truth being singular, there’s no reason we should each have to repeat this research separately, at least for the goals many of us share.
Though I guess Eliezer’s caution is worth attention.
A related request: there are a lot of goals common enough that better achieving these goals should be of interest to a large-ish portion of LW. I'm thinking here of: happiness; income; health; avoiding auto accidents; reading more effectively; building better relationships with friends, family, dating partners, or co-workers; operationalizing one's goals to better track progress; more easily shedding old habits and gaining new ones.
I think this would be very helpful (esp. income, as for most people it would seem to be the whole of success at utilitarianism).
I'd just like to point out that this is a generally excellent post. It contains a link to an empirical study involving a rationality-related topic, followed by some analysis and links to LW/OB, followed by specific questions for discussion.
More posts like this!
Each of us has, or should have, a secret identity.
What, so now I need to have impressive achievements just to read about rationality? Look, I'm sorry I'm so stupid and useless, but at least let me bask in your glory, okay?
On the contrary. If you even have a particularly unique way to read about rationality, that's useful information.
Interesting. Reading your post, it struck me that this is precisely how my best friendships operate.
That is, I seldom see them more than once a week, and although they're true brother's who I know will have my back, if we don't have anything interesting to discuss we'd rather not see each other - even when we miss hanging out.
If there are any psychologists/sociologists on here, perhaps they could expand on this.
From Marginal Revolution:
A result that shouldn't surprise this group. I've noticed obvious attempts to avoid this tendency in Less Wrong (for instance, Yvain's avoiding further Christian-bashing). We've had at least one post asking specifically for information that was unique. And I don't know about the rest of you, but I've already had plenty of new food for thought on Less Wrong.
But are we tapping the full potential? Each of us has, or should have, a secret identity. The nice thing about those identities is that they give us access to unique knowledge. We've been asked (though I can't find the link) to avoid large posts applying learned rationality techniques to controversial topics, for fear of killing minds, which seems reasonable to me. Is there a better way to allow discipline-specific knowledge to be shared among Less Wrong readers without setting off our politicosensors? It seems beneficial not only for improved rationality training, but also to enhance our secret identities. For instance, I, as an economist-in-training, would like to know not just what an anthropologist can tell me, but what a Bayesian-trained anthropologist can tell me.