Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Eliezer_Yudkowsky comments on The noncentral fallacy - the worst argument in the world? - Less Wrong

157 Post author: Yvain 27 August 2012 03:36AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1744)

You are viewing a single comment's thread. Show more comments above.

Comment author: TheOtherDave 13 September 2012 04:36:03PM 10 points [-]

Mostly he's coming across to me as having lost patience with the community not being what he wants it to be, and having decided that he can fix that by changing the infrastructure, and not granting much importance to the fact that more people express disapproval of this than approval.

Comment author: Eliezer_Yudkowsky 14 September 2012 01:43:22AM 7 points [-]

Keep in mind that it's not "more people" it's more "people who participate in meta threads on Less Wrong". I've observed a tremendous divergence between the latter set, and "what LWers seem to think during real-life conversations" (e.g. July Minicamp private discussions of LW which is where the anti-troll-thread ideas were discussed, asking what people thought about recent changes at Alicorn's most recent dinner party). I'm guessing there's some sort of effect where only people who disagree bother to keep looking at the thread, hence bother to comment.

Some "people" were claiming that we ought to fix things by moderation instead of making code changes, which does seem worth trying; so I've said to Alicorn to open fire with all weapons free, and am trying this myself while code work is indefinitely in progress. I confess I did anticipate that this would also be downvoted even though IIRC the request to do that was upvoted last time, because at this point I've formed the generalization "all moderator actions are downvoted", either because only some people participate in meta threads, and/or the much more horrifying hypothesis "everyone who doesn't like the status quo has already stopped regularly checking LessWrong".

I'm diligently continuing to accept feedback from RL contact and attending carefully to this non-filtered source of impressions and suggestions, but I'm afraid I've pretty much written-off trying to figure out what the community-as-a-whole wants by looking at "the set of people who vigorously participate in meta discussions on LW" because it's so much unlike the reactions I got when ideas for improving LW were being discussed at the July Minicamp, or the distribution of opinions at Alicorn's last dinner party, and I presume that any other unfiltered source of reactions would find this conversation similarly unrepresentative.

Comment author: Yvain 14 September 2012 06:16:20PM *  13 points [-]

I will be starting another Less Wrong Census/Survey in about three weeks; in accordance with the tradition I will first start a thread asking for question ideas. If you can think of a good list of opinions you want polled in the next few weeks, consider posting them there and I'll stick them in.

Comment author: komponisto 14 September 2012 04:28:34AM 38 points [-]

Let me see if I understand you correctly: if someone cares about how Less Wrong is run, what they should do is not comment on Less Wrong -- least of all in discussions on Less Wrong about how Less Wrong is run ("meta threads"). Instead, what they should do is move to California and start attending Alicorn's dinner parties.

Have I got that right?

Comment author: wedrifid 14 September 2012 11:40:38AM 25 points [-]

Let me see if I understand you correctly: if someone cares about how Less Wrong is run, what they should do is not comment on Less Wrong -- least of all in discussions on Less Wrong about how Less Wrong is run ("meta threads"). Instead, what they should do is move to California and start attending Alicorn's dinner parties.

That's how politics usually works, yes.

Comment author: Alicorn 14 September 2012 04:34:25AM *  17 points [-]

Also, you have to attend dinner parties on a day when Eliezer is invited and doesn't decline due to being on a weird diet that week.

Comment author: fubarobfusco 14 September 2012 05:08:15AM 15 points [-]

Can we call this the social availability heuristic?

Comment author: SilasBarta 18 September 2012 11:05:16PM 4 points [-]

Don't worry, I'm sure that venue's attendees are selected neutrally.

Comment author: Eliezer_Yudkowsky 14 September 2012 11:18:07AM 0 points [-]

All you have to do is run into me in any venue whatsoever where the attendees weren't filtered by their interest in meta threads. :)

Comment author: [deleted] 14 September 2012 04:07:41PM 11 points [-]

But now that you've stated this, you have the ability to rationalize any future IRL meta discussion...

Comment author: DaFranker 18 September 2012 08:38:08PM 7 points [-]

Can "Direct email, skype or text-chat communications to E.Y." count as a venue? Purely out of curiosity.

Comment author: Eliezer_Yudkowsky 18 September 2012 08:56:06PM 0 points [-]

The problem is that if you initiate it, it's subject to the Loss Aversion effect where the dissatisfied speak up in much greater numbers.

Comment author: komponisto 19 September 2012 10:08:45AM 25 points [-]

I don't see what this has to do with "loss aversion" (the phenomenon where people think losing a dollar is worse than failing to gain a dollar they could have gained), though that's of course a tangential matter.

The point here is -- and I say this with all due respect -- it looks to me like you're rationalizing a decision made for other reasons. What's really going on here, it seems to me, is that, since you're lucky enough to be part of a physical community of "similar" people (in which, of course, you happen to have high status), your brain thinks they are the ones who "really matter" -- as opposed to abstract characters on the internet who weren't part of the ancestral environment (and who never fail to critique you whenever they can).

That doesn't change the fact that this is is an online community, and as such, is for us abstract characters, not your real-life dinner companions. You should be taking advice from the latter about running this site to about the same extent that Alicorn should be taking advice from this site about how to run her dinner parties.

Comment author: Alicorn 19 September 2012 05:45:51PM 3 points [-]

Alicorn should be taking advice from this site about how to run her dinner parties.

Do you have advice on how to run my dinner parties?

Comment author: Bugmaster 19 September 2012 06:27:43PM 8 points [-]

Vaniver and DaFranker have both offered sensible, practical, down-to-earth advice. I, on the other hand, have one word for you: Airship.

Comment author: shminux 19 September 2012 08:43:09PM 1 point [-]

I, on the other hand, have one word for you: Airship.

Not plastics?

Comment author: DaFranker 19 September 2012 06:23:48PM 2 points [-]

Consider seating logistics, and experiment with having different people decide who sits where (or next to whom). Dinner parties tend to turn out differently with different arrangements, but different subcultures will have different algorithms for establishing optimal seating, so the experimentation is usually necessary (and having different people decide serves both as a form of blinding and as a way to turn up evidence to isolate the algorithm faster).

Comment author: Alicorn 19 September 2012 06:52:47PM 0 points [-]

Huh, I haven't been assigning seats at all except for reserving the one with easiest kitchen access for myself. I've just been herding people towards the dining table.

Comment author: Vaniver 19 September 2012 05:57:32PM 3 points [-]

Consider eating Roman-style to increase the intimacy / as a novel experience. Unfortunately, this is made way easier with specialized furniture- but you should be able to improvise with pillows. As well, it is a radically different way to eat that predates the invention of the fork (and so will work fine with hands or chopsticks, but not modern implements).

Comment author: RichardKennaway 19 September 2012 12:59:26PM -2 points [-]

since you're lucky enough to be part of a physical community of "similar" people (in which, of course, you happen to have high status), your brain thinks they are the ones who "really matter" -- as opposed to abstract characters on the internet who weren't part of the ancestral environment (and who never fail to critique you whenever they can).

Was Eliezer "lucky" to have cofounded the Singularity Institute and Overcoming Bias? "Lucky" to have written the Sequences? "Lucky" to have founded LessWrong? "Lucky" to have found kindred minds, both online and in meatspace? Does he just "happen" to be among them?

Or has he, rather, searched them out and created communities for them to come together?

That doesn't change the fact that this is is an online community, and as such, is for us abstract characters, not your real-life dinner companions. You should be taking advice from the latter about running this site to about the same extent that Alicorn should be taking advice from this site about how to run her dinner parties.

The online community of LessWrong does not own LessWrong. EY owns LessWrong, or some combination of EY, the SI, and whatever small number of other people they choose to share the running of the place with. To a limited extent it is for us, but its governance is not at all by us, and it wouldn't be LessWrong if it was. The system of government here is enlightened absolutism.

Comment author: komponisto 19 September 2012 01:22:31PM *  16 points [-]

since you're lucky enough to be part of a physical community of "similar" people

Was Eliezer "lucky" to have cofounded the Singularity Institute and Overcoming Bias?

The causes of his being in such a happy situation (is that better?) were clearly not the point here, and, quite frankly, I think you knew that.

But if you insist on an answer to this irrelevant rhetorical question, the answer is yes. Eliezer_2012 is indeed quite fortunate to have been preceded by all those previous Eliezers who did those things.

EY owns LessWrong

Then, like I implied, he should just admit to making a decision on the basis of his own personal preference (if indeed that's what's going on), instead of constructing a rationalization about the opinions of offline folks being somehow more important or "appropriately" filtered.

Comment author: DaFranker 19 September 2012 06:30:07PM 0 points [-]

(...) on the basis of his own personal preference (...)

I would replace preference with hypothesis of what constitutes the optimal rationality-refining community.

They are sensibly the same, but I find the latter to be a more useful reduction that is more open to being refined in turn.

Comment author: thomblake 19 September 2012 01:49:37PM 4 points [-]

The system of government here is enlightened absolutism.

This is a community blog. If your community has a dictator, you should overthrow him.

Comment author: ArisKatsaris 19 September 2012 06:18:22PM 2 points [-]

Is the overthrowing of dictators a terminal value to you, or is it that you associate it with good consequences?

Comment author: wedrifid 20 September 2012 10:41:01AM *  0 points [-]

This is a community blog. If your community has a dictator, you should overthrow him.

With the caveats:

  • If the dictator isn't particularly noticed to be behaving in that kind of way it is probably not worth enforcing the principle. ie. It is fine for people to have the absolute power to do whatever they want regardless of the will of the people as long as they don't actually use it. A similar principle would also apply if the President of the United States started issuing pardons for whatever he damn well pleased. If US television informs me correctly (and it may not) then he is technically allowed to do so but I don't imagine that power would remain if it was used frequently for his own ends. (And I doubt it the reaction against excessive abuse of power would be limited to just not voting for him again.)
  • The 'should' is weak. ie. It applies all else being equal but with a huge "if it is convenient to do so and you haven't got something else you'd rather do with your time" implied.
Comment deleted 19 September 2012 06:22:17PM [-]
Comment author: RichardKennaway 20 September 2012 12:37:18PM *  3 points [-]

Yudkowsky's luck consisted of having a billionaire friend (Peter Thiele) who bankrolled SIAI

How did he acquire such a friend, and who convinced him to bankroll SIAI?

Comment author: gwern 19 September 2012 09:52:47PM 3 points [-]

SIAI over its history (you can look at the Form 990s if you want) has gotten maybe half or less its budget from Thiel. Where's the rest coming from? Lady Luck's charitable writeoffs?

Still, at least you seem to have dropped your claim that SIAI or LW is a homeschooling propaganda front...

Comment author: ArisKatsaris 20 September 2012 11:23:28AM 2 points [-]

It's my impression that "front group" as typically used refers to a hidden/covert connection. LessWrong on the other hand has the logos/links for CFAR, SI and the Future of Humanity Institute displayed prominently.

Comment author: [deleted] 20 September 2012 11:13:07AM 1 point [-]

<nitpick>Thiel</nitpick>

Comment author: shminux 19 September 2012 08:45:24PM *  -1 points [-]

What would have happened if he didn't? How many times, do you think, other potential sponsors decided to pass? Seems like this is one of those cases where a person makes his own luck.

Comment author: DaFranker 18 September 2012 09:04:53PM *  4 points [-]

True. For that to be an effective communication channel, there would need to be a control group. As for how to create that control group or run any sort of blind (let alone double-blind) testing... yeah, I have no idea. Definitely a problem.

ETA: By "I have no idea", I mean "Let me find my five-minute clock and I'll get back to you on this if anything comes up".

Comment author: DaFranker 19 September 2012 02:15:06PM *  2 points [-]

So I thought for five minutes, then looked at what's been done in other websites before.

The best I have is monthly surveys with randomized questions from a pool of stuff that matters for LessWrong (according to the current or then-current staff, I would presume) with a few community suggestions, and then possibly later implementation of a weighing algorithm for diminishing returns when multiple users with similar thread participation (e.g. two people that always post in the same thread) give similar feedback.

The second part is full of holes and horribly prone to "Death by Poking With Stick", but an ideal implementation of this seems like it would get a lot more quality feedback than what little gets through low-bandwidth in-person conversations.

There are other, less practical (but possibly more accurate) alternatives, of course. Like picking random LW users every so often, appearing at their front door, giving them a brain-scan headset (e.g. an Emotiv Epoc), and having them wear the headset while being on LW so you can collect tons of data.

I'd stick with live feedback and simple surveys to begin with.

Comment author: DevilWorm 19 September 2012 08:27:33PM *  4 points [-]

it's subject to the Loss Aversion effect where the dissatisfied speak up in much greater numbers

But Eliezer Yudkowsky, too, is subject to the loss aversion effect. Just as those dissatisfied with changes overweight change's negative consequences, so does Eliezer Yudkowsky overweight his dissatisfaction with changes initiated by the "community." (For example, increased tolerance of responding to "trolling.")

Moreover, if you discount the result of votes on rules, why do you assume votes on other matters are more rational? The "community" uses votes on substantive postings to discern a group consensus. These votes are subject to the same misdirection through loss aversion as are procedural issues. If the community has taken a mistaken philosophical or scientific position, people who agree with that position will be biased to vote down postings that challenge that position, a change away from a favored position being a loss. (Those who agree with the newly espoused position will be less energized, since they weight their potential gain less than their opponents weigh their potential loss.)

If you think "voting" is so highly distorted that it fails to represent opinion, you should probably abolish it entirely.

Comment author: Nornagest 14 September 2012 04:07:16AM *  14 points [-]

I've moderated a few forums before, and with that experience in mind I'd have to agree that there's a huge, and generally hugely negative, selection bias at play in online response to moderator decisions. It'd be foolish to take those responses as representative of the entire userbase, and I've seen more than one forum suffer as a result of such a misconception.

That being said, though, I think it's risky to write off online user feedback in favor of physical. The people you encounter privately are just as much a filtered set as those who post feedback here, though the filters point in different directions: you're selecting people involved in the LW interpersonal community, for one thing, which filters out new and casual users right off the bat, and since they're probably more likely to be personally friendly to you we can also expect affect heuristics to come into play. Skepticism toward certain LW norms may also be selected against, which could lead people to favor new policies reinforcing those norms. Moreover, I've noticed a trend in the Bay Area group -- not necessarily an irrational one, but a noticeable one -- toward treating the online community as low-quality relative to local groups, which we might expect to translate into antipathy towards its status quo.

I don't know what the weightings should be, but if you're looking for a representative measure of user preferences I think it'd be wise to take both groups into account to some extent.

Comment author: Alicorn 14 September 2012 04:32:51AM 12 points [-]

You... know I don't optimize dinner parties as focus groups, right? The people who showed up that night were people who like chili (I had to swap in backup guests for some people who don't) and who hadn't been over too recently. A couple of the attendees from that party barely even post on LW.

Comment author: wedrifid 14 September 2012 12:01:10PM *  17 points [-]

You... know I don't optimize dinner parties as focus groups, right?

It is perhaps more importantly dinner parties are optimised for status and social comfort. Actually giving honest feedback rather than guessing passwords would be a gross faux pas.

Getting feedback at dinner parties is a good way to optimise the social experience of getting feedback and translate one's own status into the agreement of others.

Comment author: [deleted] 14 September 2012 03:06:27PM 7 points [-]

FWIW, I eat chili but I don't think the strongest of the proposed anti-troll measures are a good idea.

Comment author: CCC 14 September 2012 08:03:03AM 5 points [-]

If I were to guess, I'd guess that the main filter criteria for your dinner parties is geographical; when you have a dinner party in the Bay area, you invite people who can be reasonably expected to be in the Bay area. This is not entirely independant of viewpoint - memes which are more common local to the Bay area will be magnified in such a group - but the effect of that filter on moderation viewpoints is probably pretty random (similarly, the effect of the filter of 'people who like chili' on moderation viewpoints is probably also pretty random).

So the dinner party filter exists, but it less likely to pertain to the issue at hand than the online self-selection filter.

Comment author: komponisto 14 September 2012 09:08:17AM 5 points [-]

The problem with the dinner party filter is not that it is too strong, but that it is too weak: it will for example let through people who aren't even regular users of the site.

Comment author: Eliezer_Yudkowsky 14 September 2012 11:18:43AM 0 points [-]

You... know I don't optimize dinner parties as focus groups, right?

That's kinda the point.

Comment author: MBlume 14 September 2012 06:26:03PM 4 points [-]

At risk of failing to JFGI: can someone quickly summarize what remaining code work we'd like done? I've started wading into the LW code, and am not finding it quite as impenetrable as last time, so concrete goals would be good to have.

Comment author: Eliezer_Yudkowsky 15 September 2012 03:31:01AM 3 points [-]
Comment author: Bugmaster 14 September 2012 02:41:47AM 9 points [-]

That's fair, and your strategy makes sense. I also agree with DaFranker, below, regarding meta-threads.

This said, however, at the time when I joined Less Wrong, my model of the site was something like, "a place where smart people hold well-reasoned discussions on a wide range of interesting topics" (*). TheOtherDave's comment, in conjunction with yours, paints a different picture of what you'd like Less Wrong to be; let's call it Less Wrong 2.0. It's something akin to, "a place where Eliezer and a few of his real-life friends give lectures on topics they think are important, with Q&A afterwards".

Both models have merit, IMO, but I probably wouldn't have joined Less Wrong 2.0. I don't mean that as any kind of an indictment; if I were in your shoes, I would definitely want to exclude people like this Bugmaster guy from Less Wrong 2.0, as well.

Still, hopefully this one data point was useful in some way; if not, please downvote me !

(*) It is possible this model was rather naive.

Comment author: TheOtherDave 14 September 2012 03:08:59AM 6 points [-]

EY has always seemed to me to want LW to be a mechanism for "raising the sanity waterline". To the extent that wide-ranging discussion leads to that, I'd expect him to endorse it; to the extent that wide-ranging discussion leads away from that, I'd expect him to reject it. This ought not be a surprise.

Nor ought it be surprising that much of the discussion here does not noticeably progress this goal.

That said, there does seem to be a certain amount of non-apple selling going on here; I don't think there's a cogent model of what activity on LW would raise the sanity waterline, so attention is focused instead on trying to eliminate the more blatant failures: troll-baiting, for example, or repetitive meta-threads.

Which is not a criticism; it is what it is. If I don't know the cause, that's no reason not to treat the symptoms.

Comment author: Emile 14 September 2012 08:34:43AM 4 points [-]

This said, however, at the time when I joined Less Wrong, my model of the site was something like, "a place where smart people hold well-reasoned discussions on a wide range of interesting topics" (*). TheOtherDave's comment, in conjunction with yours, paints a different picture of what you'd like Less Wrong to be; let's call it Less Wrong 2.0. It's something akin to, "a place where Eliezer and a few of his real-life friends give lectures on topics they think are important, with Q&A afterwards".

No; you're conflating "Eliezer considers he should have the last word on moderation policy" and "Eliezer considers LessWrong's content should be mostly about what he has to say".

The changes of policy Eliezer is pushing have no effect on the "main" content of the site, i.e. posts that are well-received, and upvoted. The only disagreement seems to be about sprawling threads and reactions to problem users. I don't know where you're getting "Eliezer and a few of his real-life friends give lectures on topics they think are important" out of that, it's not as if Eliezer has been posting many "lectures" recently.

Comment author: Bugmaster 14 September 2012 06:47:00PM 2 points [-]

I was under the impression that Eliezer agreed with TheOtherDave's comment upthread:

Mostly [Eliezer is] coming across to me as having lost patience with the community not being what he wants it to be...

Combined with Eliezer's rather aggressive approach to moderation (f.ex. deleting downvoted comments outright), this did create the impression that Eliezer wants to restrict LessWrong's content to a narrow list of specific topics.

Comment author: DaFranker 14 September 2012 02:00:10AM *  7 points [-]

Sometimes AKA the "Forum Whiners" effect, well known in the PC games domain:

When new PC games are released, almost inevitably the main forums for the game will become flooded with a large surge of complaints, negative reviews, rage, rants, and other negative stuff. This is fully expected and the absence of such is actually a bad sign. People that are happy with the product are playing the game, not wasting their time looking for forums and posting comments there - while people who have a problem or are really unhappy often look for an outlet or a solution to their issues (though the former in much greater numbers, usually). If no one is bothering to post on the forums, then that's evidence that no one cares about the game in the first place.

I see a lot of similarities here, so perhaps that's one thing worth looking into? I'd expect some people somewhere to have done the math already on this feedback (possibly by comparing to overall sales, survey results and propagation data), though I may be overestimating the mathematical propensity of the people involved.

Regarding the stop-watching-threads thing, I've noticed that I pretty much always stop paying attention to a thread once I've gotten the information I wanted out of it, and will only come back to it if someone directly replies to one of my comments (since it shows up in the inbox). This has probably been suggested before, but maybe a "watchlist" to mark some threads to show up new comments visibly somewhere and/or a way to have grandchildren comments to one of your own show up somehow could help? I often miss it when someone replies to a reply to my comment.

Comment author: Bugmaster 14 September 2012 02:20:46AM 7 points [-]

Upvoted for the "watchlist" idea, I really wish Less Wrong had it.

Comment author: [deleted] 14 September 2012 10:24:28PM 4 points [-]

Each individual post/comment has its own RSS feed (below your user name, karma scores etc. and above “Nearest meetups” in the right sidebar).

Comment author: Rain 14 September 2012 02:40:55AM *  9 points [-]

I very much appreciate the attempts at greater moderation, including the troll penalty. Thank you.

Comment author: Sarokrae 14 September 2012 06:11:18AM 9 points [-]

Me too. Troll posts and really wrong people are too distracting without some form of intervention. Not sure the current solution is optimal (but this point has been extensively argued elsewhere), but I applaud the effort to actually stick one's neck out and try something.

Comment author: Eliezer_Yudkowsky 14 September 2012 11:19:51AM 5 points [-]

Thank you both. Very much, and sincerely.

Comment author: Will_Newsome 14 September 2012 05:25:39PM *  10 points [-]

Accepting thanks with sincerity, while somewhat-flippantly mostly-disregarding complaints? ...I must be missing some hidden justification?

Comment author: wedrifid 14 September 2012 05:29:59PM 5 points [-]

Accepting thanks with sincerity, while somewhat-flippantly mostly-disregarding complaints? ...I must be missing some hidden justification?

He is thanking them for their support, not their information.

Comment author: philh 14 September 2012 05:40:00PM 5 points [-]

People who agree are more likely to keep quiet than people who disagree. Rewarding them for speaking up reduces that effect, which means comments get closer to accurately representing consensus.

Comment author: TheOtherDave 14 September 2012 05:47:29PM 1 point [-]

Can you summarize your reasons for believing that people who agree are more likely to keep quiet than people who disagree?

Comment author: philh 14 September 2012 06:03:23PM 2 points [-]

It's the impression I've got from informal observation, and it's true when talking about myself specifically. (If I disagree, I presumably have something to say that has not yet been said. If I agree, that's less likely to be true. I don't know if that's the whole reason, but it feels like a substantial part of it.)

http://lesswrong.com/lw/3h/why_our_kind_cant_cooperate/ provides an anecdote, and suggests that Eliezer has also gotten the same impression.

Comment author: TheOtherDave 14 September 2012 06:15:08PM *  3 points [-]

I certainly agree with your last sentence.

My own experience is that while people are more likely to express immediate disagreement than agreement in contexts where disagreement is expressed at all, they are also more likely to express disagreement with expressed disagreement in such forums, from which agreement can be inferred (much as I can infer your agreement with EY's behavior from your disagreement with Will_Newsome). The idea that they are more likely to keep quiet in general, or that people are more likely to anonymously downvote what they disagree with than upvote what they agree with, doesn't jive with my experience.

And in contexts where disagreement is not expressed, I find the Asch results align pretty well with my informal expectations of group behavior.

Comment author: shminux 14 September 2012 06:04:30PM 1 point [-]

Do you doubt that content people whine less?

Comment author: TheOtherDave 14 September 2012 06:08:16PM 0 points [-]

No, I don't doubt that content people whine less.

Comment author: [deleted] 14 September 2012 06:39:32PM 4 points [-]

In case you need assurance from the online sector. I wholeheartedly welcome any increase in the prevalence of the banhammer, and the "pay 5 karma" thing seems good too.

During that Eridu fiasco, I kept hoping a moderator would do something like "this thread is locked until Eridu taboos all those nebulous affect-laden words."

Benevolent dictators who aren't afraid of dissent are a huge win, IMO.

Comment author: TheOtherDave 14 September 2012 02:16:44AM 1 point [-]

Fair enough. All I see is the vote-counts and online comments, but the real-life commenters are of course also people, and I can understand deciding to attend more to them.

Comment author: shminux 14 September 2012 02:20:59AM 0 points [-]

I think his point is that there is less selection bias IRL.

Comment author: TimS 14 September 2012 02:25:00AM 6 points [-]

But that's almost certainly false. IRL input has distinct selection bias from viewing meta threads, but not no selection bias.

Comment author: TheOtherDave 14 September 2012 02:59:38AM 4 points [-]

Yeah, exactly. Which is why I took it to mean a simple preference for considering the community of IRL folks. Which is not meant as a criticism; after all, I also take more seriously input from folks in my real life than folks on the internet.

Comment author: komponisto 14 September 2012 08:50:20AM 5 points [-]

I also take more seriously input from folks in my real life than folks on the internet.

Even when the topic on which you are receiving input is how to run an internet forum (on which the real-life folks don't post)?

Comment author: TheOtherDave 14 September 2012 01:59:47PM 3 points [-]

Well, I don't do that, clearly, since I don't run such an Internet forum.

Less trivially, though... yeah, I suspect I would do so. The tendency to take more seriously people whose faces I can see is pretty strong. Especially if it were a case like this one, where what the RL people are telling me synchronizes better with what I want to do in the first place, and thus gives me a plausible-feeling justification for doing it.

I suspect you're not really asking me what I do, though, so much as implicitly suggesting that what EY is doing is the wrong thing to do... that the admins ought to attend more to commenters and voters who are actually participating on the thread, rather than attending primarily to the folks who attend the minicamp or Alicorn's dinner parties.

If so, I don't think it's that simple. Fundamentally it depends on whether LW's sponsors want it to be a forum that demonstrates and teaches superior Internet discourse or whether it wants to be a forum for people interested in rational thinking to discuss stuff they like to discuss. If it's the latter, then democracy is appropriate. If it's the former, then purging stuff that fails to demonstrate superior Internet discourse is appropriate.

LW has seemed uncertain about which role it is playing for as long as I've been here.

Comment author: mrglwrf 14 September 2012 07:01:49PM 0 points [-]

LW has seemed uncertain about which role it is playing for as long as I've been here.

Yes, that's certainly the single largest problem. If the LW moderators decided on their goals for the site, and committed to a plan for achieving those goals, the meta-tedium would be significantly reduced. The way it's currently being done, there's too much risk of overlap between run of the mill moderation squabbles and the pernicious Eliezer Yudkowsky cult/anticult squabbles.

Comment author: shminux 14 September 2012 02:43:00AM 0 points [-]

Then he is OK with this particular selection bias :)