What else remains? What other plausible function does it serve?
Entertainment?
And this is intentional. I even found this in their community standards section:
http://rationalwiki.org/wiki/RationalWiki:Community_Standards Point of view
RationalWiki does not use Wikipedia's well-known "Neutral Point of View". We have our own version: SPOV. SPOV means two things: Snarky point of view — This is the meaning most people refer to. It means that, to keep our articles from being dry and boring, we spice it up with humor, sarcasm, skepticism, satire, and wit.
So they do announce they are trying to be funny.
Its problem is that it is an ammunition depot to aid in winning debates.
It feels more like combustibles for a fireworks display.
It feels more like combustibles for a fireworks display.
That is a good alternative interpretation. But I don't think most of its readers treat it that way.
When I come across a pseudoscience I haven't seen before, I usually go to Google first to check its position with regard to reality.
Then I go to its RationalWiki article for entertainment. This is essential if I don't want to spend the rest of the day fuming at how many people "actually believe in that stuff".
I do. In certain places I couldn't pretend to believe they're trying to be serious even if I tried.
Its problem is that it is an ammunition depot to aid in winning debates.
To answer your question, this is pretty much what it's actually for. (I'm quite pleased with this one, for example, and this one has worked well in practice to make the world a slightly better place for a while.) Trent (the local Jimbo) is also working on making a new site explicitly for original works of skeptical interest, and expanding the RWF to do more than just run an amusing and slightly useful website.
There are two salient things to which this quote could apply, and I somewhat endorse applying it to both of them :-)
You can’t prescribe decently for something you hate. It will always come out wrong. You can’t prescribe decently for something you despair in. If you despair of humankind, you’re not going to have good policies for nurturing human beings. I think people ought to give prescriptions who have ideas for improving things, ought to concentrate on the things that they love and that they want to nurture.
What else remains? What other plausible function does it serve?
It keeps the riff-raff out of LessWrong.
A pleasantly pithy remark, but also a nasty one to present without actual evidence. We don't need cheerleading, especially at the expense of people who probably could contribute effectively to LW if they felt like it.
Ok, I was engaging in a bit of their proudly proclaimed SPOV. If they hand it out they'd better be able to take it as well.
As context for the joke, the most middle-class of the national supermarket chains in the UK is Waitrose. (Middle-class as opposed to working-class; the upper class would not be seen in such places.) Number two is Sainsburys. So, some comedian remarked, what's the point of Sainsburys? To keep the riff-raff out of Waitrose.
Is that a good thing? We can hardly raise the sanity waterline if only the most sane people hang out here.
Honestly I'm much more concerned about LessWrong staying as sane as it is. Overall I think there are several negative indications.
Honestly I'm much more concerned about LessWrong staying as sane as it is. Overall I think there are several negative indications.
I agree. I always thought the "raising the sanity waterline" was a substitute for "refining the art of human rationality", not a compliment. Imagine if the Logical Empiricists tried recruiting everyday people rather than the scientific elite (and other scientifically literate philosophers)? I fear that HPMOR, while I've gotten a lot of value from it, was the beginning of the end for this place. At the very least, I think these efforts should be spun off as much as possible (and without links leading back here). I say, let self-selection effects do their thing.
The think the more sane getting more sane raises the water line.
EDIT - Even I don't know what the hell I was saying here.
I think of them as the Sith to our Jedi.
The Jedi say "We will give you ancient knowledge of vast power. But you must promise to use it only for pure truth-seeking and the good of all mankind. You must not use it to serve your own personal ends, or you will be consumed by it. You cannot possibly imagine what dangerous forces you are playing with!"
The Sith say "Ancient knowledge of vast power?! Awesome! Got to remember to use this to win any fights I get in from now on!"
Despite that I don't dislike them as much as I used to. People were relatively helpful to me when a local troll tried to smear me and even the guy responsible eventually apologized. At the same time I learned almost all of the anti-LW-ism on there is just one guy who really really hates us for some reason.
I do sometimes worry that they sometimes fall victim to the "If other people say this probability is only 1%, I will be even more virtuous than they are if I say it's only 0.0001%" fallacy, but they're probably better than most people and I'd hate to go all narcissism of minor differences on them.
Two guys, fwiw. LW burnouts have also been showing up. Many RW regulars quite like LW (and particularly Yvain), though the apparently-silly bits are in fact regarded as silly.
The only reason this article we're commenting on exists is because RW - which is piss-insignificant - is the only place on the Internet that pays LW even that much attention. Insofar as this is a problem, the problem is that no-one else pays LW even that much attention. (Of course, the question is then whether LW actually wants that attention, because press coverage in general is fundamentally shit and is really not worth touting for unless you have an actual thing you want it to publicise.)
LW paying RW this much attention while also claiming that the entire future of human value itself is at stake looks on the surface like a failure of apportionment of cognitive resources, but perhaps I've missed something.
LW paying RW this much attention while also claiming that the entire future of human value itself is at stake looks on the surface like a failure of apportionment of cognitive resources, but perhaps I've missed something.
What do you mean by "this much attention"? If Konkvistador's links at the top are reasonably comprehensive (and a quick search doesn't turn up much more), there have been 2 barely-upvoted discussion posts about RW in four years, which hardly seems like much attention. For comparison, LW has devoted several times as much energy to dating advice.
Is there a lot of discussion of RW that I'm missing, or are you claiming that even two posts in Discussion is totally excessive?
I suppose I'm really thinking of an LW regular telling me in conversation that they consider RW a serious existential risk. You know, serious enough to even think about compared to everything else in the world.
This article is a response to this comment, which was actually mostly about this comment. Posting an entire article in response to half of that comment does strike me as an overreaction. (I'd be interested in Konkvistador's similar-length response to Jade's comment, though; there's a body of work there raising quite apposite concerns about problems with LW as a social environment - specifically, the existing real world problem of creepers at LW meetups - that won't disappear by merely downvoting them.)
Thanks for linking to the context! In fairness, though, if people are citing RationalWiki as proof that LessWrong has a "reputation", then devoting a discussion-level post to it doesn't strike me as excessive.
(On a related note: I hadn't read Jade's comments, but I did after you flagged them as interesting; they struck me as totally devoid of value. Would you mind explaining what you think the valid concern he/she's expressing is?)
Well, for one thing, Jade appears to be a "she". But never mind, I'm sure it'll all work out fine.
Fixed, sorry! (I'm female and that mistake doesn't bother me at all, but I know it really annoys some people. I'll be more careful in future.)
I completely agree that characterizing RW as contributing to existential risk is absurd.
For comparison, LW has devoted several times as much energy to dating advice.
I'm not sure this comparison supports your point terribly well. Dating advice itself is incredibly instrumentally useful. The problems with the dating advice threads are the lack of quality content and the focus on irrelevant conflict. Lesswrong being unqualified to discuss a topic is a very different thing from a thing being insignificant or unworthy of attention.
Perhaps RW is more relevant than I thought. Rachael Briggs announced via an edit that she isn't doing the TDT paper for SIAI any more. (XiXiDu emailed and verified it was her.)
It strikes me that SIAI/LW may be at the stage of needing a proper PR strategy and someone whose job that is.
Briggs decided not to spend further time writing the TDT paper. However, SI is now paying her hourly to give us feedback on the TDT paper that Alex Altair is developing. She's very good at that, and appears to be enjoying the process.
According to XiXiDu, "she believed it would be unlikely for her to produce an article that would be satisfactory to both her and SIAI". Does that mean she didn't think it could be formalized, that it could be but that it would end up being obviously inferior to CDT/EDT or otherwise a bad decision theory, that she isn't able to do it but that someone else could, or something else entirely?
As many times as he's quoted people out of context, wantonly misrepresented their positions in order to support his agenda, and so on, I wouldn't be surprised if the above isn't the whole story.
Without really convincing citations, that's just mudslinging.
(I also saw the email from Briggs and it's accurate in wording and IMO context.)
But what is the context here? I mean, I'd find it perfectly plausible for a chosen academic to turn down a SI grant because she doesn't want to be associated - but Briggs already had accepted the grant, apparently in all seriousness.
Ah. Well, my general question still stands: it can't be as simple as just not wanting to be associated, since then she would not have accepted in the first place, so what changed? It's hard to imagine that XiXi's idée fixe, the basilisk, would make her turn it down, and I can't think of any recent scandals like newspaper headlines screaming 'Yudkowsky caught acausally molesting catgirls!' which might do the trick
So my best guess is the other mentioned possibility: she didn't think she could do anything worthwhile with TDT, which is interesting to me since I read a few of her papers and they were pretty good but other people who seem smarter than me and much better at decision theory think TDT is interesting and novel and a good starting point for more work!
There's a large chasm between "Briggs confirmed she's not writing the TDT paper" and his editorializing:
Would you really feel good having your name that close to crackpot ideas like the Roko basilisk? Status is important within academia. Having "Singularity Institute" in your bio doesn't look good.
That has been his hobby horse for many years now, and these have been his methods. Or am I the only person who remembers shityudkowskysays.tumblr.com?
The statement you were responding to was: 'According to XiXiDu, "she believed it would be unlikely for her to produce an article that would be satisfactory to both her and SIAI".' You expressed doubt as to this, implying Kruel was lying. When called on that, you backed off to a general claim that he quoted people out of context a lot. In this case he was doing no such thing, so you've moved to general mudslinging. It's not clear that this constitutes a worthwhile mode of discussion.
I firmly disagree with your interpretation of this thread, and also find further discussion not worthwhile.
Perhaps RW is more relevant than I thought. Rachael Briggs announced via an edit that she isn't doing the TDT paper for SIAI any more. (XiXiDu emailed and verified it was her.)
It is relevant enough that if people google their own name mentions therein can appear in the search results.
The only times I've heard about RationalWiki have been on Less Wrong. I'm not really sure what the point is; the wiki format is not really suited for reading as entertainment and to discover unknown unknowns, it's really not suited to community building and socializing, and the topic space is too narrow (as compared to Wikipedia or a Google search) for focused research. Those are pretty much my only use cases; it ends up in the same niche as the Less Wrong wiki, which I never use either.
There's another problem, which is that there's no good filtering mechanism there to decide what's worth reading - there's no upvoting, nor even visible author names. Notice that, when this article complains about some content on it, the complaints are directed at RationalWiki as a whole rather than at one particular person who writes there. And it pretty much has to be, because you have to dig into edit logs to get the author names, and that's not practical if you're just looking for something to read. The Less Wrong wiki would have the same problem, but it at least is made of mostly links, getting you to articles with scores and author names so you can decide what's worth reading.
the wiki format is not really suited for reading as entertainment
The wiki format is definitely suited for reading as entertainment for some people. I can compulsively read for hours on TvTropes to slack off, and there is a name for that behavior which has been linked on TvTropes:
The Problem With Rational Wiki
The new batch of comments polluting the recent comments page makes me inclined to suggest a "Do not feed the (tribe of) trolls!" policy at the post level!
I'm somewhat less forgiving about their casual approach to epistemology and their vulnerability to cargo cult science, as long as it is peer reviewed cargo cult science.
RatWiki operates as a mobocracy. Anyone who challenges the mob's assumptions about "science," which, for them, means "whatever we, the rational people, believe," is non grata. Yes, cargo cult science. In the example I most know about, the assumptions aren't "peer reviewed," the SPOV firmly enforced on RatWiki hasn't shown up under peer review for almost a decade, while the contrary has been -- in the journals -- mainstreamed. But they don't know that, and won't read sources and arguments based on sources. Tl;dr. And if one is brief, it's still, "Crank, go away, shut up."*
My conclusions from about eight months of activity there. My main interest is wiki structure, or I wouldn't have bothered.
Yeah, what I wrote is so about cold fusion, but I'm totally new to LW, am a bit in awe at the level of discourse here, and don't want to import a dispute from there to here. I'm happy to have found LW because of Gerard's comments on RatWiki.
This isn't quite as trollish as your first comment (now deleted) but a wall-o-text rant containing nothing concrete is not the way to post here.
The less attention we give to this absurd effort the better.
Ahem. It hadn't been further responded to for almost a year when you posted to it twice in one day.
RationalWiki is a troll site, but that could be a good thing - it gives LW the opportunity to show exactly why it isn't a cult. Yvain makes very convincing counter-arguments and other posters have done a superb job of keeping their cool while answering various inquiries. Some of the questions directed at SIAI and Eliezer are not without merit and can't be dismissed so easily.
Anecdotally speaking, I've been around a few unimaginative 'bro-grammer' types fixated on SIAI - and particularly Eliezer. The individuals I'm thinking of know just enough about computers to feign credibility on reddit while making fun of a group similar to them, yet more eccentric. Mocking Eliezer, a successful nerd with more celebrity than they could hope for, gives these guys a snarky sense of superiority. Kinda dumb.
This is just my impression of where some of this might be coming from.
Rational wiki has bias and is not a good source to go to for objective information. Snopes is no good either.
So easy to see patterns of misinformation everywhere.
people who identify as left wing politically form a large majority here
Except they don't: more than 60% of the survey participants identified as liberal or libertarian, and those are right-wing ideologies.
On the other hand, liberalism includes both classical liberalism, which is right-wing, and social liberalism, which is moderately left. I think that the "liberalism" option of the survey ended up including them both. Although it might have been useful to actually make a distinction between them.
This isn't quite as trollish as your first comment (now deleted), but a wall-o-text rant containing nothing concrete is not the way to post here.
Related to: RationalWiki's take on LW, David Gerard's Comments, Vladimir_M's comments, Public Drafts
I wanted to bring more attention to this argument because I've ran into related discussion several times in the comment section and because it demonstrates a failure mode that LessWrong may find itself vulnerable to.
Since it has been cited as a source especially on the reputation LessWrong may or may not have elsewhere I think readers should be aware Rational Wiki has a certain reputation here as well. I'm not talking about the object level disagreements such as cryonics, existential risk, many-worlds interpretation and artificial intelligence because we have some reasonable disagreement on those here as well. Even its cheeky tone while not helping its stated goals can be amusing. I'm somewhat less forgiving about their casual approach to epistemology and their vulnerability to cargo cult science, as long as it is peer reviewed cargo cult science.
While factually it is as about as accurate as Wikipedia, it is very selective about the facts that it is interested in. For example what would you expect from a site calling itself "Rational Wiki" to have on its page about charity. Do you expect information on how much good charity actually does? What kinds of charities do not do what they say on the label? How to avoid getting misled? The ethics of charity? The psychology, sociology or economics of charity?
I'm sorry to disappoint you but the article consists of some haphazardly arranged facts and stats on how much members of some religions give or are supposed to give to charity, a dig against Christianity and a non-sequitur unfavourable comparison of the US to Sweden. Contrast this with what you can find on the topic on sites like LessWrong or 80, 000 Hours. Basically the material presented is what a slightly left of centre atheist needs to win an internet debate. As is much of the rest of the site.
Indeed some entries have a clear ideological bias that is quite startling to behold on a "rational wiki" and it has been noted by some.
Now to avoid any misunderstandings there are good articles, a few LWers are contributors to the rational wiki and there is certainly nothing wrong with being a left of centre atheist! Nearly everyone on this site is an atheist, and people who identify as left wing politically form a large majority here. The tribal markers and its political agenda aren't the biggest problem. Sites with all sorts of agendas, even political ones, promoting rationality are a good thing.
Its problem is that it is an ammunition depot to aid in winning debates. Very specific kinds of debates too. This may sound harsh, but consider: How many people reading the site that aren't already atheists will change their mind on religion? How many people who follow a "crankish" belief won't do so afterwards? While I'm sure it happens the site obviously isn't optimized for this. How many people will read the wiki and try to find errors and biases in their own thinking to debug it instead of breaking if further with confirmation bias or using it as a club? How many will apply this knowledge to help them with any real world problems? Truth seeeking? As a source or community that could aid in that quest it is less useful and reliable than Wikipedia, which while a rather good and extensive encyclopaedia (despite snickering to the contrary) has a subtly but importantly different stated goal.
What else remains? What other plausible function does it serve?