If you want to promote the republishing of LW articles, I think you'd be more inclined to drop the Singularity/Futurism bits from the tagline. They're alienating and off-message, I think.
Also, when I try to share an article on G+, G+ pulls the following text for summary: "Less Wrong Discussion Future of Humanity InstituteSingularity Institute for Artificial Intelligence. Main. Posts; Comments. Discussion. Posts; Comments. Less Wrong is a community blog devoted to refin...". No good.
So therefore I think LW's admins/web designers should seriously consider replacing the rationality tagline
As far as I'm concerned, rationality is what LessWrong is about, and the tagline says exactly that. I'm not interested in reading a Singularity/AI/transhumanism forum, and those exist elsewhere for those who are. I assume that the SIAI and FHI logos have to be there to acknowledge the funding, but if it wasn't for that, I'd rather see them removed.
OTOH, it does in fact have a singularitarian agenda: to create or attract people sufficiently rational to do real work on FAI.
I assume that the SIAI and FHI logos have to be there to acknowledge the funding, but if it wasn't for that, I'd rather see them removed.
Surely better to declare your affiliations and sponsors - rather than hide them - no?
That's what I was saying. LW gets their money, they get a place in the masthead. I don't have a problem with that. In fact, I'm surprised not to see an explicit acknowledgement of the sponsorship anywhere.
ETA: The foot of the ABOUT page briefly mentions that LW is "associated with" these organisations and that the founder, EY, works at the SI.
If people are claiming the rationality tagline is code for transhumanism and technophilia, why wouldn't they also claim that for a more savory tagline?
I believe the reasoning is: savoriness of tagline -> unconscious gut-assessment of site legitimacy -> choice of side to rationalize.
I'm mainly concerned about the effect the tagline will have on people who are interested or potentially interested in refining human rationality. I'm not sure that there's a better way to attract people than advertizing that this is actually what we're about.
I do. As a rule of thumb, anyone booming the cause of truth, rationality, or common sense is usually trying to smuggle some particular belief in unargued on the back of it. (I've been around LW long enough that that prior is no longer relevant for this particular case. It does have an agenda, but one that is explicitly argued for.)
As well, anyone booing truth, rationality, or common sense, is usually trying to do the same, just by different means.
Voted up because it's intriguing that saying anything in favor of or against truth, rationality, or common sense is a danger sign.
That's plausible, it's just that my mind doesn't work that way. My filter is "is that interesting and plausible?" rather than "what might they be up to?".
Wait a minute, you're saying that talking about truth or rationality at all is suspicious? That'd be a pity. But now I understand the reactions of some of my acquaintances.
My mother, for instance often say that my philosophical views stems from a desire to control everything, or even plain fear (of death? of the unknown?). It looks like personal attacks (by dissolving the personal, historical causes of my beliefs, she dismisses the belief itself), but now I'm wondering if they're only rationalization for the bottom line "those beliefs are weird, and scary, and authoritarian and cold, and I don't like them".
I suspect a lot of people do, but how strong it is varies. I would take a look before, with an eye out for crankish or biased tendencies. Of course by the time I got here I knew this was legit.
A possible way to defuse this would be to add a second tagline that motivates understanding rationality for the purposes of building it. Then the agenda becomes clearer: yes, it really is rationality, in itself, for a specific purpose.
Cut the crap, nobody cares about rationality in the abstract.
But rationality in the abstract is so interesting! (To a certain subset of the population, and maybe the people who don't belong to that subset feel alienated by ideas that seem boring to them, but that doesn't mean it's wrong to be interested in rationality because it's cool to think about.)
No. I read LW for several years exactly because I am interested in rationality in the abstract.
I find it hard to decipher what exactly do you propose and why. Do you agree whith the second quoted text (i.e. that rationality is a ruse to propagate transhumanism)? Do you suggest that LW should be "honest" and replace the current tagline with "transhumanist propaganda website"? Or, on the other hand, do you suggest that the ruse is imperfect and the true purpose should be better hidden?
Wouldn't dropping the rationality tagline instead convince people even more thoroughly that it's not actually about rationality, but rather something else?
That being said, I agree with those concerns. LW doesn't have an agenda per se (beyond being sponsored by the SIAI), but the LW majority clearly does. While harsh, "a bunch of fringe technophiles" accurately describes a significant, and vocal, fraction of people here.
I wonder if there is such a thing as a non-fringe technophile. Who are they and what do they think/do ?
Tech is cool, but further improvement is scary? Tech is cool, but it'll never be able to do (something that sounds scary to many people but is entirely reasonable from a reductionist/materialist perspective) ?
Nowhere have I seen the dangers of AI more focused on than in LessWrong - as such I don't think LessWrong could be accurately summarized as being particularly technophilic.
I want to see who you summarize as particularly technophilic.
Have you ever met a technophobe? Opposition to life extension. Fear of nanotubes. Tolkein's aesthetics. Attraction to "natural remedies" and distrust of traditional medicine. Shouting 'hubris!'. Saying "you can't" in revulsed tones - not "it's not possible, look at my model" or "it would make you a bad person", just a gut rejection. Earlier, opposition to anaesthesia during childbirth and to industrialisation.
I would say something like Ray Kurzweil's "The Singularity is Near" is particularly technophilic; it embraces basically every belief that has ever been associated with the word singularity, while sweeping essentially all concerns of danger or ethics under the rug essentially by hand-waving.
I would certainly say that LessWrongians tend to be at least slightly more technophilic than average people, but not much more than I would expect for a group that is centered around a website. Having conversations with other people, I tend to find that they appreciate modern technology and are cautiously optimistic about near future technology and concerned about medium to long term technology; the standard LessWrong positions seem to me to be fairly in line with this except for being far more thought-out.
EDIT: obviously my anecdotal evidence is highly incomplete and should mean very little.
LW doesn't have an agenda per se (beyond being sponsored by the SIAI), but the LW majority clearly does.
You mean that there is no official mission statement? I'm pretty sure that it is part of a mission, though.
Seems to me you could replace "fringe technophiles" with 'white guys not named Harold,' and have just as valid a statement.
I assume you mean that your description gets at what a random member of the public would likely notice first. This does seem close to the truth. (Although a more literal account of what they'd perceive first would involve some word like 'rationality' or 'reason', perhaps in connection with the term 'worship'.) But the term "fringe technophiles," by itself, would not lead anyone to expect a community that asks you to justify your belief in detail if you say technology will not destroy the world.
Except the agenda of the fraction I'm speaking about is not "technology will not destroy the world". It's "friendly AI and uploads will lead us to a bright perfect techno-utopia". And as I said multiple times before, I don't buy it.
There are several things you've said you disliked; most vocally, predictions of a (positive?) Singularity and HPMOR. However, you haven't argued against them much, just said you disliked them.
If what you're trying to do is just putting up signs reading "Not everyone on LW likes this", this probably works. But I (and presumbably most people who either like or dislike those things) would like to hear your arguments for it in more detail, preferably with some back-and-forth if you're willing to engage. What's in it for you is that it can actually improve the consensus, as opposed to sticking a little [disputed] banner on it.
Besides what MixedNuts said: your description does not get at our most LW-specific beliefs. It doesn't even get at all the claims you've said you disagreed with.
So what would be more savory? Actually: what would be more savory and not misrepresent us at the same time? I mean, there is genuine interest in rationality around here. Sure, there's also above average representation of atheism and "fringe technophile ideas", and it's no accident that this is so, but that doesn't mean the interest in rationality is a thin veneer to mask a more political or ideological cause.
Yes, it's a plausible reaction. Someone potentially interested in rationality, but unwilling to suppose that a site claiming to be devoted to it is what it claims to be because he's used to everyone trying to "sell something", refuses to check the site out properly based on that.
But is that really the tagline's fault? Is it possible to do better with a different tagline? If we claim to be about something other than rationality, we just mislead potential readers and members who are interested in rationality rather than in that something else.
If I honestly want to talk about A, and you believe I'm trying to sell B, I don't think I can do any better than to keep trying to show that I'm genuinely interested in A. There's little to gain from my surrendering to your disbelief and agreeing that I'm trying to sell B, or inventing some (perhaps more plausible) C.
I actually disagree; the "Rationality" tagline is highly appropriate, especially as the singularitarians here are devoted to rational thought processes anyway in AI research.
I would, however, like to point out that the site seems to get mixed up with other groups as well:
"RationalWiki" - the highly politically partisan pro-science web page (kind of an opponent to LW, apparently.
Rationalism (philosophy) - Spinoza, Leibniz, Descartes (but they pretty much ignored empiricism so they have little to do with LW thinking)
Rationalism (Ayn Rand) - the worst kind. LW isn't a social-engineering organization, especially one committed to absolute "rational" selfishness.
So the tagline is good, but we should try not to get mixed up with those groups.
Yeah I know, but when I talk to people about rationality they're often like, "Don't be like Ayn Rand." I think there's been a post about it somewhere.
"RationalWiki" - the highly politically partisan pro-science web page (kind of an opponent to LW, apparently.
I suspect that one cause is that they get confused for LW and vice versa!
Another might be the Bayes vs. science thing.
Indeed it is written
If you speak overmuch of the Way you will not attain it.
Concerning Less Wrong's tagline, consider this plausible reaction of someone looking at LW for the first time:
And here are two real quotes from 2009:
And in reply:
The quoted text speaks for itself really. So therefore I think LW's admins/web designers should seriously consider replacing the rationality tagline with something more savory.