Followup toWhy Our Kind Can't Cooperate, Cultish Countercultishness

I used to be a lot more worried that I was a cult leader before I started reading Hacker News.  (WARNING:  Do not click that link if you do not want another addictive Internet habit.)

From time to time, on a mailing list or IRC channel or blog which I ran, someone would start talking about "cults" and "echo chambers" and "coteries".  And it was a scary accusation, because no matter what kind of epistemic hygeine I try to practice myself, I can't look into other people's minds.  I don't know if my long-time readers are agreeing with me because I'm making sense, or because I've developed creepy mind-control powers.  My readers are drawn from the nonconformist crowd—the atheist/libertarian/technophile/sf-reader/Silicon-Valley/early-adopter cluster—and so they certainly wouldn't admit to worshipping me even if they were.

And then I ran into Hacker News, where accusations in exactly the same tone were aimed at the site owner, Paul Graham.

Hold on.  Paul Graham gets the same flak I do?

  • Paul Graham has written a word or two about rationality... in a much more matter-of-fact style.
  • Paul Graham does not ask his readers for donations.  He is independently wealthy.
  • Paul Graham is not dabbling in mad-science-grade AI.  He runs Y Combinator, a seed-stage venture fund.
  • Paul Graham is not trying to save the world.  He's trying to help a new generation of entrepreneurs.

I've never heard of Paul Graham saying or doing a single thing that smacks of cultishness.  Not one.

He just wrote some great essays (that appeal especially to the nonconformist crowd), and started an online forum where some people who liked those essays hang out (among others who just wandered into that corner of the Internet).

So when I read someone:

  1. Comparing the long hours worked by Y Combinator startup founders to the sleep-deprivation tactic used in cults;
  2. Claiming that founders were asked to move to the Bay Area startup hub as a cult tactic of separation from friends and family;

...well, that outright broke my suspension of disbelief.

Something is going on here which has more to do with the behavior of nonconformists in packs than whether or not you can make a plausible case for cultishness or even cultishness risk factors.

But there are aspects of this phenomenon that I don't understand, because I'm not feeling what they're feeling.

Behold the following, which is my true opinion:

"Gödel, Escher, Bach" by Douglas R. Hofstadter is the most awesome book that I have ever read.  If there is one book that emphasizes the tragedy of Death, it is this book, because it's terrible that so many people have died without reading it.

I know people who would never say anything like that, or even think it: admiring anything that much would mean they'd joined a cult (note: Hofstadter does not have a cult).  And I'm pretty sure that this negative reaction to strong admiration is what's going on with Paul Graham and his essays, and I begin to suspect that not a single thing more is going on with me.

But I'm having trouble understanding this phenomenon, because I myself feel no barrier against admiring Gödel, Escher, Bach that highly.

In fact, I would say that by far the most cultish-looking behavior on Hacker News is people trying to show off how willing they are to disagree with Paul Graham.  Let me try to explain how this feels when you're the target of it:

It's like going to a library, and when you walk in the doors, everyone looks at you, staring.  Then you walk over to a certain row of bookcases—say, you're looking for books on writing—and at once several others, walking with stiff, exaggerated movements, select a different stack to read in.  When you reach the bookshelves for Dewey decimal 808, there are several other people present, taking quick glances out of the corner of their eye while pretending not to look at you.  You take out a copy of The Poem's Heartbeat: A Manual of Prosody.

At once one of the others present reaches toward a different bookcase and proclaims, "I'm not reading The Poem's Heartbeat!  In fact, I'm not reading anything about poetry!  I'm reading The Elements of Style, which is much more widely recommended by many mainstream writers."  Another steps in your direction and nonchalantly takes out a second copy of The Poem's Heartbeat, saying, "I'm not reading this book just because you're reading it, you know; I think it's a genuinely good book, myself."

Meanwhile, a teenager who just happens to be there, glances over at the book.  "Oh, poetry," he says.

"Not exactly," you say.  "I just thought that if I knew more about how words sound—the rhythm—it might make me a better writer."

"Oh!" he says, "You're a writer?"

You pause, trying to calculate whether the term does you too much credit, and finally say, "Well, I have a lot of readers, so I must be a writer."

"I plan on being a writer," he says.  "Got any tips?"

"Start writing now," you say immediately.  "I once read that every writer has a million words of bad writing inside them, and you have to get it out before you can write anything good.  Yes, one million.  The sooner you start, the sooner you finish."

The teenager nods, looking very serious.  "Any of these books," gesturing around, "that you'd recommend?"

"If you're interested in fiction, then definitely Jack Bickham's Scene and Structure," you say, "though I'm still struggling with the form myself.  I need to get better at description."

"Thanks," he says, and takes a copy of Scene and Structure.

"Hold on!" says the holder of The Elements of Style in a tone of shock.  "You're going to read that book just because he told you to?"

The teenager furrows his brow.  "Well, sure."

There's an audible gasp, coming not just from the local stacks but from several other stacks nearby.

"Well," says the one who took the other copy of The Poem's Heartbeat, "of course you mean that you're taking into account his advice about which books to read, but really, you're perfectly capable of deciding for yourself which books to read, and would never allow yourself to be swayed by arguments without adequate support.  Why, I bet you can think of several book recommendations that you've rejected, thus showing your independence.  Certainly, you would never go so far as to lose yourself in following someone else's book recommendations—"

"What?" says the teenager.

If there's an aspect of the whole thing that annoys me, it's that it's hard to get that innocence back, once you even start thinking about whether you're independent of someone.  I recently downvoted one of PG's comments on HN (for the first time—a respondent had pointed out that the comment was wrong, and it was).  And I couldn't help thinking, "Gosh, I'm downvoting one of PG's comments"—no matter how silly that is in context—because the cached thought had been planted in my mind from reading other people arguing over whether or not HN was a "cult" and defending their own freedom to disagree with PG.

You know, there might be some other things that I admire highly besides Gödel, Escher, Bach, and I might or might not disagree with some things Douglas Hofstadter once said, but I'm not even going to list them, because GEB doesn't need that kind of moderation.  It is okay for GEB to be awesome.  In this world there are people who have created awesome things and it is okay to admire them highly!  Let this Earth have at least a little of its pride!

I've been flipping through ideas that might explain the anti-admiration phenomenon.  One of my first thoughts was that I evaluate my own potential so highly (rightly or wrongly is not relevant here) that praising Gödel, Escher, Bach to the stars doesn't feel like making myself inferior to Douglas Hofstadter.  But upon reflection, I strongly suspect that I would feel no barrier to praising GEB even if I weren't doing anything much interesting with my life.  There's some fear I don't feel, or some norm I haven't acquired.

So rather than guess any further, I'm going to turn this over to my readers.  I'm hoping in particular that someone used to feel this way—shutting down an impulse to praise someone else highly, or feeling that it was cultish to praise someone else highly—and then had some kind of epiphany after which it felt, not allowed, but rather, quite normal.

 

Part of the sequence The Craft and the Community

Next post: "On Things that are Awesome"

Previous post: "Tolerate Tolerance"

New to LessWrong?

New Comment
121 comments, sorted by Click to highlight new comments since: Today at 4:45 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I read recently an article on charitable giving which mentioned how people split up their money among many different charities to, as they put it, "maximize the effect", even though someone with this goal should donate everything to the single highest-utility charity. And this seems a bit like the example you cited where, if blue cards came up randomly 75% of the time and red cards came up 25% of the time, people would bet on blue 75% of the time even though the optimal strategy is blue 100%. All this seems to come from concepts like "Don't put all your eggs in one basket", which is a good general rule for things like investing but can easily break down.

I find myself having to fight this rule for a lot of things, and one of them is beliefs. If all of my opinions are Eliezer-ish, I feel like I'm "putting all my eggs in one basket", and I need to "diversify".You use book recommendations as a reductio, but I remember reading about half the books on your recommended reading list, thinking "Does reading everything off of one guy's reading list make me a follower?" and then thinking "Eh, as soon as he stops recommending such good boo... (read more)

I tried to start a Hofstadter cult once. The first commandment was "Thou shalt follow the first commandment." The second commandment was "Thou shalt follow only those even-numbered commandments that do not exhort thee to follow themselves." I forget the other eight. Needless to say it didn't catch on.

You just didn't give it enough time. Remember, it always takes longer than you expect!

I find myself having to fight this rule for a lot of things, and one of them is beliefs. If all of my opinions are Eliezer-ish, I feel like I'm "putting all my eggs in one basket", and I need to "diversify"

See also Robin Hanson's post on Echo Chamber Confidence.

You use book recommendations as a reductio, but I remember reading about half the books on your recommended reading list, thinking "Does reading everything off of one guy's reading list make me a follower?"

I think of all the people who have ever recommended books to me, Eliezer has the most recommendations which I've actually followed. In most of my circle socials, I'm the "smart one", but I'm nowhere near as smart as Eliezer (or most other people on LessWrong, it seems). So I do admire EY a lot. I want to be as smart as he is, and so I try reading all the books he has read.

And it kills me, because I also remember his post about novice editors copying the surface behavior of master editors, without integrating the deep insight, and I know that by reading the same science fiction novels EY has read, I'm committing exactly the same sin. But I don't know what else I can do to try to improve myself.

how people split up their money among many different charities to, as they put it, "maximize the effect", even though someone with this goal should donate everything to the single highest-utility charity.

If I have complete or near-complete trust in the information available to me about the charity's utility, as well as its short-term sustainability, that seems like the right decision to make.

But if I don't - if I'm inclined to treat data on overhead and estimates of utility as very noisy sources of data, out of skepticism or experience - is it irrational to prefer several baskets?

Similarly with knowledge and following reading lists, ideologies and the like.

Yes, even with great uncertainty, you should still put all your eggs into your best basket.

4thomblake15y
Did you mean this as a general rule, or specifically about this topic? The literal example of eggs seems to indeed work well with multiple baskets, especially if they're all equally good.

Specifically on this topic.

The expected number of eggs lost is least if you choose the best basket and put all your eggs in it, but because of diminishing returns, you're better off sacrificing a few eggs to reduce the variance. However, your charitable donations are such a drop in the ocean that the utility curve is locally pretty much flat, so you just optimise for maximum expected gain.

2Paul Crowley15y
This follows from the expected utility of the sum being the sum of the expected utility?

It follows from the assumption that you're not Bill Gates, don't have enough money to actually shift the marginal expected utilities of the charitable investment, and that charities themselves do not operate in an efficient market for expected utilons, so that the two top charities do not already have marginal expected utilities in perfect balance.

And that you care only about the benefits you confer, not the log of the benefits, or your ability to visualize someone benefited by your action, etc.

5Paul Crowley15y
I don't see how either of these affect this result - unless you're saying it's easier to visualise one person with clean water and another with a malaria net than it is two people with clean water?
7Eliezer Yudkowsky15y
The sum of the affect raised is greater.
0Paul Crowley15y
I don't understand I'm afraid, can you unpack that a bit please? Thanks.

Consider scope insensitivity. The amount of "warm fuzzies" one gets from helping X numbers of individuals with a given problem does not scale even remotely linearly with X. Different actions to help with distinct problems, however, sum in a much closer to linear fashion (at least up to some point).

Ergo, "one person with clean water and another with a malaria net" feels intuitively like you're doing more than "two people with clean water".

3orthonormal15y
Well, not when you compare them against each other, but only when each is considered on its own: it's like this phenomenon.
3arundelo15y
I think it means: the sum of the feel-good points of giving one person clean water and another a malaria net will, for most people, be higher than the feel-good points of giving two people clean water.
5Paul Crowley15y
I'd like to get right whatever it is I'm doing wrong here, so if anyone would like to comment on any problems they see with this or the parent comment (which are both scored 0) I'd be grateful for your input. EDIT: since this was voted down, but I didn't receive an explanation, I'm assuming it's just an attack, and so I don't need to modify what I do - thanks!
0Anatoly_Vorobey15y
I suspect that the ability to visualize someone benefited by your action is often a proxy for being certain that your action actually helped someone, and that people often place additional value on that certainty. They might not be acting as perfectly rational economic agents in such cases, but I'm not sure I'd call such behavior irrational.
-1[anonymous]12y
Doesn't matter how we call a behavior. If it can be improved, it should be.
-10private_messaging12y

But if I don't - if I'm inclined to treat data on overhead and estimates of utility as very noisy sources of data, out of skepticism or experience - is it irrational to prefer several baskets?

Very much so. Rational behavior is to maximize expected utility. When rational agents are risk-averse, they are risk-averse with respect to something that suffers from diminishing returns in utility, so that the possibility of negative surprises outweighs the possibility of positive surprises. "Time spent reading material from good sources" is a plausible example of something that has diminishing returns in utility so you want to spread it among baskets. Utility itself does not suffer from diminishing returns in utility. (Support to a charity might, but only if it's large relative to the charity. Or large relative to the things the charity might be doing to solve the problem it's trying to solve, I guess.)

6[anonymous]15y
In the case of reading, I can see the benefit of not putting all of your eggs in one basket. All of us have biases, however hard we try not to and by reading the same books you are maybe allowing your biases to be shaped along the same lines as Eliezers. By more of your formative reading being outside of this, you increase your chance of being able to challenge these biases. This is especially true if you want to write in the same area as Eliezer as it increases your ability to contribute in a different way.
5AnnaSalamon15y
Do you have a mechanistic unpacking (even a guess would be helpful) of what it is to be a "sheeple" or a "cult", and of what harms come from being a "sheeple"? Given Aumann, I'm more inclined to say that if two people have different beliefs, at least one of them isn't thinking. That said, your point about respecting outside views is reasonable. Are you trying to avoid replacing the outside-presumed "badness" of cults/sheeple with understood mechanisms, so as to retain any usefulness that might be in the received heuristics and that you might not understand the mechanisms behind?
9Scott Alexander15y
By sheeple and cult, I mean people whose good judgment is clouded by the mechanisms described in the Affective Death Spiral sequence.
0pnkflyd83115y
It would be great to add a link to the article on charitable giving you refer too to see if they already conclude or dismiss my idea on the issue. From observations of those around me I tend to see the reason behind charitable giving as something other than maximizing the utility of the charitable gift. I postulate that people give to many different charities as a social signal. The contributor is signaling to those who are receiving the gift that they sympathize with the cause. The contributor is also signaling to those around them that they are a caring and compassionate person. The quantity of the gift has an almost negligible effect on this signaling. So the more times someone gives, and the more charities they give too allows them to signal positive social mores more often and to a larger audience, increasing their social status higher, than if they gave all their expendable money to one charity a limited number of times.

PG runs a discussion site. He's using it as a sort of wide-flung net to catch worthy candidates for the "inner circle" - startup founders who get into his YC program - and is quite open about it (e.g. he explicitly says that YC submissions will among other things be judged on how well their authors are known as HC commenters and how worthy their comments have been judged to be). Why is it surprising that this creates a cult atmosphere of sorts?

Before Hacker News, PG was already famous in the relevant community for his essays, which are often credited, among other things, for the modern revival of interest in Lisp (this is probably an exaggeration). Nobody called him a cult leader back then.

Joel Spolsky is a famous blogger in the programming/CS/IT niche; he has an active discussion forum on his site. Lots of people respect him, lots of other people look down on his posts. Nobody calls him a cult leader.

RMS doesn't even have a discussion forum, and doesn't write a blog. He browses the web through an email-mediated wget; that's not even Web 1.0, it's Web -0.5 or something. He's widely considered to be a cult leader.

I'd guess that to make people think you're behaving like a c... (read more)

Are you aware of the irony in saying Eliezer "won't shut up" about a topic he has demanded everybody shut up about?

I am. I view it as evidence that he recognizes the filtering effect these topics have brought to OB, and intends LW to build a community diverse and independent enough to not let itself be dominated by these topics, unless it so chooses. It's a smart decision.

5John_Maxwell15y
One small step that Eliezer could take with regard to (4), I think, would be to renounce his right to decide which posts are featured and make it entirely dependent on post score.

The "top" page is already entirely dependent on post score. I'd strongly prefer that there stay some kind of editorial filter on some aspect of LW; we're doing great right now as a community, but many online communities start out high-quality and then change as their increased popularity changes the crowd and the content.

1CarlShulman15y
IAWYC, no 'but.'
5CarlShulman15y
I generally agree with your points, and draw special attention to the last sentence : "you're also prone to present criticism against you as a result of a trendy choice to stand up to a perceived cult leader (this is a dangerous stance for oneself to adopt, even when true).." I'm not sure to what extent this is a double-instance of the recency effect (Anatoly's last sentence, and referring to Eliezer's most recent post) but it's something to be avoided regardless.
3Paul Crowley15y
Can you give an example where EY has been the first to bring up the whole cult thing?

I don't know if that ever happened, and I didn't mean to imply he had been. Suppose someone tells you that you've been acting like a cult leader. Even if you don't agree with the claim, you've just obtained a convenient meta-explanation of why people disagree with you: they're consciously standing up to the cult that isn't there; they're being extra contrarian on purpose to affirm their cherished independence. What I was trying to say is that it's generally dangerous to adopt this meta-explanation; you're better off refusing to employ it altogether or at least guard its use with very stringent empirical criteria.

3Eliezer Yudkowsky15y
I wish I could agree with that, but you can't actually refuse to employ explanations. You might be able to refuse to talk about it, but you don't get a choice of which of several causal explanations gets to be true.

You can try to correct for the self-serving temptation to overapply a certain explanation.

6Anatoly_Vorobey15y
Why not? Sometimes I manage to refuse to employ as many as five explanations before breakfast. You can't pretend that the explanation doesn't exist if it occurred to you. But you certainly can refuse to act upon it, not just talk about it. Which among competing explanations for human behavior is true is almost never certain; it's perfectly possible to bias yourself against one common explanation and by doing so avoid the more harmful, and very probable, outcome of oversubscribing to it.
3[anonymous]15y
You can try to correct for the temptation for the self-serving application to overapply a certain explanation.

So... just for the record... this post got up to #1 on HN, and then HN crashed, and is, so far as I can tell, still down a couple of hours later.

When you consider that the Less Wrong site format was inspired by HN, that LW is based on Reddit source code, and that Reddit is a Y Combinator company, I guess that writing about Paul Graham and then getting voted up on Hacker News exceeded the maximum recursion depth of the blogosphere.

This would be an excellent time for a "stack overflow" joke, if only Spolsky could be worked in somehow.

And here you are commenting on HN going down, and here's the guy who submitted this to HN replying to your comment.

"I guess that writing about Paul Graham and then getting voted up on Hacker News exceeded the maximum recursion depth of the blogosphere."

Just wait until PG writes an essay about all this...

Picture of Eliezer in monk's robes (That is you, right?), stories about freemason-esque rituals, specific vocabulary with terms like, "the Bayesian conspiracy".

It's all tongue in cheek, and I enjoy it. But if you're trying to not look like a cult, then you're doing it wrong.

0PrometheanFaun10y
I disagree. I think it's so easy for a community with widespread, genuine conviction as to their shared radicles to look like a cult, that, well, anyone willing to go through the rather extreme rigors of preventing anyone from seeing you as cult-like.. methinks they protest too much. I say we are- though far from being a cult- cultlike. We are weird, and passionate, and that's all it takes.

It seems to me that only a few groups get the label "cultish", so its not like people put the label on any group with an apparent leader. Such selective labels probably contain a lot of info, so it seems worth figuring out just what that info is. It is not wise to just find one group that gets the label which you think is fine, and then decide to ignore the label.

The straightforward approach would be to collect a dataset of groups, described by various characteristics, including how often folks call them "cultish." Then one would be in a position to figure out what this label actually indicates about a group.

I find myself moved to break possibly the greatest taboo amongst our kind, but if this act of status suicide moves just one reader to action, the sacrifice is worth it.

OK, here goes...

"Gödel, Escher, Bach" by Douglas R. Hofstadter is the most awesome book that I have ever read.

Me too!

This whole concept is confusing to me. I enjoy Eliezer's writing because it makes sense and is useful so it becomes part of my identity. I haven't found as many of his newer posts to be useful so a lower number of them are drafted into my identity. My 'self' is largely a collection of ideas and thoughts transmitted to me from other people and I don't find anything wrong with this. I do hope to produce useful knowledge myself but for right now I am educating myself to that point.

If I find a useful tool lying on the ground then I pick it up and use it, I do not try to recreate the tool from scratch in order to make it 'mine', which I feel is a meaningless concept. As long as my beliefs and skills pay for themselves in terms of useful benefits to my life I don't see the point in throwing them away because they came from someone else. I don't care who I am and I am not attached to any specific view of my self other than to try to pick the most effective tools to accomplish some core goals and values.

4thomblake15y
Then you're a very odd person. So you regard your 'core goals and values' to not be part of your self? And you're really implying that you can just pick things about yourself simply as tools to accomplish your goals? I don't think I've ever met a person that can do that. Usually, there are facts about who we are and we need to work very hard to do anything about them.

IAWYC but I think you forgot to include something about jealousy in your analysis, even if few people would admit it's part of it.

I think it's very possible to greatly admire someone and at the same time feel some form of jealousy that inhibits the clear expression of that admiration. By saying that someone else is better (much better) than you are - especially at something that you value - you are in effect admitting to a lower status.

So all the forced disagreements and claims of independence are in effect just trying to signal that your status is high and you're not submissive, or something like that.

[-][anonymous]15y140

When you read someones writings or follow the things they do but don't actually KNOW them, it's very easy to get sucked into a sort of 'larger-than-life' belief about them.

Because they're famous (and they must be famous because you've heard of them), they're obviously different and special and above regular, normal people. I've found it takes conscious effort to remember that no how famous or smart or talented they are, in the end they're just some guy or girl, with the same flaws as everyone else.

And when you think someone's larger-than-life, it's easy to praise highly, because you're not thinking of them as a normal person, you're thinking of them as ABOVE normal people. That they are special. In light of this, it's easy to see why praise for someone or something, no matter what it is, can be seen as cultish., and how you can fall into the trap of believing praise for anything is cultish.

Regarding this, it's really helpful when Eliezer mentions that he borrowed this or that part of his philosophy from a piece of anime fanfiction. It helps humanize him, or worse.

For what it's worth I don't think you've deliberately set out to become a "cult leader" -- you seem like a sincere person who just happens to be going about life in a rather nonstandard fashion. You've got some issues with unacknowledged privilege and such, and I've gotten impressions from you of an undercritical attractance to power and people who have power, but that's hardly unique.

I think mostly it's that you confuse people via sending off a lot of signals they don't expect -- like they think you must have some weird ulterior motive for not having gone to college, and instead of seeing public discussion of your own intellect as merely the result of somewhat atypical social skills, it's seen as inexcusable arrogance.

That said, because of my own negative experience(s) with people who've seemed, shall I say, rather "sparkly" at first, but who HAVE turned out to be seeking puppy-dog supplicants (or worse), I tend to be very very cautious these days when I encounter someone who seems to attract a fan club.

With you I've gone back and forth in my head many times as to whether you are what you first struck me as (a sincere, if a bit arrogant, highly ambitious guy) ... (read more)

2HughRistik15y
I'm interested in what you mean here. Could you give examples?
1AnnaSalamon15y
What benefits might you expect?

Well, for one thing, privilege is a major source of bias, and when a person doesn't even realize they (or those they admire) have particular types/levels of privilege, they're going to have a harder time seeing reality accurately.

E.g., when I was younger, I used to think that racism didn't exist anymore (that it had been vanquished by Martin Luther King, or something, before I was even born) and didn't affect anyone, and that if someone didn't have a job, they were probably just lazy. Learning about my own areas of privilege made it possible for me to see that things were a lot more complicated than that.

Of course it's possible for people to go too far the other way, and end up totally discounting individual effort and ability, but that would fall under the category of "reversed stupidity" and hence isn't what I'm advocating.

(And that's all I'm going to say in this thread for now - need to spend some more time languaging my thoughts on this subject.)

IAWY, and I actually already replied to your question about this in a comment, but:

One of the prime issues for me as a rationalist trying to learn about marketing (especially direct/internet marketing) was having to get over the fear of being a "dupe" pulled into a "scam" and "cult" situation. Essentially, if you have learned that some group you scorn (e.g. "suckers" or "fools" or whatever you call them) exhibit joining behavior, then you will compulsively avoid that behavior yourself.

I got over it, of course, but you have to actually be self-aware enough to realize that you chose this attitude/behavior for yourself... although it usually happens at a young enough age and under stressful enough conditions that you weren't thinking very clearly at the time.

But once you've examined the actual evidence used, it's possible to let go of the judgments involved, and then the feelings go away.

In other words, persons who have this issue (like me, before) have had one or more negative social experiences linking these behaviors to a disidentified group -- a group the person views negatively and doesn't want to be a part of. It's a powerfu... (read more)

Alright! a few points that I can sort of disagree on or feel were omitted in the essay. I'm being skeptical, not a cultist at all! .

My fears aren't really that you're trying to foster a cult, or that it's cultish to agree with you. I got worried when you said that you wanted more people to vocalize their agreement with you and actually work towards having a unified rationalist front. For some reason, I had this mental picture of you as a supervillain declaring your intention to take over the world. So I reflected that I was doing things, somewhat unco... (read more)

2topynate15y
Maybe you've been primed? (see the end of the post)

I agree with your conclusion, and I love your library allegory. It's pretty clear that America fears strong emotions in general, and also that "our type" learns cached patterns of ritually approved of nonconformity.

That said, some may be balking, not at admiring someone hugely, but at forming nearly their entire manner of evaluating ideas from a single person, without independent sources of evidence that can label that person "trustworthy". Anne Corwin reports fearing networks of abstractions that distance people from their own concre... (read more)

"But upon reflection, I strongly suspect that I would feel no barrier to praising Gödel, Escher, Bach even if I weren't doing anything much interesting with my life."

You don't feel yourself to be in status competition with Hofstadter do you? Or E.T. Jaynes, for that matter. Think about effusively praising Nick Bostrom as the last best hope for the survival of humane values, instead.

"I'm hoping in particular that someone used to feel this way - shutting down an impulse to praise someone else highly, or feeling that it was cultish to praise so... (read more)

4Eliezer Yudkowsky15y
Point taken. But that example isn't generic status competition, it's role competition. Are so many people calling PG a cult leader really in role competition with him? For what? Are there so many commenters at this site in role competition with me? I think you have a valid point here about a factor that would make admiration directed at another seem bad, but can it plausibly be that particular factor which is at work here? (Edited to make clear the difference between status competition and role competition.)
6CarlShulman15y
They're socially engaged with him and his web community. Status competition doesn't have to mean preparation for direct overthrow, it can also mean efforts to reduce the size of status gaps relative to current superiors. Demonstrating or admitting inferiority to someone in the immediate social hierarchy pushes the low lower, while successfully tearing down a superior raises the tearer while lowering the torn, even if only marginally.
4topynate15y
Great point, and I think that the "competition", if there is competition, isn't with PG or EY but with everyone else.
7Paul Crowley15y
There is an essay about this by Pavel Curtis, creator of LambdaMOO - he would frequently find that newcomers would respond to his perceived status by being exaggeratedly rude to his character, showing off that they were prepared to stand up to the Man, so long as of course they could do it in perfect safety under the cover of anonymity.
7Paul Crowley13y
Found it: Mudding: Social Phenomena in Text-Based Virtual Realities
0Rain13y
'Exagerrated rudeness' could also be a product of the greater internet dickwad theory. Attitudes toward MUD wiz teams are also part politics, since the leaders are often dictators of the local environment.

Figured, since this was linked to again, that I might as well say some of what I think on this:

My reaction is more, well, a couple of things, but part of it could be described like this: Yes, I do indeed admire you and think you're cool... and my natural instinctive reaction to you is kinda, well, fanboyish, I guess. Hence I try to moderate that... TO AVOID BEING ANNOYING... that is, to avoid, say, annoying you, for instance.

If you can do that quietly without anyone noticing, you're doing it right. If you make a big deal out of it to prove something to other people, you're doing it wrong. Should be obvious, really.

3Psy-Kosh14y
Well, yes. :)

If you have 'teachings' rather than suggestions or opinions, and you can't support those claims in a systematic and explicit way, then it doesn't much matter whether you intended to propagate a cult - that's precisely what you're doing.

I'm afraid to read GEB now. It's been built up so high the only possible reactions I could possibly have are "as good as everybody else thinks it is", or "didn't live up to expectations", with the latter being far more likely.

Let me try to help you. Many people who praise GEB in the highest terms and recommend that everyone read it never finished it. Many read all the dialogues, but only some of the chapters. I have absolutely no data to support turning either of the previous "many" to "most", but wouldn't be surprised by either possibility.

GEB's most important strength, by far, is in giving you a diverse set of metaphors, thought-patterns, paradoxes and ways to resolve them, unexpected connections between heretofore different domains. It enlarges your mental vocabulary - quite forcefully and wonderfully if you haven't encountered these ideas and metaphors before. It's like a very, very entertaining and funny dictionary of ideas.

The exposition of various topics in theory of computation, AI, etc. that it also contains is not as important by comparison, and isn't the best introduction to these topics (it's still good and may well be very enjoyable, depending on your background and interest).

So there's no reason to fear reading GEB. You'll chuckle with recognition at the jokes, metaphors, notions that you've already learned elsewhere, and will be delighted at those you've never seen before. Read all the dialogues; if some of the chapters bore you, resist guilt tripping and skip a few - you'll come back to them later if you need them.

3scotherns15y
I'm about 1/2 way through. I am finding the chapters to be much more interesting than the dialogues. The style the dialogues are written in seems to be rather stilted/forced and grates somewhat. They do seem to be useful metaphors for understanding some of the trickier chapters, so I can see the merit in them.
7gwern15y
/me looks up from the 'Crab Canon'. Wait, what?
5CronoDAS15y
I thought GEB was a very good book, but much of it consisted of things I had learned about up from several other sources (formal logic, Godel's incompleteness theorem, and some others) and I also found reading it to be slow and effortful - not the kind of book that keeps me compulsively turning pages, hungry for more. I also don't know how well it conveys its concepts to someone who never heard them before, because I already understood many of the concepts that GEB tries to painstakingly explain to an audience of people who hadn't already taken three university courses on formal logic. (And I did read every last word of it!) GEB has been described as "an entire humanistic education in one volume" and that's not far from the truth; I could easily imagine using it as the textbook for several consecutive semesters worth of university classes. It starts with some basic concepts that the proverbial smart 12 year old could understand, and builds on them, step by step, until the book is covering concepts and applications on the level of graduate school. The book is obscenely ambitious in its scope - imagine a single book that assumes the reader has never taken a course in algebra and wants to teach that reader enough that, by the time he or she reaches the end, can understand and discuss university-level calculus, and you'll picture something much like GEB. So, yeah, GEB really is amazing, but it's also a headache-inducing monstrosity that will try to cram your head with concepts until it explodes.
2thomblake15y
I agree with some of the sentiments below. Also, I became a philosopher before reading GEB, and didn't really find anything particularly enlightening in it. I still recommend "The Future and its Enemies" much more highly, and it's more of a fun read (even though it's a bit dated now)

IMO being accused of wanting to be a cult leader is a pure double bind. You either say "yes, I do" and then you're a cult leader, or you say "what? that's crazy because of X, Y, Z..." and then people point at your protestations as evidence that their arguments have some minimal credibility (I am sure someone will do this to EY at some point). It is, prima facie, evident to me that talking to people on the internet about rationality is a poor method of getting acolytes (and even if it were a good one for some people, the Objectivists alr... (read more)

3gwern15y
Of course, being accused of being a cult is itself weak Bayesian evidence of being a cult (much like being accused of child abuse). Being accused would raise people's estimates, and the only way to lower them is a good defense; that their estimates might not go all the way back to the originals is not enough for the choice to deny or not deny cultishness to be a double-bind, I think.

If there's an aspect of the whole thing that annoys me, it's that it's hard to get that innocence back, once you even start thinking about whether you're independent of someone.

Cross-referencing my comment on a different post for a related idea:

Your brain remembers which "simple" predictor best described your decision [. . .]

Your brain learns to predict other peoples' judgments by learning which systems of predictive categories other people count as "natural". If you have to predict other peoples' judgments a lot, your brain starts

... (read more)

Ok, I'm coming out and will admit that I admire you, Eliezer very highly. I think you are the one who taught me the most about rationality and what intelligence is all about.

Now, I admit that in my past I have fallen into the "adore the guru" trap so I still have this fear in my head and am cautious to not do the same mistake again. The cult-threads here are helping me to evaluate my position carefully.

But I like what you wrote about that innocence of being able to experience real admiration and excitement. I think if you let your critical think... (read more)

You've changed my beliefs and thinking more than anyone outside my family, by a pretty huge margin. This makes me far more likely to raise something to the level of being worth paying attention to just because you've recommended it (as it should), but it also makes me careful on a gut level every time I'm consider adopting yet another belief from you.

I think this is partly because of what you describe in this post, but partly because I know a lot of the existing beliefs I have that will be inclining me to accept the new belief came themselves from you. I'm... (read more)

my theory about myself:

I don't think people will believe me if they recognize my views as the typical LW-cluster views. They'll just dismiss them.

Which is really rational of them, actually. I think I use the same heuristic. Once I see that someone's beliefs come from a political affiliation, they're weaker evidence to me.

Like... if someone's trying to convince me out of global warming, but then I learn that she's also against affirmative action and immigration and regulation on finance. At first I might have thought she read convincing scientific arguments... (read more)

Word "cult" seems to be used in very vague sense by everyone, and people have different definitions. Here's something I wrote about Paul Graham's and a few other "cults". It's only vaguely relevant, as I used the label "cult" differently.

If you are not into Paul Graham's cult / meme complex, and you hear people who really are - talking how working 100 hours a week on built to sell startup is the best way to prove your worth as a hacker and a human being - they really sound like "cult" members.

2Eliezer Yudkowsky15y
Can you link to an example of someone who sounds like a Paul Graham cult victim?
2Anatoly_Vorobey15y
A recent example.

Explanation: Emotional overexcitability, a trait common to gifted people and yes there is good reason to believe that most LessWrongers are gifted may cause LW and Hackernews fans to be extra excitable and intense. You've probably heard that gifted people tend to be more emotional? Well on your LessWrong survey your respondents claimed an average IQ in the 140s, well beyond the minimums for all the IQ definitions for gifted. If these readers are unusually emotionally intense, as gifted people tend to be, it's likely their unusual "electricity"... (read more)

1[anonymous]11y
The IQ survey is obviously flawed.
6Epiphany11y
We all know that a survey is not proof. It was intended as a thought-provoking clue, not as proof. Everyone here is capable of considering whether LW and Hackernews are full of gifted people. This is niggling.
-7[anonymous]11y

(note: Hofstadter does not have a cult)

Douglas Hofstadter's research group is apparently quite cultish. It's close-knit, dominated by a single person, is not tolerant of disagreement, and has little intellectual interaction with the remainder of the field.

This doesn't make GEB less than excellent. It merely partially explains why they haven't made much progress since.

My personal test for whether you're my "cult leader" or just a good teacher, is how I react when I think you're wrong. If they are merely a teacher, then I will sit down and work out exactly why they're right from base principles, and I'll admit it if I'm confused or if I think they are genuinely wrong. Given how many times in the sequences I've spent a few hours working things out, I feel safe here.

A good teacher says "here is something worth understanding" rather than "here is the teacher's password" - it is a willingness to... (read more)

[-][anonymous]15y20

I know the opposite of stupidity is still stupidity, but every time I see some idiotic attempt to gain status by pointing out how "everone else except me seems to revere Eleizer too much" I have to restrain myself from reacting in the other direction and worshiping the guy.

I used to have the idea that finding flaws in something (a piece of writing or entertainment or an idea or a person) made me better than the person or the creator of the thing I was criticizing. Then I realized two things which got me to stop: 1) Critics are parasites; they don't generally produce anything that valuable and entertaining themselves, and even beautifully written reviews are pretty low on my list of things to read for edification or fun. 2) When I go around finding flaws in everything, I stop enjoying it, and living a life where I can't enj... (read more)

Critics are parasites; they don't generally produce anything that valuable and entertaining themselves

Debunking mistaken hypotheses is just as important as coming up with new ones. Otherwise our heads would be so filled with confused theories that we could never develop the correct ones.

2Scott Alexander15y
Having recently posted on the relevance of Pope's poetry to rationalism, I can't help quoting him one more time here:
[-][anonymous]15y10

You've changed my beliefs and thinking more than anyone outside my family, by a pretty huge margin. This makes me far more likely to raise something to the level of being worth paying attention to just because you've recommended it (as it should), but it also makes me careful on a gut level every time I'm consider adopting yet another belief from you.

I think this is partly because of what you describe in this post, but partly because I know a lot of the existing beliefs I have that will be inclining me to accept the new belief came themselves from you. I'm... (read more)

[-][anonymous]15y10

You've changed my beliefs and thinking more than anyone outside my family, by a pretty huge margin. This makes me far more likely to raise something to the level of being worth paying attention to just because you've recommended it (as it should), but it also makes me careful on a gut level every time I'm consider adopting yet another belief from you.

I think this is partly because of what you describe in this post, but partly because I know a lot of the existing beliefs I have that will be inclining me to accept the new belief came themselves from you. I'm... (read more)

[-][anonymous]15y10

Leaving aside the valid points about overrating particular experts, when you have limited exposure to opposing viewpoints on the subject matter; cult-like behavior doesn't even require an intentional cult leader. Paul Graham doesn't have to willfully cultivate that type of following, for some of it to arise spontaneously as a function of the social structures and participants around him.

Frequently agreeing with someone who has a lot of good ideas, and who also has high status in a community that you're a member of, is not inherently bad. But once you ... (read more)

[+][anonymous]15y-60