I am your hero!
I am your master!
Learn my arts,
Seek my way.

Learn as I learned,
Seek as I sought.

Envy me!
Aim at me!
Rival me!
Transcend me!

Look back,
Smile,
And then
Eyes front!

I was never your city,
Just a stretch of your road.

 

Part of the Politics Is the Mind-Killer subsequence of How To Actually Change Your Mind

Next post: "Politics and Awful Art"

Previous post: "Rationality and the English Language"

New to LessWrong?

New Comment
36 comments, sorted by Click to highlight new comments since: Today at 11:08 AM

Are people biased on average to follow someone else, rather than to make their own path? It is not obvious to me. Yes, many great failings have come from groups dedicatedly following a leader. But surely many other failings have come from groups not dedicatedly following a leader.

These recent posts will be very useful to point to next time someone accuses Singularitarians of being a cult.

Or, Nick, a great source of irony for those people. "For a site called 'Overcoming Biast'..."

I suspect that people tend towards following versus leading, much in the way that pack wolves have leaders and followers.

Robin, this verse is about following someone else - just following with intent to overtake, rather than following with intent to worship. When is it ever appropriate to do the latter?

The Litany describes a dilemma that should only appear in arts, not sciences. In a true science with many contributors, you wouldn't follow any single hero, unless you thought some portion of their work had been left undone.

Nick, that is not their purpose.

I know, but they can serve that purpose - how many actual cult leaders write about how to avoid becoming a cult?

What about the Guru who wrote 'Why work towards the Singularity' ? It is a text with a distinctly Messianic feel. Or, to be more generous, a Promethean feel. While it is true that Hom Sap has a nasty itch to create anything that can be created, regardless, thre's no need for such pseudo valuations as the following : "If there's a Singularity effort that has a strong vision of this future and supports projects that explicitly focus on transhuman technologies such as brain-computer interfaces and self-improving Artificial Intelligence, then humanity may succeed in making the transition to this future a few years earlier, saving millions of people who would have otherwise died. Around the world, the planetary death rate is around fifty-five million people per year (UN statistics) - 150,000 lives per day, 6,000 lives per hour. These deaths are not just premature but perhaps actually unnecessary. At the very least, the amount of lost lifespan is far more than modern statistics would suggest." Who says that continuing the lives of us dull old farts, to the inevitable detriment of the unborn, has any positive value ? I'd say that's monstruous. The transhuman AI may be an unavoidable consequence of our Luciferian inclination to meddle. That doesn't mean it's a cause. Any chance of it becoming a cult ?

"Envy me! Aim at me! Rival me! Transcend me!"

Eliezer,

Can you name 3 people who have transcended you in particular areas of rationality, and those areas? How about Spearman's g? Capacity/willpower for altruistic self-sacrifice? Conscientiousness? Tendency not to be overconfident about disastrous philosophical errors? Philosophical creativity? Mathematical creativity? Same questions with respect to 'rivaled.'

Also, your use of poetry and talk of the 'Way of Rationality' seems to be counter-signaling.

Nick,

Plenty of religious and political organizations accuse outsiders and heretics of various kinds of bias and irrationality, and 'apply' the same criteria to themselves. The problem is that they do so in a biased fashion.

"Also, your use of poetry and talk of the 'Way of Rationality' seems to be counter-signaling."

In what sense is this counter-signaling?

I realize it's not likely to convince someone who's already committed to seeing Singularitarianism as a cult, but it might help someone who's relatively unfamiliar with the territory and getting a slight cultish feel.

@Chris: "Who says that continuing the lives of us dull old farts, to the inevitable detriment of the unborn, has any positive value ?"

I hear this argument against life extension and transhuman technologies over and over, and I think it is the absolute height of hypocrisy. Why? Well, if you care so much about the unborn ( = potential people, of whom there are infinitely many), then why aren't you eagerly campaigning for the immediate colonization of the solar system, followed by the galaxy? Remember, there are always more potential people left to be realized, and the best way of realizing them is by continually increasing the rate at which new people come into existence.

Surprisingly enough, the creation of a safe and powerful AI is probably the most effective way of accomplishing this increase in new-people-creation that will benefit the unborn. Chris, if you're really interested in the rights of potential persons [as I am], you should wholeheartedly support and work towards positive, safe technological acceleration.

I don't know about you guys, but if there was only one country in the entire universe, I'd rather it be Monaco than Congo.

Why? Well, if you care so much about the unborn ( = potential people, of whom there are infinitely many), then why aren't you eagerly campaigning for the immediate colonization of the solar system, followed by the galaxy? Remember, there are always more potential people left to be realized, and the best way of realizing them is by continually increasing the rate at which new people come into existence.

'Unborn' does not equal 'potential people'. I see little point in trying to exhaustively explore the space of potential people. But given that there will be people coming after us, I fail to see the purpose in extending the lives of this generation at their expense.

Can you name 3 people who have transcended you in particular areas of rationality, and those areas? How about Spearman's g? Capacity/willpower for altruistic self-sacrifice? Conscientiousness? Tendency not to be overconfident about disastrous philosophical errors? Philosophical creativity? Mathematical creativity? Same questions with respect to 'rivaled.'

Daniel Kahneman undoubtedly knows more about heuristics and biases than I do; E. T. Jaynes was superior in manipulating and applying Bayesian calculus; Robyn Dawes has taught more students of rationality; Von Neumann was probably brighter than I am; Gandhi endured more for less; Edison put in longer hours; Epicurus seemed pretty skeptical; If Siddhārtha Gautama was a real person, he was one hell of an imaginative philosopher; Conway has probably created more math than I've learned.

Caledonian: If a longer lifespan is bad, surely a shorter one must be good? It would be a pretty unlikely coincidence if the current average of 67.2 years just happened to be morally optimal - and even if that coincidence were true now, it won't be for much longer.

So if you really believe what you're saying, then stop extending your own life; go kill yourself and knock that number down a notch. But there's the rub - you don't really believe it, and I'd bet that as soon as radical life extension comes on the market you'll go for it even while ridiculing others doing the same.

You just use irrational ideas as a way of sounding "cool", because there are all too many moronic humans who eat that crap up, thinking that anything must be right if it goes against the "establishment". If a poll was done, probably more than half of all people would claim to be non-conformists.

Obviously you're not going to find very many such idiots here, so who's your real audience? Do you show your posts to all your (nominally) progress-hating friends, gushing over how you dealt such a huge blow to The Man? Or are you such a sad, pathetic creature that you do all this purely to prove your own coolness to yourself?

Eliezer,

That wasn't a very strong signal of non-guru status. Six out of those nine people are dead (why choose the dead?) and can't condemn your ideas or compete for current authority with you, making for a less informative signal of non-guru status. You praise Kahneman for academic knowledge of heuristics and biases, but notably not for actually overcoming bias. Mentioning Dawes' total output of students, given his line of work and greater age, is very different from praising his ability to actually convey rationality.

A guru could say those things and still consistently claim to be the most generally intelligent and personally rational do-gooder currently living on the planet Earth, a view which is false for most. Are you ready to explicitly reject that proposition with respect to yourself? To say that people who do not agree with you on some important matters of fact and of value (e.g. relating to your work), and who might hinder your accumulation of supporters and resources, are your rivals or superiors in general rationality? To specify significant ways in which you have been persistently (and harmfully on balance) more biased than interlocutors concerned with rationality like Nick or Robin?

You could easily address such questions in a much more informative fashion than in the list above.

Carl, I'm not trying to signal non-guru status. If I was trying to signal non-guru status, I wouldn't write verse! But a verse to remember and repeat as a mantra might be useful to someone trying to resist the slide into cultishness ("I was never your city" keeps going through my own mind). I have little compunction about "looking like a guru" if it conveys information nicely. So long as I'm not actually a guru.

Regarding the rest of your question, I acknowledge no superior in my own specialty, and would be expected to have many superiors anywhere outside my own specialty.

"I have little compunction about "looking like a guru" if it conveys information nicely. So long as I'm not actually a guru." You're also in good company on verse with the MIT AI koans:

"A novice was trying to fix a broken Lisp machine by turning the power off and on. Knight, seeing what the student was doing, spoke sternly: "You cannot fix a machine by just power-cycling it with no understanding of what is going wrong." Knight turned the machine off and on. The machine worked." http://en.wikipedia.org/wiki/Hacker_koan

"So long as I'm not actually a guru."

Why not? The word guru doesn't necessarily have negative connotations for me. This isn't a gripe over lexical definitions. I think most of the nine people listed could have been described as a guru - the last certainly was. They had devoted followers and imparted their knowledge to them. A guru does not a cult create - that honour's reserved for those who aren't comfortable with the prospect of being usurped or overtaken.

I have limitless admiration for someone who can teach all they know, and look on with nothing but pride as their protegés go on to surpass their achievements. I know I'd have trouble with that.

But a verse to remember and repeat as a mantra might be useful to someone trying to resist the slide into cultishness

I suspect that ritualistic behavior is unlikely to aid in resisting the slide into cultishness. Quite the opposite.

Perhaps you should study the people who are exposed to teachings that seem to favor cultishness, find value in the teachings, but do not enter the cult.

Maybe I'll take up the mantle of adversary, Eli, when the circumstances are right. You are far ahead, but I think I can catch up. Who else will do learn and over take, instead of idly chatting?

"Von Neumann was probably brighter than I am;"

That one made me chuckle ... Good for you Eliezer! I do enjoy your posts. But that comment cracked me up. So I can only presume it was in jest, of course. It would be a somewhat ironic attempt at modesty to compair yourself to one of the greatest minds in history.

Caledonian: ritual behavior is only cultish if the ritual reinforces non-thought. Given that humans are so obviously born hungry for ritual, I'd be inclined to think rational/scientific culture is making far too little use of it, and more would be better. If anything, starving yourself of ritual will make you easy prey for cults.

Elizer: your PDF in the previous post changed the way I think about AI as a concept, stripping off much anthropomorphism. So to that extent you do get to be a guru to me, at least until I get good enough to make advances of my own ;-P

Couldn't resist adding a complaint about the abuse of the term 'guru' as a term of ...abuse. It represents in fact an exponent of a perfectly respectable form of expertise transmission in non-rational domains. Drift into abuse of authority by such an exponent is perhaps more likely because the method relies on authority rather than argument, but that doesn't mean that the concept is invalid, or indeed that there is any other method possible in those domains.

Goplat, can't answer for Caledonian, but as I'm pretty sad & pathetic myself, I'll take a stab. The unborn represent variety and potentiality. More of the same represents sterility. Sure I'd like to live 500 productive & happy years, but am in my better moments conscious that with present biotechnology this is unlikely. With SIAI improved biotechnology who knows ? However, my totally uninformed intuition is that however superproductive & longlived the ultra-new curly-wurly chromosomes that my friendly neighbourhood SIAI will give me are, they would do better (in accordance with their interest) endowing them on the young of the species. Your argument that we now are happy living 80 years where our ancestors were lucky to make 40 is pertinent, but adding years after 40 still doesn't increase the productive lifespan of a mathematician. Jesus died at 30 (or was it 33 ?). Mother Theresa was doing productive caring work into advanced old age. So perhaps youth = creativity, age = caring. A 'Self Improving' AI would surely privilege the 1st option. For better or for worse. Personally I'm for balance, and am all for the increase of life expectancy at a rate which is compatible with human capacity to adapt. I wrote a piece on the Impossibility of a 'Friendly' SIAI which I may inflict on the world someday.

Just had a response to Goplat rejected as spam. Wonder what the biases built in to the new antispam filter are ?

"Von Neumann was probably brighter than I am;"

That one made me chuckle ... Good for you Eliezer! I do enjoy your posts. But that comment cracked me up. So I can only presume it was in jest, of course. It would be a somewhat ironic attempt at modesty to compair yourself to one of the greatest minds in history.

Carl asked for someone with superior Spearman's g, which is more widely known as g-factor. Not "least upper bound", just "superior".

Spearman's g is tricky. It's easy for me to see that Jaynes is better at Bayesian calculus than I am, but that doesn't mean I can infer that Jaynes was doing it through superior g-factor (nor that he wasn't).

Traditional IQ tests sensibly and reliably measure a range of around 60-140. Richard Feynman's measured IQ was 137, but you have to translate that as "outside the range of the IQ test", not "80 IQ points dumber than Marilyn vos Savant".

There have been attempts to devise measures of "genius IQ" but I'm not impressed with any of their validation measures, and in any case, I haven't taken any.

Von Neumann was famous as a genius who scared other geniuses. I still added the qualifier "probably" because I don't actually know that von Neumann did his stuff via g-factor per se, rather than, say, by working so hard that he scared other hardworking mathematicians. It does seem likely that von Neumann had one of the highest Spearman's-g of the 20th century, but it's not certain. Anyone above a certain range tends to specialize in modes of cognition, and they do what they do by choosing tasks that fit their peculiar genius, not necessarily by being generally "better" than other geniuses in any directly comparable sense. Was Einstein smarter than Newton? I don't know; they applied different kinds of genius. So I picked von Neumann as the archetype - his genius wasn't necessarily the most effectively applied of the twentieth century, but he comes to mind as having a damned high g-factor.

If you just say "smart", or something like that, then you're really asking after a sort of generalized status ranking, in which case merely to compare oneself to von Neumann would be an act of great social audacity. Perhaps this is what made you laugh? But Carl didn't ask about life accomplishment or social status, he asked about Spearman's g, which is a very specific request about a characteristic that's very hard to infer above the IQ 140 range.

Are you talking to yourself, or is there something wrong with the name on this post?

One presumes that the second paragraph ought to be italicized, to indicate that it is a quotation of HighlyAmused's earlier comment. Interestingly, the Internet Archive's record of this thread as it appeared on its old home at Overcoming Bias does display the italics correctly; it is certainly odd that the move to Less Wrong should result in such an idiosyncratic error.

Traditional IQ tests sensibly and reliably measure a range of around 60-140. Richard Feynman's measured IQ was 137, but you have to translate that as "outside the range of the IQ test"

No. It is far more likely that the qualities that made Feynman a genius were not those that were measured by IQ tests.

It's not a matter of his intellect being outside of a range. Intellect has a dimensionality far greater than IQ tests measure, period.

I have a suspicion that very high IQ is like comparing cheetahs to dogs. The dog isn't worse, he's just less of a specialist. High IQ means using the same wetware differently. More computation effort is devoted to a particular range of tasks. When you get into the ultra-genius range, you are actually starting to chip away at features used by the rest of the system. A narrow focus on one mode of cognition is unavoidable.

Eliezer or Carl:

Is reading "GENERAL INTELLIGENCE," OBJECTIVELY DETERMINED AND MEASURED (http://psychclassics.yorku.ca/Spearman/) the best way to understand what you mean by "Spearman's g?"

Anon, Spearman's original book is pretty old. Try Wikipedia, or search Gene Expression, or this page seems to have a lot of resources. Jensen had a nice intro paper at Psycoloquy, but Psycoloquy seems to be down at the moment.

Chris:

I'd agree that mathematicians probably peak somewhere around 40. I can see two things contributing to the subsequent decline: mental degeneration, and an increasingly irrelevant skill set; if your specialty is narrow enough, you may have solved all the easy problems in your microfield. I expect life extension technologies to fix the former problem. For the latter problem, Feynman recommends changing fields every 7 years. Of course, he's Feynman; maybe 10 years is better for us ordinary folks.

Is there a hidden meaning to this? I only grasp the exterior feel to this shiny poem.

"...Although, do please make the check out to 'Cash'."