Yesterday, "Overcoming Cryonics" wrote:

Eliezer, enough with your nonsense about cryonicism, life-extensionism, trans-humanism, and the singularity.  These things have nothing to do with overcoming bias... if you're going to enforce the comments policy then you should also self-enforce the overcoming bias posting policy instead of using posts to blithely proselytize your cryonicism / life-extensionism / trans-humanism / singularity religion.

One, there is nothing in the Overcoming Bias posting policy against transhumanism.

Two, as a matter of fact, I do try to avoid proselytizing here.  I have other forums in which to vent my thoughts on transhumanism.  When I write a blog post proselytizing transhumanism, it looks like this, this, or this.

But it's hard for me to avoid all references to transhumanism.  "Overcoming Cryonics" commented to a post in which there was exactly one reference to a transhumanist topic.  I had said:

The first time I gave a presentation - the first time I ever climbed onto a stage in front of a couple of hundred people to talk about the Singularity - I briefly thought to myself:  "I bet most people would be experiencing 'stage fright' about now.  But that wouldn't be helpful, so I'm not going to go there.

What, exactly, am I supposed to do about that?  The first time I ever got up on stage, I was in fact talking about the Singularity!  That's the actual history!  Transhumanism is not a hobby for me, it's my paid day job as a Research Fellow of the Singularity Institute.  Asking me to avoid all mentions of transhumanism is like asking Robin Hanson to avoid all mentions of academia.

Occasionally, someone remarks that I seem to take notions like the Singularity on faith, because I mention them but don't defend them.

I don't defend my views here.  Because I know that not everyone is interested in the considerable volume of work I have produced on transhumanism.  Which you can find on yudkowsky.net.

If, however, you don't like any mention of transhumanism, even as an illustration of some other point about rationality - well, this is a blog.  These are blog posts.  They are written in the first person.  I am occasionally going to use anecdotes from my history, or even, y'know, transcribe my thought processes a little?

Given the amount of time that I spend thinking about transhumanism, I naturally tend to think of transhumanist illustrations for my points about rationality.  If I had spent the last eleven years as a geologist, I would find it easy to illustrate my ideas by talking about rocks.  If you don't like my illustrations and think you can do better, feel free to invent superior illustrations and post them in the comments.  I may even adopt them.

On some transhumanist topics, such as cryonics, I haven't written all that much myself.  But there is plenty about cryonics at Alcor or Cryonics Institute.   Also, the Transhumanist FAQ has some nice intros.  If you don't want it discussed here, then why are you asking?

I will probably post explicitly on cryonics at some point, because I think there are some points about sour grapes for which I would have difficulty finding an equally strong illustration.  Meanwhile, yes, I sometimes do mention "cryonics" as the archetype for a socially weird belief which happens to be true.  No matter what I use as an example of "socially weird but true", some people are going to disagree with it.  Otherwise it wouldn't be an example.  And weird-but-true is certainly an important topic in rationality - otherwise there would be a knockdown argument against ever dissenting.

Even after checking the referenced sources, you might find that you - gasp! - still disagree with me.  Oh, the horror!  The horror!  You don't read any other blogs where one of the authors occasionally disagrees with you.

Just because this blog is called Overcoming Bias, it does not mean that any time any author says something you disagree with, you should comment "OMG!  How biased!  I am sooo disappointed in you I thought you would do better."  Part of the art of rationality is having extended discussions with people you disagree with.  "OMG U R BIASED!" does not present much basis for continuing discussion.

It is a good rule of thumb that you should never flatly accuse someone of being "biased".  Name the specific bias that attaches to the specific problem.  Conjunction fallacyAvailability?

If you disagree with someone, you presumably think they're doing something wrong.  Saying "You are like so totally biased, dude" is not helpful.  If you strike a tragic, sorrowful pose and go, "Oh, alas, oh, woe, I am so disappointed in you," it is still not helpful.  If you point to a specific belief that you disagree with, and say, "See, that belief is biased," then that doesn't convey any additional information beyond "I disagree with that belief."  Which bias?  There's quite a lot of possibilities.

If you think that "rationality" means people will agree with you on their first try, so that anyone who doesn't do this can be dismissed out of hand as a poseur, you have an exaggerated idea of how obvious your beliefs are.

So stop telling me, or Robin Hanson, "Why, you... you... you're not absolutely rational!"  We already know that.

Just because I try to be rational doesn't mean I think I'm a god.

Well, sure, I want to be a god when I grow up, but that is like a totally different issue from that first part.

Except that both goals involve Bayesian methods.

(And are intertwined in other ways you won't realize until it's too late to turn back.)

Thank you.

Yours in the darkest abyssal depths of sincerity,
Eliezer Yudkowsky.

New to LessWrong?

New Comment
51 comments, sorted by Click to highlight new comments since: Today at 8:27 AM

Thanks Eliezer. My previous post on the 'CounterCultishness' thread would have been more relevant here. This is a good opportunity to give you, Robin, and any other occasional posters a vote of thanks for your (always) thought provoking and (mostly :-) ) incisive posts, whose interest keeps me, at least, coming back here, whatever my feelings about the SI.

Re: people accusing you of bias. I offer this likely irrelevant observation: I think it's the tone of the blog. Although you do an amazing job of explaining very complicated things to us cretins, it is seems to be that the feeling here is that you are speaking to the plebs from a very great hight inded (which is true, I am guessing). However, being the simple apes that we are, its easy to get upset and accuse you of hypocrisy.

Other blogs written by other extremely smart people (possibly even smarter people, if that's possible)do not induce this feeling in me (and therefore possibly others)so perhaps that's something you need to work on. Also, regarding the issue of cryonics etc., I also find it pretty distracting. For much the same reason as you warn against bringing contemporary politics into these sorts of discussions (i.e. of the biases themselves), it gets my feelings worked up, and it detracts from thinking about the actual biases.

Also, I get the feeling (again, probably wrongly) that you sort of think that people are biased/irrational if you don't agree with these somewhat unusual proposals. Now this is not suprising I guess, it is common Utilitarian/Economic argument that 'well you accept P, but this, you may not realise, actually also implies Q' (some counter-intutive moral point) (actually Bostrom's reversal test paper is another good example). While this is often a powerful form of argument, as the likes of Singer show, it often seems to make one feel a bit manipulated. But again, maybe that's just me. Please don't take these as slams, I'm just reporting my reactions as an attempt to explain what other people are saying & hopefully help improve things. I think it is a pretty damn interesting blog and the level of disucssion is amazingly high, but like everything it could be better.

PS The Walking out on your lonesome from the conference thing was a good point, but the way you illustrated it was a tad self-aggradizing, I think.

Go Eliezer! You rock! Keep doing what you're doing, as it's very interesting and very fun to read!

Actually, maybe you had better tone down the awesomeness of your posts a bit. (Maybe you could do that by not mentioning the Singularity so much.) If you don't tone it down, I fear I'm in danger of careening off into a Super Happy Death Spiral based on your awesomely cool ideas.

I think it would be well worth it for Eliezer, myself, or contributors here to write a post explaining why they still embrace some particular "weird" belief in the face of wide disagreement. Such a post should not just review the positive reasons for the belief, but also the reasons for thinking others to be mistaken, even when they have heard and rejected those reasons.

The reason Nicholas gets the impression that Eliezer thinks that people are biased if they disagree with his unusual ideas is that Eliezer does in fact think this.

Likewise, as Eliezer points out, as a general rule, those who disagree with him immediately jump to the conclusion that he is biased.

These things seem to illustrate two things: first, we assume that every disagreement is a persistent disagreement until shown otherwise; i.e. if someone disagrees with us from the start, we assume that he will continue to disagree with us even after we have presented all of our reasons. This seems to be a fairly reasonable assumption based on our experience; people rarely change their minds due to argument.

Second, that we consciously or subconsciously believe that Robin's position on disagreement is correct, i.e. that a persistent disagreement is irrational. Since we are unwilling to admit that we are ourselves irrational, we must attribute the irrationality to the other.

More briefly: Any disagreement is assumed to be persistent; any persistent disagreement is assumed to be irrational; and any irrationality is assumed to belong to the other guy. Thus any disagreement is assumed to involve irrationality on the part of the other.

Since most people are irrational most of the time, and even the more rational are irrational quite often, it's easy to see why when given the choice between rethinking our cherished positions and lowering our opinions of those who disagree with us, we choose to lower our opinions. It's not only ego-sparing, it's quite probable as well.

Fortunately, those wishing to overcome their own biases are well aware of these tendencies within themselves, and can distinguish between justified belief, contingency planning, wishful thought, and genuinely strong arguments.

Unknown,

There's an 'evaporative cooling' effect for weakly held beliefs among people who frequently encounter opposing views, where beliefs that one can be easily argued out of tend to be eliminated over time as one falls into stronger attractors. If someone has read most of the major literature on view X, talked to experts on the field, observed polling or survey information on its prevalence, and sought out critiques from expert disinterested parties, it's unlikely that hearing that any particular nonexpert disagrees should shift her views very much. She already expects a certain proportion of such disagreement in the population and has accounted for it in her estimates.

"I briefly thought to myself: 'I bet most people would be experiencing 'stage fright' about now. But that wouldn't be helpful, so I'm not going to go there.'"

This is part of E's history? Both this and his reaction to 9/11, ticking off a series of thoughts in robotic fashion, strike me as unlikely, given my experience being a human and viewing others.

I don't know what to label this, whether it is an attempt to establish authority by seeming exceedingly rational, remembering events in a way that pleases him, something close to the truth, or something else completely different.

But it does strike me as both odd and unlikely.

Carl, I agree with that completely. It simply shows some of the partial reasonableness in the assumptions I was talking about; we assume the other has considered objections, and therefore that he will not change his position when confronted with our reasons. This is usually a reasonable assumption. But it remains true that we deny our own irrationality, and therefore we place it on the other. And this is not always so reasonable, although it sometimes is.

""Overcoming Cryonics" commented to a post in which there was exactly one reference to a transhumanist topic." I too was very frustrated by this and most of the further comments on this post.

There seems to be a consistent bunch of commenters that are really only trying to attack Eliezer instead of have a resonable discussion or counter-response to something stated in a post. I sincerely wish these commenters would tone it down a bit or make their own damn blog to be so offensive.

The comment I wrote yesterday was motivated by how Eliezer blithely mocks traditional religion while promoting the cryonicism / life-extensionism / trans-humanism / singularity religion.

The people I know of the cryonicism / life-extensionism / trans-humanism / singularity faith are libertarians, which seems like a non sequtur. What does the apotheosis of Aubrey de Grey have to do with a political philosophy premised on the protection of private property? To be a full-fledged member of the cult maybe you have to adopt both. Similarly there are Republicans that adopt an incoherent party platform that conflates religious fundamentalism with lower income taxes. Would a rational, unbiased person migrate to both beliefs, or does a blend of intellectual dishonesty and imitating the group bring someone there?

A pet peeve of mine that manifested itself in what I wrote is that I'll visit a left-leaning politics/economics blog and some very smart people will act as if they are holders of Truth regarding the nefariousness of the political Right and the divinity of the political Left. Then I'll go to a right-leaning blog and those very smart people with similar overconfidence will make diametrically opposed assertions. On this blog that putatively is about overcoming bias I thought it hypocritical that Eliezer mocks all traditional beliefs while then arbitrarily promoting his own "religion."

Why do we believe what we believe? I think Robin Hanson says you should optimally shade your view towards the crowd wisdom to avoid bias. Does Eliezer have a strong enough basis for his iconoclastic belief in the cryonicism / life-extensionism / trans-humanism / singularity religion? Has he avoided the biases he so very eloquently writes about here? Does this site do for philosophy what Brad DeLong's site does for political economics where a bunch of likeminded smart people get together and to the point of delusion overconfidently reinforce each others' worldviews when equally smart people do the same thing with opposite views elsewhere? Who is right, and how is the disagreement reconciled?

"The comment I wrote yesterday was motivated by how Eliezer blithely mocks traditional religion while promoting the cryonicism / life-extensionism / trans-humanism / singularity religion." OC,

Just to be clear, you wouldn't try to claim that any traditional religion is plausible, right?

Way to get trolled, Eliezer. The fact that OC's comment had nothing to do with the post it was attached to should really have tipped you off that he's really only interested in pushing your buttons.

This is pretty much your and Robin's blog, write whatever you want. You don't have to make excuses.

Unknown, good summary.

Carl, yes, but in addition you probably infer that someone on the other side is a bit less rational that someone on your side.

Caledonian, "those wishing to overcome their own biases ... can distinguish between ... wishful thought, and genuinely strong arguments" sounds to me more like wishful thought.

Overcoming, I do hope we at least rise above the DeLong standard.

Caledonian, "those wishing to overcome their own biases ... can distinguish between ... wishful thought, and genuinely strong arguments" sounds to me more like wishful thought.

The easiest person to fool is yourself. The wise recognize this, and hold their own beliefs to a higher standard than even those they impose on others.

The hubristic, however, like to talk about how much cleverer and wiser they are than everyone else.

I want to apologize for juxtaposing this blog with Brad DeLong's blog. I referenced DeLong's blog as an extreme caricature of intellectual dishonesty and delusion. This blog rises above that more than any other site I have come across, and I was only cautioning against the tendency here to occasionally demonstrate the negative traits with which DeLong's blog is characterized. I in no way meant to directly compare the two and apologize if it came off that way. I shouldn't have used such an extraordinarily distasteful example.

Just read 'The Reversal Test'. A good, honest, decent paper, but does little to address the issue. It only considers modifications in one parameter. I'd like to see a reversal test for modifying one parameter out of 100, when the 100 parameters are in some sort of equilibrium, potentially unstable, and the equilibrium is one which you don't understand too well. Even given that the status quo equilibrium is by all accounts pretty lousy.

Unknown, that was interesting and spot on I guess. I must say however, that personally, I didn't think it made sense to think that you could 'agree to disagree' about factual matters if you had the same amount of evidence before I read this blog. That doesn't seem to make sense to me in an elementary way. So I have no problem with cryonics, transhumanism etc. being possibly 'true' - although I haven't the foggiest, most of these thing seem plausible to me. Perhaps what I should have said viz. the 'the other people being biased' thing is that it seems to include values. i.e. we should share the values of EY. This jars somewhat with the meta-ethical anti realisem which is part and parcel of this sort of thoguht. Now again, I guess that the response is that often we actually do have the same values, but we haven't extrapolated far enough in a coherent way.

"...a bunch of likeminded smart people get together and to the point of delusion overconfidently reinforce each others' worldviews..."

OC - if this is what you think goes on here, you haven't read enough of the comments. Even the posters don't tend to agree.

I agree with Nominull - back to the interesting stuff please!

OC - if this is what you think goes on here, you haven't read enough of the comments. Even the posters don't tend to agree.

I don't think you're read the comments carefully enough. There are distinct groups: one large one that frequently praises Our Hosts, and various individuals/small alliances that frequently criticize.

The comments are usually more on-topic to what the blog is supposed to be about than the posts!

EY wrote, "People who grow up believing certain things, even if they later stop believing them, may not quite realize how the beliefs sound to outsiders."

I agree with this. Transhumanism, cryonicism, scientology, and the singularity sound to me like the stories of Santa Claus and the Tooth fairy.

Perhaps this post should have just been a comment responding to OC in the previous post rather than a post on its own.

I thought Aubrey de Grey was not a libertarian. Anyone know the score there?

This is part of E's history? Both this and his reaction to 9/11, ticking off a series of thoughts in robotic fashion, strike me as unlikely, given my experience being a human and viewing others.

My introspection is similar to Eliezer's. A coherent chain of thoughts is quite common for me, including recognizing emotional reactions and deciding whether to go with them. I share your experience with many others who do not react this way, but some of us do.

I thought Aubrey de Grey was not a libertarian. Anyone know the score there?

I heard he uploaded mice brains to computers so that the mice could be immortal. A leap forward for both singulitarianists and life-extensionists, but libertarian Ron Paul won't allow federal funding.

Eliezer, I don't think your story would have been appreciably weakened if you'd just deleted the words "to talk about the Singularity". On the other hand, I also don't see any reason why you should have to avoid mentioning Your Strange Beliefs either. Also: surely the conjunction fallacy is not a bias, but a symptom of a bias. (The bias in question being more or less a special case of the availability heuristic: the more detail we're provided with, the easier it is to imagine whatever-it-is.)

burger flipper and Zubon, I think (1) Eliezer's thought processes are quite unusual and (2) his claimed thought processes on the two occasions mentioned by bf are (no more than) quite unusual, which to my mind makes them unsurprising and unsuspicious.

Overcoming Cryonics, singularitarianism seems to me to lack a number of important characteristics that almost all things commonly called religions share, so however wrong or irrational it may be the term "religion" seems unhelpful. (I find that applying the term "religion" to things that aren't commonly regarded as religions generally produces more heat than light.) Likewise for cryonics, life-extensionism and transhumanism more generally. All of which, incidentally, seem to me to be quite separate things, which I agree makes it interesting that they seem usually to get accepted or rejected as a group.

... If they do, that is; I realise that my evidence for this is very thin. Anyone have any figures, or even more extensive anecdotal evidence? Do people who sign up for cryonics believe in the Singularity more often than they should if the only factor is that the Singularity might make signing up for cryonics a better bet? (Etc.)

Incidentally, Al Cellier, what on earth is scientology doing in your list? It has (so far as I can see) nothing in common with the other items in the list, either in terms of shared beliefs or shared adherents. Are you just trying to annoy any singularitarians and transhumanists who are reading what you write?

EY: I'd just like to point out that there's an extra hyphen added onto the second link to the singinst blog. Luckily enough the site doesn't totally die on it, but It would still be better if it were fixed :)

I hope I'll come up with some more insightful comments in the future than little bug fixes. Keep up the great work!

Maksym: I wonder how that happened! Fixed.

For the record, I support the goals of life extension, but I don't anticipate seeing substantial human life extension before self-improving AI.

There are a large number of transhumanists who are socialists, not libertarians. In fact, as far as I can tell "libertarian transhumanism" is a distinctly American phenomenon. Saying that most transhumanists you know are libertarians may be true, but to assume that their experiences define the entire span of transhumanist belief would be creating an invalid generalization from too little evidence.

"Just because I try to be rational doesn't mean I think I'm a god.

Well, sure, I want to be a god when I grow up, but that is like a totally different issue from that first part.

Except that both goals involve Bayesian methods." Thanks for pointing me in the direction of those Bayesian methods. I have been consumed with studying them recently. They will, perhaps, replace even the scientific method as the poster child for science one day. The math is the easy part - accepting it as a philosophy is what I'm sure is difficult for some.

The whole libertarian vs socialism thing is one area where transhumanism imports elements of cultishness. If you are already a libertarian and you become familiar with transhumanism, you will probably import your existing arguments against socialism into your transhumanist perspective. Same for socialism. So you see various transhumanist organizations having political leadership struggles between socialist and libertarian factions who would probably be having the same struggles if they were a part of an international Chess club or some such other group.

The whole thing becomes entrenched in debates about things like Transition Guides and what amounts to "how to implement transhumanist policy in a [socialist/libertarian] way that's best for everyone." I always thought these discussions were what amounted to discourse at the "fandom" level of the transhumanist community, but after reading some of Eliezer's posts about his own experiences at transhumanist/singularitarian events I see that it happens at all levels.

Half-formed thought I need to pursue more offline but I'll write it down now: If you say "I am a transhumanist" and you say "I am a libertarian" and then you try to find libertarian ways to meet transhumanist goals you have made your transhumanism subservient to your libertarianism. I think it is better to find transhumanist ways to meet libertarian goals. The fact that a group of transhumanists would derail a debate by getting into politics seems to express to me that the group has made transhumanism the subservient value. Which seems inelegant given that transhumanism is probably the simpler value. Seems like there's a possible post for my own blog brewing in there, but I have to think about it some.

Riddle me this, if a fellow at the Singularity Institute for Artificial Intelligence had to estimate an over/under for the year of the Singularity, would the estimate be objective, biased upwards, or biased downwards. I suspect it would be biased downwards compared to a presumably objective over/under derived from a prediction market on the topic. We must overcome bias or we will spend too much energy trying to protect ourselves from the day of the Singularity. I'm much more likely to be hit by a car as a pedestrian than I am to be assaulted by a hybrid car / human 50 years from now.

Nicholas, Sam Harris's study of belief and disbelief employing MRI scanning seems to indicate that the human brain considers moral statements to be true or false in the same way that it considers mathematical or factual statements to be true or false.

Of course one can argue that the brain is wrong, but since one will continue to possess a human brain, one will still think and act as though the statements were true or false in the same sense. So humans will treat disagreements about values in the same way as other disagreements, even if on a meta-level they deny the equivalence. Eliezer pointed this out himself when he said that making a moral claim like "murder is wrong" feels like stating an objective fact; it feels this way because on the level of the brain, it is exactly the same as stating an objective fact.

Why is everyone linking together cryonicism, life-extensionism, trans-humanism, and the singularity? The cryonicists I know view transhumanists as their nemeses. I'm not sure these are shared beliefs.

Christian Humanist Cryonics Club is likely viable. Christian Humanism is opposed to Trans Humanism and Secular Humanism. Cryonics is quite consistent with Christian/Classical philosophy but I'm the only voice for that-- which strikes me as very odd.

http://christianhumanistcryonicsclub.blogspot.com/

Christian Humanist Cryonics Club is likely viable. Christian Humanism is opposed to Trans Humanism and Secular Humanism. Cryonics is quite consistent with Christian/Classical philosophy but I'm the only voice for that-- which strikes me as very odd.

http://christianhumanistcryonicsclub.blogspot.com/

Why is everyone linking together cryonicism, life-extensionism, trans-humanism, and the singularity?

Because they are all extreme-minority positions whose advocates often seem to have lost sight of the distinctions between "hypothetically plausible", "foreseeable", and "inevitable".

They are also all subjects where the most enthusiastic supporters are almost certainly deluding themselves.

The reason people are annoyed at the screeds is that rather than trying to construct a representation of the processes we can use to distinguish self-delusion from rational thought, this blog does little but hold up specific examples and assert that they are rational. This is obnoxious enough on its own, but when the examples are of what is generally recognized as havens for bias, people get irritable.

Q : Why is everyone linking together cryonicism, life-extensionism, trans-humanism, and the singularity ? In addition to Caledonian's irritability, I would add : A : Because the two main posters here seem to subscribe to the extreme desirability of all three, (counting trans-humanism and the singularity for one item translating as Self-Improving AI), in a nexus centred on the Singularity Institute.

personal take : a) Cryonics : couldn't care less b) Radical life extension : playing with fire c) Self-improving AI : burning the house down.

Philossifur,

I'm fascinated. Where will your soul be when you die? Where will it be when your head is reanimated?

The standards of Brad DeLong, himself, would be something to aspire to. Sometimes they're exceeded here, often not. It's good to have a benchmark, but don't kid yourselves.

Ben Jones: Dead Christian Transhumanists' souls await the Final Day, just like all the rotting Christian corpses' souls. If you thaw out their heads and turn them back on, the souls are right where they were before, to the extent they ever were anywhere, and might as well pick up where they left off. When the head finally gets used up, the soul may be presumed to be where the other corpses' souls are.

The real question is what happens when you scan the frozen head into a non-frozen-head simulator, and start it up: can the simulation exchange packets with the waiting soul? If you start up two (or a million) simulators, can they all use it, or only the first? Or do that have to take turns, as with a time-share condo?

I think we would have to conclude that the soul (its "stamp", if you like) is encoded in details of the frozen head, and, if the non-frozen-head simulator were precise enough, you'd be running a simulation of the soul in each run, while the real soul sits twiddling its metaphorical thumbs waiting for the End Times. If the non-frozen-head simulator can simulate a non-frozen-head well enough, surely its simulation of the soul ought to be good enough too. After all, what does a soul have to do? Some people would say a cantaloupe provides a perfectly adequate simulation today, but they're probably not Christians.

Does anyone know a good cryonics center for pets? Thank you.

Clarification, please, from the audience

Its 1823. I own a printing press. I publish a monthly gazette entitled "Man will fly like birds in the future"

Substantively all the math required to design airfoils was completed by 1822. That this statement is true will not be generally evident for another 75 years.

Should my contemporaries classify me as irrational, delusional, or just plain wrong? What evidence would or should you provide to back such assertions?

How should the following compound statements (made, of course, by YHS in 1823) be evaluated?

  • "Man will fly in the future, but the necessary math is incomplete"
  • "Man will fly in the future, and the necessary math is complete"
  • "Man will fly in the future, and I'm working out the math"

Don't we need to be very careful when discussing beliefs about contingencies?

Religion and cryonics are NOT mutually exclusive-- http://www.alcor.org/Library/index.html#religion

For an unbiased, science-based model of the Singularity, there is no better source than Jared Diamond, "Collapse" and "The Third Chimpanzee". The Americas and many Pacific islands have faced mini-singularies in the recent past. Among the more poignant is Easter Island, where every last tree was cut down.

Diamond identifies a round dozen worldwide crises, any one of which may cause a corresponding worldwide collapse in the next century. Among them are nuclear-weapon proliferation, water shortages, devastation of fisheries, global warming, elimination of tropical forests, and peak oil. He addresses the prospects of "technically innovating" our way out of all twelve, and the outlook is bleak.

Could a fancy AI help? Only if we followed its advice. We are already getting much better advice than we seem able to follow.

Nathan Myers, the Singularity could be the cause of the downfall of civilization and not necessarily our salvation. Ever watch the Matrix?!

Al: Diamond's point is, indeed, that the best approximation to a Singularity we know of is a downfall. He identifies dozens of examples in history and pre-history, from Petra to Yucatan to Easter Island, of exponential development causing collapse. The singular difference between the modern case and his examples that is a cause for hope is that we know the previous examples, and can quantify the process. That knowledge, thus far, is having little effect.

On the flip side, all the previous collapses were local; the Easter Islanders cut down all their own trees, but not everyone else's besides.

Some of Diamond's examples are of collapses consciously averted. New Guinea highlanders evidently noticed, 6000 years after they began doing intensive agriculture, that they were about to eliminate crucial tree species, and instituted woodlots. The Japanese Shogunate did the same a few centuries later. The Shogunate had the authority to enforce its strictures. On New Guinea, perhaps tribes that kept woodlots were able to defeat neighboring tribes that didn't. Neither scenario suggests a method to address modern global crises.

Keep using whatever examples and anecdotes you think best make your points, Eliezer. If that person doesn't like what you write, he/she can just skip it.

Eh heh heh, things sure have turned around.

The link to the Transhumanist FAQ is broken. It's now http://humanityplus.org/philosophy/transhumanist-faq/

When I write a blog post proselytizing transhumanism, it looks like this, this, or this.

The first two links are broken.