I just watched Transcendent Man about the singularity and Ray Kurzweil in particular. It's well-made, full-length, and includes the most popular criticisms of Kurzweil: that his prediction timeframes are driven by his own hope for immortality, that the timescale of his other predictions are too optimistic, that his predictions about the social outcomes of revolutionary technology are naively optimistic, and so on. Ben Goertzel and others get much face time.

You can rent or buy it on iTunes.

New Comment
35 comments, sorted by Click to highlight new comments since:
[-]Rain30

Transcendent Man is now available on Netflix instant streaming. Not sure if it's worth watching for this crowd.

that his prediction timeframes are driven by his own hope for immortality

This is not an important criticism; it is ad hominem in its purest form.

Overall, specific errors in reasoning should generally be highlighted instead of arguing that the other person is biased. One reason is because such an accusation is an ad hominem attack -- I think that such indirect methods of analyzing the rationality of an argument have an alarming potential to provoke mind-killing.

The more obvious and important reason is that citing a logical error/fallacy/bad interpretation of data is so much more reliable than trying to read emotional cues of whether someone is biased; this is especially true considering the lack of insight which we have into each other's mind.

A more correct (less wrong?) restatement of the criticism is that Kurzweil seemingly allows his own hope for immortality to bias his predictions as to when immortality will be achieved.

Isn't it? If there are significant causal pressures on his conclusions other than the reality of the thing he's trying to draw conclusions about, then it's not clear how his conclusions would become/remain entangled with reality.

This is not an important criticism; it is ad hominem in its purest form.

Prediction: Given the right incentive and five minutes to think Eliezer would be able to give an example of a criticism that is a more pure form of fallacious ad hominem. I am only slightly less confident that a randomly selected 15 year old student, allowing the '5 minutes' to include an explanation of what ad hominem means if necessary.

This is not an important criticism; it is ad hominem in its purest form.

In certain contexts, where someone is relying on someone's expertise and lack the resources to evaluate the details of a claim, then relying on experts make sense. If a given potential expert has a reason to be biased that's a reason to rely on that expert less.

True. I should have said 'popular.' I've updated.

Perhaps a better criticism is that his prediction timeframes are the opposite of conservative estimates.

How so?

You mean when criticizing his timeframes one should actually point out real flaws instead of just pointing out how they nicely align with his life expectancy?

At first glance I totally fail to see the ad-hominem, maybe a second will help.

That his prediction timeframes are driven by his own hope for immortality

Probably Kevin Kelly's point.

[-]XiXiDu-20

...the most popular criticisms of Kurzweil: that his prediction timeframes are driven by his own hope for immortality...

Interestingly, one could argue the same for Yudkowsky and much more. While Kurzweil is rich enough to sustain himself, Yudkowsky lives off the money of people who have to believe into his predictions and himself as the one who deserves the money.

This has been pointed out a couple times, but Eliezer and the other leaders of SIAI don't have nearly as high a standard of living as they could easily have working for a big tech company, which doesn't require any extraordinary level of skill. Given that established programmers have been impressed with EY's general intelligence and intuition, I find it highly likely that he could have gone this route if he'd wanted.

Now, you could allege that he instead became the poorly paid head of an x-risk charity in order to feel more self-important. But suggesting the motive of greed is nonsensical.

(edited to delete snark)

  • I have no problem with how much he earns.
  • If he needs to spend contributed money for his personal fun that is completely justifiable.
  • If he'd need a big car to impress donors that is justifiable.
  • I called him the most intelligent person I know a few times, it is all on record.
  • I said a few times that I'm less worried by dying permanently because there is someone like Yudkowsky who contains all my values and much more.

That you people permanently try to accuse me of some base motives does only reinforce my perception that there is not enough doubt and criticism here. All I'm trying to argue is that if you people take low-probability high-risk possibilities seriously then I'm surprised nobody ever talks about the possibility that Yudkowsky or the SIAI might be a risk themselves and that one could take simple measures to reduce this possibility. Given your set of beliefs those people are going to code and implement the goal-system of a fooming AI, but everyone only talks about the friendliness of the AI and not the humans who are paid to create it with your money.

I'm not the person who you should worry about, although I have no particular problem with musing about the possibility that I work for some Chutulu institute. That doesn't change much about what I am arguing though.

By the way, I was out of line with my last sentence in the grandparent. Sorry about that.

Kurzweil's money does not materialize out of thin air.

I don't know if he is rich enough to sustain himself, but he is certainly not giving away his futurism books, his longevity supplements, or his lecture talks for free. The people paying for Kurzweil's products and services also have to believe in his statements and predictions, and that Kurzweil is the one who deserves their money.

If I were to single out one of these two parties for having a greater financial incentive in the perpetuation of their ideas, my money would be on the businessman/entrepreneur, not on the research fellow working for a charity.

[+]XiXiDu-50

I find your phrasing to be dishonest, especially because you do provide arguments.

Your comment doesn't make sense to me. Are you perhaps missing a "not"?