You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Well-done documentary on the singularity: 'Transcendent Man'

3 Post author: lukeprog 04 March 2011 11:07PM

I just watched Transcendent Man about the singularity and Ray Kurzweil in particular. It's well-made, full-length, and includes the most popular criticisms of Kurzweil: that his prediction timeframes are driven by his own hope for immortality, that the timescale of his other predictions are too optimistic, that his predictions about the social outcomes of revolutionary technology are naively optimistic, and so on. Ben Goertzel and others get much face time.

You can rent or buy it on iTunes.

Comments (34)

Comment author: Rain 10 June 2011 09:55:22PM *  3 points [-]

Transcendent Man is now available on Netflix instant streaming. Not sure if it's worth watching for this crowd.

Comment author: Eliezer_Yudkowsky 05 March 2011 12:49:25AM 2 points [-]

that his prediction timeframes are driven by his own hope for immortality

This is not an important criticism; it is ad hominem in its purest form.

Comment author: Dorikka 05 March 2011 04:11:42AM 8 points [-]

Overall, specific errors in reasoning should generally be highlighted instead of arguing that the other person is biased. One reason is because such an accusation is an ad hominem attack -- I think that such indirect methods of analyzing the rationality of an argument have an alarming potential to provoke mind-killing.

The more obvious and important reason is that citing a logical error/fallacy/bad interpretation of data is so much more reliable than trying to read emotional cues of whether someone is biased; this is especially true considering the lack of insight which we have into each other's mind.

Comment author: katydee 06 March 2011 05:50:46AM 5 points [-]

A more correct (less wrong?) restatement of the criticism is that Kurzweil seemingly allows his own hope for immortality to bias his predictions as to when immortality will be achieved.

Comment author: JoshuaZ 05 March 2011 04:48:11AM 5 points [-]

This is not an important criticism; it is ad hominem in its purest form.

In certain contexts, where someone is relying on someone's expertise and lack the resources to evaluate the details of a claim, then relying on experts make sense. If a given potential expert has a reason to be biased that's a reason to rely on that expert less.

Comment author: Pavitra 05 March 2011 02:40:18AM 7 points [-]

Isn't it? If there are significant causal pressures on his conclusions other than the reality of the thing he's trying to draw conclusions about, then it's not clear how his conclusions would become/remain entangled with reality.

Comment author: wedrifid 06 March 2011 08:26:14AM *  4 points [-]

This is not an important criticism; it is ad hominem in its purest form.

Prediction: Given the right incentive and five minutes to think Eliezer would be able to give an example of a criticism that is a more pure form of fallacious ad hominem. I am only slightly less confident that a randomly selected 15 year old student, allowing the '5 minutes' to include an explanation of what ad hominem means if necessary.

Comment author: lukeprog 05 March 2011 02:17:33AM *  1 point [-]

True. I should have said 'popular.' I've updated.

Comment author: Normal_Anomaly 05 March 2011 01:31:04AM 1 point [-]

Perhaps a better criticism is that his prediction timeframes are the opposite of conservative estimates.

Comment author: MartinB 07 March 2011 04:43:04PM 0 points [-]

How so?

You mean when criticizing his timeframes one should actually point out real flaws instead of just pointing out how they nicely align with his life expectancy?

At first glance I totally fail to see the ad-hominem, maybe a second will help.

Comment author: XiXiDu 05 March 2011 10:33:46AM *  0 points [-]

...the most popular criticisms of Kurzweil: that his prediction timeframes are driven by his own hope for immortality...

Interestingly, one could argue the same for Yudkowsky and much more. While Kurzweil is rich enough to sustain himself, Yudkowsky lives off the money of people who have to believe into his predictions and himself as the one who deserves the money.

Comment author: orthonormal 09 March 2011 07:30:16AM *  3 points [-]

This has been pointed out a couple times, but Eliezer and the other leaders of SIAI don't have nearly as high a standard of living as they could easily have working for a big tech company, which doesn't require any extraordinary level of skill. Given that established programmers have been impressed with EY's general intelligence and intuition, I find it highly likely that he could have gone this route if he'd wanted.

Now, you could allege that he instead became the poorly paid head of an x-risk charity in order to feel more self-important. But suggesting the motive of greed is nonsensical.

(edited to delete snark)

Comment author: XiXiDu 09 March 2011 09:48:16AM 2 points [-]
  • I have no problem with how much he earns.
  • If he needs to spend contributed money for his personal fun that is completely justifiable.
  • If he'd need a big car to impress donors that is justifiable.
  • I called him the most intelligent person I know a few times, it is all on record.
  • I said a few times that I'm less worried by dying permanently because there is someone like Yudkowsky who contains all my values and much more.

That you people permanently try to accuse me of some base motives does only reinforce my perception that there is not enough doubt and criticism here. All I'm trying to argue is that if you people take low-probability high-risk possibilities seriously then I'm surprised nobody ever talks about the possibility that Yudkowsky or the SIAI might be a risk themselves and that one could take simple measures to reduce this possibility. Given your set of beliefs those people are going to code and implement the goal-system of a fooming AI, but everyone only talks about the friendliness of the AI and not the humans who are paid to create it with your money.

I'm not the person who you should worry about, although I have no particular problem with musing about the possibility that I work for some Chutulu institute. That doesn't change much about what I am arguing though.

Comment author: orthonormal 24 March 2011 02:25:39PM 1 point [-]

By the way, I was out of line with my last sentence in the grandparent. Sorry about that.

Comment author: childofbaud 07 March 2011 10:52:38AM *  1 point [-]

Kurzweil's money does not materialize out of thin air.

I don't know if he is rich enough to sustain himself, but he is certainly not giving away his futurism books, his longevity supplements, or his lecture talks for free. The people paying for Kurzweil's products and services also have to believe in his statements and predictions, and that Kurzweil is the one who deserves their money.

If I were to single out one of these two parties for having a greater financial incentive in the perpetuation of their ideas, my money would be on the businessman/entrepreneur, not on the research fellow working for a charity.

Comment author: SimonF 06 March 2011 07:24:44PM 1 point [-]

I find your phrasing to be dishonest, especially because you do provide arguments.

Comment author: Pavitra 06 March 2011 07:57:52PM 4 points [-]

Your comment doesn't make sense to me. Are you perhaps missing a "not"?

Comment author: timtyler 05 March 2011 12:03:12PM *  0 points [-]

That his prediction timeframes are driven by his own hope for immortality

Probably Kevin Kelly's point.