You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

[Video Link] PostHuman: An Introduction to Transhumanism

2 Post author: Joshua_Blaine 15 November 2013 02:04AM

http://youtu.be/bTMS9y8OVuY

I think this is a nice introduction to Transhumanism, inspired by the style of many well known Youtube educators. Given how much LessWrong likes these ideas, I thought it was worth sharing.

The group also has a Kickstarter here to fund an entire series of videos of this kind. I think they deserve to be backed, and LW can probably influence the video creators in a useful/helpful way.

Comments (11)

Comment author: knb 15 November 2013 06:30:58AM 4 points [-]

It's a really well-made video. However, they make the claim that the doubling period for Moore's Law is shrinking, which I think is false.

Comment author: Mestroyer 15 November 2013 07:50:52AM 4 points [-]

I also don't like that argument (even when presented correctly) because it ignores software and assumes you can brute-force everything.

Comment author: timtyler 15 November 2013 11:31:16AM 2 points [-]

Shane Legg prepared this graph.

It was enough to convince him that there was some super-exponential synergy:

Comment author: Joshua_Blaine 15 November 2013 03:03:51PM 1 point [-]

I dont disagree with the idea of an A.I. Intelligence explosion, but I do think their argument is the wrong one. It is a fairly common idea though, especially from semi-laypeople, so I'm not surprised to have seen it in their video.

Everything else in the video seemed roughly accurate, though.

Comment author: somervta 15 November 2013 11:41:32AM 0 points [-]

I'm pretty sure that was Kurzweil's claim originally. Can't remember if they attribute it to him in the video or make it themselves, though.

Comment author: somervta 15 November 2013 06:29:18AM 0 points [-]

Thanks for posting this - I had seen it elsewhere, but forgot to watch it, and I never knew about the Kickstarter

Comment author: knb 15 November 2013 08:44:35AM 0 points [-]

I also wonder if the extreme "abolitionist" position (with regard to suffering) should be listed along anti-aging and intelligence enhancement. Abolitionism seems like it might be off-putting even to people who might support the first two planks.

Comment author: davidpearce 14 January 2014 12:27:16PM 2 points [-]

"Health is a state of complete [sic] physical, mental and social well-being": the World Health Organization definition of health. Knb, I don't doubt that sometimes you're right. But Is phasing out the biology of involuntary suffering really too "extreme" - any more than radical life-extension or radical intelligence-amplification? When talking to anyone new to transhumanism, I try also to make the most compelling case I can for radical superlongevity and extreme superintelligence - biological, Kurzweilian and MIRI conceptions alike. Yet for a large minority of people - stretching from Buddhists to wholly secular victims of chronic depression and chronic pain disorders - dealing with suffering in one guise or another is the central issue. Recall how for hundreds of millions of people in the world today, time hangs heavy - and the prospect of intelligence-amplification without improved subjective well-being leaves them cold. So your worry cuts both ways.

Anyhow, IMO the makers of the BIOPS video have done a fantastic job. Kudos. I gather future episodes of the series will tackle different conceptions of posthuman superintelligence - not least from the MIRI perspective.

Comment author: somervta 15 November 2013 11:42:13AM 0 points [-]

It... was, wasn't it? Or did I mistake the bit about David Pearce and the elimination of suffering?

Comment author: knb 15 November 2013 08:24:36PM 0 points [-]

It was. I'm saying I don't think it should have been.