Audio version of Rationality: From AI to Zombies out of beta

10 Rick_from_Castify 25 November 2015 08:19PM

All three volumes of the ebook Rationality: From AI to Zombies are complete and available for purchase on Castify.  A big thanks to the narrator George Thomas.  The total recording time is over 51 hours.

Five math-heavy essays in the second volume went unread because they didn't make for good listening.  To read these the ebook can be downloaded on a "pay-what-you-want" basis from intelligence.org.

Thanks again to the original kickstarter backers, we wouldn't have been able to do this without you.

Direct Links:

Volume 1

Volume 2

Volume 3

Rationality: From AI to Zombies

80 RobbBB 13 March 2015 03:11PM

 

Eliezer Yudkowsky's original Sequences have been edited, reordered, and converted into an ebook!

Rationality: From AI to Zombies is now available in PDF, EPUB, and MOBI versions on intelligence.org (link). You can choose your own price to pay for it (minimum $0.00), or buy it for $4.99 from Amazon (link). The contents are:

  • 333 essays from Eliezer's 2006-2009 writings on Overcoming Bias and Less Wrong, including 58 posts that were not originally included in a named sequence.
  • 5 supplemental essays from yudkowsky.net, written between 2003 and 2008.
  • 6 new introductions by me, spaced throughout the book, plus a short preface by Eliezer.

The ebook's release has been timed to coincide with the end of Eliezer's other well-known introduction to rationality, Harry Potter and the Methods of Rationality. The two share many similar themes, and although Rationality: From AI to Zombies is (mostly) nonfiction, it is decidedly unconventional nonfiction, freely drifting in style from cryptic allegory to personal vignette to impassioned manifesto.

The 333 posts have been reorganized into twenty-six sequences, lettered A through Z. In order, these are titled:

  • A — Predictably Wrong
  • B — Fake Beliefs
  • C — Noticing Confusion
  • D — Mysterious Answers
  • E — Overly Convenient Excuses
  • F — Politics and Rationality
  • G — Against Rationalization
  • H — Against Doublethink
  • I — Seeing with Fresh Eyes
  • J — Death Spirals
  • K — Letting Go
  • L — The Simple Math of Evolution
  • M — Fragile Purposes
  • N — A Human's Guide to Words
  • O — Lawful Truth
  • P — Reductionism 101
  • Q — Joy in the Merely Real
  • R — Physicalism 201
  • S — Quantum Physics and Many Worlds
  • T — Science and Rationality
  • U — Fake Preferences
  • V — Value Theory
  • W — Quantified Humanism
  • X — Yudkowsky's Coming of Age
  • Y — Challenging the Difficult
  • Z — The Craft and the Community

Several sequences and posts have been renamed, so you'll need to consult the ebook's table of contents to spot all the correspondences. Four of these sequences (marked in bold) are almost completely new. They were written at the same time as Eliezer's other Overcoming Bias posts, but were never ordered or grouped together. Some of the others (A, C, L, S, V, Y, Z) have been substantially expanded, shrunk, or rearranged, but are still based largely on old content from the Sequences.

One of the most common complaints about the old Sequences was that there was no canonical default order, especially for people who didn't want to read the entire blog archive chronologically. Despite being called "sequences," their structure looked more like a complicated, looping web than like a line. With Rationality: From AI to Zombies, it will still be possible to hop back and forth between different parts of the book, but this will no longer be required for basic comprehension. The contents have been reviewed for consistency and in-context continuity, so that they can genuinely be read in sequence. You can simply read the book as a book.

I have also created a community-edited Glossary for Rationality: From AI to Zombies. You're invited to improve on the definitions and explanations there, and add new ones if you think of any while reading. When we release print versions of the ebook (as a six-volume set), a future version of the Glossary will probably be included.

Kickstarting the audio version of the upcoming book "The Sequences"

31 Rick_from_Castify 16 December 2014 01:01AM

LessWrong is getting ready to release an actual book that covers most of the material found in the Sequences. 

There have been a few posts about it in the past, here are two: the title debate, content optimization.

We've been asked if we'd like to produce the audiobook version and the answer is yes. This is a large undertaking. The finished product will probably be over 35 hours of audio.

To help mitigate our risk we've decided to Kickstarter the audiobook.  This basically allows us to pre-sell it so we're not stuck with a large production cost and no revenue. 

The kickstarter campaign is here: https://www.kickstarter.com/projects/1267969302/lesswrong-the-sequences-audiobook

If you haven't heard of us before we've already produced some sequences into audiobooks.  You can see them and listen to samples which are indicative of the audio quality here.

The Promoted Posts and the Metaethics sequence now available in audio

16 Rick_from_Castify 03 June 2014 02:00AM

We are proud to announce audio versions of the Less Wrong Promoted Posts and the Metaethics major sequence, both now available via a Castify Podcast.

The Less Wrong Promoted Posts feed will have every new promoted post which has been tagged with the Creative Commons Attribution License.  We'll aim to have them read and to you via the podcast within 48 hours.  We've found this to be a good way to keep up with Less Wrong, especially for longer articles like last month's interesting long-form post called "A Dialouge on Doublethink" by BrienneStrohl.

The Metaethics Sequence is the next installment of the sequences we've produced into audio.  We now have 7 Less Wrong sequences in audio, with more on their way. 

As always we appreciate your support and your feedback: support@castify.co.

 

Links:

Promoted Posts Subscription: http://castify.co/channels/51-less-wrong

Metaethics sequence: http://castify.co/channels/50-metaethics

Channels page: http://castify.co/channels

 

Audio Version of "How to Actually Change Your Mind"

14 Rick_from_Castify 18 February 2014 08:23PM

The audio version of the mega-sequence "How to Actually Change Your Mind" is now available.  It's the biggest sequence we've done yet, coming in at over 8 hours of audio.  Here at Castify we think this sequence is the most important one we've done so far.

On the wiki it says, "The most important technique that Less Wrong can offer you is 'How to Actually Change Your Mind'".  I couldn't agree more.  If you haven't read it already I'd strongly recommend it.  If you have more free "ear time" than "eye time" or you learn better from listening this audio version is for you.

Feedback is welcome: support@castify.co

What I've learned from Less Wrong

79 Louie 20 November 2010 12:47PM

Related to: Goals for which Less Wrong does (and doesn’t) help

I've been compiling a list of the top things I’ve learned from Less Wrong in the past few months. If you’re new here or haven’t been here since the beginning of this blog, perhaps my personal experience from reading the back-log of articles known as the sequences can introduce you to some of the more useful insights you might get from reading and using Less Wrong.

1. Things can be correct - Seriously, I forgot. For the past ten years or so, I politely agreed with the “deeply wise” convention that truth could never really be determined or that it might not really exist or that if it existed anywhere at all, it was only in the consensus of human opinion. I think I went this route because being sloppy here helped me “fit in” better with society. It’s much easier to be egalitarian and respect everyone when you can always say “Well, I suppose that might be right -- you never know!”

2. Beliefs are for controlling anticipation (Not for being interesting) - I think in the past, I looked to believe surprising, interesting things whenever I could get away with the results not mattering too much. Also, in a desire to be exceptional, I naïvely reasoned that believing similar things to other smart people would probably get me the same boring life outcomes that many of them seemed to be getting... so I mostly tried to have extra random beliefs in order to give myself a better shot at being the most amazingly successful and awesome person I could be.

continue reading »