SilasBarta comments on Less Wrong Q&A with Eliezer Yudkowsky: Ask Your Questions - Less Wrong

16 Post author: MichaelGR 11 November 2009 03:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (682)

You are viewing a single comment's thread.

Comment author: SilasBarta 12 November 2009 12:06:41AM 7 points [-]

Okay: Goedel, Escher, Bach. You like it. Big-time.

But why? Specifically, what insights should I have assimilated from reading it that are vital for AI and rationalist arts? I personally feel I learned more from Truly Part of You than all of GEB, though the latter might have offered a little (unproductive) entertainment.

Comment author: Kutta 13 November 2009 01:18:37AM *  4 points [-]

Why? I think, maybe because GEB integrates form, style and thematic content into a seamless whole in a unique and pretty much artistic way, while still being essentially non-fiction. And GEB is probably second to nothing at conveying the notion of an intertwined reality. It also provides very intelligent and intuitive introduction to a whole lot of different areas. Sometimes you can't do all the job of conveying extremely complex ideas in a succinct essay; just look at the epic amount of writing Eliezer had to do merely to establish a bare framework for FAI discussion (besides, from the fact that Eliezer likes GEB does not follow that GEB should be a recommended reading for AI or rationalist arts. It just means that Eliezer thinks it's a good book).

Comment author: SilasBarta 13 November 2009 03:18:30AM 0 points [-]

That doesn't answer my question. Again, what rationalist/AI mistake would I not make as a result of reading GEB that could not be achieved with something shorter?

Comment author: Kutta 13 November 2009 11:39:58AM *  0 points [-]

As I said, there is not necessarily any kind of rationalist/AI content in GEB directly relevant to us. It could be well just simply a good book.

Comment author: SilasBarta 16 November 2009 07:23:20PM 0 points [-]

But would Eliezer view it as that durn good (i.e. it being a tragedy that people die without reading it) if it were just entertaining fluff with no insights to AI and rationality?

Comment author: Yorick_Newsome 29 November 2009 11:48:46AM *  2 points [-]

I'm not Eliezer, and perhaps not being an AGI researcher means that my answer is irrelevant, but I think that things can have a deep aesthetic value or meaning from which one could gain insights into things more important than AI or rationality. One of these things may be the 'something to protect' that Eliezer wrote about. Others may be intrinsic values to discover, to give your rationality purpose. If I could only keep one of a copy of the Gospels of Buddha or a copy of MITECS, I would keep the Gospels of Buddha, because it reminds me of the importance of terminal values like compassion. When I read GEB the ideas of interconnectedness, of patterns, and of meaning all left me with a clearer thought process than did reading Eliezer's short paper on Coherent Extrapolated Volition, which was enjoyable but just didn't seem to resonate in the same way. Calling these things 'entertaining fluff' may be losing sight of Eliezer's 11th virtue: "The Art must have a purpose other than itself, or it collapses into infinite recursion."
That is all, of course, my humble opinion. Maybe having everyone read about and understand the dangers of black swans and unfriendly AI would be more productive than having them read about and understand the values of compassion and altruism; for if people do not understand the former, there may be no world left for the latter.