If it's worth saying, but not worth its own post, then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should start on Monday, and end on Sunday.
4. Unflag the two options "Notify me of new top level comments on this article" and "
Okay, I finished reading the book, and then I also looked at the wiki. So...
A few years ago I suspected that the biggest danger for the rationalist movement could be it's own success. I mean, as long as no one give a fuck about rationality, the few nerds are able to meet somewhere at the corner of the internet, debate their hobby, and try to improve themselves if they desire so. But if somehow the word "rationality" becomes popular, all crackpots and scammers will notice it, and will start producing their own versions -- and if they won't care about the actual rationality, they will have more degrees of freedom, so they will probably produce more attractive versions. Well, Gleb Tsipursky is already halfway there, and this Athene guy seems to be fully there... except that instead of "rationality", his applause light is "logic". Same difference.
Instead of nitpicking hundred small details, I'll try to get right into what I perceive as the fundamental difference between LW and "logic nation":
According to LW, rationality is hard. It's hard, because our monkey brains were never designed by evolution to be rational in the first place. Just to use tools and win tribal politics. That's what we are good at. The path to rationality is full of thousand biases, and requires often to go against your own instinct. This is why most people fail. This is why most smart people fail. This is why even most of the smartest ones fail. Humans are predictably irrational, their brains have systematic biases, even smart people believe stupid things for predictable reasons. Korzybski called it "map and territory", other people call it "magical thinking", here at LW we talk about "mysterious answers to mysterious question" -- this all points in approximately the same direction, that human brains have a predictable tendency to just believe some stupid shit, because from inside it seems perfectly real, actually even better than the real thing. And smarter people just do it in more sophisticated ways. So you have to really work hard, study hard, and even then you have a tiny chance to be fully sane; but without current research and hard work, your chances are zero for all practical purposes.
"Logic nation" has exactly the opposite approach. There is this "one weird trick", when you spend a few hours or weeks doing a mental exercise that will associate your positive emotions with "logic", and... voilà... you have achieved a quantum leap, and from now on all you have to do is to keep this emotional state, and everything will be alright. Your faith in logic will save you. And the first thing you have to do, of course, is to call your friends and tell them about this wonderful new thing, so they also get the chance to "click". As long as you keep worshiping "logic", everything will be okay. Mother Logic loves you, Mother Logic cares for you, Mother Logic will protect you, Mother Logic created this universe for you... and when you fully understand your true nature, you will see that actually Mother Logic is you. (Using my own words here, but this is exactly how what I have seen so far seems to me.)
Well, to me this smells like exactly the kind of predictable irrationality humans habitually do. Take something your group accepts as high-status and start worshiping it. Imagine that all your problems will magically disappear if you just keep believing hard. Dissolve yourself in some nebulous concept. How is this different from what the average New Age hippie believes? Oh yes, your goddess is called Logic, not Gaia. I rest my case.
I know that the topic of AI is too removed from our everyday lives, and most people's opinion on this topic will absolutely have no consequence on anything, but even look there: Athene just waves his hand and says it will be all magically okay, because an AI smarter than us will of course automatically invent morality. (Another piece of human predictable irrationality, called "anthropomorphisation". Yeah, the AI will be just another human, just like the god of rain is just another human. What else than a human could there be?)
Speaking of instrumental rationality, the book you linked provides a lot of good practical advice. I was impressed. I admit I didn't expect to see this level of sanity outside LessWrong. Some parts of the book could be converted into 5 or 10 really good posts on LW. I mean it as a compliment. But ultimately, that seems to be all there is, and the rest is just a huge hype about it. (Recently LW is kind of dying, so to get an idea about what a really high-quality content looks like, see e.g. articles written by lukeprog.) But speaking about epistemic rationality, the "logic nation" is far below the LW level. It's all just hand-waving. And salesmanship.
Also, I dislike how Athene provides scientific citations for very specific claims, but when he describes a whole concept, he doesn't bother hinting that the concept was already invented by someone else. For example, on the wiki there is his bastardized version of Tegmark Multiverse + Solomonoff Induction, but it is written as something he just made up, using "logic". You see, science is only useful for providing footnotes for his book. Science supports Athene, not the other way round.
Eliezer, for all his character flaws, may perhaps describe himself as the smartest being in the universe (I am exaggerating here (but not too much)), but then he still tells you about Kahneman and Solomonoff and Jaynes and others, and would encourage you to go and read their books.
Etc. The summary is that Athene provides a decent checklist of instrumental rationality in his book, but everything else is just a hype. And his target audience are the people who believe in "one weird trick".
Try reading the Sequences and maybe you will see what I was trying to describe here. That is a book that often moves people to a higher level of clarity of thinking, where the things that seemed awesome previously just become "oh, now I see how this is just another instance of this cognitive error". I believe what Athene is doing is built on such errors; but you need to recognize them as errors first. Again, I am not saying he is completely wrong; and he has useful things to provide. (I haven't listened to his podcasts yet, if they expand on the material from the book, that could be valuable. Although I strongly prefer written texts.) It's just, there is so much hype about something that was already done better. So obviously people on this website are not going to be very impressed. But it may be incredibly impressive to someone not familiar with the rationalist community.
If you are aware of mathematics what do you think about this part: https://logicnation.org/wiki/A_simple_click#Did_God_create_logic.3F Is it falsifiable? There was an interesting talk how something can arise out of nothing and how it's relatable to the present moment which one can't ever grasp but I will have to condense it for you guys later.
... (read more)