Dilbert creator and bestselling author Scott Adams recently wrote a LessWrong compatible advice book that even contains a long list of cognitive biases. Adams told me in a phone interview that he is a lifelong consumer of academic studies, which perhaps accounts for why his book jibes so well with LessWrong teachings. Along with HPMOR, How to Fail at Almost Everything and Still Win Big should be among your first choices when recommending books to novice rationalists. Below are some of the main lessons from the book, followed by a summary of my conversation with Adams about issues of particular concern to LessWrong readers.
My favorite passage describes when Adams gave a talk to a fifth-grade class and asked everyone to finish the sentence “If you play a slot machine long enough, eventually you will…” The students all shouted “WIN!” because, Adams suspects, they had the value of persistence drilled into them and confused it with success.
“WIN!” would have been the right answer if you didn’t have to pay to play but the machine still periodically gave out jackpots. Adams thinks you can develop a system to turn your life into a winning slot machine that doesn’t require money but does require “time, focus, and energy” to repeatedly pull the lever.
Adams argues that maximizing your energy level through proper diet, exercise, and sleep should take priority over everything else. Even if your only goal is to help others, be selfish with respect your energy level because it will determine your capacity for doing good. Adams has convinced me that appropriate diet, exercise, and sleep should be the starting point for effective altruists. Adams believes we have limited willpower and argues that if you make being active every single day a habit, you won’t have to consume any precious willpower to motivate yourself to exercise.
Since most pulls of the life slot machine will win you nothing, Adams argues that lack of fear of embarrassment is a key ingredient for success. Adams would undoubtedly approve of CFAR’s comfort zone expansion exercises.
Adams lists skills that increase your chances of success. These include knowledge of public speaking, psychology, business writing, accounting, design, conversation, overcoming shyness, second language, golf, proper grammar, persuasion, technology (hobby level), and proper voice technique. He gives a bit of actionable advice on each, basically ideas for becoming more awesome. I wish my teenage self had been told of Adams’ theory that a shy person can frequently achieve better social outcomes by pretending that he is an actor playing the part of an extrovert.
Adams believes we should rely on systems rather than goals, and indeed he thinks that “goals are for losers.” If, when playing the slot machine of life, your goal is winning a jackpot, then you will feel like a loser each time you don’t win. But if, instead, you are systems oriented then you can be in the constant state of success, something that will probably make you happier.
Adams claims that happiness “isn’t as dependent on your circumstances as you might think,” and “anyone who has experienced happiness probably has the capacity to spend more time at the top of his or her personal range and less time near the bottom.” His suggestions for becoming happier include improving your exercise, diet, and sleep; having a flexible work schedule, being able to imagine a better future, and being headed towards a brighter future.
The part of the book most likely to trouble LessWrong readers is when Adams recommends engaging in self-delusion. He writes:
“Athletes are known to stop shaving for the duration of a tournament or to wear socks they deem lucky. These superstitions probably help in some small way to bolster their confidence, which in turn can influence success. It’s irrelevant that lucky socks aren’t a real thing…Most times our misconceptions about reality are benign and sometimes, even helpful. Other times, not so much.”
For me, being rational means having accurate beliefs and useful emotions. But what if these two goals conflict? For example, a college friend of mine who used to be a book editor wrote “Most [authors] would do better by getting a minimum wage job and spending their entire paycheck on lottery tickets.” I know this is true for me, yet I motivated myself to write my last book in part by repeatedly dreaming of how it would be a best seller, and such willful delusions did, at the very least, make writing the book more enjoyable.
To successfully distort reality you probably need to keep two separate mental books: the false one designed to motivate yourself, and the accurate one to keep you out of trouble. If you forget that you don’t really have a “reality distortion field”, that you can’t change the territory by falsifying your map, you might make a Steve Jobs level error by, say, voluntarily forgoing lifesaving medical care because you think you can wish your problem away.
The strangest part of the book concerns affirmations, which Adams defines as the “practice of repeating to yourself what you want to achieve while imagining the outcome you want.” Adams describes his fantastic success with achieving his affirmations, which included his becoming a famous cartoonist, having a seemingly hopeless medical problem fixed, and scoring at exactly the 94th percentile on the GMATs. Adams writes that the success of affirmations for him and others seems to go beyond what could be achieved by positive thinking. Thankfully he rules out magic as a possible solution and suggests that the success of affirmations might be due to selective memories, false memories, optimists tending to notice opportunities, selection effect of people who make affirmations, and mysterious science we don’t understand. His support of affirmations seems to contradict his dislike of goals.
Our Phone Conversation
I took advantage of this offer to get a 15-minute phone interview with Adams.
He has heard of LessWrong, but doesn’t have any specific knowledge of us. He thinks the Singularity is a very probable outcome for mankind. He believes it will likely turn out all right due to what he calls “Adams’ Law of Slow-Moving Disasters” which says that “any disaster we see coming with plenty of advance notice gets fixed.” To the extent that Adams is correct, we will owe much to Eliezer and associates for providing us with loud early warnings of how the Singularity might go wrong.
I failed in my brief attempt to get Adams interested in cryonics. He likes the idea of brain uploading and thinks it will be unnecessary to save the biological part of us. I was unable to convince him that cryonics is a good intermediate step until someone develops the technology for uploading. He mentioned that so long as the Internet survives a huge amount of him basically will as well.
Recalling Adams’ claim that having a high tolerance for embarrassment is a key attribute for success, I asked him about my theory of how American dating culture, in which it’s usually the man who risks rejection by asking the woman to go on the first date, gives men an entrepreneurial advantage because it eventually makes men more tolerant of rejection. Adams didn’t directly respond to my theory, but brought up evolutionary psychology by saying that men would encounter much rejection because of their preference for variety and role as hunters. Adams stressed that this was just speculation and not back up by evidence.
Adams has heard of MetaMed. He is very optimistic about the ability of medicine to become more rational and to handle big data. When I pointed out that doctors often don’t know basic statistics, he said that this probably doesn’t impede their ability to treat patients.
After I explained the concept of akrasis to Adams, he mentioned the book “The Power of Habit” and told me that we can use science to develop better habits. (Kaj_Sotala recently wrote a highly upvoted LessWrong post on the “The Power of Habit.”)
Adams suggested that if you have trouble accomplishing a certain task, just focus on the part that you can do: for example if spending an hour at the gym seems too difficult, then just think about putting on your sneakers.
Although he didn’t use these exact words, Adams basically defended himself against the charge of “other-optimizing.” He explained that it would be very difficult to describe to an alien what a horse is. But once you succeeded describing the horse, it would be much easier to describe a zebra because you could do so in part by making references to the horse. Adams said he knows his advice isn’t ideal for everyone, but it provides a useful template you can use to develop a plan better optimized for yourself.
At the end of the interview Adams said he was surprised I had not brought up assisted suicide, given his recent blog post on the topic. In the post Adams wrote:
“My father, age 86, is on the final approach to the long dirt nap (to use his own phrase). His mind is 98% gone, and all he has left is hours or possibly months of hideous unpleasantness in a hospital bed. I'll spare you the details, but it's as close to a living Hell as you can get...”
“If you're a politician who has ever voted against doctor-assisted suicide, or you would vote against it in the future, I hate your fucking guts and I would like you to die a long, horrible death. I would be happy to kill you personally and watch you bleed out. I won't do that, because I fear the consequences. But I'd enjoy it, because you motherfuckers are responsible for torturing my father. Now it's personal.”
Based on this blog post I suspect that Yvain would agree with Adams about the magnitude and intensity of the evil of outlawing assisted suicide for the terminally ill.
If you are unwilling to buy Adams’ book, I strongly recommend you at least read his blog, which normally has a much softer and less angry tone than the passage I just cited.
Bought the book based on these recommendations. Now I'd be curious to hear what people think about the things that Adams says about goals vs. systems:
This seems to somewhat contradict the advice in the massively-upvoted Humans are not automatically strategic, in which Anna Salamon suggests that we should:
On the other hand, some of the advice in "not automatically strategic" could also been seen as suggestions of how to evaluate your systems and set them up in a way that actually serves your aims... so they're not necessarily as contradictory as they might seem like at first.
Given that people untrained in the art of rationality don't do well with goals because they are not automatically strategic the possible solutions are to forgot about goals and instead use systems, or to take a more rational approach towards goals.