Previous post: Fighting the allure of depressive realism

My blog entries are about a personal battle against depression and anxiety, from the point of view of someone who has been immersed in rationalist/LW ideas and culture for a few years now.

I want to illustrate a particular, recurring battle I have with scrupulosity. (I'm not the best at dialogues, so bear with me for a moment.)

Me: Alright, it's time to be productive and get to my homework.
???: Hold on! How can you possibly justify that if you haven't solved ethics yet?
Me: What? Who are you?
SD: Allow me to introduce myself, I'm your Skeptic Demon. I whisper ethical concerns in your ear to make sure you're always doing the right thing.
Me: That sounds more like a daemon than a demon to me.
SD: Demon. Trust me.
Me: Solving ethics can't possibly be expected of a single person, demon.
SD: Right you are! But you've looked around the world enough to know that everything you do could have ripple effects that might be disastrous. So how can you possibly feel good about working on your homework without accounting for that?
Me: What? It's just homework.
SD: Oh, no it isn't. Doing well on homework means sending a better signal to employers means more people want to hire you down the line, including for unscrupulous activities. And you've done not-great things before, so we can't be sure you'll resist. In fact the existence of first-, second-, third-, and nth-order effects implies you might not even realize when you're being offered such.
Me: Erm... Well, it's true that things have unintended consequences, but--
SD: No "buts"! You want to be a good person, right? So we gotta reason this out.
Me: I guess you have a point...
SD: Alright. So let's get started.
(hours pass)
SD: Okay. You're on shaky ice with some of these considerations. I'm not totally convinced you won't be tempted by the money to go and do something net harmful yourself, but I will give you a one-time pass. You may proceed to start your assignment .
Me: I'm exhausted and I just want to go to sleep now.
SD: Then my work is done here. *disappears in a puff of shaky logic*

This kind of conversation happens to me all the time. Why?

On one level, it's easy to see what the skeptic demon is doing. He's trolling. He's keeping me from doing the actual productive work I want to do, and very curiously never pops up to ask whether my watching TV or even whether my eating meat is ill-advised.

But he's trolling with a legitimate issue - the fact that we can't actually predict all of the possible consequences of our actions. It feels wrong to say that someone should be held ethically responsible for the sum total of that butterfly effect, but it feels equally wrong to deny they have any stake in it whatsoever. Trolls are worst when they find an issue that is near and dear to your heart and poke at it.

What to do? I'd like to at least justify why I think it's okay to ignore this little guy.

I think we can get a lot of mileage here out of the old Carl Sagan heuristic, "Extraordinary claims require extaordinary evidence." Here, it changes to extraordinary ethics require extraordinary arguments. And the idea that I should sabotage my own career out of the fear that I might accidentally harm someone down the line due to my own weakness is one heck of an extraordinary ethic.

For one, this ethic immediately fails my pop-philosophy understanding of the categorical imperative. If everyone acted like this, modern society and all of its woes would crumble, but so would its many, many, many benefits.

It also fails my understanding of why we usually give self-interest a seat at the table in ethics, even if we worry about its excesses: A world in which everyone spends all of their energy trying to make other people happy but never take time for themselves is a world where everyone runs themselves ragged and is uniformly miserable.

We could make the argument that people are far less morally responsible for second-, third-, etc. order effects from many different angles, one of my favorites being local validity. And so on.

I'm not sure how far I can take this heuristic before it breaks, but I think it's a very wise starting point to begin with when it comes to issues of scrupulosity.

New Comment
6 comments, sorted by Click to highlight new comments since:

Butterfly effects essentially unpredictable, given your partial knowledge of the world. Sure, you doing homework could cause a tornado in Texas, but it's equally likely to prevent that. To actually predict which, you would have to calculate the movement of every gust of air around the world. Otherwise your shuffling an already well shuffled pack of cards. Bear in mind that you have no reason to distinguish the particular action of "doing homework" from a vast set of other actions. If you really did know what actions would stop the Texas tornado, they might well look like random thrashing.

What you can calculate is the reliable effects of doing your homework. So, given bounded rationality, you are probably best to base your decisions on those. The fact that this only involves homework might suggest that you have an internal conflict between a part of yourself that thinks about careers, and a short term procrastinator.

Most people who aren't particularly ethical still do more good than harm. (If everyone looks out for themselves, everyone has someone to look out for them. The law stops most of the bad mutual defections in prisoners dilemmas) Evil genius trying to trick you into doing harm are much rarer than moderately competent nice people trying to get your help to do good.

Personally, I just dismiss scrupulosity as an error. I don't need a justification for doing this, any more than I need a justification for concluding that if, when doing some mathematics, I derive a contradiction, then I must have made an error somewhere. Call this the absurdity heuristic, call it a strong prior, but obsessing over the unknowable potentially enormous consequences of every breath I take is an obvious failure mode, and I don't do obvious failure modes. Instead, I do what looks like the right thing for me to do, the thing that only I will do. (That is just a rough description of a rule of thumb, not something with a detailed philosophical analysis behind it.)

This probably makes me a bad person by the standards of the typical EA. I have no interest in dissuading them from their calling, but it is not my calling.

A way I would suggest looking at it: your scrupulousity daæmon has its own estimates of prior probabilities for such things as “you being a fundamentally decent person” and “you being on the cusp of accomplishing dreadful evil, maybe by accident”, and those estimates are, respectively, low, and high, because the dæmon shares your mental substrate and inherits the effects of depression.

As with us, the dæmon’s priors guide its theorising. It expects you to accomplish dreadful evil. How? Well, if your prior probability for something is high, and there isn’t a simple explanation, then there must be a complicated explanation! And people are very good at producing complicated hypotheses out of nothing, we’ve been doing it for millennia.

One thing that's helped me is to more proactively cache these internal dialogs, and when they start again to ask myself "do I have any new significant information on this topic compared to last time I spent a few hours on it"? If no, I can skip to the end and use the one-time pass before I'm exhausted.

Also, after enough repetition and variation, I'm pretty convinced that "it all adds up to normal" applies to non-extreme ethical choices as well as it does non-extreme physics.

[-]aaq20

To generalize this heuristic a bit, and to really highlight where its weaknesses lie: An ethical argument that you should make a significant change to your lifestyle should be backed up more strongly in proportion to that change.

For example, to most people, the GWWC 10% pledge is a mildly extraordinary claim. (My parents actually yelled at me when I donated my Christmas money at 17.) But I think it does meet our bar of evidence: 10% income is usually no great hardship if you plan for it, and the arguments that the various EAs put forward for it are often quite strong.

Where this heuristic breaks down is an exercise to the reader. :)

A few ideas:

A. Utility maximization - if you donate a bunch of money to an effective charity and they save 100 lives, but in the course of delivering say, medical supplies to somewhere (which is how 100 lives were saved) one of the volunteers falls out of a boat and drowns:

1) It couldn't have been predicted in advance.

2) The gains out weigh the benefits.

So, the real question shouldn't be "Will something bad happen?" But "is it remotely plausible that the bad will outweigh the good?" Yes doing homework might, once in a million person years, result in some terrible tragedy, and if you're really worried about paper then go electronic. The scenario I outlined sounds kind of ridiculous. Most actions don't have big impacts by themselves/destroy the world (or have tremendous impact). If you determine the course of action with the highest utility, it might not be as high as you think, but it's still the best thing you can do. (Using time effectively might involve lots of planning, but that might be less important (currently) than doing what you know you have to do right now.)

B. Is the "Skeptic Demon" helping? 'Cause if it's never done anything useful, and if it's getting on something it's got on before, it probably doesn't have anything new to say. (Writing stuff down might help. It seems like it'd be more useful to apply to inaction, if that makes sense?)