Inspired in part by Robin Hanson's excellent article on paternalism a while back, and in response to the various akrasia posts.

In C.S. Lewis's fourth Narnia book, The Silver Chair, the protagonists (two children and a Marsh-wiggle) are faced with a dilemma regarding the title object.  To wit, they met an eloquent and quite sane-seeming young man, who after a while reveals that he has a mental disorder: for an hour every night, he loses his mind and must be restrained in the Silver Chair; and if he were to be released during that time he would become a giant, evil snake (it is a fantasy novel, after all).  The heroes determine to witness this, and the young man calmly straps himself into the chair.  After a few moments, a change comes over him and he begins struggling and begging for release, claiming the other personality is the false one.  The children are nonplussed: which person(ality) should they believe?  And (a separate question) which should they help?

In the book this dilemma is resolved by means of a cheat*, but we in real life have no such thing.  We do, however, have an abundance of Silver Chairs, in the form of psychotropic drugs from alcohol to hallucinogens to fancy antidepressants and antipsychotics. Of course not every person who takes such drugs is in a Silver Chair situation, but consider for instance the alcoholic who insists he doesn't have a problem, or the paranoid schizophrenic who fears that any drug is an attempt to poison him.  Now we as observers or authorities may know from statistics or even from their personal histories that the detoxxed/drugged-up versions of these people would be happy for the change and not want to return to the previous state, but does that mean it's right (in a paternalistic sense, meaning for their own good) to force them towards what we call mental health?

I would say it is not, that our preference for one side of the Silver Chair over the other is simple bias in favor of mental states similar to our own.  From our places near normality we can't imagine wanting to be in these bizarre mental states, so we assume that the people who are in them don't really want to be either.  They might claim to, sure, but why believe them?  After all, they're crazy.  For two amusing thought experiments in this line which have been considered in detail by others, let the bizarre mental state in question take the values "religious belief", and "sense of humor".  For a sobering real-world application, consider the fate of homosexuals until a few decades ago. And then think about how, as Eliezer has said, the future like the past will be filled with people whose values we would find abhorrent.

This idea has internal relevance as well.  You could easily consider, for instance, the self introspecting at home who wants to lose weight and the self in a restaurant who wants to order cheesecake as two sides of a Silver Chair**And I think that view is more helpful than just calling it "akrasia", because it presents the situation as two aspects of your personality which happen to want different things, instead of some "weakness" which is interfering with your "true will". Then instead of castigating yourself for weakness of will, you merely think "I suppose my desire for cheesecake was stronger than I anticipated.  When I return to a state where my desire to lose weight is dominant, I shall have to make stricter plans."

Again, I see a bias: we think that the desires (and in fact the entire mental state) which we have while, e.g., sitting alone calmly in a quiet room are the "true" ones, or even the "right" ones in some moral sense, and that feelings or thoughts we have at other times are "lesser" or akrasic, simply because at the time when we're introspecting we can't feel the power of those other situations, and of course we rightly privilege our calm-quiet thinking for its prowess in answering objective questions.  We spend (presumably) the bulk of our lives not engaged in quiet introspection, so why should we defer to what our desires are then?

Of course, one can always say "When I calmly introspect and plan things in advance, I end up happier/more successful than if I were to give in to my impulses".  To which I would respond "That's fine.  If happiness or success is what you want, and that method is effective, then go for it."  My point is that, just as you shouldn't condemn someone else for not conforming to the desires or thought patterns you think they ought to have, much less force them to conform, neither should you condemn yourself.  Your utils come from doing what you want, not being happy or successful, or finding the most efficient way to satisfy as many of your desires as possible, or anything else.

This idea also seems to have relevance to the topic-which-shall-not-be-named, but I guess this isn't the time for that.

 

 

* Specifically, the chairbound personality invokes the Holy Name of God, which breaks the symmetry. Not a solution many readers of this site would go for, I think.

** That phrasing is admittedly quite awkward; I guess the two sides would be "(sitting) in" and "out" of the chair.

† I once read that brain scans show that one cannot remember the sensation of sex/orgasms in the same way one can remember other more ordinary sensations.  That doesn't jive with my personal experience, but if true I think it gives interesting evidence.  A related phenomenon sometimes mentioned by poets (and which I have experienced) is that as you fall in love with someone, you actually find it harder to remember what they look like.

‡ One can also object that impulsive desires are incoherent: e.g. hyperbolic discounting.  But I would say that incoherence is a property of epistemic systems, i.e. things that must be explained by other things.  A desire doesn't need to be explained by anything or agree with anything; it merely is.  And paradoxes of wanting both X and !X don't seem to arise (or if they do, you can always kick in some rationality at that point).

New Comment
40 comments, sorted by Click to highlight new comments since:

This idea has internal relevance as well. You could easily consider, for instance, the self introspecting at home who wants to lose weight and the self in a restaurant who wants to order cheesecake as two sides of a Silver Chair**. And I think that view is more helpful than just calling it "akrasia", because it presents the situation as two aspects of your personality which happen to want different things, instead of some "weakness" which is interfering with your "true will". Then instead of castigating yourself for weakness of will, you merely think "I suppose my desire for cheesecake was stronger than I anticipated. When I return to a state where my desire to lose weight is dominant, I shall have to make stricter plans."

IAWYC, but I have trouble with this particular example. Quite often when I do eat that piece of cheesecake, I'm thinking "Oh no, I hate myself, I really shouldn't be doing this". On the other hand, I have no such feelings when the diet-self wins out over the dessert-self. That suggests that there is some fundamental asymmetry, and not just two different but equal selves involved.

Perhaps this is a better example: I've noticed that the ability to draw well is extremely useful, and I used to say that I wanted to be able to draw. Lots of people say the same thing. Well, yes; I want to be able to draw. But after thinking about it, I don't want to put in the necessary effort to learn to draw. All the moments of my day that I would have to give up for many years to do so, I'm doing something that I'd rather be doing than learning to draw. The "I want to read Less Wrong" me beats the "I want to learn to draw" me.

As I was teaching myself to draw, I got immense satisfaction from every incremental improvement in my skill, no matter how small. I think this sort of childlike attitude is a necessity for teaching yourself a complex skill in the presence of distractions.

Yeah, me too. There is a consistent asymmetry between the two sides of the Chair for me. The long-term goal is usually what I want to want to do, the short-term pleasure is usually what I want to do and want to not want to do. I think evbio explanations of adaptations maladapted for a world with different pleasure availability and a higher discount rate, where the long view is right and the short view is wrong, make a lot more sense than trying to treat the two sides symmetrically.

That said, there are certainly situations where the argument applies. When I wake up hungover, I may curse my previous night's partying, but as I'm only experiencing the downside at that moment it is not always clear to me whether I will, in a day, wish to have gotten drunk or not. But when I eat sugar, or read LW instead of working, I am usually wishing I was not doing it even while doing it, let alone days or weeks later.

I certainly agree that metawanting is involved in this phenomenon; that's part of why I requested more discussion on the topic (which Alicorn is attempting to do).

Ok I'll react. I for one have wound up following my desires as they are.

I mean, far too often I could see the two sides of a coin, like "chocolate tastes good | that's going to ruin my health", and realize that indeed, the first part was stronger than the second one. Since I incur a net willpower cost for the weaker side, whenever I try to fight off the stronger impulse, I slowly but surely learned the habit to instantaneously bow and submit to my impulse, rather than wasting resources on what would turn out to be a lost cause in the end, anyway.

At first I did it only on those occasions where it didn't look like it was even relevant to push myself. Then I expanded the set of occasions where I'd submit, again and again. I could often see how it was either harmless to concede ground, or irrelevant to keep it. And now I'm pretty much ridden with akrasia and laziness.

The state where you follow your "rational" judgment is more fragile, and artificial than the one where you follow the natural impulse. It's easier to fall out of it than back in. And if you don't train it constantly, and if you grow into the habit of yielding, then you'll get even better at repeating the same pattern next time.

I'll give just one more general example, that is when you think something along the lines of "oh, I want X, but I shouldn't; ok, I'll just do a little bit of it, then I'll stop", where X can be a lot of things, like this apple pie, gaming time while you're studying, getting a wee bit too close to that special person you can't be close to, etc. Well, if it's already difficult to fight it off at first, it won't get easier once you've had a bit of it. To the contrary. And if you're sophisticate enough to generalize past experiences, pleasurable ones, then any such "mistake" will help you choose the impulse over the rational choice, next time.

On an absolute sense, I'd tend to believe that something fragile needs to be protected. Also, not a few of the actions I'll take against my best judgment, I know to be nefarious in one way or another.

Yes, I damn want those things I think I shouldn't want. In some cases, to the point where I'll actually defend them against myself or other's interference, even though I know that if I have to be consistent with my own values, I shouldn't, rather should I fight them. That , in general, sounds almost like a plea for help from the part of a mind that knows it's right, but too fragile to survive once it's made a few mistakes and has to compete with something much more vivid. If we end up saying that whatever side wins oneself over, is right, as long as you like it, then it's only selecting for the strongest of both opponents. Does might make right ?

Does this look any different in the light of hyperbolic discounting?

The funny thing about regret is that it's better to regret something you have done than something that you haven't done. I feel very fortunate in that I regret vanishingly little of what I have done - but I regret so much of what I haven't done.

I'll regret a lot when I'll be getting older, as I'll probably get crippled physically, mentally and economically from my lifestyle. I am also forfeiting my chances at something better, post singularity (or post longevity escape velocity, or whatever), if I can't make it to there. That's a lot of stuff I couldn't do, to be regretted. Fact is, however, what I know intellectually, doesn't move me anymore, or very little. It doesn't connect to my feelings. There's no dread, no sense of responsibility, no hope, nothing, that can push me forward. But there's still something to keep me down. Any effort still feels like an effort. Doing something without any will to do it, feels like an effort. A permanent one. And then there's little time and attention to spare for "important stuff" when there's a constant drone in the back of your mind, reminding you of all the little, meaningless things you could do to get some immediate pleasure. That trumps long term projects, for which I feel much less motivated than I feel for short term ones.

I kinda knew about hyperbolic discounting, for a long time. It was fairly evident, from observing others, as well as myself, that the natural tendency was to discount, even large incentives, applying to tomorrow in favor of those applying today. But I didn't discount my future so much when I was younger. Point in fact, I rarely did. I was ready to forfeit pleasure and even give up on a normal life, to develop myself, learn, study, spare money, eat healthily, etc.

Maybe I overdid it, and am suffering some kind of burnout. Or maybe I had enabling conditions that disappeared later on, like my family, or maybe was it that I was a child and therefore that I didn't quite reason like an adult, that my neurology was different. Or maybe did I receive a few hard blows when I was a young teen, and that was more than I could handle.

So, no, it doesn't look too different in light of hyperbolic discounting. I'm stuck, and knowledge alone doesn't seem to be enough to help. Helping myself looks in principle as easy as "just doing it". In practice I can't, and I don't even see why. It's almost as if I was resisting getting better, as if I was protecting that flaw that's grown in me. For that, I must say the "stuck in the middle" article rang a bell too.

I kinda knew about hyperbolic discounting, for a long time.

You knew you were discounting the future, sure, but I don't think that's the same as knowing about hyperbolic discounting.

I am very much pro-pleasure; no-one should have to give up on that. I guess I'm lucky in that my favourite pleasures are relatively safe ones like sex and drugs, so I don't have to discount the future to enjoy them.

I'm really sorry to hear you feel so bleak about yourself and the situation you find yourself in; it sounds like you're not alone here.

The earliest description of akrasia that I'm aware of is by Paul, in the epistle to the Romans, chapter 7 verse 15: "For what I want to do I do not do, but what I hate I do" (NIV). The Christians explained these feelings as due to deliberate temptation by external (and internal) evil forces. The appropriate response to temptation is willpower. (A more doctrinal response might be faith and prayer, but in practice having faith seems a lot like giving yourself a willpower placebo.) That's probably where our ideas about how to deal with akrasia come from.

The best advice I've heard about silver chairs is that, if you're torn in two different ways about a decision, it's because both have high utility or both have low utility, but the difference in utility as far as you can see is negligible. And that means that you're indifferent; so you shouldn't worry about it.

I have some experience with dieting and exercise. I think that a lot of people who feel themselves in a silver chair over dieting are really suffering from merely believing that they want to diet (see earlier discussions by EY on OB); or from having set up "lose weight" as a final goal rather than as an instrumental goal. Once I really believed that dieting and exercise would get me what I wanted, it suddenly became easy. Which means that I was only doing what I wanted to do all along. The me in the silver chair eating the cheesecake wasn't really a different me. It was, like in the story, the more fully-aware me; and once I really wanted to lose weight, instead of merely believing that I wanted to lose weight, it cooperated happily.

Akrasia, both the term and its first instance, are from Plato (Wikipedia backs me up on this ;) ).

Wikipedia's summary is clearer, if less true to the text, than the one I wrote, so, basically, Plato says akrasia is impossible because: "A person never chooses to act poorly or against his better judgment; actions that go against what is best are only a product of being ignorant of facts or knowledge of what is best or good."

As for the weight loss thing, very few people desire "losing weight"; they desire "being fit" or "looking good." "Losing weight" is merely a means to those ends. It would be like saying you desire going to the dentist; almost no one wants to go to the dentist, but they do want to have good teeth, and it's necessary to that end. If you look at the problem this way, it makes more sense, though I agree that eating the cheesecake is not necessarily akratic.

As for the weight loss thing, very few people desire "losing weight"; they desire "being fit" or "looking good."

Actually, very few people desire "being fit" or "looking good" -- they are avoiding being fat and looking bad! This is not a trivial distinction, unfortunately.

More precisely, I should say that "few people who are trying and failing to lose weight desire being fit or looking good." Desiring those things is correlated with success, whereas avoiding other things is correlated with failure.

For one thing, avoidance-based motivation is cyclical: the more weight you lose, the less motivated you'll be to stay on your diet, since the thing you're avoiding is further away now.

For another, avoidance motivation is non-specific: it leads you to choose your weight loss methods according to what will least inconvenience you, rather than what will produce the best results. You will also not be committed to any particular technique, since your true goal is to get away from something, rather than moving towards any particular end-state.

Third and finally: avoidance-based motivation is inherently stressful, and stress is not conducive to healthy weight loss.

Two of these issues apply to avoidance-motivation for almost any goal, for almost any person. The third is dependent somewhat on mindset; growth-minded individuals aren't as stressed by avoidance motivation, presumably because they don't allow it to become chronic, or possibly because they don't view what they're avoiding as reflecting on them to the same extent that fixed-mindset people do.

Avoidance motivation is useful for noticing there's a problem, but it needs to be quickly turned into the choice of a particular desired direction and goal in order to be useful. This is why so many self-help books harp on defining goals.

Unfortunately, most of them do not explain that it is insufficient to reverse the verbalization of your goals to make them positive. Translating "I wish I weren't so fat and ugly" to "I want to weigh X pounds by Y date" is not merely a matter of changing the words you're using. The inner experience to which the words apply, must also be replaced.

[-]dclayh-10

For another, avoidance motivation is non-specific: it leads you to choose your weight loss methods according to what will least inconvenience you, rather than what will produce the best results.

IAWYC, but I don't see how this follows. If I want to get away from New York, I choose the fastest road, the same as if I were going to a point in that direction.

If I want to get away from New York, I choose the fastest road, the same as if I were going to a point in that direction.

If you have only ONE dimension of appropriate choice, sure. But the cognitive architecture for avoiding pain doesn't seem to make the same kind of trade-offs that the subsystem for obtaining pleasure does. We're willing to experience pain to get pleasure, but not as willing to trade one pain for reduction in another. We'll prefer to wait around for something that promises to eliminate the pain without adding any new ones.

That's why "easy" sells to people trying to lose weight, but "hard" sells to people who are trying to gain strength or build their body. Just look at the marketing for exercise products that boast just how tired their workout is going to make you, vs. the ones that emphasize how easy it's going to be; the correlation is with the prospect's direction of motivation, either towards or away-from.

If we were truly consistent in our motivated decision-making, everyone would advertise that their products are easy, as well as effective. In practice, advertising either targets easiness or toughness -- with toughness being used as a proxy for effectiveness.

The "easy" products emphasize ease, comfort, and relief, while treating the results as almost incidental... and they also emphasize just how fat and ugly people's "before" is. "Hard" products put more emphasis on their "afters", sometimes not even bothering with any "before" pictures!

So, whether it makes logical sense or not, the marketers have figured out that we actually do think this way, and have rationally adapted to maximize their utility. ;-)

Hm. I suppose I've never studied exercise literature that closely.

Akrasia, both the term and its first instance, are from Plato (Wikipedia backs me up on this ;) ).

Not surprising, since Paul is sometimes called the Platonizer of Christianity.

[-]loqi50

I would say it is not, that our preference for one side of the Silver Chair over the other is simple bias in favor of mental states similar to our own.

Interesting. I am reminded of how annoying it is to recall states of happiness and mirth when I'm upset about something, and vice versa.

I think you raise a very interesting question and I have strong sympathies for your conclusion. That having been said, there is a point at which it's possible to tell that someone is behaving irrationally in an absolute sense. By that, I mean that it can be determined that someone's behavior is systematically incompatible with achieving some important items among their desires (whatever those desires may be). I don't mean the desires they would have if something were done to get them out of the Silver Chair - I mean the desires they actually do have at the time, that they're not achieving because there is something interfering.

For instance, the drunk who does not believe he has a problem does not harbor a secret desire to destroy his family or die of cirrhosis at an early age or kill people in car accidents by driving under the influence. His belief that he does not have a problem is interfering with his avoidance of those nondesiderata. It's not that either his rational mind or the drinking have changed his desires re: kin estrangement/his liver/automobile accidents; it's that his desire for alcohol has hijacked his thought process to the point where he can't connect the excess of booze to the nondesiderata. If it's possible to excise the hijacking desire, other desires which are also important to the drunk will be in a better position to be fulfilled.

It's when we start being paternalistic, to achieve ends that the subject does not share at all and to save no one but the subject, that it becomes deeply murky territory.

How can you distinguish an "overridden thought process" from a merely very strong, overriding desire? I think many alcoholics would at least claim to be aware of the risks.

And I think that view is more helpful than just calling it "akrasia", because it presents the situation as two aspects of your personality which happen to want different things, instead of some "weakness" which is interfering with your "true will".

Nice point.

BTW, the Silver Chair contains the defense of Christianity that I've probably heard Christians cite more than any other single source: "Suppose we have only dreamed, or made up, all of those things—trees and grass and sun and moon and stars and Aslan himself. Suppose we have. Then all I can say is that, in that case, the made-up things seem a good deal more important than the real ones."

[-][anonymous]20

An important counterargument is, "If made-up things are that important, then the making-things-up process is even more important." Christianity's "made-up things" socially require you not to understand, improve, or re-extrapolate the making-things-up process.

Similarly, if Pascal's Wager is important, then the process that lets you find and make Pascal's Wagers is even more important. Christianity's Pascal's Wager socially requires you to not search for, see, or make other Pascal's Wagers more important than Christianity's Pascal's Wager.

People are mentioning different "I am currently conflicted between two decisions" scenarios in the comments, but I don't think those are very good examples of the problem described in the post. In that case, it is a single person who feels torn to two different directions. He has difficulty deciding between those two, yes, but neither of the forces pulling at him is by itself conscious and self-aware (as far as we know). The desire of a sentient being isn't being violated when somebody manages to make a difficult decision.

People are mentioning different "I am currently conflicted between two decisions" scenarios in the comments, but I don't think those are very good examples of the problem described in the post.

Yep. NLP calls the main subject of this post "sequential incongruity" (having differently-motivated states that are separate in time) to distinguish it from garden-variety incongruity (indecision/active inner conflict at a single point in time).

[-]gjm20

In at least some of these cases there's a likely asymmetry between the two Sides of the Chair, which may make it harder to maintain that we shouldn't favour one side's preferences over the other's. Namely, in some cases one of the two states is not sustainable. For instance, it's impossible and/or impractical to be high on (say) heroin for more than a smallish fraction of your life. (I think that's true, but I am not a drug expert. If it happens not to be true about heroin, that probably just means I picked a bad example. [EDIT: as David Nelson points out, that was in fact a bad example, but replacing heroin with hallucinogens fixes it. END OF EDIT]) So doing something that gratifies you-while-high but more severely harms you-while-not-high is likely to be bad for you-on-average.

In some other cases, one state is indefinitely sustainable but only at a cost that both versions might (if they could be convinced that the cost is real) find unacceptable. For instance, one can doubtless stay crazy for ever, but if one could explain to crazy-you that taking the antipsychotic drugs at least most of the time will make it possible to have a job, a partner-of-the-appropriate-sex, a life that doesn't involve being locked up all the time, etc., then I suspect that even crazy-you might agree that the price is worth paying. (Of course, if crazy-you is crazy enough then this is a hopeless project. Too bad; can't win 'em all.) Or: even when you're guzzling chocolate cake, you might well agree that you have to moderate your appetite most of the time because even cake-guzzling-you doesn't want to be a ball of lard six feet in diameter.

I don't claim that this asymmetry makes it obviously definitely right to enforce the wishes of you-in-the-sustainable-state even when you're in the unsustainable state and have very different wishes; but it does seem to offer a better justification for doing so than "simple bias in favour of mental states similar to our own". In some cases, at least.

Heroin/opiates are a bad example. Addicts with a steady supply or chronic pain patients are able to go years without skipping a day. I can't track it down right now, but I read a study a few years ago from some European county where they decided to try just giving a group of addicts all the heroin they wanted. The majority used every day, held down jobs, stayed out jail, and were relatively healthy. Of course government spending money to give addicts drugs was an outrage., so the program got shut down early.

Hallucinogens would be a better example, there would be no way to function in society if you were constantly on one.

The book Licit and Illicit Drugs points out that one of the founders of Johns Hopkins was a heroin addict. Being a doctor, he was able to take it in pill form for many years and nobody was the wiser in terms of his productivity.

There's been more than one such experiment - this Google search finds results about one in Liverpool but mention others in Sweden and other countries.

[-]gjm00

Thanks for improving my poor choice of example :-).

For instance, one can doubtless stay crazy for ever, but if one could explain to crazy-you that taking the antipsychotic drugs at least most of the time will make it possible to have a job, a partner-of-the-appropriate-sex, a life that doesn't involve being locked up all the time, etc., then I suspect that even crazy-you might agree that the price is worth paying.

Don't be sure about that. One of the reasons schizophrenics are so problematic is that they can't be trusted to take their meds. Apparently in a subset of schizophrenics, the condition causes them to see the world as 'vivid', bright, popping out at them in great details; and the medications bring them down to the grey dull normal perceptions - which they don't like and sometimes will skip their medications to retrieve. Even if that does mean risking the trappings of a normal life they've managed to piece together.

So doing something that gratifies you-while-high but more severely harms you-while-not-high is likely to be bad for you-on-average.

So can we derive a general principle from this? Perhaps: 'we should do what benefits us over the long run, and not the short'. So to take the cheesecake example: the duration of me which suffers for having eaten the cheesecake (both directly and for having broken my diet) is much much longer than the duration of me enjoying the cheesecake, and so we shouldn't eat the cheesecake. If we leave out mention of peaks entirely, then we avoid issues like 'but the hallucinogen feels awesome!' Or to take the wireheading example: if the wireheading could be sustained over a long time, comparable to a normal lifespan, and we're not talking the Niven scenario of dying in a few weeks, why wouldn't we permit the wireheading? It is almost by definition not harmful. (And arguments about higher pleasures of life than direct neural stimulation, are, I think, addressed by Mills's old quip about 'better to be Socrates dissatisfied'.) Basing our decisions on duration seems to me to deal with silver chairs satisfactorily. We appeal to decisions made 'sitting quietly' because that's what the rest of our life, averaged out, is like, and per the foregoing we should privilege the long-term over the short.

(And this privileging of duration even has some historical precedent; in Indian philosophy, one point of contention between Sankara/Vedantin and the Yogacara darsana was over being & perception; the Yogacara argued that we had no way to say that dreams were less real than reality, since we perceived both. Sankara's arguments consigning dreams to falsity were that waking life is longer than dreams, and waking life endures and contradicts dreams, while the reverse did not occur. Our argument here is somewhat analogous: which desire, the long-term average of our many selves, or the short-term current desire, is more 'real' and thus the one we should follow?)

So doing something that gratifies you-while-high but more severely harms you-while-not-high is likely to be bad for you-on-average.

Well, this gets to a point I made explicit in an earlier draft of this post: viz., how can an outside observer determine the strength of a desire? Meaning, if the heroin addict continues to get high, how can you say those short periods don't outweigh all the rest? (Or to use the example from that earlier draft: how can you say that the burst of satisfaction a suicide feels as he dies doesn't outweigh the entire remaining life he would have had?)

The obvious answer is brain scans or similar, which might easily show a maximum possible intensity of satisfaction, but I have a (probably irrational) distrust of such methods.

Do you see that asymmetry in wireheading? (Suppose that you have a modest trust fund.)

[-]gjm10

Dunno; but the fact that you probably couldn't without the trust fund seems relevant.

Your utils come from doing what you want, not being happy or successful, or finding the most efficient way to satisfy as many of your desires as possible, or anything else.

I agree with your main point. I disagree about this small bit, however as it appears to presuppose that giving in to an impulsive or compulsive desire is "what you want", and perhaps also that you cannot change "what you want". Ainslie notes that even many addictive desires are not actually "wanted", and most of the field of NLP as well as my own work shows that it's actually not that hard to change "what you want".

That having been said, I agree that condemning what you want is useless. Unless you incorporate information about your actual utilon sources into your decision-making, you will not be able to change with any consistency.

(Or, to put it another way: if you refuse to specifically acknowledge what you're giving up by changing your behavior, it's unlikely that you'll succeed in extinguishing the old behavior, or preventing substitute behaviors from arising.)

I certainly agree that it's possible to change what you want, if you want to, but analyzing such meta-desires seems tricky. (I'm reminded of the passage in Carmen where the title character says something to the effect of "I love you now less than I did, and soon I won't love you at all.")

This is a very interesting post and regardless of any errors some may feel it has, I think it has the right idea. One of the problems with attributing failure to akrasia is that it doesn't always feel like you would have made the right choice if you had will power. For an example from my own life, sometimes rather than working on things that I do want to work on, I play video games, often rationalizing to myself that I just don't have the mental energy to do the work. But, really, it's more that I want to play the games, maybe more than do the work, than that I failed to have the will power to do the work (although I wish I had had the will power to overcome my desire to play the games and do the work).

You've definitely given us more precise language to use when discussing "failures" of will power.

The me of the future is a different person. But screw that guy, I'm sticking up for the me of now!

Your proposition is: "Utils come from you doing what you want, not being happy or successful, therefore, forcing people to do something they do not directly want to do is necessarily bad for their utility." This claim appears to contradict reality, or else it redifines utility into something new and quite different.

A necessary implication of this view would be that, given the constraints they faced, everyone is at their maximum possible utility, i.e. there is no choice that you (or anyone) would go back and change. Period. Full stop. Nothing you could have done would make your utility higher than it is.

This may be true for you personally, but I sincerely doubt that. Indeed, I doubt it's true for anyone with a working long-term memory. Thus, your definition is either empirically false or it redefines utility into something unrecognizable.

As your premise is false, the conclusion is trivial and unsupported.

Of course, if I misunderstand you, or if this theory can be supported by a true premise with a conventional concept of utility, please do so; the general idea is interesting and the post is certainly thought-provoking.

Couple defenses/explanations for the "Your utility is already maximized" conclusion: -Coercion: changing facts about choices (i.e. removing coercion) might increase your utility, but, if you can't change the constraints you faced, you would not have changed a single decision.) -Information: One could argue you'd now change your choice due to better information. But if better information would increase your utility, then your utility stemmed from the consequences of your decision, not your making the decision itself, thus falsifying the original proposition. -Unconsidered alternatives - it is possible that your utility would be higher if you had realized an alternative choice that never came to mind. Thus, all decisions you made you chose the best of the options you considered; were you aware of more options, you might have actually done better. It's unclear from the phrasing of this proposition if this would count as utility increasing. Nevertheless, I doubt there's anyone who does not think his utility would be higher had he made a different choice that had come to mind at the time.

[-]bill00

I am struggling with the general point, but I think in some situations it is clear that one is in a "bad" state and needs improvement. Here is an example (similar to Chris Argyris's XY case).

A: "I don't think I'm being effective. How can I be of more help to X?"

B: "Well, just stop being so negative and pointing out others' faults. That just doesn't work and tends to make you look bad."

Here, B is giving advice on how to act, while at the same time acting contrary to that advice. The values B wants to follow are clearly not the values he is actually following; furthermore, B doesn't realize that this is happening (or he wouldn't act that way).

This seems to be a state that is clearly "bad", and shouldn't be seen as just different. If I am demonstrably and obliviously acting against my values as I would express them at the time, then I clearly need help. Note that this is different from saying that I am acting against some set of values I would consider good if I were in a different/better state of mind. The values I am unknowingly transgressing are the ones I think I'm currently trying to fulfill.

Does this make sense? What are your reactions?

By the way, this is a common situation; people feeling stress, threat, or embarrassment often start acting in this way.

[-]dclayh-10

I don't think hypocrisy is so fundamentally different as you think. If it interferes with your other goals (e.g. by making you less rational than you need to be) then work on it, but if hypocrisy gives you warm fuzzies (which it seems do for many people) then go ahead, although you shouldn't be surprised if other people judge you (or don't trust you) because of it.

[+][anonymous]-50