Akrasia is the tendency to act against your own long-term interests, and is a problem doubtless only too familiar to us all. In his book "Breakdown of Will", psychologist George C Ainslie sets out a theory of how akrasia arises and why we do the things we do to fight it. His extraordinary proposal takes insights given us by economics into how conflict is resolved and extends them to conflicts of different agencies within a single person, an approach he terms "picoeconomics". The foundation is a curious discovery from experiments on animals and people: the phenomenon of hyperbolic discounting.
We all instinctively assign a lower weight to a reward further in the future than one close at hand; this is "discounting the future". We don't just account for a slightly lower probability of recieving a more distant award, we value it at inherently less for being further away. It's been an active debate on overcomingbias.com whether such discounting can be rational at all. However, even if we allow that discounting can be rational, the way that we and other animals do it has a structure which is inherently irrational: the weighting we give to a future event is, roughly, inversely proportional to how far away it is. This is hyperbolic discounting, and it is an empirically very well confirmed result.
I say "inherently irrational" because it is inconsistent over time: the relative cost of a day's wait is considered differently whether that day's wait is near or far. Looking at a day a month from now, I'd sooner feel awake and alive in the morning than stay up all night reading comments on lesswrong.com. But when that evening comes, it's likely my preferences will reverse; the distance to the morning will be relatively greater, and so my happiness then will be discounted more strongly compared to my present enjoyment, and another groggy morning will await me. To my horror, my future self has different interests to my present self, as surely as if I knew the day a murder pill would be forced upon me.
If I knew that a murder pill really would be forced upon me on a certain date, after which I would want nothing more than to kill as many people as possible as gruesomly as possible, I could not sit idly by waiting for that day to come; I would want to do something now to prevent future carnage, because it is not what the me of today desires. I might attempt to frame myself for a crime, hoping that in prison my ability to go on a killing spree would be contained. And this is exactly the behavour we see in people fighting akrasia: consider the alcoholic who moves to a town in which alcohol is not sold, anticipating a change in desires and deliberately constraining their own future self. Ainslie describes this as "a relationship of limited warfare among successive selves".
And it is this warfare which Ainslie analyses with the tools of behavioural economics. His analysis accounts for the importance of making resolutions in defeating akrasia, and the reasons why a resolution is easier to keep when it represents a "bright clear line" that we cannot fool ourselves into thinking we haven't crossed when we have. It also discusses the dangers of willpower, and the ways in which our intertemporal bargaining can leave us acting against both our short-term and our long-term interests.
I can't really do more than scratch the surface on how this analysis works in this short article; you can read more about the analysis and the book on Ainslie's website, picoeconomics.org. I have the impression that defeating akrasia is the number one priority for many lesswrong.com readers, and this work is the first I've read that really sets out a mechanism that underlies the strange battles that go on between our shorter and longer term interests.
This reminds me of a webcomic, where the author justifies his lack of self improvement, and his continual sucking at life:
"Pfft. I'll let Future Scott deal with it. That guy's a dick!"
http://kol.coldfront.net/comic/ (No perma-link; it's comic 192, if new one's been posted since I wrote this.)
When dealing with your future self there's an economic balancing act at play, because Future Self's values will inevitable shift. On the extreme side, if Omega had told Aurini'1989 that if he saves his $10 for ten years, it will grow to the point where he can buy every Ninja Turtle action figure out there, Aurini'1989 would have said, "Yes, but Aurini'1999 won't want Ninja Turtles anymore - however, he will likely value the memory of having played with Ninja Turtles." To hold the Future Self completely hostage to the desires of the present makes as little sense as holding the Present Self hostage to the desires of the future.
It breaks down to a tactical problem (which units do you build first in Civ 4?); I'm glad I spent money on that beer five years ago, because I still find value in the memory. What makes the problem difficult to solve is our fuzzy perceptions. First there's the issue of scope intensity; none of our senses are calibrated, including our sense of time. But there's also the issue of inconsistency of self. The 8 AM self which desires to be left alone to drink his coffee and read a book is a wildly different person than the 10 PM self hopped up on whiskey and telling the bartender how it really is.
The first problem is easy enough to correct for, you don't even need to be trained in rationality to accomplish this. Most people, if given the offer X period of suffering for Y period of benefit, will be able to make a cost/benefit analysis as to whether it is a good deal or not.* Aurini'2001 made this calculation when he joined the army. The numbers are fuzzy, but they're not inestimable. Furthermore, statistical studies (such as education level vs long-term earnings) can be used to bolster these calculations.
The real nut of the problem is the inconsistentcy of the self. We are wildly different people from moment to moment, regardless of a relatively consistent average over time. We are our values (apologies - I can't find who wrote the original post on this topic). We all have a number of ad hoc techniques we use to stay true to our primary goals, but I'm not sure what the broader solution would be.
I guess what I'm saying is that it isn't so much irrationality that causes you to stay up all night reading, instead of getting a good night's sleep. When we consider choices that aren't immediate, most people can make accurate judgements based upon the information they have. The bigger problem is how rapidly our values shift on minutiae. It's not just that the morning is relatively further away from now, than one month vs a month and a day - the bigger problem is that there's more personal variance between those times.
*Regarding the googleplex of dust motes vs a lifetime of torture dilemma: I think the scope intensity fail which occurred there is because a lifetime of torture could reasonably be expected to destroy the self; if it had been a week of torture, most people would volunteer, I think. It was an inability to empathize with a googleplex as opposed to an individual.
Insulting my future self like that sure makes me less anxious about providing for my future self.