There is a lot of talk here about sophisticated rationality failures - priming, overconfidence, etc. etc. There is much less talk about what I think is the more common reason for people failing to act rationally in the real world - something that I think most people outside this community would agree is the most common rationality failure mode - acting emotionally (pjeby has just begun to discuss this, but I don't think it's the main thrust of his post...).
While there can be sound evolutionary reasons for having emotions (the thirst for revenge as a Doomsday Machine being the easiest to understand), and while we certainly don't want to succumb to the fallacy that rationalists are emotionless Spock-clones. I think overcoming (or at least being able to control) emotions would, for most people, be a more important first step to acting rationally than overcoming biases.
If I could avoid saying things I'll regret later when angry, avoid putting down colleagues through jealousy, avoid procrastinating because of laziness and avoid refusing to make correct decisions because of fear, I think this would do a lot more to make me into a winner than if I could figure out how to correctly calibrate my beliefs about trivia questions, or even get rid of my unwanted Implicit Associations.
So the question - do we have good techniques for preventing our emotions from making bad decisions for us? Something as simple as "count to ten before you say anything when angry" is useful if it works. Something as sophisticated as "become a Zen Master" is probably unattainable, but might at least point us in the right direction - and then there's everything in between.
This touches on what for me is one of the big open questions on what it means to act rationally. I question the common position that the kinds of 'irrational' decisions you describe are actually all that irrational. Many such decisions seem to be rational decisions for an agent with a high time preference at the moment of decision. They may seem irrational from the perspective of a future self who looks back on the decisions when dealing with the consequences but I see the problem as more one of conflicting interests between present selves and past/future selves than one strictly of rationality. As the recent post discussed, rationality doesn't provide goals, it only offers a system for achieving goals. Many apparently irrational decisions are I suspect rational responses to short term goals that conflict with longer term goals.
If I decide to eat a chocolate bar now to satisfy a current craving, I am not really acting irrationally. I have a powerful short term drive to eat chocolate and there is nothing irrational in my actions to satisfy that short term goal. Later on I may look at the scales and regret eating the chocolate but that reflects either a conflict between short term and long term goals or a conflict between the goals of my present self and my past self (really just alternative ways of looking at the same problem). It is not a failure of rationality in terms of short term decision making, it is a problem of incentives not aligning across time frames or between present and future selves. In order to find solutions to such dilemmas it seems more useful to look to micro-economics and the design of incentive structures that align incentives across time scales than to ways to improve the rationality of decisions. The steps I take to acquire chocolate are perfectly rational, the problem is with the conflicts in my incentive structure.
It doesn't optimize for "you", it optimizes for the gene that increases the chance of cheating. The "future" has very little "you".