Actions and Words: Akrasia and the Fruit of Self-Knowledge

10 Annoyance 15 April 2009 03:27PM

Knowing other people requires intelligence,

but knowing yourself requires wisdom.

Those who overcome others have force,

but those who overcome themselves have power.

- Tao Te Ching, Chapter 33:  Without Force, Without Perishing

Much has been written here about the issue of akrasia.  People often report that they really, sincerely want to do something, that they recognize that certain courses of action are desirable/undesirable and that they should choose them -- but when the time comes to decide, they do otherwise.  Their choices don't match what they said their choices would be.

While I'm sure many people are less than honest in reporting their intentions to others, and possibly even more who aren't even being honest with themselves, there are still plenty of people that are presumably sincere and honest.  So how can they make their actions match their understanding of what they want?  How can their choices reflect their own best judgment?

Isn't that really the wrong question?

continue reading »

Proverbs and Cached Judgments: the Rolling Stone

15 Annoyance 01 April 2009 03:40PM

People have long noted that individuals diagnosed as schizophrenic usually manifest disturbances of language, communication, and abstract thought.  One way to examine that disturbance is to ask patients to interpret various common proverbs, as psychiatrists have done since before the turn of the century.  (Interested readers can find a layperson-suitable discussion of this method's utility in the modern day at the following link: AAPL newsletter.)

Originally, patients' responses were evaluated by their correctness.  Now they're graded on their degree of abstraction.  Responses that understand the sayings literally or in simplistically concrete terms are generally considered to be signs of a failure to abstract, although illiterate or mentally challenged individuals also tend to respond that way, and individuals encountering a proverb for the first time are less likely to recognize its symbolic meaning.  It seems clear that cultural exposure to proverbial forms, to the idiomatic usage of phrases and scenarios, affects how we recognize such methods of communication.

But why was the 'correctness' criterion dropped?  Because perfectly normal people, whom no one would consider schizophrenic, often gave interpretations that wildly conflicted with what the interviewer considered to be the correct one.  Which interpretations were 'correct' depended heavily on the traditions and cultures that the listeners came from.

Let's consider a classic example of a proverb often given divergent interpretations:

The rolling stone gathers no moss.

People from societies where stability and slowly-developed connections are valued consider this saying to be a warning of the dangers of activity and change.  Without staying still, beautiful moss won't grow. People from societies where activity and change are valued, however, consider it to be a prescription for how to avoid decay and degeneration.  If you don't keep moving, you'll be covered by moss!

When asked to explain their interpretation, the value of moss growth is typically presented as desirable or undesirable, depending on the defended meaning.  But if you start out by asking people whether moss is something to seek or avoid, there's no clear preference outside of specific contexts.  People generally don't have aesthetic preferences either way; overall, people don't care.

So the symbolic meaning of the mossy growth doesn't determine how people interpret the saying; people invest the moss with meaning to justify the judgment they had already reached.  This is may be an example of what people at this site would call a 'cached thought'.  Rather than giving a reason for their judgment, people reply with rationalizations that have nothing to do with why they reached their conclusion.  Rather than thinking about why they decided as they did, people bring out a ready smokescreen.

What's the actual logical structure of the saying? Rational analysis sheds a great deal of light on the question.  The meaning can be stated in various ways, all equivalent.

Stability is required for the development of certain states.  Activity is incompatible with the development of certain states.  (Desirable/undesirable) states can be (encouraged/prevented) by (engaging in/avoiding) (necessary precursors/incompatible conditions).

The saying encodes a pattern that expresses a relationship, but the pattern is devoid of evaluation.  It's a blank screen upon which people project their pre-existing values and judgments.  To truly understand the proverb, it's necessary to recognize which aspects of our perception are the saying itself, and which are our own ideas projected onto it.

Recognizing the Candlelight as Fire: Joshu Washes the Bowl

-11 Annoyance 29 March 2009 06:13PM

Joshu Washes the Bowl

A monk told Joshu: `I have just entered the monastery. Please teach me.'

Joshu asked: `Have you eaten your rice porridge?'

The monk replied: `I have eaten.'

Joshu said: `Then you had better wash your bowl.'

At that moment the monk was enlightened.

Mumon's Comment: Joshu is the man who opens his mouth and shows his heart. I doubt if this monk really saw Joshu's heart. I hope he did not mistake the bell for a pitcher.

It is too clear and so it is hard to see.
A dunce once searched for fire with a lighted lantern.
Had he known what fire was,
He could have cooked his rice much sooner.

When It's Not Right to be Rational

4 Annoyance 28 March 2009 04:15PM

By now I expect most of us have acknowledged the importance of being rational.  But as vital as it is to know what principles generally work, it can be even more important to know the exceptions.

As a process of constant self-evaluation and -modification, rationality is capable of adopting new techniques and methodologies even if we don't know how they work.  An 'irrational' action can be rational if we recognize that it functions.  So in an ultimate sense, there are no exceptions to rationality's usefulness.

In a more proximate sense, though, does it have limits?  Are there ever times when it's better *not* to explicitly understand your reasons for acting, when it's better *not* to actively correlate and integrate all your knowledge?

I can think one such case:  It's often better not to look down.

People who don't spend a lot of time living precariously at the edge of long drops don't develop methods of coping.  When they're unexpectedly forced to such heights, they often look down.  Looking down, subcortical instincts are activated that cause them to freeze and panic, overriding their conscious intentions.  This tends to prevent them from accomplishing whatever goals brought them to that location, and in situations where balance is required for safety, the panic instinct can even cause them to fall.

If you don't look down, you may know intellectually that you're above a great height, but at some level your emotions and instincts aren't as triggered.  You don't *appreciate* the height on a subconscious level, and so while you may know you're in danger and be appropriately nervous, your conscious intentions aren't overridden.  You don't freeze.  You can keep your conscious understanding compartmentalized, not bringing to mind information which you possess but don't wish to be aware of.

The general principle seems to be that it is useful to avoid fully integrated awareness of relevant data if acknowledging that data dissolves your ability to regulate your emotions and instincts.  If they run amok, your reason will be unseated.  Careful application of doublethink, and avoiding confronting emotionally-charged facts that aren't absolutely necessary to respond appropriately to the situation, is probably the best course of action.

If you expect that you're going to be dealing with heights in the future, you can train yourself not to fall into vertigo.  But if you don't have opportunities for training down your reactions, not looking down is the next best thing.

On Seeking a Shortening of the Way

10 Annoyance 27 March 2009 05:11PM

"The most instructive experiences are those of everyday life."  - Friedrich Nietzsche

What is it that the readers of lesswrong are looking for?  One claim that's been repeated frequently is that we're looking for rationality tricks, shortcuts and clever methods for being rational.  Problem is:  there aren't any.

People generally want novelty and gimmicks.  They're exciting and interesting!  Useful advice tends to be dull, tedious, and familiar.  We've heard it all before, and it sounded like a lot of hard work and self-discipline.  If we want to lose weight, we don't do the sensible and quite difficult thing and eat a balanced diet while increasing our levels of exercise.  We try fad diets and eat nothing but grapefruits for a week, or we gorge ourselves on meats and abhor carbohydrates so that our metabolisms malfunction.  We lose weight that way, so clearly it's just as good as exercising and eating properly, right?

We cite Zen stories but don't take the time and effort to research their contexts, while at the same time sniggering a the actual beliefs inherent in that system.  We wax rhapsodic about psychedelics and dismiss the value of everyday experiences as trivial - and handwave away praise of the mundane as utilization of "applause lights".

We talk about the importance of being rational, but don't determine what's necessary to do to become so.

Some of the greatest thinkers of the past had profound insights after paying attention to parts of everyday life that most people don't give a second thought.  Archimedes realized how to determine the volume of a complex solid while lounging in a bath.  Galileo recognized that pendulums could be used to reliably measure time while letting his mind drift in a cathedral.

Sure, we're not geniuses, so why try to pay attention to ordinary things?  Shouldn't we concern ourselves with the novel and extraordinary instead?

Maybe we're not geniuses because we don't bother paying attention to ordinary things.

Beginning at the Beginning

5 Annoyance 11 March 2009 07:23PM

I can't help but notice that some people are utilizing some very peculiar and idiosyncratic meanings for the word 'rational' in their posts and comments.  In many instances, the correctness of rationality is taken for granted; in others, the process of being rational is not only ignored, but dispensed with alltogether, and rational is defined as 'that which makes you win'.

That's not a very useful definition.  If I went to someone looking for helping selecting between options, and was told to choose "the best one", or "the right one", or "the one that gives you the greatest chance of winning", what help would I have received?  If I had clear ideas about how to determine the best, the right, or the one that would win, I wouldn't have come looking for help in the first place.  The responses provide no operational assistance.

There is a definite lack of understanding here of what rationality is, much less why it is correct, and this general incomprehension can only cripple attempts to discuss its nature or how to apply it. We might think that this site would try to dispel the fog surrounding the concept.  Remarkably, a blog established to help "refining the art of human rationality" neither explains nor defines rationality.

Those are absolutely critical goals if lesswrong is to accomplish what it advertises itself as attempting.  So let's try to reach them.


The human mind is at the same time both extremely sophisticated and shockingly primitive.  Most of its operations take place beneath the level of explicit awareness; we don't know how we reach conclusions and make decisions, we're merely presented with the results along with an emotional sense of rightness or confidence.

Despite these emotional assurances, we sometimes suspect that such feelings are unfounded.  Careful examination shows that to be precisely the case.  We can and do develop confidence in results, not because they are reliable, but for a host of other reasons. 

Our approval or disapproval of some properties can cross over into our evaluation of others.  We can fall prey to shortcuts while believing that we've been thorough.  We tend to interpret evidence in terms of our preferences, perceiving what we want to perceive and screening out evidence we find inconvenient or uncomfortable.  Sometimes, we even construct evidence out of whole cloth to support something we want to be true.

It's very difficult to detect these flaws in ourselves as we make them.  It is somewhat easier to detect them in others, or in hindsight while reflecting upon past decisions which we are no longer strongly emotionally involved in.  Without knowing how our decisions are reached, though, we're helpless in the face of impulses and feelings of the moment, even while we're ultimately skeptical about how our judgment functions.

So how can we try to improve our judgment if we don't even know what it's doing?

How did Aristotle establish the earliest-known examination of the principles of justification?  If he originated the foundation of the systems we know as *logic*, how could that be accomplished without the use of logic?

As Aristotle noted, the principles he made into a set of formal rules already existed.  He observed the arguments of others, noting how people defended positions and attacked the positions of others, and how certain arguments had flaws that could be pointed out while others seemed to possess no counters.  His attempts to organize people's implicit understandings of the validity of arguments led to an explicit, formal system.  The principles of logic were implicit before they were understood explicitly.

The brain is capable of performing astounding feats of computation, but our conscious grasp of mathematics is emulated and lacks the power of the system that creates it.  We can intuitively comprehend how a projectile will move from just a glimpse of its trajectory, although solving the explicit partial differential equation that describes that motion is terrifically difficult, and virtually impossible to accomplish in real-time.  Yet our explicit grasp of mathematics makes it possible for us to solve problems and comprehend ideas completely beyond the capacity of our hunter-gatherer ancestors, even though the processing power of our brains does not appear to have changed from those early days.

In the same way, our models of what proper thought means give us options and opportunities far beyond what our intuitive, unconscious reasoning makes possible, even though the conscious understanding works with much fewer resources than the unconscious.

When we consciously and deliberately model the evolution of one statement into another according to elementary rules that make up the foundation of logical consistency, something new and exciting happens.  The self-referential aspects of that modeling permit us to compare the decisions presented to us by the parts of our minds beneath the threshold of our awareness and override them.  We can evaluate our own evaluations, reaching conclusions that our emotions don't lead us to and rejecting some of those that they do.

That's what rationality is:  having explicit and conscious standards of validity, and applying them in a systematic way.  It doesn't matter if we possess an inner conviction that something is true - if we can't demonstrate that it can be generated from basic principles according to well-defined rules, it's not valid.

What makes this so interesting is that it's self-correcting.  If we observe an empirical relationship that our understanding doesn't predict, we can treat it as a new fact.  For example, let's say that we find that certain manipulations of tarot decks permit us to predict the weather, even though we have no idea of why the two should be correlated at all.  With rationality, we don't need to know why.  Once we've recognized that the relationship exists, it becomes rational for us to use it.  Likewise, if a previously-useful relationship suddenly ceases to be, even though we have no theoretical grounds for expecting that to happen, we simply acknowledge the fact.  Once we've done so, we can justify ignoring that which we previously considered to be evidence.

Human reasoning is especially plagued by superstitions, because it's easy for us to accept contradictory principles without acknowledging the inconsistency.  But when we're forced to construct step-by-step justifications for our beliefs, contradiction is thrown into sharp relief, and can't be ignored.

Arguments that are not made explicitly, with conscious awareness of how each point is derived from fundamental principles and empirical observations, may or may not be correct.  But they're never rational.  Rational reasoning does not guarantee correctness; rational choice does not guarantee victory.  What rationality offers is self-knowledge of validity.  If rational standards are maintained when thinking, the best choice as defined by the knowledge we possess will be made.  Whether it will be best when we gain new knowledge, or in some absolute sense, is unknown and unknowable until that moment comes.

Yet those who speak here often of the value of human rationality frequently don't do so by rational means.  They make implicit arguments with hidden assumptions and do not acknowledge or clarify them.  They emphasize the potential for rationality to bootstrap itself to greater and greater levels of understanding, yet don't concern themselves with demonstrating that their arguments arise from the most basic elements of reason.  Rationality starts when we make a conscious attempt to understand and apply those basic elements, to emulate in our minds the principles that make the existence of our minds possible.

Are we doing so?

View more: Prev