Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

David_Allen comments on Welcome to Less Wrong! (2010-2011) - Less Wrong

42 Post author: orthonormal 12 August 2010 01:08AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (796)

You are viewing a single comment's thread.

Comment author: David_Allen 13 August 2010 12:20:49AM *  7 points [-]

My search began when I realized that I was confused. I was confused by what people did and what they said. I was confused by my responses to other people, how interacting with other people affected me. And I was confused about how I worked. Why I did the things I did, why I felt the way I did, why sometimes things were easy for me, and sometimes they were hard.

I learned very early in my life that I needed to critically analyze what other people told me. Not simply to identify truth or falsehood, but to identify useful messages in lies and harmful messages hidden in apparently truthful statements.

At the age of 11 I taught myself to program on a TRS-80, and in the process I discovered how to learn through play and exploration. Of course I had been learning in this way all along, but this was when I discovered the truth about how I learned. This realization has changed my approach to everything.

Computer programming confused me, so my search continued. By focusing on how I thought about programming, I quickly became very skilled. I learned how to explore problems and dissolve them into useful pieces. I learned how to design and express solutions in many programming languages and environments. I learned the theory of computation and how it is tied to philosophy, logic, mathematics and natural languages.

I worked in industry for 20 years, starting with internships. I've worked on large and small systems in low level and high level languages. I've done signal processing for engineering systems and developed web interfaces. I've worked alone, and in teams. I've run software teams launching companies.

Programming still confused me. I was frustrated and confused by how difficult it was to do programming well. In general it is very difficult to implement a simple idea, in a simple way that is simple to use. Even under ideal circumstances and in the best designed system, complexity grows faster than the code base. This dooms many projects to failure.

I am now coming to grips with the true nature of this problem, and with its solution. The problem rests in the nature of knowledge and meaning. The implications extend far beyond computer science and I intend to write articles on this topic for Less Wrong.

A core idea that I am exploring is the context principle. Traditionally, this states that a philosopher should always ask for a word's meaning in terms of the context in which it is being used, not in isolation.

I've redefined this to make it more general: Context creates meaning and in its absence there is no meaning.

And I've added the corollary: Domains can only be connected if they have contexts in common. Common contexts provide shared meaning and open a path for communication between disparate domains.

Comment author: Perplexed 15 August 2010 03:12:16PM 1 point [-]

A core idea that I am exploring is the context principle. Traditionally, this states that a philosopher should always ask for a word's meaning in terms of the context in which it is being used, not in isolation.

I've redefined this to make it more general: Context creates meaning and in its absence there is no meaning.

And I've added the corollary: Domains can only be connected if they have contexts in common. Common contexts provide shared meaning and open a path for communication between disparate domains.

Some examples: In programming, an argument or message can be passed only if sender and receiver agree on the datatype of the argument (i.e. on how the bits should be interpreted). In Bayesian inference, all probabilities are conditional on background knowlege. In natural deduction (logic), complex sentences in simple contexts are decomposed into simple sentences in complex contexts.

In all cases, there are rules for transferring information between context and "content". But you can never completely eliminate the context. You are always left with a residual context which may take the form of assumed axioms, rules of inference, grammars, or alphabets. That is, the residual is our way of representing the simplest possible context. I think that it is an interesting research program to examine how more complex contexts can be specified using the same core machinery of axioms, alphabets, grammars, and rules.

Comment author: David_Allen 03 September 2010 11:56:06PM *  2 points [-]

In Bayesian inference, all probabilities are conditional on background knowlege.

Absolutely. The interpretation of the evidence depends entirely on its meaning, within the context at hand. This is why different observers can come to different conclusions given the same evidence; they have adopted different contexts.

For example: "...humans are making decisions based on how we think the world works, if erroneous beliefs are held, it can result in behavior that looks distinctly irrational."

So when we observe a person with behavior or beliefs that appear to be irrational, we are probably using a different context than they are. If we want to understand or to change this person's beliefs, we need to establish a common context with them, creating a link between their context and ours. This is essentially the goal of Nonviolent Communication.

I also see ideas in Buddhism that can be phrased in terms of the context principle. Suffering (dukkha) is context dependent. We may suffer under conditions that bring another joy. My wife, for example dislikes most of the TV shows I watch. If she realizes that I am happy to put on headphones to spare her from exposure, she can experience gratitude instead of resentment.

In all cases, there are rules for transferring information between context and "content".

This is a key insight. If you can split a system arbitrarily between context and content, how do you decide where to make the split? In programming, which part of the problem is represented in the program, and which part in the data?

This task can be arbitrarily hard. As I stated above:

In general it is very difficult to implement a simple idea, in a simple way that is simple to use.

The Daily WTF contains many examples of simple ideas implemented poorly.

But you can never completely eliminate the context. You are always left with a residual context which may take the form of assumed axioms, rules of inference, grammars, or alphabets. That is, the residual is our way of representing the simplest possible context.

In computer science you can ground certain abstractions in terms of themselves. For example the XML Schema Definition Language can be used to define a schema for itself.

The observable universe appears to be our residual common context. If we want to come up with a TOE that explains this context, perhaps we need to look for one that can be defined in terms of itself.

I think that it is an interesting research program to examine how more complex contexts can be specified using the same core machinery of axioms, alphabets, grammars, and rules.

This sounds similar to what I am working on. I am working on a methodology for creating a network of common contexts that can operate on each other to build new contexts. There is a core abstraction that all contexts can be projected into.

Key ideas for this approach come from Language-oriented programming and Aspect-oriented programming.