half of all starting configurations are unable to end with all cards up and in fact will end with UUUUUUUUUUUUUUUUUUUD.
That is why the starting configuration is explicitly given as: "Twenty random cards are placed in a row all face down". If the number of starting face down cards is odd, then it will terminate with the last card as down and the rest as up. If the number of starting face down cards is even, then it will terminate with all the cards face up.
Oops, misread the problem statement. Of course you're right. (Though I think the problem is made slightly more interesting if you allow starting with an arbitrary configuration.)
In summary, the goal of this post is to start a discussion on the current meaning of 'rationality' as it is defined on less wrong. I am specifically trying to find out:
I think that the description below from the What Do We Mean By 'Rationality' post sums up the current meaning of 'rationality' as it is used on this site:
Now, I think that the definition or description of 'rationality' above is pretty good. In fact, if I wanted to introduce someone to the concept of rationality then I would probably refer to it, but I would explain that it is a working definition. This means that it conveys the general idea and that most of the time this suffices. I have no problems with working definitions. One of my favorite ideas on less wrong is the idea that words and concepts are pointers to areas in concept space. This idea allows you to use working definitions and to not waste your time on semantic issues. But, I think that an often neglected aspect of this idea is that you still need to ensure that the words you use point to the right and restricted areas in concept space. When people say "I am not here to argue about definitions", this does not abdicate their responsibility to create decent definitions. It is like saying: "hey, I know that this definition is not perfect, but I think that it's close enough that it will be able to convey the general idea that I am getting at". If that is all that you are trying to do, then avoiding refining your definitions is fine, but it should be noted that the more important and cited the concept becomes the more neccesary it is to improve the definitions of the concept.
I think the definition of rationality above has two major problems:
Perhaps the biggest issue I have with the definition is not anything to do with how it currently is, but instead with how hard it is to improve. This sounds like a good thing, but it's not. The definition is hard to improve, not because it is perfect, but because instrumental rationality is just too big. Any ideas or improvements to the definition that would seem plausible are likely to be quickly discarded as they can be made to fall into the instrumental rationality category.
I am not going to be providing a better definition of 'rationality' as the goal of this post is just to start a discussion, but I do think that a lot of the problems I have mentioned above can be best solved by first choosing a simple core definition of what it means to be rational and then having a seperate myriad of areas in which improvements lead to increases in rationality. In general, the more granularised, results-orientated and verified each of these areas is the better.
A possible parent or base definition for 'rationality' is already on the wiki. It says that 'rationality' is the: "characteristic of thinking and acting optimally". This, to me, seems like a pretty good starting point, although, I do admit that the definition itself is too concise and that it doesn't really tell us much since 'optimal' is also hard to define.
That is not to say that we don't have any idea of what 'optimal' means, we do. It is just that this understanding (logic, probability and decision theory etc.) is mostly related to the normative sense of the word. This is a problem because we are agents with certain limitations and adaptations which make it so that our attempts to do things the normative way are often impractical, cumbersome and flawed. It is for this reason that any definition of 'rationality' should be about more than just: 'get closer to the normative model'. Of course, getting closer to the results of the normative model is always good, but I still think that a decent definition of 'rationality' should take into account, for example, ecological and bounded rationality as well as the values of the agent under consideration.