Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
There's a lot of background mess in our mental pictures of the world. We try and be accurate on important issues, but a whole lot of the less important stuff we pick up from the media, the movies, and random impressions. And once these impressions are in our mental pictures, they just don't go away - until we find a fact that causes us to say "huh", and reassess.
Here are three facts that have caused that "huh" in me, recently, and completely rearranged minor parts of my mental map. I'm sharing them here, because that experience is a valuable one.
- Think terrorist attack on Israel - did the phrase "suicide bombing" spring to mind? If so, you're so out of fashion: the last suicide bombing in Israel was in 2008 - a year where dedicated suicide bombers managed the feat of killing a grand total of 1 victim. Suicide bombings haven't happened in Israel for over half a decade.
- Large scale plane crashes seem to happen all the time, all over the world. They must happen at least a few times a year, in every major country, right? Well, if I'm reading this page right, the last time there was an airline crash in the USA that killed more that 50 people was... in 2001 (2 months after 9/11). Nothing on that scale since then. And though there has been crashes on route to/from Spain and France since then, it seems that major air crashes in western countries is something that essentially never happens.
- The major cost of a rocket isn't the fuel, as I'd always thought. It seems that the Falcon 9 rocket costs $54 million per launch, of which fuel is only $0.2 million (or, as I prefer to think of it - I could sell my house to get enough fuel to fly to space). In the difference between those two prices, lies the potential for private spaceflight to low-Earth orbit.
As soon as I got out of college I got a job at a restaurant. At the time I had never had a job at a restaurant, but my mom had known the owners and I felt obligated to avoid performing badly. Yet inevitably I did perform badly, and how this performance was evaluated would greatly affect my way of perceiving my mistakes.
If you're entrenched in an organization, there's a good chance you have an idea of what it is you're supposed to do and what mistakes you will or will not be making. But suppose you're in a position like this one: by way of your ignorance you know you're going to make a lot of mistakes, and it's just a question of when and how much. Further, you know that if you make too many mistakes, you make people you care about look bad. And finally, there are a lot of unknown unknowns: you don't know what possible mistakes and acts of ignorance exist to begin with, so many mistakes you've made you will be blind to.
The proactive thing to do, naturally, is to try to minimize how many mistakes you make.
There are two key ways to gauge the depth of being told you have made a mistake. The first way is to take mistakes literally, as if no other mistake exists, and any other mistake would be pointed out to you. So if you correct this mistake, everything else should be fine. This is how you'd expect to take mistakes if you were, say, under the supervision of an editor.
But the second kind is where the title of this writeup comes in. Not everyone is literal, or critical enough to notice every mistake. Much of the time, you'll only receive news of a mistake if many other mistakes are already afoot, and this mistake just happens to stand out from the set of mistakes you've already made. And since you don't know what mistakes you could be making, you don't know if there are many more mistakes under your level of awareness that you could be correcting for, but aren't.
In short, you're tasked with avoiding a wrongness iceberg: a mistake indicative of a nautical mile of mistakes below the surface and your level of awareness.
This is a debilitating position to be in, because your mental map of your performance prior to discovering the iceberg needs to be completely rewritten; in addition to accounting for all of the new areas you need to work on, you will likely account for the embarrassment of realizing that you have opened up a new frontier of mistakes to reflect on from your period of unaware incompetence.
While I don't think it's impossible that people exist who have never been in a situation like this, I think anyone who dives into a new field or skill is familiar, at least, with this feeling of brief yet total incompetence. And if you're in a field with enough depth and subjective calls to allow for a wrongness iceberg scenario, there might not be much you can do to prevent it. The most you can do is provide adequate resistance for the inevitable.
That's why I've created this mental model to think about it constructively. In every situation where I've faced a wrongness iceberg, the anxiety has been catastrophic. If you can at least deal with it, you can realize why it is you're anxious and what's going on with your assessment of your own mistakes. From experience, knowing that I'm worried about making this kind of iceberg-revealing mistake is helpful for mitigating my stress. And if you can somehow preempt an iceberg, that's even better.
side note: I've extended this concept to other domains, and it works well. A "dishonesty iceberg" is when one person's lie reveals a nautical mile of lies below the surface, and an "attraction iceberg" is when one person's expression of attraction toward you are indicative of a much greater level of internal attraction.
Hey all - I typed this out to help me understand, well... how to understand things:
Mental clarity is the ability to read reality accurately.
I don't mean being able to look at the complete objective picture of an event, as you don't have any direct access to that. I'm talking about the ability to read the data presented by your subjective experience: thoughs, sights, sounds, etc. Once you get a clear picture of what that data is, you can then go on and use it to build or falsify your ideas about the world.
This post will focus on the "getting a clear picture" part.
I use the word "read" because it's no different than reading from a book, or from these words. When you read a book, you are actually curious as to what the words are saying. You wouldn't read anything into it that's not there, which would be counterproductive to your understanding.
You just look at the words plainly, and through this your mind automatically recognizes and presents the patterns: the meaning of the sentences, their relation to the topic, the visual imagery associated with them, all of that. If you want to know a truth about reality, just look at it and read what's there.
Want to know what the weather's like? Look outside - read what's going on.
Want to know if the Earth revolves around the Sun, or vice versa? Look at the movement of the planets, read what they're doing, see which theory fits better.
Want to check if your beliefs about the world are correct? Take one, read the reality that the belief tries to correspond to, and see how well they compare.
This is the root of all science and all epiphanies.
But if it's so simple and obvious, why am I talking about it?
It's not something that we as a species often do. For trivial matters, sure, for science too, but not for our strongly-held opinions. Not for the beliefs and positions that shape our self-image, make us feel good/comfortable, or get us approval. Not for our political opinions, religious ideas, moral judgements, and little white lies.
If you were utterly convinced that your wife was faithful, moreso, if you liked to think of her in that way, and your friend came along and said she was cheating on you, you'd be reluctant to read reality and check if that's true. Doing this would challenge your comfort and throw you into an unknown world with some potentially massive changes. It would be much more comforting to rationalize why she still might be faithful, than to take one easy look at the true information. It would also more damaging.
Delusion is reading into reality things which aren't there. Telling yourself that everything's fine when it obviously isn't, for example. It's the equivalent of looking at a book about vampires and jumping to the conclusion that it's about wizards.
Sounds insane. You do it all the time. You'll catch yourself if you're willing to read the book of your own thoughts: flowing through your head, in plain view, is a whole mess of opinions and ideas of people, places, and positions you've never even encountered. Crikey!
That mess is incredibly dangerous to have. Being a host to unchecked or false beliefs about the world is like having a faulty map of a terrain: you're bound to get lost or fall off a cliff. Reading the terrain and re-drawing the map accordingly is the only way to accurately know where you're going. Having an accurate map is the only way to achieve your goals.
So you want to develop mental clarity? Be less confused, or more successful? Have a better understanding of the world, the structure of reality, or the accuracy of your ideas?
Just practice the accurate reading of what's going on. Surrender the content of your beliefs to the data gathered by your reading of reality. It's that simple.
It can also be scary, especially when it comes to challenging your "personal" beliefs. It's well worth the fear, however, as a life built on truth won't crumble like one built on fiction.
Truth doesn't crumble.
Earlier this month, Metus did a post asking for LW-ers locations. I thought it would even more useful to have this information in visual format, so I created a Google map. You can access it on the website below. Unfortunately, I am terrible at this post-writing interface, so I can't get the image of the map to load onto here. You'll have to click the link to view it.
(If you haven't filled out the poll yet, please do it! If more people submit their location info, I'll add them to the map.)
These are only for US locations that had a 5-digit zip-code, or a city name. I may or may not map the other countries. If someone else wants to volunteer, send me a message, and I'll add you as a collaborator to the map, so you can edit.
Related to: Living Luminously
Linked is a treatise on exactly this concept. If the effects of recording and classifying every thought pan out like the author says they'll pan out... well, read a (limited) excerpt (from the Introduction), and I'll let you decide whether it's worth your time.
If you do the things described in this book, you will be IMMOBILIZED for the duration of your commitment.The immobilization will come on gradually, but steadily. In the end, you will be incapable of going somewhere without your cache of notes, and will always want a pen and paper w/ you. When you do not have pen and paper, you will rely on complex memory pegging devices, described in "The Memory Book''. You will NEVER BE WITHOUT RECORD, and you will ALWAYS RECORD.
YOU MAY ALSO ARTICULATE. Your thoughts will be clearer to you than they have ever been before. You will see things you have never seen before. When someone shows you one corner, you'll have the other 3 in mind. This is both good and bad. It means you will have the right information at the right time in the right place. It also means you may have trouble shutting up. Your mileage may vary.
You will not only be immobilized in the arena of action, but you will also be immobilized in the arena of thought. This appears to be contradictory, but it's not really. When you are writing down your thoughts, you are making them clear to yourself, but when you revise your thoughts, it requires a lot of work - you have to update old ideas to point to new ideas. This discourages a lot of new thinking. There is also a "structural integrity'' to your old thoughts that will resist change. You may actively not-think certain things, because it would demand a lot of note keeping work. (Thus the notion that notebooks are best applied to things that are not changing.)
The full text is written in a stream-of-consciousness style, which is why I hesitated to post this topic in the first place. But there are probably note-taking junkies, or luminosity junkies, or otherwise interested folk amongst LW. So why not?
(Incidentally I'm reminded of Buckminster Fuller's Dymaxion Chronofile. I wonder how he managed it, or what benefits/costs it wrought?)
Or "The problems inherent in making a goal maximiser with a changing world model."
No paper clips were created or destroyed in the making of this script.
*This is an experimental post to try and get this point across. I'll write something similar for the type of systems I would like to explore, if this goes down well.*
Goal maximisers are great when you have a fixed ontology and you only have limited ways of getting information about the world. These aren't the case in AGI. Remember the map is not the territory, and the map is all that the utility maximiser can look at when deciding the utility of the future of actions.
TL;DR You can't have a utility maximiser choose how to alter the world model or how the world model should progress, if the utility is derived from that world model. If you have something else derive the world model it will conflict with the utility maximiser over resources and what to do in the world. Some method of resolving the conflicts is necessary which means we must go beyond normal model based utility maximisers.