Meetup : London Meetup - Effective Altruism
Discussion article for the meetup : London Meetup - Effective Altruism
LessWrong London are having another awesome meetup and we would love for you to come!
Since I have recently attended the UK and the USA effective altruism summit (http://www.effectivealtruismsummit.com/) I thought that could be the topic of this event. I'll try and answer questions about it and say what went on when we gathered a load of charities together for a long weekend. I'm sure some people will want to split off and chat about other things and that is fine too.
The plan is to meet at The Shakespeare Inn, 200m from Holborn Underground at 2pm on Sunday 31th August. We will officially finish at 4pm but many people stay longer. We will have a bit of paper with the LessWrong logo on it so you can find us easily.
If you have any questions, or are thinking of coming, feel free to email me (james) at kerspoon+lw@gmail.com or call me on 07429552244. Otherwise, just turn up!
Hope to see you there, James
P.S err on the side of turning-up, we're friendly, and it's fun :)
Discussion article for the meetup : London Meetup - Effective Altruism
Goal Setting and Goal Achievement
The reason people succeed is well-focused, regular, effort.
The reason for failure is more often lack of effort rather than the direction of the effort. For example, more people put on weight because they couldn't stick to their diet than those that chose a diet that wouldn't have caused weight loss.
1. How the Brain Decides what Task you Will Perform
You subconscious decides on your motivation for a task, but you can consciously choose to use your self-control and not do the easiest task. For example, my subconscious is saying that watching TV requires the least motivation but I have chosen to use up some of my self-control and write this article instead.
I'm going to discuss a few tactics to reduce the self-control required to complete a task (such as setting up a trigger) as well as ways of making your subconscious disfavour your distractions (for example by removing the plug from your TV).
1.1 Motivation
A good guideline for the motivation that your subconscious gives you for a task is given in the equation below.
The Expectancy and Value can be thought of in terms of gambling. The Value is how much you get paid if you win and the Expectancy is the probability of winning. The Value of a task changes constantly as it is how much you value completing the task at any given moment. Delay is the amount of time before you will get the pay-off. For writing an essay it is likely to be after the entire thing is written, watching TV is an almost instant hit. Impulsiveness is a more complex part which is covered in Piers Steel's “Procrastination”.
If you are struggling to put effort in to a task then it is probably because your motivation for it is lower than its alternatives. For example, there are many things I could do at the moment: I could get some food, play computer games, go for a walk, write this essay, etc. and each has a certain level of motivation based on the above equation. If I make sure I'm not hungry then the first task is of such low value that it wont be a problem. If I delete my computer games so I have to find the CD to be able to play them it increases the delay making that less of an issue.
Every time you get distracted take note of the task and see why you have a higher motivation for it than what you should be doing. Then go through each part of the equation in turn to see if you can reduce the motivation for tasks you don't want to do.
1.2 Self Control
Self-control is the effort you put in to a task to make sure you do it instead of one with a higher motivation. For example, getting a snack might have a higher motivation than continuing with my work but I can use up some self-control to make sure I keep working.
Self control is like a muscle. Do too much and you will exhaust it, but if you successfully use it regularly it will get stronger.
2. How to Set goals to increase changes of success
Below are three methods to increase the chances of succeeding at a task. They are based on the ideas in section one.
-
The first method (triggers and mechanical starting points) works because you quickly program your subconscious to perform a task at a given time. It means you can sidestep the subconscious calculation of motivation until the task has started, and starting is often the hardest part.
-
The second (mental contrasting) provides you with a more accurate guess for the value and expectancy of the task. This means you can work out whether you are likely to succeed before you have started. It also increases your motivation when doing the task.
-
The third (prevention or promotion) frames the goal so that you get more motivation when you need it most. But you have to choose when you need it, there is a tradeoff.
It will still take a lot of self control to achieve worthwhile tasks but by using mechanisms such as this you can increase you chances.
2.1 Triggers and Mechanical Starting Points
The key to this method of goal setting is to set up simple triggers and a mechanical starting point. A trigger is something to tell you when to start working on your goal, a mechanical task is one that can be done with no thinking, or at least without conscious thought. By thinking of your goal in terms of triggers and starting points you can form a habit much quicker. For example if you goal is to become more flexible you should convert this into one with trigger and a mechanical starting point.
The trigger should specify exactly where and when you start, a good example would be "straight after brushing your teeth every weekday morning" (this is only a good example if you already have the habit of brushing your teeth every morning), a bad example would be "at 8:15am", what happens if you wake up a bit late and are brushing your teeth at that time. A further problem with purely time based triggers is that you wont be able to tell it is exactly 8:15 unless you set an alarm and in a lot of cases it is not practical to have many alarms going off through the day.
A mechanical starting point is something that gets you started without having to think. Setting your alarm for 10mins and swinging your leg back and forward would be a good starting point for the goal of become more flexible. If your goal was to write a book a good starting point would be to get out your notebook and read through the last bit of what you wrote last time. A bad starting point would be to continue writing where you left off last time, that requires conscious thought.
Some goals do not have a time based trigger at all, for example “not snacking” applies all the time. You still want to form a habit that your subconscious can follow without having to deplete your self control. In that cause I would suggest framing the goal in terms of "if ... then ..." e.g. if I feel the urge to snack then I will eat a few nuts and seeds and drink a glass of water. For goals like this you can also factor in the brain taking the easiest option, if you make it a real effort to get to the snacks you will be less likely to do so.
2.2 Mental Contrasting
The next key point is to consider alternating points of view of the goal. You want to be overly pessimistic about how hard it will be (i.e. think it will be difficult) but optimistic about your chance of success. The idea is to get to a point where you think:
"This will be hard to achieve but I know, with a lot of self control and effort I can do it, and it really will be worth it."
The way to do this is to alternate between thinking of reasons why the task will be difficult (or why you could fail) then think of why it is important to you to achieve.
When you are thinking of the difficulties imagine yourself in a realistic situation where you are likely to give up. Think up a bad day where you are hungry, annoyed, lonely, and/or tired (H.A.L.T). After you have put yourself in that mindset imagine what would tip you over the edge into failing your goal in some way. By picturing you mental state it will help you have a more realistic view of you chances for success.
Then imagine that you have already achieved the goal. Think of the reasons why it helped you or made your life better. You can also imagine what it would be like to not achieve the goal. Really think if this would make any difference or not. This will help you work out how important the goal is to you. It is important to be brutally honest.
Repeat this cycle two or three times and you should have a good idea if you are likely to put in the effort when it counts, and if that effort is worth it. You should need very long to do this, just make sure you do each part in turn (think first of the difficulties, then the reasons).
You might realise that when the chips are down and you are hungry, angry, lonely, tired or just having a bad day you are likely to give up. If this is the case (remember to be a bit pessimistic about this) then don’t bother working on the goal. You have already decided that it is not worth the effort. If you think you cannot succeed when you are planning you will have no hope when some unexpected challenge pops up.
It is also worth quickly thinking through an “if-then” reaction for your sticking points that you have identified. For example, if you are trying to run every day but you know some days you feel tired and find it difficult to start. In that case you could set the response to be "If I ever feel too tired to start running then I will put my running shoes on and leave the house whether I actually run or not." The problem is likely to be self-control depletion not exhaustion, by getting to the point of actually running you have lowered the mental effort of starting.
2.3 Prevention or Promotion
The different ways you can think about a goal change where you will put effort in to it. You can either have a goal where you must not fail (prevention) or a goal where you want to conquer (promotion). The way you phrase your goal can change it from one to the other and cause you to have more motivation when you need it. For example lets say you go rock climbing for fun with some friends, if you are worried that you are holding the group back so must train harder then your focus is prevention. If you want to beat everyone else then your focus is promotion.
If you have a prevention focus then you get more motivation when you are failing but less when you are doing well. Taking the example of rock climbing, if you had a really bad training session where you were actually worse than before then you would feel like you had to train even harder because you really cannot afford to fail. On the other hand you will put less effort in when succeeding, after all your goal is simply not to fail and if you are doing great then you can afford to put in less effort.
If you have a promotion focus you get more motivation when you are doing well but are likely to give up when failing. For example if you have had a great training session and managed to beat someone on a new climb then you will feel elated and want to do more. Yet if you do badly then you are likely to put less effort in and hence do even worse and give up. If you were trying to win but if looks like you can't then why bother to put the effort in.
You can use this to your advantage. Think of whether you want more motivation when failing or succeeding and frame your goal accordingly. For any task you can choose promotion or prevention not both. If you decide you want a promotion goal and start to fail then you should realise it, feel bad about it and make a conscious decision to put more effort in.
3 Summary
The subconscious gives you a motivation value for every thing you could do. You can change the motivation by working out why your subconscious is choosing distracting things and changing those factors. By spending a few minutes examining your goals you can increase the likelihood of achieving them. The three ways discussed were:
-
"If-then" triggers,
-
Mental contrasting,
-
Changing the wording of the goal to give either a prevention or promotion focus.
Remember that even with this type of goal setting you are vulnerable to self-control depletion. Sometimes you have bad days and on those days it will take a lot of mental effort and a lot of thinking through why you wanted to achieve the goal to actually succeed. The check-list below is meant to take the ideas above and form them into questions to improve the chances of succeeding.
I suggest having a written list of goals somewhere where you will look at them regularly. For example, a bedside table where you can look at them morning and night. It depends on your routine. There is no point in having these triggers if they are not regularly refreshed in your mind. The best way to do that is to read them and perform them regularly.
3.1 Goal Check-list
-
I want more motivation when [failing, succeeding] therefore focus will be [prevention, promotion]?
-
It will be done when …
-
I am doing this because …
-
My sticking point will be …
-
I want this to be done because …
-
I risk failure when …
-
The trigger is …
-
The very first step is ...
3.2 Distraction Check List
-
Notice you are becoming distracted.
-
Examine each factor of the motivation equation in turn for both the distracting task and the task you want to perform.
-
Think how to change the goal or your environment to change the factors in your favour.
-
If you are regularly coming up against distractions for the same goal then re-evaluate the value and expectancy. It might be worth changing or dropping the goal.
PostScript
The sources are from the list below (plus two years of small tweaks by myself):
- What Color is your parachute
- The Seven Habits of Highly Effective People by Stephen Covey
- Succeed: How We Can Reach Our Goals by Heidi Grant Halvorson
- The Procrastination Eqn. -- http://procrastinus.com/
- http://gettingresults.com/wiki/How_To_-_Set_Goals_and_Achieve_Them
I wrote this article before reading the more recent work on how mindset changes whether people perform as if self-control was a resource that could be depleted. I plan to make a new article that contains this information as well as a lot more on mindset and emotional responses but I want to test it out on the London LW community first.
Although this is based in research it does diverge somewhat. It is more a measure of what has worked well for me. If you want the truth in the research read the research papers in the back of the books I have referenced.
This post is also available on my personal blog
Meetup : London Social Meetup (and AskMeAnything about the CFAR workshop)
Discussion article for the meetup : London Social Meetup (and AskMeAnything about the CFAR workshop)
LessWrong London are having another awesome meetup and we would love for you to come! This one will mostly be a social chat but I have just returned from a very interesting week in San Francisco where I went to a CFAR workshop (rationalist training camp) and stayed at Leverage Research (a load of people living together and trying to improve the world). Because of that I thought people might want to know all the exciting things they have told me. The plan is to meet at The Shakespeare Inn, 200m from Holborn Underground at 2pm on Sunday 27th . We will officially finish at 4pm but honestly people tend to enjoy it so much they want to stay much longer, and regularly do. We will have a sign with the LessWrong logo on it so you can find us easily. If you have any questions, or are thinking of coming, feel free to email me (james) at kerspoon+lw@gmail.com. Otherwise, just turn up! Hope to see you there, James P.S err on the side of turning-up, we're friendly, and it's fun :)
Discussion article for the meetup : London Social Meetup (and AskMeAnything about the CFAR workshop)
An Introduction To Rationality
This article is an attempt to summarize basic material, and thus probably won't have anything new for the hard core posting crowd. It'd be interesting to know whether you think there's anything essential I missed, though.
Summary
We have a mental model of the world that we call our beliefs. This world does not always reflect reality very well as our perceptions distort the information coming from our senses. The same is true of our desires such that we do not accurately know why we desire something. It may also be true for other parts of our subconscious. We, as conscious beings, can notice when there is a discrepancy and try to correct for it. This is known as Epistemic rationality. Seen this way science is a formalised version of epistemic rationality. Instrumental rationality is the act of doing what we value, whatever that may be.
Both forms of rationality can be learnt and it is the aim of this document to convince you that it is both a good idea and worth your time.
Hill Walker's Analogy
Reality is the physical hills, streams, roads and landmarks (the territory) and our beliefs are the map we use to navigate them. We know that the map is not the same thing as the territory but we hope it is accurate enough to understand things about it all the same. Errors in the map can easily lead to us doing the wrong thing.

The Lens That Distorts

We see the world through a filter or lens called perception. This is necessary; there is too much information in the world to map fully but it is very important to quickly recognise certain things. For example it is useful to recognise that a lion is about to attack you.
To go back to our hill walker's analogy: we are quicker navigating with a hill walker's map than an aerial photograph. This is because the map has been filtered to only show what is commonly needed by walkers. Too much information can slow our judgement or confuse us. On the other hand incorrect information or a lack of information is just as bad. If when out walking we are told to turn left at the second stream yet the hill has more streams than the map then we will get lost. The map was not adequate in this situation.
In the same way our own maps can be wrong caused by errors in our perceptual filter. Optical illusions are a nice example of such a phenomenon where there is a distortion between map and territory.
An Example
Light from the sun bounces off our shoelaces, which are untied, and hits our eye (reality). These signals get perceived (the lens) as an untied shoelace and thus we have the belief (map) that our shoelaces are untied. In this case the territory and map reflect the same thing but the map contains a condensed version; the exact position of the laces were not deemed important and hence were filtered out by our perception.
Truth
"The sentence 'snow is white' is true if, and only if, snow is white." -- Alfred Tarski
What is being said here is that if the reality is that 'snow is white' then we should believe that 'snow is white'. In other words, we should try to make our belief match reality. Unfortunately we cannot directly tell if in reality snow is white but given enough evidence we should believe it as true.
By default, our subconscious believes what it sees, it has to. You have no time to question your belief if a lion is coming towards you. We could not have survived this long as a species if we questioned and tested everything. Yet there are some times where we are reproducibly, predictably, irrational. That is, we do not update our belief correctly based upon evidence.
Rationality
-
Epistemic rationality: believing, and updating on evidence, so as to systematically improve the correspondence between your map and the territory. The art of obtaining beliefs that correspond to reality as closely as possible. This correspondence is commonly termed "truth" or "accuracy".
-
Instrumental rationality: achieving your values. Not necessarily "your values" in the sense of being selfish values or unshared values: "your values" means anything you care about. The art of choosing actions that steer the future toward outcomes ranked higher in your preferences. Sometimes referred to as "winning".
These definitions are the two kinds of rationality used here. Epistemic rationality is making the map match the territory. An example of an epistemic rationality error would be believing a cloud of steam is a ghost. Instrumental rationality is about doing what you should based upon what you value. An instrumental rationality error would be working rather than going to your friend's birthday when your values say you should have gone.
Why Should We Be Rational?
-
Curiosity – an innate desire to know things. e.g. “how does that work?”
-
Pragmatism – we can better achieve what we want in the world if we understand it better. e.g. “I want to build an aeroplane; therefore I need to know about lift and aerodynamics”, “I want some milk; I need to know whether to go to the fridge or store”. If we are irrational then our belief may cause a plane crash or walking to the store when there is milk in the fridge.
-
Morality - the belief that truth seeking is important to society, and therefore it is a moral requirement to be rational.
What Rationality Is And Is Not
Being rational does not mean thinking without emotion. Being rational and being logical are different things. It may be more rational for you to go along with something you believe to be incorrect if it fits with your values.
For example, in an argument with a friend it may be logical to stick to what you know is true but rationally you may just concede a point of argument and help them even if you think it is not in their best interest. It all depends on your values and following your values is part of the definition of instrumental rationality.
If you say “the rational thing to do is X but the right thing is Y” then you are using a different definition for rationality than is intended here. The right thing is always the rational thing by definition.
Rationality also differs for different people. In other words the same action may be rational for one person and irrational for another. This could be due to:
-
Different realities. For example, living in a country with deadly spiders should give you more reason to be afraid of them. Hence the belief that spiders are scary is only rational in certain countries.
-
Different values. For example two people may agree on how unlikely it is to win the lottery but one may still value the prize enough to enter. Hence playing the lottery can be either rational or irrational depending on your values.
-
Incorrect beliefs. If you believe that a light-bulb will work without electricity and do not have sufficient evidence to support the claim then your belief is wrong. In fact if you believe anything without sufficient evidence then you are wrong but what constitutes sufficient evidence is down to your values and hence is another valid reason for beliefs to differ.
When confronting someone who has different beliefs this could be down to the points above, in which case it is worth trying to see the world through their lens to understand how their belief came about. You may still conclude that their belief is wrong. There is no problem with this. Not all beliefs are based in rationality.
An important note in such discussions is to consider whether you are actually arguing over different points or using different definitions. For example, two people may argue about how many people live in New York because each is using a slightly different definition of New York. The same sort of thing happens when the old saying “if a tree falls in the woods and no one is around to hear it does it make a sound?”. People may have a different answer for this based upon their definition of “makes a sound” but there expected experiences are generally the same; that there are sound waves but not the perception of sound.
Science and Rationality
Science is a system of acquiring knowledge based on scientific method, and the organized body of knowledge gained through such research. In other words it is the act of testing theories and gaining information from those tests. A theory proposes an explanation for an event, that's all it is. This theory may or may not be useful; a useful theory is one that is:
-
Logically consistent,
-
Can be used as a tool for prediction.
If this seems familiar to rationality then you are correct. Science attempts to obtain a true map of reality using a specific set of techniques. It is in essence a more formalised version of rationality.
A theory is very strongly linked to a belief. Indeed they should both have the two traits listed above; to be logically consistent and to be a predictor of events. Just like a theory, a belief proposes an explanation for events.
The Lens That Sees Its Flaws
We, as conscious beings, have the ability to correct the distortion to our perceptual filter. We may not be able to see through an optical illusion (our lens will always be flawed) but we can choose to believe that it is an illusion based upon other evidence and our own thoughts.
In the image on the left below (Müller-Lyer illusion) we may see the lines as different lengths but through other evidence choose not to believe what our senses are telling us. The image to the right (Munker illusion) is an illusion that is even harder to believe: the red and purple looking colours on the top parts are actually the same; as is the green and turquoise looking ones on the bottom image. I hope that you check this for yourself. Even after you convince yourself of the illusion it will still be very hard to see them as the same colour. We may never be able to fix the flaws in our perception but by being aware of them we can reduce the mistakes we make because of it.

Our Modular Brain
We don't know what many of our desires are, we don't know where they come from, and we can be wrong about our own motivations. For example, we may think the desire to give oral sex is for the pleasure of our partner. However our bodies create this desire as a test for health, fertility and infidelity. We consciously feel the desire and ascribe reason to it but it can be a different reason to the subconscious one.
Because of this we should treat the signals coming from our subconscious as distorted by yet another lens, one that hides much of what is behind it. We (speaking as the conscious) notice our subconscious desires and try to infer why they have occurred. Yet this understanding may be wrong. Again these flaws in our lens can be corrected. You probably already temper the amount of sugar and fat you eat even though you have a subconscious desire to eat more.
Another example is that there are no seriously dangerous spiders in the UK so there is no rational reason to be afraid of them. Yet is seems to be a universal trait. On seeing a spider and feeling afraid, we can recognise that particular flaw in our subconscious reasoning and choose to act differently by not running away.
What should we do?
To be epistemicly rational we must look for systematic flaws in the perceptual filters (lenses) between reality and our brain as well as within different parts of the brain. We must then train ourselves to recognise and correct them when they occur. Simply talking or thinking about them is not enough, active training in recognising and dealing with such errors is needed.
To be instrumentally rational we must define our values, updating them as needed, and learn how to achieve these values. Simply having well defined values is not enough, everything we do should work towards achieving them in some way. There are methods that can be learnt to help with achieving goals and it would make sense to learn these especially if you are prone to procrastination. The links below are a few examples of such things on procrastination, self help, and achieving goals.
-
http://lesswrong.com/lw/3nn/scientific_selfhelp_the_state_of_our_knowledge/
-
http://lesswrong.com/lw/2p5/humans_are_not_automatically_strategic
One of the most important concepts to grasp is the best way to update our beliefs based upon what we experience (the evidence). Thomas Bayes formulated this mathematically such that if we were behaving rationally we would expect to follow Bayes' Theorem.
Research shows that, in some cases at least, people do not generally follow this model. Hence one of our first goals should be to think about evidence in a Bayesian way. See http://yudkowsky.net/rational/bayes for an intuitive explanation of Bayes' Theorem.
One important concept comes out of the theorem that I will briefly introduce here. Evidence must update our existing beliefs not replace them. If a test comes up positive for cancer the probability that you have it depends on the accuracy of the test AND the prevalence for cancer in the general population. This is likely to seem strange unless you think of it in a Bayesian way.
Summary
-
The physical world is reality.
-
Inside our brains we have beliefs.
-
Our beliefs are meant to mirror reality.
-
A good belief and a good scientific theory is:
-
Logically consistent – fits in with every other good belief.
-
A predictor of events – helps you predict the future and explain past events.
-
-
Our perception of the world can distort the beliefs.
-
We can change how we perceive things through conscious thought.
-
This can reduce the error between reality and our beliefs.
-
This is called rationality.
Meetup : London Meetup - Achieving Better Goals
Discussion article for the meetup : London Meetup - Achieving Better Goals
LessWrong London are having another awesome meetup and we would love for you to come!
The plan is to meet at The Shakespeare Inn, 200m from Holborn Underground at 2pm on Sunday 4th August. We will officially finish at 4pm but honestly people tend to enjoy it so much they want to stay longer, and regularly do. We will have a sign with the LessWrong logo on it so you can find us easily.
This meetup is all about creating goals and turning them into actionable objectives. If you went to Rikk's Meetup then hopefully you already have a few goals in mind but don't worry if you didn't.
If you have any questions, or are thinking of coming, feel free to email me (James) at kerspoon+lw@gmail.com. Otherwise, just turn up!
Hope to see you there, James
P.S err on the side of turning-up, we're friendly, and it's fun :)
"Through rationality we shall become awesome, and invent and test systematic methods for making people awesome, and plot to optimize everything in sight, and have fun."
Discussion article for the meetup : London Meetup - Achieving Better Goals
Meetup : London Special Guests: Jaan Tallinn and Michael Vassar of MetaMed
Discussion article for the meetup : London Special Guests: Jaan Tallinn and Michael Vassar of MetaMed
We have three specials guests! Jaan, Michael, and Cat (see below) are flying in to the UK and will be joining us for a meetup. As usual, it is simply a few people with common interests chatting for a few hours. Anyone can come along, don't feel like you need to have read the sequences. It's a fun way to spend a few hours. It's also nice to have people to bounce ideas off - we are a friendly bunch. Hope to see you there, For more information see our google group (link below) or message me (kerspoon) Guests: - Jaan Tallinn, who participated in the development of Skype and Kazaa. - Michael Vassar, the former president of the Singularity Institute and current Chief Science Officer of MetaMed. - Cat Lavigne, CFAR instructor. See: https://groups.google.com/forum/?fromgroups=#!topic/lesswronglondon/_h6lGBjnO9Q http://lesswrong.com/r/discussion/lw/had/michael_vassar_in_europe/ http://lesswrong.com/r/discussion/lw/h95/want_to_have_a_cfar_instructor_visit_your_lw_group/
Discussion article for the meetup : London Special Guests: Jaan Tallinn and Michael Vassar of MetaMed
I believe it's doublethink
This is my attempt to provide examples and a summarised view of the posts on "Against Doublethink" on the page How To Actually Change Your Mind.
What You Should Believe
Lets assume I am sitting down with my friend John and we each have incomplete and potentially inaccurate maps of a local mountain. When John says "My map has a bridge at grid reference 234567", I should add a note to my map saying "John's map has a bridge at grid reference 234567" *not* actually add the bridge to my map.
The same is true of beliefs. If Sarah tells me "the sky is green" I should, assuming she is not lying, add to my set of beliefs "Sarah believes the sky is green". What happens too often is that we directly add "The sky is green" to our beliefs. It is an overactive optimisation that works in most cases but causes occasional problems.
Taking the analogy a step further we can decide to question John about why he has drawn the bridge on his map. Then, depending on the reason, we can choose to draw the bridge on our map or not.
We can give our beliefs the same treatment. Upon asking Sarah why she believed the sky is green, if she said "someone told me" and couldn't provide further information I wouldn't choose to believe it. If, however, she said "I have seen it for myself" then I may choose to believe it, depending on my priors.
I Believe You Believe
The curious case is when someone says "I believe X". This can be meant a few ways:
- I have low confidence in this belief. e.g. "I believe that my friend Bob's eyes are hazel, but I'm not sure".
- I have this belief but have reasons to think you wont share it. e.g. "I believe she is attractive".
- I have the fact 'I believe the sky is green' in my mental model of the world. e.g. "I believe god exists."
The first case I do not have a problem with. It means your probability density has not yet shown a clear winner but you are giving me the answer that is in the lead at the moment. In this case I should add a note saying "You believes there is a bridge here, you are not very confident in the belief".
I don't have a problem with the second case either. I can have the belief "Angelina Jolie is attractive", someone else not have that belief, and we both be rational. This is because we are using different criteria for attractive. If I were to change it to a consistent definition of attractive it wouldn't be a problem e.g. The phrase "Angelina Jolie is regularly voted in the top 100 most attractive people in the world" doesn't require the phrase 'I believe...'.
The last case is even more curious. Lets assume that John (from our first example) says "I believe there is a bridge at grid reference 234567" but means it in the third case. I should add a note to my map saying "John has the following note on his map: 'I believe there is a bridge at grid reference 234567'". You would hope that the reason he has that note is because there is actually a bridge on his map. Unfortunately people are not that rational. You can have a cached belief that says "I believe X" even if you do not have "X" as a belief. By querying why they have that belief you should be able to work out if you should believe it, or even if they should.
To use the example from religion you can have the belief "I believe god exists" even if you do not have the belief "god exists".
Recommendations
I'm going to put myself on the line and give some recommendations:
- When we are told or recite a fact, try to remember why it is or was added. The reason will often be poor.
- When telling others facts, tell them the reason you believe it, e.g. say "I think there is a bridge here because I overheard someone talking about it". This should help you weed out cached beliefs in yourself and give the other person a better metric for adding to the own beliefs.
- When being told something, ask them why they have the belief, it also helps if you recite it back to them as if you are trying to understand, for example: "I see. You think there is a bridge here. Why do you think that?".
- When we hear "I believe" or "I think" try to classify the statement as one of the three options above.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)