Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

The Value of Uncertainty Reduction - references to academic literature appreciated

2 MichaelBishop 29 April 2014 01:51PM

My intuition (probably widely shared) suggests that uncertainty about the future is stress inducing and reducing uncertainty about the future is helpful because it allows us to plan. So I started trying to invent thought experiments that would begin to help me quantify how much I (and others) value uncertainty reduction... and then I began to get confused. Below I'll share two examples focused on knowledge about the next five years of one's career but similar psychological/philosophical issues would arise in many other contexts.

Thought Experiment #1: the risk of job loss
Imagine the true probability you involuntarily lose your job at some point in the next 5 years is either 0% or 50% and that your current best guess is that you have a 25% chance of losing your job. For a price, an oracle will tell you whether the truth is 0% of 50%. How much will you pay?

If you think you can answer this question, please do so. Part of my confusion is that knowing the probability I will lose my job seems certain to affect the probability that I lose my job. If you told me the probability was 50% then I'd do a combination of working overtime and looking for other jobs that should reduce that probability, and if the probability remains 50% then I'm in a much less pleasant situation than I would be in the case that the probability is 50% but only because I'm assuming its a more reasonable 25%.

Thought Experiment #2: uncertainty about future earnings
Imagine your estimate of your total income over the next 10 years is unbiased, and that the random error in your estimate is normally distributed. (Admittedly a normally distributed error term is unrealistic in this problem but bear with me for simplicity). What's a reasonable standard deviation? Let's say 3 years worth of income. How much would you pay to reduce that standard deviation to 1.5 years of income?

Once again, go ahead and to answer this if you can, but I've got myself confused here as well... I'm trying to get at the present value of reducing uncertainty about the future, but in this example it appears I getting an offer to reduce the actual risk of *experiencing* a much lower than expected income at the expense of reducing the chances that I make a much higher than expected income, not just reducing uncertainty.

Any insight into what's going on with my thought experiments would be greatly appreciated. I see some parallels between them and Newcomb's Paradox, but I'm not sure what to make of Newcomb's Paradox either. If people have relevant references to the philosophy literature that's great...
relevant references to judgment and decision-making or economics literature would be even better.

Seeking examples of people smarter than me who got hung up

10 MichaelBishop 13 January 2013 04:40PM

I'm looking for historical examples of scientists who were 
a) very intelligent and still
b) continued to put themselves behind a theory in their discipline long after it was rejected.  Maybe they got too attached to it, refused to be wrong, got emotional, but they somehow let their hangups get in the way.

Maybe I'm being too demanding, but if you can resist, give me fewer cranks, pseudoscience, and wierd sociopolitical commitments and more theories that were credible until they became incredible to all but their big fancy until-then-respected proponent.

To get you started: 
* Fred Hoyle against the Big Bang
* Lord Kelvin and Hoyle on microbes from spce
* Tesla against relativity and other chunks of modern physics.
* Heaviside against relativity
* George Gaylord Simpson against plate tectonics
* Newton on alchemy

More?

Posted on behalf of a friend. Thanks.

Test your forecasting ability, contribute to the science of human judgment

3 MichaelBishop 05 May 2012 03:07PM

As XFrequentist mentioned last August, "Intelligence Advanced Research Project Activity (IARPA) with the goal of improving forecasting methods for global events of national (US) interest. One of the teams (The Good Judgement Team) is recruiting volunteers to have their forecasts tracked. Volunteers will receive an annual honorarium ($150), and it appears there will be ongoing training to improve one's forecast accuracy (not sure exactly what form this will take)."

You can pre-register here.

Last year, approximately 2400 forecasters were assigned to one of eight experimental conditions.  I was the #1 forecaster in my condition.  It was fun, and I learned a lot, and eventually they are going to give me a public link so that I can brag about this until the end of time.  I'm participating again this year, though I plan to regress towards the mean.

I'll share the same info XFrequentist did last year below the fold because I think it's all still relevant.

continue reading »

Software for Critical Thinking, Prof. Geoff Cumming

3 MichaelBishop 30 March 2011 05:13PM

Prof. Geoff Cumming has done some interesting work.  Of particular relevance to the LW community, he has studied software for enhancing critical thinking.

 

My past research: I worked on Computer tools for enhancing critical thinking, with Tim van Gelder. We studied argument mapping, and Tim’s wonderful Reason!Able software for critical thinking. This has proved very effective in university and school classrooms as the basis for effective enhancement of critical thinking. In an ARC-funded project we evaluated the software and Tim’s related educational materials. We found evidence that a one semester critical thinking course, based on Reason!Able, gives a very substantial increase—considerably greater than reported in previous evaluations of critical thinking courses—in performance on standardised tests.

Tim’s software has been further developed by his company Austhink Software, and is now available commercially as Rationale and bCisive: both are fabulous! http://www.austhink.org/ http://bcisive.austhink.com/