tkocian comments on Open Thread: September 2011 - LessWrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (441)
I keep running into problems with various versions of what I internally refer to as the "placebo paradox", and can't find a solution that doesn't lead to Regret Of Rationality. Simple example follows:
You have an illness from wich you'll either get better, or die. The probability of recovering is exactly half of what you estimate it to be due to the placebo effect/positive thinking. Before learning this you have 80% confidence in your recovery. Since you estimate 80%, your actual chance is 40% so you update to this. Since the estimate is now 40%, the actual chance is 20%, so you update to this. Then it's 10%, so you update to that. etc. Until both your estimated and actual chance of recovery are 0. then you die.
An irrational agent, on the other hand, upon learning this could self delude to 100% certainty of recovery, and have a 50% chance of actually recovering.
This is actually causing me real world problems, such as inability to use techniques based on positive thinking, and a lot of cognitive dissonance.
Another version of this problem features in HP:MoR, in the scene where harry is trying to influence the behaviour of dementors.
And to show this isn't JUST a quirk of human mind design, one can envision Omega setting up an isomorphic problem for any kind of AI.
Can you see what an absurdly implausible scenario you must use as a ladder to demonstrate rationality as a liability? Rather than being a strike against strict adherence to reality. The fact that we have to stretch so hard to paint it this way, further legitimizes the pursuit of rationality.
Except I happen to, as far as I can tell, be in that "implausible" scenario IRL, or at least an isomorphic one.
I mean no disrespect for your situation whatever it may be. I gave this some additional thought. You are saying that you have an illness in which the rate of recovery is increased by fifty percent due to a positive outlook and the placebo effect this mindset produces. Or that an embrace of the facts of your condition lead to an exponential decline at the rate of fifty percent. Is it depression, or some other form of mental illness? If it is, then the cause of death would likely be suicide. I am forced to speculate because you were purposefully vague.
For the sake of argument I will go with my speculative scenario. It is very common for those with bi-polar disorder and clinical depression to create a negative feedback loop which worsens their situation in the way you have highlighted. But it wouldn't carry the exacting percentages of taper (indeed no illness would carry that exact level of decline based merely on the thoughts in the patients head). But given your claims that the illness exponentially declines, wouldn't the solution be knowledge of this reality? It seems that the delusion has come in the form of accepting that an illness can be treated with positive thinking alone. The illness is made worse by an acceptance not of rationality, but of this unsupported data which by my understanding is irrational.
I am very skeptical of your scenario, merely because I do not know of any illnesses which carry this level of health decline due to the absence of a placebo. If you have it please tell me what it is as I would like to begin research now.
It's not depression or bipolarity, probably, but for the purposes of this discussion the difference is probably irrelevant.
I never claimed the 50% thing was ever anything other than a gross simplification to make the math easier. Obviously it's much more complicated than that with other factors, less extreme numbers, and so on, but the end result is still isomorphic to it. Maybe it's even polynomial rather than exponential, but it's still a huge problem.
Can you actually describe the scenario you really are in? I can think of ways I'd address a lot of real-world analogues, but none of them are actually isomorphic to the example you gave. The solutions generally rely on the lack of a true isomorphism, too.
I'd rather not, due to it being extremely personal and embarrassing as well as a huge weak spot.