M. Scott Peck in the Road Less Traveled says something like, "Mental health is a commitment to reality at any cost."
Gendlin's take on it is spot on too.
I have an article that I'm writing on that in my emotionsforengineers blogspot.
I will definitely link to this site as well.
"Mental health is a commitment to reality at any cost." Depression is considered a mental illness. The depressed are less biased in their self-assessments than the population as a whole. Personally, I agree with Caplan and Szasz that "mental illness" is a poor borrowing from medicine to psychiatry and is usually unfalsifiable.
The depressed are less biased in their self-assessments than the population as a whole.
Can that be true? What about their assessments of the rest of the world outside themselves? My experience with depressed people runs very counter to that. Do you have any references to that?
Depressive realism is an incredibly, well, depressing fact about the world.
Is there something we're missing about it though? Is the world actually such that understanding it better makes you sad, or is it rather that for whatever reason sad people happen to be better at understanding the world?
And if it is in fact that understanding makes you sad... what does this mean for rationality?
Depressive realism is an incredibly, well, depressing fact about the world.
It's not that depressing. If it was lack of bias that caused the depression, that would be bad, but I'm pretty certain it's the other way around.
So you're saying you think that while maybe typically happy people are more irrational, it's still possible to be rational and happy.
I guess I agree with that. But sometimes I feel like I may just hope this is true, and not actually have good evidence for it.
I'm saying that the truth is not so horrifying that it will cause you to go into depression. If the only way to become rational involves depression, this just means that becoming rational sucks. It doesn't mean that the world sucks.
I'm saying that the truth is not so horrifying that it will cause you to go into depression.
This is what I hope and desire to be true. But what I'm asking for here is evidence that this is the case, to counteract the evidence from depressive realism that would seem to say that no, actually the world is so terrible that depression is the only rational response.
What reason do we have to think that the world doesn't suck?
We have lived this far. Our forefathers lived here successfully satisfying their wishers. Our children will also live here. That is the evidence, reason and inspiration to face sucking world and make it more comfortable
I'm pretty rational and I chose to become happy, and now I feel happy most of the time. I'm continuously choosing to be happy.
Idk if that's some valid evidence for you (or if you even care after 10 years lol), you'd have to believe me that I'm rational and that I'm actually happy, but there you go :D
Once in 6th grade, my teacher read us a story about a man who chose to be happy. I was like "holy shit you can do that?" and then I was happy for like the next 7 years.
Then life became difficult in various ways and I haven't been as happy since. I still can locally choose to be happy on the timescale of hours but it doesn't feel sustainable.
How exactly did life become difficult for you?
I have local lows when I encounter difficulties, but those pass pretty quickly as I approach or solve those difficulties.
Do you have any references to that? Yes. They do underestimate the probability their their depression will end, however (I'll see if I can find the link to where I read that, it was likely another GMU blogger). I don't know about other cognitive biases in the depressed.
It looks like there's still some serious controversy on the issue.
But suppose for a moment that it's true: Suppose that depressed people really do have more accurate beliefs, and that this really is related to their depression.
What does this mean for rationality? Is it more rational to be delusional and happy or to be accurate and sad? Or can we show that even in light of this data there is a third option, to actually be accurate and happy?
If you're an egoist, it's best to be delusional and happy. If you're not, the needs of others outweigh your own. Of course, even if depressed people are more accurate, that doesn't mean that they're more productive. Then again, they may be able to use their more accurate beliefs to find a better charity and make up the difference. Of course, you could just have a depressed philanthropist tell you where to donate.
Depressive people are definitely accurate on certain things. But they have lost their hopes. They are not seeing the whole picture. We must live and we must help others to live.
It seems to me - and I'm a depressive - that even if depressed people really do have more accurate self-assessment, your third option is still the most likely.
One recurrent theme on this site is that humans are prone to indulge cognitive biases which _make them happy_. We try to avoid the immediate hedonic penalty of admitting errors, forseeing mistakes, and so on. We judge by the availability heuristic, not by probability, when we imagine a happy result like winning the lottery.
When I'm in a depressed state, I literally _can't_ imagine a happy result. I imagine that my all plans will fail and striving will be useless.
This is still not a rational state of mind. It's not _inherently_ more accurate. But it's a state of mind that's inherently more resistant to certain specific errors - such as over-optimistic probability assessment or the planning fallacy.
These errors of optimism are common, especially in self-assessment. Which might well be the reason depressed people make more accurate self-assessments - humans as a whole have a cognitive bias to personal overconfidence.
-
But it's also inherently more resistant to optimistic conclusions, _even when they're backed by the evidence_.
(It's more rational to be accurate and sad than delusional and happy - because happiness based on delusion frequently crashes into real-world disasters, whereas if you're accurate and sad you can _use_ the accuracy to reduce the things you're sad about.)
This quote sounds nice and is useful if taken metaphorically, but it is technically less universally applicable than it appears, because there is a map/territory confusion. Surely there exist hypothetical agents in certain environments who by more accurately modeling themselves directly cause their own destruction. In other words, yes we are already living with reality, but no we are NOT all already "living with living with reality". Accepting the quote uncritically, one might feel a false sense of assurance that this plan cannot fail. Rob Zahra
What is true is already so. Owning up to it doesn't make it worse.
That's just not true, for some social environments. If you and your friends all believe X, and believing X identifies people as being members of that group, then discovering that X is false and owning up to it might make you lose a lot of friends. Depending on what you need those friends for, that might be a serious problem.
Other alternatives are:
Tell yourself the truth and lie to your friends if needed. Many people find it difficult to lie consistently for a long time; I don't think I can.
Find friends you don't need to lie to.
Take the lead and try to bring your existing friends with you as you change your mind.
There's always the default option, which is to deceive yourself.
Maybe you're lucky and X isn't really a membership-belief to start with, and these friends are already friends you don't need to lie to.
You're conflating something here. The statement only refers to "what is true", not your situation; each pronoun refers only to "what is true":
What is true is already true. Owning up to the truth doesn't make the truth any worse.
The statement only refers to "what is true", not your situation; each pronoun refers only to "what is true":
"What is true" includes everything about my situation. Whether the truth is better or worse is a statement about my judgment of goodness of the truth, which surely includes my judgment of my situation. Whether I own up to the truth has immediate consequences on my situation, unless I can cheaply suppress behavior change deriving from that knowledge.
On the face of it, what you're saying seems to be obviously false. It's more likely that I misunderstand you somehow, but I can't imagine how right now.
"What is true" does not refer to the entire universe. In "owning up to it doesn't make it worse", it refers to the specific thing "what is true" that you are trying to change your mind about. "Owning up to P doesn't make P worse", because your state of mind is not causally connected to P. In the specific example of finding that X is false:
"X is false" is already true. Owning up to it doesn't make "X is false" worse.
Clearly whatever bad things are brought about by a state of affairs where X is false are already occurring -- because it is in fact false! Changing your mind about X should have no effect on these affairs. Your social situation, on the other hand, is a completely different thing.
The Litany of Gendlin is meant to neutralize fears like "but if god didn't exist, that would be terrible!" resulting in clinging to faith in a god, or "I can't be ill, that's too bad to imagine!" resulting in not going to the doctor.
Makes sense, if the universe can be chopped up that way. If "what is true" overlaps enough with your social situation and you aren't good at lying, it might not make sense. I suppose the Litany of Gendlin was not meant to be universally applicable.
Take an example: coming out to a homophobic friend. Now, I'm gay - due to conditioning, I may feel bad about "I am straight" being false. Owning up to being gay won't make ""I am straight" is false" any worse, cause it's already true. This is the limit of the Litany of Gendlin, because my homophobic friend doesn't know I'm gay. So "X thinks I am straight" is true, not false, and owning up to it WILL make it worse, because it changes my friend's belief from true to false (and then they will act upon that belief).
Acknowledging the truth of ""I am straight" is false" doesn't make anything worse.
Acknowledging the truth of "X thinks I'm straight" doesn't make anything worse.
Telling X that you're gay could make things worse for you, but that's not the type of thing that the Litany of Gendlin applies to: It's taking an action, not acknowledging a truth.
(I think that's what you meant, but your wording seems to have gotten confused toward the end if so.)
(I think that's what you meant, but your wording seems to have gotten confused toward the end if so.)
Indeed it did.
For example:
Say I live in a bad neighborhood, but I'm kind of clueless and don't really want to believe it. I hear gunshots sometimes, but rationalize that it must just be cars backfiring. I hear my neighbors fighting, but tell myself it must be a TV program that someone has on really loud. I see people hanging around outside, selling who-knows-what, but tell myself that it must just be the local culture, and it's not my place to say that other people can't spend time outside, that's just silly.
The probability of the police breaking my door down because someone taking anonymous tips about drug activity misheard an apartment number is not any better in that situation than in the one where I admit to what's going on; my beliefs don't change the police's behavior. And in the situation where I acknowledge what's going on, I can do something about it, like finding somewhere else to live.
Acknowledging it is less comfortable - being afraid of one's neighbors is not fun, and the first situation avoids that - but feeling less fear doesn't mean there's actually less danger.
I can see the objection there however, partly because I sort of have this issue. I've never been attacked, or mugged, or generally made to feel genuinely unsafe - those few incidents that have unsettled me have affected me far less than the social pressure I've felt to feel unsafe - people telling me "are you sure you want to walk home alone ?", or "don't forget to lock the door at all times !".
I fight against that social pressure. I don't WANT the limitations and stress that come with being afraid, and the lower opinion it implies I should have of the world around me. I value my lack of fear quite highly, overall.
That said, is it really to my advantage to have a false sense of security ? Obviously not. I don't want to be assaulted or hurt or robbed. If the world really is a dangerous place there is no virtue in pretending it isn't.
What I should to is work to separate my knowledge from my actions. If I really want to go home alone, I can do this without fooling myself about how risk-free it is; I can choose instead to value the additional freedom I get from going over the additional safety I'd get from not going. And if I find I don't value my freedom that highly after all, then I should change my behaviour with no regrets. And if I'm afraid that thinking my neighbourhood is unsafe will lead me to be a meaner person overall, well, I don't have to let it. If being a kind person is worth doing at all, it's worth doing in a dangerous world.
(this has the additional advantage that if I do this correctly, actually getting mugged might not change my behaviour as radically as it would if I were doing all that stuff out of a false sense of security)
Of course the truth is that it isn't that simple: our brain being what it is, we cannot completely control the way we are shaped by our beliefs. As earlier commenters have pointed out, while admitting you're gay won't affect the fact that you are gay, and it doesn't imply you should worsen your situation by telling your homophobic friends that you're gay, our brains happen to be not that good at living a sustained lie, so in practice it probably will force you to change your behaviour.
Still, I don't think this makes the litany useless. I think that it is possible when we analyse our beliefs, to not only figure out how true they are but also to figure out the extent to which changing them would really force us to change our behaviour. It probably won't lead to a situation where we choose to adopt a false belief - the concept strikes me as rather contradictory - but at the end of the exercise we'd know better which behaviours we really value, and we might figure ways to hold on to them even as our beliefs change.
Acknowledging it is less comfortable
Some people might view "less comfortable" as worse.
You're conflating something here. The statement only refers to "what is true", not your situation; each pronoun refers only to "what is true"
In that case saying "Owning up to the truth doesn't make the truth any worse" is correct, but doesn't settle the issue at hand as much as people tend to think it does. We don't just care about whether someone owning up to the truth makes the truth itself worse, which it obviously doesn't. We also care about whether it makes their or other people's situation worse, which it sometimes does.
I did some background reading on this quote recently. It's funny how it means something rather more restricted in its original context. Gendlin was writing about honestly sharing one's thoughts and feelings about another person with them rather than telling them what one thinks they want to hear or what one wishes were true of one's own feelings. That's what the "owning up", "not being open", and "already enduring it" refer to. But now that it's been pulled out of context, it seems to say quite a lot more.
From information security analyst Joshua Goller, the "Tarksi-Gendlin Litany of Information Security":
What is vulnerable to exploitation is already so;
facing it doesn't make it worse,
and ignoring it doesn't make it secure.
Because it's vulnerable, it is what is there to be attacked,
and anything not found to be vulnerable isn't there to be hacked (yet).
The developers can stand to know what is exploitable,
for they unknowingly wrote it to be exploited,
and the users can stand to know what is insecure,
for they are currently using it insecurely.
If the software contains an exploitable bug,
I desire to believe that the software contains an exploitable bug.
If the software does not contain an exploitable bug,
I desire to believe that I haven't found any exploitable bugs in it yet,
And I had better keep looking.
Let me not become attached to beliefs I may not want.
If the software does not contain an exploitable bug,
I desire to believe that I haven't found any exploitable bugs in it yet,
And I had better keep looking.
This part seems like a mistake. If this component actually does not contain an exploitable bug, my time would be better spent looking for exploitable bugs in other components. Otherwise I can never audit the whole code base.
Yes. Also, if there is no exploitable bug, then I would want to believe that, not merely that I haven't found any yet.
I think this could be considered one the the very basics of rational thinking. Like, if someone asked what rationality/being rational means and wants a short answer, this Litany is a pretty good summary.
A simpler way to put this is "Reality does not change whether you believe in it or not but it effects you regardless"
Edit: LW actually made a song/album out of this one!
—Eugene Gendlin