Eitan_Zohar comments on I need a protocol for dangerous or disconcerting ideas. - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (154)
I have. It definitely isn't. It may have been exacerbated by biochemical causes, but it wasn't caused by them alone. (Sertraline did help me, just never as much as nullifying an existential problem.)
So you accept the argument?
I have no idea what you are trying to say, beyond "listen to your instincts because are more suited for the real world than your intellect."
The fact that taking drugs for your mental issues doesn't nullify your concerns about existential problems in no way implies that your worries about those problems don't come as a result of mental health issues.
Sure, but I can say that I wouldn't be depressed at all if not for those existential problems. I mean, I would be depressed but in a general, background sort of way.
You can say that and of course it seems true to you. It's just like it feels true to the schizophrenic that the CIA is out to get him and his paranoia is due to the CIA trying to get him and not due to the fact that he's a schizophrenic.
Psychological research in general suggests that people are quite good at finding ways to rationalize their emotions. There a strong outside view, that suggests that rationalizations are usually not the root cause.
I've considered it at various points over the last seven years. I think I've justified it properly.
The nature of outside views is that they are going to be wrong eventually.
Of course you do, as the pressures for internal mental consistency are very strong.
This isn't an argument, it's Descartes' demon.
Understanding mental biases and how our brain plays tricks on it is a core part of LW. It hasn't much to do with logical argument but with modern psychological research.
It's no easy skill to notice when your emotions prevent you from clearly thinking about an issue.
Saying "The nature of outside views is that they are going to be wrong eventually." is also very particular. If I'm testing gravity by repeating scientific experiments whereby I drop balls, I'm engaging in the outside view. Science is all about the outside view instead of subjective experience.
When one is subject to a mental illnesses that generally is known to make on think irrationally about an issue, it's useful to not trust one's reasoning and instead seek help for the mental illness by trustworthy people. Bootstrapping trust isn't easy. There are valid reasons why you might not trust the average psychologists enough to trust his judgement over your own.
The general approach is too find trustworthy in person friends. For LW type ideas, you find them at LW meetups. You likely don't want to pull all your information from people from a LW meetup but if your LW friends say that you are irrational about an issue, your mainstream psychologists tells you, you are irrational about the issue and other social contacts also tell you that you are irrational, no matter how strongly it feels like you are right, you should assume that you aren't right.
Well, I definitely know that my depression is causally tied to my existential pessimism. I just don't if it's the only factor, or if fixing something else will stop it for good. But as I said, I don't necessarily want to default to ape mode.
Out of curiosity, how do you know that this is the direction of the causal link? The experiences you have mentioned in the thread seem to also be consistent with depression causing you to get hung up on existential pessimism.
I think he was trying to make a map-territory distinction. You have a mental model of how your brain computes value. You also have your brain, computing value however it actually computes value. Since our values are quite complex, and likely due to a number of different physical causes, it is reasonable to conclude that our mental model is at best an imperfect approximation.
I don't think he's trying to say "listen to your heart" so much as "the map is not the territory, but both are inside your brain in this instance. Because of this, it is possible to follow the territory directly, rather than following your imperfect map of the territory."
That said, we are now a couple meta-levels away from your original question. To bring things back around, I'd suggest that you try and keep in mind that any odd, extreme predictions your mental models make may be flaws in an oversimplified model, and not real existential disasters. In some cases, this may not seem to be the case given other pieces of evidence, but hopefully in other instances it helps.
The greater the inferential distance you have to go to reach an uncomfortable conclusion, the higher the likelihood that there is a subtle logical flaw somewhere, or (much more common) some unknown-unknown that isn't even being taken into account. LessWrong tends to deal with highly abstract concepts many steps removed from observations and scientifically validated truths, so I suspect that a large fraction of such ideas will be discredited by new evidence. Consider shifting your probability estimates for such things down by an order of magnitude or more, if you have not already done so. (That last paragraph was an extremely compressed form of what should be a much larger discussion. This hits on a lot of key points, though.)
That does sound like reasonable advice... however I now have empirical evidence for Dust Theory. Still, most of the horrible problems in it seem to have been defused.
What is your empirical evidence for dust theory?
Point 2: http://lesswrong.com/lw/mgd/the_consequences_of_dust_theory/ck0q
That doesn't even remotely meet the bar for 'evidence' from my standpoint. At best, you could say that it's a tack-on to the original idea to make it match reality better.
Put another way, it's not evidence that makes the idea more likely, it's an addition that increases the complexity yet still leaves you in a state where there are no observables to test or falsify anything.
In common terms, that's called a 'net loss'.
Why do we dream? Because a large amount of conscious beings join the measure of beings who can. That's why we find ourselves as pre-singularity humans. I'd say that's empirical evidence.
Sorry, but evidence doesn't really work that way. Even if we allow it, it is exceptionally weak evidence, and not enough to distinguish 'dust theory' from any other of the countless ideas in that same category. Again, it looks to me like a tack-on to the original idea that is needed simply to make the idea compatible with existing evidence.
As for why we dream, it's actually because of particles, forces, and biochemistry. A mundane explanation for a mundane process. No group hive mind of spirit energy or "measure of beings" required.
Dreaming is a very specific process that seems optimized to the scenario I described with DT. Do these other ideas predict the same?
So you are saying that humans or humanlike minds are the most common type of consciousness that is mathematically possible?
"Dreaming is a very specific process that seems optimized to demonstrate the existence of a dream realm."
"Dreaming is a very specific process that seems optimized to recharge the Earth Spirit that is Mother Gaia."
"Dreaming is a very specific process by which Wyvren allows us to communicate with Legends."
I have literally no idea how you could possibly draw that conclusion from the statement that dreaming has a mundane physics-based explanation. The two things aren't even remotely related.
Sertraline has insomnia listed as a very common (>10%) side effect. If you're currently on it, this is a more parsimonious explanation for your difficulty sleeping than your philosophical beliefs about how sleeping interacts with subjective experience.
I'm not on it. I don't have difficulty falling asleep, it's just traumatizing to get in bed.