There is a pattern here, and part of it looks like this. You contemplate an idea X and it bothers you. You circulate your concerns among a number of people who are good at thinking and interested in ideas like X. None of them is bothered by it; none of them seems to see it the same way as you do. And, in every case, you conclude that all those people have failed to understand your idea.
Now, I think there are two kinds of explanation for this. First, we have (to put it crudely) the ones in which you are right and everyone else is wrong.
And then we have (to put it crudely, again) the ones in which...
Another arm-chair diagnosis here. Clearly far-out ideas affect you more than other, equally (or more) intelligent people. This is almost certainly a flaw in how your brain functions, not an indication of the problem's severity. If it were, some of those smart people you contacted would consider it seriously. If you concede that this is a problem with your brain, then you should consult the experts on fixing the brain, not fixing the future.
I think what you're doing is something that in psychology is called "Catastrophizing". In essence you're taking a mere unproven conjecture or possibility, exaggerating the negative severity of the implications, and then reacting emotionally as if this worst case scenario were true or significantly more likely than it actually is.
The proper protocol then is to re-familiarize yourself with Bayes Theorem (especially the concepts of evidence and priors), compartmentalize things according to their uncertainty, and try to step back and look at your actual beliefs and how they make you feel.
Rationality is more than just recognizing that something could be true, but also assigning appropriate degrees of belief to ideas that have a wide range of certainties and probability. What I am seeing repeatedly from your posts about the "dangers" of certain ideas, is that your assigning far too much fear to things which other people aren't.
To use an overused quote: "Fear is the mindkiller."
Try to look at the consequences of these ideas as dispassionately as possible. You cannot control everything that happens to you, but you can, to an extent, control your response t...
The next mistake was opening the door to solipsism and Brain-in-a-Vat arguments. This was so traumatic to me that I spent years in a manic depression.
Consider the possibility that the manic-depression was coincidental. When people have mental things happen for fundamentally biochemical reasons, they often misattribute them to the most plausible seeming non-biochemical cause they can think of. Exposure to ideas can exacerbate an existing problem, but it is unlikely that the lowest-hanging here has anything at all to do with the ideas themselves. Instead of looking at how you engage with stressful ideas, consider looking into other aspects of your life which might reduce your resilience.
With that said...
You started with a set of values and preferences and an ontology. When you encountered dust theory, you discovered that one of the definitions used to define your values - the notion of personal identity - wasn't fully coherent. You then tried to substitute a different definition in its place - an alternative notion of personal identity, which might not carry across a sleep/wake cycle. This alternate notion of identity is not the thing you care about. A small philosophically-minded portion of your brain has decided that it is what you care about, and is now in conflict with the other parts of your brain which don't accept the altered values. Listen to them; while those brain-parts aren't good at explaining things, they have knowledge and in this case they are right.
I am going to perpetrate a little bit of the sin of amateur psychological diagnosis over the Internet. Sorry about that.
I'm not sure that the substance of the philosophical and cosmological concepts here is what is afflicting you. After all, many people engage with cosmological horror recreationally — see, for instance, the continued popularity of writers such as Lovecraft, Stross, Banks, or the "SCP Foundation" folks.
Exposure to weird cosmological horror does not cause most humans to freak out, at least not for very long. Most people more-or-less instinctively take Egan's Law into account ("it all adds up to normality") — to the extent that this Law is only needed as a reminder for people who don't automatically do so.
It sounds like you are having trouble disengaging from these ideas. So you might want to go seek treatment specifically for anxiety. This doesn't mean "stop thinking about these issues and thereby give up any possibility of coming up with good solutions to them"; it means "become able to stop thinking about these issues when it's getting loopy and unproductive, and get back to ape mode — and remember, ape mode is acceptable; we've b...
I think what you need to realize is that it is not a question of proving that all of those things are false, but rather that it makes no difference whether they are or not. For example when you go to sleep and wake up it feels just the same whether it is still you or a different person, so it doesn't matter at all.
I think there should be a discussion about the more general idea of "needing a protocol for discussing dangerous or disconcerting ideas" in addition to the discussion of this specific circumstance.
I think that I understand your feelings. I had the same periods of existential fear about most of the things that you had. Two of them I discussed in the post about AI failures levels which didn't got any comments to my surprise.
But it is also possible to have existential euphoria. First one I got than proved for my self the idea of quantum immortality. Second is that than I understood that I will become a god in my own branch of the universe. The last needs more complex explanation, which I would omit for now.
But as I became older I got less and less fe...
the idea that during sleep my mind declines enough to merge into other experiences and I awake into a world I would consider alien, with perfectly consistent memories.
An entity with self-consistent memories is astronomically more likely to be found in a world which matches those memories, than in some mismatched world. The latter has a complexity penalty equal to all the extra mismatched complexity.
...Finally, we come to an absolutely terrifying idea I had a few days ago, which I naively assumed would catch the attention of any rational person. An extrapolation of Dust Theory [3] implied that you might die upon going to sleep, not immediately, but through degeneration, and that the person who wakes up in the morning is simply a different observer, who has an estimated lifespan of however long he remains awake. Rationally anyone should therefore sign up for cryonics and then kill themselves, forcing their measure to continue into post-Singularity worlds
My logic was sound, but he substituted abstractly intuitive concepts in place of them.
Sound logic doesn't help when you start of with bad assumptions.
One of them of them insisted that I needed to explain how 'causality' could be violated; isn't that the whole point of acausal systems?
No, it isn't. The fact that the word acausal exists on LW, doesn't mean that those people who use it don't believe in causality. It's used when speaking about agents that use a specific decision theory.
You furthermore simply pointed at arguments without being explicit...
Let X be something bad. If X is true it is something you and nearly every other person should rightly fear. If, however, no one but you fears X than either (1) you are mistaken, or (2) you have some special information or insight that everyone else lacks. Logically, it's almost certainly (1). So if you fear X but Eliezer, Bostrom, and Hanson don't appear to, take comfort from their lack of fear even if you don't understand it.
Like many people I felt compelled to distinguish myself by solving your problem while playing by your rules (rules which aren't completely clear). But after all ... and I guess I should offer an apology if this doesn't help, but, why should any of that change anything? Picture someone who for his whole life thought he had free will, then discovered that the universe is deterministic, with all that entails about ideas like "free will" as normal people envision it. This sounds pretty similar to your situation. You discovered that you may at any poi...
I have a talent for reasoning my way into terrifying and harmful conclusions.
I often worked through them myself, always by refuting the horrible consequences of them to my own satisfaction
This is a good start. Assemble your data.
Catalogue the terrifying conclusions which have troubled you, and record the life cycle of each.
Currently still terrified? If not, how was the terror resolved? How long after the thought first terrified you until the terror was resolved?
Stop reasoning and take data. Often patterns becomes obvious once data is tidily assembled...
Come to think of your thought process as an accumulation of information layers on top of each other, it should not be surprising to see the introduction of a possibly devastating new thought threatening the foundations of your thinking process being counter-productive, or depressive. I am speaking of my personal experience with solipsism, which did not come from exposure but personal self-destructive thought process; I've looked it up to find out about solipsism later on. The introduction of these ideas at your pace, as you've experienced yourself, is very...
I have the exact same problem. It's nice to know I'm not alone. I've been scared to mention my fears on lesswrong because I didn't think anyone would understand.
I'm mainly concerned about many worlds interpretation being true. I don't take dust theory seriously. Unless I understand it wrongly, it removes causation and just assumes the information itself is what is important. I really recommend you read Causal Universes. It's one of my favorite Lesswrong posts.
I also think dust theory leads to absurd and obviously wrong conclusions. Like how do interpret so...
An extrapolation of Dust Theory [3] implied that you might die upon going to sleep, not immediately, but through degeneration, and that the person who wakes up in the morning is simply a different observer, who has an estimated lifespan of however long he remains awake.
If that were true, wouldn't a lot of people be dying in their sleep so that we'd be seeing their corpses?
You are talking about rationality and about fear. Your protocol could have several independent layers. You seems to think that your ideas produce your fear, but it could be also opposite. Your fear could produce your ideas (and it is definitely very probable that fear has impact on your ideas (at least on contents)). So you could analyze rational questions on lesswrong and independently solve your irrational part (=fear etc) with terapeuts. There could be physical or chemical reasons why you are concerning more than other people. Your protocol for dangerou...
The next big failure was my resolution to the Doomsday argument.
Are you aware of the self-indication assumption.
Why think about these sorts of things?
Personally, death really messes with my mind, and I try not to think about it (and related bad things) in the short-mid term. I don't see that I'm in a position to do much to avoid death/related bad things right now, and so I don't see that there's much benefit to thinking about it right now. The cost to me is that it makes me mildly unhappy and risks moments of extreme unhappiness.
Do you have a link to Max Tegmark's rebuttal? What I've read so far seemed like a confused dodge.
I'm pretty much immune to infinity angst. PM the "dust theory" problem to me. I'm curious how it could be worse psychologically than modal realism, as AFAICT dust theory implies that all subjective experiences exist, so I'm unsure how it could differ in terms of psychological impact.
I have a talent for reasoning my way into terrifying and harmful conclusions. The first was modal realism as a fourteen-year-old. Of course I did not understand most of its consequences, but I disliked the fact that existence was infinite. It mildly depressed me for a few days. The next mistake was opening the door to solipsism and Brain-in-a-Vat arguments. This was so traumatic to me that I spent years in a manic depression. I could have been healed in a matter of minutes if I had talked to the right person or read the right arguments during that period, but I didn't.
Lesswrong has been a breeding ground of existential crisis for me. The Doomsday argument (which I thought up independently), ideas based on acausal trade (one example was already well known; one I invented myself), quantum immortality, the simulation argument, and finally my latest and worst epiphany: the potential horrible consequences of losing awareness of your reality under Dust Theory. I don't know that that's an accurate term for the problem, but it's the best I can think of.
This isn't to say that my problems were never solved; I often worked through them myself, always by refuting the horrible consequences of them to my own satisfaction and never through any sort of 'acceptance.' I don't think that my reactions are a consequence of an already depressed mind-state (which I certainly have anyway) because the moment I refute them I feel emotionally as if it never happened. It no longer wears on me. I have OCD, but if it's what's causing me to ruminate than I think I prefer having it as opposed to irrational suppression of a rational problem. Finding solutions would have taken much longer if I hadn't been thinking about them constantly.
I've come to realize that this site, due to perhaps a confluence of problems, was extremely unhelpful in working through any of my issues, even when they were brought about of Lesswrong ideas and premises. My acausal problem [1] I sent to about five or six people, and none of them had anything conclusive to say but simply referred me to Eliezer. Who didn't respond, even though this sort of thing is apparently important to him. This whole reaction struck me as disproportionate to the severity of the problem, but that was the best response I've had so far.
The next big failure was my resolution to the Doomsday argument. [2] I'm not very good yet at conveying these kind of ideas, so I'm not sure it was entirely the fault of the Lesswrongers, but still. One of them of them insisted that I needed to explain how 'causality' could be violated; isn't that the whole point of acausal systems? My logic was sound, but he substituted abstractly intuitive concepts in place of them. I would think that there would be something in the Sequences about that.
The other posters were only marginally more helpful. Some of them challenged the self-sampling assumption, but then why even bother if the problem I'm trying to solve requires it to be true? In the end, not one person even seemed to consider the possibility that it might work. Even though it is a natural extrapolation from other ideas which are taken very very seriously by Lesswrong. Instead of discussing my resolution, they discussed the DA itself, or AI, or whatever they found more interesting.
Finally, we come to an absolutely terrifying idea I had a few days ago, which I naively assumed would catch the attention of any rational person. An extrapolation of Dust Theory [3] implied that you might die upon going to sleep, not immediately, but through degeneration, and that the person who wakes up in the morning is simply a different observer, who has an estimated lifespan of however long he remains awake. Rationally anyone should therefore sign up for cryonics and then kill themselves, forcing their measure to continue into post-Singularity worlds that no longer require him to sleep (not that I would have ever found the courage to do this). [4] In the moments when I considered it most plausible I gave it no more than a 10% chance of being true (although it would have been higher if I had taken Dust Theory for granted), and it still traumatized me in a way I've never experienced before. Always during my worst moments sleep came as a relief and escape. Now I cannot go to sleep. Only slightly less traumatizing was the idea that during sleep my mind declines enough to merge into other experiences and I awake into a world I would consider alien, with perfectly consistent memories.
My inquiries on different threads were almost completely ignored, so I eventually created my own. After twenty-four hours there were nine posts, and now there are twenty-two. All of them either completely miss the point (always not realizing this) or show complete ignorance about what Dust Theory is. The idea that this requires any level of urgency does not seem to have occurred to anyone. Finally, the second part of my question, which asked about the six-year-old post "getting over Dust Theory" was completely ignored, despite having ninety-five comments on it by people who seem to already understand it themselves.
I resolved both issues, but not to my own satisfaction: while I now consider the death outcome unlikely enough to dismiss, the reality-jumping still somewhat worries me. I now will not be able to go to sleep without fear for the next few months; maybe longer, and my mental and physical health will deteriorate. Professional help or a hotline is out of the question because I will not inflict these ideas on people who are not equipped to deal with them, and also because I regard psychologists as charlatans or, at best, practitioners of a deeply unhealthy field. The only option I have to resolve the issues is talking to someone who can discuss it rationally.
This post [5] by Eliezer, however unreliable he might be, convinced me that he might actually know what he is talking about (though I still don't know how Max Tegmark's rebuttal to quantum immortality is refuted, because it seems pretty airtight to me). More disappointing is Nick Bostrom's argument that mind-duplicates will experience two subjective experiences; he does not understand the idea of measure, i.e. that we exist in all universes that account for our experiences, but more in some than others. Still, I think there has to be someone out there who is capable of following my reasoning- all the more frustrating, because the more people misapprehend my ideas, the clearer and sharper they seem to me.
Who do I talk to? How do I contact them? I doubt that going around emailing these people will be effective, but something has to change. I can't go insane, as much as that would be a relief, and I can't simply ignore it. I need someone sane to talk to, and this isn't the place to find that.
Sorry if any of this comes off as ranting or incoherent. That's what happens when someone is pushed to all extremes and beyond. I am not planning on killing myself whatsoever and do not expect that to change. I just want help.
[1] http://lesswrong.com/lw/l0y/i_may_have_just_had_a_dangerous_thought/ (I don't think that the idea is threatening anymore, though.)
[2] http://lesswrong.com/lw/m8j/a_resolution_to_the_doomsday_argument/
[3] http://sciencefiction.com/2011/05/23/science-feature-dust-theory/
[4] http://lesswrong.com/lw/mgd/the_consequences_of_dust_theory/
[5] http://lesswrong.com/lw/few/if_mwi_is_correct_should_we_expect_to_experience/7sx3
(The insert-link button is greyed out, for whatever reason.)