Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Daniel_Burfoot 16 June 2017 11:25:42PM 1 point [-]

Given that many of the most successful countries are small and self-contained (Singapore, Denmark, Switzerland, Iceland, arguably the other Scandinavian countries), and also the disasters visited upon humanity by large unified nation-states, why are people so attached to the idea of large-scale national unity?

Comment author: gilch 17 June 2017 02:48:07AM 0 points [-]

I'm not sure what idea you're talking about. Are you talking about intranational unity or international unity? Can you give examples?

Comment author: compiledwrong 16 June 2017 04:46:56AM 1 point [-]

Do you believe that "the mind works the way the mind think that it works"? (I do.)

The most specific example of this that comes to mind for me is- if you believe that exercising willpower drains your willpower, then that is true. And if you believe that exercising willpower strengthens your willpower, then it does! http://news.stanford.edu/news/2007/february7/dweck-020707.html http://mindsetscholarsnetwork.org/wp-content/uploads/2015/09/What-We-Know-About-Growth-Mindset.pdf

Comment author: gilch 17 June 2017 02:41:12AM 1 point [-]

No, I don't, or only a little bit. See Moravec's Paradox: the easiest tasks to program a computer to do are those we are most conscious of. There are parts of the brain that are remarkably plastic, but there is a lot of background processing that we are not aware of.

If you don't believe that exercising willpower drains your willpower, then it actually still does, but you just don't notice it as soon. This has been tested. It's also true that certain mental abilities can be improved with practice, this is just plasticity. Think of it like exercising a muscle. If you overexert yourself, a strong instinct will try to stop you from hurting yourself. Trained athletes can overcome this instinct to some extent, but they still have real physical limits. And of course, appropriate exercise can improve performance over the long term, but again there are real physical limits to how much.

Comment author: turchin 12 June 2017 08:23:43PM 0 points [-]

You may also not identify with you the person who will get up in your body after night sleep tomorrow. There will be large informational and biochemical changes in your brain, as well as a discontinuity of the stream of consciousness during deep sleep.

I mean that an attempt to deny identity with your copies will result in even larger paradoxes.

Comment author: gilch 13 June 2017 05:07:36AM *  1 point [-]

I don't buy it. Why don't you wake up as Britney Spears instead? Clearly there's some information in common between your mind patterns. She is human after all (at least I'm pretty sure).

Clearly there is a sufficient amount of difference that would make your copy no longer you.

I think it is probable that cryonics will preserve enough information, but I think it is nigh impossible that my mere written records could be reconstructed into me, even by a superintelligence. There is simply not enough data.

But given Many Worlds, a superintelligence certainly could attempt to create every possible human mind by using quantum randomization. Only a fraction of these could be realized in any given Everett branch, of course. Most possible human minds are insane, of course, since their memories would make no sense.

Given the constraint of "human mind" this could be made more probable than Boltzmann Brains. But if the Evil AI "upgrades" these minds, then they'd no longer fit that constraint.

Comment author: turchin 11 June 2017 04:40:52PM 6 points [-]

But killing oneself has a tail risk too. One is that the hell exist in our simulation, and suicide is a sin :)

Another is that quantum immortality is true AND that you will survive any attempt of the suicide but seriously injured. Personally, I don't think it is the tail outcome, but give it high probability, but most people give it the very low probability.

Paradoxically, only cryonics could protect you against these tail outcomes, as even small chance that you will successfully cryopreserved and return to life (say 1 per cent) dominates over the chance that you will be infinitely dying but not able to die (say 0.00001 per cent) between all branches of muliverse, so you will experience 100 000 more often that you returned from cryostasis than you suffering infinitely because of quantum immortality.

Chances that you will be resurrected by evil AI only to torture you are much smaller, say 1 per cent of all cryobranches of multiverse.

It means that if you choose suicide (and believes in QI) you have 100 time more chances on eternal suffering than if you chose cryonics.

Comment author: gilch 11 June 2017 07:03:47PM 1 point [-]

One is that the hell exist in our simulation, and suicide is a sin :)

Pascal's mugging. One could just as easily imagine a simulation such that suicide is necessary to be saved from hell. Which is more probable? We cannot say.

Another is that quantum immortality is true AND that you will survive any attempt of the suicide but seriously injured. Personally, I don't think it is the tail outcome, but give it high probability, but most people give it the very low probability.

I also think this is more likely than not. Subjective Immortality doesn't even require Many Worlds. A Tegmark I multiverse is sufficient. Assuming we have no immortal souls and our minds are only patterns in matter, then "you" are simultaneously every instantiation of your pattern throughout the multiverse. Attempting suicide will only force you into living only in the bad outcomes where you don't have control over your life anymore, and thus cannot die. But this is exactly what the suicidal are trying to avoid.

Comment author: entirelyuseless 11 June 2017 05:46:25PM 3 points [-]

We already know that most people who attempt suicide survive. This is true even from the standard viewpoint of an external observer. This is already a good reason not to attempt suicide.

Comment author: gilch 11 June 2017 06:48:08PM 1 point [-]

[Citation needed]

Do most only survive their first attempt and try again, or do most live a natural lifespan after a failed attempt? What proportion of these suffer debilitating injuries for their efforts?

Stupid Questions June 2017

3 gilch 10 June 2017 06:32PM

This thread is for asking any questions that might seem obvious, tangential, silly or what-have-you. Don't be shy, everyone has holes in their knowledge, though the fewer and the smaller we can make them, the better.

Please be respectful of other people's admitting ignorance and don't mock them for it, as they're doing a noble thing.

To any future monthly posters of SQ threads, please remember to add the "stupid_questions" tag.

Comment author: madhatter 02 June 2017 01:07:59AM 0 points [-]

Anything not too technical about nanotechnology? (Current state, forecasts, etc.)

Comment author: gilch 02 June 2017 05:57:23AM 0 points [-]

Engines of Creation is the classic. It's much less technical than Nanosystems.

Comment author: gilch 29 May 2017 12:46:11AM 0 points [-]

Does "Bi-Weekly" mean twice a week or every-other week?

Comment author: qmotus 22 May 2017 01:18:16PM 1 point [-]

Are people close to you aware that this is a reason that you advocate cryonics?

Comment author: gilch 22 May 2017 10:29:30PM 0 points [-]

I'm not sure what you're implying. Most people close to me are not even aware that I advocate cryonics. I expect this will change once I get my finances sorted out enough to actually sign up for cryonics myself, but for most people, cryonics alone already flunks the Absurdity heuristic. Likewise with many of the perfectly rational ideas here on LW, including the logical implications of quantum mechanics and cosmology, like Subjective Immortality. Linking more "absurditiess" seems unlikely to help my case in most instances. One step at a time.

Comment author: strangepoop 15 May 2017 11:36:23PM *  4 points [-]

Why does patternism [the position that you are only a pattern in physics and any continuations of it are you/you'd sign up for cryonics/you'd step into Parfit's teleporter/you've read the QM sequence]

not imply

subjective immortality? [you will see people dying, other people will see you die, but you will never experience it yourself]

(contingent on the universe being big enough for lots of continuations of you to exist physically)

I asked this on the official IRC, but only feep was kind enough to oblige (and had a unique argument that I don't think everyone is using)

If you have a completely thought out explanation for why it does imply that, you ought never to be worried about what you're doing leading to your death (maybe painful existence, but never death), because there would be a version of you that would miraculously escape it.

If you bite that bullet as well, then I would like you to formulate your argument cleanly, then answer this (rot13):

jul jrer lbh noyr gb haqretb narfgurfvn? (hayrff lbh pbagraq lbh jrer fgvyy pbafpvbhf rira gura)

ETA: This is slightly different from a Quantum Immortality question (although resolutions might be similar) - there is no need to involve QM or its interpretations here, even in a classical universe (as long as it's large enough), if you're a patternist, you can expect to "teleport" to another exact clone somewhere that manages to live.

Comment author: gilch 20 May 2017 12:33:24AM 2 points [-]

I think it does imply subjective immortality. I'll bite that bullet. Therefore, you should sign up for cryonics.

Consciousness isn't continuous. There can be interruptions, like falling asleep or undergoing anesthesia. A successor mind/pattern is a conscious pattern that remembers being you. In the multiverse, any given mind has many many successors. It doesn't have to follow immediately, or even have to follow at all, temporally. At the separations implied even for a Tegmark Level I multiverse, past and future are meaningless distinctions, since there can be no interactions.

You are your mind/pattern, not your body. A mind/pattern is independent of substrate. Your unconscious, sleeping self is not your successor mind/pattern. It's an unconscious object that has a high probability of creating your successor (i.e. it can wake up). Same with your cryonicically-preserved corpsicle, though the probability is lower.

Any near-death event will cause grievous suffering to any barely-surviving successors, and grief and loss to friends and relatives in branches where you (objectively) don't survive. I don't want to suffer grievous injury, because that would hurt. I also don't want my friends and relatives to suffer my loss. Thus, I'm reluctant to risk anything that may cause objective death.

But, the universe being a dangerous place, I can't make that risk zero. By signing up for cryonics, I can increase the measure of successors that have a good life, even after barely surviving.

In the Multiverse, death isn't all-or-none, black or white. A successor is a mind that remembers being you. It does not have to remember everything. If you take a drug that causes you to not form long-term memory of any event today, have you died by the next day? Objectively, no. Your friends and relatives can still talk to "you" the next day. Subjectively, partially. Your successors lack certain memories. But people forget things all the time.

Being mortal in the multiverse, you can expect that your measure of successors will continue to diminish as your branches die, but the measure never reaches absolute zero. Eventually all that remains are Bolzman Brains and the like. The most probable Boltzman brain successors only live long enough to have a "single" conscious qualia of remembering being you. The briefest of conscious thoughts. Their successors remember that thought and may have another random thought. You can eventually expect an eternity of totally random qualia and no control at all over your experience.

This isn't Hell, but Limbo. Suffering is probably only a small corner of possible qualia-space, but so is eudaimonia. After an eternity you might stumble onto a small Botzlman World where you have some measure of control over your utility for some brief time, but that world will die, and your successors will again be only Boltzman brains.

I can't help that some of my successors from any given moment are Boltzman brains. But I don't want my only successors to be Boltzman Brains, because they don't increase my utility. Therefore, cryonics.

See the Measure Problem of cosmology. I'm not certain of my answer, and I'd prefer not to bet my life on it, but it seems more likely than not. I do not believe that Boltzman Brains can be eliminated from cosmology, only that they have lesser measure than evolved beings like us. This is because of the Trivial Theorem of Arithmetic: almost all natural numbers are really damn huge. The universe doesn't have to be infinite to get a Tegmark Level I multiverse. It just has to be sufficiently large.

View more: Next