Comment author: Eitan_Zohar 08 July 2015 06:14:32PM *  0 points [-]

Sign up for cryonics. All of your subjective future will continue into quantum worlds that care enough to revive you, without regard for worlds where the cryonics organization went bankrupt or there was a nuclear war.

Doesn't this mean that you should deliberately avoid finding out whether cryonics can actually preserve your information in a retrievable way, because if it can't it would eliminate the vast majority of the worlds that would have brought you back? Whereas if you don't know it remains undetermined. Am I getting this right?

Comment author: Eliezer_Yudkowsky 14 July 2015 06:35:16PM 2 points [-]

You're confusing subjective probability and objective quantum measure. If you flip a quantum coin, half your measure goes to worlds where it comes up heads and half goes to where it comes up tails. This is an objective fact, and we know it solidly. If you don't know whether cryonics works, you're probably still already localized by your memories and sensory information to either worlds where it works or worlds where it doesn't; all or nothing, even if you're ignorant of which.

Comment author: G0W51 07 June 2015 08:39:12PM 0 points [-]

The article said the leverage penalty "[penalizes] hypotheses that let you affect a large number of people, in proportion to the number of people affected." If this is all the leverage penalty does, then it doesn't matter if it takes 3^^^3 atoms or units of computation, because atoms and computations aren't people.

That said, the article doesn't precisely define what the leverage penalty is, so there could be something I'm missing. So, what exactly is the leverage penalty? Does it penalize how many units of computation, rather than people, you can affect? This sounds much less arbitrary than the vague definition of "person" and sounds much easier to define: simply divide the prior of a hypothesis by the number of bits flipped by your actions in it and then normalize.

Comment author: Eliezer_Yudkowsky 07 June 2015 09:35:29PM 2 points [-]

can even strip out the part about agents and carry out the reasoning on pure causal nodes; the chance of a randomly selected causal node being in a unique100 position on a causal graph with respect to 3↑↑↑3 other nodes ought to be at most 100/3↑↑↑3 for finite causal graphs.

Comment author: JonahSinick 07 June 2015 08:00:45AM *  2 points [-]

I'm puzzled, is there a way to read his comment

People described him to me as resembling a young Bill Gates. His estimated expected future wealth based on that data if pursuing entrepreneurship, and informed by the data about the relationship of all of the characteristics I could track with it, was in the 9-figure range. Then add in that facebook was a very promising startup (I did some market sizing estimates for it, and people who looked at it and its early results were reliably impressed).

other than as him doing it at the time?

Comment author: Eliezer_Yudkowsky 07 June 2015 07:50:33PM 1 point [-]

Yes, as his post facto argument.

Comment author: Eliezer_Yudkowsky 07 June 2015 07:16:03AM 3 points [-]

You have not understood correctly regarding Carl. He claimed, in hindsight, that Zuckerberg's potential could've been distinguished in foresight, but he did not do so.

Comment author: Eliezer_Yudkowsky 07 June 2015 06:59:28AM 9 points [-]

Moved to Discussion.

Comment author: G0W51 07 June 2015 02:23:54AM 1 point [-]

What if the mugger says he will give you a single moment of pleasure that is 3^^^3 times more intense than a standard good experience? Wouldn't the leverage penalty not apply and thus make the probability of the mugger telling the truth much higher?

I think the real reason the mugger shouldn't be given money is that people are more likely to be able to attain 3^^^3 utils by donating the five dollars to an existential risk-reducing charity. Even though the current universe presumably couldn't support 3^^^3 utils, there is a chance of being able to create or travel to vast numbers of other universes, and I think this chance is greater than the chance of the mugger being honest.

Am I missing something? These points seem too obvious to miss, so I'm assigning a fairly large probability to me either being confused or that these were already mentioned.

Comment author: Eliezer_Yudkowsky 07 June 2015 06:51:42AM 0 points [-]

I don't think you can give me a moment of pleasure that intense without using 3^^^3 worth of atoms on which to run my brain, and I think the leverage penalty still applies then. You definitely can't give me a moment of worthwhile happiness that intense without 3^^^3 units of background computation.

Comment author: Unknowns 05 May 2015 07:25:25AM 0 points [-]

This calls into question your claim that you won't accept bets that would call into question your ability to pay if you lose.

What do think is the probability (given the fixed assumption that Christianity is false) that sometime before 2045 you will have the psychological experience of a vision of Christ claiming to be risen from the dead?

Comment author: Eliezer_Yudkowsky 09 May 2015 11:46:54PM 1 point [-]

Preeeeeeeeeeeetty small, and I nonetheless won't accept any bets that I couldn't pay off if I lost, because that's deontologically dishonorable.

Comment author: lerjj 04 March 2015 09:57:17PM 2 points [-]

It's actually the same tactic as the Weasley twins used to cover the "engaged to Ginever Weasley" story- plant so many make newspaper reports that everyone gets confused. And it kinda happens again after the Hermione/Draco incident. Guess Eliezer like the theme of people not being able to discern the truth from wild rumours if the truth's weird enough.

Comment author: Eliezer_Yudkowsky 05 March 2015 05:47:40AM 12 points [-]

Oh, trust me, they can't discern the truth from wild rumors even if it's normal. (I am speaking of real life, here.)

Comment author: tim 05 March 2015 04:14:52AM 5 points [-]

The biggest problem for me is that when I imagine myself reading these events and Voldemort going, "A nice try but I can sense whatever transfiguration trick it is that you're using. Thank you, that will take me some time to perfect in my eternity," I don't feel surprised.

Throwing additional stipulations and conditions into the situation doesn't change the fact that the way in which Voldemort loses is not convincing.

It doesn't feel like Harry earned the win because I can just as easily imagine Voldemort laughing at Harry's childish tricks and killing him. For the finale to truly be satisfying, there has to be a specific, pre-established reason why Voldemort was unable to defend against Harry's tactics and, at least in my mind, this was not the case.

Simply being unaware of partial transfiguration doesn't cut it. This is a person who casts nearly thirty charms to discuss sensitive information in Mary's room and recognizes the value of ambush and surprise attacks. Yet in his final moment of securing his eternity is unable to sense transfigured material winding its way around himself and his followers. Material transfigured by person he has a known magical resonance with.

It simply does not feel reasonable for Voldemort to lose like this, no matter how many addendums are added to justify his behavior in these final chapters.

Comment author: Eliezer_Yudkowsky 05 March 2015 04:33:38AM 7 points [-]

I do remark that Dumbledore was unable to detect Harry doing an ongoing Transfiguration while he looked into Harry's prison cell in Azkaban.

Comment author: Eliezer_Yudkowsky 04 March 2015 05:50:47PM 8 points [-]

A lot of people think that Voldemort was going too easy on Harry, making this a "Coil vs. Taylor in the burning building" violation of suspension-of-disbelief for some of them. I am considering rewriting 113 with the following changes:

  • Most Death Eaters are watching the surrounding area, not Harry; Voldemort's primary hypothesis for how Time might thwart him involves outside interference.
  • Voldemort tells Harry to point his wand outward and downward at the ground, then has a Death Eater paralyze Harry (except heart/lungs/mouth/eyes) in that position before the unbreakable Vow. This would also require a retroedit to 15 or 28 to make it clear that Transfiguration does not require an exact finger position on the wand.

Submitting...

View more: Next