Comment author: buybuydandavis 16 March 2015 12:10:11AM *  7 points [-]

The presence of Hermione's character totally changes the tone of the story, and reading this one, it became really clear how heavily the Sunshine General was missing from the last ~third or so of the story arc. Eliezer writes her very well, and seems to enjoy writing her too.

Harry's world was bleak without Hermione. Harry's love for Hermione, and even love for Humanity in general, had been missing for a while. He largely went into young Tom Riddle mode for a long time, without Hermione's influence.

Harry:

Being friends with you means that my life doesn't have to go the way Voldemort's did.

Recall Quirrell:

“Then here is what I might have done at your age, if there had been anyone to do it for—”

Hermione showed Harry the possibility of both love and understanding. He had love from his parents, and understanding from Quirrell, but both from Hermione. The world became a different place for Harry when he came to know Hermione.

Maybe I was expecting too much adulthood from Harry, but in every meaningful way but romantic, he loves Hermione, and Harry's evasion of that admission was disappointing, if not entirely out of character.

Comment author: TylerJay 17 March 2015 07:47:49PM 3 points [-]

And just a few lines before your last quote:

Quirrell:

"I did not have any friends like that when I was young." Still the same emotionless voice. "What would have become of you, I wonder, if you had been alone?"

Comment author: JenniferRM 16 March 2015 05:46:48AM *  14 points [-]

I think the most interesting part of this ending (the thing that really surprised me the most) was the idea of Dumbledore not holding an idiot ball, nor being crazy, nor even being "apparently crazy just for the sake of complex strategically cultivated opacity"... but instead being the embodiment of the biggest point of departure from canon in that he knows every prophesy and thereby caused many other points of departure semi-intentionally.

Also, having Dumbledore essentially become the half-understanding servant of whatever it is that causes prophesies, turns the whole story into something that is fundamentally about time travel in a way I really wasn't expecting.

Maybe I should have. Eliezer's notes have mentioned that he thinks very highly of HP and the Wastelands of Time, but I thought that the time traveling themes would mostly be restricted to time turners, and time turners wouldn't be very powerful, because otherwise it would disrupt the rationality theme...

This makes me think that it would be moderately rewarding to read HPMOR itself again to try to examine Dumbledore's actions more carefully. Like... what if he said what he said during the feast on the first night (when Harry was drinking comed-tea) because it was what the prophesies said he had to do? How constrained was he? Was there really "crazy act" on his part, mixed into the prophesy hacking, to hide the prophesy hacking better? How much free agency did he have leftover? And for that matter, how much did Eliezer track such issues?

If this was just the finish of the first draft, rather than the entire and complete finish of the series, I'd expect editing to shore up the coherence of the necessities of time travel.

Knowing that the plotting was worked out the way a TV series is written it seems to imply that early content was probably optimized more to hook readers than to align with the rigors of plot. But still, my guess is that the core reason for Dumbledore to seem crazy was already in Eliezer's mind in the first few chapters. Sadly, there will be no more data to settle the question honestly, but it was a fun game while it lasted. I'm sad the data source has shut down, but happy to have played :-)

EDIT: Oh! Also it makes Dumbledore being outside of time (instead of actually dead) more interesting. Presumably he cannot be "raised from the dead" from this position. Also, it appears that there is some room for him to be causally related to the source of prophecies, from his position outside time... maybe? ;-)

Comment author: TylerJay 17 March 2015 07:43:41PM 4 points [-]

What I love about this twist is how it changes the interpretation of so many other things that were said throughout the story. For example:

"Purposeless?" said Professor Quirrell. "Oh, but the madness of Dumbledore is not that he is purposeless, but that he has too many purposes.

It turns out PQ was right in that the madness of Dumbledore was not purposeless, however much his going around and "snipping all the threads of destiny" to constrain future events would, to anyone without all the knowledge of prophecy, look like many divergent purposes. Even Dumbledore himself didn't know how or why some of them fit into the whole picture. But it was all done in service of his one true goal. And if the service of that goal had involved killing Harry or framing Hermione? Well,

"Who knows what the Headmaster thinks he has reason to do, when he has found reason to do so many strange things already."

Comment author: TylerJay 15 March 2015 09:09:27AM 2 points [-]

I couldn't be happier with the ending. So perfect.

"I think that you always were, from the day I met you, my mysterious old wizard."

Thank you so much Eliezer. It's been an amazing journey.

Comment author: TylerJay 15 March 2015 08:48:18AM *  3 points [-]

I derived Bayes' Theorem and the basic rule for conditional probabilities today while trying to find the answer to a question.

I had seen Bayes' Theorem before, but never really understood it and certainly had never memorized it. The only relevant knowledge I had was:

  • That the syntax for "probability of A given B" is p(A|B)
  • That you can multiply independent probabilities to do AND, but aren't allowed to do that if they are dependent

I was surprised at how it followed directly from intuition and the last bullet point above. I put together a toy problem involving the probability that I go outside depending on if it's raining or not and was able to derive Bayes' Theorem essentially from first principles. Not a groundbreaking accomplishment or anything, but I know I'll never forget any of it now!

The line of reasoning was really simple once I was able to distill it:

  1. I can simulate the probability that I'm outside given that it's raining by first rolling a die to see if it's raining, and if it is, then rolling another die to see if I'm outside, since I already know it's raining.
  2. These are clearly independent events, so I should be able to multiply their probabilities to get the probability of their conjunction.
  3. What does the conjunction mean? Intuitively, I'm outside and it's raining exactly when each of those die rolls turned up True, so this multiplication must give the probability that I'm outside and it's raining: p(outside|raining) * p(raining) = p(outside AND raining)
  4. And if that works, then symmetrically, this same probability p(outside AND raining) should also be equal to the probability that it's raining given that I'm outside times the probability that I'm outside.
  5. Isolate p(outside|raining) on one side of the equation and Boom: Bayes' Theorem.

p(A|B) * p(B) = p(A & B) = p(B|A) * p(A)

p(A|B) * p(B) = p(B|A) * p(A)

p(A|B) = [p(B|A) * p(A)] / p(B)

Comment author: Viliam_Bur 10 March 2015 03:31:01PM *  12 points [-]

We have a baby girl! (we = me + BarbaraB)

Name: Ivana. Date of birth: March 8th. No complications, the baby seems healthy.

She was born in Hainburg in Austria, which is near Bratislava, because frankly we trust Austrian health care more than Slovakian (based on both personal experience and research: tl;dr: health care in Slovakia is typically understaffed and overmedicates, doctors in Austria are typically much more polite and friendly, and what they say seems more compatible with international research, although there are exceptions too). Now we are back in Bratislava.

family photo

Comment author: TylerJay 15 March 2015 07:52:16AM 0 points [-]

Congratulations!

Out of curiosity: What do you think of Czech healthcare? I got appendicitis while visiting the Czech Republic and had to have my appendix out while there in a hospital that was built in the 1300s.

Comment author: TobyBartels 08 March 2015 08:56:21PM 0 points [-]

Just to be clear, the Reddit thread didn't say that you can fit 48 hours in a day, it just didn't say that you couldn't. And it probably had that Dumbledore quotation too, it's just me saying that we can't know for sure what that quote means.

Comment author: TylerJay 09 March 2015 07:22:53AM 0 points [-]

Haha, yup, I gotcha. Thanks for the info.

Comment author: johnswentworth 07 March 2015 06:06:37PM *  2 points [-]

Right, that much makes sense. The problem is the "perfectly simulate C3PO" part toward the end. If we really want to see what it would do, then we need a perfect simulation of the environment in addition to C3PO* itself. Any imperfection, and C3PO* might realize it's in a simulated environment. All else equal, once C3PO* knows it's in a simulated environment, it would presumably try to get out. Since its utility function is different from C3PO, it would sometimes be motivated to undermine C3PO (or us, if we're the ones running the simulation).

Comment author: TylerJay 08 March 2015 01:47:45AM 3 points [-]

Just remember that this isn't a boxing setup. This is just a way of seeing what an AI will do under a false belief. From what I can tell, the concerns you brought up about it trying to get out isn't any different between the scenario when we simulate C3PO* and when we simulate C3PO. The problem of making a simulation indistinguishable from reality is a separate issue.

Comment author: johnswentworth 05 March 2015 10:58:03PM 2 points [-]

One high-level concern. If I'm reading this right (and please let me know if I'm not), then this is talking about handling counterfactuals by spawning a copy of the AI with a different utility function.

Just on general principles, spawning a copy of the AI with a different utility function seems really, really dangerous. The new copy would be motivated to trade off anything at all in the no-thermodynamic-miracle scenario in order to increase utility in the event of a thermodynamic miracle. In particular, if the AI were boxed (as we might expect for counterfactual processing) then it would presumably try to get out of that box.

Comment author: TylerJay 07 March 2015 04:16:53AM *  2 points [-]

Here's my explanation of it. Let me know if this helps with your concerns at all:

Imagine we have an AI design we want to test. Call this AI C3PO, and let its utility function be U(A) where A is a world-state from the set of all possible world-states. And let the super-unlikely-event-happening-at-the-specified-time described in the post be w such that w = true if it happens and w = false if it doesn't happen. Then let A* be a world state in the subset of all world-states A in which w = true. Basically, A* is A given that w happened (this is how we simulate a "false belief" by only allowing the AI to consider worlds in which w = true). Finally, let C be a constant.

The proposal is that we create a variant of C3PO, C3PO* that has the utility function:

U*(A) = P(!w) * C + P(w) * (U(A*))

If the AI is boxed such that it cannot affect the probability of w occurring and it cannot modify its own utility function, then maximizing U* is exactly the same as maximizing U once event w has occurred (ie. with false belief w). In this way, we are able to perfectly simulate C3P0 to find out what it would do if w were true, but we don't actually have to convince it that w is true.

Comment author: Stuart_Armstrong 06 March 2015 12:21:49PM 3 points [-]

C need not be a low constant, btw. The only requirement is that u(false,action a, A) = u(false, action b, A) for all actions a and b and all A. ie nothing the AI does affects the utility of worlds where w is false, so this does not constrain its actions.

Basically the AI observes the ON signal going through, and knows that either a) the signal went through normally, or b) the signal was overwritten by coincidence by exactly the same signal. It's actions have no consequences in the first case, so it ignores it, and acts "as if" it was certain there had been a thermodynamic miracle that happened.

Comment author: TylerJay 07 March 2015 03:38:31AM *  3 points [-]

Thanks. I understand now. Just needed to sleep on it, and today, your explanation makes sense.

Basically, the AI's actions don't matter if the unlikely event doesn't happen, so it will take whatever actions would maximize its utility if the event did happen. This maximizes expected utility

Maximizing [P(no TM) * C + P(TM) * u(TM, A))] is the same as maximizing u(A) under assumption TM.

Comment author: TobyBartels 06 March 2015 06:23:20AM 1 point [-]

Thanks, but that's not actually definitive. Somewhere on Reddit is a list of information about time turners, and while this was quoted there, nothing else suggested that you couldn't go back 6 hours, wait 12 hours, then go back 6 hours with a new time turner.

But there was a clear statement by Eliezer that you couldn't go back 6 hours, wait only 6 hours, and then go back again. This imposes a hard limit of 48 hours per day.

Comment author: TylerJay 07 March 2015 03:21:26AM 0 points [-]

Nope, you're right. It's not definitive. In my original comment, I just said I thought I remembered reading somewhere that you couldn't fit >30 hrs into a day, and the passage I quoted is where I got that impression. If /r/hpmor thinks it's possible that TTs let you fit up to 48 hrs in a day, then I have high confidence there wasn't anything explicitly forbidding it in the story.

View more: Prev | Next