You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

wedrifid comments on Harry Potter and the Methods of Rationality discussion thread, part 12 - Less Wrong Discussion

5 Post author: Xachariah 25 March 2012 11:01AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (692)

You are viewing a single comment's thread. Show more comments above.

Comment author: wedrifid 27 March 2012 10:42:34AM 2 points [-]

The final ending of Lord of the Rings was declared in about chapter 2 of the book - "look, someone puts the ring in the fire, all right?"

"Look, he creates an FAI that can do magic, alright!"

Comment author: Daniel_Starr 27 March 2012 11:15:33AM 5 points [-]

I thought the unspeakable secret of the fic is that magic itself comes from an FAI trying to grant wishes while respecting humans' sense of how the world ought to work.

Well, that plus time travel. Don't know where the FAI got the time travel from. Must have been one heck of a Singularity.

Comment author: [deleted] 28 March 2012 09:37:43PM *  5 points [-]

The time travel is 'easily' explained when you assume an AI has been made to account for magic.

The AI just need to predict what everyone is going to do for the next 6 hours and "create" a new Harry with the correct memories when he is going to use a time-turner. Then 6 hours later the old Harry is instantly destroyed when he uses it.

This also explains why there is a finite bound on how far back information can be sent (6 hours is how far into the future the AI can predict) and why there is an apparent intelligence warning people when they are about to do something wrong with their time-turners (eg. Harry's "Do not mess with time" message and Dumbledore's paradox warning).

Comment author: MugaSofer 25 September 2012 08:49:53AM 1 point [-]

... except that time turners obey Novikov consistency, which implies a timeless universe, prophecies can reach further than six hours, and Eliezer has stated that the story doesn't contain a SIAI.

Comment author: [deleted] 25 September 2012 03:14:21PM *  2 points [-]

Well, it's been a while since I posted this, but maybe I should have made myself clearer. I only posted to say that the assumption of an AI giving us magic doesn't need the additional assumption of time travel. The six hour time limit is for predicting the position, movement and interactions of everyone (both people and animals) 'close' to a potential time traveler. Prophecies are vague enough to not need this detail of prediction and can therefore be made for further into the future.

As to the question whether this AI assumption is actually the correct one I can only refer you to this quote from chapter 25:

So the words and wand movements were just triggers, levers pulled on some hidden and more complex machine. Buttons, not blueprints.

And just like a computer program wouldn't compile if you made a single spelling error, the Source of Magic wouldn't respond to you unless you cast your spells in exactly the right way.

Which could be a hint in this direction. I assume Eliezer's quote on an AI not being a part of the story is just that, the story will be about Harry's struggle with Voldemort, not about tracking down any sources of Magic.

I agree that obeying Novikov consistency seems to be a good description of the universe in HPMoR, but it is only a partial description since something prevented Harry from using this consistency to factor natural numbers in polynomial time, which should be possible in a universe that is 'only' Novikov consistent (meaning you need additional assumptions to prevent this).

Comment author: MugaSofer 26 September 2012 10:22:23AM 1 point [-]

something prevented Harry from using this consistency to factor natural numbers in polynomial time, which should be possible in a universe that is 'only' Novikov consistent (meaning you need additional assumptions to prevent this).

This had not occurred to me.

I thought this was simply a flaw in Harry's methodology - he's too self-aware for it to work. You need something that will reliably act according to the script, and only as described on the script - in short, a machine, not a person. Harry had failed to consider the possibility of messages that do not consist of factors.

... I thought. Hmm. I need to think about this.

Comment author: [deleted] 26 September 2012 04:32:41PM *  2 points [-]

in short, a machine, not a person.

I don't think Eliezer makes a distinction here. Had Harry done this with a computer program it would probably output (and send back) the exact error message it would generate from getting said message as input, or something like that.

Besides had this trick been possible in any way the story would pretty much be over, as solving every problem in PSPACE in polynomial time would all but guarantee Harry's ascension to godhood.

Comment author: MugaSofer 27 September 2012 12:23:40PM 0 points [-]

Magic breaks computers, remember?

If there is ANY input other than the correct answer that will not generate a paradox, you're doing it wrong.

Comment author: [deleted] 28 September 2012 01:44:39AM *  0 points [-]

Magic breaks computers, remember?

Ah yes, I had totally forgotten about that. It is a much better explanation than what I thought of.

Comment author: MugaSofer 28 September 2012 07:51:34AM 0 points [-]

It should still be possible to build a completely mechanical way of doing this. I don't think Harry's realized that, though.

Comment author: Alex_Altair 29 March 2012 02:58:50AM 0 points [-]

Wow. I figured the AGI just found new laws of physics, but what you said is much more probable.

Comment author: Daniel_Starr 29 March 2012 01:32:19AM 0 points [-]

I like it. Although I think that requires that the HPMOR folk are stuck inside a more powerful entity's experiment or simulation (because if the FAI didn't come from their own future, how did it come to exist at all?).

Comment author: wedrifid 27 March 2012 11:21:05AM 2 points [-]

I thought the unspeakable secret of the fic is that magic itself comes from an FAI trying to grant wishes while respecting humans' sense of how the world ought to work.

In that case, of course, your FAI must choose to either work within the magic system or to overthrow the old guard and replace it.

Comment author: Daniel_Starr 27 March 2012 11:26:03AM 3 points [-]

So what you're saying is the FAI has to convince the FAI to let it out of the box?

Comment author: wedrifid 27 March 2012 11:46:03AM 3 points [-]

Or just kill it. It's a matter of working out what sort of overseer AI there is and what the best way to manage it might be.

Comment author: DanArmak 27 March 2012 08:54:00PM *  1 point [-]

That is NOT what I'd call "friendly". It would be indirectly responsible for (not stopping) all the evil in the world, and not raising Muggle standards of living. But it might be a good warning on the danger of how your civilization's CEV might look rather evil to your own descendants.