lukeprog comments on Reply to Holden on The Singularity Institute - Less Wrong

46 Post author: lukeprog 10 July 2012 11:20PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (213)

You are viewing a single comment's thread. Show more comments above.

Comment author: lukeprog 11 July 2012 12:36:49AM 9 points [-]

If I earmark my donations for "HPMOR Finale or CPA Audit whichever comes first" would that act as positive or negative pressure towards Eliezer's fiction creation complex?

I think the issue is that we need a successful SPARC and an "Open Problems in Friendly AI" sequence more urgently than we need an HPMOR finale.

Comment author: shokwave 11 July 2012 01:07:13AM *  9 points [-]

"Open Problems in Friendly AI" sequence

an HPMOR finale

A sudden, confusing vision just occurred, of the two being somehow combined. Aaagh.

Comment author: shminux 11 July 2012 04:59:21AM 3 points [-]

Spoiler: Voldemort is a uFAI.

Comment author: arundelo 11 July 2012 05:41:43AM 6 points [-]

For the record:

Nothing in this story so far represents either FAI or UFAI. Consider it Word of God.

(And later in the thread, when asked about "so far": "And I have no intention at this time to do it later, but don't want to make it a blanket prohibition.")

Comment author: NancyLebovitz 15 July 2012 02:03:33AM *  2 points [-]

In the earlier chapters, it seemed to me that the Hogwarts facility dealing with Harry was something like being faced with an AI of uncertain Friendliness.

Correction: It was more like the faculty dealing with an AI that's trying to get itself out of its box.

Comment author: MatthewBaker 11 July 2012 10:21:43PM 0 points [-]

I think our values our positively maximized by delaying the HPMOR finale as long as possible, my post was more out of curiosity to see what would be most helpful to Eliezer.