MatthewBaker comments on Reply to Holden on The Singularity Institute - Less Wrong

46 Post author: lukeprog 10 July 2012 11:20PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (213)

You are viewing a single comment's thread.

Comment author: MatthewBaker 11 July 2012 12:14:33AM 5 points [-]

If I earmark my donations for "HPMOR Finale or CPA Audit whichever comes first" would that act as positive or negative pressure towards Eliezer's fiction creation complex? (I only ask because bugging him for an update has been previously suggested to reduce update speed)

Furthermore. Oracle AI/Nanny AI seem to both fail the heuristic of "other country is about to beat us in a war, should we remove the safety programming" that I use quite often with nearly everyone I debate AI about from outside the LW community. Thank you both for writing such concise yet detailed responses that helped me understand the problem areas of Tool AI better.

Comment author: lukeprog 11 July 2012 12:36:49AM 9 points [-]

If I earmark my donations for "HPMOR Finale or CPA Audit whichever comes first" would that act as positive or negative pressure towards Eliezer's fiction creation complex?

I think the issue is that we need a successful SPARC and an "Open Problems in Friendly AI" sequence more urgently than we need an HPMOR finale.

Comment author: shokwave 11 July 2012 01:07:13AM *  9 points [-]

"Open Problems in Friendly AI" sequence

an HPMOR finale

A sudden, confusing vision just occurred, of the two being somehow combined. Aaagh.

Comment author: shminux 11 July 2012 04:59:21AM 3 points [-]

Spoiler: Voldemort is a uFAI.

Comment author: arundelo 11 July 2012 05:41:43AM 6 points [-]

For the record:

Nothing in this story so far represents either FAI or UFAI. Consider it Word of God.

(And later in the thread, when asked about "so far": "And I have no intention at this time to do it later, but don't want to make it a blanket prohibition.")

Comment author: NancyLebovitz 15 July 2012 02:03:33AM *  2 points [-]

In the earlier chapters, it seemed to me that the Hogwarts facility dealing with Harry was something like being faced with an AI of uncertain Friendliness.

Correction: It was more like the faculty dealing with an AI that's trying to get itself out of its box.

Comment author: MatthewBaker 11 July 2012 10:21:43PM 0 points [-]

I think our values our positively maximized by delaying the HPMOR finale as long as possible, my post was more out of curiosity to see what would be most helpful to Eliezer.

Comment author: David_Gerard 13 July 2012 08:44:06AM 7 points [-]

In general - never earmark donations. It's a stupendous pain in the arse to deal with. If you trust an organisation enough to donate to them, trust them enough to use the money for whatever they see a need for. Contrapositive: If you don't trust them enough to use the money for whatever they see a need for, don't donate to them.

Comment author: MatthewBaker 13 July 2012 06:01:02PM 2 points [-]

I never have before but this CPA Audit seemed like a logical thing that would encourage my wealthy parents to donate :)