Manfred comments on A cynical explanation for why rationalists worry about FAI - Less Wrong

25 Post author: aaronsw 04 August 2012 12:27PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (179)

You are viewing a single comment's thread. Show more comments above.

Comment author: Manfred 04 August 2012 03:01:38PM 0 points [-]

You're right that you can interpret FAI as motivated reasoning. I guess I should have considered alternate interpretations more.

Eliezer concluded the singularity was the most important thing to work on and then decided the best way to get other people to work on it was to improve their general rationality.

Well, kinda. Eliezer concluded the singularity was the most important thing to work on and then decided the best way to work on it was to code an AI as fast as possible, with no particular regard for safety.

I also don't see how I conflated LW and SI

"[...] arguing about ideas on the internet" is what I was thinking of. It's a LW-describing sentence in a non-LW-related area. Oh, and "Why rationalists worry about FAI" rather than "Why SI worries about FAI."

Comment author: aaronsw 04 August 2012 03:07:14PM 3 points [-]

Two people have been confused by the "arguing about ideas" phrase, so I changed it to "thinking about ideas".

Comment author: Manfred 04 August 2012 05:57:00PM *  2 points [-]

It's more polite, and usually more accurate, to say "I sent a message I didn't want to, so I changed X to Y."

Comment author: Decius 05 August 2012 01:17:02AM 1 point [-]

Most accurate would be "feedback indicates that a message was received that I didn't intend to send, so..."