Manfred comments on A cynical explanation for why rationalists worry about FAI - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (179)
You're right that you can interpret FAI as motivated reasoning. I guess I should have considered alternate interpretations more.
Well, kinda. Eliezer concluded the singularity was the most important thing to work on and then decided the best way to work on it was to code an AI as fast as possible, with no particular regard for safety.
"[...] arguing about ideas on the internet" is what I was thinking of. It's a LW-describing sentence in a non-LW-related area. Oh, and "Why rationalists worry about FAI" rather than "Why SI worries about FAI."
Two people have been confused by the "arguing about ideas" phrase, so I changed it to "thinking about ideas".
It's more polite, and usually more accurate, to say "I sent a message I didn't want to, so I changed X to Y."
Most accurate would be "feedback indicates that a message was received that I didn't intend to send, so..."