wedrifid comments on Reply to Holden on The Singularity Institute - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (213)
As an example datapoint Eliezer's reply to Holden caused a net decrease (not necessarily an enormous one) in both my positive affect for and abstract evaluation of the merit of the organisation based off one particularly bad argument that shocked me. It prompted some degree (again not necessarily a large degree) of updating towards the possibility that SingInst could suffer the same kind of mind-killed thinking and behavior I expect from other organisations in the class of pet-cause idealistic charities. (And that matters more for FAI oriented charities than save-the-puppies charities, with the whole think-right or destroy the world thing.)
When allowing for the possibility that I am wrong and Eliezer is right you have to expect most other supporters to be wrong a non-trivial proportion of the time too so too much talking is going to have negative side effects.
Which issue are you talking about? Is there already a comments thread about it on Eliezer's post?
Found it. It was nested too deep in a comment tree.
The particular line was:
The position is something I think it is best I don't mention again until (unless) I get around to writing the post "Predicting Failure Without Details" to express the position clearly with references and what limits apply to that kind of reasoning.
Isn't it just straight-up outside view prediction?