Phil_Goetz4 comments on Ends Don't Justify Means (Among Humans) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (87)
Um, conditional independence, that is.
I want to know if my being killed by Eliezer's AI hinges on how often observables of interest tend to be conditionally dependent.