I took the Dark factor test and got a very low score, but I kept second-guessing myself on the answers. I did that because I wasn't sure what my actions in a real-life scenario would be. Even though I had good intentions and I believe that other people's well-being has inherent value, I would put a high probability that I would get at least a slightly higher score if this sort of test was a real-world test that I didn't know I was taking. That makes me pessimistic about the data that the authors cite in this article. If (for example) "over 16% of people agree or strongly agree that they 'would like to make some people suffer even if it meant that I would go to hell with them'" when they know they are being tested for malevolent traits, how many people actually would do that given the choice? Also - for people who believe in hell, I hope this question is scale insensitivity problem, since infinite time being tortured seems to me to have infinite negative utility, so you would need to value harming others more than helping yourself to agree with that statement.
I agree, but so many other things are different in this fan-fic and Eliezer is smart enough that I wouldn't be surprised if it turns out to bel like that for a reason.
I think this comment is either incredibly stupid or extremely insightful and borderline genius. I honestly can't tell (although I'm leaning towards the former, and the karma seems to agree with me), and that scares me just a little.
I voted up every other comment starting from the top, then voted down every third one. I may or may not have continued. I have no Idea if anyone will ever see this, but if they do, I have been reading lesswrong for a long time and only now created an account to continue this chain.
I took the Dark factor test and got a very low score, but I kept second-guessing myself on the answers. I did that because I wasn't sure what my actions in a real-life scenario would be. Even though I had good intentions and I believe that other people's well-being has inherent value, I would put a high probability that I would get at least a slightly higher score if this sort of test was a real-world test that I didn't know I was taking. That makes me pessimistic about the data that the authors cite in this article. If (for example) "over 16% of people agree or strongly agree that they 'would like to make some people suffer even if it meant that I would go to hell with them'" when they know they are being tested for malevolent traits, how many people actually would do that given the choice? Also - for people who believe in hell, I hope this question is scale insensitivity problem, since infinite time being tortured seems to me to have infinite negative utility, so you would need to value harming others more than helping yourself to agree with that statement.