You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

wadavis comments on Exams and Overfitting - Less Wrong Discussion

12 Post author: robot-dreams 06 January 2015 07:35PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (47)

You are viewing a single comment's thread. Show more comments above.

Comment author: wadavis 09 January 2015 06:16:03PM 0 points [-]

Are we comfortable saying that this a conflict between ethical altruism and ethical egoism?

I acknowledge the arguments are sound from the altruist perspective. If I argue them, my arguments will not be altruistic. Lets retable this discussion for elsewhere as 'a convince me altruism is better' discussion, without limiting the discussion to post secondary testing. There is a popular perspective that if you are rational, you will agree the altruism is the answer. I'm not convinced of that yet.

If altruism/egoism is too narrow, we can use wants-to-kill-Moloch versus Moloch-can't-be-killed-so-make-your-sacrifice.

Comment author: someonewrongonthenet 11 January 2015 08:14:21PM *  0 points [-]

I'm comfortable with that. I don't think rationality == altruism, but I do think if altruism is your preference than it's irrational to not be altruistic, and I further think the typical human prefers to be altruistic even if they don't realize it yet. I think altruistic humans are happier than non-altruistic ones, and the "warm fuzzy" variants of altruism cause happiness. (Cheating is like the anti warm fuzzy. It is a cold slimy.)

Like I said

Rationality is winning, but winning is having the world arranged according to your preferences and most people's preferences include moral preferences.

Absent that last clause, you can get into a debate about when altruism is-and-is-not rational (and at that point we're not talking about morality and we are talking about game theory, so we should stop using the word "altruism" and instead use "cooperation"), but since we're all human beings here I implicitly took it as a terminal value. I agree that there can be rational minds that do not work that way.