AmagicalFishy comments on On Caring - Less Wrong

99 Post author: So8res 15 October 2014 01:59AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (272)

You are viewing a single comment's thread. Show more comments above.

Comment author: Decius 15 October 2014 07:27:02AM 1 point [-]

If you don't feel like you care about billions of people, and you recognize that the part of your brain that cares about small numbers of people has scope sensitivity, what observation causes you to believe that you do care about everyone equally?

Serious question; I traverse the reasoning the other way, and since I don't care much about the aggregate six billion people I don't know, I divide and say that I don't care more than one six-billionth as much about the typical person that I don't know.

People that I do know, I do care about- but I don't have to multiply to figure my total caring, I have to add.

Comment author: AmagicalFishy 23 November 2014 10:39:05PM *  1 point [-]

I second this question: Maybe I'm misunderstanding something, but part of me craves a set of axioms to justify the initial assumptions. That is: Person A cares about a small number of people who are close to them. Why does this equate to Person A having to care about everyone who isn't?

Comment author: lalaithion 23 November 2014 11:32:52PM 1 point [-]

For me, personally, I know that you could choose a person at random in the world, write a paragraph about them, and give it to me, and by doing that, I would care about them a lot more than before I had read that piece of paper, even though reading that paper hadn't changed anything about them. Similarly, becoming friends with someone doesn't usually change the person that much, but increases how much I care about them an awful lot.

Therefore, I look at all 7 billion people in the world, and even though I barely care about them, I know that it would be trivial for me to increase how much I care about one of them, and therefore I should care about them as if I had already completed that process, even if I hadn't

Maybe a better way of putting this is that I know that all of the people in the world are potential carees of mine, so I should act as though I aready care about these people in deference to possible future-me.

Comment author: AmagicalFishy 24 November 2014 05:30:36AM *  2 points [-]

For the most part, I follow—but there's something I'm missing. I think it lies somewhere in: "It would be trivial for me to increase how much I care about one fo them, and therefore I should care about them as if I had already completed that process, even if I hadn't."

Is the underlying "axiom" here that you wish to maximize the number of effects that come from the caring you give to people, because that's what an altruist does? Or that you wish to maximize your caring for people?

To contextualize the above question, here's a (nonsensical, but illustrative) parallel: I get cuts and scrapes when running through the woods. They make me feel alive; I like this momentary pain stimuli. It would be trivial for me to woods-run more and get more cuts and scrapes. Therefore I should just get cuts and scrapes.

I know it's silly, but let me explain: A person usually doesn't want to maximize their cuts and scrapes, even though cuts and scrapes might be appreciated at some point. Thus, the above scenario's conclusion seems silly. Similarly, I don't feel a necessity to maximize my caring—even though caring might be nice at some point. Caring about someone is a product of my knowing them, and I care about a person because I know them in a particular way (if I knew a person and thought they were scum, I would not care about them). The fact that I could know someone else, and thus hypothetically care about them, doesn't make me feel as if I should.

If, on the other hand, the axiom is true—then why bother considering your intuitive "care-o-meter" in the first place?

I think there's something fundamental I'm missing.

(Upon further thought, is there an agreed-upon intrinsic value to caring that my ignorance of some LW culture has lead me to miss? This would also explain wanting to maximize caring.)

(Upon further-further thought, is it something like the following internal dialogue? "I care about people close to me. I also care about the fate of mankind. I know that the fate of mankind as a whole is far more important than the fate of the people close to me. Since I value internal consistency, in order for my caring-mechanism to be consistent, my care for the fate of mankind must be proportional to my care for the people close to me. Since my caring mechanism is incapable of actually computing such a proportionality, the next best thing is to be consciously aware of how much it should care if it were able, and act accordingly.")

Comment author: Decius 24 November 2014 11:59:59PM 0 points [-]

(Upon further-further thought, is it something like the following internal dialogue? "I care about people close to me. I also care about the fate of mankind. I know that the fate of mankind as a whole is far more important than the fate of the people close to me. Since I value internal consistency, in order for my caring-mechanism to be consistent, my care for the fate of mankind must be proportional to my care for the people close to me. Since my caring mechanism is incapable of actually computing such a proportionality, the next best thing is to be consciously aware of how much it should care if it were able, and act accordingly.")

I care about self-consistency, but being self-consistent is something that must happen naturally; I can't self-consistently say "This feeling is self-inconsistent, therefore I will change this feeling to be self-consistent"

Comment author: AmagicalFishy 25 November 2014 01:12:50AM 0 points [-]

... Oh.

Hm. In that case, I think I'm still missing something fundamental.

Comment author: Decius 28 November 2014 06:11:40AM 0 points [-]

I care about self-consistency because an inconsistent self is very strong evidence that I'm doing something wrong.

It's not very likely that if I take the minimum steps to make the evidence of the error go away, I will make the error go away.

The general case of "find a self-inconsistency, make the minimum change to remove it" is not error-correcting.

Comment author: lalaithion 25 November 2014 05:11:07PM 0 points [-]

I actually think that your internal dialogue was a pretty accurate representation of what I was failing to say. And as for self consistency having to be natural, I agree, but if you're aware that you're being inconsistent, you can still alter your actions to try and correct for that fact.

Comment author: Decius 24 November 2014 11:58:13PM 0 points [-]

I look at a box of 100 bullets, and I know that it would be trivial for me to be in mortal danger from any one of them, but the box is perfectly safe.

It is trivial-ish for me to meet a trivial number of people and start to care about them, but it is certainly nontrivial to encounter a nontrivial number of people.