Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Wei_Dai comments on Max Tegmark on our place in history: "We're Not Insignificant After All" - Less Wrong

18 [deleted] 04 January 2010 12:02AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (68)

You are viewing a single comment's thread. Show more comments above.

Comment author: byrnema 04 January 2010 05:11:38PM 4 points [-]

And I also puzzle over why I appear to be in such an atypical position.

And I was wondering why I was in such an atypical position of not caring.

You write of pushing the universe towards net improvements. By 'improvement', you mean relative to your particular or general human values. At a large and far scale, why should we have any loyalty to those values, especially if they are arbitrary (that is, not sufficiently general to mind space)? If the universe is meant to be dominated by the different values of other minds, why would I do anything but shrug my shoulders about that?

Comment author: Wei_Dai 04 January 2010 05:49:29PM 2 points [-]

I think just by asking the question "why should I care?", you probably already care more than most, who just go on doing what they always did without a second thought.

If I ask myself "why do I care?", the answer is that I don't seem to care much about the standard status symbols and consumer goods (bigger houses, faster cars, etc.), so what is left? Well for one thing, I care about knowledge, i.e., finding answers to questions that puzzle me, and I think I can do that much better in some futures than in others.

Comment author: AdeleneDawner 04 January 2010 06:48:35PM 1 point [-]

Er... if you answered why you care, I'm failing to find where you did so. Listing what you care about doesn't answer the question.

I don't think it's controversial that 'why do you care about that' is either unanswerable, or answerable only in terms of something like evolution or neurochemestry, in the case of terminal values.

Comment author: byrnema 04 January 2010 07:43:40PM *  0 points [-]

Listing what you care about doesn't answer the question.

There is a subtext to this question, which is that I believe we typically assume -- until it is demonstrated otherwise -- that our values are similar or overlap significantly, so it is natural to ask 'why do you value this' when maybe we really mean 'what terminal value do you think you're optimizing with this'? Disagreements about policy or 'what we should care about' are then often based on different beliefs about what achieves what than different values. It is true that if our difference in caring turns out to be based upon different values, or weighting values differently, then there's nothing much to discuss. Since I do value knowledge too, I wanted to further qualify how Wei Dai values knowledge, because I don't see how nudging the far future one way or another is going to increase Wei Dei's total knowledge.

Comment author: Wei_Dai 04 January 2010 07:30:40PM 0 points [-]

Byrnema had a specific objection to human values that are "arbitrary", and I think my response addressed that. To be more explicit, all values are vulnerable to the charge of being arbitrary, but seeking knowledge seems less vulnerable than others, and that seems to explain why I care more about the future than the average person. I was also trying to point out to Byrnema that perhaps she already cares more about the future than most, but didn't realize it.

Comment author: byrnema 04 January 2010 06:03:55PM *  0 points [-]

To what extent does your caring about the future depend upon you being there to experience it?

Then my next question would be, how important is your identity to this value? For example, do you have a strong preference whether it is "you" that gains more and more knowledge of the universe, or any other mind?

Comment author: Wei_Dai 04 January 2010 07:46:12PM *  3 points [-]

I might change my mind in the future, but right now my answers are "to a large extent" and "pretty important".

Why do you care what my values are, though, or why they are what they are? I find it fascinating that "value-seeking" is a common behavior among rationalist-wannabes (and I'm as guilty of it as anyone). It's almost as if the most precious resource in this universe isn't negentropy, but values.

ETA: I see you just answered this in your reply to Adelene Dawner:

I wanted to further qualify how Wei Dai values knowledge, because I don't see how nudging the far future one way or another is going to increase Wei Dei's total knowledge.

I expect that I can survive indefinitely in some futures. Does that answer your question?

Comment author: byrnema 04 January 2010 07:58:56PM *  2 points [-]

It's almost as if the most precious resource in this universe isn't negentropy, but values.

That's an amusing observation, with some amount of truth to it.

The reason why I was asking is because I was seeking to understand why you care and I don't.

Given your reply, I think our difference in caring can be explained by the fact that when I imagine the far future, I don't imagine myself there. I'm also less attached to my identity; I wouldn't mind experiencing the optimization of the universe from the point of view of an alien mind, with different values. (This last bit is relevant if you want the future to be good just the sake of it being good, even if you're not there.)