cousin_it comments on Open Thread, April 2011 - Less Wrong

5 Post author: ata 02 April 2011 06:43PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (109)

You are viewing a single comment's thread. Show more comments above.

Comment author: TheCosmist 02 April 2011 08:47:08PM *  3 points [-]

(I'm new here and don't have enough karma to create a thread, so I am posting this question here. Apologies in advance if this is inappropriate.)

Here is a topic I haven’t seen discussed on this forum: the philosophy of “Cosmicism”. If you’re not familiar with it check Wikipedia, but the quick summary is that it’s the philosophy invented by H. P. Lovecraft which posits that humanity’s values have no cosmic significance or absolute validity in our vast cosmos; to some alien species we might encounter or AI we might build, our values would be as meaningless as the values of insects are to us. Furthermore, all our creations and efforts are ultimately futile in a universe of increasing entropy and astrophysical annihilation. Lovecraft’s conclusion is: “good, evil, morality, feelings? Pure 'Victorian fictions'. Only egotism exists."

Personally I find this point of view difficult to refute – it seems as close to the truth about “life, the universe and everything” as one can have and remain consistent with our current understanding of the universe. At the same time, such a philosophy is rather frightening in that a world of egomaniacal cosmicists who consider human values to be meaningless would be seem to be highly unstable and insane.

I don’t claim to be an exceptionally rational person, so I’m asking the rationalists of this forum: what is your response to Cosmicism?

Comment author: cousin_it 02 April 2011 08:55:01PM *  16 points [-]

The standard reply here is that duh, values are a property of agents. I'm allowed to have values of my own and strive for things, even if the huge burning blobs of hydrogen in the sky don't share the same goals as me. The prospect of increasing entropy and astrophysical annihilation isn't enough to make me melt and die right now. Obligatory quote from HP:MOR:

"There is no justice in the laws of Nature, Headmaster, no term for fairness in the equations of motion. The universe is neither evil, nor good, it simply does not care. The stars don't care, or the Sun, or the sky. But they don't have to! We care! There is light in the world, and it is us!"

Comment author: TheCosmist 02 April 2011 09:15:55PM 0 points [-]

So in other words you agree with Lovecraft that only egotism exists?

Comment author: cousin_it 02 April 2011 09:17:18PM *  16 points [-]

Wha? There's no law of nature forcing all my goals to be egotistical. If I saw a kitten about to get run over by a train, I'd try to save it. The fact that insectoid aliens may not adore kittens doesn't change my values one bit.

Comment author: Vladimir_M 02 April 2011 10:49:35PM *  7 points [-]

That's certainly true, but from the regular human perspective, the real trouble is that in case of a conflict of values and interests, there is no "right," only naked power. (Which, of course, depending on the game-theoretic aspects of the concrete situation, may or may not escalate into warfare.) This does have some unpleasant implications not just when it comes to insectoid aliens, but also the regular human conflicts.

In fact, I think there is a persistent thread of biased thinking on LW in this regard. People here often write as if sufficiently rational individuals would surely be able to achieve harmony among themselves (this often cited post, for example, seems to take this for granted). Whereas in reality, even if they are so rational to leave no possibility of factual disagreement, if their values and interests differ -- and they often will -- it must be either "good fences make good neighbors" or "who-whom." In fact, I find it quite plausible that a no-holds-barred dissolving of the socially important beliefs and concepts would in fact exacerbate conflict, since this would become only more obvious.

Comment author: cousin_it 02 April 2011 11:08:17PM *  10 points [-]

Negative-sum conflicts happen due to factual disagreements (mostly inaccurate assessments of relative power), not value disagreements. If two parties have accurate beliefs but different values, bargaining will be more beneficial to both than making war, because bargaining can avoid destroying wealth but still take into account the "correct" counterfactual outcome of war.

Though bargaining may still look like "who whom" if one party is much more powerful than the other.

Comment author: Vladimir_M 03 April 2011 12:06:28AM 9 points [-]

How strong perfect-information assumptions do you need to guarantee that rational decision-making can never lead both sides in a conflict to precommit to escalation, even in a situation where their behavior has signaling implications for other conflicts in the future? (I don't know the answer to this question, but my hunch is that even if this is possible, the assumptions would have to be unrealistic for anything conceivable in reality.)

And of course, as you note, even if every conflict is resolved by perfect Coasian bargaining, if there is a significant asymmetry of power, the practical outcome can still be little different from defeat and subjugation (or even obliteration) in a war for the weaker side.

Comment author: AlephNeil 03 April 2011 05:36:09PM 0 points [-]

Negative-sum conflicts happen due to factual disagreements (mostly inaccurate assessments of relative power), not value disagreements.

By 'negative-sum' do you really mean 'negative for all parties'? Because, taking 'negative-sum' literally, we can imagine a variant of the Prisoner's Dilemma where A defecting gains 1 and costs B 2, and where B defecting gains 3 and costs A 10.

Comment author: cousin_it 03 April 2011 06:55:41PM 0 points [-]

I suppose I meant "Pareto-suboptimal". Sorry.

Comment author: Vladimir_M 03 April 2011 09:33:28PM *  1 point [-]

I suppose I meant "Pareto-suboptimal".

How does that make sense? You are correct that under sufficiently generous Coasian assumptions, any attempt at predation will be negotiated into a zero-sum transfer, thus avoiding a negative-sum conflict. But that is still a violation of Pareto optimality, which requires that nobody ends up worse off.

Comment deleted 03 April 2011 10:53:15PM [-]
Comment author: David_Gerard 03 April 2011 08:58:38AM *  3 points [-]

As I commented on What Would You Do Without Morality?:

I expect I'll keep on doing what I'm doing, which is trying to work out what I actually want. [...] So far I haven't lapsed into nihilist catatonia or killed everyone or destroyed the economy. This suggests that assuming a morality is not a requirement for not behaving like a sociopath. I have friends and it pleases me to be nice to them and I have a lovely girlfriend and a lovely three year old daughter who I spend most of my life's efforts on trying to bring up and on the prerequisites to that.

Without an intrinsic point to the universe, it seems likely to me that people would go on behaving with the same sort of observable morality they had before. I consider this supported by the observed phenomenon that Christians who turn atheist seem to still behave as ethically as they did before, without a perception of God to direct them.

This may or may not directly answer your question of what's the correct moral engine to have in one's mind (if there is a single correct moral engine to have in one's mind - and even assuming what's in one's mind has a tremendous effect on one's observed ethical behaviour, rather than said ethical behaviour largely being evolved behaviour going back millions of years before the mind), but I don't actually care about that except insofar as it affects the observed behaviour.