Comment author: Manon_de_Gaillande 11 January 2009 12:52:16PM 3 points [-]

I don't see how removing getting-used-to is close to removing boredom. IANAneurologist, but on a surface level, they do seem to work differently - boredom is reading the same book everyday and getting tired of it, habituation is getting a new book everyday and not thinking "Yay, new fun" anymore.

I'm reluctant to keep habituation because, at least in some cases, it is evil. When the emotion is appropriate to the event, it's wrong for it to disminish - you have a duty to rage against the dying of the light. (Of course we need it for survival, we can't be mourning all the time.) It also looks linked to status quo bias.

Maybe, like boredom, habituation is an incentive to make life better; but it's certainly not optimal.

In response to You Only Live Twice
Comment author: Manon_de_Gaillande 13 December 2008 03:16:00PM 10 points [-]

I'm going to stick out my neck. Eliezer wants everyone to live. Most people don't.

People care about their and their loved ones' immediate survival. They discount heavily for long-term survival. And they don't give a flying fuck about the life of strangers. They say "Death is bad.", but the social norm is not "Death is bad.", it's "Saying "Death is bad." is good.".

If this is not true, then I don't know how to explain why they dismiss cryonics out of hand with arguments about how death is not that bad that are clearly not their true rejection. The silliness heuristic explains believing it would fail, or that it's a scam - not rejecting the principle. Status quo and naturalistic bias explain part of the rejection, but surely not the whole thing.

And it would explain why I was bewildered, thinking "Why would you want a sucker like me to live?" even though I know Eliezer truly values life.

In response to Mundane Magic
Comment author: Manon_de_Gaillande 31 October 2008 06:37:40PM 2 points [-]

Actually, the Mystic Eyes of Depth Perception are pretty underwhelming. You can tell how far away things are with one eye most of the time. The difference is big enough to give a significant advantage, but nothing near superpower level. My own depth perception is crap (better than one eye though), and I don't often bump into walls.

In response to Crisis of Faith
Comment author: Manon_de_Gaillande 11 October 2008 01:34:06PM 1 point [-]

Nazir Ahmad Bhat, you are missing the point. It's not a question of identity, like which ice cream flavor you prefer. It's about truth. I do not believe there is a teapot orbiting around Jupiter, for the various reasons explained on this site (see _Absence of evidence is evidence of absence_ and the posts on Occam's Razor). You may call this a part of my identity. But I don't need people to believe in a teapot. Actually, I want everyone to know as much as possible. Promoting false beliefs is harming people, like slashing their tires. You don't believe in a flying teapot: do you need other people to?

Comment author: Manon_de_Gaillande 30 August 2008 06:19:54PM -2 points [-]

Eliezer, sure, but that can't be the *whole* story. I don't care about some of the stuff most people care about. Other people whose utility functions differ in similar but different ways from the social norm are called "psychopaths", and most people think they should either adopt their morals or be removed from society. I agree with this.

So why should I make a special exception for myself, just because that's who I happen to be? I try to behave as if I shared common morals, but it's just a gross patch. It feels tacked on, and it is.

I expected (though I had no idea how) you'd come up with an argument that would convice me to fully adopt such morals. But what you said would apply to *any* utility function. If a paperclip maximizer wondered about morality, you could tell it: "'Good' means 'maximizes paperclips'. You can think about it all day long, but you'd just end up making a mistake. Is that worth forsaking the beauty of tiling the universe with paperclips? What do you care there exists in mindspace minds that drag children off train tracks?" and it'd work just as well. Yet if you could, I bet you'd choose to make the paperclip maximizer adopt your morals.

In response to The Meaning of Right
Comment author: Manon_de_Gaillande 30 July 2008 03:34:00PM 1 point [-]

Constant: "Give a person power, and he no longer needs to compromise with others, and so for him the raison d'etre of morality vanishes and he acts as he pleases."

If you could do so easily and with complete impunity, would you organize fights to death for your pleasure? Would you even want to? Moreover, humans are often tempted to do things they know they shouldn't, because they also have selfish desires. AIs don't if you don't build it into them. If they really do ultimately care about humanity's well-being, and don't take any pleasure from making people obey them, they will keep doing so.

In response to The Meaning of Right
Comment author: Manon_de_Gaillande 30 July 2008 11:53:00AM 1 point [-]

I'm confused. I'll try to rephrase what you said, so that you can tell me whether I understood.

"You can change your morality. In fact, you do it all the time, when you are persuaded by arguments that appeal to other parts of your morality. So you may try to find the morality you really should have. But - "should"? That's judged by your current morality, which you can't expect to improve by changing it (you expect a particular change would improve it, but you can't tell in what direction). Just like you can't expect to win more by changing your probability estimate to win the lottery.

Moreover, while there is such a fact as "the number on your ticket matches the winning number", there is no ultimate source of morality out there, no way to judge Morality_5542 without appealing to another morality. So not only you can't jump to another morality, you also have to reason to want to: you're not trying to guess some true morality.

Therefore, just keep whatever morality you happen to have, including your intuitions for changing it."

Did I get this straight? If I did, it sounds a lot like a relativistic "There is no truth, so don't try to convice me" - but there is indeed no truth, as in, no objective morality.

In response to The Meaning of Right
Comment author: Manon_de_Gaillande 29 July 2008 09:05:19AM 4 points [-]

This argument sounds too good to be true - when you apply it to your own idea of "right". It also works for, say, a psychopath unable to feel empathy who gets a tremendous kick out of killing. How is there not a problem with that?

Comment author: Manon_de_Gaillande 27 July 2008 10:49:41PM 0 points [-]

No! The problem is not reductionism, or that morality is or isn't about my brain! The problem is that what morality actually computes is "What should you feel-moral about in order to maximize your genetic fitness in the ancestral environment?". Unlike math, which is more like "What axioms should you use in order to develop a system that helps you in making a bridge?" or "What axioms should you use in order to get funny results?". I care about bridges and fun, not genetic fitness.

Actually, "Whatever turns y'all on" is a pretty damn good morality. Because it makes sense on an intuitive level (it looks like what selfishness would be if other people were you). Because it doesn't care too much where your mind comes from, as it maximizes *whatever* turns you on. Because it mostly adds up to normality. Possibly because it's what I used, so I'm biased. Though I don't think you quite get normality - killing is a minor offense here, because people don't get to experience it.

Comment author: Manon_de_Gaillande 18 July 2008 11:27:58PM 1 point [-]

Folks, we covered that already! "You should open the door before you walk trough it." means "Your utility function ranks 'Open the door then walk through it' above 'Walk through the door without opening it'". *YOUR* utility function. "You should not murder." is *not* just reminding you of your own preferences. It's more like "(The 'morality' term of) my utility function ranks 'You murder' below 'you don't murder'.", and most "sane" moralities tend to regard "this morality is universal" as a good thing.

View more: Prev | Next