TheOtherDave comments on The $125,000 Summer Singularity Challenge - Less Wrong

20 Post author: Kaj_Sotala 29 July 2011 09:02PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (259)

You are viewing a single comment's thread. Show more comments above.

Comment author: TheOtherDave 25 April 2013 06:35:45PM 4 points [-]

Frankly, if I try to imagine living in a world in which I am as confident that that many people exist as I am that 7 billion people exist today, I'm not sure I wouldn't kill off billions for a cookie.

I mean, if I try to imagine living in a world where only 10,000 people exist, I conclude that I would be significantly more motivated to extend the lives of an arbitrary person (e.g., by preventing them from starving) than I am now. (Leaving aside any trauma related to the dieback itself.)

If a mere six orders of magnitude difference in population can reduce my motivation to extend an arbitrary life to that extent, it seems likely that another twenty or thirty orders of magnitude would reduce me to utter apathy when it comes to an arbitrary life. Add another ten orders of magnitude and utter apathy when it comes to a billion arbitrary lives seems plausible.

What if it's billions who they've never met, and are never going to meet?

I presumed this.
If it's billions of friends instead, I no longer have any confidence in any statement about my preferences, because any system capable of having billions of friends is sufficiently different from me that I can't meaningfully predict it.
If it's billions of people including a friend of mine, I suspect that my friend is worth about as much as they are in the 7billion-person world, + (billions-1) people who I'm apathetic about. I suspect I either get really confused at this point, or compartmentalize fiercely.

Comment author: CCC 25 April 2013 07:04:34PM *  2 points [-]

If it's billions of people including a friend of mine, I suspect that my friend is worth about as much as they are in the 7billion-person world, + (billions-1) people who I'm apathetic about. I suspect I either get really confused at this point, or compartmentalize fiercely.

Thinking about this has caused me to realise that I already compartmentalise pretty fiercely. Some of the lines along which I compartmentalise are a little surprising when I investigate them closely... friend/non-friend is not the sharpest line of the lot.

One pretty sharp line is probably-trying-to-manipulate-me/probably-not-trying-to-manipulate-me. But I wouldn't want to kill anyone on either side of that line (I wouldn't even want to be rude to them without reason (though 'he's a telemarketer' is reason for hanging up the phone on someone mid-sentance)). My brain seems to insist on lumping "have never met or interacted with, likely will never meet or interact with" in more-or-less the same category as "fictional".

Comment author: MugaSofer 26 April 2013 11:30:24AM *  -2 points [-]

My brain seems to insist on lumping "have never met or interacted with, likely will never meet or interact with" in more-or-less the same category as "fictional".

That sounds a lot like some sort of scope insensitivity than a revealed preference.

Comment author: CCC 26 April 2013 01:18:54PM *  1 point [-]

I don't think it's scope insensitivity in this particular case, because I'm considering one-on-one interactions in this compartmentalisation.

Of course, this particular case did come to my mind as a side-effect of a discussion on scope insensitivity.

Comment author: MugaSofer 26 April 2013 02:39:09PM -2 points [-]

Sorry, I was replying to the last bit. Edited.

Comment author: CCC 26 April 2013 03:31:05PM 0 points [-]

That edit does make your meaning clearer. It does so by highlighting that my phrasing was sloppy, so let me try to explain myself better.

Let us say that I hear of someone being mugged. My emotional reaction changes as a function of my relationship to the victim. If the victim is a friend, I am concerned and rush to check that he is OK. If the victim is an acquaintence, I am concerned and check that he is OK the next time I see him. If the victim is someone whom I have never met or interacted with, and am unlikely to meet or interact with, I am mildly perturbed. If the victim is a fictional character, I am also mildly perturbed.

When considering only one person, those last two categories blur together in my mind somewhat.

Comment author: [deleted] 26 April 2013 04:57:57PM *  0 points [-]

If the victim is someone whom I have never met or interacted with, and am unlikely to meet or interact with, I am mildly perturbed. If the victim is a fictional character, I am also mildly perturbed.

If the victim is someone whom I have never met or interacted with, and am unlikely to meet or interact with, I shrug and think ‘so what? so many people get mugged every day, why should I worry about this one in particular?’ If it's a fictional character, it depends on whether the author is good enough to switch me from far-mode to near-mode thinking.

Comment author: TheOtherDave 26 April 2013 05:29:51PM 2 points [-]

Well, but this elides differences in the object with differences in the framing. I certainly agree that an author can change how I feel about a fictional character, but an author can also change how I feel about a real person whom I have never met or interacted with, and am unlikely to meet or interact with.

Comment author: MugaSofer 29 April 2013 10:10:51AM *  -1 points [-]

If the victim is someone whom I have never met or interacted with, and am unlikely to meet or interact with, I shrug and think ‘so what? so many people get mugged every day, why should I worry about this one in particular?’

Am I the only person here who is in any way moved by accounts of specific victims? Nonfiction writers can switch you to near-mode too, or at least they can to me.

Comment author: CCC 30 April 2013 01:39:50PM 2 points [-]

Like army1987, I can be moved by accounts of specific victims, whether they are fictional or not. There is a bug here, and the bug is this; that I am moved the same amount by an otherwise identical fictional or nonfictional account, where the nonfictional account contains no-one with whom I have ever interacted.

That is, simply knowing that an account is non-fictional doesn't affect my emotional reaction, one way or another. (This doesn't mean I am entirely without sympathy for people I have never met - it simply means that I have equivalent sympathy for fictional characters). This is a bug; ideally, my emotional reaction should take into account such an important detail as whether or not something really happened. After all, what detail could be more important?

Comment author: Kawoomba 30 April 2013 02:13:46PM *  2 points [-]

It's not a bug, it's a feature (in some contexts).

Consider you were playing 2 games of online chess against an anonymous opponent. You barely lose the first one. Now you're feeling the spirit of competition, your blood boiling for revenge! Should you force yourself to relinquish the thrill of the contest, because "it doesn't really matter"? That would be no fun! :-(

If you're reading a work of fiction, knowing it is fiction, why are you doing so? Because emotional investment is fun? Why would you then sabotage your enjoyment by trying to downsize your emotional investment, since "it's not real"? Also no fun! :-(

If the flawed heuristic you are employing in a certain context works in your favor in that context, switching it off would be dumb (although being vaguely aware of it would not be).

Comment author: CCC 03 May 2013 02:17:20PM 1 point [-]

Should you force yourself to relinquish the thrill of the contest, because "it doesn't really matter"? That would be no fun!

Oh, it does matter. There's a real opponent there. That's reality.

If you're reading a work of fiction, knowing it is fiction, why are you doing so? Because emotional investment is fun?

You make a good point.

Comment author: MugaSofer 01 May 2013 01:53:16PM -1 points [-]

I'm not sure I'd characterize that as a "bug", more a feature we need to be aware of and take into account.

If you weren't moved by fictional scenarios, you wouldn't be able to empathize with people in those scenarios - including your future self! We mostly predict other people's actions by using our own brain as a black box, imaging ourselves in their situation and how we would react, so there goes any situation featuring other humans. And we couldn't daydream or enjoy fiction, either.

Would it be useful to turn it off? Maaaybe, but as long as you don't start taking hypothetical people's wishes into account, and stop reading stuff that triggers you, you're fine - I bet the consequences for misuse would be higher than the marginal benefits.

Comment author: CCC 03 May 2013 02:16:09PM 0 points [-]

I don't think that empathising with fictional characters should be turned off. I just think that properly calibrated emotions should take all factors into account, with properly relevant weightings. I notice that my emotions do not seem to be taking the 'reality' factor into account, and I therefore conclude that my emotions are poorly calibrated.

My future self would be a potentially real scenario, and thus would deserve all the emotional investment appropriate for a situation that may well come to pass. (He also gets the emotional investment for being me, which is quite large).

I'm not sure whether I should be feeling more sympathy for strangers, or less sympathy for fictional people.

Comment author: MugaSofer 12 May 2013 08:45:35PM 0 points [-]

So ... are you saying that they're poorly calibrated, but that's fine and nothing to worry about as long as we don't forget it and start giving imaginary people moral weight? Because if so, I agree with you on this.

Comment author: [deleted] 29 April 2013 11:57:13AM 1 point [-]

If the account is detailed enough, it does move me, but not much more than an otherwise identical account that I know is fictional.

Comment author: MugaSofer 29 April 2013 10:12:01AM *  -2 points [-]

That edit does make your meaning clearer. It does so by highlighting that my phrasing was sloppy, so let me try to explain myself better.

Fair enough.

If the victim is someone whom I have never met or interacted with, and am unlikely to meet or interact with, I am mildly perturbed. If the victim is a fictional character, I am also mildly perturbed.

That depends on much you know about/empathize with them, right?

Comment author: CCC 30 April 2013 01:31:50PM 0 points [-]

That depends on much you know about/empathize with them, right?

Yes; but I can know as much about a fictional character as about a non-fictional character whom I have not interacted with. The dependency has nothing to do with the fictionality or lack thereof of the character.

Comment author: MugaSofer 01 May 2013 02:03:33PM -1 points [-]

Right, hence me quoting both the section on fictional and non-fictional characters.

To be honest, our brains don't really seem to distinguish between fiction and non-fiction at all; it's merely a question of context. Hence our reactions to fictional evidence and so forth. Lotta awkward biases you can catch from that what with our tendency to "buy in" to compelling narratives.

Comment author: Kawoomba 26 April 2013 01:38:48PM 0 points [-]

It's not a bias if you value an additional dollar less once all your needs are met.

It's not a bias if you value a random human life less if there are billions of others, compared to if there are only a few others.

You may choose for yourself to value a $10 bill the same whether you're dirt poor, or a millionaire. Same with human lives. But you don't get to "that's a bias" others who have a more nuanced and context-sensitive estimation.

Comment author: CCC 26 April 2013 01:32:39PM *  0 points [-]

Add another ten orders of magnitude and utter apathy when it comes to a billion arbitrary lives seems plausible.

A billion is nine orders of magnitude. As a very rough estimate, then, adding an order of magnitude to the number of lives in existence divides the motivation to extend an arbitrary stranger's life by an order of magnitude. And the same for any other multiplier.

That is, if G is chosen such that f(x)-f(x-1)=G, then f(Mx)-f(Mx-1)=G/M for any given x and any multiplier M. If I then define my hedons such that f(0)=0 and f(1)=1...

...then I get that f(x) is the harmonic series.

For 10,000 people, on this entirely arbitrary (and extremely large) scale, I get a value f(x) between 9 and 10; for seven billion, f(x) lies between 23 and 24 (source)

Comment author: [deleted] 26 April 2013 05:12:21PM 3 points [-]

...then I get that f(x) is the harmonic series.

That's pretty much the natural logarithm of x (plus a constant, plus a term O(1/n)).

Comment author: CCC 27 April 2013 05:36:03PM 1 point [-]

Hm. Yes, to the level of approximation I'm using here, I could as easily have used a log function. And would have, if I'd thought of it; the log function is used enough that I'd expect its properties to be easier for whoever reads my post to imagine.

Comment author: MugaSofer 26 April 2013 11:28:42AM *  -1 points [-]

I mean, if I try to imagine living in a world where only 10,000 people exist, I conclude that I would be significantly more motivated to extend the lives of an arbitrary person (e.g., by preventing them from starving) than I am now. (Leaving aside any trauma related to the dieback itself.)

Well, if the population is that low saving people is guarding against an existential risk, so I would feel the same. Does your introspection yield anything on why smaller numbers matter more?

ETA: your brain can't grasp numbers anywhere near as high as a billion. How sure are you murder matters now?

Comment author: TheOtherDave 26 April 2013 01:12:36PM 3 points [-]

How sure are you murder matters now?

It's pretty clear that individual murder doesn't matter to me.

I mean, someone was murdered just now, as I write this sentence, and I care about that significantly less than I care about the quality of my coffee. I mean, I just spent five seconds adjusting the quality of my coffee, which is at least a noticeable quantity of effort if not a significant one. I can't say the same about that anonymous murder.

Oh look, there goes another one. (Yawn.)

The metric I was using was not "caring whether someone is murdered", which it's clear I really don't, but rather "being willing to murder someone," which it's relatively clear that I do, but not nearly as much as I could. (Insert typical spiel here about near/far mode, etc.)

Comment author: nshepperd 26 April 2013 02:11:49PM 0 points [-]

I think the resolution to that is that you don't have to have an immediate emotional reaction to care about it. There are lots of good and bad things happening in the world right now, but trying to feel all of them would be pointless, and a bad fit for our mental architecture. But we can still care, I think.

Comment author: TheOtherDave 26 April 2013 03:38:08PM 0 points [-]

Well, I certainly agree that I don't have to have an emotional reaction to each event, or indeed a reaction to the event at all, in order to be motivated to build systems that handle events in that class in different ways. I'm content to use the word "care" to refer to such motivation, either as well as or instead of referring to such emotional reactions. Ditto for "matters" in questions like "does murder matter", in which case my answer to the above would change, but that certainly isn't how I udnerstood MugaSofer's question.

Comment author: [deleted] 26 April 2013 05:03:50PM *  1 point [-]

So the question now is: if you could prevent someone you would most likely never otherwise interact with from being murdered, but that would make your coffee taste worse, what would you do?

Comment author: shminux 26 April 2013 06:00:41PM *  7 points [-]

Don't we make this choice daily by choosing our preferred brand over Ethical Bean at Starbucks?

Comment author: Eliezer_Yudkowsky 26 April 2013 11:26:06PM 7 points [-]

I hear the ethics at Starbucks are rather low-quality and in any case, surely Starbucks isn't the cheapest place to purchase ethics.

Comment author: gwern 27 April 2013 12:23:31AM *  18 points [-]

Bah! Listen, Eliezer, I'm tired of all your meta-hipsterism!

"Hey, let's get some ethics at Starbucks" "Nah, it's low-quality; I only buy a really obscure brand of ethics you've probably never heard of called MIRI". "Hey man, you don't look in good health, maybe you should see a doctor" "Nah, I like a really obscure form of healthcare, I bet you're not signed up for it, it's called 'cryonics'; it's the cool thing to do". "I think I like you, let's date" "Oh, I'm afraid I only date polyamorists; you're just too square". "Oh man, I just realized I committed hindsight bias the other day!" "I disagree, it's really the more obscure backfire effect which just got published a year or two ago." "Yo, check out this thing I did with statistics" "That's cool. Did you use Bayesian techniques?"

Man, forget you!

/angrily sips his obscure mail-order loose tea, a kind of oolong you've never heard of (Formosa vintage tie-guan-yin)

Comment author: Eliezer_Yudkowsky 27 April 2013 08:24:01AM 6 points [-]

If you can't pick something non-average to meet your optimization criteria, you can't optimize above the average.

This comment has been brought to you by my Dvorak keyboard layout.

Comment author: [deleted] 27 April 2013 12:42:20AM 2 points [-]

Ouch, that cuts a bit close to home...

Comment author: [deleted] 27 April 2013 10:14:48AM *  1 point [-]

(Had to google “backfire effect” to find out whether you had made it up on the spot.)

EDIT: Looks like I had already heard of that effect, and I even seem to recall E.T. Jaynes giving a theoretical explanation of it, but I didn't remember whether it had a name.

Comment author: Vaniver 27 April 2013 01:14:53AM 1 point [-]

"Yo, check out this thing I did with statistics" "That's cool. Did you use Bayesian techniques?"

I can't tell if I should feel good or bad that this was the only one where I said "well, actually..."

Comment author: [deleted] 27 April 2013 10:12:18AM *  1 point [-]

BTW, for some reason, certain “fair trade” products at my supermarket are astoundingly cheap (as in, I've bought very similar but non-“fair trade” stuff for more); I notice that I'm confused.

Comment author: TheOtherDave 26 April 2013 05:27:29PM 3 points [-]

Judging from experience, the answer is that it depends on how the choice is framed.

That said, I'd feel worse afterwards about choosing the tastier coffee.