Rationality Quotes September 2012

7 Post author: Jayson_Virissimo 03 September 2012 05:18AM

Here's the new thread for posting quotes, with the usual rules:

  • Please post all quotes separately, so that they can be voted up/down separately.  (If they are strongly related, reply to your own comments.  If strongly ordered, then go ahead and post them together.)
  • Do not quote yourself
  • Do not quote comments/posts on LW/OB
  • No more than 5 quotes per person per monthly thread, please.

Comments (1088)

Comment author: Jayson_Virissimo 01 September 2012 08:18:04AM *  12 points [-]

Conspiracy Theory, n. A theory about a conspiracy that you are not supposed to believe.

-L. A. Rollins, Lucifer's Lexicon: An Updated Abridgment

Comment author: Jayson_Virissimo 01 September 2012 08:25:27AM *  34 points [-]

Infallible, adj. Incapable of admitting error.

-L. A. Rollins, Lucifer's Lexicon: An Updated Abridgment

Comment deleted 01 September 2012 09:33:16AM *  [-]
Comment author: gwern 01 September 2012 06:47:39PM 4 points [-]
Comment author: Ezekiel 01 September 2012 09:16:48PM 8 points [-]

... which one wish, carefully phrased, could also provide.

Comment author: Jonathan_Graehl 03 September 2012 03:58:34AM 9 points [-]
Comment author: Eliezer_Yudkowsky 03 September 2012 05:19:56AM 14 points [-]

Er... actually the genie is offering at most two rounds of feedback.

Sorry about the pedantry, it's just that as a professional specialist in genies I have a tendency to notice that sort of thing.

Comment author: roland 03 September 2012 09:21:48AM 0 points [-]

Why only 2 rounds of feedback if you have 3 wishes?

Comment author: RomanDavis 03 September 2012 09:24:30AM 4 points [-]

The third one's for keeps: you can't wish the consequences away.

Comment author: roland 03 September 2012 09:34:02AM 1 point [-]

Right, but the consequences still qualify as feedback, no?

Comment author: RomanDavis 03 September 2012 09:47:56AM *  0 points [-]

I always imagine the genie just goes back into his lamp to sleep or whatever, so in the hypothetical as it exists in my head, no. But I guess there could be a highly ambitious Genie looking for feedback after your last wish, so maybe.

I think in this case, Eliezer in talking about a genie like in Failed Utopia 4-2 who grants his wish, and then keeps working, ignoring feedback, because he just doesn't care, because caring isn't part of the wish.

The genie doesn't care about consequences, he just cares about the wishes. The second wish and third wish are the feedback.

Comment author: wedrifid 03 September 2012 10:03:20AM 3 points [-]

I always imagine the genie just goes back into his lamp to sleep or whatever, so in the hypothetical as it exists in my head, no. But I guess there could be a highly ambitious Genie looking for feedback after your last wish, so maybe.

The feedback is for you, not what you happen to say to the genie.

Comment author: wedrifid 03 September 2012 10:33:55AM *  3 points [-]

Sorry about the pedantry, it's just that as a professional specialist in genies I have a tendency to notice that sort of thing.

Rather than a technical correction you seem just to be substituting a different meaning of 'feedback'. The author would certainly not agree that "You get 0 feedback from 1 wish".

Mind you I am wary of the the fundamental message of the quote. Feedback? One of the most obviously important purposes of getting feedback is to avoid catastrophic failure. Yet catastrophic failures are exactly the kind of thing that will prevent you from using the next wish. So this is "Just Feedback" that can Kill You Off For Real despite the miraculous intervention you have access to.

I'd say "What the genie is really offering is a wish and two chances to change your mind---assuming you happen to be still alive and capable of constructing corrective wishes".

Comment author: Morendil 03 September 2012 11:00:10AM *  4 points [-]

"What the genie is really offering is a wish and two chances to change your mind---assuming you happen to be still alive and capable of constructing corrective wishes".

One well-known folk tale is based on precisely this interpretation. Probably more than one.

Comment deleted 01 September 2012 09:35:51AM *  [-]
Comment author: Ezekiel 02 September 2012 12:06:06PM 3 points [-]

Open question: Do you care about what (your current brain predicts) your transhuman self would want?

Comment author: fiddlemath 02 September 2012 10:48:28PM 2 points [-]

Yes, I think so. It surely depends on exactly how I extrapolate to my "transhuman self," but I suspect that its goals will be like my own goals, writ larger

Comment author: Eliezer_Yudkowsky 03 September 2012 05:22:24AM 10 points [-]

If you don't, you're really going to regret it in a million years.

Comment author: Ezekiel 03 September 2012 05:50:51AM *  0 points [-]

The chance of human augmentation reaching that level within my lifespan (or even within my someone's-looking-after-my-frozen-brain-span) is, by my estimate, vanishingly low. But if you're so sure, could I borrow money from you and pay you back some ludicrously high amount in a million years' time?

More seriously: Seeing as my current brain finds regret unpleasant, that's something that reduces to my current terminal values anyway. I do consider transhuman-me close enough to current-me that I want it to be happy. But where their terminal values actually differ, I'm not so sure - even if I knew I were going to undergo augmentation.

Comment author: wedrifid 03 September 2012 07:48:10AM 3 points [-]

If you don't, you're really going to regret it in a million years.

I'm rather skeptical about that, even conditioning on Ezekiel being around to care. I expect that the difference between him having his current preferences and his current preferences+more caring about future preferences will not result in a significant difference in the outcome the future Ezekiel will experience.

Comment author: Will_Newsome 01 September 2012 10:17:14AM *  11 points [-]

Proceed only with the simplest terms, for all others are enemies and will confuse you.

— Michael Kirkbride / Vivec, "The Thirty Six Lessons of Vivec", Morrowind.

Comment author: Ezekiel 03 September 2012 03:17:59PM 5 points [-]

Am I the only one who thinks we should stop using the word "simple" for Occam's Razor / Solomonoff's Whatever? In 99% of use-cases by actual humans, it doesn't mean Solomonoff induction, so it's confusing.

Comment author: Ezekiel 01 September 2012 11:27:29AM *  58 points [-]

"Wait, Professor... If Sisyphus had to roll the boulder up the hill over and over forever, why didn't he just program robots to roll it for him, and then spend all his time wallowing in hedonism?"
"It's a metaphor for the human struggle."
"I don't see how that changes my point."

Comment author: Eugine_Nier 03 September 2012 02:53:57AM *  29 points [-]

Well, his point only makes any sense when applied to the metaphor since a better answer to the question

"Wait, Professor... If Sisyphus had to roll the boulder up the hill over and over forever, why didn't he just program robots to roll it for him, and then spend all his time wallowing in hedonism?"

is:

"where would Sisyphus get a robot in the middle of Hades?"

Edit: come to think of it, this also works with the metaphor for human struggle.

Comment author: Alejandro1 03 September 2012 03:01:13AM 5 points [-]

Borrowing one of Hephaestus', perhaps?

Comment author: Ezekiel 03 September 2012 05:54:26AM 19 points [-]

Now someone just has to write a book entitled "The Rationality of Sisyphus", give it a really pretentious-sounding philosophical blurb, and then fill it with Grand Theft Robot.

Comment author: taelor 03 September 2012 04:42:35AM 15 points [-]

Answer: Because the Greek gods are vindictive as fuck, and will fuck you over twice as hard when they find out that you wriggled out of it the first time.

Comment author: buybuydandavis 03 September 2012 11:04:20AM *  13 points [-]

Who was the guy who tried to bargain the gods into giving him immortality, only to get screwed because he hadn't thought to ask for youth and health as well? He ended up being a shriveled crab like thing in a jar.

My highschool english teacher thought this fable showed that you should be careful what you wished for. I thought it showed that trying to compel those with great power through contract was a great way to get yourself fucked good an hard. Don't think you can fuck with people a lot more powerful than you are and get away with it.

EDIT: The myth was of Tithonus. A goddess Eos was keeping him as a lover, and tried to bargain with Zeus for his immortality, without asking for eternal youth too. Ooops.

Comment author: Ezekiel 03 September 2012 03:14:29PM 21 points [-]

Don't think you can fuck with people a lot more powerful than you are and get away with it.

I'm no expert, but that seems to be the moral of a lot of Greek myths.

Comment author: RomanDavis 01 September 2012 12:18:46PM *  6 points [-]

A scientific theory

Isn't just a hunch or guess

It's more like a question

That's been put through a lot of tests

And when a theory emerges

Consistent with the facts

The proof is with science

The truth is with science

They Might Be Giants

Comment author: Yvain 01 September 2012 02:20:44PM 56 points [-]

Do unto others 20% better than you expect them to do unto you, to correct for subjective error.

-- Linus Pauling

Comment author: gwern 01 September 2012 07:14:46PM 18 points [-]

Citation for this was hard; the closest I got was Etzioni's 1962 The Hard Way to Peace, pg 110. There's also a version in the 1998 Linus Pauling on peace: a scientist speaks out on humanism and world survival : writings and talks by Linus Pauling; this version goes

I have made a modern formulation of the Golden Rule: "Do unto others 20 percent better than you would be done by - the 20 percent is to correct for subjective error."

Comment author: DanielLC 02 September 2012 07:13:46PM 1 point [-]

How about doing unto others what maximizes total happiness, regardless of what they'd do unto you?

Comment author: RomanDavis 02 September 2012 07:45:27PM 2 points [-]

By acting in a way that discourages them from hurting you, and encouraging them to help you, you are playing your part in maximizing total happiness.

Comment author: DanielLC 02 September 2012 10:50:18PM 0 points [-]

Yeah, but it's not necessarily the ideal way to act. Perhaps you should act generally better than that, or perhaps you should try to amplify it more. Do what you can to find out the optimal way to act. At least pay attention if you find new information. Don't just make a guess and assume you're correct.

Comment author: RomanDavis 03 September 2012 05:21:38AM 0 points [-]

You don't think you should discourage others from hurting you? I think that seems sort of obvious. Now, if you could somehow give a person a strong incentive to help you/ not hurt, while simultaneously granting them a shitload of happiness, that seems ideal. This doesn't really exclude that, it's just on the positive side of doing/ being done unto.

Comment author: DanielLC 03 September 2012 05:56:32AM 0 points [-]

You should probably discourage others from hurting you. It's just not clear how much.

Comment author: RomanDavis 03 September 2012 06:12:04AM 1 point [-]

As much as possible for the least amount of harm possible and the least amount of wasted time and resources, obviously. Which varies on a case by case basis.

I mean if it was practical, you'd give your friends 2 billion units of happiness, and then after turning the cheek to your enemies, grant them 1.9 billion units of happiness, but living on planet earth, giving you 80% of the crap you gave me seems about right.

Comment author: CCC 03 September 2012 08:43:11AM 0 points [-]

Not necessarily. If I horribly torture Jim because Jim stepped on my toes, then I am not maximizing total happiness; the unhappiness given to Jim by the torture outwieghs the unhappiness in me that is prevented by having no-one step on my toes.

Comment author: RomanDavis 03 September 2012 09:21:10AM 4 points [-]

That's a lot of effort and pain to prevent someone stepping on your toes.

Also, I'm not sure that'd be a terribly effective way to prevent harm to yourself. I mean, to the extent possible, once everyone knows you tortured Jim, people will be scared shitless to step on your toes, but Jim and Jim's family are very likely to murder you, or at least sue you for all your money and put you in jail for a long time.

Comment author: Kindly 02 September 2012 09:06:42PM 2 points [-]

It's a nice sentiment, but the optimization problem you suggest is usually intractable.

Comment author: DanielLC 02 September 2012 10:51:15PM 2 points [-]

It's better to at least attempt it than just find an easier problem and do that. You might have to rely on intuition and such to get any answer, but you're not going to do well if you just find something easier to optimize.

Comment author: Kindly 02 September 2012 11:04:22PM 3 points [-]

Yes, but there's no way a pithy quote is going to solve the problem for you. It might, however, contain a useful heuristic.

Comment author: prase 02 September 2012 09:46:29PM 6 points [-]

The former is computationally far more feasible.

Comment author: CronoDAS 02 September 2012 11:56:56PM *  2 points [-]
Comment author: DanielLC 03 September 2012 12:13:53AM *  0 points [-]

It's impossible to find a strategy that produces happiness better than trying to produce happiness, since if you knew of one, you'd try to produce happiness by following that strategy. If this method is what works best, then in doing what works best, you'd follow this method.

Also, linking to TVTropes tends to fall under generalizing from fictional evidence.

Comment author: CronoDAS 03 September 2012 02:40:49AM 1 point [-]

Art imitates life. ;)

And it's not hard to think of real life examples of atrocities "justified" on utilitarian grounds that the rest of the world thinks are anything but justifiable. The Reign of Terror during the French Revolution, for example, is generally regarded as having gone too far.

Comment author: wedrifid 03 September 2012 07:53:58AM 0 points [-]

How about doing unto others what maximizes total happiness, regardless of what they'd do unto you?

You may do that if you must, I recommend against it.

Comment author: DanielLC 03 September 2012 05:38:39PM 0 points [-]

Why do you recommend against it? Do you have a more complicated utility function?

Comment author: Caspian 03 September 2012 07:15:29AM 3 points [-]

Did you take "expect" to mean as in prediction, or as in what you would have them do, like the Jesus version?

Comment author: Daniel_Burfoot 01 September 2012 03:57:48PM 24 points [-]

It is now clear to us what, in the year 1812, was the cause of the destruction of the French army. No one will dispute that the cause of the destruction of Napoleon's French forces was, on the one hand, their advance late in the year, without preparations for a winter march, into the depths of Russia, and, on the other hand, the character that the war took on with the burning of Russian towns and the hatred of the foe aroused in the Russian people. But then not only did no one foresee (what now seems obvious) that this was the only way that could lead to the destruction of an army of eight hundred thousand men, the best in the world and led by the best generals, in conflict with a twice weaker Russian army, inexperienced and led by inexperienced generals; not only did no one foresee this, but all efforts on the part of the Russians were constantly aimed at hindering the one thing that could save Russia, and, on the part of the French, despite Napoleon's experience and so-called military genius, all efforts were aimed at extending as far as Moscow by the end of summer, that is, at doing the very thing that was to destroy them.

  • Leo Tolstoy, "War and Peace", trans. Pevear and Volokhonsky
Comment author: RichardKennaway 01 September 2012 04:03:28PM 18 points [-]

Nothing can be soundly understood
If daylight itself needs proof.

Imām al-Ḥaddād (trans. Moṣṭafā al-Badawī), "The Sublime Treasures: Answers to Sufi Questions"

Comment author: gwern 01 September 2012 05:25:57PM *  10 points [-]
Comment author: Jay_Schweikert 02 September 2012 05:39:39PM 4 points [-]

This also made me think of the aphorism "if water sticks in your throat, with what will you wash it down?"

Comment author: gwern 02 September 2012 05:43:35PM 0 points [-]

Or "if salt loses its savor", although I wonder if they're really making the same philosophical point about relative weights of evidence on two sides of a contradiction/paradox.

Comment author: siodine 02 September 2012 06:00:59PM *  4 points [-]

Richard Carrier on solipsism, but not nearly as pithy:

Solipsism still requires an explanation for what you are cognating. There are only two logically possible explanations: random chance, or design.

It’s easy to show that the probability that your stream of consciousness is a product of random chance is absurdly low (see Bolzmann brains, for example). In simple form, if we assume no prior knowledge or assumptions (other than logic and our raw uninterpreted experience), the prior probability of solipsism becomes 0.5 but the likelihood of the evidence on solipsism is then vanishingly small (approaching zero), since chance events would sooner produce a relative chaos than an organized stream of complex consciousness, whereas the likelihood of that same evidence on a modest scientific realism is effectively 100%. Work the math and the probability of chance-based solipsism is necessarily vanishingly small (albeit not zero, but close enough for any concern). Conclusion: random solipsism would sooner produce a much weirder experience.

That leaves some sort of design hypothesis, namely your mind is cleverly making everything up, just so. Which requires your mind to be vastly more intelligent and resourceful and recollectful than you experience yourself being, since you so perfectly create a reality for yourself that remains consistent and yet that you can’t control with your mind. So you control absolutely everything, yet control next to nothing, a contradiction in terms, although an extremely convoluted system of hypotheses could eliminate that contradiction with some elaborate device explaining why your subconscious is so much more powerful and brilliant and consistent and mysterious than your conscious self is. The fact that you have to develop such a vastly complex model of how your mind works, just to get solipsism to make the evidence likely (as likely as it already is on modest scientific realism), necessarily reduces the prior probability by as much, and thus the probability of intelligent solipsism is likewise vanishingly small. Conclusion: intelligent solipsism would sooner result in your being more like a god, i.e. you would have vast or total control over your reality.

One way to think of the latter demarcation of prior probability space is similar to the thermodynamic argument against our having a Boltzmann brain: solipsism is basically a cartesian demon scenario, only the demon is you; so think of all the possible cartesian demons, from “you can change a few things but not all,” to “you can change anything you want,” and then you’ll see the set of all possible solipsistic states in which you would have obvious supernatural powers (the ability to change aspects of reality) is vastly larger than the set of all possible solipsistic states in which you can’t change anything except in exactly the same way as a modest scientific realism would produce. In other words, we’re looking at an incredible coincidence, where the version of solipsism that is realized just “happens” to be exactly identical in all observed effects to non-solipsism. And the prior probability space shared by that extremely rare solipsism is a vanishingly small fraction of all logically possible solipsisms. Do the math and the probability of an intelligent solipsism is vanishingly small.

This all assumes you have no knowledge making any version of solipsism more likely than another. And we are effectively in that state vis-a-vis normal consciousness. However we are not in that state vis-a-vis other states of consciousness, e.g. put “I just dropped acid” or “I am sleeping” in your background knowledge and that entails a much higher probability that you are in a solipsistic state, but then that will be because the evidence will be just as such a hypothesis would predict: reality starts conforming to your whim or behaving very weirdly in ways peculiar to your own desires, expectations, fears, etc. Thus “subjective” solipsism is then not a vanishingly small probability. But “objective” solipsism would remain so (wherein reality itself is a product of your solipsistic state), since for that to explain all the same evidence requires extremely improbable coincidences again, e.g. realism explains why you need specific conditions of being drugged or sleeping to get into such a state, and why everything that happens or changes in the solipsistic state turns out not to have changed or happened when you exit that state, and why the durations and limitations and side effects and so on all are as they are, whereas pure solipsism doesn’t come with an explanation for any of that, there in that case being no actual brain or chemistry or “other reality” to return to, and so on, so you would have to build all those explanations in to get objective solipsism to predict all the same evidence, and that reduces the prior. By a lot.

There is no logically consistent way to escape the conclusion that solipsism is exceedingly improbable.

Comment author: gwern 02 September 2012 11:23:49PM 15 points [-]

I think that's actually a really terrible bit of arguing.

There are only two logically possible explanations: random chance, or design.

We can stop right there. If we're all the way back at solipsism, we haven't even gotten to defining concepts like 'random chance' or 'design', which presume an entire raft of external beliefs and assumptions, and we surely cannot immediately say there are only two categories unless, in response to any criticism, we're going to include a hell of a lot under one of those two rubrics. Which probability are we going to use, anyway? There are many more formalized versions than just Kolmogorov's axioms (which brings us to the analytic and synthetic problem).

And much of the rest goes on in a materialist vein which itself requires a lot of further justification (why can't minds be ontologically simple elements? Oh, your experience in the real world with various regularities has persuaded you that is inconsistent with the evidence? I see...) Even if we granted his claims about complexity, why do we care about complexity? And so on.

Yes, if you're going to buy into a (very large) number of materialist non-solipsist claims, then you're going to have trouble making a case in such terms for solipsism. But if you've bought all those materialist or externalist claims, you've already rejected solipsism and there's no tension in the first place. And he doesn't do a good case of explaining that at all.

Comment author: siodine 03 September 2012 01:11:27AM 1 point [-]

Good points, but then likewise how do you define and import the designations of 'hand' or 'here' and justify intuitions or a axiomatic system of logic (and I understood Carrier to be referring to epistemic solipsism like Moore -- you seem to be going metaphysical)? (or were you not referring to Moore's argument in the context of skepticism?)

Comment author: simplicio 01 September 2012 04:06:40PM 20 points [-]

...a good way of thinking about minimalism [about truth] and its attractions is to see it as substituting the particular for the general. It mistrusts anything abstract or windy. Both the relativist and the absolutist are impressed by Pilate's notorious question 'What is Truth?', and each tries to say something useful at the same high and vertiginous level of generality. The minimalist can be thought of turning his back on this abstraction, and then in any particular case he prefaces his answer with the prior injunction: you tell me. This does not mean, 'You tell me what truth is.' It means, 'You tell me what the issue is, and I will tell you (although you will already know, by then) what the truth about the issue consists in.' If the issue is whether high tide is at midday, then truth consists in high tide being at midday... We can tell you what truth amounts to, if you first tell us what the issue is.

There is a very powerful argument for minimalism about truth, due to the great logician Gottlob Frege. First, we should notice the transparency property of truth. This is the fact that it makes no difference whether you say that it is raining, or it is true that it is raining, or true that it is true that it is raining, and so on forever. But if 'it is true that' introduced some substantial, robust property of a judgment, how could this be so? Consider, for example, a pragmatism that attempts some equation between truth and utility. Then next to the judgment 'it is raining' we might have 'it is useful to believe that it is raining.' But these are entirely different things! To assess the first we direct our attention to the weather. To assess the second we direct our attention to the results of believing something about the weather - a very different investigation.

Let us return to Pilate. Where does minimalism about truth leave him? It suggests that when he asked this question, he was distracting himself and his audience from his real job, which was to find out whether to uphold certain specific historical charges against a defendant. Thus, if I am innocent, and I come before a judge, I don't want airy generalities about the nature of truth. I want him to find that I did not steal the watch if I did not steal the watch. I want him to rub his nose in the issue. I want a local judgment about a local or specific event, supposed to have happened in a particular region of time and space.

(Simon Blackburn, Truth)

Comment author: Alejandro1 01 September 2012 05:14:07PM 7 points [-]

The pithiest definition of Blackburn's minimalism I've read is in his review of Nagel's The Last Word:

We can see why this is so if we put it in terms of what we can call Ramsey’s ladder. This takes us from p to it is true that p, to it is really true that p, to it is really a fact that it is true that p, and if we like to it is really a fact about the independent order of things ordained by objective Platonic normative structures with which we resonate in harmony that it is true that p. For the metatheoretical minimalist, Ramsey’s ladder is horizontal. The view from the top is just the same as the view from the bottom, and the view is p.

It is followed by an even pithier response to how Nagel refutes relativism (pointing that our first-order conviction that 2+2=4 or that murder is wrong is more certain than any relativist doubts) and thinks that this establishes a quasi-Platonic absolutism as the only alternative:

This is… taking advantage of the horizontal nature of Ramsey’s ladder to climb it, and then announce a better view from the top.

Comment author: buybuydandavis 03 September 2012 11:12:35AM 1 point [-]

"What is truth" is a pretty good question, though a better one is "what do we do with truths?"

We do a lot of things with truths, it can serve a lot of different functions. The problem comes where people doing different things with their truths talk to each other.

Comment author: [deleted] 01 September 2012 04:17:29PM 10 points [-]

Life is like a box of crayons. Most people are the 8-color boxes, but what you're really looking for are the 64-color boxes with the sharpeners on the back. I fancy myself to be a 64-color box, though I've got a few missing. It's ok though, because I've got some more vibrant colors like periwinkle at my disposal. I have a bit of a problem though in that I can only meet the 8-color boxes. Does anyone else have that problem? I mean there are so many different colors of life, of feeling, of articulation... so when I meet someone who's an 8-color type... I'm like, 'hey girl, magenta!' and she's like, 'oh, you mean purple!' and she goes off on her purple thing, and I'm like, 'no - I want magenta!'

John Mayer

Comment author: Kaj_Sotala 01 September 2012 06:08:27PM 49 points [-]

The person who says, as almost everyone does say, that human life is of infinite value, not to be measured in mere material terms, is talking palpable, if popular, nonsense. If he believed that of his own life, he would never cross the street, save to visit his doctor or to earn money for things necessary to physical survival. He would eat the cheapest, most nutritious food he could find and live in one small room, saving his income for frequent visits to the best possible doctors. He would take no risks, consume no luxuries, and live a long life. If you call it living. If a man really believed that other people's lives were infinitely valuable, he would live like an ascetic, earn as much money as possible, and spend everything not absolutely necessary for survival on CARE packets, research into presently incurable diseases, and similar charities.

In fact, people who talk about the infinite value of human life do not live in either of these ways. They consume far more than they need to support life. They may well have cigarettes in their drawer and a sports car in the garage. They recognize in their actions, if not in their words, that physical survival is only one value, albeit a very important one, among many.

-- David D. Friedman, The Machinery of Freedom

Comment author: DanielLC 02 September 2012 08:03:58PM 0 points [-]

He's just showing that those people don't give infinite value, not that it's nonsense. It's nonsense because, even if you consider life infinitely more intrinsically valuable than a green piece of paper, you'd still trade a life for green pieces of paper, so long as you could trade them back for more lives.

Comment author: RobinZ 02 September 2012 10:23:33PM 2 points [-]

If life were of infinite value, trading a life for two new lives would be a meaningless operation - infinity times two is equal to infinity. Not unless by "life has infinite value" you actually mean "everything else is worthless".

Comment author: fiddlemath 02 September 2012 10:44:58PM *  6 points [-]

Not quite so! We could presume that value isn't restricted to the reals + infinity, but say that something's value is a value among the ordinals. Then, you could totally say that life has infinite value, but two lives have twice that value.

But this gives non-commutativity of value. Saving a life and then getting $100 is better than getting $100 and saving a life, which I admit seems really screwy. This also violates the Von Neumann-Morgenstern axioms.

In fact, if we claim that a slice of bread is of finite value, and, say, a human life is of infinite value in any definition, then we violate the continuity axiom... which is probably a stronger counterargument, and tightly related to the point DanielLC makes above.

Comment author: DanielLC 02 September 2012 10:55:37PM *  3 points [-]

You could use hyperreal numbers. They behave pretty similarly to reals, and have reals as a subset. Also, if you multiply any hyperreal number besides zero by a real number, you get something isomorphic to the reals, so you can multiply by infinity and it still will work the same.

I'm not a big fan of the continuity axiom. Also, if you allow for hyperreal probabilities, you can still get it to work.

Comment author: Eugine_Nier 03 September 2012 03:03:49AM 1 point [-]

You could use hyperreal numbers.

At which point why not just re-normalize everything so that you're only dealing with reals?

Comment author: DanielLC 03 September 2012 04:33:34AM 0 points [-]

You could have something have infinite value and something else have finite value. Since this has an infinitesimal chance of actually mattering, it's a silly thing to do. I was just pointing out that you could assign something infinite utility and have it make sense.

Comment author: Decius 03 September 2012 03:32:35AM 0 points [-]

Also, if you multiply any hyperreal number besides zero by a real number, you get something isomorphic to the reals,

True

so you can multiply by infinity and it still will work the same.

Only if you have a way to describe infinity in terms of a real number.

Comment author: DanielLC 03 September 2012 04:35:58AM 1 point [-]

Only if you have a way to describe infinity in terms of a real number.

You just pick some infinite hyper real number and multiply all the real numbers by that. What's the problem?

Comment author: peter_hurford 01 September 2012 06:16:01PM 15 points [-]

"Our planet is a lonely speck in the great enveloping cosmic dark. In our obscurity -- in all this vastness -- there is no hint that help will come from elsewhere to save us from ourselves. It is up to us." - Sagan

Comment author: buybuydandavis 03 September 2012 11:21:44AM *  9 points [-]

Rorschach: You see, Doctor, God didn't kill that little girl. Fate didn't butcher her and destiny didn't feed her to those dogs. If God saw what any of us did that night he didn't seem to mind. From then on I knew... God doesn't make the world this way. We do.

EDIT: Quote above is from the movie.

Comment author: Ezekiel 03 September 2012 02:19:52PM *  8 points [-]

Verbatim from the comic:

It is not God who kills the children. Not fate that butchers them or destiny that feeds them to the dogs. It's us.
Only us.

I personally think that Watchmen is a fantastic study* on all the different ways people react to that realisation.

("Study" in the artistic sense rather than the scientific.)

Comment author: peter_hurford 01 September 2012 06:18:48PM 20 points [-]

"In a society in which the narrow pursuit of material self-interest is the norm, the shift to an ethical stance is more radical than many people realize. In comparison with the needs of people starving in Somalia, the desire to sample the wines of the leading French vineyards pales into insignificance. Judged against the suffering of immobilized rabbits having shampoos dripped into their eyes, a better shampoo becomes an unworthy goal. An ethical approach to life does not forbid having fun or enjoying food and wine, but it changes our sense of priorities. The effort and expense put into buying fashionable clothes, the endless search for more and more refined gastronomic pleasures, the astonishing additional expense that marks out the prestige car market in cars from the market in cars for people who just want a reliable means to getting from A to B, all these become disproportionate to people who can shift perspective long enough to take themselves, at least for a time, out of the spotlight. If a higher ethical consciousness spreads, it will utterly change the society in which we live." -- Peter Singer

Comment author: Dolores1984 01 September 2012 09:54:11PM 3 points [-]

An ethical approach to life does not forbid having fun or enjoying food and wine

I'm not at all convinced of this. It seems to me that a genuinely ethical life requires extraordinary, desperate asceticism. Anything less is to place your own wellbeing above those of your fellow man. Not just above, but many orders of magnitude above, for even trivial luxuries.

Comment author: MixedNuts 01 September 2012 10:08:42PM 19 points [-]

Julia Wise would disagree, on the grounds that this is impossible to maintain and you do more good if you stay happy.

Comment author: Dolores1984 02 September 2012 06:00:50AM 0 points [-]

That sounds to me like exactly the sort of excuse a bad person would use to justify valuing their selfish whims over the lives of other people. If we're holding our ideas to scrutiny, I think the idea that the 'Sunday Catholic' school of ethics is consistent could take a long, hard look.

Comment author: Desrtopa 02 September 2012 07:45:15PM 16 points [-]

Julia Wise holds the distinction of having actually tried it though. Few people are selfless enough to even make the attempt.

Comment author: [deleted] 02 September 2012 10:28:28PM *  25 points [-]

We're talking about a person who, along with her partner, gives to efficient charity twice as much money as she spends on herself. There's no way she doesn't actually believe what she says and still does that.

Comment author: prase 03 September 2012 12:09:24AM 4 points [-]

That she gives more than most others doesn't imply that her belief that giving even more is practically impossible isn't hypocritical. Yes, she very likely believes it, thus it is not a conscious lie, but only a small minority of falsities are conscious lies.

Comment author: Eliezer_Yudkowsky 03 September 2012 05:31:51AM 19 points [-]

Yeah, but there's also a certain plausibility to the heuristic which says that you don't get to second-guess her knowledge of what works for charitable giving until you're - not giving more - but at least playing in the same order of magnitude as her. Maybe her pushing a little bit harder on that "hypocrisy" would cause her mind to collapse, and do you really want to second-guess her on that if she's already doing more than an order of magnitude better than what your own mental setup permits?

Comment author: [deleted] 03 September 2012 08:10:17AM 2 points [-]

There's an Italian proverb “Everybody is a faggot with other people's asses”, meaning more-or-less ‘everyone is an idealist when talking about issues that don't directly affect them/situations they have never experienced personally”.

Comment author: [deleted] 03 September 2012 08:01:35AM 1 point [-]

You're using hypocritical in a weird way -- I'd only normally use it to mean ‘lying’, not ‘mistaken’.

Comment author: faul_sname 03 September 2012 12:43:57AM 8 points [-]

That sounds to me like exactly the sort of excuse a bad person would use to justify valuing their selfish whims over the lives of other people.

Is it justified? Pretend we care nothing for good and bad people. Do these "bad people" do more good than "good people"?

Comment author: [deleted] 03 September 2012 12:52:21AM 7 points [-]

Do you live a life of extraordinary, desperate asceticism? If not, why not? If so, are you happy?

Comment author: katydee 02 September 2012 11:39:57PM 3 points [-]

And the great philosopher Diogenes would disagree with her.

Comment author: RomanDavis 03 September 2012 05:53:39AM 8 points [-]

So, how many lives did he save again?

Clever guy, but I'm not sure if you want to follow his example.

Comment author: prase 02 September 2012 09:35:19PM *  19 points [-]

As it is probably intended, the more reminders like this I read, the more ethical I should become. As it actually works, the more of this I read, the less I become interested in ethics. Maybe I am extraordinarily selfish and this effect doesn't happen to most, but it should be at least considered that constant preaching of moral duties can have counterproductive results.

Comment author: RobinZ 02 September 2012 10:31:15PM 18 points [-]

xkcd reference.

Not to mention the remarks of Mark Twain on a fundraiser he attended once:

Well, Hawley worked me up to a great state. I couldn't wait for him to get through [his speech]. I had four hundred dollars in my pocket. I wanted to give that and borrow more to give. You could see greenbacks in every eye. But he didn't pass the plate, and it grew hotter and we grew sleepier. My enthusiasm went down, down, down - $100 at a time, till finally when the plate came round I stole 10 cents out of it. [Prolonged laughter.] So you see a neglect like that may lead to crime.

Comment author: NancyLebovitz 03 September 2012 02:24:49AM *  4 points [-]

It might be worth taking a look at Karen Horney's work. She was an early psychoanalyst who wrote that if a child is abused, neglected, or has normal developmental stages overly interfered with, they are at risk of concluding that just being a human being isn't good enough, and will invent inhuman standards for themselves.

I'm working on understanding the implications (how do you get living as a human being right? :-/ ), but I think she was on to something.

Comment author: Eliezer_Yudkowsky 03 September 2012 05:27:27AM 17 points [-]

I wasn't abused or neglected. Did she check experimentally that abuse or neglect is more prevalent among rationalists than in the general population?

Of course that's not something a human would ordinarily do to check a plausible-sounding hypothesis, so I guess she probably didn't, unless something went horribly wrong in her childhood.

Comment author: NancyLebovitz 03 September 2012 06:00:47AM 1 point [-]

I was thinking about prase in particular, who sounds as though he might have some problems with applying high standards in a way that's bad for him.

Horney died in 1952, so she might not have had access to rationalists in your sense of the word.

When I said it might be worth taking a look at Horney's work, I really did mean I thought it might be worth exploring, not that I'm very sure it applies. It seems to be of some use for me.

Comment author: NancyLebovitz 03 September 2012 11:42:35AM 3 points [-]

Second thought: Maybe I should have not mentioned her theory about why people adopt inhuman standards, and just focused on the idea that inhuman standards are likely to backfire, Viliam_Bur did.

Also-- if I reread I'll check this-- I think Horney focused on inhuman standards of already having a quality, which is not quite the same thing as having inhuman standards about what one ought to achieve, though I think they're related.

Comment author: Viliam_Bur 03 September 2012 09:18:02AM *  19 points [-]

I suspect it's because authors of "ethical remainders" are usually very bad at understanding human nature.

What they essentially do is associate "ethical" with "unpleasant", because as long as you have some pleasure, you are obviously not ethical enough; you could do better by giving up some more pleasure, and it's bad that you refuse to do so. The attention is drawn away from good things you are really doing, to the hypothetical good things you are not doing.

But humans are usually driven by small incentives, by short-term feelings. The best thing our rationality can do is better align these short-term feelings with out long-term goals, so we actually feel happy when contributing to our long-term goals. And how exactly are these "ethical remainders" contributing to the process? Mostly by undercutting your short-term ethical motivators, by always reminding you that what you did was not enough, therefore you don't deserve the feelings of satisfaction. Gradually they turn these motivators off, and you no longer feel like doing anything ethical, because they convinced you (your "elephant") that you can't.

Ethics without understanding human nature is just a pile of horseshit. Of course that does not prevent other people from admiring those who speak it.

Comment author: peter_hurford 01 September 2012 06:19:13PM 3 points [-]

"Is this a victory or a defeat? Is this justice or injustice? Is it gallantry or a rout? Is it valor to kill innocent children and women? Do I do it to widen the empire and for prosperity or to destroy the other's kingdom and splendor? One has lost her husband, someone else a father, someone a child, someone an unborn infant... What's this debris of the corpses?" -- Ashoka

Comment author: peter_hurford 01 September 2012 06:19:37PM 30 points [-]

"He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his candle at mine, receives light without darkening me. No one possesses the less of an idea, because every other possesses the whole of it." - Jefferson

Comment author: Eliezer_Yudkowsky 01 September 2012 07:56:47PM 20 points [-]

"Nontrivial measure or it didn't happen." -- Aristosophy

(Who's Kate Evans? Do we know her? Aristosophy seems to have rather a lot of good quotes.)

Comment author: Alicorn 01 September 2012 08:08:35PM 22 points [-]

*cough*

"I made my walled garden safe against intruders and now it's just a walled wall." -- Aristosophy

Comment author: RomanDavis 01 September 2012 10:30:40PM *  19 points [-]

Attachment? This! Is! SIDDHARTHA!

Is that you? That's ingenious.

For more rational flavor:

Live dogmatic, die wrong, leave a discredited corpse.

This should be the summary for entangled truths:

To find the true nature of a thing, find the true nature of all other things and look at what is left over.

how to seem and be deep:

Blessed are those who can gaze into a drop of water and see all the worlds and be like who cares that's still zero information content.

Dark Arts:

The master said: "The master said: "The master said: "The master said: "There is no limit to the persuasive power of social proof.""""

More Dark arts:

One wins a dispute, not by minimising potential counterarguments' plausibility, but by maximising their length.

Luminosity:

Have you accepted your brain into your heart?

Comment author: Alicorn 01 September 2012 10:42:11PM 6 points [-]

No, I'm not her. I don't know who she is, but her Twitter is indeed glorious. (And Google Reader won't let me subscribe to it the way I'm subscribed to other Twitters, rar.)

Comment author: RomanDavis 01 September 2012 10:51:42PM *  15 points [-]

She's got to be from here, here's learning biases can hurt people:

Heuristics and biases research: gaslighting the human race?

Cryonics:

"Are you signed up for Christonics?" "No, I'm still prochristinating."

I'm starting to think this is someone I used to know from tvtropes.

Comment author: Unnamed 02 September 2012 07:24:06AM 0 points [-]
Comment author: Alicorn 02 September 2012 05:16:03PM 0 points [-]

I found that, but it won't let me subscribe to it with Google Reader, only with other things I don't use.

Comment author: Unnamed 02 September 2012 08:05:04PM 0 points [-]

That's odd - I did subscribe to it with Google Reader, right before I posted the link.

Comment author: Alicorn 02 September 2012 08:12:02PM 0 points [-]

My bookmarklet says "can't find a feed", and the dropdown menu doesn't offer Google Reader as far as I can tell. How did you do it?

Comment author: Unnamed 02 September 2012 09:41:17PM 0 points [-]

"Google" was auto-selected in my dropdown menu, so it was straightforward for me, same as always. Two clicks, one on Subscribe, the second to indicate Google Reader rather than Google Homepage.

Not sure how much troubleshooting help I can give you. Does that page at least show the recent tweets? Are you logged into Google? Maybe try going to your Google Reader page and entering the url there in the subscribe-to-a-new-feed place?

Comment author: Alicorn 02 September 2012 09:50:24PM 0 points [-]

Does that page at least show the recent tweets?

Yes.

Are you logged into Google?

Yes.

Maybe try going to your Google Reader page and entering the url there in the subscribe-to-a-new-feed place?

Doesn't work, says it can't find it. I don't know why; this is how I've subscribed to Twitters in the past.

Comment author: [deleted] 01 September 2012 09:06:54PM 0 points [-]

The best way for doubters to control a questionable new technology is to embrace it, lest it remain wholly in the hands of enthusiasts who think there is nothing questionable about it.

Stewart Brand

(from Bret Victor's excellent quotes page)

Comment author: Zvi 01 September 2012 09:10:38PM 17 points [-]

Subway ad: "146 people were hit by trains in 2011. 47 were killed."

Guy on Subway: "That tells me getting hit by a train ain't that dangerous."

  • Nate Silver, on his Twitter feed @fivethirtyeight
Comment author: [deleted] 02 September 2012 12:27:26AM 9 points [-]

Wait, 32% probability of dying “ain't that dangerous”? Are you f***ing kidding me?

Comment author: [deleted] 02 September 2012 12:37:46AM 22 points [-]

If I expect to be hit by a train, I certainly don't expect a ~68% survival chance. Not intuitively, anyways.

Comment author: radical_negative_one 02 September 2012 04:25:22PM 17 points [-]

I'm guessing that even if you survive, your quality of life is going to take a hit. Accounting for this will probably bring our intuitive expectation of harm closer to the actual harm.

Comment author: [deleted] 02 September 2012 10:20:58PM 3 points [-]

Hmmm, I can't think of any way of figuring out what probability I would have guessed if I had to guess before reading that. Damn you, hindsight bias!

(Maybe you could spell out and rot-13 the second figure in the ad...)

Comment author: faul_sname 02 September 2012 11:20:29PM *  2 points [-]

I would expect something like that chance. Being hit by a train will be very similar to landing on your side or back after falling 3 to 10 meters (I'm guessing most people hit by trains are at or near a train station, so the impacts will be relatively slow). So the fatality rate should be similar.

Of course, that prediction gives a fatality rate of only 5-20%, so I'm probably missing something.

Comment author: khafra 03 September 2012 12:36:48AM 4 points [-]

There's the whole crushing and high voltage shock thing, depending on how you land.

Comment author: Ezekiel 02 September 2012 01:16:02AM 28 points [-]

My brain technically-not-a-lies to me far more than it actually lies to me.

-- Aristosophy (again)

Comment author: J_Taylor 02 September 2012 03:33:00AM *  11 points [-]

Major Greene this evening fell into some conversation with me about the Divinity and satisfaction of Jesus Christ. All the argument he advanced was, "that a mere creature or finite being could not make satisfaction to infinite justice for any crimes," and that "these things are very mysterious."

Thus mystery is made a convenient cover for absurdity.

  • John Adams
Comment author: katydee 02 September 2012 09:11:35AM *  2 points [-]

It's not easy to learn a new language. We are all used to speaking in a vague verbal language when expressing degrees of belief. In daily life, this language serves us quite well and the damage caused by its ambiguity is minor, but for important decisions it is helpful to use numbers to express degrees of belief. It may be more difficult to elicit numbers, but it is much more efficient. We understand each other better, numerical expressions are more sensitive to small differences in our feelings, and in the end, our decision processes will be better.

From "An Elementary Approach to Thinking Under Uncertainty," by Ruth Beyth-Marom, Shlomith Dekel, Ruth Gombo, & Moshe Shaked.

Comment author: Kaj_Sotala 02 September 2012 07:12:41PM 2 points [-]
Comment author: Jayson_Virissimo 03 September 2012 09:41:21AM *  3 points [-]

Arguably, assigning a particular floating point number between 0.0 and 1.0 to represent subjective degrees of belief is a specialized skill and it could take years of practice in order to become fluent in numerical-probability-speak.* Another possibility is that it merely adds a kind of pseudo-precision without any benefit over natural language.

In any case, it seems to be an empirical question and so should be answered with empirical data. I guess we won't really know until we have a good-sized number of people using things such as PredictionBook for extended periods of time. I'll keep you posted.

*There does exist rigorously defined verbal probabilities, but as far as I know they haven't been used much since the Late Middle Ages/Early Modern Period.

Comment author: Jay_Schweikert 02 September 2012 05:48:43PM *  26 points [-]

Qhorin Halfhand: The Watch has given you a great gift. And you only have one thing to give in return: your life.

Jon Snow: I'd gladly give my life.

Qhorin Halfhand: I don’t want you to be glad about it! I want you to curse and fight until your heart’s done pumping.

--Game of Thrones, Season 2.

Comment author: Ezekiel 02 September 2012 10:54:07PM 8 points [-]

And you only have one thing to give in return: your life.

Also effort, expertise, and insider information on one of the most powerful Houses around. And magic powers.

Comment author: RomanDavis 03 September 2012 05:25:55AM 0 points [-]

He has magic powers?

Comment author: Ezekiel 03 September 2012 05:52:19AM 1 point [-]

Rot13'd for minor spoiling potential: Ur'f n jnet / fxvapunatre.

Comment author: Rhwawn 03 September 2012 12:20:30AM 26 points [-]

Reminds me of Patton:

No man ever won a war by dying for his country. Wars were won by making the other poor bastard die for his. You don't win a war by dying for your country.

Comment author: [deleted] 02 September 2012 06:19:19PM 1 point [-]

We must remember that, strictly speaking, "formal" does not mean merely "rigorous", but "according to form". Meaning need be ascribed only to the result of a formal process. It is not needed to guide the process itself. We ascribe meaning to intermediate formal states primarily, nay solely, to reassure ourselves.

Guy Steele

Comment author: simplicio 02 September 2012 07:32:28PM 1 point [-]

Can you elaborate on what this is getting at?

Comment author: RomanDavis 02 September 2012 07:47:40PM *  1 point [-]

You shouldn't be decieved by the use of the word formal as an applause light.

Comment author: [deleted] 02 September 2012 07:53:32PM *  2 points [-]

I think the message is pretty similar to this quote. Put another way: be careful to not favor the letter of the law over the spirit of the law. Which is hard because brains prize anything that spares them work, and the letter of the law is (I'm guessing) more compressible than its spirit.

Comment author: katydee 02 September 2012 06:51:54PM 15 points [-]

When we were first drawn together as a society, it had pleased God to enlighten our minds so far as to see that some doctrines, which we once esteemed truths, were errors; and that others, which we had esteemed errors, were real truths. From time to time He has been pleased to afford us farther light, and our principles have been improving, and our errors diminishing.

Now we are not sure that we are arrived at the end of this progression, and at the perfection of spiritual or theological knowledge; and we fear that, if we should once print our confession of faith, we should feel ourselves as if bound and confin'd by it, and perhaps be unwilling to receive farther improvement, and our successors still more so, as conceiving what we their elders and founders had done, to be something sacred, never to be departed from.

Michael Welfare, quoted in The Autobiography of Benjamin Franklin

Comment author: katydee 02 September 2012 09:01:21PM 32 points [-]

Lady Average may not be as good-looking as Lady Luck, but she sure as hell comes around more often.

Anonymous

Comment author: wallowinmaya 02 September 2012 10:30:35PM 34 points [-]

Nobody is smart enough to be wrong all the time.

Ken Wilber

Comment author: CronoDAS 03 September 2012 12:28:48AM 2 points [-]

Unless you're a fictional character. Or possibly Mike "Bad Player" Flores:

There is an episode of Seinfeld where George—a lifelong screw up—decides to do the opposite of his natural instincts and impulses at every turn. He has a great day, lands his job at the Yankees, etc.

I was a superb Onslaught drafter, but there was probably a reason my buddy Scott had a dim confidence in my game play. So I decided to draft normally but pull a George and do the opposite of everything I was inclined to in game.

The result was a Day 2 with a terrible Sealed deck and 3-0 / 6-0 in my first draft. I needed 2-1 for Top 8.

The "Even Steven" part is that at that point I was so full of myself I forgot to do the opposite of what I wanted to do and made about three important mistakes... Exactly enough to land myself one point out of Top 8 at Grand Prix Boston (Kibler won).

Comment author: MileyCyrus 03 September 2012 03:11:36AM 15 points [-]

Lol, my professor would give a 100% to anyone who answered every exam question wrong. There were a couple people who pulled it off, but most scored 0<10.

Comment author: Decius 03 September 2012 03:26:56AM 11 points [-]

I'm assuming a multiple-choice exam, and invalid answers don't count as 'wrong' for that purpose?

Otherwise I can easily miss the entire exam with "Tau is exactly six." or "The battle of Thermopylae" repeated for every answer. Even if the valid answers are [A;B;C;D].

Comment author: Alejandro1 03 September 2012 03:35:59AM 29 points [-]

"But I tell you he couldn't have written such a note!" cried Flambeau. "The note is utterly wrong about the facts. And innocent or guilty, Dr Hirsch knew all about the facts."

"The man who wrote that note knew all about the facts," said his clerical companion soberly. "He could never have got 'em so wrong without knowing about 'em. You have to know an awful lot to be wrong on every subject—like the devil."

"Do you mean—?"

"I mean a man telling lies on chance would have told some of the truth," said his friend firmly. "Suppose someone sent you to find a house with a green door and a blue blind, with a front garden but no back garden, with a dog but no cat, and where they drank coffee but not tea. You would say if you found no such house that it was all made up. But I say no. I say if you found a house where the door was blue and the blind green, where there was a back garden and no front garden, where cats were common and dogs instantly shot, where tea was drunk in quarts and coffee forbidden—then you would know you had found the house. The man must have known that particular house to be so accurately inaccurate."

--G.K. Chesterton, "The Duel of Dr. Hirsch"

Comment author: RomanDavis 03 September 2012 05:23:47AM 1 point [-]

Reversed malevolence is intelligence?

Comment author: Viliam_Bur 03 September 2012 08:18:07AM *  24 points [-]

Inverted information is not random noise.

Comment author: Bruno_Coelho 03 September 2012 12:15:45AM 1 point [-]

There is not a man living whom it would so little become to speak from memory as myself, for I have scarcely any at all, and do not think that the world has another so marvellously treacherous as mine.

-- Motaigne

Comment author: ChrisHallquist 03 September 2012 06:22:54AM *  23 points [-]

“Why do you read so much?”

Tyrion looked up at the sound of the voice. Jon Snow was standing a few feet away, regarding him curiously. He closed the book on a finger and said, “Look at me and tell me what you see.”

The boy looked at him suspiciously. “Is this some kind of trick? I see you. Tyrion Lannister.”

Tyrion sighed. “You are remarkably polite for a bastard, Snow. What you see is a dwarf. You are what, twelve?”

“Fourteen,” the boy said.

“Fourteen, and you’re taller than I will ever be. My legs are short and twisted, and I walk with difficulty. I require a special saddle to keep from falling off my horse. A saddle of my own design, you may be interested to know. It was either that or ride a pony. My arms are strong enough, but again, too short. I will never make a swordsman. Had I been born a peasant, they might have left me out to die, or sold me to some slaver’s grotesquerie. Alas, I was born a Lannister of Casterly Rock, and the grotesqueries are all the poorer. Things are expected of me. My father was the Hand of the King for twenty years. My brother later killed that very same king, as it turns out, but life is full of these little ironies. My sister married the new king and my repulsive nephew will be king after him. I must do my part for the honor of my House, wouldn’t you agree? Yet how? Well, my legs may be too small for my body, but my head is too large, although I prefer to think it is just large enough for my mind. I have a realistic grasp of my own strengths and weaknesses. My mind is my weapon. My brother has his sword, King Robert has his warhammer, and I have my mind… and a mind needs books as a sword needs a whetstone, if it is to keep its edge.” Tyrion tapped the leather cover of the book. “That’s why I read so much, Jon Snow.”

--George R. R. Martin, A Game of Thrones

Comment author: ArisKatsaris 03 September 2012 08:37:29AM 6 points [-]

I think the quote could be trimmed to its last couple sentences and still maintain the relevant point..

Comment author: ChrisHallquist 03 September 2012 08:43:42AM *  6 points [-]

Oh, totally. But I prefer the full version; it's really a beautifully written passage.

Comment author: RobinZ 03 September 2012 04:22:57PM 29 points [-]

I disagree, in fact. That books strengthen the mind is baldly asserted, not supported, by this quote - the rationality point I see in it is related to comparative advantage.

Comment author: Jayson_Virissimo 03 September 2012 08:59:50AM 33 points [-]

...beliefs are like clothes. In a harsh environment, we choose our clothes mainly to be functional, i.e., to keep us safe and comfortable. But when the weather is mild, we choose our clothes mainly for their appearance, i.e., to show our figure, our creativity, and our allegiances. Similarly, when the stakes are high we may mainly want accurate beliefs to help us make good decisions. But when a belief has few direct personal consequences, we in effect mainly care about the image it helps to project.

-Robin Hanson, Human Enhancement

Comment author: buybuydandavis 03 September 2012 10:45:47AM 15 points [-]

I think he's mischaracterizing the issue.

Beliefs serve multiple functions. One is modeling accuracy, another is signaling. It's not whether the environment is harsh or easy, it's which function you need. There are many harsh environments where what you need is the signaling function, and not the modeling function.

Comment author: Matt_Caulfield 03 September 2012 03:55:04PM 16 points [-]

It may be of course that savages put food on a dead man because they think that a dead man can eat, or weapons with a dead man because they think a dead man can fight. But personally I do not believe that they think anything of the kind. I believe they put food or weapons on the dead for the same reason that we put flowers, because it is an exceedingly natural and obvious thing to do. We do not understand, it is true, the emotion that makes us think it is obvious and natural; but that is because, like all the important emotions of human existence it is essentially irrational.

  • G. K. Chesterton