Rationality Quotes March 2012

4 Post author: Thomas 03 March 2012 08:04AM

Here's the new thread for posting quotes, with the usual rules:

  • Please post all quotes separately, so that they can be voted up/down separately.  (If they are strongly related, reply to your own comments.  If strongly ordered, then go ahead and post them together.)
  • Do not quote yourself
  • Do not quote comments/posts on LW/OB
  • No more than 5 quotes per person per monthly thread, please.

 

Comments (525)

Comment author: Thomas 01 March 2012 08:36:37AM 6 points [-]

How extremely stupid [I] not to have thought of that.

Thomas Henry Huxley - about Darwin's theory of evolution

Comment author: [deleted] 01 March 2012 10:47:51AM 30 points [-]

Meh. That's just hindsight bias.

All truths are easy to understand when they are revealed; what's hard is to find them out.

Galileo Galilei (translated by me)

Comment author: Thomas 01 March 2012 12:04:50PM 0 points [-]

Generally, yes. But in this particular casa we can trust, that the later Darwin's bulldog really felt that way and that this was a justified statement. He obviously understood the matter well.

All those English animal breeders had a good insight. It was more or less a wild generalization for them. Non so wild for Huxley.

Comment author: ciphergoth 01 March 2012 03:56:06PM 0 points [-]

So that I can google for it - what's the original text? Thanks!

Comment author: [deleted] 01 March 2012 04:36:56PM *  6 points [-]

The version I've read is "Tutte le verità sono facili da capire quando sono rivelate, il difficile è scoprirle!" But that sounds like suspiciously modern Italian to me, so I wouldn't be surprised to find out that it's itself a paraphrase.

ETA: Apparently it was quoted in Criminal Minds, season 6, episode 11, and I suspect the Italian dubbing backtranslated the English version of the show rather than looking for the original wording by Galileo. (Which would make my version above a third-level translation.)

ETA2: In the original version of Criminal Minds, it's "All truths are easy to understand once they are discovered; the point is to discover them" according to Wikiquote. (How the hell did point become difficile? And why the two instances of discover were translated with different verbs? That's why I always watch shows and films in the original language!)

ETA3: And Wikiquote attributes that as “As quoted in Angels in the workplace : stories and inspirations for creating a new world of work (1999) by Melissa Giovagnoli”.

Comment author: Eliezer_Yudkowsky 03 March 2012 08:10:32AM 18 points [-]

With the great historical exception of quantum mechanics.

Comment author: Thomas 03 March 2012 08:48:22AM 4 points [-]

In fact, most people don't understand the Relativity. Most still rejects Evolution. It wasn't easy to understand the Copernican system in the Galileo's time.

It is easy to understand for a handful, and it seems obvious only to a few, when a new major breakthrough is made. Galileo was wrong. It may be easier, but not "easy to understand once a truth is revealed".

Comment author: [deleted] 03 March 2012 10:27:24AM *  1 point [-]

It wasn't easy to understand the Copernican system in the Galileo's time.

I suppose people didn't understand it because they didn't want to, not because they couldn't manage to. (Same with evolution -- what the OP was about. I might agree about relativity, though I guess for some people at least the absolute denial macro does play some part.)

Galileo was wrong.

More like stuff that was true back them is no longer true now.

Comment author: Thomas 03 March 2012 10:55:37AM 2 points [-]

I suppose people didn't understand it because they didn't want to

I suppose not. Why? People either have an inborn concept of the absolute up-down direction, either they develop it early in life. Updating to the round (let alone moving and rotating Earth) is not that easy and trivial for a naive mind of a child or for a Medieval man.

A new truth is usually heavy to understand for everybody. Had not been so, the science would progress faster.

Comment author: [deleted] 03 March 2012 11:25:15AM *  2 points [-]

I don't see how that contradicts my claim that it's not that people couldn't understand the meaning of the statement “the Earth revolves around the Sun”, but rather they disagreed with it because it was at odds with what they thought of the world. iħ∂|Ψ⟩/∂t = Ĥ|Ψ⟩, now that's a statement most people won't even understand enough to tell whether they think it's true or false.

Comment author: [deleted] 03 March 2012 10:33:44AM *  2 points [-]

Historical? I know you count many worlds as “understanding”, but I wouldn't until this puzzle is figured out. (Or maybe it's that I like Feynman's (in)famous quote so much I want to keep on using it, even if this means using a narrower meaning for understand.)

Comment author: Giles 04 March 2012 03:05:55PM 4 points [-]

I would say instead that many truths are easy to understand once you understand them. But still hard to explain to other people.

Comment author: Stabilizer 01 March 2012 09:29:18AM 17 points [-]

To be a good diagnostician, a physician needs to acquire a large set of labels for diseases, each of which binds an idea of the illness and its symptoms, possible antecedents and causes, possible developments and consequences, and possible interventions to cure or mitigate the illness. Learning medicine consists in part of learning the language of medicine. A deeper understanding of judgments and choices also requires a richer vocabulary than is available in everyday language. The availability of a diagnostic label for [the] bias... makes it easier to anticipate, recognize and understand.

-Daniel Kahneman, Thinking, Fast and Slow

Comment author: khafra 01 March 2012 01:10:25PM 4 points [-]

Yeah, a good compression algorithm--a dictionary that has short words for the important stuff--is vital to learning just about anything. I've noticed that in the martial arts; there's no way to learn a parry, entry, and takedown without a somatic vocabulary for the subparts of that; and the definitions of your "words" affects both the ease of learning and the effectiveness of its execution.

Comment author: Stabilizer 01 March 2012 07:06:15PM 0 points [-]

Interesting. So by somatic vocabulary, you basically mean composing long complicated moves from short, repeatable sub-moves?

Comment author: khafra 01 March 2012 08:04:24PM 1 point [-]

Basically, yes. Much of the vocabulary has very long descriptions in English, but shorter ones in different arts' parlance; some of it doesn't really have short descriptions anywhere but in the movements of people who've mastered it. The Epistemic Viciousness problem makes it difficult, in general, to find and cleave at the joints.

Comment author: Stabilizer 02 March 2012 03:03:24AM 3 points [-]

Also, wouldn't it be better to call it a hash table or a lookup-table rather than a compression algorithm. The key is swift and appropriate recall. Example: Compare a long-time practicing theoretical physicist with a physics grad student. Both know most of basic quantum mechanics. But the experienced physicist would know when to whip out which equation in which situation. So, the knowledge content is not necessarily compressed (I'm sure there is some compression) as much as the usability of the knowledge is much greater.

Comment author: Cthulhoo 01 March 2012 10:18:27AM 20 points [-]

When I disagree with a rational man, I let reality be our final arbiter; if I am right, he will learn; if I am wrong, I will; one of us will win, but both will profit.

Ayn Rand

Comment author: florian 01 March 2012 12:11:53PM 32 points [-]

Making the (flawed) assumption that in a disagreement, they cannot both be wrong.

Comment author: peter_hurford 01 March 2012 05:22:16PM 16 points [-]

Also, they could be wrong about whether they actually disagree.

Comment author: [deleted] 02 March 2012 09:08:48PM 3 points [-]

IME that's the case in a sizeable fraction of disagreements between humans; but if they “let reality be [their] final arbiter” they ought to realize that in the process.

Comment author: shokwave 02 March 2012 12:24:16AM 3 points [-]

one of us might win, but both will profit.

I have also heard it quoted like this.

Comment author: Giles 04 March 2012 02:39:43PM *  5 points [-]

I think that if the other person convinces you that they are right and they are right, then it should count as "winning the argument". It's the idea that has lost, not you.

Comment author: NancyLebovitz 01 March 2012 12:34:26PM *  4 points [-]

I wouldn’t say that we defy the limit, I’d say that we reexamine it, by very carefully considering the set of cases that actually matter.

--- pseudonym

Comment author: bungula 01 March 2012 01:30:50PM 13 points [-]

It's the Face of Boe. I'm absolutely certain about this, absolutely positive. Of course I'll probably turn out to be incorrect

Sam Hughes, talking about the first season finale of Doctor Who, differentiating between the subjective feeling of certainty and the actual probability estimate.

Comment author: James_Miller 01 March 2012 03:22:58PM 14 points [-]

Had no idea so much strategy was possible in Rock, Paper, Scissors? The rules of the game itself may be simple, but the human mind is not.

Natalie Wolchover

Comment author: Giles 04 March 2012 02:35:08PM 3 points [-]

I saw on TV some kid lose convincingly against a RPS champion when the kid had been given a prepared (random) list of moves to make ahead of time. That can't be explained by strategy - it was either coincidence or it's possible to cheat by seeing which way your opponent's hand is unfolding and change your move at the last moment.

Comment author: Desrtopa 04 March 2012 02:45:36PM 5 points [-]

The latter is definitely possible. Back when I was still playing RPS as a kid, I was fairly good at it; enough for somewhere upwards of 70% of my plays to be wins.

You don't want to change your move at the last moment though so much as you want to keep your hand in a plausibly formless configuration you can turn into a move at the last moment. Less likely to be called out for cheating.

Comment author: James_Miller 04 March 2012 08:50:42PM 6 points [-]

Or the losers were unintentionally signaling their moves beforehand.

Comment author: [deleted] 01 March 2012 04:00:15PM 8 points [-]

Outrage indicates how outraged individuals want the world to be; evidence tells everyone how the world is.

Tauriq Moosa

Comment author: [deleted] 01 March 2012 04:04:23PM 6 points [-]

Sometimes the only thing left is to know that it’s okay to be uncomfortable.

Wendy Braitman

Comment author: gRR 01 March 2012 04:21:08PM 6 points [-]

The winner is the one who makes the next-to-last mistake.

Ksawery Tartakower

Comment author: Jonathan_Graehl 01 March 2012 07:40:34PM 1 point [-]

I like this even though it violates the correct standard of "mistake": was the choice expected-optimal, before the roll of the die?

I like that it suggests continuing to focus on the rest of the game rather than beating yourself up over a past mistake.

Comment author: bentarm 02 March 2012 01:43:23PM 4 points [-]

I like this even though it violates the correct standard of "mistake": was the choice expected-optimal, before the roll of the die?

Tartakower was a chess player.

Comment author: Jonathan_Graehl 02 March 2012 06:44:56PM 0 points [-]

Somehow I'd imagined chess without really knowing.

The roll of the die is still in effect: unanticipated consequences of only-boundedly-optimal moves by each player can't make the original move more or less of a true mistake.

Comment author: [deleted] 02 March 2012 08:45:28PM 2 points [-]

I like that it suggests continuing to focus on the rest of the game rather than beating yourself up over a past mistake.

Tartakower also said "No one ever won a game by resigning" indeed.

Comment author: steven0461 02 March 2012 09:05:19PM *  3 points [-]

Suppose White gives away a pawn, and then the next move White accidentally lets Black put him in checkmate. White made the next-to-last mistake, but lost, so the saying must be false in a mundane sense. Is there an esoteric sense in which the saying is true?

Comment author: gRR 03 March 2012 01:56:01AM *  1 point [-]

Hmm, I suppose, a "mistake" in a technical sense is defined in terms of mini-max position evaluation, assuming infinite computing power:

eval(position) = -1 (loss), 0 (tie), or +1(win)
IsFatalMistake(move) = (eval(position before the move) > eval(position after the move) AND eval(position after the move) == -1)

With this definition, either giving away the pawn or missing the checkmate (or both) wasn't a fatal mistake, since the game was already lost before the move :)

Comment author: Will_Newsome 03 March 2012 01:58:36AM 0 points [-]

On a purely empirical level most amateur games once they reach critical positions are blunderfests punctuated by a few objectively strong moves that decide the game, and many complex positions near the end of games are similar blunderfests even among masters, and if you're assuming that the majority of moves are blunders then Tartakower's point is generally true. But I don't think that's what he meant.

Comment author: [deleted] 03 March 2012 03:01:58PM *  3 points [-]

The winner is the one who makes the next-to-last mistake.

I read this as implying that the loser is the one who makes the last mistake — the mistake that allows his opponent to win.

But yeah, I think the quote is kinda sloppy — it assumes that the opponents take turns in making mistakes.

Comment author: [deleted] 03 March 2012 03:39:51PM *  1 point [-]

the opponents take turns in making mistakes

This is true if you only count as mistakes moves which turn a winning position into a losing position, as gRR said elsethread. (I think I picked up this meaning from Chessmaster 10's automatic analyses, and was implicitly assuming it when reading the Tartakower quote.)

Comment author: [deleted] 04 March 2012 12:30:13AM 1 point [-]

Does "you shouldn't give up after a mistake, because many chess games involve both players, even the winner, making multiple mistakes" count as esoteric?

Comment author: [deleted] 01 March 2012 04:56:33PM 22 points [-]

“Anne!” Anne was seated on the springboard; she turned her head. Jubal called out, “That new house on the far hilltop — can you see what color they’ve painted it?”

Anne looked in the direction in which Jubal was pointing and answered, “It’s white on this side.”

Robert Heinlein, Stranger In A Strange Land

Comment author: HonoreDB 01 March 2012 04:58:22PM 23 points [-]

"Are you trying to tell me that there are sixteen million practicing wizards on Earth?" "Sixteen million four hundred and--" Dairine paused to consider the condition the world was in. "Well it's not anywhere near enough! Make them all wizards."

--Diane Duane, High Wizardry

Comment author: peter_hurford 01 March 2012 05:19:45PM *  31 points [-]

The problem, often not discovered until late in life, is that when you look for things in life like love, meaning, motivation, it implies they are sitting behind a tree or under a rock. The most successful people in life recognize, that in life they create their own love, they manufacture their own meaning, they generate their own motivation. For me, I am driven by two main philosophies, know more today about the world than I knew yesterday. And lessen the suffering of others. You'd be surprised how far that gets you.

-- Neil DeGrasse Tyson

Comment author: DanielLC 01 March 2012 07:09:07PM 5 points [-]

For me, I am driven by two main philosophies

I think he'd do better if he just made up his mind. I'd go with the second one.

Comment author: pedanterrific 01 March 2012 07:41:37PM 27 points [-]

watch out folks, we got a badass over here

Comment author: fortyeridania 02 March 2012 02:01:50AM 4 points [-]

Fits this one, two out of three.

Comment author: gwern 01 March 2012 05:55:59PM 4 points [-]

"In practice replacing digital computers with an alternative computing paradigm is a risky proposition. Alternative computing architectures, such as parallel digital computers have not tended to be commercially viable, because Moore's Law has consistently enabled conventional von Neumann architectures to render alternatives unnecessary. Besides Moore's Law, digital computing also benefits from mature tools and expertise for optimizing performance at all levels of the system: process technology, fundamental circuits, layout and algorithms. Many engineers are simultaneously working to improve every aspect of digital technology, while alternative technologies like analog computing do not have the same kind of industry juggernaut pushing them forward."

--Benjamin Vigoda, "Analog Logic: Continuous-Time Analog Circuits for Statistical Signal Processing" (2003 PhD thesis)

Comment author: RichardKennaway 01 March 2012 08:26:52PM 7 points [-]

Alternative computing architectures, such as parallel digital computers have not tended to be commercially viable, because Moore's Law has consistently enabled conventional von Neumann architectures to render alternatives unnecessary.

And the very next year, Intel abandoned its plans to make 4 GHz processors, and we've been stuck at around 3 GHz ever since.

Since when, parallel computing has indeed had the industry juggernaut behind it.

Comment author: gwern 01 March 2012 08:57:16PM *  0 points [-]

Yep, and that's why we all have dual-core or more now rather than long ag. Parallel computers of various architectures have been around since at least the '50s (mainframes had secondary processors for IO operations, IIRC), but were confined to niches until the frequency wall was hit and the juggernaut had to do something else with the transistors Moore's law was producing.

(I also read this quote as an indictment of the Lisp machine and other language-optimized processor architectures, and more generally, as a Hansonesque warning against 'not invented here' thinking; almost all innovation and good ideas are 'not invented here' and those who forget that will be roadkill under the juggernaut.)

Comment author: gwern 01 March 2012 05:57:07PM 21 points [-]

"All logic texts are divided into two parts. In the first part, on deductive logic, the fallacies are explained; in the second part, on inductive logic, they are committed."

--Morris Raphael Cohen, quoted by Cohen in "The Earth Is Round (p < 0.05)"

Comment author: gwern 01 March 2012 05:58:00PM 25 points [-]

"It's easy to think of yourself as being quite a nice person so long as you live on your own and are the only witness to yourself."

--Alain de Botton

Comment author: Grognor 03 March 2012 08:45:03AM 0 points [-]

How poignant for me since every last bit applies to me.

Comment author: gwern 01 March 2012 05:58:13PM 18 points [-]

"Hope always feels like it's made up of a set of reasons: when it's just sufficient sleep and a few auspicious hormones."

--Alain de Botton

Comment author: Will_Newsome 03 March 2012 01:51:57AM 8 points [-]

(Perhaps this individual quote is insightful (I can't tell), but this sort of causal analysis leads to basic confusions of levels of organization more often than it leads to insight.)

Comment author: Alejandro1 01 March 2012 06:23:41PM 15 points [-]

The demons told me that there is a hell for the sentimental and the pedantic. They are abandoned in an endless palace, more empty than full, and windowless. The condemned walk about as if searching for something, and, as we might expect, they soon begin to say that the greatest torment consists in not participating in the vision of God, that moral suffering is worse than physical suffering, and so on. Then the demons hurl them into the sea of fire, from where no one will ever take them out.

Adolfo Bioy Casares (my translation)

Comment author: Incorrect 01 March 2012 06:32:51PM 10 points [-]

The condemned walk about as if searching for something, and, as we might expect, they soon begin to say that the greatest torment consists in not participating in the vision of God, that moral suffering is worse than physical suffering, and so on

Why don't they just play tag with each other? Sounds like it would be fun.

Comment author: fubarobfusco 01 March 2012 10:22:26PM 4 points [-]

Because they're jerks.

Comment author: Alejandro1 02 March 2012 09:06:29PM 5 points [-]

Indeed. The kind of people who would go "Whee! Let's play tag!" in this situation do not find themselves in Hell (at least in this particular one) in the first place.

Comment author: GLaDOS 01 March 2012 07:07:58PM 12 points [-]

I have sometimes seen people try to list what a real intellectual should know. I think it might be more illuminating to list what he shouldn’t.

--Gregory Cochran, in a comment here

Comment author: [deleted] 01 March 2012 09:11:09PM 12 points [-]

Also good, from that comment's OP:

One of the main reasons that I shy away from modern liberalism is a strong commitment to interchangeability and identity across all individuals and populations as a matter of fact, rather than equality as a matter of legal commitment.

Razib Khan

Comment author: GLaDOS 01 March 2012 10:50:43PM *  4 points [-]

Yes but I didn't at first want to post that because it is slightly political. Though I guess the rationality core does outweigh any mind-killing.

Comment author: [deleted] 02 March 2012 03:13:24AM 5 points [-]

You have a Rationality Core, too?

Comment author: [deleted] 02 March 2012 06:45:55AM 7 points [-]

Mine tastes kind of like nougat.

Comment author: NancyLebovitz 03 March 2012 01:01:43PM 7 points [-]

This has 6 karma points, so I'm left curious about whether people have anything in mind about what real intellectuals shouldn't know.

Comment author: Eugine_Nier 04 March 2012 12:39:51AM 2 points [-]

I interpret the quote as saying that to be a "good intellectual" one needs to not know the problems with the positions "good intellectuals" are expected to defend.

Comment author: player_03 04 March 2012 12:46:17AM *  2 points [-]

I could be interpreting it entirely wrong, but I'd guess this is the list Cochran had in mind:

Comment author: FiftyTwo 04 March 2012 06:49:33PM 1 point [-]

My immediate thought was a 'real intellectual' shouldn't fill their brain with random useless information, (e.g. spend their time reading tvtropes).

Comment author: [deleted] 01 March 2012 08:02:07PM 9 points [-]

On our kind not cooperating:

When somebody is doing the right thing, you dont mess with them.

Michelle Obama

Comment author: FiftyTwo 04 March 2012 07:22:34PM 2 points [-]

Sounds like a counter to "Never interrupt your enemy when he is making a mistake." (Attributed but seemingly falsely to Napoleon Bonaparte)

Comment author: arundelo 01 March 2012 08:03:29PM 11 points [-]

When reading, you win if you learn, not if you convince yourself that you know something the author does not know.

-- Reg Braithwaite (raganwald)

Comment author: [deleted] 01 March 2012 09:09:23PM 35 points [-]

False opinions are like false money, struck first of all by guilty men and thereafter circulated by honest people who perpetuate the crime without knowing what they are doing

--Joseph de Maistre, Les soirées de Saint-Pétersbourg, Ch. I

Comment author: Thomas 02 March 2012 02:15:44PM 4 points [-]

Some guilt also falls onto those who are not eager enough to verify those opinions or the money they circulate.

The man on the top (at the beginning) is NOT guilty for everything.

Comment author: TheOtherDave 02 March 2012 05:38:16PM 15 points [-]

To my way of thinking, it's quite possible for me to be fully responsible for a chain of events (for example, if they would not have occurred if not for my action, and I was aware of the likelihood of them occurring given my action, and no external forces constrained my choice so as to preclude acting differently) and for other people upstream and downstream of me to also be fully responsible for that chain of events. This is no more contradictory than my belief that object A is to the left of object B from one perspective and simultaneously to the right of object A from another. Responsibility is not some mysterious fluid out there in the world that gets portioned out to individuals, it's an attribute that we assign to entities in a mental and/or social model.

You seem to be claiming that models wherein total responsibility for an event is conserved across the entire known causal chain are superior to mental models where it isn't, but I don't quite see why i ought to believe that.

Comment author: philh 01 March 2012 10:32:08PM *  10 points [-]

The Princess Bride:

Man in Black: Inhale this, but do not touch.
Vizzini: [sniffs] I smell nothing.
Man in Black: What you do not smell is called iocane powder. It is odorless, tasteless, dissolves instantly in liquid, and is among the more deadlier poisons known to man.
[He puts the goblets behind his back and puts the poison into one of the goblets, then sets them down in front of him]
Man in Black: All right. Where is the poison? The battle of wits has begun. It ends when you decide and we both drink, and find out who is right... and who is dead.
[Vizzini stalls, then eventually chooses the glass in front of the man in black. They both drink, and Vizzini dies.]
Buttercup: And to think, all that time it was your cup that was poisoned.
Man in Black: They were both poisoned. I spent the last few years building up an immunity to iocane powder.

Comment author: shokwave 02 March 2012 12:18:19AM *  8 points [-]

Man in Black: All right. Where is the poison? The battle of wits has begun.
Vizzini: But it's so simple. All I have to do is divine from what I know of you: are you the sort of man who would put the poison into his own goblet or his enemy's? Now, a clever man would put the poison into his own goblet, because he would know that only a great fool would reach for what he was given. I am not a great fool, so I can clearly not choose the wine in front of you. But you must have known I was not a great fool. You would have counted on it, so I can clearly not choose the wine in front of me.
Man in Black: You've made your decision then?
Vizzini: Not remotely! Because iocane comes from Australia, as everyone knows! And Australia is entirely peopled with criminals. And criminals are used to having people not trust them, as you are not trusted by me, so I can clearly not choose the wine in front of you.
Man in Black: Truly, you have a dizzying intellect.
Vizzini: And you must have suspected I would have known the powder's origin, so I can clearly not choose the wine in front of me.
Man in Black: You're just stalling now.
Vizzini: You'd like to think that, wouldn't you?! You've beaten my giant, which means you're exceptionally strong, so you could've put the poison in your own goblet, trusting on your strength to save you, so I can clearly not choose the wine in front of you! But, you've also bested my Spaniard, which means you must have studied, and in studying you must have learned that man is mortal, so you would have put the poison as far from yourself as possible, so I can clearly not choose the wine in front of me!
...
Man in Black: Then make your choice.
Vizzini: I will, and I choose- ...

Vizzini of the Princess Bride, on the dangers of reasoning in absolutes - both logically ("this is proof it's not in my goblet") and propositionally (the implicit assumption Vizzini has that one and only one wine goblet is poisoned - P or ~P, as it were)

Comment author: philh 02 March 2012 01:49:57AM 24 points [-]

I don't agree that Vizzini is trying to reason in logical absolutes. He talks like he is, but he doesn't necessarily believe the things he's saying.

Man in Black: You're trying to trick me into giving away something. It won't work.
Vizzini: It has worked! You've given everything away! I know where the poison is!

My interpretation is that he really is trying to trick the man.

Later he distracts the man and swaps the glasses around; then he pretends to choose his own glass. He makes sure the man drinks first. I think he's reasoning/hoping that the man would not deliberately drink from the poisoned cup. So when the man does drink he believes his chosen cup is safe. If the man had been unwilling to drink, Vizzini would have assumed that he now held the poisoned glass, and perhaps resorted to treachery.

He's overconfident, but he's not a complete fool.

(I don't have strong confidence in this analysis, because he's a minor character in a movie.)

Comment author: shokwave 02 March 2012 04:40:56AM *  1 point [-]

Well, yes, he only pretends to reason in logical absolutes...

... which was why I wrote "and propositionally" - because he does actually reason in propositional absolutes. I agree with your analysis but note that it is only a good strategy if it's true that one and only one cup contains poison (or the equivalent, that one and only one cup will kill the Man in Black).

On re-reading I may have lost that subtlety in the clumsy (parenthetical-filled) expression of the final line.

Comment author: shokwave 02 March 2012 04:46:33AM 3 points [-]

(I don't have strong confidence in this analysis, because he's a minor character in a movie.)

That the Man in Black describes it as a battle of wits - and not a puzzle - agrees with you.

Comment author: Alejandro1 02 March 2012 01:36:55AM 18 points [-]

The reason you can't rigidly separate positive from normative economics is that you can't rigidly separate claims of fact from claims of value in general. Human language is too laden with thick concepts that mix the two. The claim that someone is a "slut" or a "bitch", for example, melds together factual claims about a woman's behavior with a lot of deeply embedded normative concepts about what constitutes appropriate behavior for a woman. The claim that financial markets are "efficient" is both an effort to describe their operation and a way of valorizing them. The idea of a "recession" or "full employment" or "potential output" all embed certain ideas about what would constitute a normal arrangement of human economic activity (...) You could try to rigorously purge your descriptions of the economy of anything that vaguely smells of a thick moral concept, but you'd find yourself operating with an impoverished vocubulary unable to describe human affairs in any kind of reasonable way.

--Matt Yglesias

Comment author: Nominull 02 March 2012 09:53:45AM 13 points [-]

I found that very poignant, but I'm not sure I agree with his final claim. I think he's committing the usual mistake of claiming impossible what seems hard.

Comment author: RichardKennaway 02 March 2012 12:20:09PM *  8 points [-]

Is it even hard? JFDI, or as we might say here, shut up and do the impossible. Is "efficient" a tendentious word? Taboo it. Is discussion being confused by mixing normative and positive concepts? DDTT.

The quote smells like rationalising to me.

Comment author: TheOtherDave 02 March 2012 05:42:07PM 4 points [-]

Yeah, agreed. It's entirely possible to describe a system of economic agents without using such value-laden terns (though in some cases we may have to make up new terms). We don't do it, mostly because we don't want to. Which IMHO is fine; there's no particular reason why we should.

Comment author: magfrump 05 March 2012 08:38:27AM 1 point [-]

The first thought that I have when considering how to describe the economy without using normative language is that all of the values that are commonly measured (i.e. GDP, unemployment, etc.) are chosen to be measured because they are proxies for things that people value.

In fact, the whole study of economics seems to me like the study of things people value and how they are distributed. If you choose proxies for value you're having a profound effect on what gets measured (consider the recent discussions of statistical significance as a proxy for evidence) and if you try to list everything that everyone values you end up butting up against unsolved problems.

Comment author: komponisto 02 March 2012 02:17:24AM *  5 points [-]

By studying the masters, not their pupils.

-- Niels Henrik Abel, on how he developed his mathematical ability.

Comment author: michaelcurzi 02 March 2012 04:01:53AM *  1 point [-]

When the perishable puts on the imperishable, and the mortal puts on immortality, then shall come to pass the saying that is written: “Death is swallowed up in victory.” “O death, where is your victory? O death, where is your sting?” The sting of death is sin, and the power of sin is the law.

1 Corinthians 15:54-57

(I like this quote, as long as it's shamelessly presented without context of the last line: "But thanks be to God, who gives us the victory through our Lord Jesus Christ." )

Comment author: NancyLebovitz 03 March 2012 01:04:42AM 2 points [-]

The sting of death is sin, and the power of sin is the law.

How do you interpret that line?

Comment author: gyokuro 02 March 2012 04:39:18AM *  16 points [-]

"I've never ever felt wise," Derk said frankly. "But I suppose it is a tempation, to stare into distance and make people think you are."
"It's humbug," said the dragon. "It's also stupid. It stops you learning more."

Diana Wynne Jones, Dark Lord of Derkholm

Comment author: DSimon 02 March 2012 05:50:57AM *  29 points [-]

T-Rex: Our bodies are amazing things! Check it, everyone!
We use our mouths to talk. We invent, remember and teach entire languages with which to do the talking! And if that fails, we can talk with our hands. We build planes and boats and cars and spaceships, all by either using our bodies directly, or by using instruments invented by our bodies. We compose beautiful music and tell amazing stories, all with our bodies, these fleshy bags with spooky skeletons inside.
And yet, if we have a severe enough peanut allergy, we can be killed in seconds by a friggin' legume. And hey, 70% of our planet is water, but what happens if we spend too much time in it? We drown. Game over, man!
I used to make fun of Green Lantern for being vulnerable to the color yellow. Then I choked on my orange juice one morning and nearly suffocated.

-- Dinosaur Comics

Comment author: Will_Newsome 02 March 2012 09:55:32AM *  18 points [-]

If you want to know how decent people can support evil, find a mirror.

Mencius Moldbug, A gentle introduction to Unqualified Reservations (part 2) (yay reflection!)

Comment author: satt 03 March 2012 02:53:15PM 21 points [-]

If only it were all so simple! If only there were evil people somewhere committing evil deeds, and it were necessary only to separate them from the rest of us and destroy them. But the line dividing good and evil cuts through the heart of every human being. And who is willing to destroy a piece of his own heart?

— Aleksandr Solzhenitsyn, The Gulag Archipelago

Comment author: baiter 02 March 2012 12:52:37PM *  59 points [-]

"...I always rejoice to hear of your being still employ'd in experimental Researches into Nature, and of the Success you meet with. The rapid Progress true Science now makes, occasions my regretting sometimes that I was born so soon. It is impossible to imagine the Height to which may be carried, in a thousand years, the Power of Man over Matter. We may perhaps learn to deprive large Masses of their Gravity, and give them absolute Levity, for the sake of easy Transport. Agriculture may diminish its Labor and double its Produce; all Diseases may by sure means be prevented or cured, not excepting even that of Old Age, and our Lives lengthened at pleasure even beyond the antediluvian Standard. O that moral Science were in as fair a way of Improvement, that Men would cease to be Wolves to one another, and that human Beings would at length learn what they now improperly call Humanity!"

-- Benjamin Franklin, Letter to Joseph Priestley, 8 Feb 1780

Comment author: Stabilizer 04 March 2012 05:54:09AM *  10 points [-]

One of the first transhumanists?

Comment author: Will_Newsome 04 March 2012 12:14:24PM *  -2 points [-]

"Be perfect, like an FAI is perfect." -- Jesus

Comment author: Jayson_Virissimo 05 March 2012 08:29:56AM 6 points [-]

The hard core of transhumanism goes back to at least the Middle Ages, possibly sooner.

Comment author: Stabilizer 05 March 2012 08:35:52AM 2 points [-]

Interesting. The particular philosophers you have in mind?

Comment author: Jayson_Virissimo 05 March 2012 09:44:52AM 13 points [-]

Primarily, I had the Arabic-speaking philosophical alchemists in mind, but there are others. If there is significant interest, then I will elaborate further.

Comment author: Will_Newsome 05 March 2012 09:10:21AM 1 point [-]

Does Imitation of Christ count as transhumanism, or is too ideologically distinct?

Comment author: Jayson_Virissimo 05 March 2012 09:41:13AM 2 points [-]

I would say no, because their isn't enough emphasis on technology as the means of achieving post-humanity.

Comment author: EllisD 02 March 2012 02:24:29PM *  13 points [-]

Whether a mathematical proposition is true or not is indeed independent of physics. But the proof of such a proposition is a matter of physics only. There is no such thing as abstractly proving something, just as there is no such thing as abstractly knowing something. Mathematical truth is absolutely necessary and transcendent, but all knowledge is generated by physical processes, and its scope and limitations are conditioned by the laws of nature.

-David Deutsch, The Beginning of Infinity.

Comment author: TimS 02 March 2012 02:56:07PM 1 point [-]

The Pythagorean theorem isn't proved or or even checked by measuring right triangles and noticing that a^2 + b^2 = c^2. Is the Pythagorean theorem not knowledge?

Comment author: khafra 02 March 2012 02:59:56PM 14 points [-]

I don't think Deutsch means that mathematical proofs are all inductive. I think he means that proofs are constructed and checked on physical computing devices like brains or GPGPUs; and that because of that mathematical knowledge is not in a different ontological category than empirical knowledge.

Comment author: TimS 02 March 2012 03:58:59PM *  1 point [-]

I feel quite confident saying that mathematics will never undergo paradigm shifts, to use the terminology of Kuhn.

The same is not true for empirical sciences. Paradigm shifts have happened, and I expect them to happen in the future.

Comment author: benelliott 02 March 2012 04:42:16PM 3 points [-]

Would the whole Russel's paradox incident count as a mathematical paradigm shift?

Comment author: TimS 02 March 2012 05:28:22PM 0 points [-]

Reading Wikipedia, it looks like a naive definition of a set turns out to be internally inconsistent. Does that mean the concept of set was abandoned by mathematicians the way epicyles have been abandoned by physicists? That's not my sense, so I hesitate to say redefining set in a more coherent way is a paradigm shift. But I'm no mathematician.

Comment author: benelliott 02 March 2012 05:40:25PM 1 point [-]

Its a matter of degree rather than an absolute line. However, I would say a time when even the very highest experts in a field believed something of great importance to their field with quite high confidence, and then turned out to wrong, probably counts.

Comment author: TimS 02 March 2012 05:42:54PM 0 points [-]

I don't think "everyone in field X made an error" is that same thing as saying "Field X underwent a paradigm shift."

Comment author: Bugmaster 02 March 2012 06:56:07PM 0 points [-]

Why not ? That sounds like a massive shift in the core beliefs of the field in question. If that's not a paradigm shift, then what is ?

Comment author: TimS 02 March 2012 07:01:15PM 0 points [-]

The "non-expressible in the new concept-space" thing that you think never actually happens.

Comment author: Morendil 02 March 2012 04:58:00PM 0 points [-]

mathematics will never undergo paradigm shifts,

What would count as one?

Comment author: TimS 02 March 2012 05:38:24PM 0 points [-]

As I understand it, a paradigm shift would include the abandonment of a concept. That is, the concept cannot be coherently expressed using the new terminology. For example, there's no way to express coherent concepts in things like Ptolomy's epicycles or Aristole's impetus. I think Kuhn would say that these examples are evidence that empirical science is socially mediated.

I'm not aware of any formerly prominent mathematical concepts that can't even be articulated with modern concepts. Because mathematics is non-empirical and therefore non-social, I would be surprised if they existed.

Comment author: Morendil 02 March 2012 05:53:39PM 1 point [-]

That is, the concept cannot be coherently expressed using the new terminology. For example, there's no way to express coherent concepts in things like Ptolomy's epicycles or Aristole's impetus.

I'm not seeing how the second sentence is an example of the criterion in your first sentence. That criterion seems to strict, too: in general the new paradigm subsumes the old (as in the canonical example of Newtonian vs relativistic physics).

I'm also not seeing what the attributes "empirical" and "non-social" have to do (causally) with the ability to form coherent concepts.

Maybe you should also unpack what you mean by "coherent"?

I'm not a mathematician, but from my outside perspective I would cheerfully qualify something like Wilf-Zeilberger theory as the math equivalent to a paradigm shift in the empirical sciences.

WP lists "non-euclidean geometry" as a paradigm shift, BTW.

Comment author: TimS 02 March 2012 06:15:55PM 0 points [-]

That is, the concept cannot be coherently expressed using the new terminology. For example, there's no way to express coherent concepts in things like Ptolomy's epicycles or Aristole's impetus. I'm not seeing how the second sentence is an example of the criterion in your first sentence.

Using modern physics, there is no way to express the concept that Ptolomy intended when he said epicycles. More casually, modern physicists would say "Epicycles don't exist" But contrast, the concept of set is still used in Cantor's sense, even though his formulation contained a paradox. So I think the move from geocentric theory to heliocentric theory is a paradigm shift, but adjusting the definition of set is not.

I'm also not seeing what the attributes "empirical" and "non-social" have to do (causally) with the ability to form coherent concepts.

I'm using the word science as synonymous with "empirical studies" (as opposed to making stuff up without looking). That's not intended to be controversial in this community. What is controversial is the assertion that studying the history of science shows examples of paradigm shifts.

One possible explanation of this phenomena is that science is socially mediated (i.e. affected by social factors when the effect is not justified by empirical facts).

I'm asserting that mathematics is not based on empirical facts. Therefore, one would expect that it could avoid being socially mediated by avoiding interacting with reality (that is, I think a sufficiently intelligent Cartesian skeptic could generate all of mathematics). IF I am correct that they are caused by the socially mediated aspects of the scientific discipline and IF mathematics can avoid being socially mediated by virtue of its non-empirical nature, then I would expect that no paradigm shifts would occur.

This whole reference to paradigm shifts is an attempt to show a justification for my belief that mathematics is non-empirical, contrary to the original quote. If you don't believe in paradigm shifts (as Kuhn meant them, not as used by management gurus), then this is not a particularly persuasive argument.


WP lists "non-euclidean geometry" as a paradigm shift, BTW.

If Wikipedia says that, I don't think it is using the word the way Kuhn did.

Comment author: Bugmaster 02 March 2012 06:36:43PM *  1 point [-]

Using modern physics, there is no way to express the concept that Ptolomy intended when he said epicycles.

As I'd mentioned elsewhere, there's actually a pretty easy way to express that, IMO: "Ptolemy thought that planets move in epicycles, and he was wrong for the following reasons, but if we had poor instruments like he did, we might have made the same mistake".

IF I am correct that they are caused by the socially mediated aspects of the scientific discipline and IF mathematics can avoid being socially mediated by virtue of its non-empirical nature, then I would expect that no paradigm shifts would occur.

The abovementioned non-euclidean geometry is one such shift, as far as I understand (though I'm not a mathematician). I'm not sure what the difference is between the history of this concept, and what Kuhn meant.

But there were other, more powerful paradigm shifts in math, IMO. For example, the invention of (or discovery of, depending on your philosophy) zero (or, more specifically, a positional system for representing numbers). Irrational numbers. Imaginary numbers. Infinite sets. Calculus (contrast with Zeno's Paradox). The list goes on.

I should also point out that many, if not all, of these discoveries (or "inventions") either arose as a solution to a scientific problem (f.ex. Calculus), or were found to have a useful scientific application after the fact (f.ex. imaginary numbers). How can this be, if mathematics is entirely "non-empirical" ?

Comment author: TimS 02 March 2012 06:43:02PM 1 point [-]

Hmm, I'll have to think about the derivation of zero, the irrational numbers, etc.

I should also point out that many, if not all, of these discoveries (or "inventions") either arose as a solution to a scientific problem (f.ex. Calculus), or were found to have a useful scientific application after the fact (f.ex. imaginary numbers). How can this be, if mathematics is entirely "non-empirical"

The motivation for derivation of mathematical facts is different from the ability to derive them. I don't why the Cartesian skeptic would want to invent calculus. I'm only saying it would be possible. It wouldn't be possible if mathematics was not independent of empirical facts (because the Cartesian skeptic is isolated from all empirical facts except the skeptic's own existence).

Comment author: Morendil 02 March 2012 08:24:59PM 0 points [-]

socially mediated (i.e. affected by social factors when the effect is not justified by empirical facts).

Hmm, "justified" generally has a social component, so I doubt that this definition is useful.

there is no way to express the concept that Ptolomy intended when he said epicycles

So this WP page doesn't exist? ;)

My position, FWIW, is that all of science is socially mediated (as a consequence of being a human activity), mathematics no less than any other science. Whether a mathematical proposition will be assessed as true by mathematicians is a property ultimately based on physics - currently the physics of our brains.

Comment author: komponisto 02 March 2012 10:45:31PM 2 points [-]

WP lists "non-euclidean geometry" as a paradigm shift, BTW.

If Wikipedia says that, I don't think it is using the word the way Kuhn did.

For Kuhn, the word was, if anything, a sociological term -- not something referring to the structure of reality itself. (Kuhn was not himself a postmodernist; he still believed in physical reality, as distinct from human constructs.) So it seems to me that it would be entirely consistent with his usage to talk about paradigm shifts in mathematics, since the same kind of sociological phenomena occur in the latter discipline (even if you believe that the nature of mathematical reality itself is different from that of physical reality).

Comment author: Manfred 02 March 2012 06:11:02PM *  4 points [-]

there's no way to express coherent concepts in things like Ptolomy's epicycles or Aristole's impetus

There are perfectly fine ways to express those things. Epicycles might even be useful in some cases, since they can be used as a simple approximation of what's going on.

The reason people don't use epicycles any more isn't because they're unthinkable, in the really strong "science is totally culture-dependent" sense. It's because using them was dependent on whether we thought they reflected the structure of the universe, and now we don't. Ptolemy's claim behind using epicycles was that circles were awesome, so it was likely that the universe ran on circles. This is a fact that could be tested by looking at the complexity of describing the universe with circles vs. ellipses.

So this paradigm shift stuff doesn't look very unique to me. It just looks like the refutation of an idea that happened to be central to using a model. Then you might say that math can have no paradigm shifts because it constructs no models of the world. But this isn't quite true - there are models of the mathematical world that mathematicians construct that occasionally get shaken up.

Comment author: TimS 02 March 2012 06:24:19PM *  -1 points [-]

My point was that trying to express epicycles in the new terminology is not possible. That is, modern physicists say, "Epicycles don't exist."

Obviously, it is possible to use sociological terminology to describe epicycles. You yourself said that they were useful at times. But that's not the language of physics.

Since you mentioned it, I would endorse "Science is substantially culturally dependent", NOT "Science is totally culturally dependent." So culturally dependent that there is not reason to expect correspondence between any model and reality. Better science makes better predictions, but it's not clear what a "better" model would be if there's no correspondence with reality.

I brought all this up not to advocate for the cultural dependence of science. Rather, I think it would be surprising for a discipline independent of empirical facts to have paradigm shifts. Thus, the absence of paradigm shifts is a reason to think that mathematics is independent of empirical facts.

If you don't think science is substantially culturally dependent, then there's no reason my argument should persuade you that mathematics is independent of empirical facts.

Comment author: Manfred 02 March 2012 07:14:42PM *  7 points [-]

My point was that trying to express epicycles in the new terminology is not possible.

But it is! You simply specify the position as a function of time and you've done it! The reason why that seems so strange isn't because modern physics has erased our ability to add circles together, it's because we no longer have epicycles as a fundamental object in our model of the world.

So if you want the copernican revolution to be a paradigm shift, the idea needs to be extended a bit. I think the best way is to redefine paradigm shift as a change in the language that we describe the world in. If we used to model planets in terms of epicycles, and now we model them in terms of ellipses, that's a change of language, even though ellipses can be expressed as sums of epicycles, and vice versa.

In fact, in every case of inexpressibility that we know of, it's been because one of the ways of thinking about the world didn't give correct predictions. We have yet to find two ways of thinking about the world that let you get different experimental results if you plan the experiment two different ways. In these cases, the paradigm shift included the falsification of a key claim.

Rather, I think it would be surprising for a discipline independent of empirical facts to have paradigm shifts

I don't think it's necessarily true (for example, you can imagine an abstract game having a revolution in how people thought about what it was doing), but it seems reasonable for math, depending on how you define "math." I think people are just giving you a hard time because you're trying to make this general definitional argument (generally not worth the effort) on pretty shaky ground.

Comment author: TimS 02 March 2012 07:25:27PM *  4 points [-]

Thanks, that's quite clear. Should I reference abandonment of fundamental objects as the major feature of a paradigm shift?

In fact, in every case of inexpressibility that we know of, it's been because one of the ways of thinking about the world didn't give correct predictions.

Yes, every successful paradigm shift. Proponents of failed paradigm shifts are usually called cranks. :)

My position is that the repeated pattern of false fundamental objects suggest that we should give up on the idea of fundamental objects, and simply try to make more accurate predictions without asserting anything else about the "accuracy" of our models.

Comment author: komponisto 02 March 2012 10:50:10PM *  8 points [-]

My point was that trying to express epicycles in the new terminology is not possible.

This is false in an amusing way: expressing motion in terms of epicycles is mathematically equivalent to decomposing functions into Fourier series -- a central concept in both physics and mathematics since the nineteenth century.

Comment author: Bugmaster 02 March 2012 10:57:48PM 0 points [-]

To be perfectly fair, AFAIK Ptolemy thought in terms of a finite (and small) number of epicycles, not an infinite series.

Comment author: Bugmaster 02 March 2012 06:26:05PM 0 points [-]

For example, there's no way to express coherent concepts in things like Ptolomy's epicycles or Aristole's impetus.

I disagree, as, I suspect, you already know :-)

But I have a further disagreement with your last sentence:

Because mathematics is non-empirical and therefore non-social...

What do you mean, "and therefore" ? As I see it, "empirical" is the opposite of "social". Gravity exists regardless of whether I like it or not, and regardless of how many passionate essays I write about Man's inherent freedom to fly by will alone.

Comment author: TimS 02 March 2012 06:30:32PM 1 point [-]

Yes, non-empirical is the wrong word. I mean to assert that mathematics is independent of empirical fact (and therefore non-social. A sufficiently intelligent Cartesian skeptic could derive all of mathematics in solitude).

Comment author: Bugmaster 02 March 2012 06:43:45PM *  0 points [-]

A sufficiently intelligent Cartesian skeptic could derive all of mathematics in solitude...

I don't know whether this is true or not; arguments could (and have) been made that such a skeptic could not exist in a non-empirical void. But that's a bit offtopic, as I still have a problem with your previous sentence:

I mean to assert that mathematics is independent of empirical fact ... and therefore non-social.

Are you asserting that all things which are "dependent on empirical fact" are "social" ? In this case, you must be using the word "social" in a different way than I am.

If we lived in a culture where belief in will-powered flight was the norm, and where everyone agreed that willing yourself to fly was really awesome and practically a moral imperative... then people would still plunge to their deaths upon stepping off of skyscraper roofs.

Comment author: TimS 02 March 2012 06:55:21PM 0 points [-]

I don't know whether this is true or not; arguments could (and have) been made that such a skeptic could not exist in a non-empirical void.

:) It is the case that the coherence of the idea of the Cartesian skeptic is basically what we are debating.


I'm specifically asserting that things that are independent of empirical facts are non-social.

I think that things that are subject to empirical fact are actually subject to social mediation, but that isn't a consequence of my previous statement.


What does rejection of the assertion "If you think you can fly, then you can" have to do with the definition of socially mediated? I don't think post-modern thinking is committed to the anti-physical realism position, even if it probably should endorse the anti-physical models position. The ability to make accurate predictions doesn't require a model that corresponds with reality.

Comment author: ChristianKl 03 March 2012 06:35:23PM 0 points [-]

Didn't Gödel show that nobody can derive all of mathematics in solitude because you can't have a complete and consistented mathamatical framework?

Comment author: NancyLebovitz 03 March 2012 06:42:54PM 1 point [-]

Goedel showed that no one can derive all of mathematics at all, whether in solitude or in a group, because any consistent system of axioms can't lead to all the true statements from their domain.

Anyone know whether it's proven that there are guaranteed to be non-self-referential truths which can't be derived from a given axiom system? (I'm not sure whether "self-referential" can be well-defined.)

Comment author: [deleted] 02 March 2012 10:57:24PM 4 points [-]

Aristole's impetus

A totally trivial nit pick, I admit, but there's no such thing as the Aristotelian theory of impetus. The theory of impetus was an anti-Aristotelian theory developed in the middle ages. Aristotle has no real dynamical theory.

Comment author: Bugmaster 02 March 2012 10:59:38PM 0 points [-]

Thanks, I did not actually know that. But I should have known.

Comment author: TimS 03 March 2012 04:10:54AM 0 points [-]

Thanks. Did not know that.

Comment author: [deleted] 02 March 2012 05:16:15PM *  6 points [-]

I feel quite confident saying that mathematics will never undergo paradigm shifts, to use the terminology of Kuhn.

It believe it already has. Consider the Weierstrass revolution. Before Weierstrass, it was commonly accepted that while continuous functions may lack a derivative at a set of discrete points, it still had to have a derivative somewhere. Then Weierstrass developed a counterexample, which I think satisfies the Kuhnian "anomaly that cannot be explained within the current paradigm."

Another quick example: during the pre-War period, most differential geometry was concerned with embedded submanifolds in Euclidean space. However, this formulation made it difficult to describe or classify surfaces -- I seem to believe but don't have time to verify that even deciding whether two sets of algebraic equations determine isomorphic varieties is NP-hard. Hence, in the post-War period, intrinsic properties and descriptions.

EDIT: I was wrong, or at least imprecise. Isomorphism of varieties can be decided with Grobner bases, the reduction of which is still doubly-exponential in time, as far as I can tell. Complexity classes aren't in my domain; I shouldn't have said anything about them without looking it up. :(

Comment author: TimS 02 March 2012 05:32:05PM *  0 points [-]

Reading the wiki page, it looks like Weierstrass corrected an error in the definition or understanding of limits. But mathematicians did not abandon the concept of limit the way physicists abandoned the concept of epicycle, so I'm not sure that qualifies as a paradigm shift. But I'm not mathematician, so my understanding may be seriously incomplete.

I can't even address your other example due to my failure of mathematical understanding.

Comment author: [deleted] 02 March 2012 07:35:06PM 2 points [-]

Reading the wiki page, it looks like Weierstrass corrected an error in the definition or understanding of limits.

Hindsight bias. The old limit definition was not widely considered either incorrect or incomplete.

But mathematicians did not abandon the concept of limit the way physicists abandoned the concept of epicycle, so I'm not sure that qualifies as a paradigm shift.

They abandoned reasoning about limits informally, which was de rigeur beforehand. For examples of this, see Weierstrass' counterexample to the Dirichlet principle. Prior to Weierstrass, some people believed that the Dirichlet principle was true because approximate solutions exist in all natural examples, and therefore the limit of approximate solutions will be a true solution.

Comment author: TimS 02 March 2012 07:54:56PM *  1 point [-]

That's pretty clear, thanks. Obviously, experts aren't likely to think there is a basic error before it has been identified, but I'm not in position to have a reliable opinion on whether I'm suffering from hindsight bias.

Still, what fundamental object did mathematics abandon after Weierstrass' counter-example? How is this different from the changes to the definition of set provoked by Russell's paradox?

Comment author: [deleted] 02 March 2012 08:28:28PM 2 points [-]

I don't recall where it is said that such an object is necessary for a Kuhnian revolution to have occurred. There was a crisis, in the Kuhnian sense, when the old understanding of limit (perhaps labeling it as limit1 will be clearer) could not explain the existence of e.g., continuous functions without derivatives anywhere, or counterexamples to the Dirichlet principle. Then Weierstrass developed limit2 with deltas and epsilons. Limit1 was then abandoned in favor of limit2.

Comment author: Eugine_Nier 03 March 2012 04:03:50AM 3 points [-]

Hindsight bias. The old limit definition was not widely considered either incorrect or incomplete.

Not true. The "old limit definition" was non-existent beyond the intuitive notion of limit, and people were fully aware that this was not a satisfactory situation.

Comment author: [deleted] 03 March 2012 09:20:41PM 0 points [-]

We need to clarify what time period we're talking about. I'm not aware of anyone in the generation of Newton/Leibniz and the second generation (e.g., Daniel Bernoulli and Euler) who felt that way, but it's not as if I've read everything these people ever wrote.

The earliest criticism I'm aware of is Berkeley in 1734, but he wasn't a mathematician. As for mathematicians, the earliest I'm aware of is Lagrange in 1797.

Comment author: TimS 04 March 2012 03:55:46AM 0 points [-]

I'm also curious about this history.

Comment author: [deleted] 02 March 2012 06:03:42PM 4 points [-]

Wikipedia gives the acceptance of non-Euclidean geometry as a "classical case" of a paradigm shift. I suspect that there were several other paradigm shifts involved from Euclid's math to our math: for instance, coordinate geometry, or the use of number theory applied to abstract quantities as opposed to lengths of line segments.

Comment author: ChristianKl 03 March 2012 06:22:04PM 0 points [-]

The frequentist vs. baysian debate is a debate of computing mathematical paradigms. True mathematicians however shun statistics. They don't like the statistical pradigm ;)

Gödel's discovery ended a certain mathmatical pradigm of wanting to construct a complete mathematics from the ground up.

I could imagine a future paradigm shift way from the ideal of mathmatical proofs to more experimental math. Neural nets or quantum computers can give you answer to mathematical question that you ask that might be better than the answer s that axiom and proof based math provides.

Comment author: Eugine_Nier 04 March 2012 12:46:16AM 1 point [-]

Gödel's discovery ended a certain mathmatical pradigm of wanting to construct a complete mathematics from the ground up.

Except, in practice mathematics still works this way.

Comment author: Bugmaster 02 March 2012 06:55:00PM -1 points [-]

No, that's not how you prove it, but you can check it pretty easily with right triangles. Similarly, if you believe that Pi == 3, you only need a large wheel and a piece of string to discover that you're wrong. This won't tell you the actual value of Pi, nor would it constitute a mathematical proof, but at least the experience would point you in the right direction.

Comment author: TimS 02 March 2012 06:59:28PM 2 points [-]

If you find a right triangle with sides (2.9, 4, 5.15) rather than (3,4,5), are you ever entitled to reject the Pythagrean theorem? Doesn't measurement error and the non-Euclidean nature of the actual universe completely explain your experience?

In short, it seems like you can't empirically check the Pythagorean theorem.

Comment author: Bugmaster 02 March 2012 09:17:42PM 0 points [-]

If you find a right triangle with sides (2.9, 4, 5.15) rather than (3,4,5), are you ever entitled to reject the Pythagrean theorem?

That is not what I said. I said, regarding Pi == 3, "this won't tell you the actual value of Pi, nor would it constitute a mathematical proof, but at least the experience would point you in the right direction". If you believe that a^2 + b^2 = c^5, instead of c^2; and if your instruments are accurate down to 0.2 units, then you can discover very quickly that your formula is most probably wrong. You won't know which answer is right (though you could make a very good guess, by taking more measurements), but you will have enough evidence to doubt your theorem.

The words "most probably" in the above sentence are very important. No amount of empirical measurements will constitute a 100% logically consistent mathematical proof. But if your goal is to figure out how the length of the hypotenuse relates to the lengths of the two sides, then you are not limited to total ignorance or total knowledge, with nothing in between. You can make educated guesses. Yes, you could also get there by pure reason alone, and sometimes that approach works best; but that doesn't mean that you cannot, in principle, use empirical evidence to find the right path.

Comment author: MaoShan 04 March 2012 06:57:53AM -1 points [-]

Peer review. If the next two hundred scientists who measure your triangle get the same measurements from other rulers by different manufacturers, you'd be completely justified in rejecting the Pythagorean theorem.

My challenge to you: go out and see if you can find a right triangle with those measurements.

Comment author: Eugine_Nier 04 March 2012 08:37:15PM 4 points [-]

Sure, how about a triangle just outside a black hole.

Comment author: MaoShan 05 March 2012 04:19:51AM -1 points [-]

That was a quick trip. Which black hole was it?

Comment author: Vaniver 03 March 2012 01:05:54AM *  0 points [-]

The Pythagorean theorem isn't proved or or even checked by measuring right triangles and noticing that a^2 + b^2 = c^2.

I am having trouble with this as a statement of historical fact. Isn't that how they did it?

Comment author: ChristianKl 03 March 2012 06:35:14PM 2 points [-]

You could call it a pradigm shift that we today don't like how they did it ;)

Comment author: RichardKennaway 03 March 2012 05:25:23PM *  8 points [-]

There is no such thing as abstractly proving something

Of course there is. A proof of a mathematical proposition is just as much itself a mathematical object as the proposition being proved; it exists just as independently of physics. The proof as written down is a physical object standing in the same relation to the real proof as the digit 2 before your eyes here bears to the real number 2.

But perhaps in the context Deutsch isn't making that confusion. What scope and limitations on mathematical knowledge, conditioned by the laws of nature, does he draw out from these considerations?

Comment author: wallowinmaya 02 March 2012 08:16:07PM 6 points [-]

Faith: not wanting to know what is true.

Friedrich Nietzsche

Comment author: Eugine_Nier 03 March 2012 07:12:25AM 4 points [-]

I don't think that is a good description of what people mean by "faith".

For a better idea of the concept of faith start here.

Summary: Theory is to faith as our concept of physical necessitation is to that of social obligation.

Comment author: DSimon 04 March 2012 05:57:42AM *  2 points [-]

It's not what people intend "faith" to mean, but nevertheless it often ends up being its effective definition. (EDIT: To clarify, by "it" I am referring to Nietzsche's definition.)

Comment author: wallowinmaya 02 March 2012 08:25:53PM 5 points [-]

All sciences are now under the obligation to prepare the ground for the future task of the philosopher, which is to solve the problem of value, to determine the true hierarchy of values.

Friedrich Nietzsche, foreseeing the CEV-problem? (Just kidding, of course)

Comment author: XFrequentist 02 March 2012 09:01:52PM 11 points [-]

May the best of your todays, be the worst of your tomorrows

  • Jay-Z, Forever Young

[Taking the lyrics literally, the whole thing is a pretty sweet transhumanist anthem.]

Comment author: gwern 03 March 2012 07:53:42AM 5 points [-]

"A full tour through the modern critics of the competitive organization of society would be a truly exhausting trip. It would include the drama, the novel, the churches, the academies, the lesser intellectual establishments, the socialists and communists and Fabians and a swarm of other dissenters. One is reminded of Schumpeter’s remark that the Japanese earthquake of 1924 had a remarkable aspect: it was not blamed on capitalism. Suddenly one realizes how impoverished our society would be in its indignation, as well as in its food, without capitalism."

--George F. Stigler, "Economics or Ethics?"

Comment author: shminux 03 March 2012 08:00:39AM 3 points [-]

Here's my advice: If you meet an economist, ask him to adjust your spine so you no longer get the common cold. Then ask him for some specific investment tips and do exactly what he recommends. Let me know which one works out best.

Scott Adams

Comment author: Grognor 03 March 2012 08:57:49AM 37 points [-]

“Stupider” for a time might not have been a real word, but it certainly points where it’s supposed to. The other day my sister used the word “deoffensify”. It’s not a real word, but that didn’t make it any less effective. Communication doesn’t care about the “realness” of language, nor does it often care about the exact dictionary definitions. Words change through every possible variable, even time. One of the great challenges of communication has always been making sure words mean the same thing to you and your audience.

-Michael "Kayin" O'Reilly

Comment author: Bugmaster 03 March 2012 09:54:54AM 15 points [-]

Or, as the Language Log puts it:

The first thing to say is that the only possible way to settle a question of grammar or style is to look at relevant evidence. I suppose there really are people who believe the rules of grammar come down from some authority on high, an authority that has no connection with the people who speak and write English; but those people have got to be deranged.

Comment author: Nominull 03 March 2012 09:58:49AM 5 points [-]

Swap out "grammar" and "style" for "morality" and "ethics"?

Comment author: [deleted] 03 March 2012 11:03:54AM *  6 points [-]

the Language Log

It's Language Log, without the, goddammit!

Comment author: DSimon 04 March 2012 05:55:45AM 2 points [-]

One of my favorite things about many constructed languages is that they get rid of this distinction entirely. You don't have to worry about whether or not "Xify" is a so-called real word for any given value X, you only have to check if it X's type fits the pattern. This happens merely because it's a lot easier, when you're working from scratch anyways, to design the language that way than to have to come up with a big artificial list of -ify words.

Comment author: Viliam_Bur 03 March 2012 12:57:48PM *  0 points [-]

"Do you believe in revolution
Do you believe that everything will change
Policemen to people
And rats to pretty women
Do you think they will remake
Barracks to bar-rooms
Yperit to Coca-Cola
And truncheons to guitars?

Oh-oh, my naive
It will never be like that
Oh-oh, my naive
Life is like it is

Do you think that ever
Inferiority complexes will change to smiles
Petržalka to Manhattan
And dirty factories to hotels
Do you think they will elevate
Your idols to gods
That you will never have to
Bathe your sorrow with alcohol?

Oh-oh, my naive...

Do you think that suddenly
Everyone will reconcile with everyone
That no one will write you off
If you will have holes in your jeans
Do you think that in everything
Everyone will help you
That you will never have to be
Afraid of a higher power?

Oh-oh, my naive..."

My translation of a Slovak punk-rock song in 1990s "Slobodná Európa: Nikdy to tak nebude". Is it an example of an outside view, or just trying to reverse stupidity?

Comment author: NancyLebovitz 03 March 2012 01:05:31PM 9 points [-]

It is more important to know what is true today, than to have been right yesterday

Found here.

Comment author: [deleted] 03 March 2012 03:25:48PM *  22 points [-]

•••

Comment author: Spectral_Dragon 04 March 2012 02:13:50AM 0 points [-]

"The only sovereign I can allow to rule me is reason. The first law of reason is this: what exists exists; what is is. From this irreducible, bedrock principle, all knowledge is built. This is the foundation from which life is embraced. Reason is a choice. Wishes and whims are not facts, nor are they a means to discovering them. Reason is our only way of grasping reality--it is our basic tool of survival. We are free to evade the effort of thinking, to reject reason, but we are not free to avoid the penalty of the abyss we refuse to see."

-- Terry Goodkind, Faith of the fallen. I know quite a few here dislike the author, but there's still a lot of good material, like this one, or the Wizard Rules.

Comment author: Stabilizer 04 March 2012 05:50:49AM 21 points [-]

Society changes when we change what we're embarrassed about.

In just fifty years, we've made it shameful to be publicly racist.

In just ten years, someone who professes to not know how to use the internet is seen as a fool.

The question, then, is how long before we will be ashamed at being uninformed, at spouting pseudoscience, at believing thin propaganda? How long before it's unacceptable to take something at face value? How long before you can do your job without understanding the state of the art?

Does access to information change the expectation that if you can know, you will know?

We can argue that this will never happen, that it's human nature to be easily led in the wrong direction and to be willfully ignorant. The thing is, there are lots of things that used to be human nature, but due to culture and technology, no longer are.

-Seth Godin

Comment author: djcb 04 March 2012 09:56:58AM 8 points [-]

There is a spookier possibility. Suppose it is easy to send messages to the past, but that forward causality also holds (i.e. past events determine the future). In one way of reasoning about it, a message sent to the past will "alter" the entire history following its receipt, including the event that sent it, and thus the message itself. Thus altered, the message will change the past in a different way, and so on, until some "equilibrium" is reached--the simplest being the situation where no message at all is sent. Time travel may thus act to erase itself (an idea Larry Niven fans will recognize as "Niven's Law").

-- Hans Moravec Time Travel and Computing

Comment author: Woodbun 04 March 2012 12:02:38PM *  19 points [-]

"One of the great commandments of science is, 'Mistrust arguments from authority'. (Scientists, being primates, and thus given to dominance hierarchies, of course do not always follow this commandment.)"

-Carl Sagan, The Demon Haunted World

Comment author: Will_Newsome 04 March 2012 12:10:37PM 12 points [-]

The world is paved with good intentions; the road to Hell has bad epistemology mixed in.

Steven Kaas

Comment author: Voltairina 04 March 2012 10:35:25PM *  0 points [-]

“I don’t know what you mean by ‘glory,’ ” Alice said. Humpty Dumpty smiled contemptuously. “Of course you don’t—till I tell you. I meant ‘there’s a nice knock-down argument for you!’ ” “But ‘glory’ doesn’t mean ‘a nice knock-down argument’,” Alice objected. “When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean—neither more nor less.” “The question is,” said Alice, “whether you can make words mean so many different things.” “The question is,” said Humpty Dumpty, “which is to be master -- that’s all.”

-Charles Dodgeson(Lewis Carrol), Through the Looking Glass

Comment author: TimS 04 March 2012 10:51:35PM 3 points [-]

Isn't Humpty Dumpty wrong, if the goal is intelligible conversation?

Comment author: TheOtherDave 04 March 2012 11:29:06PM 5 points [-]

Absolutely. But if the goal is to establish dominance, as Humpty Dumpty (appears to) suggest, its technique often works.

Comment author: Voltairina 04 March 2012 10:51:52PM *  6 points [-]

Courage is what it takes to stand up and speak; courage is also what it takes to sit down and listen.

Winston Churchill

Comment author: antigonus 05 March 2012 07:33:53AM 2 points [-]

I tell you that as long as I can conceive something better than myself I cannot be easy unless I am striving to bring it into existence or clearing the way for it.

-- G.B. Shaw, "Man and Superman"

Shaw evinces a really weird, teleological view of evolution in that play, but in doing so expresses some remarkable and remarkably early (1903) transhumanist sentiments.

Comment author: MarkusRamikin 05 March 2012 08:09:44AM 8 points [-]

I love that quote, but if it carries a rationality lesson, I fail to see it. Seems more like an appeal to the tastes of the audience here.

Comment author: antigonus 05 March 2012 09:48:42AM 4 points [-]

Yeah, you're correct. Wasn't thinking very hard.

Comment author: DSimon 05 March 2012 10:27:39AM 1 point [-]

I have to disagree; the lesson in the quote is "Win as hard as you can", which is very important if not very complicated.

Comment author: MarkusRamikin 05 March 2012 11:10:48AM 2 points [-]

I don't see the connection. If bringing a superior being to myself into existence is maximum win for me, that's not obvious. Not everyone, like Shaw's Don Juan, values the Superman.

Comment author: NancyLebovitz 05 March 2012 01:26:22PM *  8 points [-]

My father was a psychologist and a lifelong student of human behavior, and when I brought him my report card he often used to say: “This tells me something about you, something about your teacher, and something about myself.

Lynne Murray