Rationality Quotes March 2012
Here's the new thread for posting quotes, with the usual rules:
- Please post all quotes separately, so that they can be voted up/down separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- Do not quote yourself
- Do not quote comments/posts on LW/OB
- No more than 5 quotes per person per monthly thread, please.
Loading…
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Comments (525)
Thomas Henry Huxley - about Darwin's theory of evolution
Meh. That's just hindsight bias.
Galileo Galilei (translated by me)
Generally, yes. But in this particular casa we can trust, that the later Darwin's bulldog really felt that way and that this was a justified statement. He obviously understood the matter well.
All those English animal breeders had a good insight. It was more or less a wild generalization for them. Non so wild for Huxley.
So that I can google for it - what's the original text? Thanks!
The version I've read is "Tutte le verità sono facili da capire quando sono rivelate, il difficile è scoprirle!" But that sounds like suspiciously modern Italian to me, so I wouldn't be surprised to find out that it's itself a paraphrase.
ETA: Apparently it was quoted in Criminal Minds, season 6, episode 11, and I suspect the Italian dubbing backtranslated the English version of the show rather than looking for the original wording by Galileo. (Which would make my version above a third-level translation.)
ETA2: In the original version of Criminal Minds, it's "All truths are easy to understand once they are discovered; the point is to discover them" according to Wikiquote. (How the hell did point become difficile? And why the two instances of discover were translated with different verbs? That's why I always watch shows and films in the original language!)
ETA3: And Wikiquote attributes that as “As quoted in Angels in the workplace : stories and inspirations for creating a new world of work (1999) by Melissa Giovagnoli”.
With the great historical exception of quantum mechanics.
In fact, most people don't understand the Relativity. Most still rejects Evolution. It wasn't easy to understand the Copernican system in the Galileo's time.
It is easy to understand for a handful, and it seems obvious only to a few, when a new major breakthrough is made. Galileo was wrong. It may be easier, but not "easy to understand once a truth is revealed".
I suppose people didn't understand it because they didn't want to, not because they couldn't manage to. (Same with evolution -- what the OP was about. I might agree about relativity, though I guess for some people at least the absolute denial macro does play some part.)
More like stuff that was true back them is no longer true now.
I suppose not. Why? People either have an inborn concept of the absolute up-down direction, either they develop it early in life. Updating to the round (let alone moving and rotating Earth) is not that easy and trivial for a naive mind of a child or for a Medieval man.
A new truth is usually heavy to understand for everybody. Had not been so, the science would progress faster.
I don't see how that contradicts my claim that it's not that people couldn't understand the meaning of the statement “the Earth revolves around the Sun”, but rather they disagreed with it because it was at odds with what they thought of the world. iħ∂|Ψ⟩/∂t = Ĥ|Ψ⟩, now that's a statement most people won't even understand enough to tell whether they think it's true or false.
Historical? I know you count many worlds as “understanding”, but I wouldn't until this puzzle is figured out. (Or maybe it's that I like Feynman's (in)famous quote so much I want to keep on using it, even if this means using a narrower meaning for understand.)
I would say instead that many truths are easy to understand once you understand them. But still hard to explain to other people.
-Daniel Kahneman, Thinking, Fast and Slow
Yeah, a good compression algorithm--a dictionary that has short words for the important stuff--is vital to learning just about anything. I've noticed that in the martial arts; there's no way to learn a parry, entry, and takedown without a somatic vocabulary for the subparts of that; and the definitions of your "words" affects both the ease of learning and the effectiveness of its execution.
Interesting. So by somatic vocabulary, you basically mean composing long complicated moves from short, repeatable sub-moves?
Basically, yes. Much of the vocabulary has very long descriptions in English, but shorter ones in different arts' parlance; some of it doesn't really have short descriptions anywhere but in the movements of people who've mastered it. The Epistemic Viciousness problem makes it difficult, in general, to find and cleave at the joints.
Also, wouldn't it be better to call it a hash table or a lookup-table rather than a compression algorithm. The key is swift and appropriate recall. Example: Compare a long-time practicing theoretical physicist with a physics grad student. Both know most of basic quantum mechanics. But the experienced physicist would know when to whip out which equation in which situation. So, the knowledge content is not necessarily compressed (I'm sure there is some compression) as much as the usability of the knowledge is much greater.
Ayn Rand
Making the (flawed) assumption that in a disagreement, they cannot both be wrong.
Also, they could be wrong about whether they actually disagree.
IME that's the case in a sizeable fraction of disagreements between humans; but if they “let reality be [their] final arbiter” they ought to realize that in the process.
I have also heard it quoted like this.
I think that if the other person convinces you that they are right and they are right, then it should count as "winning the argument". It's the idea that has lost, not you.
--- pseudonym
Sam Hughes, talking about the first season finale of Doctor Who, differentiating between the subjective feeling of certainty and the actual probability estimate.
Natalie Wolchover
I saw on TV some kid lose convincingly against a RPS champion when the kid had been given a prepared (random) list of moves to make ahead of time. That can't be explained by strategy - it was either coincidence or it's possible to cheat by seeing which way your opponent's hand is unfolding and change your move at the last moment.
The latter is definitely possible. Back when I was still playing RPS as a kid, I was fairly good at it; enough for somewhere upwards of 70% of my plays to be wins.
You don't want to change your move at the last moment though so much as you want to keep your hand in a plausibly formless configuration you can turn into a move at the last moment. Less likely to be called out for cheating.
Or the losers were unintentionally signaling their moves beforehand.
Tauriq Moosa
Wendy Braitman
Ksawery Tartakower
I like this even though it violates the correct standard of "mistake": was the choice expected-optimal, before the roll of the die?
I like that it suggests continuing to focus on the rest of the game rather than beating yourself up over a past mistake.
Tartakower was a chess player.
Somehow I'd imagined chess without really knowing.
The roll of the die is still in effect: unanticipated consequences of only-boundedly-optimal moves by each player can't make the original move more or less of a true mistake.
Tartakower also said "No one ever won a game by resigning" indeed.
Suppose White gives away a pawn, and then the next move White accidentally lets Black put him in checkmate. White made the next-to-last mistake, but lost, so the saying must be false in a mundane sense. Is there an esoteric sense in which the saying is true?
Hmm, I suppose, a "mistake" in a technical sense is defined in terms of mini-max position evaluation, assuming infinite computing power:
eval(position) = -1 (loss), 0 (tie), or +1(win)
IsFatalMistake(move) = (eval(position before the move) > eval(position after the move) AND eval(position after the move) == -1)
With this definition, either giving away the pawn or missing the checkmate (or both) wasn't a fatal mistake, since the game was already lost before the move :)
On a purely empirical level most amateur games once they reach critical positions are blunderfests punctuated by a few objectively strong moves that decide the game, and many complex positions near the end of games are similar blunderfests even among masters, and if you're assuming that the majority of moves are blunders then Tartakower's point is generally true. But I don't think that's what he meant.
I read this as implying that the loser is the one who makes the last mistake — the mistake that allows his opponent to win.
But yeah, I think the quote is kinda sloppy — it assumes that the opponents take turns in making mistakes.
This is true if you only count as mistakes moves which turn a winning position into a losing position, as gRR said elsethread. (I think I picked up this meaning from Chessmaster 10's automatic analyses, and was implicitly assuming it when reading the Tartakower quote.)
Does "you shouldn't give up after a mistake, because many chess games involve both players, even the winner, making multiple mistakes" count as esoteric?
Robert Heinlein, Stranger In A Strange Land
--Diane Duane, High Wizardry
-- Neil DeGrasse Tyson
I think he'd do better if he just made up his mind. I'd go with the second one.
watch out folks, we got a badass over here
Fits this one, two out of three.
--Benjamin Vigoda, "Analog Logic: Continuous-Time Analog Circuits for Statistical Signal Processing" (2003 PhD thesis)
And the very next year, Intel abandoned its plans to make 4 GHz processors, and we've been stuck at around 3 GHz ever since.
Since when, parallel computing has indeed had the industry juggernaut behind it.
Yep, and that's why we all have dual-core or more now rather than long ag. Parallel computers of various architectures have been around since at least the '50s (mainframes had secondary processors for IO operations, IIRC), but were confined to niches until the frequency wall was hit and the juggernaut had to do something else with the transistors Moore's law was producing.
(I also read this quote as an indictment of the Lisp machine and other language-optimized processor architectures, and more generally, as a Hansonesque warning against 'not invented here' thinking; almost all innovation and good ideas are 'not invented here' and those who forget that will be roadkill under the juggernaut.)
--Morris Raphael Cohen, quoted by Cohen in "The Earth Is Round (p < 0.05)"
--Alain de Botton
How poignant for me since every last bit applies to me.
--Alain de Botton
(Perhaps this individual quote is insightful (I can't tell), but this sort of causal analysis leads to basic confusions of levels of organization more often than it leads to insight.)
Adolfo Bioy Casares (my translation)
Why don't they just play tag with each other? Sounds like it would be fun.
Because they're jerks.
Indeed. The kind of people who would go "Whee! Let's play tag!" in this situation do not find themselves in Hell (at least in this particular one) in the first place.
--Gregory Cochran, in a comment here
Also good, from that comment's OP:
Razib Khan
Yes but I didn't at first want to post that because it is slightly political. Though I guess the rationality core does outweigh any mind-killing.
You have a Rationality Core, too?
Mine tastes kind of like nougat.
This has 6 karma points, so I'm left curious about whether people have anything in mind about what real intellectuals shouldn't know.
I interpret the quote as saying that to be a "good intellectual" one needs to not know the problems with the positions "good intellectuals" are expected to defend.
I could be interpreting it entirely wrong, but I'd guess this is the list Cochran had in mind:
•
My immediate thought was a 'real intellectual' shouldn't fill their brain with random useless information, (e.g. spend their time reading tvtropes).
On our kind not cooperating:
Michelle Obama
Sounds like a counter to "Never interrupt your enemy when he is making a mistake." (Attributed but seemingly falsely to Napoleon Bonaparte)
-- Reg Braithwaite (raganwald)
--Joseph de Maistre, Les soirées de Saint-Pétersbourg, Ch. I
Some guilt also falls onto those who are not eager enough to verify those opinions or the money they circulate.
The man on the top (at the beginning) is NOT guilty for everything.
To my way of thinking, it's quite possible for me to be fully responsible for a chain of events (for example, if they would not have occurred if not for my action, and I was aware of the likelihood of them occurring given my action, and no external forces constrained my choice so as to preclude acting differently) and for other people upstream and downstream of me to also be fully responsible for that chain of events. This is no more contradictory than my belief that object A is to the left of object B from one perspective and simultaneously to the right of object A from another. Responsibility is not some mysterious fluid out there in the world that gets portioned out to individuals, it's an attribute that we assign to entities in a mental and/or social model.
You seem to be claiming that models wherein total responsibility for an event is conserved across the entire known causal chain are superior to mental models where it isn't, but I don't quite see why i ought to believe that.
The Princess Bride:
Man in Black: Inhale this, but do not touch.
Vizzini: [sniffs] I smell nothing.
Man in Black: What you do not smell is called iocane powder. It is odorless, tasteless, dissolves instantly in liquid, and is among the more deadlier poisons known to man.
[He puts the goblets behind his back and puts the poison into one of the goblets, then sets them down in front of him]
Man in Black: All right. Where is the poison? The battle of wits has begun. It ends when you decide and we both drink, and find out who is right... and who is dead.
[Vizzini stalls, then eventually chooses the glass in front of the man in black. They both drink, and Vizzini dies.]
Buttercup: And to think, all that time it was your cup that was poisoned.
Man in Black: They were both poisoned. I spent the last few years building up an immunity to iocane powder.
Vizzini of the Princess Bride, on the dangers of reasoning in absolutes - both logically ("this is proof it's not in my goblet") and propositionally (the implicit assumption Vizzini has that one and only one wine goblet is poisoned - P or ~P, as it were)
I don't agree that Vizzini is trying to reason in logical absolutes. He talks like he is, but he doesn't necessarily believe the things he's saying.
Man in Black: You're trying to trick me into giving away something. It won't work.
Vizzini: It has worked! You've given everything away! I know where the poison is!
My interpretation is that he really is trying to trick the man.
Later he distracts the man and swaps the glasses around; then he pretends to choose his own glass. He makes sure the man drinks first. I think he's reasoning/hoping that the man would not deliberately drink from the poisoned cup. So when the man does drink he believes his chosen cup is safe. If the man had been unwilling to drink, Vizzini would have assumed that he now held the poisoned glass, and perhaps resorted to treachery.
He's overconfident, but he's not a complete fool.
(I don't have strong confidence in this analysis, because he's a minor character in a movie.)
Well, yes, he only pretends to reason in logical absolutes...
... which was why I wrote "and propositionally" - because he does actually reason in propositional absolutes. I agree with your analysis but note that it is only a good strategy if it's true that one and only one cup contains poison (or the equivalent, that one and only one cup will kill the Man in Black).
On re-reading I may have lost that subtlety in the clumsy (parenthetical-filled) expression of the final line.
That the Man in Black describes it as a battle of wits - and not a puzzle - agrees with you.
--Matt Yglesias
I found that very poignant, but I'm not sure I agree with his final claim. I think he's committing the usual mistake of claiming impossible what seems hard.
Is it even hard? JFDI, or as we might say here, shut up and do the impossible. Is "efficient" a tendentious word? Taboo it. Is discussion being confused by mixing normative and positive concepts? DDTT.
The quote smells like rationalising to me.
Yeah, agreed. It's entirely possible to describe a system of economic agents without using such value-laden terns (though in some cases we may have to make up new terms). We don't do it, mostly because we don't want to. Which IMHO is fine; there's no particular reason why we should.
The first thought that I have when considering how to describe the economy without using normative language is that all of the values that are commonly measured (i.e. GDP, unemployment, etc.) are chosen to be measured because they are proxies for things that people value.
In fact, the whole study of economics seems to me like the study of things people value and how they are distributed. If you choose proxies for value you're having a profound effect on what gets measured (consider the recent discussions of statistical significance as a proxy for evidence) and if you try to list everything that everyone values you end up butting up against unsolved problems.
-- Niels Henrik Abel, on how he developed his mathematical ability.
1 Corinthians 15:54-57
(I like this quote, as long as it's shamelessly presented without context of the last line: "But thanks be to God, who gives us the victory through our Lord Jesus Christ." )
How do you interpret that line?
Diana Wynne Jones, Dark Lord of Derkholm
-- Dinosaur Comics
Mencius Moldbug, A gentle introduction to Unqualified Reservations (part 2) (yay reflection!)
— Aleksandr Solzhenitsyn, The Gulag Archipelago
-- Benjamin Franklin, Letter to Joseph Priestley, 8 Feb 1780
One of the first transhumanists?
"Be perfect, like an FAI is perfect." -- Jesus
The hard core of transhumanism goes back to at least the Middle Ages, possibly sooner.
Interesting. The particular philosophers you have in mind?
Primarily, I had the Arabic-speaking philosophical alchemists in mind, but there are others. If there is significant interest, then I will elaborate further.
Does Imitation of Christ count as transhumanism, or is too ideologically distinct?
I would say no, because their isn't enough emphasis on technology as the means of achieving post-humanity.
-David Deutsch, The Beginning of Infinity.
The Pythagorean theorem isn't proved or or even checked by measuring right triangles and noticing that a^2 + b^2 = c^2. Is the Pythagorean theorem not knowledge?
I don't think Deutsch means that mathematical proofs are all inductive. I think he means that proofs are constructed and checked on physical computing devices like brains or GPGPUs; and that because of that mathematical knowledge is not in a different ontological category than empirical knowledge.
I feel quite confident saying that mathematics will never undergo paradigm shifts, to use the terminology of Kuhn.
The same is not true for empirical sciences. Paradigm shifts have happened, and I expect them to happen in the future.
Would the whole Russel's paradox incident count as a mathematical paradigm shift?
Reading Wikipedia, it looks like a naive definition of a set turns out to be internally inconsistent. Does that mean the concept of set was abandoned by mathematicians the way epicyles have been abandoned by physicists? That's not my sense, so I hesitate to say redefining set in a more coherent way is a paradigm shift. But I'm no mathematician.
Its a matter of degree rather than an absolute line. However, I would say a time when even the very highest experts in a field believed something of great importance to their field with quite high confidence, and then turned out to wrong, probably counts.
I don't think "everyone in field X made an error" is that same thing as saying "Field X underwent a paradigm shift."
Why not ? That sounds like a massive shift in the core beliefs of the field in question. If that's not a paradigm shift, then what is ?
The "non-expressible in the new concept-space" thing that you think never actually happens.
What would count as one?
As I understand it, a paradigm shift would include the abandonment of a concept. That is, the concept cannot be coherently expressed using the new terminology. For example, there's no way to express coherent concepts in things like Ptolomy's epicycles or Aristole's impetus. I think Kuhn would say that these examples are evidence that empirical science is socially mediated.
I'm not aware of any formerly prominent mathematical concepts that can't even be articulated with modern concepts. Because mathematics is non-empirical and therefore non-social, I would be surprised if they existed.
I'm not seeing how the second sentence is an example of the criterion in your first sentence. That criterion seems to strict, too: in general the new paradigm subsumes the old (as in the canonical example of Newtonian vs relativistic physics).
I'm also not seeing what the attributes "empirical" and "non-social" have to do (causally) with the ability to form coherent concepts.
Maybe you should also unpack what you mean by "coherent"?
I'm not a mathematician, but from my outside perspective I would cheerfully qualify something like Wilf-Zeilberger theory as the math equivalent to a paradigm shift in the empirical sciences.
WP lists "non-euclidean geometry" as a paradigm shift, BTW.
Using modern physics, there is no way to express the concept that Ptolomy intended when he said epicycles. More casually, modern physicists would say "Epicycles don't exist" But contrast, the concept of set is still used in Cantor's sense, even though his formulation contained a paradox. So I think the move from geocentric theory to heliocentric theory is a paradigm shift, but adjusting the definition of set is not.
I'm using the word science as synonymous with "empirical studies" (as opposed to making stuff up without looking). That's not intended to be controversial in this community. What is controversial is the assertion that studying the history of science shows examples of paradigm shifts.
One possible explanation of this phenomena is that science is socially mediated (i.e. affected by social factors when the effect is not justified by empirical facts).
I'm asserting that mathematics is not based on empirical facts. Therefore, one would expect that it could avoid being socially mediated by avoiding interacting with reality (that is, I think a sufficiently intelligent Cartesian skeptic could generate all of mathematics). IF I am correct that they are caused by the socially mediated aspects of the scientific discipline and IF mathematics can avoid being socially mediated by virtue of its non-empirical nature, then I would expect that no paradigm shifts would occur.
This whole reference to paradigm shifts is an attempt to show a justification for my belief that mathematics is non-empirical, contrary to the original quote. If you don't believe in paradigm shifts (as Kuhn meant them, not as used by management gurus), then this is not a particularly persuasive argument.
If Wikipedia says that, I don't think it is using the word the way Kuhn did.
As I'd mentioned elsewhere, there's actually a pretty easy way to express that, IMO: "Ptolemy thought that planets move in epicycles, and he was wrong for the following reasons, but if we had poor instruments like he did, we might have made the same mistake".
The abovementioned non-euclidean geometry is one such shift, as far as I understand (though I'm not a mathematician). I'm not sure what the difference is between the history of this concept, and what Kuhn meant.
But there were other, more powerful paradigm shifts in math, IMO. For example, the invention of (or discovery of, depending on your philosophy) zero (or, more specifically, a positional system for representing numbers). Irrational numbers. Imaginary numbers. Infinite sets. Calculus (contrast with Zeno's Paradox). The list goes on.
I should also point out that many, if not all, of these discoveries (or "inventions") either arose as a solution to a scientific problem (f.ex. Calculus), or were found to have a useful scientific application after the fact (f.ex. imaginary numbers). How can this be, if mathematics is entirely "non-empirical" ?
Hmm, I'll have to think about the derivation of zero, the irrational numbers, etc.
The motivation for derivation of mathematical facts is different from the ability to derive them. I don't why the Cartesian skeptic would want to invent calculus. I'm only saying it would be possible. It wouldn't be possible if mathematics was not independent of empirical facts (because the Cartesian skeptic is isolated from all empirical facts except the skeptic's own existence).
Hmm, "justified" generally has a social component, so I doubt that this definition is useful.
So this WP page doesn't exist? ;)
My position, FWIW, is that all of science is socially mediated (as a consequence of being a human activity), mathematics no less than any other science. Whether a mathematical proposition will be assessed as true by mathematicians is a property ultimately based on physics - currently the physics of our brains.
For Kuhn, the word was, if anything, a sociological term -- not something referring to the structure of reality itself. (Kuhn was not himself a postmodernist; he still believed in physical reality, as distinct from human constructs.) So it seems to me that it would be entirely consistent with his usage to talk about paradigm shifts in mathematics, since the same kind of sociological phenomena occur in the latter discipline (even if you believe that the nature of mathematical reality itself is different from that of physical reality).
There are perfectly fine ways to express those things. Epicycles might even be useful in some cases, since they can be used as a simple approximation of what's going on.
The reason people don't use epicycles any more isn't because they're unthinkable, in the really strong "science is totally culture-dependent" sense. It's because using them was dependent on whether we thought they reflected the structure of the universe, and now we don't. Ptolemy's claim behind using epicycles was that circles were awesome, so it was likely that the universe ran on circles. This is a fact that could be tested by looking at the complexity of describing the universe with circles vs. ellipses.
So this paradigm shift stuff doesn't look very unique to me. It just looks like the refutation of an idea that happened to be central to using a model. Then you might say that math can have no paradigm shifts because it constructs no models of the world. But this isn't quite true - there are models of the mathematical world that mathematicians construct that occasionally get shaken up.
My point was that trying to express epicycles in the new terminology is not possible. That is, modern physicists say, "Epicycles don't exist."
Obviously, it is possible to use sociological terminology to describe epicycles. You yourself said that they were useful at times. But that's not the language of physics.
Since you mentioned it, I would endorse "Science is substantially culturally dependent", NOT "Science is totally culturally dependent." So culturally dependent that there is not reason to expect correspondence between any model and reality. Better science makes better predictions, but it's not clear what a "better" model would be if there's no correspondence with reality.
I brought all this up not to advocate for the cultural dependence of science. Rather, I think it would be surprising for a discipline independent of empirical facts to have paradigm shifts. Thus, the absence of paradigm shifts is a reason to think that mathematics is independent of empirical facts.
If you don't think science is substantially culturally dependent, then there's no reason my argument should persuade you that mathematics is independent of empirical facts.
But it is! You simply specify the position as a function of time and you've done it! The reason why that seems so strange isn't because modern physics has erased our ability to add circles together, it's because we no longer have epicycles as a fundamental object in our model of the world.
So if you want the copernican revolution to be a paradigm shift, the idea needs to be extended a bit. I think the best way is to redefine paradigm shift as a change in the language that we describe the world in. If we used to model planets in terms of epicycles, and now we model them in terms of ellipses, that's a change of language, even though ellipses can be expressed as sums of epicycles, and vice versa.
In fact, in every case of inexpressibility that we know of, it's been because one of the ways of thinking about the world didn't give correct predictions. We have yet to find two ways of thinking about the world that let you get different experimental results if you plan the experiment two different ways. In these cases, the paradigm shift included the falsification of a key claim.
I don't think it's necessarily true (for example, you can imagine an abstract game having a revolution in how people thought about what it was doing), but it seems reasonable for math, depending on how you define "math." I think people are just giving you a hard time because you're trying to make this general definitional argument (generally not worth the effort) on pretty shaky ground.
Thanks, that's quite clear. Should I reference abandonment of fundamental objects as the major feature of a paradigm shift?
Yes, every successful paradigm shift. Proponents of failed paradigm shifts are usually called cranks. :)
My position is that the repeated pattern of false fundamental objects suggest that we should give up on the idea of fundamental objects, and simply try to make more accurate predictions without asserting anything else about the "accuracy" of our models.
This is false in an amusing way: expressing motion in terms of epicycles is mathematically equivalent to decomposing functions into Fourier series -- a central concept in both physics and mathematics since the nineteenth century.
To be perfectly fair, AFAIK Ptolemy thought in terms of a finite (and small) number of epicycles, not an infinite series.
I disagree, as, I suspect, you already know :-)
But I have a further disagreement with your last sentence:
What do you mean, "and therefore" ? As I see it, "empirical" is the opposite of "social". Gravity exists regardless of whether I like it or not, and regardless of how many passionate essays I write about Man's inherent freedom to fly by will alone.
Yes, non-empirical is the wrong word. I mean to assert that mathematics is independent of empirical fact (and therefore non-social. A sufficiently intelligent Cartesian skeptic could derive all of mathematics in solitude).
I don't know whether this is true or not; arguments could (and have) been made that such a skeptic could not exist in a non-empirical void. But that's a bit offtopic, as I still have a problem with your previous sentence:
Are you asserting that all things which are "dependent on empirical fact" are "social" ? In this case, you must be using the word "social" in a different way than I am.
If we lived in a culture where belief in will-powered flight was the norm, and where everyone agreed that willing yourself to fly was really awesome and practically a moral imperative... then people would still plunge to their deaths upon stepping off of skyscraper roofs.
:) It is the case that the coherence of the idea of the Cartesian skeptic is basically what we are debating.
I'm specifically asserting that things that are independent of empirical facts are non-social.
I think that things that are subject to empirical fact are actually subject to social mediation, but that isn't a consequence of my previous statement.
What does rejection of the assertion "If you think you can fly, then you can" have to do with the definition of socially mediated? I don't think post-modern thinking is committed to the anti-physical realism position, even if it probably should endorse the anti-physical models position. The ability to make accurate predictions doesn't require a model that corresponds with reality.
Didn't Gödel show that nobody can derive all of mathematics in solitude because you can't have a complete and consistented mathamatical framework?
Goedel showed that no one can derive all of mathematics at all, whether in solitude or in a group, because any consistent system of axioms can't lead to all the true statements from their domain.
Anyone know whether it's proven that there are guaranteed to be non-self-referential truths which can't be derived from a given axiom system? (I'm not sure whether "self-referential" can be well-defined.)
A totally trivial nit pick, I admit, but there's no such thing as the Aristotelian theory of impetus. The theory of impetus was an anti-Aristotelian theory developed in the middle ages. Aristotle has no real dynamical theory.
Thanks, I did not actually know that. But I should have known.
Thanks. Did not know that.
It believe it already has. Consider the Weierstrass revolution. Before Weierstrass, it was commonly accepted that while continuous functions may lack a derivative at a set of discrete points, it still had to have a derivative somewhere. Then Weierstrass developed a counterexample, which I think satisfies the Kuhnian "anomaly that cannot be explained within the current paradigm."
Another quick example: during the pre-War period, most differential geometry was concerned with embedded submanifolds in Euclidean space. However, this formulation made it difficult to describe or classify surfaces -- I seem to believe but don't have time to verify that even deciding whether two sets of algebraic equations determine isomorphic varieties is NP-hard. Hence, in the post-War period, intrinsic properties and descriptions.
EDIT: I was wrong, or at least imprecise. Isomorphism of varieties can be decided with Grobner bases, the reduction of which is still doubly-exponential in time, as far as I can tell. Complexity classes aren't in my domain; I shouldn't have said anything about them without looking it up. :(
Reading the wiki page, it looks like Weierstrass corrected an error in the definition or understanding of limits. But mathematicians did not abandon the concept of limit the way physicists abandoned the concept of epicycle, so I'm not sure that qualifies as a paradigm shift. But I'm not mathematician, so my understanding may be seriously incomplete.
I can't even address your other example due to my failure of mathematical understanding.
Hindsight bias. The old limit definition was not widely considered either incorrect or incomplete.
They abandoned reasoning about limits informally, which was de rigeur beforehand. For examples of this, see Weierstrass' counterexample to the Dirichlet principle. Prior to Weierstrass, some people believed that the Dirichlet principle was true because approximate solutions exist in all natural examples, and therefore the limit of approximate solutions will be a true solution.
That's pretty clear, thanks. Obviously, experts aren't likely to think there is a basic error before it has been identified, but I'm not in position to have a reliable opinion on whether I'm suffering from hindsight bias.
Still, what fundamental object did mathematics abandon after Weierstrass' counter-example? How is this different from the changes to the definition of set provoked by Russell's paradox?
I don't recall where it is said that such an object is necessary for a Kuhnian revolution to have occurred. There was a crisis, in the Kuhnian sense, when the old understanding of limit (perhaps labeling it as limit1 will be clearer) could not explain the existence of e.g., continuous functions without derivatives anywhere, or counterexamples to the Dirichlet principle. Then Weierstrass developed limit2 with deltas and epsilons. Limit1 was then abandoned in favor of limit2.
Not true. The "old limit definition" was non-existent beyond the intuitive notion of limit, and people were fully aware that this was not a satisfactory situation.
We need to clarify what time period we're talking about. I'm not aware of anyone in the generation of Newton/Leibniz and the second generation (e.g., Daniel Bernoulli and Euler) who felt that way, but it's not as if I've read everything these people ever wrote.
The earliest criticism I'm aware of is Berkeley in 1734, but he wasn't a mathematician. As for mathematicians, the earliest I'm aware of is Lagrange in 1797.
I'm also curious about this history.
Wikipedia gives the acceptance of non-Euclidean geometry as a "classical case" of a paradigm shift. I suspect that there were several other paradigm shifts involved from Euclid's math to our math: for instance, coordinate geometry, or the use of number theory applied to abstract quantities as opposed to lengths of line segments.
The frequentist vs. baysian debate is a debate of computing mathematical paradigms. True mathematicians however shun statistics. They don't like the statistical pradigm ;)
Gödel's discovery ended a certain mathmatical pradigm of wanting to construct a complete mathematics from the ground up.
I could imagine a future paradigm shift way from the ideal of mathmatical proofs to more experimental math. Neural nets or quantum computers can give you answer to mathematical question that you ask that might be better than the answer s that axiom and proof based math provides.
Except, in practice mathematics still works this way.
No, that's not how you prove it, but you can check it pretty easily with right triangles. Similarly, if you believe that Pi == 3, you only need a large wheel and a piece of string to discover that you're wrong. This won't tell you the actual value of Pi, nor would it constitute a mathematical proof, but at least the experience would point you in the right direction.
If you find a right triangle with sides (2.9, 4, 5.15) rather than (3,4,5), are you ever entitled to reject the Pythagrean theorem? Doesn't measurement error and the non-Euclidean nature of the actual universe completely explain your experience?
In short, it seems like you can't empirically check the Pythagorean theorem.
That is not what I said. I said, regarding Pi == 3, "this won't tell you the actual value of Pi, nor would it constitute a mathematical proof, but at least the experience would point you in the right direction". If you believe that a^2 + b^2 = c^5, instead of c^2; and if your instruments are accurate down to 0.2 units, then you can discover very quickly that your formula is most probably wrong. You won't know which answer is right (though you could make a very good guess, by taking more measurements), but you will have enough evidence to doubt your theorem.
The words "most probably" in the above sentence are very important. No amount of empirical measurements will constitute a 100% logically consistent mathematical proof. But if your goal is to figure out how the length of the hypotenuse relates to the lengths of the two sides, then you are not limited to total ignorance or total knowledge, with nothing in between. You can make educated guesses. Yes, you could also get there by pure reason alone, and sometimes that approach works best; but that doesn't mean that you cannot, in principle, use empirical evidence to find the right path.
Peer review. If the next two hundred scientists who measure your triangle get the same measurements from other rulers by different manufacturers, you'd be completely justified in rejecting the Pythagorean theorem.
My challenge to you: go out and see if you can find a right triangle with those measurements.
Sure, how about a triangle just outside a black hole.
That was a quick trip. Which black hole was it?
I am having trouble with this as a statement of historical fact. Isn't that how they did it?
You could call it a pradigm shift that we today don't like how they did it ;)
Of course there is. A proof of a mathematical proposition is just as much itself a mathematical object as the proposition being proved; it exists just as independently of physics. The proof as written down is a physical object standing in the same relation to the real proof as the digit 2 before your eyes here bears to the real number 2.
But perhaps in the context Deutsch isn't making that confusion. What scope and limitations on mathematical knowledge, conditioned by the laws of nature, does he draw out from these considerations?
Friedrich Nietzsche
I don't think that is a good description of what people mean by "faith".
For a better idea of the concept of faith start here.
It's not what people intend "faith" to mean, but nevertheless it often ends up being its effective definition. (EDIT: To clarify, by "it" I am referring to Nietzsche's definition.)
Friedrich Nietzsche, foreseeing the CEV-problem? (Just kidding, of course)
[Taking the lyrics literally, the whole thing is a pretty sweet transhumanist anthem.]
--George F. Stigler, "Economics or Ethics?"
Scott Adams
-Michael "Kayin" O'Reilly
Or, as the Language Log puts it:
Swap out "grammar" and "style" for "morality" and "ethics"?
It's Language Log, without the, goddammit!
One of my favorite things about many constructed languages is that they get rid of this distinction entirely. You don't have to worry about whether or not "Xify" is a so-called real word for any given value X, you only have to check if it X's type fits the pattern. This happens merely because it's a lot easier, when you're working from scratch anyways, to design the language that way than to have to come up with a big artificial list of -ify words.
"Do you believe in revolution
Do you believe that everything will change
Policemen to people
And rats to pretty women
Do you think they will remake
Barracks to bar-rooms
Yperit to Coca-Cola
And truncheons to guitars?
Oh-oh, my naive
It will never be like that
Oh-oh, my naive
Life is like it is
Do you think that ever
Inferiority complexes will change to smiles
Petržalka to Manhattan
And dirty factories to hotels
Do you think they will elevate
Your idols to gods
That you will never have to
Bathe your sorrow with alcohol?
Oh-oh, my naive...
Do you think that suddenly
Everyone will reconcile with everyone
That no one will write you off
If you will have holes in your jeans
Do you think that in everything
Everyone will help you
That you will never have to be
Afraid of a higher power?
Oh-oh, my naive..."
My translation of a Slovak punk-rock song in 1990s "Slobodná Európa: Nikdy to tak nebude". Is it an example of an outside view, or just trying to reverse stupidity?
Found here.
•••
-- Terry Goodkind, Faith of the fallen. I know quite a few here dislike the author, but there's still a lot of good material, like this one, or the Wizard Rules.
-Seth Godin
-- Hans Moravec Time Travel and Computing
-Carl Sagan, The Demon Haunted World
Steven Kaas
-Charles Dodgeson(Lewis Carrol), Through the Looking Glass
Isn't Humpty Dumpty wrong, if the goal is intelligible conversation?
Absolutely. But if the goal is to establish dominance, as Humpty Dumpty (appears to) suggest, its technique often works.
Winston Churchill
-- G.B. Shaw, "Man and Superman"
Shaw evinces a really weird, teleological view of evolution in that play, but in doing so expresses some remarkable and remarkably early (1903) transhumanist sentiments.
I love that quote, but if it carries a rationality lesson, I fail to see it. Seems more like an appeal to the tastes of the audience here.
Yeah, you're correct. Wasn't thinking very hard.
I have to disagree; the lesson in the quote is "Win as hard as you can", which is very important if not very complicated.
I don't see the connection. If bringing a superior being to myself into existence is maximum win for me, that's not obvious. Not everyone, like Shaw's Don Juan, values the Superman.
Lynne Murray