Rationality Quotes May 2014
Another month has passed and here is a new rationality quotes thread. The usual rules are:
- Please post all quotes separately, so that they can be upvoted or downvoted separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- Do not quote yourself.
- Do not quote from Less Wrong itself, HPMoR, Eliezer Yudkowsky, or Robin Hanson. If you'd like to revive an old quote from one of those sources, please do so here.
- No more than 5 quotes per person per monthly thread, please.
- Provide sufficient information (URL, title, date, page number, etc.) to enable a reader to find the place where you read the quote, or its original source if available. Do not quote with only a name.
Loading…
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Comments (294)
-- Joseph P. Simmons, The Reformation: Can Social Scientists Save Themselves
Context: The quotes here are taken from the C.S. Lewis sci-fi novel Perelandra in which protagonist, Ransom, goes to an extremely ideal Venus to have philosophical discoveries and box with a man possessed by a demon.
These quotes come from the beginning of the novel when Ransom is attempting to describe the experience of having been transported through space by extraterrestrial means which had augmented his body to protect it from cold and hunger and atrophy for the duration of the journey.
This discussion (taking place in a debate over the Christian afterlife) touches upon certain sentiments about how the augmentation (or, for Lewis, glorification) of modern human bodies does not lessen us as humans but instead only improves that which is there.
C.S. Lewis, Perelandra, p. 29.
This feels like a combination of words that are supposed to sound Wisely, but don't actually make sense. (I guess Lewis uses this technique frequently.)
How specifically could being "definite" be a a problem for language? Take any specific thing, apply an arbitrary label, and you are done.
There could be a problem when a person X experienced some "qualia" that other people have never experienced, so they can't match the verbal description with anything in their experience. Or worse, they have something similar, which they match instead, even when told not to. And this seems like a situation described in the text. -- But then the problem is not having the shared experience. If they did, they would just need to apply an arbitrary label, and somehow make sure they refer to the same thing when using the label. The language would have absolutely no problem with that.
Since any attempt to defend the quote itself will only come off as a desire to shoehorn my chosen author into the rationality camp, I'll just give the simple reason why I chose to include that quote instead of stopping with the two previous:
I felt it touched on the subject of inferential distance and discussing reality using labels in a manner that was worthy of attention.
I tend to think of language as a symbolic system to denote/share/communicate these experiences with other brains. Ofcourse, there's the inherent challenge of seldom two experiences are same.(Even if it is an experiment on electrons). It's one of the reason, one of my sci-fi favourite scenario is brain-brain interfaces, that figure some way to interpret and transfer the empirical heuristic rules about a probability distribution(of any given event) one person has to another. Or may be am just being too idealistic about people always having such heuristics in their heads. (even if they are not aware of it) . :-)
This remark seems to flow from an oversimplified view of how language works. In the context of, for example, a person or a chair, this paradigm seems pretty solid... at least, it gets you a lot. You can ostend the thing ('take' it, as it were) and then appy the label. But in the case of lots of "objects" there is nothing analogous to such 'taking' as a prior, discrete step from talking. For example, "objects" like happiness, or vagueness or definiteness themselves.
I think you may benefit from reading Wittgenstein, but maybe you'd just hate it. I think you need it though!
While we are quoting Perelandra
A parallel passage from 1984:
"You will understand that I must start by asking you certain questions. In general terms, what are you prepared to do?'
'Anything that we are capable of,' said Winston.
O'Brien had turned himself a little in his chair so that he was facing Winston. He almost ignored Julia, seeming to take it for granted that Winston could speak for her. For a moment the lids flitted down over his eyes. He began asking his questions in a low, expressionless voice, as though this were a routine, a sort of catechism, most of whose answers were known to him already.
'You are prepared to give your lives?'
'Yes.'
'You are prepared to commit murder?'
'Yes.'
'To commit acts of sabotage which may cause the death of hundreds of innocent people?'
'Yes.'
'To betray your country to foreign powers?'
'Yes.'
'You are prepared to cheat, to forge, to blackmail, to corrupt the minds of children, to distribute habit-forming drugs, to encourage prostitution, to disseminate venereal diseases--to do anything which is likely to cause demoralization and weaken the power of the Party?'
'Yes.'
'If, for example, it would somehow serve our interests to throw sulphuric acid in a child's face--are you prepared to do that?'
'Yes.'
'You are prepared to lose your identity and live out the rest of your life as a waiter or a dock-worker?'
'Yes.'
'You are prepared to commit suicide, if and when we order you to do so?'
'Yes.'
'You are prepared, the two of you, to separate and never see one another again?'
'No!' broke in Julia.
It appeared to Winston that a long time passed before he answered. For a moment he seemed even to have been deprived of the power of speech. His tongue worked soundlessly, forming the opening syllables first of one word, then of the other, over and over again. Until he had said it, he did not know which word he was going to say. 'No,' he said finally.
In an opinion piece in the Boston Globe called "At MIT, the humanities are just as important as STEM" by Deborah K. Fitzgerald, Apr 30, 2014
The slashdot poster AthanasiusKircher goes on to ask
See slashdot post
Some of these things are not like the others...
Which are the odd ones out?
To a first approximation:
{ critical thinking skills; an ability to work with and interpret numbers and statistics; a willingness to experiment, to open up to change }
vs.
{ knowledge of the past and other cultures; access to the insights of great writers and artists }
Then you've got this one by itself because what the heck does it even mean:
{ the ability to navigate ambiguity }
This is part of critical thinking. Taking a vaguely defined or ambiguous problem, parsing out what it means and figuring out an approach.
I'm rather curious;
If you take people across a big swath of humanities, and ask them about subjects where there is a substantial amount of debate and not a lot of decisive evidence - say, theories of a historical Jesus - how many of those people are going to describe one of those theories as more likely than not?
Like, if you have dozens of theories that you've studied and examined closely, are we going to see people assigning >50% to their favored theory? Or will people be a lot more conservative with their confidence?
Could you have picked an example where one side isn't composed entirely of crackpots?
Depends on your definition of crackpots. I don't think most Jesus scholars are crackpots, just most likely overly credulous of their favored theories.
What I'm curious about is if people in these fields that are starved for really decisive evidence still feel compelled to name a >50% confidence theory, or if they are comfortable with the notion that their most-favored hypothesis indicated by the evidence is still probably wrong, and just comparatively much better than the other hypotheses that they have considered.
I think he meant "jesus myth" proponents, who IIRC are ... dubious.
Well, hence "historical Jesus". If I were talking about Jesus mythicists, I would have said that. I ignorantly assume there aren't that many Jesus mythicist camps fighting each other out over specific theories of mythicism...
I'm actually looking forward to Richard Carrier's book on that, but I do not expect it to decide mythicism.
Which side are you claiming to be crackpots?
Modern day people who believe there was no real historical preacher, probably named Yeshua or something like that, wandering around Palestine in the first century, and on whom the Gospels are based, are crackpots. Their position is strongly refuted by the available evidence. You don't have to be a theist or a Christian to accept this. See, for example, pretty much any of the works of Bart Ehrman, particularly "Did Jesus Exist?"
There are legitimate disputes about this historical figure. How educated was he? Was he more Jewish or Greek in terms of philosophy and theology? (That he was racially Jewish is undenied.) Was he a Zealot? etc. However that he existed has been very well established.
Seriously, I can't see how anyone could claim that Jesus was ahistorical who isn't some combination of doing reverse-stupidity on Christianity or taking an absurd contrarian position for the sake of taking an absurd contrarian position.
Edit: fixed typo.
I would think that believing Jesus didn't exist would be just as absurd as thinking that all or almost all of the events in the Gospels literally happened. Yet the latter make up a significant number of practicing Biblical scholars. And for the majority of Biblical scholars who don't think the Gospels are almost literally true, still have a form of Jesus-worship going on as they are practicing Christians. It would be hard to think that Jesus both came back from the dead and also didn't exist; meaning that it would be very hard to remain a Christian while also claiming that Jesus didn't exist, and most Biblical scholars were Christians before they were scholars.
The field both is biased in a non-academic way against one extreme position while giving cover and legitimacy to the opposite extreme position.
BTW, the probability that the Jesus character in the four Gospels was based on a real person would be a great question to ask in the next LW census/survey.
I predict you'd get a minority of people using it as a proxy for atheism, another minority favoring it simply because it's an intensely contrarian position, and the majority choosing whatever the closest match to "I don't know" on the survey is.
Was Bram Stoker's Dracula "based on" a real person ? Possibly, given an extremely weak interpretation of "based on".
What does it take for a fictional character to be based on a real person? Does it suffice to have a similar name, live in a similar place at a similar time? Do they have to perform similar actions as well? This has to be made clear before the question can be meaningfully answered.
That's an extraordinarily weak "based on". The Dracula/Tepes connection in Bram Stoker's work doesn't go much beyond Stoker borrowing what he thought was a cool name with exotic, ominous associations (and that "exotic" is important; Eastern Europe in Stoker's time was seen as capital-F Foreign to Brits, which comes through quite clearly in the book). Later authors played on it a bit more.
The equivalent here would be saying that there was probably someone named Yeshua in the Galilee area around 30 AD.
Was Yeshua that uncommon of a name? You're setting the bar pretty low here. (That being said, my understanding is that there's a strong scholarly consensus that there was a Jew named Yeshua who lived in Galilee, founded a cult which later became Christianity, and was crucified by the Romans controlling the area. So these picky ambiguities about "based on" aren't really relevant anyway)
That is true, and intentional. It is far from obvious that the connection between the fictional Jesus and the (hypothetical?) historical one is any less tenuous than that (1) . The comparison also underscores the pointlessness of the debate : just as evidence for Vlad Dracul's existence is at best extemely weak evidence for the existence of vampires, so too is evidence for a historical Jesus at best extremely weak evidence for the truth of Christianity.
(1) Keep in mind that there are no contemporary sources that refer to him, let alone to anthing he did.
{ the ability to navigate ambiguity }
I think this is one of the most important skills you get from the humanities. I have a friend who's a history professor. He's very used to hearing 20 different accounts of the same event told by different people, most of whom are self-serving if not outright lying, and working out what must actually have gone on, which looks like a strength to me.
He has a skill I'd like to have, but don't, and he got it from studying history, (and playing academic politics).
Statistics is precisely that, but with numbers.
That only works if you have numbers.
Luckily, you can make numbers.
"Making numbers" is unlikely to produce useful numbers.
"Making" is not "making up".
When you flip a coin a bunch of times and decide that it's fair, you've made numbers. There are no numbers in the coin itself, but you reasonably can state the probability of the coin coming up heads and even state your certainty in this estimate. These are numbers you made.
As a more general observation, in the Bayesian approach the prior represents information available to you before data arrives. The prior rarely starts as a number, but you must make it a number before you can proceed further.
Not necessarily.
Relevant Slate Star Codex post: “If It’s Worth Doing, It’s Worth Doing With Made-Up Statistics”
How did he know that his judgment of what actually had gone on was correct? How did he verify his conclusion?
Perhaps the ability to work with poorly-defined objectives? Including how to get some idea of what someone wants and use that to ask useful questions to refine it?
— Errol Morris
Steven Pinker
This lacks a ring of truth for me.
A lot of folks seem to expect the science of human beings to reinforce their bitterness and condemnation of human nature (roughly, "people are mostly crap"). I kinda suspect that if you asked "sophisticated people" (whoever those are) to name some important psychology experiments, those who named any would come up with Zimbardo's Stanford prison experiment and Milgram's obedience experiments pretty early on. Not a lot of emotional uplift there.
As for the arts — horror films where everyone dies screaming seem to be regarded as every bit as lowbrow as feel-good comedies.
It's not obvious that one is better off with the truth. Assume that for some desirable thing X:
P(X|I believe X will happen) = 49%
P(X|I believe X won't happen) = 1%
It seems I can't rationally believe that X will happen. Perhaps I would be better off being deluded about it.
Sorry, I don't understand - why does sum of probabilities not equal 100% in your example? Assume that you missed "5" in "P(X|I believe X won't happen) = 1%"
But for what reason?
These probabilities are not required to sum to 1, because they are not incompatible and exhaustive possible outcomes of an experiment. More obvious example to illustrate:
P(6-sided die coming up as 6 | today is Monday) = 1/6
P(6-sided die coming up as 6 | today is not Monday) = 1/6
1/6 + 1/6 != 1
I think your example is not suitable for situation above - there I can see only two possible outcomes: X happen or X not happen. We don't know anything more about X. And P(X|A) + P(X|~A) = 1, isn't so?
No. You may have confused it with P(X|A) + P(~X|A) = 1 (note the tilda). In my case, either 6-sided die comes up as 6, or it doesn't.
Yes, either X happens or X doesn't happen. P(X) + P(~X) = 1, so therefore P(X | A) + P(~X | A) = 1. Both formulations are stating the probability of X. But one is adjusting for the probability of X given A; so either X given A happens or X given A doesn't happen (which is P(~X | A) not P(X | ~A)).
When Pinker said "better off", I assumed he included goal achievement. It's plausible that people are more motivated to do something if they're more certain than they should be based on the evidence. They might not try as hard otherwise, which will influence the probability that the goal is attained. I don't really know if that's true, though.
The thing may be worth doing even if the probability isn't high that it will succeed, because the expected value could be high. But if one isn't delusionally certain that one will be successful, it may no longer be worth doing because the probability that the attempt succeeds is lower. (That was the point of my first comment.)
There could be other psychological effects of knowing certain things. For example, maybe it would be difficult to handle being completely objective about one's own flaws and so on. Being objective about people you know may (conceivably) harm your relationships. Having to lie is uncomfortable. Knowing a completely useless but embarrassing fact about someone but pretending you don't is uncomfortable, not simply a harmless, unimportant update of your map of the territory. Etc.
I'm not saying I know of any general way to avoid harmful knowledge, but that doesn't mean it doesn't exist.
Nassim Taleb
Accident, n. An inevitable occurrence due to the action of immutable natural laws.
I tend to disagree.. I have done some things which I thought was experimenting with but did not come up with any clear conclusion after the experiment and analysis. On rewriting the thesis it turned out there were a lot more implicit assumptions inside the hypothesis that I was not aware of. I think it was a badly designed experiment and it was rather unproductive in retrospective analysis. I suppose one could argue that it brought to light the implicit assumptions and that was a useful result. Somehow(not sure how or why) I find that a low standard to consider something an experiment.
Experiments can fail if they are executed or planned improperly. If both the control and the experimental group are given sugar pills, for example, or the equipment fails in a shower of sparks, the experiment has provided no evidence by which one can update. It is a small quibble, and probably not what the quote meant to illustrate (I'm guessing that the experiment provided evidence which downgraded the probability of the hypothesis), but something to note nonetheless: experiments are not magic knowledge-providers.
I think Ferguson would call those "results," and from those you would have learned about performing experiments, not about the original hypothesis you were interested in.
If anything, I think a really failed experiment is one that makes you think you've learned something that is in fact wrong, which is the result of flaws in the experiment that you never become aware of.
Ferguson's proposed new language is a downgrade. Being unable to identify something as a failure when the outcome sucks is fatalism and not particularly useful.
— Robert Morris, quoted in Brian Snow's "We Need Assurance!"
An experiment is supposed to teach you the truth. If you run the experiment badly and, say, get a false positive, then the experiment failed.
I don't think there's such a thing as "unmediated experience of the world".
(I like the quotation a lot for giving a plausible, lucid reason why Zen might spurn the usual sort of analytical discourse. But it's so clear an explanation of an idea that I think it's revealed a basic problem with the idea, namely that it points towards a non-existent goal.)
There is such a thing as a less mediated experience of the world.
Can you give some examples of more and less mediated experiences?
Reasoning inductively rather than deductively, over uncompressed data rather than summaries.
Mediated: "The numbers between 3 and 7" Unmediated: "||| |||| ||||| |||||| |||||||"
That's an interesting question-- "mediated" should probably be modified by "of what?" and "by what?".
It's definitely possible for perceptions to become less mediated by focusing on small details so that prototypes aren't dominant. It's possible to become a lot more perceptive about color, and Drawing on the Right Side of the Brain is about seeing angles, lengths, shading, curves, etc. rather than objects and thus being able to draw accurately.
If you get some distance on your emotions through meditation and/or CBT, is your experience of your emotions less mediated? More mediated? Wrong questions? I think meditators assume that the calm you achieve is already there-- you just weren't noticing it until you meditated enough, so your emotions are more mediated and your calm is less mediated, but now that I've put it into words, I'm not sure what you would use for evidence that the calm was always there rather than created by meditation.
Thank you for the evidence that it's possible to get 12 karma points for something that doesn't exactly make sense.
Because? People who claim it are lying? You dont have it, and your mind is typical?
Or maybe they and satt mean different things by “unmediated”.
Because causal mechanisms to relay information from the world to one's brain are a necessary prerequisite for "experience of the world", so one's "experience of the world" is always mediated by those causal mechanisms.
And it's not possible for just the cognitive mechanisms to shut down, and leave the perceptual ones?
If you shut down the cognitive mechanisms completely, would you even remember what you have perceived? Or even that you have perceived something?
Maybe not. That matches some reports of nonordinary experience.
I doubt it's possible. I'm sceptical that one can cleanly sort every experience-related bodily mechanism into a "cognitive" category xor a "perceptual" category. Intuitively, for example, I might think of my eyes as perceptual, and the parts of my brain that process visual signals as cognitive, but if all of those bits of my brain were cut out, I'd expect to see nothing at all, not an "unmediated" view of the world — which implies my brain is perceptual as well as cognitive. So I expect the idea of just shutting down the cognitive mechanisms and leaving the perceptual mechanisms intact is incoherent.
(Often there're also external physical mechanisms which are further mediators. You can't see an object without light going from the object to your eye, and you can't hear something without a medium between the source and your ear.)
So are people who claim unmediated experience lying?
Or using a different definition of "unmediated", or confused about their experience, or...
My best guess is that the vast majority of them are sincere. Being correct vs. being a liar is a false dichotomy.
So are they sincerely ,mistaken about that they think unmediated experience is, or about what you think it is?
(Presumably your first "that" is meant to be a "what"?) That question implies a false dichotomy too. The mistaken people might not be mistaken about what anyone thinks unmediated experience is; perhaps everyone pretty much agrees on what it is, and the mistaken people are simply misremembering or misinterpreting their own experiences.
This conversation might be more productive if you switch from Socratic questioning to simply presenting a reasonable definition of "unmediated experience" according to which unmediated experience exists. After all, your true objection seems to be that I'm using a bad definition.
There are also people who claim that they feel God's presence in their heart, you know.
I believe them. I don't believe in God, but I do believe that it's possible to have the subjective experience of a divine presence -- there's too much agreement on the broad strokes of how one feels, across cultures and religions, for it to be otherwise. Though on the other hand, some of the more specific takes on it might be bullshit, and basic cynicism suggests that some of the people talking about feeling God's presence are lying.
Seems reasonable to extend the same level of credulity to claims about enlightenment experiences. That's not to say that Buddhism is necessarily right about how they hash out in terms of mental/spiritual benefits, or in terms of what they actually mean cognitively, of course.
I don't disagree with any of that. Who knows, could be even one and the same experience which people raised in one culture interpret as God's presence, and in another as enlightenment.
The research summarized in this book seems to suggest that this is indeed the case.
And people who claim to see cold fusion and canals on mars.
There is a happy medium between treating empirical evidence as infallible, and dismissing it as not conforming to your favourite theory.
It's like neutrality on Wikipedia. You'll never attain neutrality, but there is such a thing as less and more, and you want to head in the "more" direction.
I think I see what you mean; if I mentally substitute "is closer to an" for "involves the", and "that state would have" for "that state has", the practice the quotation describes makes more sense to me. (I'm leery of the idea that it's better to head in the direction of less mediation — taking off my glasses doesn't give me a clearer view of the world — but that's a different objection.)
So while the original quotation talked about not thinking at all, your revised version urges that we think as little as possible. How does it qualify as a "rationality quote"?
It can be rationally beneficial to realise now much mediation is involved in perception, in the same way it is useful to replace naive ealism with scientific realism.
Relatively unmediated perception is also aesthetically interesting, and therefore of terminal value to many.
You tell me; I have to squint pretty hard to make it read as telling me something useful about rationality.
You can construe the goal as non existent, but that is an uncharitable reading.
Whether the goal exists is an empirical question, no...? I don't understand where (a lack of) charity enters into it.
The principle of charity relates to what people mean by what they say. Unmitigated experience might be empirically nonexistent under one interpretation of unmediated but not under another. If someone claims to have had unmediated experience , that is evidence relating to what they mean by their words.
Maybe the PoC would be an easier sell if it were phrased in terms of the "typical semantics fallacy".
I see. What more charitable interpretation of "unmediated experience" would you prefer?
Words are used to point to places. The thing that comes to your mind when you hear the words "unmediated experience of the world" might not exist. That doesn't mean that there aren't using people who use that phrase to point to something real.
Couldn't you say exactly that to anyone who doubts the existence of anything?
You could. And the way to resolve a dispute over the existence of, say, unicorns, would be to determine what is being meant by the word, in terms of what observations their existence implies that you will be more likely to see. Then you can go and make those observations.
The problem with talk of mental phenomena like "unmediated perception" is that it is difficult to do this, because the words are pointing into the mind of the person using them, which no-one else can see. Or worse, the person isn't pointing anywhere, but repeating something someone else has said, without having had personal experience. How can you tell whether a disagreement is due to the words being used differently, the minds being actually different, or the words and the minds being much the same but the people having differing awareness of their respective minds?
This is a problem I have with pretty much everything I have read about meditation. I can follow the external instructions about sitting, but if I cannot match up the description of the results to be supposedly obtained with my experience, there isn't anywhere to go with that.
The assumptions in that sentence are interesting. It presupposes that a debate is an interaction where you compete against other person by proving them wrong. I rather want to offer friendly way to improve understanding. Whether or not the other person accept it is their choice.
In cases like this it's very useful to think about what people mean with words and not go with your first impression of what they might mean.
I don't think so. I just meant to point out that what you said was a triviality. If you intended it as a protreptic triviality, that's fine, I have no objection and that's justification enough for me.
Could you define what you mean with "triviality"?
I mean something which follows from anything. I don't intend it as a term of disapprobation: trivialities are often good ways of expressing a thought, if not literally what was said. If you intended this: "In cases like this it's very useful to think about what people mean with words and not go with your first impression of what they might mean" then I agree with you, and with the need to say it. I just missed your point the first time around (and if you were to ask me, you put the point much better when you explained it to me).
Yes, that roughly what I mean. However there might be no way for you to know what they mean if you lack certain experiences.
If a New Agey person speaks about how the observer effect in Quantum physics means X, his problem is that he doesn't have any idea what "observer" means for a physicist. Actually getting the person to understand what "observer" means to a physicist isn't something that you accomplish in an hour if the person has a total lack of physics background. .
The same is true in reverse. It's not straightforward for the physicist to understand what the New Agey person means. Understanding people with a very different mindset then you is hard.
You seem to be saying two things here:
This entails that it is possible to simply explain what you mean, even across very large inferential gaps.
Yet here you seem to entertain the idea that it's sometimes impossible to explain what you mean, because a certain special experience is necessary.
I endorse the first of these two points, and I'm extremely skeptical about the second. It also seems to me that physicists tend to hold to the first, and new agers tend to hold to the second, and that this constitutes much of the difference in their epistemic virtue.
Nick Szabo
The very narrow choice of values and their seemingly libertarian phrasing implies some hidden criteria for what constitutes "a good answer" - which enables whoever follows this advice to immediately dismiss a proposal based on some unspecified "good"-ness of the answer without further thought or discussion, and dramatically downgrade their opinion of the proposer in the bargain. This seems detrimental to the rational acquisition of ideas and options.
EDIT: Criticism has since been withdrawn in response to context provided below.
The quote doesn't give that impression in context, including the comments - it's actually a statement about the importance of the rule of law. From the comments, Nick notes:
Acknowledged, and criticism withdrawn.
Trivially true, as one who cannot point out the difference is ignorant in the field of legal systems. I guess it is not what is meant?
Wikipedia:Don't stuff beans up your nose
There is a shorter version :-)
"Kids, while we're away, don't lock the cat in the fridge", said the parents.
"Ooooh, that's a great idea", said the kids...
That's not necessarily a bad result. If he's busy stuffing beans up his nose, then this might keep him out of greater trouble; everything else that's listed before (and which apparently he did before) seems worse. That might be just what his mother planned.
When another asserted something that I thought an error, I deny'd myself the pleasure of contradicting him abruptly, and of showing immediately some absurdity in his proposition; and in answering I began by observing that in certain cases or circumstances his opinion would be right, but in the present case there appear'd or seem'd to me some difference, etc.
I soon found the advantage of this change in my manner; the conversations I engag'd in went on more pleasantly. The modest way in which I propos'd my opinions procur'd them a readier reception and less contradiction; I had less mortification when I was found to be in the wrong, and I more easily prevail'd with others to give up their mistakes and join with me when I happened to be in the right.
Benjamin Franklin
I would love to hear what Richard Dawkins would say in reply to this quote.
Personally, I think it's great advice--challenging people immediately and directly is often not a good long-term strategy.
Dawkins, in arguments with theists, homeopaths, etc., is not trying to convince his interlocutors; nor are most of the other well-known atheist public figures. The aim to convince bystanders — the private atheist who is unsure whether to "come out", the theist who's all but lost his faith but isn't sure whether atheism is a position one may take publicly, the person who's lukewarm on religious arguments but has always had a rather benign and respectful view of religion, etc.
In private conversations with someone whose opinions are of concern to you, Franklin's advice make sense. The public arguments of Dawkins & Co. are more akin to performances than conversations. I think he achieves his aim admirably. I, for one, have little interest in watching people get on a public stage and have exchanges laden with "in certain cases or circumstances..." and other such mealy-mouthed nonsense.
Unfortunately this self-debasing style of contradiction has become the norm, and the people I talk to can instantly notice when I am pouring sugar on top of a serving of their own ass. Perhaps they are simply noticing changes in my tone of voice or body language, but in sufficiently intellectual partners I've noticed that abruptly contradicting them startles them into thinking more often, though I avoid this in everyday conversation with non-intellectuals for fear of increasing resentment.
-- Scott Aaronson
The same is true for a lot of intellectual concepts outside of math.
What like?
For my part, I've found the economic notions of opportunity cost and marginal utility to be like this.
That's maths too.
The specific application of the math does add value.
Most obviously for the opportunity costs, on the math side you only have to understand the "minus" symbol, which pretty much everyone already does. With marginal utility you have to understand the "derivative", but you still have to apply it in a situation ouside of math class.
It's applied math, not the pure math that the OP was talking about. Furthermore, these can be useful ideas even when used purely qualitatively; then it's not even applied math (except in a sense that everything is math, if we make the math sufficiently imprecise).
"Nothing in Biology Makes Sense Except in the Light of Evolution"
— Theodosius Dobzhansky
The fact that a theory that can be stated in ten words frames an entire discipline is quite incredible. Compared to group theory and probability, it sure seems like an easier uploading process as well.
What are the ten words or less in which evolution can be stated?
"We have what replicated better; noise permanently affects replicative ability"?
warped by random change
what replicates stays around
always evolving
(More constraints! More constraints!)
change without motion
the lament of the red queen
coevolution
"Multiply, vary, let the strongest live and the weakest die."
-Charles Darwin, The Origin of Species
I think that Darwin would himself acknowledge that "fittest" is a more accurate rendition than "strongest," but whether the quote can be rendered in this way without breaking the ten words constraint comes down to a question of whether "unfittest" counts as a legit word.
Maladapted, as an adjective? Though I suppose that's cheating a bit since it's a sense of adaptation that draws on an evolutionary metaphor.
I think "fit" has become a free-floating standard rather than meaning "fitting into a particular environment".
Natural Selection: the differential survival of replicators with heritable variation.
"Mathematics is about proving theorems based on axioms and other theorems" also frames a whole discipline.
A frame tells you something about a disciple but it doesn't tell you everything.
A good deal of the sequences seem to fall in this category. Conservation of expected evidence, for instance.
When it comes to general concepts cybernetics is something to which a lot of people on LW don't have much exposure and cybernetics as central as knowing probability theory for understanding how the world works.
Basically any subject in which I invested a decent amount of thought produces lessons that are applicable to other topics. I even learned a lot in an activity like Salsa dancing that's useful in other contexts.
What introductory material about it would you recommend?
Unfortunately I don't have a good recommendation. Formally I learned about it in a physiology lecture at university and the professor said that there isn't a good textbook that he could use to teach us.
While searching around I found An Introduction of Cybernetics by Ross Ashby. It's might not be perfect but I think it's probably a good enough introduction.
If only we could put together, say, a four-year college degree course intended to have this effect ...
I think that's a super idea. I'd like to design it and I'd like to take it. The ideas that underlie everything else. Like a whole university course devoted to A-level maths, but covering every simple underlying idea. We should start by trying to work out what the syllabus should be.
(one 16 lecture course on each topic, and we'll have three courses per term so that's 36 courses in total)
Off the top of my head we should have: groups, calculus, dimensional analysis, estimation, probability (inc bayes), relativity, quantum mechanics, electronics, programming, chemistry, evolution, evolutionary psychology, heuristics and biases, law, public speaking, creative writing, economics, logic, game theory, game-of-life, how-to-win-friends-and-influence people, history, cosmology, geography, atomic theory, molecular biology ...
All taught with immediate direct applications to actual things in the immediate environment and if you can't come up with simple examples that a child would find interesting and could understand then it doesn't make the cut.
Any more suggestions? If we get loads let's make a post on 'The ideal 4-year university course'.
Here's a related post, though it doesn't have that many suggestions: http://lesswrong.com/lw/l7/the_simple_math_of_everything/
The joke was that this is precisely what a liberal arts degree was meant to be; the main problem is that liberal arts degrees haven't kept up with the times.
-- Alan Lightman
Every 100 million years or so, an asteroid or comet the size of a mountain smashes into the earth, killing nearly everything that lives. If ever we needed proof of Nature’s indifference to the welfare of complex organisms such as ourselves, there it is. The history of life on this planet has been one of merciless destruction and blind, lurching renewal.
Sam Harris, Mother Nature is Not Our Friend, in response to the Edge Annual Question 2008
http://www.samharris.org/site/full_text/the-edge-annual-question-20081#sthash.IBMyMOQN.dpuf
Captain James Tiberius Kirk dodging an appeal to nature and the "what the hell" effect, to optimize for consequences instead of virtue.
That clip is a brilliant example of Shatner's much-mocked characteristic acting-speak.
Real probabilities about the structure and properties of the cosmos, and its relation to living organisms on this planet, can be reach’d only by correlating the findings of all who have competently investigated both the subject itself, and our mental equipment for approaching and interpreting it — astronomers, physicists, mathematicians, biologists, psychologists, anthropologists, and so on. The only sensible method is that of assembling all the objective scientifick data of 1931, and forming a fresh chain of partial indications bas’d exclusively on that data and on no conceptions derived from earlier and less ample arrays of data; meanwhile testing, by the psychological knowledge of 1931, the workings and inclinations of our minds in accepting, connecting, and making deductions from data, and most particularly weeding out all tendencies to give more than equal consideration to conceptions which would never have occurred to us had we not formerly harboured provisional and capricious ideas of the universe now conclusively known to be false. It goes without saying that this realistic principle fully allows for the examination of those irrational feelings and wishes about the universe, upon which idealists so amusingly base their various dogmatick speculations.
-- H.P. Lovecraft, Selected Letters, 1932-1934.
Consider my priors for knowledge of Bayes-fu by wise predecessors to be significantly raised.
What's with bas'd and dogmatick? Is Lovecraft aiming at some antique effect, or did he write in a non-standard dialect?
Yes and yes. Lovecraft was writing in early 20th century New England, but he typically affected the forms of late 1700s British English, or at least tried to. Partly this was for stylistic effect, but I get the sense that he also thought of his native idiom as intellectually debased.
The aesthetics of tradition were kind of a thing with Lovecraft, although in other ways he was thoroughly modern. Not that these affectations were exclusive to Lovecraft by any means; William Hope Hodgson for example wrote The Night Land (a seminal 1912 horror/SF story and notable Lovecraft influence) in an excruciating pseudo-17th-century dialect.
--Kevan Lee
- Herman Chernoff (pg 34 of Past, Present, and Future of Statistical Science, available here)
Actually, if you do this with something besides a test, this sounds like a really good way to teach a third-grader probabilities.
I'm sure this has been discussed before, but my attempts at searches for those discussions failed, so...
Why is this thread in Main and not Discussion?
Tradition, I guess.
In the Age of Sequences, Eliezer sometimes posted rationality quotes, in the article text (1, 2, 3, etc.). Things written by Eliezer in that era are probably automatically considered Main-level. And the new Rationality Quotes threads don't seem worse than the traditional ones -- if we look at the highly voted quotes.
Well, discussion didn't exist back than.
Last month I posted the rationality quotes in discussion. Someone complained and said it belonged in main so I moved it there. This month I just started it in Main.
--- The Black Opera by Mary Gentle
"The power of accurate observation is commonly called cynicism by those who have not got it." -George Bernard Shaw
Or naivety, depending on how cynical the critic is.
And of course, inaccurate observations are commonly called cynical and/or naive as well...
"Man is not going to wait passively for millions of years before evolution offers him a better brain."
--Corneliu E. Giurgea, the chemist who synthesized Piracetam and coined the term 'Nootropic'
This is from Greg Egan's 1999 novel Teranesia; since there are no hits for ‘Teranesia’ in the Google custom search, I'm inferring that it hasn't been posted before.
Here's a little background. This is a spoiler for some events early in the novel, but it is early; it's not a spoiler for the really big stuff (not even in this chapter). So Prabir lives alone with his father (‘Baba’) and mother (and baby sister Madhusree who is not in this scene), and their garden has been sown with mines for some very interesting reasons that needn't concern us, and Baba has discovered this by being blown up by one. But he's still alive, so mother and Prabir have laid a ladder atop some boxes across the garden, and she's crawled along the ladder to rescue Baba without setting off more mines. But this is harder than anticipated.
(taken from the American hardback edition, pages 50&51)
[Edit: grammar in the text written by me]
It is a good quote, and it works in context, but often it pays to (temporarily) believe that "what you'd like to be true" actually is and do your hardest (or even impossible) to figure out how you got there. “Yes, we can do it.” could be the first step toward figuring out the "how" part.
-- Lucien Zell (can't find an authoritative attribution)
I'm really not clear on what this is actually supposed to be a metaphor for.
It's clearly not something you would literally want to do, since the night is temporary and the light provided by the map is dim and brief. But maybe this is a metaphorical long-lasting night and bright burning map?
Destroying something that would be useful ir even necessary in the future so that you can better get through or perhaps survive the present.
Going to the same college as your high school sweetheart for example. Perhaps it will work out and you won't need the map.
-- Marcus Aurelius, Meditations, pg. 76
-- Carlos Bueno, Mature Optimization, pg. 14. Emphasis mine.
“I refuse to answer that question on the grounds that I don't know the answer.”
― Douglas Adams
I like this quote, but it occurs to me that "I don't know" is often a reasonable answer to a question.
How about this:
"I refuse to answer that question on the grounds that I can't think of an answer which I am confident will not put me in a negative light."
That just seems like overly honest politicking to me.
― Terry Pratchett, The Wee Free Men (Discworld, #30)
If you want to use your selfishness to help others, then you're not selfish.
Selfishness seems to be referred to as primarily a a mindset or attitude. Helping others as an outcome. I think they can co-exist at the same time, for example Adam Smith's invisible hand in capitalism.
Of course you're not. But human nature is supposedly selfish, and if your true goals are altruistic, you will have to find a way to turn it around.
Emphasis on "supposedly", since the popular hypotheses about "selfish human nature" are far too simplistic to reflect any actual results of psychological research.
Of course they are. Unlike those about Pratchett's witches, though. They reflect the 'locally-selfish-globally-altruistic' concept surprisingly well.