Rationality Quotes January 2013
Happy New Year! Here's the latest and greatest installment of rationality quotes. Remember:
- Please post all quotes separately, so that they can be voted up/down separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- Do not quote yourself
- Do not quote comments/posts on LessWrong or Overcoming Bias
- No more than 5 quotes per person per monthly thread, please
Loading…
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Comments (604)
"I've never seen the Icarus story as a lesson about the limitations of humans. I see it as a lesson about the limitations of wax as an adhesive."
-- Randall Munroe, in http://what-if.xkcd.com/30/ (What-if xkcd, Interplanetary Cessna)
--Benjamin Franklin
"Just because you no longer believe a lie, does not mean you now know the truth."
Mark Atwood
--"Sid" a commenter from HalfSigma's blog
--Sir Francis Galton
-G. K. Chesterton
Vannevar Bush
The Last Psychiatrist (http://thelastpsychiatrist.com/2010/10/how_not_to_prevent_military_su.html)
Keith E. Stanovich, What Intelligence Tests Miss: The Psychology of Rational Thought
For instance, you need analytical thinking to design your heuristics. Let your heuristics build new heuristics and a 2% failure rate compounded will give you a 50% failure rate in a few tens of generations.
Possibly, but Stanovich thinks that most heuristics were basically given to us by evolution and rather than choose among heuristics what we do is decide whether to (use them and spend little energy on thinking) or (not use them and spend a lot of energy on thinking).
What is analytical thinking, but a sequence of steps of heuristics well vetted not to lead to contradictions?
-Buttercup Dew (@NationalistPony)
Never in my life did I expect to find myself upvoting a comment quoting My Nationalist Pony.
C. S. Lewis, Mere Christianity
Caveat: this is not at all how the majority of the religious people that I know would use the word "faith". In fact, this passage turned out to be one of the earliest helps in bringing me to think critically about and ultimately discard my religious worldview.
I have had "extreme temporary loss of foundational beliefs," where I briefly lost confidence in beliefs such as the nonexistence of fundamentally mental entities (I would describe this experience as "innate but long dormant animist intutions suddenly start shouting,") but I've never had a mood where Christianity or any other religion looked probable, because even when I had such an experience, I was never enticed to privilege the hypothesis of any particular religion or superstition.
I answered "sometimes" thinking of this as just Christianity, but I would have answered "very often" if I had read your gloss more carefully.
I'm not quite sure how to explicate this, as it's something I've never really though much about and had generalized from one example to be universal. But my intuitions about what is probably true are extremely mood and even fancy-dependent, although my evaluation of particular arguments and such seems to be comparatively stable. I can see positive and negative aspects to this.
Oh whoops, I didn't read the parenthetical either. Not sure if it changes my answer.
I put never, but "not anymore" would be more accurate
This. Took a while to build that foundation, and a lot of contemplation in deciding what needed to be there... but once built, it's solid, and not given to reorganization on whim. That's not because I'm closed-minded or anything, it's because stuff like a belief that the evidence provided by your own senses is valid really is kind of fundamental to believing anything else, at all. Not believing in that implies not believing in a whole host of other things, and develops into some really strange philosophies. As a philosophical position, this is called 'empiricism', and it's actually more fundamental than belief in only the physical world (ie: disbelief in spiritual phenomena, 'materialism'), because you need a thing that says what evidence is considered valid before you have a thing that says 'and based on this evidence, I conclude'.
I answered "rarely," but I should probably qualify that. I've been an atheist for about 5 years, and in the last 2 or 3, I don't recall ever seriously thinking that the basic, factual premises of Christianity were any more likely than Greek myths. But I have had several moments -- usually following some major personal failing of mine, or maybe in others close to me -- where the Christian idea of man-as-fallen living in a fallen world made sense to me, and where I found myself unconsciously groping for something like the Christian concept of grace.
As I recall, in the first few years after my deconversion, this feeling sometimes led me to think more seriously about Christianity, and I even prayed a few times, just in case. In the past couple years that hasn't happened; I understand more fully exactly why I'd have those feelings even without anything like the Christian God, and I've thought more seriously about how to address them without falling on old habits. But certainly that experience has helped me understand what would motivate someone to either seek or hold onto Christianity, especially if they didn't have any training in Bayescraft.
I am fascinated by all of the answers that are not "never," as this has never happened to me. If any of the answerers were atheists, could any of you briefly describe these experiences and what might have caused them? (I am expecting "psychedelic drugs," so I will be most surprised by experiences that are caused by anything else.)
Erm...when I was a lot younger, when I considered doing something wrong or told a lie I had the vague feeling that someone was keeping tabs. Basically, when weighing utilities I greatly upped the probability that someone would somehow come to know of my wrongdoings, even when it was totally implausible. That "someone" was certainly not God or a dead ancestor or anything supernatural...it wasn't even necessarily an authority figure.
Basically, the superstition was that someone who knew me well would eventually come to find out about my wrongdoing, and one day they would confront me about it. And they'd be greatly disappointed or angry.
I'm ashamed to say that in the past I might have actually done actions which I myself felt were immoral, if it were not for that superstitious feeling that my actions would be discovered by another individual. It's hard to say in retrospect whether the superstitious feeling was the factor that pushed me back over that edge.
Note that I never believed the superstition...it was more of a gut feeling.
I'm older now and am proud to say that I haven't given serious consideration to doing anything which I personally feel is immoral for a very, very long time. So I do not know whether I still carry this superstition. It's not really something I can test empirically.
I think part of it is that as I grew older my mind conceptually merged "selfish desire" and "morality" neatly into one single "what is the sum total of my goals" utility function construct (though I wasn't familiar with the term "utility function" at the time).
This shift occurred sometime in high school, and it happened around the same time that I overcame mind-body dualism at a gut level. Though I've always had generally atheist beliefs, it wasn't until this shift that I really understood the implications of a logical universe.
Once these dichotomies broke down, I no longer felt the temptation to "give in" to selfish desire, nor was I warded off by "guilt" or the superstitious fear. I follow morals because I want to follow them, since they are a huge part of my utility function. Once my brain understood at a gut level that going against my morality was intrinsically against my interests, I stopped feeling any temptation to do immoral actions for selfish reasons. On the flip side, the shift also allows be to be selfish without feeling guilty. It's not that I'm a "better person" thanks to the shift in gut instinct...it's more that my opposing instincts don't fight with each other by using temptation, fear, and guilt anymore.
I think there is something about that "shift" experience I described (anecdote indicates that a lot of smart people go through this at some point in life, but most describe it in less than articulate spiritual terms) which permanently alters your gut feelings about reality, morality, and similar topics in philosophy.
I'm guessing those who answered "never" either did not carry the illusions in question to begin with and therefore did not require a shift in thought, or they did not factor in how they felt pre-shift into their introspection.
Occasionally the fundamental fact that all our inferences are provisional creeps me out. The realization that there's no way to actually ground my base belief that, say, I'm not a Boltzmann brain, combined with the fact that it's really quite absurd that anything exists rather than nothing at all given that any cause we find just moves the problem outwards is the closest thing I have to "doubting existence".
I put sometimes.
I believe all kinds of crazy stuff and question everything when I'm lying in bed trying to fall asleep, most commonly that death will be an active and specific nothing that I will exist to experience and be bored frightened and upset by forever. Something deep in my brain believes a very specific horrible cosmology as wacky and specific as any religion but not nearly as cheerful. When my faculties are weakened it feels as if I directly know it to be true and any attempt to rehearse my reasons for materialism feels like rationalizing.
I'm neither very mentally healthy nor very neurotypical, which may be part of why this happens.
Hasn't happened to me in years. Typically involved desperation about how some aspect of my life (only peripherally related to the beliefs in question, natch) was going very badly. Temptation to pray was involved. These urges really went away when I discovered that they were mainly caused by garden variety frustration + low blood sugar.
I think that in my folly-filled youth, my brain discovered that "conversion" experiences (religious/political) are fun and very energizing. When I am really dejected, a small part of me says "Let's convert to something! Clearly your current beliefs are not inspiring you enough!"
I am firmly atheist right now, lounging in my mom's warm living room in a comfy armchair, tipity-typing on my keyboard. But when I go out to sea, alone, and the weather turns, a storm picks up, and I'm caught out after dark, and thanks to a rusty socket only one bow light works... well, then, I pray to every god I know starting with Poseidon, and sell my soul to the devil while at it.
I'm not sure why I do it.
Maybe that's what my brain does to occupy the excess processing time? In high school, when I still remembered it, I used to recite the litany against fear. But that's not quite it. When waves toss my little boat around and I ask myself why I'm praying---the answer invariably comes out, ``It's never made things worse. So the Professor God isn't punishing me for my weakness. Who knows... maybe it will work? Even if not, prayer beats panic as a system idle process.''
I answered Sometimes. For me the 'foundational belief' in question is usually along the lines: "Goal (x) is worth the effort of subgoal/process (y)." These moods usually last less than 6 months, and I have a hunch that they're hormonal in nature. I've yet to systematically gather data on the factors that seem most likely to be causing them, mostly because it doesn't seem worth the effort right now. Hah. Seriously, though, I have in fact been convinced that I need to work out a consistent utility function, but when I think about the work involved, I just... blah.
My own response was “rarely”; had I answered when I was a Christian ten years ago, I would probably have said “sometimes”; had I answered as a Christian five years ago I might have said “often” or “very often” (eventually I allowed some of these moments of extreme uncertainty to become actual crises of faith and I changed my mind, though it happened in a very sloppy and roundabout way and had I had LessWrong at the time things could’ve been a lot easier.)
And still, I can think of maybe two times in the past year when I suddenly got a terrifying sinking feeling that I have got everything horribly, totally wrong. Both instances were triggered whilst around family and friends who remain religious, and both had to do with being reminded of old arguments I used to use in defense of the Bible which I couldn’t remember, in the moment, having explicitly refuted.
Neither of these moods was very important and both were combated in a matter of minutes. In retrospect, I’d guess that my brain was conflating fear of rejection-from-the-tribe-for-what-I-believe with fear of actually-being-wrong.
Not psychedelic drugs, but apparently an adequate trigger nonetheless.
I'm a bit late here, but my response seems different enough to the others posted here to warrant replying!
My brain is abysmally bad at storing trains of thought/deduction that lead to conclusions. It's very good at having exceptionally long trains of thoughts/deductions. It's quite good at storing the conclusions of my trains of thoughts, but only as cached thoughts and heuristics. It means that my brain is full of conclusions that I know I assign high probabilities to, but don't know why off the top of my head. My beliefs end up stored as a list of theorems in my head, with proofs left as an exercise to the reader. I occasionally double-check them, but it's a time-consuming process.
If I'm having a not very mentally agile day, I can't off the top of my head re-prove the results I think I know, and a different result seems tempting, I basically get confused for a while until I re-figure out how to prove the result I know I've proven before.
Basically on some days past-me seems like a sufficiently different person that I no longer completely trust her judgement.
Interesting. I've only had this experience in very restricted contexts, e.g. I noticed recently that I shouldn't trust my opinions on movies if the last time I saw them was more than several years ago because my taste in movies has changed substantially in those years.
I have been diagnosed with depression in the past, so it's not terribly surprising to me when "My life is worth living" is considered a foundational belief, that has it's confidence fade in and out quite a lot. In this case, the drugs would actually restore me back to a more normal level.
Although, considering the frequency with which it is still happening, I may want to reconsult with my Doctor. Saying "I have been diagnosed with mental health problems, and I'm on pills, but really, I still have some pretty bad mental health problems." pattern matches rather well to "Perhaps I should ask my Doctor about updating those pills."
Yep. Medical professionals often err on the side of lesser dosage anyway, even for life-threatening stuff. After all, "we gave her medication but she died anyway, the disease was too strong" sounds like abstract, chance-and-force-of-nature-and-fate stuff, and like a statistic on a sheet of paper.
"Doctor overdoses patient", on the other hand, is such a tasty scoop I'd immediately expect my grandmother to be gossiping about it and the doctor in question to be banned from medical practice for life, probably with their diplomas revoked.
They also often take their guidelines from organizations like the FDA, which are renowned to explicitly delay for five years medications that have a 1 in 10000 side-effect mortality rate versus an 80% cure-and-survival rate for diseases that kill 10k+ annually (bogus example, but I'm sure someone more conscientious than me can find real numbers).
Anyway, sorry for the possibly undesired tangent. It seems usually-optimal to keep returning to your doctor persistently as much as possible until medication really does take noticeable effect.
Sometimes, I am extremely unconvinced in the utility of "knowing stuff" or "understanding stuff" when confronted with the inability to explain it to suffering people who seem like they want to stop suffering but refuse to consider the stuff that has potential to help them stop suffering. =/
Interesting. My confidence in my beliefs has never been tied to my ability to explain them to anyone, but then again I'm a mathematician-(in-training), so...
Well, it's not that I'm not confident that they're useful to me. They are! They help me make choices that make me happy. I'm just not confident in how useful pursuing them is in comparison to various utilitarian considerations of helping other people be not miserable.
For example, suppose I could learn some more rationality tricks and start saving an extra $100 each month by some means, while in the meantime someone I know is depressed and miserable and seemingly asking for help. Instead of going to learn those rationality tricks to make an extra $100, I am tempted to sit with them and tell them all the ways I learned to manage my thoughts in order to not make myself miserable and depressed. And when this fails spectacularly, eating my time and energy, I am left inclined to do neither because that person is miserable and depressed and I'm powerless to help them so how useful is $100 really? Blah! So, to answer the question, this is the mood in which I question my belief in the usefulness of knowing and doing useful things.
I am also a computer science/math person! high five
Aren't useful things kind of useful to do kind of by definition? (I know this argument is often used to sneak in connotations, but I can't imagine that "is useful" is a sneaky connotation of "useful thing.")
What you describe sounds to me like a failure to model your friend correctly. Most people cannot fix themselves given only instructions on how to do so, and what worked for you may not work for your friend. Even if it might, it is hard to motivate yourself to do things when you are miserable and depressed, and when you are miserable and depressed, hearing someone else say "here are all the ways you currently suck, and you should stop sucking in those ways" is not necessarily encouraging.
In other words, "useful" is a two- or even three-place predicate.
I thought the most truthful answer for me would be "Rarely", given all possible interpretations of the question. I think that it should have been qualified "within the past year", to eliminate the follies of truth-seeking in one's youth. Someone who answers "Never" cannot be considering when they were a five-year-old. I have believed or wanted to believe a lot of crazy things. Even right now, thinking as an atheist, I rarely have those moods, and only rarely due to my recognized (and combated) tendency toward magical thinking. However, right now, thinking as a Christian, I would have doubts constantly, because no matter how much I would like to believe, it is plain to see that most of what I am expected to have faith in as a Christian is complete crap. I am capable of adopting either mode of thinking, as is anyone else here. We're just better at one mode than others.
--Freefall
-- Greg Egan, "Border Guards".
— Umberto Eco, The Search for the Perfect Language
In reference to Occam's razor:
--from Machine Learning by Tom M. Mitchell
Interesting how a concept seems more believable if it has a name...
-- Sterren with a literal realization that the territory did not match his mental map in The Unwilling Warlord by Lawrence Watt-Evans
(a few verbal tics were removed by me; the censorship was already present in the version I heard)
I'd vote this up, but I can't shake the feeling that the author is setting up a false dichotomy. Living forever would be great, but living forever without arthritis would be even better. There's no reason why we shouldn't solve the easier problem first.
Sure there is. If you have two problems, one of which is substantially easier than the other, then you still might solve the harder problem first if 1) solving the easier problem won't help you solve the harder problem and 2) the harder problem is substantially more pressing. In other words, you need to take into account the opportunity cost of diverting some of your resources to solving the easier problem.
In general this is true, but I believe that in this particular case the reasoning doesn't apply. Solving problems like arthritis and cancer is essential for prolonging productive biological life.
Granted, such solutions would cease to be useful once mind uploading is implemented. However, IMO mind uploading is so difficult -- and, therefore, so far in the future -- that, if we did chose to focus exclusively on it, we'd lose too many utilons to biological ailments. For the same reason, prolonging productive biological life now is still quite useful, because it would allow researchers to live longer, thus speeding up the pace of research that will eventually lead to uploading.
Using punctuation that is normally intended to match ({[]}), confused me. Use the !%#$ing other punctuation for that.
Sympathetic, but ultimately, we die OF diseases. And the years we do have are more or less valuable depending on their quality.
Physicians should maximize QALYs, and extending lifespan is only one way to do it.
The question is whether that's a useful paradigm. Aubrey Gray argues that it isn't.
-- Tim Kreider, The Quiet Ones
...but why wait until they'd almost gotten to Boston?
Perhaps because at that point, one is not faced with the prospect of spending several hours in close proximity to people with whom one has had an unpleasant social interaction.
No one wants to appear rude, of course. As this was almost the end of the ride, the person who rebuked them minimized the time he'd have to endure in the company of people who might consider him rude because of his admonishment, whether or not they agree with him. I wonder if this is partly a cultural thing.
The passage states that he'd already spoken to them twice.
"This is how it sometimes works", I would have said. Anything more starts to sound uncomfortably close to "the lurkers support me in email."
I don't know the circumstances, but I would have tried to make eye contact and just blatantly stare at them for minutes straight, maybe even hamming it up with a look of slight unhinged interest. They would have become more uncomfortable and might have started being anxious that a stranger is eavesdropping on them, causing them to want to be more discrete, depending on their disposition. I've actually tried this before, and it seems to sometimes work if they can see you staring at them. Give a subtle, slight grin, like you might be sexually turned on. If you won't see them again then it's worth a try.
Since this has got 22 upvotes I must ask: What makes this a rationality quote?
Every actual criticism of an idea/behaviour is likely to imply a much larger quantity of silent doubt/disapproval.
Sometimes, but you need to take into account what P(voices criticism | has criticism) is. Otherwise you'll constantly cave to vocal minorities (situations where the above probability is relatively large).
I'd say it comes under the 'instrumental rationality' heading. The chatter was clearly bothering the writer, but - irrationally - neither he nor the others (bar one) actually got up and said anything.
You could argue that the silence of the author and the woman behind the couple is an example of the bystander effect.
-Bas van Fraassen, The Scientific Image
What does that mean?
Believing large lies is worse than small lies; basically, it's arguing against the What-The-Hell Effect as applied to rationality. Or so I presume, did not read original.
I had noticed that effect myself, but I didn't know it had a name.
I had noticed it and mistakenly attributed it to the sunk cost fallacy but on reflection it's quite different from sunk costs. However, it was discovering and (as it turns out, incorrectly) generalising the sunk cost fallacy that alerted me to the effect and that genuinely helped me improve myself, so it's a happy mistake.
One thing that helped me was learning to fear the words 'might as well,' as in, 'I've already wasted most of the day so I might as well waste the rest of it,' or 'she'll never go out with me so I might as well not bother asking her,' and countless other examples. My way of dealing it is to mock my own thought processes ('Yeah, things are really bad so let's make them even worse. Nice plan, genius') and switch to a more utilitarian way of thinking ('A small chance of success is better than none,' 'Let's try and squeeze as much utility out of this as possible' etc.).
I hadn't fully grasped the extent to which I was sabotaging my own life with that one, pernicious little error.
Lambs are young sheep; they have less meat & less wool.
The punishment for livestock rustling being identical no matter what animal is stolen, you should prefer to steal a sheep rather than a lamb.
--Rory Miller
Nice juxtaposition.
osewalrus
I try to get around this by assuming that self-interest and malice, outside of a few exceptional cases, are evenly distributed across tribes, organizations, and political entities, and that when I find a particularly self-interested or malicious person that's evidence about their own personality rather than about tribal characteristics. This is almost certainly false and indeed requires not only bad priors but bad Bayesian inference, but I haven't yet found a way to use all but the narrowest and most obvious negative-valence concepts to predict group behavior without inviting more bias than I'd be preventing.
(If you wonder where "two hundred and forty-two miles" shortening of the river came from, it was the straightening of its original meandering path to improve navigation)
http://xkcd.com/605/
"De notre naissance à notre mort, nous sommes un cortège d’autres qui sont reliés par un fil ténu."
Jean Cocteau
("From our birth to our death, we are a procession of others whom a fine thread connects.")
"We are living on borrowed time and abiding by the law of probability, which is the only law we carefully observe. Had we done otherwise, we would now be dead heroes instead of surviving experts." –Devil's Guard
-- Steven Brust, spoken by Vlad, in Iorich
Seems to describe well the founder of this forum. I wonder if this quote resonates with a certain personal experience of yours.
-http://writebadlywell.blogspot.com/2010/05/write-yourself-into-corner.html
I would argue that the lesson is that when something valuable is at stake, we should focus on the simplest available solutions to the puzzles we face, rather than on ways to demonstrate our intelligence to ourselves or others.
Speaking of writing yourself into a corner...
According to TV Tropes, there was one show, "Sledge Hammer", which ended its first season with the main character setting off a nuclear bomb while trying to defuse it. They didn't expect to be renewed for a second season, so when they were, they had a problem. This is what they did:
Previously on Sledge Hammer:
[scene of nuclear explosion]
Tonight's episode takes place five years before that fateful explosion.
Story ... too awesome ... not to upvote ...
not sure why its rational, though.
Éowyn explaining to Aragorn why she was skilled with a blade. The Lord of the Rings: The Two Towers, the 2002 movie.
-Paul Graham
Partial duplicate
-- TVTropes
Edit (1/7): I have no particular reason to believe that this is literally true, but either way I think it holds an interesting rationality lesson. Feel free to substitute 'Zorblaxia' for 'Japan' above.
Interesting; is this true?
Yes, my Japanese teacher was very insistent about it, and IIRC would even take points off for talking about someones mental state with out the proper qualifiers.
I think you're missing a word here :P
Specific source: Useful Notes: Japanese Language on TV Tropes
TV Tropes is unreliable on Japanese culture. While it's fond of Japanese media, connection demographics show that Japanese editors are disproportionately rare (even after taking the language barrier into account); almost all the contributors to a page like that are likely to be language students or English-speaking Japanophiles, few of whom have any substantial experience with the language or culture in the wild. This introduces quite a bit of noise; for example, the site's had problems in the past with people reading meanings into Japanese words that don't exist or that are much more specific than they are in the wild.
I don't know myself whether the ancestor is accurate, but it'd be wise to take it with a grain of salt.
http://www.exmormon.org/whylft18.htm
This is part of why it's important to fight against all bad arguments everywhere, not just bad arguments on the other side.
Another interpretation: Try to figure out which side has more intelligent defenders and control for that when evaluating arguments. (On the other hand, the fact that all the smart people seem to believe X should probably be seen as evidence too...)
Yes, argument screens off authority, but that assumes that you're in a universe where it's possible to know everything and think of everything, I suspect. If one side is much more creative about coming up with clever arguments in support of itself (much better than you), who should you believe if the clever side also has all the best arguments?
Person 1: "I don't understand how my brain works. But my brain is what I rely on to understand how things work." Person 2: "Is that a problem?" Person 1: "I'm not sure how to tell."
-Today's xkcd
--Wendy Cope, He Tells Her from the series ‘Differences of Opinion’
--George Eliot
Apologies to Jayson_Virissimo.
Silas Dogood
I'd have thought from observation that quite a lot of human club is just about discussing the rules of human club, excess meta and all. Philosophy in daily practice being best considered a cultural activity, something humans do to impress other humans.
I dunno; "philosophy", at least, doesn't seem to be about discussing the rules of the human club, or maybe it's discussing a very specific part of the rules (but then, so is Maths!). Family gossip and stand-up comedy seem much closer to "discussing the rules of the human club".
Most of this is done by people that don't understand the rules of human club...
— Gregory Wheeler, "Formal Epistemology"
Is there a concrete example of a problem approached thus?
Viewing the interactions of photons as both a wave and a billiard ball. Both are wrong, but by seeing which traits remain constant in all models, we can project what traits the true model is likely to have.
Does that work? I don't know enough physics to tell if that makes sense.
It doesn't give you all the information you need, but that's how the problem was originally tackled. Scientists noticed that they had two contradictory models for light, which had a few overlapping characteristics. Those overlapping areas allowed them to start formulating new theories. Of course it took ridiculous amounts of work after that to figure out a reasonable approximation of reality, but one has to start somewhere.
-- Eric Hoffer, The True Believer
Orson Scott Card, The Lost Gate
As my math teacher always said,
-- Penn Jilette
Disagree about the fashion.
lanyard
"Two roads diverged in a wood. I took the one less traveled by, and had to eat bugs until the park rangers rescued me."
Duplicate.
Wasn't that poem sarcastic anyway? Until the last stanza, the poem says how the roads were really identical in all particulars -- and in the last stanza the narrator admits that he will be describing this choice falsely in the future.
I was rereading HP Lovecraft's The Call of Cthulhu lately, and the quote from the Necronomicon jumped out at me as a very good explanation of exactly why cryonics is such a good idea.
(Full disclosure: I myself have not signed up for cryonics. But I intend to sign up as soon as I can arrange to move to a place where it is available.)
The quote is simply this:
So strange that this quote hasn't already been memed to death in support of cryonics.
RationalWiki is extremely sceptical of cryonics and still it has quoted that.
Er... logical fallacy of fictional evidence, maybe? I wince every time somebody cites Terminator in a discussion of AI. It doesn't matter if the conclusion is right or wrong, I still wince because it's not a valid argument.
The original quote has nothing to do with life extension/immortality for humans. It just happens to be an argument for cryonics, and it seems to be a valid one: death as failure to preserve rather than cessation of activity, mortality as a problem rather than a fixed rule.
Yeah, I wasn't saying that it should be used because it's "so true." Just that it's easy to appropriate.
-Jobe Wilkins (Whateley Academy)
David Wong, 6 Harsh Truths That Will Make You a Better Person. Published in Cracked.com
This article greatly annoyed me because of how it tells people to do the correct practical things (Develop skills! Be persistent and grind! Help people!) yet gives atrocious and shallow reasons for it - and then Wong says how if people criticize him they haven't heard the message. No, David, you can give people correct directions and still be a huge jerk promoting an awful worldview!
He basically shows NO understanding of what makes one attractive to people (especially romantically) and what gives you a feeling of self-worth and self-respect. What you "are" does in fact matter - both to yourself and to others! - outside of your actions; they just reveal and signal your qualities. If you don't do anything good, it's a sign of something being broken about you, but just mechanically bartering some product of your labour for friendship, affection and status cannot work - if your life is in a rut, it's because of some deeper issues and you've got to resolve those first and foremost.
This masochistic imperative to "Work harder and quit whining" might sound all serious and mature, but does not in fact has the power to make you a "better person"; rather, you'll know you've changed for the better when you can achieve more stuff and don't feel miserable.
I wanted to write a short comment illustrating how this article might be the mirror opposite of some unfortunate ideas in the "Seduction community" - it's "forget all else and GIVE to people, to obtain affection and self-worth" versus "forget all else and TAKE from people, to obtain affection and self-worth" - and how, for a self-actualized person, needs, one's own and others', should dictate the taking and giving, not some primitive framework of barter or conquest - but I predictably got too lazy to extend it :)
I've taken a crack at what's wrong with that article.
The problem is, there's so much wrong with it from so many different angles that it's rather a large topic.
Yep.
As with most self-help advice, it is like an eyeglass prescription - only good for one specific pathology. It may correct one person's vision, while making another's massively worse.
Also, I remember what it was like to be (mildly!) depressed, and my oh my would that article not have helped.
My complaint about the article is that it has the same problem as most self-help advice. When you read it, it sounds intelligent, you nod your head, it makes sense. You might even think to yourself "Yeah, I'm going to really change now!"
But as everyone whose tried to improve himself knows, it's difficult to change your behavior (and thoughts) on a basis consistent enough to really make a long-lasting difference.
It a misleading claim. Studying of how parents influence their kids generally conclude that "being" of the parent is more important than what they specifically do with the kids.
From the article:
The author of the article doesn't seem to understand that there such a thing as good listening. If a girl tell you about some problem in her life it can be more effective to empathize with the girl than to go and solve the problem.
If something says "It's what's on the inside that matters!" a much better response would be ask: What makes you think that your inside is so much better than the inside of other people?
Could you explain this? Or link to info about such studies? (Or both?)
If a parent has a low self esteem their child is also likely to have low self esteem. The low self esteem parent might a lot to prove try to do for his child to prove to himself that he's worthy.
There a drastic difference between a child observing: "Mommy hugs me because she read in a book that good mothers hug their children and she wants to prove to herself that she's a good mother and Mommy hugs me because she loves me".
On paper the women who spents a lot of energy into doing the stuff that good mothers are supposed to do is doing more for their child then a mother who's not investing that much energy because she's more secure in herself. Being secure in herself increase the chance that she will do the right things at the right time signal her self confidence to the child. A child who sees that her mother is self confident than has also a reason to believe that everything is alright.
As far as studies go, unfortunately I don't keep good records on what information I read from what sources :( (I would add that hugging is an example I use here to illustrate the point instead of refering to specific study about hugging)
Yes... and studies show that this is largely due to genetic similarity, much less so to parenting style.
Which still means that it boils down to what the mother does.
The thing is, no one can see what you "are" except by what you do. Your argument seems to be "doing things for the right reason will lead to doing the actual right thing, instead of implementing some standard recommendation of what the right thing is". Granted. But the thing that matters is still the doing, not the being. "Being" is relevant only to the extent that it makes you do.
Oh, and as for this:
There's a third possibility: "Mommy doesn't hug me, but I know she loves me anyway". Sometimes that's worse than either of the other two.
Yep.
I'm rather curious how parents can "be" something to children without doing, since it's supposed children don't know their parents before their first contact (after birth, I mean).
I think I have heard of such studies, but the conclusion is different.
Who the parents are matter more than things like which school do the kids go, or in which neighborhood they live, etc.
But in my view, that's only because being something (let's say, a sportsman), will makes you do things that influence your kids to pursue a similar path
For Instance It Makes You Write With Odd Capitalization.
It's probably a section title.
I wish my 17-year-old self had read that article.
-- Kyubey (Puella Magi Madoka Magica)
For you, I'll walk this endless maze...
Jimmy the rational hypnotist on priming and implicit memory:
John Locke, Essay Concerning Human Understanding
-- Eric Hoffer, The True Believer
A decent quote, except I am minded to nitpick that there is no such thing as unbelief as a separate category from belief. We just have credences.
Many futile conversations have I seen among the muggles, wherein disputants tried to make some Fully General point about unbelief vs belief, or doubt vs certainty.
-- John Rawls, Justice as Fairness: A Restatement.
-Woody Allen EDIT: Fixed formatting.
-- Ricardo, publicly saying "oops" in his restrained Victorian fashion, in his essay "On Machinery".
-- P. W. Bridgman, ‘‘The Struggle for Intellectual Integrity’’
-- Groucho Marx
I'm not sure that's great advice. It will result in you trying to try to live forever. The only way to live forever or die trying is to intend to live forever.
Beatrice the Biologist
-Alfie Kohn, "Punished By Rewards"
-- Eric Hoffer, The True Believer
-- Eric Hoffer, The True Believer
-- Larry Wall
See the Mouse Universe.
--Daniel Dennett, Breaking the Spell (discussing the differences between the "intentional object" of a belief and the thing-in-the-world inspiring that belief)
The generation of random numbers is too important to be left to chance. Robert R. Coveyou, Oak Ridge National Laboratory
It's not easy to find rap lyrics that are appropriate to be posted here. Here's an attempt.
-- someone on Usenet replying to someone deriding Kurzweil
In general, though, that argument is the Galileo gambit and not a very good argument.
There's a more charitable reading of this comment, which is just "the absurdity heuristic is not all that reliable in some domains."
What makes this the Galileo Gambit is that the absurdity factor is being turned into alleged support (by affective association with the positive benefits of air travel and frequent flier miles) rather than just being neutralized. Contrast to http://lesswrong.com/lw/j1/stranger_than_history/ where absurdity is being pointed out as a fallible heuristic but not being associated with positives.
Dr. Seuss
I don't think change can be planned. It can only be recognized.
jad abumrad, a video about the development of Radio Lab and the amount of fear involved in doing original work
South Park, Se 16 ep 4, "Jewpacabra"
note: edited for concision. script
Even though this quote is focusing on religion, I think it applies to any beliefs people have that they think are "harmless" but greatly influence how they treat others. In short, since no person is an island, we have a duty to critically examine the beliefs we have that influence how we treat others.
--Scott Derrickson
While affirming the fallacy-of-composition concerns, I think we can take this charitably to mean "The universe is not totally saturated with only indifference throughout, for behold, this part of the universe called Scott Derrickson does indeed care about things."
That's the way I interpreted it, too. There's a speech in HP:MOR where Harry makes pretty much the same point.
“There is light in the world, and it is us!”
Love that moment.
That's exactly the sentiment I was aiming for with the quote.
-- Closing lines of Crimes and Misdemeanors, script by Woody Allen.
Scott Derrickson is indifferent. How do I know this? I know because Scott Derrickson's skin cells are part of Scott Derrickson, and Scott Derrickson's skin cells are indifferent.
If you interpret “X is indifferent” as “no part of X cares”, the original quote is valid and yours isn't.
-- Steve Smith, American Dad!, season 1, episode 7 "Deacon Stan, Jesus Man", on the applicability of this axiom.
-- Scenes From A Multiverse
-- thedaveoflife
--John Derbyshire
Wile this is all very inspiring, is it true? Yes, truth in and of itself is something that many people value, but what this quote is claiming is that there are a class of people (that he calls "dissidents") that specifically value this above and beyond anything else. It seems a lot more likely to me that truth is something that all or most people value to one extent or another, and as such, sometimes if the conditions are right people will sacrifice stuff to achieve it, just like for any other thing they value.
David Brin
Lots of people in Weimar Germany got angry at the emerging fascists - and went out and joined the Communist Party. It was tough to be merely a liberal democrat.
I suspect you have your causation backwards. People created / joined the Freikorps and other quasi-fascist institutions to fight the threat of Communism. Viable international Communism (~1917) predates the fall of the Kaiser - and the Freikorp had no reason to exist when the existing authorities were already willing and capable of fighting Communism.
More generally, the natural reading of the Jedi moral rules is that the risk of evil from strong emotions was so great that liberal democrats should be prohibited from feeling any (neither anger-at-injustice nor love)
I don't know why you would think the causation would be only in one direction.
Now I'm confused. What is the topic of discussion? Clarification of Weimar Republic politics is not responsive to the Jedi-moral-philosophy point. Anger causing political action, including extreme political action, is a reasonable point, but I don't actually think anger-at-opponent-unjust-acts was the cause of much Communist or Fascist membership.
You might think anger-at-social-situation vs. anger-at-unjust-acts is excessive hair-splitting. But I interpreted your response as essentially saying "Anger-at-injustice really does lead to fairly directly evil." Your example does not support that assertion. If I've misinterpreted you, please clarify. I often seem to make these interpretative mistakes, and I'd like to do better at avoiding these types of misunderstandings in the future.
If the memories of my youth serve me anger 'leads to the dark side of the force' via the intermediary 'hate'. That is, it leads you to go around frying things with lightening and choking people with a force grip. This is only 'evil' when you do the killing in cases where killing is not an entirely appropriate response. Unfortunately humans (and furry green muppet 'Lannik') are notoriously bad at judging when drastic violation of inhibitions is appropriate. Power---likely including the power to kill people with your brain---will almost always corrupt.
Not nearly as much as David Brin perverts the message that Lucas's message. I in fact do reject the instructions of Yoda but I reject what he actually says. I don't need to reject a straw caricature thereof.
Automatically. Immediately. Where did this come from? Yoda is 900 years old, wizened and gives clear indications that he thinks of long term consequences rather than being caught up in the moment. We also know he's seen at least one such Jedi to Sith transition with his own eyes (after first predicting it). Anakin took years to grow from a whiny little brat into an awesome badass (I mean... "turn evil"). That is the kind of change that Yoda (and Lucas) clearly have in mind.
That seems unlikely. It also wasn't claimed by the Furry Master. Instead what can be expected is that that opinions and political beliefs will change in predictable ways---most notably in the direction of endorsing the acquisition and use of power in ways that happen to benefit the self. Maybe the corrupted will change from a Blue to a Green but more likely they'll change into a NavyBlue and consider it Right to kill Greens with their brain, take all their stuff and ravage their womenfolk (or menfolk, or asexual alien humanoids, depending on generalized sexual orientation).
Except that Lucas in the very same movie has Darth Vader turn back to the Light and throw Palpatine down some shaft due to loyalty to his son. Perhaps Lucas isn't presenting the moral lesson that Brin believes he is presenting.
Agreed generally, but will quibble about your last par. Vader's redemption is being presented as a Heroic Feat, it is no more representative of normal moral or psychological processes in this universe than blowing up the Death Star with a single shot is representative of normal tactics.
-Pirkei Avot (5:15)
-- Henri Poincaré
Deep wisdom indeed. Some people believe the wrong things, and some believe the right things, some people believe both, some people believe neither.
To me, it expresses the need to pay attention to what you are learning, and decide which things to retain and which to discard. E.g. one student takes a course in Scala and memorizes the code for generics, while the other writes the code but focuses on understanding the notion of polymorphism and what it is good for.
Randall Munroe
This is a duplicate.
And to think, I was just getting on to post this quote myself!
SMBC comics: a metaphor for deathism.
While I am a fan of SMBC, in this case he's not doing existentialism justice (or not understanding existentialism). Existentialism is not the same thing as deathism. Existentialism is about finding meaning and responsibility in an absurd existence. While mortality is certainly absurd, biological immortality will not make existential issues go away. In fact, I suspect it will make them stronger..
edit: on the other hand, "existentialist hokey-pokey" is both funny and right on the mark!
-- Jonathan Haidt
-- Yvain, on why brinkmanship is not stupid