Rationality Quotes October 2011
Here's the new thread for posting quotes, with the usual rules:
- Please post all quotes separately, so that they can be voted up/down separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- Do not quote yourself.
- Do not quote comments/posts on LW/OB.
- No more than 5 quotes per person per monthly thread, please.
Loading…
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Comments (532)
--Oliver Heavside
--Thomas Carlyle
I love this quote, but it really isn't true. People frequently forego the first one.
What is most awful is how often people do things for no reason at all.
Why is that awful?
I interpret "good reason" as "'good' reason".
I don't think the first one ever gets generated unless someone else asks them why they did that something.
it must be nice to be clever enough to generate good reasons in real time, rather than having to spend all your spare cycles preemptively coming up with justifications for your actions.
I just ran into a surprisingly candid example of Richard Feynman talking about when he did that. He worked on the atomic bomb to make sure that Nazis didn't get it first, but then he kept working on it even after the Nazis had been defeated.
I love it too and I like to have an evil reason as well. That keeps things in perspective. And a right reason - which balances the 'good' with the 'evil' according to my ethical sentiment. But that's just a (morally ambiguous) ideal. The real reason, that which Carlyle mentions, is something else again.
I have a different angle - I like to have a stupid reason, to amuse my friends with.
--Thomas Carlyle
--Thomas Carlyle
Indefinitely, anyway. I am reminded of another Carlyle quote that Moldbug quoted with approval (but then doesn't he always):
Sharon Fenick
Reading is merely a surrogate for thinking for yourself; it means letting someone else direct your thoughts. Many books, moreover, serve merely to show how many ways there are of being wrong, and how far astray you yourself would go if you followed their guidance. You should read only when your own thoughts dry up, which will of course happen frequently enough even to the best heads; but to banish your own thoughts so as to take up a book is a sin against the holy ghost; it is like deserting untrammeled nature to look at a herbarium or engravings of landscapes.
-Schopenhauer
Many of us are not Schopenhauer, and could stand to have our thoughts directed sometimes.
Theodore Roszack
--Principia Discordia (surprisingly, not quoted yet)
Maybe because it has little to do with rationality?
(deleted to humor gwern, though probably ineffectually)
I thought the relevance was extremely obvious to LW; before I explain, I'd like to hear what you interpret it as (or don't interpret it as being at all relevant).
My guess (I haven't read RobinZ's) is that vg'f nobhg pbasvezngvba ovnf. Knowing what Discordianism is probably makes it more obvious.
If I were to guess in rot13, I'd say it was about pbasvezngvba ovnf, napubevat, naq bgure fhpuyvxr curabzran - but I would still agree with shminux that it's not very good.
-- Jesse Custer, Preacher (Garth Ennis)
-- Ayn Rand
--Douglas Hofstadter
Anon
--Dan Dennet: Breaking the Spell
Eppure si muovo?
According to wikipedia, it's unlikely Galileo actually said this.
I think that the legend works best as a legend if it's known to be untrue. After all, the point is that whether he said it or not, the earth kept moving.
I'll grant that, for sure.
Aside from my abject failure at the Italian language, I think my objection can be sustained. Semmelweis, for instance, was fired for his continued insistence that hand washing by doctors prevented disease; and met his end in a sanitarium. He saved many lives by insisting on hand washing, even though he predated the germ theory of disease, and there was probably something akin to a utilitarian calculation in his giving up his own welfare for that of many others.
So, one does not go to one's death for the truth of the propositions one doesn't understand, but rather for the way the implications of those propositions affect one's terminal values. This brings us much closer to the religious who believe that very bad things happen if they recant their creed.
Indeed, because it's ungrammatical. The phrase that Galileo may (not) have said is:
(EDIT: Unless, of course, what was meant was:
i.e., "And yet....yes, I move!")
E.T. Jaynes's "Bayesian Methods: General Background"
-- Herodotus
The problem with that quote is that human biases often go the other way, i.e., we'd rather blame bad consequences on bad luck then admit we made a bad decision.
The quote may still have some use when applied to humans other than oneself.
I tried to track this down, and this seems to be Jaynes's paraphrase of Herodotus; pg 2 of "Bayesian Methods: General Background". (I looked through one translation, http://classics.mit.edu/Herodotus/history.mb.txt , and was unable to locate it.)
I got it out of "Data Analysis A Bayesian Tutorial" pg 4 where it is attributed to Herodotus
After some more searching and a pointer on Straight Dope, I think I've found it in Book 7 of the Histories when Artabanus is trying to dissuade Xerxes from launching his ill-fated war against the Greeks, where it is, as one would expect from Jaynes's paraphrase, different:
Or in another translation:
-- G.K. Chesterton
I so adore cliches. They create an expectation to subvert.
Do that too much and you'll end up with a "high brow" piece that's incomprehensible to anyone not familiar with the cliches you're subverting.
The short story in question is "The Dagger with Wings", originally published in The Incredulity of Father Brown.
That said, I don't quite understand why this constitutes a Rationality Quote.
To me, the lesson is that when someone appeals to your intuitions - you can just say no.
"Don't you feel there must be a supreme being, that everything has a purpose and a place in the grand order of things?"
"No."
(Fun story, incidentally.)
-- Roissy in DC
Why am I not a special little snowflake?
Also, that's not an equilibrium. If everyone acts like a snowflake, if will stop creating positive impressions and people who can afford to will start acting humble to countersignal snowflakiness. Unless very few people can afford to, in which case the decision is isomorphic to the prisoner's dilemma and Roissy is telling people to defect.
--Roissy in DC
Who is Roissy?
Roissy is a PUA whose old blog is now only available via the Internet Archive. Apparently this is his current blog.
In a comment from April 2010, I said:
However, I liked the grandparent and (a bit less) the other Roissy quote Konkvistador posted.
Surrendering to the barbarians are we.
To paraphrase Roissy, feel free trying to save this doomed civilization, I'll be poolside getting a tan.
Except that we are all part of this "doomed civilization", and if it collapses in civil war, nuclear apocalypse or even just a gigantic economic collapse, being at poolside won't keep you safe. So we have to save it, or to fix it. Now you can say that building a friendly AI is a much more efficient way of saving it/fixing it that getting involved in politics. That's something I can fully respect. But saying that you don't care or don't want to try is irresponsible.
For myself, saving it is very close to the thing I've to protect so I won't skip any single way I have under my own power of trying to save it : from understanding the world better to raising the sanity waterline around me to giving to charity to using train or walking instead of having a car to getting involve in politics even if it's "dirty". Because what matter is to win.
I'm very open to any argument about "this would be a more efficient way to save it" that would make me stronger in defending what I've to protect, but "don't try to save it, have fun and if everything collapse too bad" is not acceptable, it doesn't help my terminal values.
His argument is that the modern world was doomed before we where born, there is nothing really one can do to reform or save it. There can be no "have to" when there is a fairly strong possibility that nothing can be done, because incentives, biases and plain ignorance are aligned in such a way that effective positive action will bring overwhelming response against it. Anyone who thinks voting will solve anything has quite a bit of a way to go in my mind.
When civil war/nuclear apocalypse/gigantic economic collapse comes Roissy will still have a tan when it happens. The activist won't.
Actually I do think it is by far the most productive course of action, and I do support that effort as much as I can. But should that in itself raise some alarm bells in our minds? Getting Friendly AI right before it is too late is such long shot by most estimates. If contributing to this is indeed the best option for maximising desirable mid term future states of the universe for an individual or small group, we should pause to think about just how little certainty and influence a person has on a system composed of 7 billion people, their machines and the natural envrionment
For some games the only way to win is not to play. I am quite certain the average LWer will do the world much more good if he tries to promote rational thinking and tries, as best as he can, to detached and disinvest himself both emotionally and resource-wise from daily politics and ideology.
I am not saying the tiny influence a person has on the world automatically dosen't matter if a huge payoff is at all possible. I am saying that people are over-invested into politics, far beyond the point of diminishing returns due to our brains and our society tricking us into believing we matter far more in the process of government than we actually do. Remember the opportunity cost of involvement in politics!
I partially endorse the poolside getting a tan recommendation, because I'm actually convinced that taking a swim in the pool each morning for 30 minutes rather than reading political commentary will give the world more utility, because of your improved well being and productivity in other endeavours. Its likley not the optimal use of your time, but don't let the perfect become the enemy of the good.
This seems suspiciously convenient for someone who already prefers poolside tanning to saving the modern world.
The idea that you actually can save the modern world is also convenient for people with a certain self-perception.
Yes, and there are endless crackpots who believe themselves to be doing just that. Someone who is genuinely out to save the world will (unfortunately) share this same feature with crackpots; they will have to distinguish themselves from crackpots in other ways.
Allow me a small nitpick in a great commentary: I would advise against the tan due to the dangers of skin cancer and skin damage. Tanning has become a cultural obsession in the west, not so much in asian countries but is generally unhealthy for white skin. You only should get enough sun to produce the necessary Vitamin D, other than that, avoid it! Go for the swim though, it is healthy!
Agreed!
Wow, my thoughts, I'm surprised to read them here. ;)
If folks are over-invested into politics, there are two ways of making the situation more optimal.
Reduce investment into politics. This is Roissy's recommendation. This option does not directly affect political issues, but it does free up effort/resources to be spent on more productive things (science, business, personal enjoyment etc.)
Raise the social payoff of investment into politics. This entails promoting reform and change in political processes so as to make them more deliberative, less ideological, more conducive to efficient outcomes etc. This option yields a direct payoff by improving political outcomes.
I'm not saying that (2) is easy. But a site having as mission statement "refining the art of human rationality" should definitely take an interest in the issue, since so much of human deliberation occurs in the public sphere.
In the absence of research into the issue of good governance and the conditions that affect change of government by a order of magnitude better than what is currently available, I would say promoting point two is in practice harmful advice.
Strategy 1. has a guaranteed pay-off but requires an individual to admit to himself that investment so far has been wasted. Strategy 2. can be used to rationalize any escalation of investment and past investment, a very comforting idea.
But I wish to stress something, people who disinvest from politics can invest, if they really want to improve governance at any cost, into quality rationalist research that is sorely lacking. In fact I claim that a community comprised exclusively of involved and politically active citizens can in fact never come up with what would amount to "useful social science" on certain issues (say the effect of governance).
I would thus argue that a site dedicated to "the refinement of human rationality" has not only thrived because of the no mind killer rule, it might if it put its resources to it radically improve the quality of government precisely by dis-investing emotionally and resource wise from politics to partially mitigate the perverse incentives involved in the endeavour.
You do not consider "quality rationalist research" into good governance to fall under political involvement? Yes, it is very different than day-to-day involvement into political practice, ideology etc. But then again, I have not seen the latter advocated much in this thread, or at all on LW.
I for one am quite wary of any "escalation of investment", expressly because affecting any stable system (natural or social) is unfeasible without a thorough, rational understanding of its structure and leverage points. And so most effort into political activism is indeed "wasted"[1]. But given that so many folks apparently are emotionally invested into changing governance in some way, I don't think there's anything wrong with helping such folks achieve desirable outcomes.
[1] I am obviously disregarding exceptional cases such as the "Arab Spring" uprisings; but even the Tea Party has had negligible effects (e.g. the leading R nominations for the 2012 election are widely seen as mediocre, and TP candidates are not faring well), and I expect little better from the 'Occupy' effort given how unfocused it is.
None of these statements imply any of the others.
That is a plainly false claim. The second statement implies the first for a start.
There are ways to have politics in your personal life that aren't talking about ideology.
Could you give an example?
Silently thinking about ideology. Acting on the results in your personal life (you're a cop; do you arrest Mandela?). Thinking about who to vote for. Joining a union. Going on strike. Disobeying a law you don't like. Storming a significant building. Fighting in a civil war.
For politics beyond the municipal level I just don't see how. Politics is termed the art of the possible, this means one must necessarily aggressively signal to build coalitions. On the level of a state one must build coalitions around class, ideological, ethnic and even religious affiliations.
Humans are built as hypocrites for a reason, some forms of signalling are easier and more safely done if you honestly believe (while keeping adaptive behaviour that dosen't quite go together with your beliefs). Biases and failures of our mind will shift your opinions closer to your stated opinions even if you guard against this, or at the very least your children will inherit them (see crypto-Jews or Kakure Kirishitan's to get a feeling for how hard it is to avoid this).
Wondering why people are down voting this, considering I and other LWers have advocated political disengagement in the context of live in first world (and other) states as a recipe for personal happiness and improved productivity, and such comments have been up voted in the past.
Not a downvoter (and I doubt this is why people are downvoting, but I suppose it could be) but I somewhat disagree. There is a (very small) group of friends with whom I can discuss political topics without them becoming mind-killing. Frequently this is due to admitted information gaps between us, and it's about learning the specifics of an issue. It's certainly not easy though, and often involves us switching sides on each other when we see one of us using dark arts. Similarly, I would expect most LWers to be able to (somewhat) rationally discuss politics. Which is not to say that they would enjoy it/should just that I would expect them to be far more able (moreso than my friends) to do it productively.
If I had to guess why people are downvoting, though, it would be because discussing politics can bring you more happiness, and indeed does for many people, even if it's just yelling at each other. Although I feel very uncertain (p = 0.35) that this is actually the reason for downvotes.
I just upvoted it. It is extremely good advice.
It was actually Lesswrong which made me realize politics is mostly bunk, when I saw that we could talk about nearly anything it and it was completely irrelevant. And we often speak of big things. Daily politics, anything shorter than a several decade long trend or a revolution will simply not affect you and is a waste of cognitive resources and often a source of frustration.
You are not living in a hunter gatherer tribe where your voice matters, the pay-offs are so low that how your country is run could be about as influenced by your actions as are plate tectonics and you probably couldn't tell the difference. Social interactions will inform you of change which specifically requires you to change behaviour, no need to watch the news.
Really following this advice in the past two years has saved me hundreds of hours that I've been able to spend on leisure and productive endeavours. This is ignoring the emotional investment, that I would have also wasted. If I'm going to waste that, I may as well root for my favourite sports team rather than political party! More entertaining and at least that won't cause me to cut myself from interesting people or career opportunities or leave me vulnerable for silly beliefs with repercussions beyond being mistaken about my team "deserving" to win.
Lesswrong, please listen, it is OK to put on your cynic hat when that is the best thing to do!
It's not quite that simple. You shouldn't just ask yourself, "Will my one little voice make things better? Or would exactly the same things happen if I said the opposite, all else being equal?"
If UDT or TDT are at all on the right track, then you should ask, "If everyone who decides according to the same logic by which I decide decides to say this, will it make things better?" The number of people who share your logic may scale with total population, so it might still make sense to speak, even if you are individually an infinitesimal fraction of the total population.
AFAICT, this is simply wrong. There are many political trends (on a scale of several months to a few years) which will noticeably affect the typical person. This is not to say that affecting such trends is easy or that conventional "political involvement" is useful. But this should be regarded as an open problem, not as a reason to renounce and surrender any kind of involvement with these issues.
The typical person will be informed of such trends by other typical people.
Insofar as politics is mostly bunk, I would say that it's because politicians are only about as sane as everyone else. If you want politics to accomplish anything useful, try to raise the sanity waterline.
Which as we know is very very hard to do by doing politics.
A fully cynical approach, however, would likely still find that a certain level of interest in politics is necessary so that one could optimize one's political opinions (and when and how to express them) so as to maximize their signaling value.
Moreover, I disagree with this statement:
In cases of major economic and political instability, it can be tremendously valuable to be able to anticipate the coming trouble and undertake damage control as early as possible. Clearly, this isn't doable if you become aware of it only when you're struck with it completely unprepared.
Now of course, one could argue that the present system in the Western world is so stable that the probability of such trouble is infinitesimal, or that there is no known method for predicting such trouble with any accuracy. However, this must be established separately from the issue of futility of trying to influence politics by personal activism or voting.
Ironically being apolitical people will assume your politics match theirs or at least fall somewhere on the spectrum of respectable consensus, so the difference between this and consciously maximising your opinions might be smaller than seems at first glance. There are indeed in the Western world professions where it is basically your job formally or informally to master such signalling and "know politics". But these are far from the majority, even among the educated classes. Maximising political signalling will do you some good but, consider the non-trivial amount of time, effort and energy spent on this. There is a very real opportunity cost here.
I would agree with this. I tried to emphasise this here:
Looking back I think I should have elaborated on this black swan more.
I would argue that even in times of great instability and war it is relatively easy for most "common people" (who are either relatively politically apathetic or reactionary) to side post facto with the winning side, identifying which doesn't take much time or cognitive resources. If this was not the case several countries today should be less populous than they are.
But of course people who find themselves in classes that suddenly become politicised but weren't until recently (like people who wore glasses under the Khmer Rouge) would find themselves in quite a bit of trouble if they happen to follow such strategies.
I should have been more explicitly in separating these two. But the thing is I don't think our mind really understands us being informed about politics and not being able to influence it in at least a small way, with the possible exception that we are really low status in our tribe (something which will frustrate us if we don't feel we deserve this low status according to other signals).
(I think you meant to link to "Generalizing From One Example" instead of "The Mind Projection Fallacy", which is something different.)
You are right, thank you for pointing that out.
Skimming through the article I can see why I remembered it as "Typical Mind Fallacy" rather than "Generalizing From One Example". I will fix the link.
I agree, though in societies that are ideologized to a high degree, it takes non-trivial knowledge to recognize all topics and opinions that will be taken as political, so that paradoxically you need some knowledge of politics to be safely apolitical. Similarly, in such societies, the range of professions that don't require at least some expression of ideological rectitude can be surprisingly narrow, and it may exclude practically all high-status professions, even those that are supposed to be strictly technical. It seems to me (though it would of course be a controversial question in its own right) that Western societies have been moving in this direction for quite a while now.
Now, people, especially smart people, usually have an instinct to synchronize unconsciously with the respectable opinion (or rather with some particular position within the range of the respectable opinion). They will obtain the necessary knowledge without conscious effort, and they will normally be safe as long as they don't say anything that strikes them as overtly controversial. But if your synchronization mechanism doesn't work very well, it's definitely advisable to spend some effort on self-education to make sure you don't commit a dangerous faux pas.
You don't even need to reach for such extreme examples as the Khmer Rouge. In ethnic conflicts, for example, if you belong to the wrong ethnicity in the wrong place, you are typically given no option to avoid harassment, dispossession, expulsion, or even death, no matter what loyalties you choose to profess.
Moreover, some forms of political instability cause sweeping damage akin to a natural disaster, for example wartime destruction or asset price crashes. Anticipating these just slightly ahead of time gives you an immense advantage; even if you save just pennies on the dollar, it can mean the difference between a difficult but bearable situation and utter destitution.
Knowing politics can't save you here in any case. At least will do you no more good than say a simple heuristic of sticking to others of your own ethnicity (which should keep you reasonably safe). And since you are sticking to them, they will again inform you of any potential trouble via daily interactions. Just remember to be more paranoid than the norm and not afraid to change countries.
How much better is, someone involved in the political process, going to be at predicting this compared to someone who apolitically looks at the broad trends and reigning ideology? The latter person knows there is a certain probability of such a blow up, though he may miss it due to too infrequent updating, the former will probably only realize this is possible a few weeks or even days before the event. How these two approaches compare to each other depends on how fast one updates on new information I suppose. I would argue that a long term strategy of preparation for such possible "man made disasters" might outdo the rushed preparations of someone responding to the politics as they happen.
Perfect blindness to both daily politics and the real mechanisms of how one's society function is naturally perilous. But consider the context of this discussions. The "real" mechanism would be perhaps controversial but mostly not covered under the "no mind killers" rule. Politics as in the politics that most obviously triggers this is useless.
I think we're having a misunderstanding about what exactly we mean by the "no mind killers" rule. Clearly, getting into mind-killing debates with people is worse than useless; that much we can agree on. On the other hand, making a correct decision to bail out ahead of trouble requires that you face the very worst mind-killing issues head-on and make correct judgments about them. (It is possible that such attempts are ultimately futile or not worth the opportunity costs when all probabilities are considered, but there's a Catch-22 situation there, because consideration of at least some highly mind-killing topics is necessary in order to establish this.)
However, when people speak about avoiding mind-killers on LW, they often have in mind complete cessation of thinking about such topics and living under the assumption that the status quo will continue indefinitely, or all until some grand technological game-changer. (Worse yet, sometimes they go further and privilege the hypotheses on controversial questions favored by the respectable opinion and official intellectual institutions, and consider attacks on these, but not adherence to them, as mind-killing.)
What I meant by this was things like idle speculating about the election. Indulging in off hand remarks about Gawddamn Liberals and Bible Thumping Conservatives. Frowning seriously and speaking about some politicians misconduct. Debating the particularities of certain laws. Endorsing candidates, criticizing candidates. Taking the parties stated platform seriously, ad hominens on the demographics that support a certain position, various other Dark Arts, ect.
In short everything that immediately triggers tribal feelings in those who are basically politically active average Joe "good citizens".
These are of course ideological mind-killers. I would argue that currently there is some room for intelligent debates on LW about various such issues, the sore thumb being gender relations/sexual conduct. People are not obviously mind-killed by discussing say group differences or questioning Democracy (ok many are, but a substantial and not at all fringe fraction of LWers who have tought about this question and take some deep criticism of it quite seriously), though a kind of paranoia and strained feeling of someone saying "too much" does persist. Considering its demographics, constant stream of new unacclimatised participants and the very aggressive signalling on things like charity and altruism (which are concepts always heavily defined and shaped by the underlying fundamentals of a society) it is admirable that LW rationalist community can go as far as it does.
I say people can survive without daily politics just fine. Because signalling only requires they understand the ideological fundamentals, and even further they might do just as well if they try and understand the ideological fundamentals from the outside, without getting into the messy details. I think my "model crazy society's expectations as a black box without bothering with the details of how they think their crazy works" satisfices.
I do however agree with your concern here:
But to think about such issues critically and objectively, there is in fact no need to even know what has been going on in say the past year. The tabooed fundamentals and key axioms have been the same for quite a bit longer and will not be changed by political action in the context of a modern Western "representative democracy".
People who value truth seeking and truth in itself have a higher than average probability of having a damaged mechanism. One does need to know ideology but this dosen't translate into voting, watching Fox News and CNN, reading newspapers, discussing politics, commenting on Facebook on who will win the Democratic nomination this year or caring who your congressman is.
Knowing what the basic ideological structure of your society is does not translate into "doing politics" or "caring about politics" and not even exactly to "knowing politics". In fact since you bring up ideology, I will say that taking an outside view of dominant Western ideology one can conclude it is remarkably easy to figure out its result, compared to the extensive processing one must do within the framework provided by this ideology to get the same output.
It is trivial to predict the correct position on nearly anything following a few simple rules. Building a black box seems the most reasonable course of action. Naturally you can't really state the rules emulating the black box or people will object, since the signalling is all messed up and it may ruin important narratives. Following the rule set isn't without its problems, you will get a few very false negatives, but comparatively many false positives (that persist as false positives because power structures haven't yet had need to levy them in their never ending quest for ...uh... power) but if anything you will end up seeming too orthodox for your own good. As long as you maintain your apolitical demeanour, aren't passionate about "your opinions" (how could you be, you are getting the result without the empowering rationalizations remember!) this will never get you into trouble. And the best part is that you will often end up being "right" a few years or a decade or two later, a few true believers who know you over a long period of time might even notice this and end up respecting you for being "forward-thinking".
Actually, I'd say the "Reason As Memetic Immune Disorder hypothesis has it basically correct - to the extent that one can be "too orthodox," one ultimately has to fall into one or another heterodoxy. Most anybody who strives to take 90% of secular Western ideology seriously, consistently, and literally is going to end up libertarian or communist or transhumanist or the like - and of course (in the society we're discussing) only nerds do this. I think you're right to observe and that most everybody's aware that if one's goal is to get by socially with minimum effort, taking official ideology at face value, like taking religion at face value, is insane, but then of course different people have different goals.
Could you please state these rules.
Rule 1: assume all judgments that things are [ETA: or are] not of equal value are due to motivated thinking by people writing their bottom line according to a weighted primeval in-group/out-group equation, and in response one should compensate along necessary opposite vectors [ETA: or find an ingroup member to inform you about their group].
Rule 2: Rule 1 does not apply to the extent the in-group in question is constructed around complying with these rules.
Note 1: There will not necessarily be unique solutions to these rules, for example, evidence that men and women think differently in important ways can either be dismissed entirely or have its interpretation arranged so that the tasks women are better at are more important.
Note 2: These rules only apply to conclusions in line with primeval in-group/out-group thinking, for example, no one cares if their non-African scientists discover that all modern non-Africans are descended from Neanderthals, because the in-group is allowed to say things that some perceived moral systems would see as making them inferior. The opposite would have been a different situation.
Note 3: Any conclusion that results from compensating more than one did may be honestly disagreed with; the presence of honest disagreement marks the system as tolerant and makes those who apparently dishonestly disagree beyond the pale and not even worth arguing with. Any conclusion at variance with the rules system due to its compensating significantly less isn't just wrong, but evidence of primitive rationalization and/or moral failure, in accordance with Rule 1, and its advocates are different in kind from those merely disagreed with,.
[ETA: Note 4: These rules are recursive. Whether someone is valuing things as equal (or not) is to be judged by an interpreter of the rules according to the rules.
These rules constitute a black box with which one can discover socially respectable positions without understanding facts and apparently underlying issues. Information about such facts may distract from unbiased use of the black box, and result in unacceptable opinions.]
It is basically the most mind-killing thing that I can think of in the context of LW public discussion.
Also the simple rules are only simple in comparison to what they replace.They are currently scattered through various notes and correspondences and a reference chart that's probably only understandable to me. I don't have a go to response or prepared mail, I do write about individual points when they come up. To properly introduce them fully to other LWers would probably take one or two top level quality articles, rather than a throwaway comment in an obscure thread.
Not without getting into highly mind-killing territory.
Consider racefail09, where sophisticated people with great skill in expressing themselves and intimate knowledge of our politics, nonetheless found themselves in no end of trouble for violating obscure and difficult to detect taboos, found themselves in grave and potentially career threatening trouble, despite determined and terrified effort to conform.
Non political people are always getting in trouble for ideological violations - for example using "gay" or "twat" as curse word rather than "prick", and as racefail09 demonstrates, even highly political people who purport to have all the correct politics can and do regularly get in trouble.
Racefail09 is suggestive of the Maoist self criticism movement. When collectivization was considerably less successful and complete than it had been officially decreed to be, Mao concluded that ten percent of the party were traitors, so it became necessary to find and punish that many traitors, regardless of whether they existed or not, and no amount of knowledge of what was necessary to conform ideologically could save one.
Not at all. The people who complained about supposed racism in the original post should have been trolled hard for disputing the author's motives in wishing to write fiction about minority folks--and seeking to do it "right", i.e. minimizing outgroup biases. Their original arguments were non-sensical and should have been exposed as such. Instead what we got from the folks on the author's side was lots and lots of arguments about how minority people should have no say in the matter, and how pseudonymous/anonymous critics should be disregarded for not making their identity known (even though unprivileged critics have lots of reasons for being pseudonymous). The complainers' faction replied by correctly accusing the authors' side of racist bias, and that was that. It could no longer be sensibly argued that the OP authors were in the right when seeking to write about cultural outgroups in an unbiased way, so the debate was effectively lost.
Hardly a marker of "sophisticated people with great skill in expressing themselves and intimate knowledge of our politics".
No argument there, however getting to the point where you have an outside view is by itself a vast and difficult project in political and ideological self-education, which requires successful grappling with many extreme mind-killing issues without getting mind-killed yourself. You make it sound much easier than it really is!
You are right, it is much harder than I made it sound.
But I am convinced that many LWers if they could be made clearly aware of the importance of this for a clear picture and committing to be very aggressive fighting several very hard to root out biases, could make the transition or perhaps at least trust another LWer who has done some of the legwork once they saw the predictive power of the model.
Naturally the impulse of wanting to grab someone and shake violently until they realize the importance of something they have been missing their entire lives does little good. It is hard to communicate in many or few words, due to various complications, just how utterly vital this difficult and even dangerous (intellectually and perhaps emotionally) journey is in order to understand society.
I sometimes fear it just can not be done.
Conversely, the Lesswrong community's attitude that politics is mostly bunk was one of the main things that convinced me that Lesswrong was a den of sanity.
(I would in fact propose this as an excellent general litmus test for rationality, at least among the intelligent and informed.)
"Conversely, the Lesswrong community's attitude that politics is mostly bunk was one of the main things that convinced me that Lesswrong was a den of sanity.
(I would in fact propose this as an excellent general litmus test for rationality, at least among the intelligent and informed.)"
Really? How's this for a rationality quote:
(Attributed to Pericles, Greek politician.)
Stated differently, conflicts among folks in any society are inevitable. Politics is simply a way of de-escalating such conflicts and making sure they're dealt with peacefully, when simpler solutions like ethical debate become impractical (due to growing social complexity) and established law is contentious or not directly applicable.
Please note: thinking politics is bunk is not the same thing as not being interested in politics.
Or, to use a line I've always dreamed of using, after shocking somebody with unexpected political knowledge: "I said I was apathetic. I didn't say I was ignorant."
(And no, the Pericles quote is not a good rationality quote. It's blatant propaganda from the mouth of a politician.)
OK, then what does it mean? If you mean that de-facto political practice (I'd rather call this "politics-as-usual", for clarity) is not worth getting involved with, or not promoting good outcomes, or something else which could be described as "is bunk", then I will probably agree. But again, this is not a sensible reason for disclaiming and renouncing any kind of involvement in politics. Instead, we might (and perhaps should) see this as an opportunity for raising the sanity waterline in this domain by promoting more effective styles of political involvement and argumentation.
In this context, "not signaling allegiance to a standard political faction". Or more generally, "not looking at the world through the prism of a standard political ideology".
I might agree with you, except that factions (hence, folks needing to declare allegiance to some faction) have instrumental value in political processes, and it's hard to see what might replace them. So even though ideologies can be bad (since they often lead to absolute-sounding, black-and-white thinking) the best antidote to them is compromise and careful deliberation--as opposed to withdrawing politics entirely and ceding any debate to the faction with the briefest sound-bites and the most adherents.
That could have used anonymity.
If other people think political ideology is highly relevant to status, doesn't that make it probably at least somewhat so?
I was actually searching for your thread, but didn't find it among articles tagged by quotes. I would have used it if it was so.
Some pretty low status people have been quoted in rationality threads in the past, including people with really odious ideologies. Are downvoters convinced that Roissy is even lower status than those individuals or are they rather concerned that since Roissy (as the blog was back in 2007-2009) was read by quite a few members of the OB/LW community (including Robin Hanson who still links to him)!
Basically is this a fear that while Roissy's beliefs pay rent in anticipated experience they are evil (as in espousing different values) and thus shouldn't be allowed to influence fellow LWers who are clearly not good enough thinkers to handle this?
If this is so this may be a confirmation of Vladimir_M's take on the state of gender related debates on LW.
Or it could just be a bad quote.
It isn't low status as much as it is "out group". Low status doesn't warrant that kind of attention.
Posters like dedalus2u (if I recall this right) have argued that out group is basically just lowest possible status.
I disagree with posters like dedalus2u. Practically speaking I would far prefer to be the out-group villain that people desperately try to lower in status than the person that actually has low status within the group who gets treated with utter indifference.
I think I agree.
So, I am sure, would that Roissy fellow. :)
"Don’t ask yourself if something is fair. Ask someone else--a stranger in the street, for example." -Lemony Snicket
Why? How does knowing about this 'fairness' thing help me? (This was the line I was expecting the quote to go after the first sentence.)
If you want a truly amoral reason to care, it is this: most other people do, and these are the people you will have to convince of any proposal you want to make about anything, ever. If you propose something unfair, and are called on it, you will lose status and your proposal is unlikely to be adopted.
I would be deeply surprised if you did not care at all about fairness. I tend to think that at least some regard for fairness is part of the common mental structures of humans (there's a sequence post about this but I can't find it)
I agree with this in a sense, but only in a sense. It seems to me that every culture has a slightly different idea of what 'fairness' means, to the point where the word itself doesn't really translate from one language to another. (Or perhaps I'm thinking of 'justice', which still seems similar enough to count.) The tendency to have something-like-fairness seems pretty universal, though, even if the specific concepts involved aren't.
There is enough neuroatypicality here that I am only barely surprised when someone deviates significantly typical human morality.
But mostly in the form of aspergers-like attributes, and this specific form of non-typicality isn't supposed to be very different to "normal people" in terms of moral feelings, as far as I've been told, anyway. (And in fact I vaguely remember reading an article on how "aspies" tended to care about morality more than the normals... ETA: found it. Doesn't look like a particularly trustworthy source though.)
I love fairness. "Ethical Inhibitions" may be the one you are thinking of (or of interest anyway). Possibly my favorite post.
Agreed, but I don't think the quote necessarily disagrees with you. I interpreted it to mean, "If you want to know if something is fair, you can't just consult yourself." This says nothing about whether fairness is helpful or desirable, it's just warning against committing the typical mind fallacy with respect to fairness.
David Hume
Roughly true, but downvoted for being basic (by LW standards) to the point of being an applause light. Good Rationality Quotes are ones we can learn from, not just agree with.
--#122 Assorted Opinions and Maxims, Friedrich Nietzsche
Upvotes for irony if anyone can find an earlier version of the quote from a European source.
-- AJ Milne
Not to be confused with A. A. Milne, who wrote Winnie the Pooh.
For some context, this is a response to allegations of scientism, a word with remarkably overt anti-epistemological connotations.
Really, I see it as describing a family of genuine failure modes that people trying to be "scientific" often fall into. For example:
a) attempting to argue by definition that something is "science" and therefore right.
b) arguing that just because some evidence isn't scientific, that it's not valid evidence.
c) insisting that the results of the latest scientific research should are right, despite results in the relevant field having a very poor replication rate.
In case people try to argue that these errors rarely get made, here is a comment by Yvain with 22 karma that makes errors (b) and (c).
Can you point out where Yvain makes those comments that you think violate b and c? Reading that post it looks to me like Yvain's points are a little more nuanced than that.
Note incidentally that while you might be able to use the word that way, the vast majority of people who use it seem to use it in a way closer to what sketerpot is talking about. If one interacts at all with either young earth creationists or homeopaths for example it often doesn't take long before the term is thrown around.
In my experience scientists arguing with creationists (I haven't looked at arguments with homeopaths) frequently make the mistakes I list above, as well as a few related ones. In particular using the AJ Milne quote ciphergoth cited in an argument against creationism is itself at best a straw man, after all the creationist also cares about getting the facts right, in fact that's why he's arguing with the scientist, because he believes the scientist has his facts wrong.
In any case the underlying argument in the AJ Milne quote is: all people are about truth; therefore, you should believe what science has to say about subject X.
This is an example of either (1) or (2) depending on how the implicit premises are made precise.
Actually, the underlying argument is not: 'all people are about truth; therefore, you should believe what science has to say about subject X'.
The underlying argument actually is: attacking someone else's argument on the basis that said argument is apparently unreasonably concerned with something so naive as the actual facts of the matter, and smearing this as 'scientism' is purely misdirection, and utterly without logical basis. It's a culturally-based ploy that works only if one has been convinced that determining the actual facts of the matter are an exclusive and unreasonable obsession that only follows from one being afflicted with this apparent disease 'scientism', and, apparently, reasonable people not so obsessed really don't worry about such trifles as factuality.
It's a mite peculiar, to me, that you can read a comment that merely specifically says, in fact, that concerns with factual correctness are not the exclusive domain of science (and it was, in fact, a comment on a false dichotomy of exactly this nature--again, the context is at the link), and assume that what it means, apparently, is 'science by definition is right'. This assumption is utter nonsense. I've no idea where you pulled that from, but it sure as hell wasn't from my quote.
Here are some excerpts from Yvain's comments that exhibit the problems I mentioned, (as well as others that maybe I should add).
This essentially error (b) with elements of (c). From a Bayesian perspective "saying there are likely flaws in mainstream medical research" does mean one should decrease the weight one assigns to all medical findings, thus one should assign more (relative weight) to other, non-scientific, evidence, e.g., evidence likely to be based an anecdotes.
This argument violates conservation of expected evidence.
[Here follows several paragraphs describing of how much he discourages people from being afraid to take statins along with some references to "good doctors" and "correctly prescribed statin" that seem to be there to help set up a potential No True Scotsman] If my doctor recommends I take statin, I don't care about the base rates for statin "correctly prescribed" by "good doctors", I care about the base rate of statin as actually prescribed by actual doctors.
Then Nancy tells her anecdote
Yvain's reply begins:
Funny how he didn't see fit to mention this it his first post while he spent several paragraphs arguing for why satins are perfectly safe.
I'm not sure but somehow I suspect these numbers assume the statin was prescribed "correctly". Furthermore, they certainly don't take into account the base rate for medical studies being false. Also, he next says:
Somehow I suspect the numbers he gives in the preceding paragraph assumed no drug interactions.
I don't read most of that the way you've read it. For example, Yvain said "Saying that there are likely flaws in mainstream medical research doesn't license you to discount any specific medical finding unless you have particular reason to believe that finding is false." Discount is much stronger language than simply reducing weight in the claim.
No it doesn't. It only violates that if in the alternate case where Yvain knew that almost all new studies turn out to be right he would point this as a success of the method. I suspect that in that counterfactual, he likely would. But that's still not a b or a c type violation.
Most of the reply to Nancy while potentially problematic doesn't fall into b and c. But I don't think you are being fair when you say:
The standard of safe is very different than listing every well known side-effect, especially if they only happen in a fraction of the population. I don't see a contradiction here, and if there is one, it doesn't seem to fall under b or c in any obvious way.
It's not clear what Yvain indented to mean by "discount"; however, the rest of his argument assumes he can disregard the base rate unless there you have specific evidence.
Speaking of which, lukeprog discusses the idea of "reclaiming" the word scientism on his blog.
Richard Dawkins, The Selfish Gene, ch. 8
From the film The Maggie. The quote is excerpted from here.
Background: Earlier part of the 20th century, the west coast of Scotland. Marshall, an American, is in a small chartered aircraft chasing a Clyde puffer captained by Mactaggart, with whom he has business. He and the pilot have just caught sight of her in the sea below. Night is approaching.
Marshall: Where do you reckon they're making for?
Pilot: It looks like they're putting into Inverkerran for the night.
Marshall: Tell me, if they thought I thought they were going to Inverkerran, where do you reckon they would make for then?
Pilot: Strathcathaig, maybe.
Marshall: This sounds silly, but if they thought I'd think they were going to Strathcathaig because it looked as if they were going to Inverkerran -- where would they go then?
Pilot: My guess would be Pennymaddy.
Marshall: If there's such a thing as a triple bluff, I bet Mactaggart invented it. Okay, Pennymaddy.
--Cut to aboard the puffer--
Mactaggart: Aye, he'll have guessed we're making for Inverkerran.
Hamish: Will he not go there himself, then?
Mactaggart: Oh, no. He'll know we know he's seen us, so he'll be expecting us to head for Strathcathaig instead.
Hamish: Will I set her for Pennymaddy, then?
Mactaggart: No, If it should occur to him that it's occurred to us that he's expecting us to go to Strathcathaig, he would think we'll be making for Pennymaddy.
Hamish: Well, then, shall I set her for Penwhannoy?
Mactaggart: No. We'll make for Inverkerran just as we planned. It's the last thing he's likely to think of.
Lives of quiet desperation paradoxically may surface as ebullient market bubbles.
Peter Thiel, The Optimistic Thought Experiment
From the first episode of Dexter, season 6:
Batista: "...it's all about faith..."
Dexter: "Mmm..."
Batista: "It's something you feel, not something you can explain. It's very hard to put into words."
Dexter smiles politely, while thinking to himself: Because it makes no sense.
Dexter is atheist? Maybe I should see that show after all...
People put plenty of things into words that make no sense. Words are only words; that's why humanity invented mathematics.
Sure, but putting nonsense into words opens it to attack; fear of a justified attack may present as "it's hard to put into words".
--William James, "The Will to Believe" (section VII)
--Sappho #7; trans. Barnard (seen on http://www.nada.kth.se/%7Easa/Quotes/immortality )
Combine this with Nietzsche's "God is dead."
Ralph Waldo Emerson, Self-Reliance
This isn't a bad quote, but I downvoted it anyway, because it's practically a cliche at this point in our popular culture. Sorry :-(
No worries! I think it kind of illustrates what bias is quite nicely though. I haven't been so "exposed" to it personally but I guess that's because I'm not from a English speaking country, I'll try to think of quote that would actually add something to the list next time. Thanks for stating your reason for downvoting!
Cheers!
I've never heard it...
Pierre Duhem The aim and structure of physical theory
“Uniformity is death. Diversity is life”
Mikhaïl Bakounine
"Perfection isn't when there is nothing left to add, but when there is nothing left to take away."
Antoine de Saint-Exupery
Repeat.
http://lesswrong.com/lw/mx/rationality_quotes_3/
Also really badly needs to be applied to itself. So many words!
I disagree. The symmetry of the "nothing left to add / nothing left to take away" phrasing is important to the poetry of the phrase. That matters.
Warrigal previously suggested "Perfection is lack of excess."
Perfection is efficiency.
Perfection's fast.
I think "fast" is qualitatively different from "efficient" to the point where the meaning is lost. OTOH,
"Perfection's efficient."
The problem with this notion: which is more perfect, "i c wat u dd thar" or "I see what you did there"?
"Nothing left to take away," if it doesn't imply that perfection is the absence of anything at all, contains an implicit "without causing disfunctionality or other problems." ""i c wat u dd thar" is arguably not even an English sentence. It's also arguably an aesthetic affront (as is the at first accidental alliteration).
I agree with what you're saying in general, but I'm compelled to point out that, in some specific cases, "i c wat u dd thar" would actually be preferable. For example, such cases include -- just off the top of my head -- humor, parody, satire, and characterization (in a fictional narrative).
ADBOC. Well, if you have something against textspeak (or txtspk, compare newspeak) how about acronyms, such as 'laser'? The analogy seems to hold: as long as you agree beforehand on their meaning - as, indeed, must be done with all words - the brevity would be a virtue. Though, I suppose, YMMV.
Agree with the first, strongly disagree with the second.
"A scientific theory should be as simple as possible, but no simpler."
Einstein
Sounds good, but may not be meaningful outside of physics, where by "theory" you usually mean model, and a model can be made simpler or more complex as the occasion demands.
Considering my brain is too small for the universe, making the theory as simple as possible sounds like a good strategy when dealing with hard problems.
It's a good strategy if "possible" means "not too simple to function correctly". You can model a human as a point mass affected only by gravity, and this model is presumably too simple for most purposes, but it's not clear in what sense it's an impossible model.
Isaac Asimov
Tim Kreider, Artist's Note for The Pain
Fred Clarke, August 9
Alternative link, for anyone else who had problems with the typepad one.
House, episode 2x24, "No Reason"
Or if something doesn't make sense, you may not have learned to think like reality.
Great read thanks!
Isn't possible that you've just asked a Wrong Question? Although I guess you could claim that you have then made an assumption that the question could be answered . . .
"Real magic is the kind of magic that is not real, while magic that is real (magic that can actually be done), is not real magic."
-Lee Siegle
-Confucius
-A Softer World
That's terrible advice. Far better to spend that time thinking of a better attack plan. Make sure it includes contingencies to deal with anyone who may wish to avenge whoever you are killing.
Boy, Less Wrong can be really literal-minded sometimes.
"Literal-minded", "optimally efficient"; "to-mah-to", "to-may-to"...
I know this is a joke, but I want to take it seriously. Really, literal-mindedness is actually an issue in certain kinds of cases.
And I realize what I'm about to do, and so I'll just stop here.
My understanding is that the advice is to be aware that you could also end up dead, so you should dig an extra grave for yourself. It's not practical advice, it's a warning that revenge is dangerous and not worth it.
I think the point of the quote is that it's yet better to spend that time doing productive things unrelated to revenge, given that generating enough such contingencies is pretty costly.
Edit: Actually, wedrifid is right.
No, it isn't. That is another point that could be made in the general area of "Boo Revenge". The most useful point that is conveyed, via assuming it as a premise, is that taking revenge is dangerous.
I argue that quotes don't (or rather shouldn't) get credit for all possible supporting arguments for the general position they are applauding.
Apple
There's a nice quote from George Bernard Shaw on the same subject: "The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man."
It's more demonstrative imho ^^
I dislike that quote. It used to be in my quotesfile, but I removed it sometime recently. It starts out on a bad premise, namely, that the reasonable man adapts himself to the world. There's no justification for that, and if you reverse "reasonable" and "unreasonable" the quote is pointless.
See my comment here:
http://lesswrong.com/lw/7wm/rationality_quotes_october_2011/52wq
Yeah, I noticed that after I posted mine.
Ah, that one.
I may lack the context to properly appreciate this quote, but evaluating it on its own merits, I've always thought it's unfair - I think the judgemental aspect isn't necessarily warranted.
It's unreasonable to want to adapt the world to ourselves, now? In many cases I think it's just a good idea, and there are plenty of examples that I don't think anyone would feel any need to disagree with. Humankind changed the world when they eliminated smallpox, for example.
I may be missing the point.
Maybe the key to understand the quote is that "reasonable" and "unreasonable" are social judgements, society would rather want people to conform to the norms/world than have them change it. At least that's the way I read the quote.
Oh, I see! That makes sense.
"Reason" in general seems to be a good set of heuristics. Trying to be reasonable will help you make financial decisions, plan ahead for common contingencies, work hard yet sustainably, get into stable relationships, etc. Another good point of reason is that its failings tend to be known or easy to predict; for example, it tends to select low-variance strategies, discount excitement, and underestimate the duration and magnitude of personality changes. That makes it easier to evaluate: use it much more for mortgages than for romance.
The ones who do are a proper subset of the ones who think they can, and there are serious costs to being in the difference between the two sets.
Traditional saying.