Rationality Quotes October 2011
Here's the new thread for posting quotes, with the usual rules:
- Please post all quotes separately, so that they can be voted up/down separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- Do not quote yourself.
- Do not quote comments/posts on LW/OB.
- No more than 5 quotes per person per monthly thread, please.
Loading…
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Comments (532)
Unknown
Jean-Paul Sartre, Nausea
Samuel Florman
I'm very confused by the downvotes, could someone explain?
Didn't downvote, but:
“Uniformity is death. Diversity is life”
Mikhaïl Bakounine
-Confucius
That's terrible advice. Far better to spend that time thinking of a better attack plan. Make sure it includes contingencies to deal with anyone who may wish to avenge whoever you are killing.
I think the point of the quote is that it's yet better to spend that time doing productive things unrelated to revenge, given that generating enough such contingencies is pretty costly.
Edit: Actually, wedrifid is right.
No, it isn't. That is another point that could be made in the general area of "Boo Revenge". The most useful point that is conveyed, via assuming it as a premise, is that taking revenge is dangerous.
I argue that quotes don't (or rather shouldn't) get credit for all possible supporting arguments for the general position they are applauding.
-A Softer World
-They Might Be Giants "Science is Real"
Ralph Waldo Emerson, Self-Reliance
Herman Grassmann, Die Ausdehnungslehre (translation by, I think, Michael J. Crowe)
Bertrand Russell
Reading is merely a surrogate for thinking for yourself; it means letting someone else direct your thoughts. Many books, moreover, serve merely to show how many ways there are of being wrong, and how far astray you yourself would go if you followed their guidance. You should read only when your own thoughts dry up, which will of course happen frequently enough even to the best heads; but to banish your own thoughts so as to take up a book is a sin against the holy ghost; it is like deserting untrammeled nature to look at a herbarium or engravings of landscapes.
-Schopenhauer
Many of us are not Schopenhauer, and could stand to have our thoughts directed sometimes.
"Perfection isn't when there is nothing left to add, but when there is nothing left to take away."
Antoine de Saint-Exupery
House, episode 2x24, "No Reason"
-- Ayn Rand
Lives of quiet desperation paradoxically may surface as ebullient market bubbles.
Peter Thiel, The Optimistic Thought Experiment
James Burke
That guy needs to train his gut instincts more. Because I find mine damn useful and seldom 'dangerously wrong'.
In order to train gut instincts, wouldn't you already have to understand the thing that you were having gut instincts about, in order to know whether or not your instincts were telling you the right thing?
--Roissy in DC
None of these statements imply any of the others.
That is a plainly false claim. The second statement implies the first for a start.
There are ways to have politics in your personal life that aren't talking about ideology.
Could you give an example?
Silently thinking about ideology. Acting on the results in your personal life (you're a cop; do you arrest Mandela?). Thinking about who to vote for. Joining a union. Going on strike. Disobeying a law you don't like. Storming a significant building. Fighting in a civil war.
For politics beyond the municipal level I just don't see how. Politics is termed the art of the possible, this means one must necessarily aggressively signal to build coalitions. On the level of a state one must build coalitions around class, ideological, ethnic and even religious affiliations.
Humans are built as hypocrites for a reason, some forms of signalling are easier and more safely done if you honestly believe (while keeping adaptive behaviour that dosen't quite go together with your beliefs). Biases and failures of our mind will shift your opinions closer to your stated opinions even if you guard against this, or at the very least your children will inherit them (see crypto-Jews or Kakure Kirishitan's to get a feeling for how hard it is to avoid this).
Wondering why people are down voting this, considering I and other LWers have advocated political disengagement in the context of live in first world (and other) states as a recipe for personal happiness and improved productivity, and such comments have been up voted in the past.
I just upvoted it. It is extremely good advice.
It was actually Lesswrong which made me realize politics is mostly bunk, when I saw that we could talk about nearly anything it and it was completely irrelevant. And we often speak of big things. Daily politics, anything shorter than a several decade long trend or a revolution will simply not affect you and is a waste of cognitive resources and often a source of frustration.
You are not living in a hunter gatherer tribe where your voice matters, the pay-offs are so low that how your country is run could be about as influenced by your actions as are plate tectonics and you probably couldn't tell the difference. Social interactions will inform you of change which specifically requires you to change behaviour, no need to watch the news.
Really following this advice in the past two years has saved me hundreds of hours that I've been able to spend on leisure and productive endeavours. This is ignoring the emotional investment, that I would have also wasted. If I'm going to waste that, I may as well root for my favourite sports team rather than political party! More entertaining and at least that won't cause me to cut myself from interesting people or career opportunities or leave me vulnerable for silly beliefs with repercussions beyond being mistaken about my team "deserving" to win.
Lesswrong, please listen, it is OK to put on your cynic hat when that is the best thing to do!
Conversely, the Lesswrong community's attitude that politics is mostly bunk was one of the main things that convinced me that Lesswrong was a den of sanity.
(I would in fact propose this as an excellent general litmus test for rationality, at least among the intelligent and informed.)
"Conversely, the Lesswrong community's attitude that politics is mostly bunk was one of the main things that convinced me that Lesswrong was a den of sanity.
(I would in fact propose this as an excellent general litmus test for rationality, at least among the intelligent and informed.)"
Really? How's this for a rationality quote:
(Attributed to Pericles, Greek politician.)
Stated differently, conflicts among folks in any society are inevitable. Politics is simply a way of de-escalating such conflicts and making sure they're dealt with peacefully, when simpler solutions like ethical debate become impractical (due to growing social complexity) and established law is contentious or not directly applicable.
Please note: thinking politics is bunk is not the same thing as not being interested in politics.
Or, to use a line I've always dreamed of using, after shocking somebody with unexpected political knowledge: "I said I was apathetic. I didn't say I was ignorant."
(And no, the Pericles quote is not a good rationality quote. It's blatant propaganda from the mouth of a politician.)
OK, then what does it mean? If you mean that de-facto political practice (I'd rather call this "politics-as-usual", for clarity) is not worth getting involved with, or not promoting good outcomes, or something else which could be described as "is bunk", then I will probably agree. But again, this is not a sensible reason for disclaiming and renouncing any kind of involvement in politics. Instead, we might (and perhaps should) see this as an opportunity for raising the sanity waterline in this domain by promoting more effective styles of political involvement and argumentation.
In this context, "not signaling allegiance to a standard political faction". Or more generally, "not looking at the world through the prism of a standard political ideology".
I might agree with you, except that factions (hence, folks needing to declare allegiance to some faction) have instrumental value in political processes, and it's hard to see what might replace them. So even though ideologies can be bad (since they often lead to absolute-sounding, black-and-white thinking) the best antidote to them is compromise and careful deliberation--as opposed to withdrawing politics entirely and ceding any debate to the faction with the briefest sound-bites and the most adherents.
Insofar as politics is mostly bunk, I would say that it's because politicians are only about as sane as everyone else. If you want politics to accomplish anything useful, try to raise the sanity waterline.
Which as we know is very very hard to do by doing politics.
AFAICT, this is simply wrong. There are many political trends (on a scale of several months to a few years) which will noticeably affect the typical person. This is not to say that affecting such trends is easy or that conventional "political involvement" is useful. But this should be regarded as an open problem, not as a reason to renounce and surrender any kind of involvement with these issues.
The typical person will be informed of such trends by other typical people.
It's not quite that simple. You shouldn't just ask yourself, "Will my one little voice make things better? Or would exactly the same things happen if I said the opposite, all else being equal?"
If UDT or TDT are at all on the right track, then you should ask, "If everyone who decides according to the same logic by which I decide decides to say this, will it make things better?" The number of people who share your logic may scale with total population, so it might still make sense to speak, even if you are individually an infinitesimal fraction of the total population.
A fully cynical approach, however, would likely still find that a certain level of interest in politics is necessary so that one could optimize one's political opinions (and when and how to express them) so as to maximize their signaling value.
Moreover, I disagree with this statement:
In cases of major economic and political instability, it can be tremendously valuable to be able to anticipate the coming trouble and undertake damage control as early as possible. Clearly, this isn't doable if you become aware of it only when you're struck with it completely unprepared.
Now of course, one could argue that the present system in the Western world is so stable that the probability of such trouble is infinitesimal, or that there is no known method for predicting such trouble with any accuracy. However, this must be established separately from the issue of futility of trying to influence politics by personal activism or voting.
Ironically being apolitical people will assume your politics match theirs or at least fall somewhere on the spectrum of respectable consensus, so the difference between this and consciously maximising your opinions might be smaller than seems at first glance. There are indeed in the Western world professions where it is basically your job formally or informally to master such signalling and "know politics". But these are far from the majority, even among the educated classes. Maximising political signalling will do you some good but, consider the non-trivial amount of time, effort and energy spent on this. There is a very real opportunity cost here.
I would agree with this. I tried to emphasise this here:
Looking back I think I should have elaborated on this black swan more.
I would argue that even in times of great instability and war it is relatively easy for most "common people" (who are either relatively politically apathetic or reactionary) to side post facto with the winning side, identifying which doesn't take much time or cognitive resources. If this was not the case several countries today should be less populous than they are.
But of course people who find themselves in classes that suddenly become politicised but weren't until recently (like people who wore glasses under the Khmer Rouge) would find themselves in quite a bit of trouble if they happen to follow such strategies.
I should have been more explicitly in separating these two. But the thing is I don't think our mind really understands us being informed about politics and not being able to influence it in at least a small way, with the possible exception that we are really low status in our tribe (something which will frustrate us if we don't feel we deserve this low status according to other signals).
(I think you meant to link to "Generalizing From One Example" instead of "The Mind Projection Fallacy", which is something different.)
You are right, thank you for pointing that out.
Skimming through the article I can see why I remembered it as "Typical Mind Fallacy" rather than "Generalizing From One Example". I will fix the link.
I agree, though in societies that are ideologized to a high degree, it takes non-trivial knowledge to recognize all topics and opinions that will be taken as political, so that paradoxically you need some knowledge of politics to be safely apolitical. Similarly, in such societies, the range of professions that don't require at least some expression of ideological rectitude can be surprisingly narrow, and it may exclude practically all high-status professions, even those that are supposed to be strictly technical. It seems to me (though it would of course be a controversial question in its own right) that Western societies have been moving in this direction for quite a while now.
Now, people, especially smart people, usually have an instinct to synchronize unconsciously with the respectable opinion (or rather with some particular position within the range of the respectable opinion). They will obtain the necessary knowledge without conscious effort, and they will normally be safe as long as they don't say anything that strikes them as overtly controversial. But if your synchronization mechanism doesn't work very well, it's definitely advisable to spend some effort on self-education to make sure you don't commit a dangerous faux pas.
You don't even need to reach for such extreme examples as the Khmer Rouge. In ethnic conflicts, for example, if you belong to the wrong ethnicity in the wrong place, you are typically given no option to avoid harassment, dispossession, expulsion, or even death, no matter what loyalties you choose to profess.
Moreover, some forms of political instability cause sweeping damage akin to a natural disaster, for example wartime destruction or asset price crashes. Anticipating these just slightly ahead of time gives you an immense advantage; even if you save just pennies on the dollar, it can mean the difference between a difficult but bearable situation and utter destitution.
People who value truth seeking and truth in itself have a higher than average probability of having a damaged mechanism. One does need to know ideology but this dosen't translate into voting, watching Fox News and CNN, reading newspapers, discussing politics, commenting on Facebook on who will win the Democratic nomination this year or caring who your congressman is.
Knowing what the basic ideological structure of your society is does not translate into "doing politics" or "caring about politics" and not even exactly to "knowing politics". In fact since you bring up ideology, I will say that taking an outside view of dominant Western ideology one can conclude it is remarkably easy to figure out its result, compared to the extensive processing one must do within the framework provided by this ideology to get the same output.
It is trivial to predict the correct position on nearly anything following a few simple rules. Building a black box seems the most reasonable course of action. Naturally you can't really state the rules emulating the black box or people will object, since the signalling is all messed up and it may ruin important narratives. Following the rule set isn't without its problems, you will get a few very false negatives, but comparatively many false positives (that persist as false positives because power structures haven't yet had need to levy them in their never ending quest for ...uh... power) but if anything you will end up seeming too orthodox for your own good. As long as you maintain your apolitical demeanour, aren't passionate about "your opinions" (how could you be, you are getting the result without the empowering rationalizations remember!) this will never get you into trouble. And the best part is that you will often end up being "right" a few years or a decade or two later, a few true believers who know you over a long period of time might even notice this and end up respecting you for being "forward-thinking".
Consider racefail09, where sophisticated people with great skill in expressing themselves and intimate knowledge of our politics, nonetheless found themselves in no end of trouble for violating obscure and difficult to detect taboos, found themselves in grave and potentially career threatening trouble, despite determined and terrified effort to conform.
Non political people are always getting in trouble for ideological violations - for example using "gay" or "twat" as curse word rather than "prick", and as racefail09 demonstrates, even highly political people who purport to have all the correct politics can and do regularly get in trouble.
Racefail09 is suggestive of the Maoist self criticism movement. When collectivization was considerably less successful and complete than it had been officially decreed to be, Mao concluded that ten percent of the party were traitors, so it became necessary to find and punish that many traitors, regardless of whether they existed or not, and no amount of knowledge of what was necessary to conform ideologically could save one.
Not at all. The people who complained about supposed racism in the original post should have been trolled hard for disputing the author's motives in wishing to write fiction about minority folks--and seeking to do it "right", i.e. minimizing outgroup biases. Their original arguments were non-sensical and should have been exposed as such. Instead what we got from the folks on the author's side was lots and lots of arguments about how minority people should have no say in the matter, and how pseudonymous/anonymous critics should be disregarded for not making their identity known (even though unprivileged critics have lots of reasons for being pseudonymous). The complainers' faction replied by correctly accusing the authors' side of racist bias, and that was that. It could no longer be sensibly argued that the OP authors were in the right when seeking to write about cultural outgroups in an unbiased way, so the debate was effectively lost.
Hardly a marker of "sophisticated people with great skill in expressing themselves and intimate knowledge of our politics".
Could you please state these rules.
Not without getting into highly mind-killing territory.
Rule 1: assume all judgments that things are [ETA: or are] not of equal value are due to motivated thinking by people writing their bottom line according to a weighted primeval in-group/out-group equation, and in response one should compensate along necessary opposite vectors [ETA: or find an ingroup member to inform you about their group].
Rule 2: Rule 1 does not apply to the extent the in-group in question is constructed around complying with these rules.
Note 1: There will not necessarily be unique solutions to these rules, for example, evidence that men and women think differently in important ways can either be dismissed entirely or have its interpretation arranged so that the tasks women are better at are more important.
Note 2: These rules only apply to conclusions in line with primeval in-group/out-group thinking, for example, no one cares if their non-African scientists discover that all modern non-Africans are descended from Neanderthals, because the in-group is allowed to say things that some perceived moral systems would see as making them inferior. The opposite would have been a different situation.
Note 3: Any conclusion that results from compensating more than one did may be honestly disagreed with; the presence of honest disagreement marks the system as tolerant and makes those who apparently dishonestly disagree beyond the pale and not even worth arguing with. Any conclusion at variance with the rules system due to its compensating significantly less isn't just wrong, but evidence of primitive rationalization and/or moral failure, in accordance with Rule 1, and its advocates are different in kind from those merely disagreed with,.
[ETA: Note 4: These rules are recursive. Whether someone is valuing things as equal (or not) is to be judged by an interpreter of the rules according to the rules.
These rules constitute a black box with which one can discover socially respectable positions without understanding facts and apparently underlying issues. Information about such facts may distract from unbiased use of the black box, and result in unacceptable opinions.]
It is basically the most mind-killing thing that I can think of in the context of LW public discussion.
Also the simple rules are only simple in comparison to what they replace.They are currently scattered through various notes and correspondences and a reference chart that's probably only understandable to me. I don't have a go to response or prepared mail, I do write about individual points when they come up. To properly introduce them fully to other LWers would probably take one or two top level quality articles, rather than a throwaway comment in an obscure thread.
Actually, I'd say the "Reason As Memetic Immune Disorder hypothesis has it basically correct - to the extent that one can be "too orthodox," one ultimately has to fall into one or another heterodoxy. Most anybody who strives to take 90% of secular Western ideology seriously, consistently, and literally is going to end up libertarian or communist or transhumanist or the like - and of course (in the society we're discussing) only nerds do this. I think you're right to observe and that most everybody's aware that if one's goal is to get by socially with minimum effort, taking official ideology at face value, like taking religion at face value, is insane, but then of course different people have different goals.
No argument there, however getting to the point where you have an outside view is by itself a vast and difficult project in political and ideological self-education, which requires successful grappling with many extreme mind-killing issues without getting mind-killed yourself. You make it sound much easier than it really is!
You are right, it is much harder than I made it sound.
But I am convinced that many LWers if they could be made clearly aware of the importance of this for a clear picture and committing to be very aggressive fighting several very hard to root out biases, could make the transition or perhaps at least trust another LWer who has done some of the legwork once they saw the predictive power of the model.
Naturally the impulse of wanting to grab someone and shake violently until they realize the importance of something they have been missing their entire lives does little good. It is hard to communicate in many or few words, due to various complications, just how utterly vital this difficult and even dangerous (intellectually and perhaps emotionally) journey is in order to understand society.
I sometimes fear it just can not be done.
Knowing politics can't save you here in any case. At least will do you no more good than say a simple heuristic of sticking to others of your own ethnicity (which should keep you reasonably safe). And since you are sticking to them, they will again inform you of any potential trouble via daily interactions. Just remember to be more paranoid than the norm and not afraid to change countries.
How much better is, someone involved in the political process, going to be at predicting this compared to someone who apolitically looks at the broad trends and reigning ideology? The latter person knows there is a certain probability of such a blow up, though he may miss it due to too infrequent updating, the former will probably only realize this is possible a few weeks or even days before the event. How these two approaches compare to each other depends on how fast one updates on new information I suppose. I would argue that a long term strategy of preparation for such possible "man made disasters" might outdo the rushed preparations of someone responding to the politics as they happen.
Perfect blindness to both daily politics and the real mechanisms of how one's society function is naturally perilous. But consider the context of this discussions. The "real" mechanism would be perhaps controversial but mostly not covered under the "no mind killers" rule. Politics as in the politics that most obviously triggers this is useless.
I think we're having a misunderstanding about what exactly we mean by the "no mind killers" rule. Clearly, getting into mind-killing debates with people is worse than useless; that much we can agree on. On the other hand, making a correct decision to bail out ahead of trouble requires that you face the very worst mind-killing issues head-on and make correct judgments about them. (It is possible that such attempts are ultimately futile or not worth the opportunity costs when all probabilities are considered, but there's a Catch-22 situation there, because consideration of at least some highly mind-killing topics is necessary in order to establish this.)
However, when people speak about avoiding mind-killers on LW, they often have in mind complete cessation of thinking about such topics and living under the assumption that the status quo will continue indefinitely, or all until some grand technological game-changer. (Worse yet, sometimes they go further and privilege the hypotheses on controversial questions favored by the respectable opinion and official intellectual institutions, and consider attacks on these, but not adherence to them, as mind-killing.)
What I meant by this was things like idle speculating about the election. Indulging in off hand remarks about Gawddamn Liberals and Bible Thumping Conservatives. Frowning seriously and speaking about some politicians misconduct. Debating the particularities of certain laws. Endorsing candidates, criticizing candidates. Taking the parties stated platform seriously, ad hominens on the demographics that support a certain position, various other Dark Arts, ect.
In short everything that immediately triggers tribal feelings in those who are basically politically active average Joe "good citizens".
These are of course ideological mind-killers. I would argue that currently there is some room for intelligent debates on LW about various such issues, the sore thumb being gender relations/sexual conduct. People are not obviously mind-killed by discussing say group differences or questioning Democracy (ok many are, but a substantial and not at all fringe fraction of LWers who have tought about this question and take some deep criticism of it quite seriously), though a kind of paranoia and strained feeling of someone saying "too much" does persist. Considering its demographics, constant stream of new unacclimatised participants and the very aggressive signalling on things like charity and altruism (which are concepts always heavily defined and shaped by the underlying fundamentals of a society) it is admirable that LW rationalist community can go as far as it does.
I say people can survive without daily politics just fine. Because signalling only requires they understand the ideological fundamentals, and even further they might do just as well if they try and understand the ideological fundamentals from the outside, without getting into the messy details. I think my "model crazy society's expectations as a black box without bothering with the details of how they think their crazy works" satisfices.
I do however agree with your concern here:
But to think about such issues critically and objectively, there is in fact no need to even know what has been going on in say the past year. The tabooed fundamentals and key axioms have been the same for quite a bit longer and will not be changed by political action in the context of a modern Western "representative democracy".
Who is Roissy?
Roissy is a PUA whose old blog is now only available via the Internet Archive. Apparently this is his current blog.
In a comment from April 2010, I said:
However, I liked the grandparent and (a bit less) the other Roissy quote Konkvistador posted.
Surrendering to the barbarians are we.
To paraphrase Roissy, feel free trying to save this doomed civilization, I'll be poolside getting a tan.
Except that we are all part of this "doomed civilization", and if it collapses in civil war, nuclear apocalypse or even just a gigantic economic collapse, being at poolside won't keep you safe. So we have to save it, or to fix it. Now you can say that building a friendly AI is a much more efficient way of saving it/fixing it that getting involved in politics. That's something I can fully respect. But saying that you don't care or don't want to try is irresponsible.
For myself, saving it is very close to the thing I've to protect so I won't skip any single way I have under my own power of trying to save it : from understanding the world better to raising the sanity waterline around me to giving to charity to using train or walking instead of having a car to getting involve in politics even if it's "dirty". Because what matter is to win.
I'm very open to any argument about "this would be a more efficient way to save it" that would make me stronger in defending what I've to protect, but "don't try to save it, have fun and if everything collapse too bad" is not acceptable, it doesn't help my terminal values.
His argument is that the modern world was doomed before we where born, there is nothing really one can do to reform or save it. There can be no "have to" when there is a fairly strong possibility that nothing can be done, because incentives, biases and plain ignorance are aligned in such a way that effective positive action will bring overwhelming response against it. Anyone who thinks voting will solve anything has quite a bit of a way to go in my mind.
When civil war/nuclear apocalypse/gigantic economic collapse comes Roissy will still have a tan when it happens. The activist won't.
Actually I do think it is by far the most productive course of action, and I do support that effort as much as I can. But should that in itself raise some alarm bells in our minds? Getting Friendly AI right before it is too late is such long shot by most estimates. If contributing to this is indeed the best option for maximising desirable mid term future states of the universe for an individual or small group, we should pause to think about just how little certainty and influence a person has on a system composed of 7 billion people, their machines and the natural envrionment
For some games the only way to win is not to play. I am quite certain the average LWer will do the world much more good if he tries to promote rational thinking and tries, as best as he can, to detached and disinvest himself both emotionally and resource-wise from daily politics and ideology.
I am not saying the tiny influence a person has on the world automatically dosen't matter if a huge payoff is at all possible. I am saying that people are over-invested into politics, far beyond the point of diminishing returns due to our brains and our society tricking us into believing we matter far more in the process of government than we actually do. Remember the opportunity cost of involvement in politics!
I partially endorse the poolside getting a tan recommendation, because I'm actually convinced that taking a swim in the pool each morning for 30 minutes rather than reading political commentary will give the world more utility, because of your improved well being and productivity in other endeavours. Its likley not the optimal use of your time, but don't let the perfect become the enemy of the good.
Allow me a small nitpick in a great commentary: I would advise against the tan due to the dangers of skin cancer and skin damage. Tanning has become a cultural obsession in the west, not so much in asian countries but is generally unhealthy for white skin. You only should get enough sun to produce the necessary Vitamin D, other than that, avoid it! Go for the swim though, it is healthy!
Agreed!
This seems suspiciously convenient for someone who already prefers poolside tanning to saving the modern world.
The idea that you actually can save the modern world is also convenient for people with a certain self-perception.
Yes, and there are endless crackpots who believe themselves to be doing just that. Someone who is genuinely out to save the world will (unfortunately) share this same feature with crackpots; they will have to distinguish themselves from crackpots in other ways.
If folks are over-invested into politics, there are two ways of making the situation more optimal.
Reduce investment into politics. This is Roissy's recommendation. This option does not directly affect political issues, but it does free up effort/resources to be spent on more productive things (science, business, personal enjoyment etc.)
Raise the social payoff of investment into politics. This entails promoting reform and change in political processes so as to make them more deliberative, less ideological, more conducive to efficient outcomes etc. This option yields a direct payoff by improving political outcomes.
I'm not saying that (2) is easy. But a site having as mission statement "refining the art of human rationality" should definitely take an interest in the issue, since so much of human deliberation occurs in the public sphere.
In the absence of research into the issue of good governance and the conditions that affect change of government by a order of magnitude better than what is currently available, I would say promoting point two is in practice harmful advice.
Strategy 1. has a guaranteed pay-off but requires an individual to admit to himself that investment so far has been wasted. Strategy 2. can be used to rationalize any escalation of investment and past investment, a very comforting idea.
But I wish to stress something, people who disinvest from politics can invest, if they really want to improve governance at any cost, into quality rationalist research that is sorely lacking. In fact I claim that a community comprised exclusively of involved and politically active citizens can in fact never come up with what would amount to "useful social science" on certain issues (say the effect of governance).
I would thus argue that a site dedicated to "the refinement of human rationality" has not only thrived because of the no mind killer rule, it might if it put its resources to it radically improve the quality of government precisely by dis-investing emotionally and resource wise from politics to partially mitigate the perverse incentives involved in the endeavour.
You do not consider "quality rationalist research" into good governance to fall under political involvement? Yes, it is very different than day-to-day involvement into political practice, ideology etc. But then again, I have not seen the latter advocated much in this thread, or at all on LW.
I for one am quite wary of any "escalation of investment", expressly because affecting any stable system (natural or social) is unfeasible without a thorough, rational understanding of its structure and leverage points. And so most effort into political activism is indeed "wasted"[1]. But given that so many folks apparently are emotionally invested into changing governance in some way, I don't think there's anything wrong with helping such folks achieve desirable outcomes.
[1] I am obviously disregarding exceptional cases such as the "Arab Spring" uprisings; but even the Tea Party has had negligible effects (e.g. the leading R nominations for the 2012 election are widely seen as mediocre, and TP candidates are not faring well), and I expect little better from the 'Occupy' effort given how unfocused it is.
Wow, my thoughts, I'm surprised to read them here. ;)
Scott Aaronson
J. K. Rowling
Edit: Wasn't expecting downvotes. Maybe the distinction between the attributions is obvious, but I still don't see it.
Edit 2: Downvotes explained; thanks.
I've read the source and context of that and it's really not impressing me as a rational thing to do... it's a clever/smartass thing to do, but in what way did Ilyssa win? Surely she didn't expect Eric to enlighten her on the subject in some way she hadn't thought about before, and now she is "miserable about Eric", and didn't get to enjoy Hamlet.
The "I can't stop myself" says it all - she can't choose not to defect. That's not a strength.
Agreed. All the things to say that she finds "interesting" and "valid" seem to be shocking to other people. That's not a problem of being too honest, it's a problem of intentionally trying to drive people away (or being someone's bulbous caricature of a "rationalist").
And, of course, rational agents maximize their current utility functions.
David Hume
Roughly true, but downvoted for being basic (by LW standards) to the point of being an applause light. Good Rationality Quotes are ones we can learn from, not just agree with.
From the first episode of Dexter, season 6:
Batista: "...it's all about faith..."
Dexter: "Mmm..."
Batista: "It's something you feel, not something you can explain. It's very hard to put into words."
Dexter smiles politely, while thinking to himself: Because it makes no sense.
Three proposed derogatory labels from Dilbert creator Scott Adams:
Labelass: A special kind of idiot who uses labels as a substitute for comprehension.
Binarian: A special kind of idiot who believes that all people who hold a different view from oneself have the same views as each other.
Masturdebator: One who takes pleasure in furiously debating viewpoints that only exist in the imagination.
What's the word for someone who sees errors as defining character attributes that only occur in "idiots" and not decent, sensible people like theirself and their friends and readers?
--Douglas Hofstadter
Apple
There's a nice quote from George Bernard Shaw on the same subject: "The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man."
It's more demonstrative imho ^^
Ayn Rand
This view is much too binary. There are a myriad variety of choices of what to focus on, what aspect of it to focus on, and how much effort to apply to the focus. Someone can be purposefully aware of a very specific task, say a high speed race, with the bulk of their thinking down at the level of pattern matching. Someone can do highly abstract symbolic manipulations while half asleep and still recognize when they bump into the right set of manipulations to solve the problem.
It's too bad this has already dropped off the front page. Someone should request sticky threads here, although I don't care enough.
-- The Last Psychiatrist, "The Rise and Fall of Atypical Antipsychotics"
Carl Linderholm, Mathematics Made Difficult.
William Lawrence Bragg
Professor Albus Percival Wulfric Brian Dumbledore
Not sure I want to take that from someone who died.
Is all wisdom about living by anyone who is no longer alive made worthless by that fact? That seems rather arbitrary!
-- Discordian saying
Fred Clarke, August 9
-Scott Aaronson, from here
Tim Kreider, Artist's Note for The Pain
From the film The Maggie. The quote is excerpted from here.
Background: Earlier part of the 20th century, the west coast of Scotland. Marshall, an American, is in a small chartered aircraft chasing a Clyde puffer captained by Mactaggart, with whom he has business. He and the pilot have just caught sight of her in the sea below. Night is approaching.
Marshall: Where do you reckon they're making for?
Pilot: It looks like they're putting into Inverkerran for the night.
Marshall: Tell me, if they thought I thought they were going to Inverkerran, where do you reckon they would make for then?
Pilot: Strathcathaig, maybe.
Marshall: This sounds silly, but if they thought I'd think they were going to Strathcathaig because it looked as if they were going to Inverkerran -- where would they go then?
Pilot: My guess would be Pennymaddy.
Marshall: If there's such a thing as a triple bluff, I bet Mactaggart invented it. Okay, Pennymaddy.
--Cut to aboard the puffer--
Mactaggart: Aye, he'll have guessed we're making for Inverkerran.
Hamish: Will he not go there himself, then?
Mactaggart: Oh, no. He'll know we know he's seen us, so he'll be expecting us to head for Strathcathaig instead.
Hamish: Will I set her for Pennymaddy, then?
Mactaggart: No, If it should occur to him that it's occurred to us that he's expecting us to go to Strathcathaig, he would think we'll be making for Pennymaddy.
Hamish: Well, then, shall I set her for Penwhannoy?
Mactaggart: No. We'll make for Inverkerran just as we planned. It's the last thing he's likely to think of.
"It is startling to realize how much unbelief is necessary to make belief possible. What we know as blind faith is sustained by innumerable unbeliefs."
What is unbelief?
T-Rex: If I lived in the past I'd have different beliefs, because I'd have nobody modern around to teach me anything else!
FACT.
And I find it really unlikely that I would come up with all our modern good stuff on my own, running around saying "You guys! Democracy is pretty okay. Also, women are equal to men, and racism? Kind of a dick move." If I was raised by racist and sexist parents in the middle of a racist and sexist society, I'm pretty certain I'd be racist and sexist! I'm only as enlightened as I am today because I've stood on the shoulders of giants.
Right. So that raises the question: Is everyone from that period in Hell, or is Heaven overwhelmingly populated by racists?
-- T-Rex, Dinosaur Comics
All that's needed is a belief in purgatory.
We'd probably all end up there too, based on the near certainty that we're doing things that people in the future will correctly consider as obviously immoral.
I intend to anticipate as many of those harsh-judgement-of-future-generations things as possible, do the right thing now, and breeze through purgatory so much faster than the rest of those chumps. Bwahaha.
On that note, does anybody want to speculate about what people in the future will correctly regard as immoral that we're doing now? The time to think about this is before we get to the future and/or purgatory.
Some low-hanging fruit, for example, would be the widespread mistreatment of people with gender identity disorder, or squandering money on forms of charity that are actually harmful, e.g. destroying poor countries' textile industries by flooding the market with cheap donated clothes.
Agreed, and it's why I'm vegan.
I can't tell whether this is deadpan humor or not.
I think closed borders will be considered a great evil in the future, but that's probably another way of saying that not enough people are agreeing with me now.
That's an interesting interpretation of "we". Unless you do those things...?
I meant "we" as in "we, as a society", or more specifically "we, the society that I happen to find myself in." Please pardon the ambiguity.
I was being facetious. Please pardon the ambiguity.
Seems to imply, at least to me, a function of "we" that includes "I". Plus, it seems a more interesting question to ask what you're doing that might come to be considered immoral - it's rather unlikely that you're really perfect, isn't it?
It's easy to say "that thing that all those other people are doing, and which I already think is immoral, will come to be considered immoral by our descendants." That's just saying "I'm better than you, neener neener."
Ah, I misread DSimon's use of "we", and the misunderstandings cascaded from there. My mistake. To clarify, I would like to hear things that I may be doing that future generations may be justified in disapproving of. It's an interesting and relevant question. Gloating about my own supposed superiority (neener neener) hadn't even crossed my mind.
Like the vast majority here, I aim to improve myself.
Looking back, my last post came out rather more accusatory than I intended, for which I apologize.
To get back on topic, how about "not cryonically preserving people against their will"?
I think the obvious answer would be that Heaven is overwhelmingly populated by ex-racists. Once they get there, they'd have people around to teach them better stuff.
Who would teach them? The more severe racists from periods even further back?
I think the assumption is that divine beings would be there.
Maybe the dead of other races, provably ensouled and with barriers to communication magically removed.
Sam Harris, "Lying"
For a second I read that as "putrefying."
Greater than signs are only necessary at the beginning of the paragraph, by the way.
Thanks, fixed.
I think this is actually a myth. It's appealing, to us who love truth so much, to think that deviating from the path of the truth is deadly and dangerous and leads inevitably to dark side epistemology. But there is a trick to telling lies, such that they only differ from the truth in minor, difficult to verify ways. If you tell elegant lies, they will cling to the surface of the truth like a parasite, and you will be able to do almost anything with them that you could do with the truth. You just have to remember a few extra bits that you changed, and otherwise behave as a normal honest person would, given those few extra bits.
It is customary to add at the end of such confessions, "or so I'm told", which is technically not a lie but merely an implicature.
Being embarrassed about your knowledge is anathema to rational conversation. You can see it in drug policy debates, where nobody talks about how relatively harmless marijuana is, for fear that people might know that they smoke it. You can see it in censorship debates, where no community member is going to stand up and say "hey, this porno doesn't violate my standards, in fact it's pretty hot". We can stand around pretending to be good people, or we can get at the truth.
I'm more willing to admit to lying here, because I trust you guys more than most people to take that admission only for what it is, and no more.
You sound like you're advocating radical honesty. It seems like there should be a middle ground of making sure relevant information is introduced, but doing it in a way that minimizes derailing self-disclosure (or self-disclosure that could cost you in status).
Also, arguing from personal experience can be form of defection, shifting the conversation to an arena where one's convincingness is proportional to one's willingness to lie. (I think I have some comments saved that say that better than I can.)
You're not actually disagreeing with Harris. Crafting efficient lies that behave as you describe is hard, particularly on the spot during conversation. Practice helps, and having your interlocutor's trust can compensate for a lot of imperfections, but it's still a lot of work compared to just sharing everything you know
Worse, you can simply let people catch you, then get angry with them and bully them into accepting your claims not to have lied out of a mix of imperfect certainty and conflict avoidance. By doing this you condition them to accept the radical form of dominance where they have the authority to tell you what you are morally entitled to believe.
--Sappho #7; trans. Barnard (seen on http://www.nada.kth.se/%7Easa/Quotes/immortality )
Combine this with Nietzsche's "God is dead."
--Principia Discordia (surprisingly, not quoted yet)
Maybe because it has little to do with rationality?
André Gide
-- Roissy in DC
"Real magic is the kind of magic that is not real, while magic that is real (magic that can actually be done), is not real magic."
-Lee Siegle
Ran Prieur
Adolph Hitler
What misfortune for all that those in power don't either.
Robert A. Heinlein
Bertrand Russell
Hydrogen atoms have exactly one proton.
What do you mean by "hydrogen atom" and "have" and "exactly" and "proton". ("One" I can deal with for now, but quantum physics makes the rest of your sentence meaningless (i.e. it makes your sentence an inexact high level description.))
By "proton" I mean a thingy that creates a potential well where an electron bops around, and by "hydrogen atom" I mean a single of these with a single electron in it, and by "have" I mean that when the electron has high enough energy you don't call it an hydrogen atom but "a proton here and an electron over there". This is of course a tautology.
By "one" I mean S(0) (and by "0" I mean the empty set), which is also a tautology. And if you don't know what I mean by "exactly" then you don't understand the parent quote anyway.
Admittedly a good counterexample would involve an exact truth that is not a tautology.
There are exactly zero unicorns.
-- Hans. The Troll Hunter
Kaboom!
W. Stanley Jevons
Dragon Quest Monsters: Joker 2
Wow, that one is actually brilliant!
George E. Forsythe
-- AJ Milne
Not to be confused with A. A. Milne, who wrote Winnie the Pooh.
For some context, this is a response to allegations of scientism, a word with remarkably overt anti-epistemological connotations.
Speaking of which, lukeprog discusses the idea of "reclaiming" the word scientism on his blog.
Really, I see it as describing a family of genuine failure modes that people trying to be "scientific" often fall into. For example:
a) attempting to argue by definition that something is "science" and therefore right.
b) arguing that just because some evidence isn't scientific, that it's not valid evidence.
c) insisting that the results of the latest scientific research should are right, despite results in the relevant field having a very poor replication rate.
In case people try to argue that these errors rarely get made, here is a comment by Yvain with 22 karma that makes errors (b) and (c).
Can you point out where Yvain makes those comments that you think violate b and c? Reading that post it looks to me like Yvain's points are a little more nuanced than that.
Note incidentally that while you might be able to use the word that way, the vast majority of people who use it seem to use it in a way closer to what sketerpot is talking about. If one interacts at all with either young earth creationists or homeopaths for example it often doesn't take long before the term is thrown around.
In my experience scientists arguing with creationists (I haven't looked at arguments with homeopaths) frequently make the mistakes I list above, as well as a few related ones. In particular using the AJ Milne quote ciphergoth cited in an argument against creationism is itself at best a straw man, after all the creationist also cares about getting the facts right, in fact that's why he's arguing with the scientist, because he believes the scientist has his facts wrong.
In any case the underlying argument in the AJ Milne quote is: all people are about truth; therefore, you should believe what science has to say about subject X.
This is an example of either (1) or (2) depending on how the implicit premises are made precise.
Actually, the underlying argument is not: 'all people are about truth; therefore, you should believe what science has to say about subject X'.
The underlying argument actually is: attacking someone else's argument on the basis that said argument is apparently unreasonably concerned with something so naive as the actual facts of the matter, and smearing this as 'scientism' is purely misdirection, and utterly without logical basis. It's a culturally-based ploy that works only if one has been convinced that determining the actual facts of the matter are an exclusive and unreasonable obsession that only follows from one being afflicted with this apparent disease 'scientism', and, apparently, reasonable people not so obsessed really don't worry about such trifles as factuality.
It's a mite peculiar, to me, that you can read a comment that merely specifically says, in fact, that concerns with factual correctness are not the exclusive domain of science (and it was, in fact, a comment on a false dichotomy of exactly this nature--again, the context is at the link), and assume that what it means, apparently, is 'science by definition is right'. This assumption is utter nonsense. I've no idea where you pulled that from, but it sure as hell wasn't from my quote.
Here are some excerpts from Yvain's comments that exhibit the problems I mentioned, (as well as others that maybe I should add).
This essentially error (b) with elements of (c). From a Bayesian perspective "saying there are likely flaws in mainstream medical research" does mean one should decrease the weight one assigns to all medical findings, thus one should assign more (relative weight) to other, non-scientific, evidence, e.g., evidence likely to be based an anecdotes.
This argument violates conservation of expected evidence.
[Here follows several paragraphs describing of how much he discourages people from being afraid to take statins along with some references to "good doctors" and "correctly prescribed statin" that seem to be there to help set up a potential No True Scotsman] If my doctor recommends I take statin, I don't care about the base rates for statin "correctly prescribed" by "good doctors", I care about the base rate of statin as actually prescribed by actual doctors.
Then Nancy tells her anecdote
Yvain's reply begins:
Funny how he didn't see fit to mention this it his first post while he spent several paragraphs arguing for why satins are perfectly safe.
I'm not sure but somehow I suspect these numbers assume the statin was prescribed "correctly". Furthermore, they certainly don't take into account the base rate for medical studies being false. Also, he next says:
Somehow I suspect the numbers he gives in the preceding paragraph assumed no drug interactions.
I don't read most of that the way you've read it. For example, Yvain said "Saying that there are likely flaws in mainstream medical research doesn't license you to discount any specific medical finding unless you have particular reason to believe that finding is false." Discount is much stronger language than simply reducing weight in the claim.
No it doesn't. It only violates that if in the alternate case where Yvain knew that almost all new studies turn out to be right he would point this as a success of the method. I suspect that in that counterfactual, he likely would. But that's still not a b or a c type violation.
Most of the reply to Nancy while potentially problematic doesn't fall into b and c. But I don't think you are being fair when you say:
The standard of safe is very different than listing every well known side-effect, especially if they only happen in a fraction of the population. I don't see a contradiction here, and if there is one, it doesn't seem to fall under b or c in any obvious way.
It's not clear what Yvain indented to mean by "discount"; however, the rest of his argument assumes he can disregard the base rate unless there you have specific evidence.
Brandon Watson
--Oliver Heavside
--Thomas Carlyle
I love this quote, but it really isn't true. People frequently forego the first one.
I interpret "good reason" as "'good' reason".
I don't think the first one ever gets generated unless someone else asks them why they did that something.
it must be nice to be clever enough to generate good reasons in real time, rather than having to spend all your spare cycles preemptively coming up with justifications for your actions.
I just ran into a surprisingly candid example of Richard Feynman talking about when he did that. He worked on the atomic bomb to make sure that Nazis didn't get it first, but then he kept working on it even after the Nazis had been defeated.
I love it too and I like to have an evil reason as well. That keeps things in perspective. And a right reason - which balances the 'good' with the 'evil' according to my ethical sentiment. But that's just a (morally ambiguous) ideal. The real reason, that which Carlyle mentions, is something else again.
I have a different angle - I like to have a stupid reason, to amuse my friends with.
--Thomas Carlyle
--Thomas Carlyle
Indefinitely, anyway. I am reminded of another Carlyle quote that Moldbug quoted with approval (but then doesn't he always):
Sharon Fenick
Theodore Roszack
-- Jesse Custer, Preacher (Garth Ennis)
Anon
--Dan Dennet: Breaking the Spell
E.T. Jaynes's "Bayesian Methods: General Background"
-- Herodotus
The problem with that quote is that human biases often go the other way, i.e., we'd rather blame bad consequences on bad luck then admit we made a bad decision.
The quote may still have some use when applied to humans other than oneself.
I tried to track this down, and this seems to be Jaynes's paraphrase of Herodotus; pg 2 of "Bayesian Methods: General Background". (I looked through one translation, http://classics.mit.edu/Herodotus/history.mb.txt , and was unable to locate it.)
I got it out of "Data Analysis A Bayesian Tutorial" pg 4 where it is attributed to Herodotus
After some more searching and a pointer on Straight Dope, I think I've found it in Book 7 of the Histories when Artabanus is trying to dissuade Xerxes from launching his ill-fated war against the Greeks, where it is, as one would expect from Jaynes's paraphrase, different:
Or in another translation:
-- G.K. Chesterton
I so adore cliches. They create an expectation to subvert.
Do that too much and you'll end up with a "high brow" piece that's incomprehensible to anyone not familiar with the cliches you're subverting.
The short story in question is "The Dagger with Wings", originally published in The Incredulity of Father Brown.
That said, I don't quite understand why this constitutes a Rationality Quote.
To me, the lesson is that when someone appeals to your intuitions - you can just say no.
"Don't you feel there must be a supreme being, that everything has a purpose and a place in the grand order of things?"
"No."
(Fun story, incidentally.)