16

Suppose you're a protestant, and you want to convince other people to do what the Bible says to do.  Would you persuade them by showing them that the Bible says that they should?

Now suppose you're a rationalist, and you want to convince other people to be rational.  Would you persuade them with a rational argument?

If not, how?

ADDED:  I'm not talking about persuading others who already accept reason as final arbiter to adopt Bayesian principles, or anything like that.  I mean persuading Joe on the street who does whatever feels good, and feels pretty good about that.  Or a doctor of philosophy who believes that truth is relative and reason is a social construct.  Or a Christian who believes that the Bible is God's Word, and things that contradict the Bible must be false.

Christians don't place a whole set of the population off-limits and say, "These people are unreachable; their paradigms are too different."  They go after everyone.  There is no class of people whom they are unsuccessful with.

Saying that we have to play by a set of self-imposed rules in the competition for the minds of humanity, while our competitors don't, means we will lose.  And isn't rationality about winning?

ADDED:  People are missing the point that the situation is symmetrical for religious evangelists.  For them to step outside of their worldview, and use reason to gain converts, is as epistemically dangerous for them, as it is for us to gain converts using something other than reason.  Contemporary Christians consider themselves on good terms with reason; but if you look back in history, you'll find that many of the famous and influential Christian theologians (starting with Paul) made explicit warnings against the temptation of reason.  The proceedings from Galileo's trial contain some choice bits on the relation between reason and faith.

Using all sorts of persuasive techniques that are not grounded in religious truth, and hence are epistemically repulsive to them and corrosive to their belief system, has proven a winning strategy for all religions.  It's a compromise; but these compromises did not weaken those religions.  They made them stronger.

New Comment
65 comments, sorted by Click to highlight new comments since: Today at 6:17 PM

Most people (even those who aren't rationalists) consider rationality to be usually a good thing. Most people who aren't Christians don't consider following the Bible to be particularly useful or meritorious. So I find your analogy unconvincing. But:

You should convince them by appealing to whatever they understand, in a way you can do with integrity. If there is no such way, then you're probably both wasting your time. I'd have thought that saying more than this would require consideration of the special characteristics of particular situations.

To expand on this: one major arrow in rationality's quiver is that practically everyone (a few genuine postmodernists excepted) values some basic concept of rationality. If this weren't so, then political actors wouldn't get such mileage out of showing inconsistencies, biases and (purported) fallacies in their opponents.

Furthermore, the vast majority of people believe themselves to be epistemically rational, now excepting some fideists of various types as well (but even these usually have arguments for doing so that appeal to some sense of second-order rationality).

I'm not sure sweet reason will work. I do remember a sophisticated theist friend coming back from visiting our local Church of England, which is low enough church to be about one notch above the local Pentecostal churches, and includes recruits from the higher end of there. People who quite literally believe in a simplistic karmic model of the world, where good things happen to good people, so if bad things or good things happen to you it's because you're bad or good. He was horrified. I then ran EY's concentric series of retcons theory of religion past him and he had to concur to some degree. I've since given him The God Delusion to read, after I caught him bitching about it without having read it. I am now biding my time. Muwahahaha.

You should convince them by appealing to whatever they understand, in a way you can do with integrity. If there is no such way, then you're probably both wasting your time.

Yet Christians manage the same trick, on a large scale.

There is a Mahayana Buddhist doctrine - it might have to do with the "doctrine of the lesser vehicle", but I forget - that says (paraphrased), "No one can be persuaded of the truth of Buddhism unless they already understand the truth of Buddhism. Therefore for their own good you may deceive them, and tell them that the study of this doctrine will give them the lesser things that they in their ignorance desire, to persuade them to follow it unto understanding."

I think a much-too-large fraction of how Christians manage it is by means that most people here would deplore: not merely because they appeal to something other than reason, but because they're actually anti-rational.

If you wish to proceed in that way, go ahead. My guess is that (1) rationalists in general will not do well using techniques that go so badly against the grain, (2) rationalists who do what it takes to use such techniques will tend to corrupt their own rationalism in so doing (because, e.g., the most effective way to fool others is to fool yourself first), and (3) the loss -- e.g., from people noticing that they've been tricked and deciding not to trust anything you've ever told them -- might well turn out to be greater than the gain anyway.

I remain unconvinced of the need, anyway: most people agree, at least in theory and in general, that rationality is good and useful. Convincing someone to be rationalist might be harder; so focus instead on showing them how to be rational more effectively in particular cases where they are agreed that being rational is good. The principles generalize, after all.

They cheat. Persuasion per se is not involved.

Is it cheating to suggest to a theist that the tools of rational thought can help them more fully understand God?

It would be a truth in the denotation and a dirty trick in the connotation, but it isn't what I meant by "cheating".

How do they cheat? Can/should we cheat in a similar fashion?

They exploit brain hacks. Teaching kids. Guilt or shame and the promise of absolution. Peer pressure. Tribal cohesiveness. Force, fear, and pain-reinforcement. Mere exposure effect. Etc etc.

Basically you're asking "is the dark side stronger", and I refer you to Yoda.

Appeal to fictional evidence, that's dangerous too. Involving the dark side of star wars will elicit cached thoughts. The force is a fictional contraption devised by people for a story, and it doesn't work in the same way as rationality do.

That said, is it still ok to rob a bank to give to a charity though ? We must be damn sure of our truth, and of the nobleness of our purposes, to lie others into the same understanding as ours.

I wonder if there's a bit of Aumann agreement in there. We might disagree with other people, but to just hack their brains cancels any useful updates we might have got from their unique knowledge.

That is much too complicated to be solved in one sentence. However, ultimately, we'll make a bet, on the assurance that we must be right. If we indeed are, it makes sense to convert other people to our worldview, provided their objectives are similar to ours, since that will help them.

Historically, though, it has been shown that people believing they were right, were not even close to being that. Would we be repeating that mistake if we said that what we advocate is the truth ?

What do we advocate anyway ? It seems our vision of truth is much more flexible than any other seen so far. We don't even have a fixed vision, anything we believe at this point, is liable to be rewritten.

It seems to me that to be a good rationalist, you should ideally not need someone else to show you unique knowledge, that might change your mind. You should be able to do it yourself. But that idea can be potentially abused too.

Yoda is an unabashed religious, moral realist. In his world, you can measure someone's goodness by the color of their lightsaber.

It is irrational to label a set of tools "dark arts" and place them off limits to us. EY has a justification for not using the "dark arts", but it's (my interpretation) supposed to be a lot more sophisticated than just calling them evil - and hence has many more possible exceptions or failure points.

I'm sure a rationalist society would teach its kids. That hack is hardly avoidable - people have to start from somewhere.

The other stuff has an obvious downside: it makes the victim dumber. Zombies are useful to theists but not to us. Also, it tangles the dark-sider in nonsense that they must subsequently defend. It makes them a practicing anti-rationalist in order to shore up their gains. In the end and with a sufficiently smart victim, it's simply fated to collapse, leaving bad odor all around.

Also, it tangles the dark-sider in nonsense that they must subsequently defend.

Actually, i think that might be the best part: somebody starts to notice that it's nonsense, you take them aside and say, "Congratulations! Most of what I taught you was lies, and, of course, you can't trust me to say which is which. You'll just have to look at the evidence, figure it out for yourself."

I think the same argument could be made against using anything other than Biblical principles to win converts to Christianity. A Christian church that believed those arguments would lose.

And aren't rationalists supposed to win?

Rationality is supposed to score a win (whenever it is possible). Rationalists only try to use rationality, to the best of the capability, to win. They may or may not succeed.

Looks to me like Christianity has the more winning strategy (where winning = gaining converts).

It may well be that Christianity is winning (in that sense). That doesn't mean that it has a winning strategy: it might (and clearly does) have other advantages which rationalism doesn't have and either couldn't or shouldn't get.

I'm going to take the downvote I got for that as indicating that I wasn't clear enough and explain a bit further.

Suppose A beats B at some game. (Here A is Christianity, B is rationalism, the game is having as many people as possible onside.) It could be that this is because A is playing the game better than B. But A could also be winning for reasons that have nothing to do with how A and B are playing.

Example 1: two people are trying to outdo one another in getting many sexual partners. (I make no comment on the wisdom or morals of playing this game.) A might be winning by being physically more attractive, or by having a pile of inherited money and therefore more scope for generous gestures.

(... Perhaps Christianity just is more appealing to most people than rationalism; see, e.g., Pascal Boyer's theories about what sorts of belief tend to lodge in people's minds and form religious doctrines. Perhaps Christianity benefits from having been officially adopted by the Roman Empire and plenty of other empires since then, and spread by the sword or by economic intimidation.)

Example 2: two people are playing the game of making as much money as possible. A might be winning by virtue of getting lucky early on and therefore having more resources for the rest of the game.

(... Perhaps Christianity has many adherents now simply because it had many in the past, and people tend to pass on their religion to their children and to others around them.)

Example 3: two people are playing a game of tennis. A might be winning because she's friends with the referee, who calls balls in or out dishonestly to favour A.

(... Perhaps Christianity has many adherents because powerful people and institutions are Christian and others are intimidated or impressed by their status. Roman Empire, again -- or the US today.)

It's not hard to come up with further examples, but I'll leave it there. Rationalism doesn't have the option of being something different and more appealing, or changing history so as to have the advantage of lots of existing members; perhaps rationalists could somehow contrive to gain enough power to intimidate, or enough influence in schools etc. to brainwash, but it might not be possible to do that without becoming corrupted and ceasing to be rationalist.

These are all ways in which Christianity could "win" whether or not it employs a "winning strategy".

I didn't say that it was winning. I said it looked to me like it had a more winning strategy. Their strategy is to win converts by any means, as opposed to the rationalist strategy that several people are endorsing that says that we can't use irrational persuasive methods. Comparing those two strategies, I predict the first will win.

Yes, where winning equates gaining converts. But gaining converts, for us, ought to be only instrumental to a greater purpose. Many strategies may win on the short or mid term, being more explosive or efficient, but still lead to a dead end.

So what religion uses to gain converts, may not work for us, as it destroys our long term purposes. Though I find it difficult to disentangle what in those methods we could use, and what we couldn't.

I would call 2000 years long term. (In the set of strategy histories observed so far.)

Part of my point is that the methods they use to gain converts are also against their long term purpose. The fact that thoroughly-evolved religions do this indicates it is adaptive, despite the short-term hit to their worldview.

What use is a dumbed down, brain-hacked convert? Are you using them to keep score, or something?

What same argument? I don't follow.

Rationality is to a Christian somewhat as the Dark Arts are to us. Christians have often made conversions based on reason, even though giving reason legitimacy makes their converts "dumber" and less-able to resist the temptation of reason.

They haven't said "these practices are off-limits to us". They strive for an optimal tradeoff between winning converts and corrupting their religion. We can consider their policies to have been selected by evolution. So we should be suspicious of claims that we, using reason, can find tradeoffs better than 2000 years of cultural evolution can. Particularly when our tradeoff ax + by involves suspicious numbers like a=0 and b=1.

Actually quite a few Christians are very rational people. It is possible to use only some of the tools or rationality, to dig your own grave even deeper than you could if you knew nothing of it.

Becoming a more sophisticate debater for instance.

Those people don't consider "rationality" as something negative, far from it. They have their own idea of what rationality is, of course, but that idea overlaps ours enough that those two concepts can be considered to be similar.

I'm oversimplifying; but if you go back into church history, especially pre-Enlightenment, you'll find that most of the major church fathers made statements explicitly condemning rationality.

I would muster all their biases in my favor. I would try every trick in the book to influence the unconscious part of their mind because that's where it all happens.

For the conscious part, I'd tell them useful stuff and try to demonstrate it on the spot.

For new age people, I'd try to get them to try to attract rationality.

For new age people, I'd try to get them to try to attract rationality.

I must admit, I laughed out loud there.

Dark Side Epistemology exercise: go to Google, find a really painfully written web ad for a Pick-Up-Artist manual or something comparably cheesy and dark-side-susceptible. Rewrite the ad to be an ad for rationality techniques.

(Don't feel you have to show it to anyone, this is just an exercise.)

Some examples of what I think you're looking for:

  1. Vassar's proposed shift from saying "this is the best thing you can do" to "this is a cool thing you can do" because people's psychologies respond better to this
  2. Operant conditioning in general
  3. Generally, create a model of the other person, then use standard rationality to explore how to most efficiently change them. Obviously, the less wrong and overcoming bias knowledge base is very relevant for this.

create a model of the other person, then use standard rationality to explore how to most efficiently change them

Standard rationality tells me it's most efficient to lie to them from a young age.

(In this case, to tell the truth to them, we hope.)

Thanks! Those are good examples. Although the fact that they wouldn't make me feel dirty makes me suspect we should go farther.

I think WW Bartley was striving to achieve just this goal in "The Retreat to Commitment". His approach (after much discussion of the Protestants' approach to philosophy as background) is to ask what the goal of thinking about thinking should be. He concludes that anyone who thinks the question is interesting must be looking for techniques to help find out what is true about the world we live in. If you and your interlocutor can agree on that, then you're well on your way to being able to establishing correspondence with reality as the metric for better and worse choices of how to think.

Bartley's success in the book is in arguing well that you don't have to taken any particular theory, approach, or metric as primary in the struggle to decide what works best. At any particular moment you have to have a place to stand, but if there are reasons to doubt the foundations you are using, you can stand somewhere else for a while and inspect those foundations.

At the end you want to come back to contemplation of the question as to which approaches seem to lead to the best understanding of the truth of the world.

This comment is in reply to some ideas in the comments below.

In my opinion, my rationality is as faith-based as is a religious person's religious belief.

Among my highest values is "being right" in the sense of being able to instrumentally effect or predict the world. I want to be able to communicate across long distances, to turn combustible fuel into safe transportation, to correctly predict what an interstellar probe will find and to be able to build an interstellar probe that will work. Looking at the world, I see much more success in endeavors like these from science and rationality than from religiosity or appeals to god. And so I adopt rationality as it supports my values.

I also want to raise healthy, happy, "good" children. My one child who dabbles in alcohol, drugs, and petty theft, I am pretty sure I could "help" him by going to church with him. I've known many people who are effective at doing things I see as good because, it seems, of their religious beliefs and participation in churches and religious communities. I liked being a Lutheran for a few years. One night I told our pastor that I just didn't believe in god. He told me he thought half the church had that happening. Even so I couldn't stay engaged.

I feel the loss of religious faith as a sorrow, or a pain, or a burr under my saddle, or something. But I can't justify it, or more importantly, I can only pretend to believe, actual belief does not seem to me to be a real option anymore.

And it turns out I have enough "faith" in scientific rationalism that I won't even pretend I believe in god. I choose to believe that staying consistent with rational principles will payoff more for me and those I care about than will falling back to the more accessible morality of religious faith. It is a leap of faith, especially in light of "rationalists win." If my son were to become an heroin addict and devote his life to petty theft, jail, and shooting up, AND I could have prevented that by bringing him to church, I will have paid a price for my faith, as much as any Christian Martyr who was harmed or whos family was harmed because he did not deny his Christian belief.

People who think their rationality does not come from a faith they possess remind me of religious people who think their belief in god is just right, that it does not come from a faith that they possess or have chosen.

Taboo "faith", what do you mean specifically by that term?

Taboo "faith", what do you mean specifically by that term?

Good idea. I mean that EVERYBODY, rationalist atheist and christian alike, starts with an axiom or assumption.

In the case of rationalist atheists (or at least come such as myself) the axioms started with are things like 1) truth is inferred with semi=quantifiable confidence from evidence supporting hypotheses, 2) explanations like "god did it" or "alpha did it" or "a benevolent force of the universe did it" are disallowed. I think some people are willing to go circular, allow the axioms to remain implicit and then "prove" them along the way: I see no evidence for a conscious personality with supernatural powers. But I do claim that is circular, you can't prove anything without knowing how you prove things and so you can't prove how you prove things by applying how you prove things without being circular.

So for me, I support my rationalist atheist point of view by appealing to the great success it has in advancing engineering and science. By pointing to the richness of the connections to data, the "obvious" consistency of geology with a 4 billion year old earth, the "obvious" consistency of evolution from common ancestors of similar structures across species right down to the ADP-ATP cycle and DNA.

But a theist is doing the same thing. They START with the assumption that there is a powerful conscious being running both the physical and the human worlds. They marvel at the brilliance of the design of life to support their claim even though it can't prove their axioms. They marvel at the richness of the human moral and emotional world as more support for the richness and beauty of conscious and good creation.

Logically, there is no logic without assumptions. Deduction needs something to deduce from. I like occams razor and naturalism because my long exposure to it leaves me feeling very satisfied with its ability to describe many things I think are important. Other people like theism because their long exposure to it leaves them feeling very satisfied with its ability to describe and even prescribe the things they think are important.

I am not aware of a definitive way to challenge axioms, and I don't think there is one at the level I think of it.

It took me a long time to respond to this because I found the question resistant to analysis. My immediate impulse is to shout, "But, dammit, my rational argument is really truly actually valid and your bible quotation isn't!" This is obviously irrelevant since, by hypothesis, my goal is to be convincing rather than correct.

After thinking about it, I've decided that the reason the question was hard to analyze is because that hypothesis is so false for me. You haven't placed any constraints at all; in particular, you haven't said that my goal is

  • to convince others to be more rational via a correct argument, or
  • to convince others to be more rational, provided that this is true, or
  • to convince others to be more rational, as long as I don't have to harm anyone to do it, or
  • to convince others to be more rational, as long as I can maintain my integrity in the process.

If I take "convince others to be more rational" as a supergoal, then of course I should lie and brain-hack and do whatever is in my power to turn others into rationalists. But in reality, many of my highest values have less to do with the brain-states of others, than with what comes out of my own mouth. Turning others into rationalists at the price of becoming a charlatan myself would not be such a great tradeoff.

I regularly "lose" debates because I'm not willing to use rhetoric I personally find unconvincing. (Though I'm probably flattering myself to suppose that I would "win" otherwise.) To give a specific example, I am deeply opposed to drug prohibition, while openly predicting that more people will be addicted to drugs if they are legally available. This is a very difficult position to quickly relay to someone who doesn't already agree with me, but any simplification would be basically dishonest. I could invent instrumental reasons why I shouldn't use a basically dishonest argument in this case, but the truth is that I just hate lying to people, even more than I hate letting them walk around with false, deadly ideas.

I imagine Eliezer and Robin run into this themselves, when they say that a certain unusual medical procedure only has a small probability of success, but should be undergone anyway because the payoff is so high. Many people will hear "low probability of success" and stop listening, and many of those people will therefore die unnecessarily. Does this mean Eliezer and Robin should start saying that there is a high probability of success after all, in order to better save lives?

Now maybe your point here is that yes, we all should be lying for the sake of our goals---that we should throw out our rules of engagement and wage a total war on epistemic wrongness. I have considered this myself, and honestly I don't have a good rebuttal. I can only say that I'm not ready to be that kind of warrior.

I think coming to agreement on terms through a dialectic is something most everyone can agree to engage in, and I don't think it's offensive to or beyond the scope of rationality. Socrates' way is the sort of meta-winning way, the way that, if fully pursued, will arrive at the conclusion of rationality.

For instance, In any one of those cases, I could start with a dialectic about problem-solving in everyday life, or at least general cases, and proceed to the principle that rationality is the best way. I'd try to come to agreement about the methods we use to diagnose a car problem, calculate how much they owe in taxes, or decide to enter an intersection, and extrapolate to epistemology from there. The philosopher, the Christian, and the hedonist all use reason, not will-to-power, faith, or desire to fix and drive their cars and pay their taxes, and this gives the evangelist of reason a method of proving the epistemological assertion that there is such a thing as truth, which we encounter in passing, and that rationality is the optimal way to approach it.

People are irrational largely because they're stupid. I have yet to be convinced that "rationality" is something entirely distinct from intelligence itself, such that you can appeal to someone to become significantly more "rational" without simultaneously effecting the seemingly tougher feat of boosting IQ a standard deviation or so.

For some evidence to the contrary (and the beginnings of a theory about when cognitive ability will correlate with rationality and when it won't) try this:

  • Stanovich, K. E, & West. R. F. (2008). On the relative independence of thinking biases and cognitive ability. Journal of Personality and Social Psychology, 94, 672-695. JPSP08.pdf

(More here.)

The approach Jaynes takes in the opening chapters Probability: The Logic of Science based on Cox's theorem was very persuasive to me and the few others I've mentioned it to. The basic idea is to start with a few criteria that just seem like common sense that everyone should agree are desirable in a reasoning system. Then Jaynes shows that probability theory is the only possible system that fulfills these criteria.

Is there a similar approach that can be used to argue to rationality in general? I like to appeal to universiality. Different subjects should be governed by the same rules. And science and rationality have been enormously successful, so why shouldn't it be applied universally. Unfortunately, this approach can easily be abused. Can we formulate a good approach of this sort that isn't just leading people to say what we want?

I hate everything Jaynes has written about Cox's theorem. He glosses over assumptions and pretends the assumptions that he admits too are weaker than they are.

Go back and read it again. Cox's theorem isn't anywhere near as strong as Jaynes makes it out to be.

Will do. I just obtained a hard copy of P:tLoS to study a little more seriously.

Are there any specific issues to watch out for? Is there a better source for understanding Cox's theorem you could point me to?

http://www.stats.org.uk/cox-theorems/Halpern1999a.pdf

Halpern gives a correct proof of one of the rigorous variations on Cox's theorem, and gives a counterexample to Cox's theorem for a set of propositions that's too small to satisfy the density requirement.

There're a couple things he seemed to gloss over, but those seemed more like "boilerplate that was 'more of the same' for certain bits, IIRC" rather than "significant things that we're missing significant bits of"... but then, I guess "glossing over" is a problem because it makes things seem like that. :)

Anyways, I happen to be a fan of vulnerability/coherence/dutch book style arguments. I mean, for cleanliness/simplicity, those just win hands down. (a touch of, at most, linear algebra vs the functional analysis of Cox's Theorem? :)) And in some forms build up decision theory right at the same time!

Although, now I'm wondering... just how much weaker is Cox's theorem than Jaynes makes it sound?

The proof in Jaynes applies proves that if you want to assign plausibilities to propositions, and those plausibilities are going to be real numbers, and P(a^b|c) is a function of P(a|b^c) and p(b|c) and P(not a) is a function of P(a) and all those functions are twice-differentiable, and P satisfies a certain density requirement, then P has to be isomorphic to probability.

It just doesn't have the same philosophical punch as "a few criteria that just seem like common sense that everyone should agree are desirable in a reasoning system." when you actually spell out the assumptions and they contain seemingly unjustified technical things like differentiability and density.

There are a bunch of rigorous variations on Cox's theorem, but as far as i know there is nothing that lives up to the hype.

Well, some of those criteria at least seem perfectly reasonable.

As far as what which thing was a function of, IIRC, he kinda went through that, discussing some examples and basically outlining an argument for what sort of things could depend on what vs what would lead to absurdities, so the "this is a function of this and that" wasn't, IIRC, pulled out of thin air.

I'm not talking about convincing people who believe in reason to use probability theory. I meant, people who don't accept reason as the final arbiter in arguments. Which may still be most people.

I may have been unclear. I only meant Jaynes's approach as an analogy. I was speculating whether an approach based on common-sense desirata would work as well for rationality in general as it does for probability.

[-][anonymous]8y00

I acknowledge the symmetry and appreciate the conceptual novelty of your argument to my idea set. It's a very powerful interpretation of some behaviour I have observed.

Has it been experimentally tested?

ADDED: People are missing the point that the situation is symmetrical for religious evangelists. For them to step outside of their worldview, and use reason to gain converts, is as epistemically dangerous for them, as it is for us to gain converts using something other than reason.

Disagree. Most people know very little of logic and reason, and will not bother to do their homework upon being approached with Christian arguments based on reason.

[-][anonymous]15y00

Some examples of what I think you're looking for:

  1. Vassar's proposed shift from saying "this is the best thing you can do" to "this is a cool thing you can do" because people's psychologies respond better to this
  2. Operant conditioning in general
  3. Most generally, create a model of the other person, then use standard rationality to explore how to most efficiently change them. Obviously, the less wrong and overcoming bias knowledge base is very relevant for this.
[-][anonymous]15y00

Some examples of what I think you're looking for:

  1. Vassar's proposed shift from saying "this is the best thing you can do" to "this is a cool thing you can do" because people's psychologies respond better to this
  2. Operant conditioning in general
  3. Most generally, create a model of the other person, then use standard rationality to explore how to most efficiently change them. Obviously, the less wrong and overcoming bias knowledge base is very relevant for this.
[-][anonymous]15y00

Some examples of what I think you're looking for:

  1. Vassar's proposed shift from saying "this is the best thing you can do" to "this is a cool thing you can do" because people's psychologies respond better to this
  2. Operant conditioning in general
  3. Most generally, create a model of the other person, then use standard rationality to explore how to most efficiently change them. Obviously, the less wrong and overcoming bias knowledge base is very relevant for this.
[-][anonymous]15y00

Some examples of what I think you're looking for:

  1. Vassar's proposed shift from saying "this is the best thing you can do" to "this is a cool thing you can do" because people's psychologies respond better to this
  2. Operant conditioning in general
  3. Most generally, create a model of the other person, then use standard rationality to explore how to most efficiently change them. Obviously, the less wrong and overcoming bias knowledge base is very relevant for this.

A moral argument is often a good start.

Do you mean a utilitarian argument? Rational behavior helps us not kill each other and build more cool stuff, something like that?

A purely moral argument seems difficult; it would go something like, "Rationalism is good. Period."

Instrumental rationality as a tool for moral ends. Epistemological rationality as a tool for instrumental ends. Oh look, now we have to revise our worldview.

In fact I've never yet seen the relationship between epistemic, instrumental rationalities, and real world objectives, so clearly stated. Would it be ok to collect such material in the wiki ? Like, all those short, concise, illuminating quotes we stumble upon in here ?

No objection here.

Sneaky. I like it.

That is after all, how Christianity got sucker-punched the first time around.