Feed confirmatory evidence to others, give them tests to run which you know beforehand are confirmatory
This is not a way to take advantage of confirmation bias. Confirmation bias means that others look for confirming evidence for their true theories, and ignore disconfirming evidence. This process is not much affected by you adding extra confirmatory evidence - they can find plenty on their own. Instead, it is a way to fool rational people - for example, Bayesians who update based on evidence will update wrong if fed biased evidence. Which doesn't really fit here.
The way to actually use confirmation bias to convince people of things is to present beliefs you want to transmit to them as evidence for things they already believe. Then confirmation bias will lead them to believe this new evidence without question, because they wish to believe it to confirm their existing beliefs.
Another way to take advantage of confirmation bias is exemplified by horoscopes: offering people predictions that are sufficiently vague that no matter what happens, people can find a way to interpret the prediction as having come true.
Also, someone who wanted to be respected by many people could write semi-nuanced opinion texts that could be plausibly interpreted to favor either side in a debate. In the "best" case, supporters of both sides will read the text and like you for being on their side.
You may have heard about IARPA's Sirius Program, which is a proposal to develop serious games that would teach intelligence analysts to recognize and correct their cognitive biases. The intelligence community has a long history of interest in debiasing, and even produced a rationality handbook based on internal CIA publications from the 70's and 80's. Creating games which would systematically improve our thinking skills has enormous potential, and I would highly encourage the LW community to consider this as a potential way forward to encourage rationality more broadly.
While developing these particular games will require thought and programming, the proposal did inspire the NYC LW community to play a game of our own. Using a list of cognitive biases, we broke up into groups of no larger than four, and spent five minutes discussing each bias with regards to three questions:
The Sirius Program specifically targets Confirmation Bias, Fundamental Attribution Error, Bias Blind Spot, Anchoring Bias, Representativeness Bias, and Projection Bias. To this list, I also decided to add the Planning Fallacy, the Availability Heuristic, Hindsight Bias, the Halo Effect, Confabulation, and the Overconfidence Effect. We did this Pomodoro style, with six rounds of five minutes, a quick break, another six rounds, before a break and then a group discussion of the exercise.
Results of this exercise are posted below the fold. I encourage you to try the exercise for yourself before looking at our answers.
Caution: Dark Arts! Explicit discussion of how to exploit bugs in human reasoning may lead to discomfort. You have been warned.
Confirmation Bias
Fundamental Attribution Error
Bias Blind Spot
Anchoring Bias
Representativeness Bias
Projection Bias
Planning Fallacy
Availability Heuristic
Hindsight Bias
Halo Effect
Confabulation
Overconfidence Bias
Summary
How long do you think it should take to solve a major problem if you are not wasting any time? Everything written above was created in a sum total of one hour of work. How many of these ideas had never even occurred to us before we sat down and thought about it for five minutes? Take five minutes right now and write down what areas of your life you could optimize to make the biggest difference. You know what to do from there. This is the power of rationality.