Change blindness is the phenomenon whereby people fail to notice changes in scenery and whatnot if they're not directed to pay attention to it. There are countless videos online demonstrating this effect (one of my favorites here, by Richard Wiseman).
One of the most audacious and famous experiments is known informally as "the door study": an experimenter asks a passerby for directions, but is interrupted by a pair of construction workers carrying an unhinged door, concealing another person whom replaces the experimenter as the door passes. Incredibly, the person giving directions rarely notices they are now talking to a completely different person. This effect was reproduced by Derren Brown on British TV (here's an amateur re-enactment).
Subsequently a pair of Swedish researchers familiar with some sleight-of-hand magic conceived a new twist on this line of research, arguably even more audacious: have participants make a choice and quietly swap that choice with something else. People not only fail to notice the change, but confabulate reasons why they had preferred the counterfeit choice (video here). They called their new paradigm "Choice Blindness".
Just recently the same Swedish researchers published a new study that is even more shocking. Rather than demonstrating choice blindness by having participants choose between two photographs, they demonstrated the same effect with moral propositions. Participants completed a survey asking them to agree or disagree with statements such as "large scale governmental surveillance of e-mail and Internet traffic ought to be forbidden as a means to combat international crime and terrorism". When they reviewed their copy of the survey their responses had been covertly changed, but 69% failed to notice at least one of two changes, and when asked to explain their answers 53% argued in favor of what they falsely believed was their original choice, when they had previously indicated the opposite moral position (study here, video here).
Dark Tactic:
This one makes me sick to my stomach.
Imagine some horrible person wants to start a cult. So they get a bunch of people together and survey them asking things like:
"I don't think that cults are a good thing." "I'm not completely sure that (horrible person) would be a good cult leader."
and switches them with:
"I think that cults are a good thing." "I'm completely sure that (horrible person) would be a good cult leader."
And the horrible person shows the whole room the results of the second set of questions, showing that there's a consensus that cults are a good thing and most people are completely sure that (horrible person) would be a good cult leader.
Then the horrible person asks individuals to support their conclusions about why cults are a good thing and why they would be a good leader.
Then the horrible person starts asking for donations and commitments, etc.
Who do we tell about these things? They have organizations for reporting security vulnerabilities for computer systems so the professionals get them... where do you report security vulnerabilities for the human mind?
Is you start a cult you don't tell people that you start a cult. You tell them: Look there this nice meetup. All the people in that meetup are cool. The people in that group think differently than the rest of the world. They are better. Then there are those retreats where people spents a lot of time together and become even better and more different than the average person on the street.
Most people in the LessWrong community don't see it as a cult, and the same is true for most organisations that are seen as cults.