A currently existing social norm basically says that everyone has the right to an opinion on anything, no matter how little they happen to know about the subject.
But what if we had a social norm saying that by default, people do not have the right to an opinion on anything? To earn such a right, they ought to have familiarized themselves on the topic. The familiarization wouldn't necessarily have to be anything very deep, but on the topic of e.g. controversial political issues, they'd have to have read at least a few books' worth of material discussing the question (preferrably material from both sides of the political fence). In scientific questions where one needed more advanced knowledge, you ought to at least have studied the field somewhat. Extensive personal experience on a subject would also be a way to become qualified, even if you hadn't studied the issue academically.
The purpose of this would be to enforce epistemic hygiene. Conversations on things such as public policy are frequently overwhelmed by loud declarations of opinion from people who, quite honestly, don't know anything on the subject they have a strong opinion on. If we had in place a social norm demanding an adequate amount of background knowledge on the topic before anyone voiced an opinion they expected to be taken seriously, the signal/noise ratio might be somewhat improved. This kind of a social norm does seem to already be somewhat in place in many scientific communities, but it'd do good to spread it to the general public.
At the same time, there are several caveats. As I am myself a strong advocate on freedom of speech, I find it important to note that this must remain a *social* norm, not a government-advocated one or anything that is in any way codified into law. Also, the standards must not be set *too* high - even amateurs should be able to engage in the conversation, provided that they know at least the basics. Likewise, one must be careful that the principle isn't abused, with "you don't have a right to have an opinion on this" being a generic argument used to dismiss any opposing claims.
The piece you may have missed is that even if the situation can be changed, it is still sufficient to use a positive reinforcement to motivate action, and in human beings, it is generally most useful to use positive reinforcement to motivate positive action.
This is because, on the human platform at least, positive reinforcement leads to exploratory, creative, and risk-taking behaviors, whereas negative reinforcement leads to defensive, risk-avoidance, and passive behaviors. So if the best way to change a situation is to avoid it, then by all means, use negative reinforcement.
However, if the best way to change the situation is to engage with it, then negative emotions and "shoulds" are your enemy, not your friend, as they will cause your mind and body to suggest less-useful behaviors (and signals to others).
IAWYC, modulo the use of "should": at least with connotations assumed on Less Wrong, it isn't associated with compulsion or emotional load, it merely denotes preference. "Ought" would be closer.