A lot of people probably saw this on hacker news but I thought I'd share it anyway - People are biased against creative ideas, studies find
To sum up, most people dislike uncertainty so much that they'll reject pretty much anything new, good or not. The article states that "Anti-creativity bias can be so subtle that people are unaware of it, which can interfere with their ability to recognize a creative idea." By "creative idea," I of course mean lawful creativity - the article seems to suggest that at a certain point, every creative suggestion starts to sound about as useful as "let's put pictures of purple unicorns on the wall to help ourselves be more productive," if you're biased enough.
What's a good way to fight this? Obviously solving the problem of being creative is a totally different matter. But I would suggest the usual "if you were a different person injected into your own life to improve things" approach and start by taking every single suggestion seriously and thinking it through as if you were only dealing with the issue for the very first time, and then as time went on, improve at making quick unbiased evaluations of creative ideas.
This article is a journalistic synopsis of a scientific paper but does not link to said paper. Nothing unusual there but it does limit the ability to discuss it intelligibly; the further you are from the empirical evidence the greater the potential distortion of the signal.
That being said; I would be interested to see if this was more directed specifically at creativity or merely at uncertainty. If it's simply how to digest new information, well... not even the standard Bayesian theorem states you should be entirely without bias; new information that contradicts old needs to account for the old in the new assessment of probabilities. That's biasing, definitionally.
Typically I handle the concept of uncertainty in a number of ways; one, by a learned apathy -- I seem innately to be more comfortable with uncertainty than others. Two; when I knowingly introduce uncertainty I do so by estimating the 'marginal value' of the act and at what threshold a minimum allocation of time/resources to a thing would likely result in positive gain. I then compartmentalize off an amount relevant to that threshold but no further. (Risk mitigation.)
I don't know that this is useful to others, however.
http://digitalcommons.ilr.cornell.edu/articles/450/