The most useful thinking skill I've taught myself, which I think should be more widely practiced, is writing what I call "fact posts." I write a bunch of these on my blog. (I write fact posts about pregnancy and childbirth here.)
To write a fact post, you start with an empirical question, or a general topic. Something like "How common are hate crimes?" or "Are epidurals really dangerous?" or "What causes manufacturing job loss?"
It's okay if this is a topic you know very little about. This is an exercise in original seeing and showing your reasoning, not finding the official last word on a topic or doing the best analysis in the world.
Then you open up a Google doc and start taking notes.
You look for quantitative data from conventionally reliable sources. CDC data for incidences of diseases and other health risks in the US; WHO data for global health issues; Bureau of Labor Statistics data for US employment; and so on. Published scientific journal articles, especially from reputable journals and large randomized studies.
You explicitly do not look for opinion, even expert opinion. You avoid news, and you're wary of think-tank white papers. You're looking for raw information. You are taking a sola scriptura approach, for better and for worse.
And then you start letting the data show you things.
You see things that are surprising or odd, and you note that.
You see facts that seem to be inconsistent with each other, and you look into the data sources and methodology until you clear up the mystery.
You orient towards the random, the unfamiliar, the things that are totally unfamiliar to your experience. One of the major exports of Germany is valves? When was the last time I even thought about valves? Why valves, what do you use valves in? OK, show me a list of all the different kinds of machine parts, by percent of total exports.
And so, you dig in a little bit, to this part of the world that you hadn't looked at before. You cultivate the ability to spin up a lightweight sort of fannish obsessive curiosity when something seems like it might be a big deal.
And you take casual notes and impressions (though keeping track of all the numbers and their sources in your notes).
You do a little bit of arithmetic to compare things to familiar reference points. How does this source of risk compare to the risk of smoking or going horseback riding? How does the effect size of this drug compare to the effect size of psychotherapy?
You don't really want to do statistics. You might take percents, means, standard deviations, maybe a Cohen's d here and there, but nothing fancy. You're just trying to figure out what's going on.
It's often a good idea to rank things by raw scale. What is responsible for the bulk of deaths, the bulk of money moved, etc? What is big? Then pay attention more to things, and ask more questions about things, that are big. (Or disproportionately high-impact.)
You may find that this process gives you contrarian beliefs, but often you won't, you'll just have a strongly fact-based assessment of why you believe the usual thing.
There's a quality of ordinariness about fact-based beliefs. It's not that they're never surprising -- they often are. But if you do fact-checking frequently enough, you begin to have a sense of the world overall that stays in place, even as you discover new facts, instead of swinging wildly around at every new stimulus. For example, after doing lots and lots of reading of the biomedical literature, I have sort of a "sense of the world" of biomedical science -- what sorts of things I expect to see, and what sorts of things I don't. My "sense of the world" isn't that the world itself is boring -- I actually believe in a world rich in discoveries and low-hanging fruit -- but the sense itself has stabilized, feels like "yeah, that's how things are" rather than "omg what is even going on."
In areas where I'm less familiar, I feel more like "omg what is even going on", which sometimes motivates me to go accumulate facts.
Once you've accumulated a bunch of facts, and they've "spoken to you" with some conclusions or answers to your question, you write them up on a blog, so that other people can check your reasoning. If your mind gets changed, or you learn more, you write a follow-up post. You should, on any topic where you continue to learn over time, feel embarrassed by the naivety of your early posts. This is fine. This is how learning works.
The advantage of fact posts is that they give you the ability to form independent opinions based on evidence. It's a sort of practice of the skill of seeing. They likely aren't the optimal way to get the most accurate beliefs -- listening to the best experts would almost certainly be better -- but you, personally, may not know who the best experts are, or may be overwhelmed by the swirl of controversy. Fact posts give you a relatively low-effort way of coming to informed opinions. They make you into the proverbial 'educated layman.'
Being an 'educated layman' makes you much more fertile in generating ideas, for research, business, fiction, or anything else. Having facts floating around in your head means you'll naturally think of problems to solve, questions to ask, opportunities to fix things in the world, applications for your technical skills.
Ideally, a group of people writing fact posts on related topics, could learn from each other, and share how they think. I have the strong intuition that this is valuable. It's a bit more active than a "journal club", and quite a bit more casual than "research". It's just the activity of learning and showing one's work in public.
A metaphor: Knowledge is a jigsaw puzzle, and the search for truth is a process of trial and error fitting new pieces alongside those already have. The more pieces you have in place, the quicker you can accept or reject new ones; the more granular the detail you perceive in their edges, the better you can identify the exact shape of holes in the puzzle and make new discoveries.
And if there's a misshapen piece you absolutely refuse to move it will screw up the entire puzzle and you'll never get it right. This method is great - generally reliable sources which fit together are free pieces which act as your foundation to even get started.
Unfortunately, it's often easy and natural to force contradicting new data into your existing model even if it really doesn't fit - patching the conflicts without ever really noticing the dissonance, and overfitting your theory without actually restructuring your beliefs. One useful trick for checking yourself: explicitly asking yourself "what do I expect this figure / fact to be or say?" on each step of the project before you look it up. If you go in with reasonably certain expectations and the data reads wildly out of bounds, maybe you've found a major hole in your understanding of the issue, maybe the info is bad, or maybe that figure is saying something very different than you interpreted it.