A common response in the recent LessWrong threads about UFO's is rationalists immediately going into a state of wanting to translate the news into probabilities of the existence of aliens instead of taking the facts for what they are and thinking about what should happen based on the revealed facts.
According to Ross Coulthart, David Grusch gave the ICIG, Congress and the Senate, the location where the vehicles are stored and the names of the people who control access to those programs.
While I would like to know whether or not aliens visited earth, I think it's more useful to simply take the stance "I don't know" instead of thinking in terms of probability.
From the "I don't know"-stance, the next step is obvious: There need to be congressional hearings where the people who were named has being in control of access to those programs get asked in public about the nature of those programs.
Given that there seem to be powerful people in the intelligence community who want to block public exposure of whatever the nature of those programs are, it's important that there's public pressure on Congress to investigate and hold public hearings that go into the details.
The mental moves of directly rounding down to "my priors against aliens are high" -> "no aliens" -> "no need to do anything" is bad as if enough people hold it we won't get more evidence.
That's just mistakes how the human mind and human intelligence works. Our brain is not made to think in terms of probability.
I think that most people who think that their beliefs are completely coherent delude themselves. Assuming that beliefs being completely coherent is a natural state of the human mind mistakes a lot about what goes on in human minds. When building an AI for a long-time there was the belief that AI will likely have coherent beliefs. With GPT we see that the best intellience we can build on computers doesn't seem to have that feature of coherent beliefs either.
Julia Galef writes about how noticing confusion is a key skill of a rationalist. The state of noticing confusion is one where you see that the evidence doesn't seem to really fit and you don't have a good idea of the right hypothesis.
Confusion calls for more investigation. It's normal that you don't have clear hypnothesis when you investigate when you are confused.
Thomas Kuhn writes about how new scientific paradigms always start with seeing some anamolies and investigating them. If you don't engage in that investigation because you don't have a decent probability hypothesis of how the facts fit together, you are not going to find new paradigms because that involves working a decent amount of time in a space with a lot of unknowns.