Here’s a paper from FHI from 2016 on a cost benefit analysis of GoF research:
https://www.fhi.ox.ac.uk/wp-content/uploads/GoFv9-1.pdf
I don’t know how carefully you’ve quantified “most of the EA think tanks,” but maybe worth adding some precision here?
The problem is that the paper doesn't do a cost benefit analysis and instead says "However, in the case of potential pandemic pathogens, even a very low probability of accident could be unacceptable given the consequences of a global pandemic."
The base rates that the last pandemic that was caused by a lab leak wasn't even 50 years ago, suggests that at the time it wasn't a very low probability. Then the paper goes on and calls for a generalized solution for the problem. The recommendation looks to a reader like "gain of function" research is a political topic and thus we can push for general safety solutions that are useful also in other areas where the risks are troubling to us.
A non-generalized solution would be to say: "No biosafety 3/4 labs in cities. Have them all in remote areas and require researchers leaving them to undergo a 14 day quarantine". The fact that Baric developed his SARS 2 too outside of a biosafety 4 lab is mindboggling.
When speaking about asteroids as an X-risk speaking about very low probability makes sense. Not doing the cost benefit analysis and speaking of a very low probability gives people reading FHI papers a false impression of the risks.
Well, you may not like their approach, but the original argument you were making, I think, was that EA think tanks weren’t addressing this issue. This paper certainly dealt with the topic in more depth than the listicle, not that that’s saying much, and it did it 2-3 years earlier. Also it took me all of 10 seconds to find it. So again, can you be a little more precise in saying what you mean by “most EA think tanks?”
Or are you mainly saying that you’d have liked to have seen EA screaming at giant, obvious, institutional level against gain of function research, rather than writing some tidy policy papers?
Also in early 2019, Kelsey Piper's article Biologists are trying to make bird flu easier to spread. Can we not? was published at Vox (Future Perfect).
A physics professor at the university of Hamburg, in Germany, spent over a year putting together research papers and publications from labs dedicated to that kind of research, like the one in Wuhan, and published significant bits of them along with commentary to instigate public discussion about Covid and the need -- or lack thereof -- of research related to making viruses more dangerous. He asks a great point: even in the best case, the research done so far on Coronaviruses from bats, manipulated to be more dangerous in a lab, has not helped us prevent a pandemic, so why continue to risk a breach?
This is a very condensed summary in English: https://www.uni-hamburg.de/en/newsroom/presse/2021/pm8.html . It links to the complete document in German and English; even if you don't understand German, all the snippets of published articles are in English, so you can follow along even without the commentary.
He provides circumstantial evidence that this pandemic was the result of the research. His purpose, however, doesn't seem to be the assignment of blame on political grounds but making sure that a very possible origin of the virus is not dismissed as a conspiracy theory, rendering prevention measures suboptimal in the future.
The post is written in January 23, 2019 about 10 Deadly Viruses created in labs:
It seems like listverse outperformed most of the EA ThinkTanks on the question of thinking about pandemics pre-COVID.