This is evidence against the claim that debating opinions which are not widely held, or even considered "conspiracy theories", just gives them a platform and strengthens their credibility in the eyes of the public.
I'm not sure if this really applies here, since Lab Leak was never really treated as a crazy/fringe idea among rationalists. In fact, it looks like it was the majority opinion before the debate and ACX posts.
This, and also most people on ACX respect Scott and his opinions, so if he demonstrates that he has put a lot of thought into this, and then he makes a conclusion, it will sound convincing to most.
Basically, we need to consider not just how many people believe some idea, but also how strongly. The typical situation with a conspiracy theory is that we have a small group that believes X very strongly, and a large group that believes non-X with various degrees of strength, from strongly to almost zero. What happens then is that people with a strong belief typically don't change their mind, while people with zero belief (who until now just took one side by default, because they never heard about the other) will flip a coin. Therefore the typical outcome is that the conspiracy theory becomes better known.
Or maybe the zero belief is not literally "never heard about theory" but "never met an actual person who also believes the theory" and as the debate starts, they find each other, and thus the conspiracy theory becomes socially acceptable (even being in a minority feels very different from being alone).
When the conspiracy theory is wildly known, and everyone already knows a few believers, most damage was already done.
Rationalist-adjacent community is often the opposite of the wider society, in that the mainstream beliefs are low-status, and we need to be reminded that they sometimes actually exist for a good reason. There is always this suspicion that people who have mainstream beliefs are simply too stupid to think independently. Therefore a debate will improve the case of the mainstream belief.
Is a lot of the effect not "people who read ACX trust Scott Alexander"? Like, the survey selects for most "passionate" readers, those willing to donate their free time to Scott for research with ~nothing in return. Him publicly stating on his platform "I am now much less certain of X" is likely to make that group of people be less certain of X?
This is clever. Looking at readership views on a subject Scott posted about during the survey wasn't something I'd even thought about, but it feels obvious in hindsight which is an excellent sign of clever in my book. And your results are not just significant but at the "hot damn look at that chart" level.
Thank you for thinking of this and writing it up!
I also found a huge effect in my twitter poll https://twitter.com/warty_dog/status/1773479101568786736 , though that has worse potential selection issues than ACX poll
ACX recently posted about the Rootclaim Covid origins debate, coming out in favor of zoonosis. Did the post change the minds of those who read it, or not? Did it change their judgment in favor of zoonosis (as was probably the goal of the post), or conversely did it make them think Lab Leak was more likely (as the "Don't debate conspiracy theorists" theory claims)?
I analyzed the ACX survey to find out, by comparing responses before and after the post came out. The ACX survey asked readers whether they think the origin of Covid is more likely natural or Lab Leak. The ACX survey went out March 26th and was open until about April 10th. The Covid origins post came out March 28th, and the highlights on April 9th. So we can compare people who responded before the origins post came out to those who responded after[1]. We should be careful, though, since those who fill out the survey earlier could be different than those who filled out later, and this could create a correlation which isn't causal.
I used a Regression Discontinuity Design on the time of the response to see if there was a break in the trend of responses right at the time the Covid post went up. Figuratively, this compares respondents "right before" the post to "right after" so can help assuage the confound fears.
I find that the post made readers more likely to think that the origin was indeed zoonosis. And this is highly significant. Here are the results, in charts.
Analysis
Here is the number of responses over time, with the timings of the posts highlighted. We'll mostly just need the timing of the Covid origins post, which is around response 4,002.
I'm assuming that readers who responded to the survey after the post went up have read the post before responding. This is the post engagement data[1] which shows within a few days of posting, most views of the post already took place.
The ACX Survey asked respondents what they thought about Covid origins.
I substracted 3 from the questionnaire response, to analyze a centered scale, for convenience. Here are the sliding window averages of 1,000 responses.
There are some fluctuations, but quite clearly there is a break in the trend at the time of the post, with readers starting to give scores more towards zoonosis. Looks like the post lowered responses by about 0.5 points (this takes time to transition in the chart, because of the sliding window) There's not enough data to eyeball something about the Comment Highlights post.
Another way to look at the same data is using not a sliding window, but a cumulative sum, where the local slope is the average response. I detrended this, so that it has 0 slope before the Covid post, just for convenience again.
We very clearly see the break in the trend, and the slope comes out -0.52 points, similar to before. This is almost half a standard deviation, which is a pretty large effect. Needless to say it is extremely statistically significant. In fact, this effect made the Covid origins question the most highly correlated with response order of all survey questions.
As a placebo test, I also checked whether this effect exists for other responses, even ones correlated with Covid origins before the post, like views on Abortion, or Political Spectrum. I found nothing that looks nearly this clear. The effects are much smaller if any, and not highly significant.
I was curious if the post also had a polarizing effect, where readers became more likely to hold a stronger view after the post, i.e. Lab Leak proponents becoming more certain of Lab Leak, and zoonosis proponents becoming more certain of zoonosis. I don't find much support for this. The sliding window standard deviation of responses does not increase after the post. I'm not sure this is the perfect test for this hypothesis, so tell me if you have better ideas and I'm happy to implement them.
The post also didn't seem to change the probability of respondents to answer the question at all.
Conclusion
In this case, it seems that the post convinced readers that zoonosis is more likely than they had previously thought. This is evidence against the claim that debating opinions which are not widely held, or even considered "conspiracy theories", just gives them a platform and strengthens their credibility in the eyes of the public.
Scott has been kind enough to share the timestamps of the survey responses and the engagement data with me for this analysis.