This depends on who's predicting, who's listening, and various actors' models of how other actors will react to the prediction. For this reason, it's very hard to accurately predict conflict type and severity - this is not completely zero-sum, but it's close enough to be anti-inductive.
Conditional predictions about who will be hurt by how much for wars _might_ reduce the chance of wars. Better knowledge of how badly one will lose (or be hurt in a win) can certainly lead participants to be more willing to submit rather than fighting.
Makes me think of the story where they simulate war, and then have their own citizens killed to represent what would have happened if there had been an actual war. Imagine this, but without the killing part. It might remove the negative sum aspect of wars.
My prior is that predictions are useful in general, and I have the impression this is also true for wars in general too, but I have no idea really, I just spent a few minutes thinking about it.
And I imagine it changes depending on the war dynamic. For example, predictions on mutually assure destructions would probably escalate really quickly (although not sure where it would stabilize; not necessarily high). Reason: If a prediction market predicts with slightly too high probability that a great power will do a nuclear first strike, predictors will slightly update that the other great power might also do a first strike to avoid being the recipient of a first strike, etc. in a way that can spiral and overall increase the risk of a first strike.
On the other hand, if a prediction markets signal wars would be more likely if countries had less economic trades with each other, then predicting this would be useful to inform countries to increase economic trades, and hence reducing the risk of wars.
I think it depends on the amount of money involved in the predictions! If the reward for correct predictions is high enough, people with political power might be incentivised to perform a military version of 'insider trading' and escalate/call off a war.
Good point.
Alternatively, maybe weaker countries would bet that they will go to war to hedge their bets. Possibly this would be equivalent of paying more powerful countries in exchange for them not going to war with you (which maybe is zero-sum instead of negative-sum).
[retracted: I read the question too quickly, misunderstood it]
My impression, after some thought and discussion (over the last ~1 year or so), is that people being smarter / predicting better will probably decrease the number of wars and make them less terrible. That said, there are of course tails; perhaps some specific wars could be far worse (one country being much better at destroying another).
As I understand it, many wars in part started due to overconfidence; both sides are overconfident on their odds of success (due to many reasons). If they were properly calibrated, they would more likely partake in immediate trades/consessions or similar, rather than take fights, which are rather risky.
Similarly, I wouldn't expect different AGIs to physically fight each other often at all.
Still interesting and relevant I think! It answers: "What are the externalities of GOOD / BAD predictions on wars?"
To which extent, if at all, does (probabilistically) predicting peace increases or decreases peace in expectation?
To which extent, if at all, does (probabilistically) predicting war increases or decreases in expectation?
Do those predictions have any other significant impact on the world?
Answers might very well depend on the type of wars.