Robin Hanson notes that the existence of a stock market can also give rise to an incentive to e.g. bomb a company's offices, yet such things very rarely actually happen.
Large stock market gains are trackable. See the investigation of people who bought puts on airline stocks before 9/11, for example. (It didn't end up finding them guilty, but my point is that it could have, the information was there.)
If prediction markets were required to ban anonymous users, then it might be comparable to the stock market.
Do the proponents of futarchy think predicting should be anonymous?
Do the proponents of futarchy think predicting should be anonymous?
I infer from the timing of Szabo's post (maybe he says more explicitly, I didn't read it carefully since this is an old criticism of PMs) that this post was prompted by the recent progress of Truthcoin/Augur, a prediction market which is now running prototypes on the Ethereum blockchain.
Truthcoin is fully-distributed (judgments are distributed among all traders and honesty incentivized by a clever majority algorithm), and so whether or not one thinks predicting should be anonymous, it will be pseudonymous.
the existence of the market also gives rise to parallel external incentives.
Yes, this is generally true, so what? A sports betting market, then, is necessarily a market for breaking players' legs, any country's stock and bond markets are markets in political assassination of its leaders, etc. etc.
Interesting point! It seems obvious in hindsight that if you reward people for making predictions that correspond to reality, they can benefit both by fitting their predictions to reality or fitting reality to their predictions. Certainly, it is an issue that come up even in real life in the context of sporting betting. That said, this particular spin on things hadn't occurred to me, so thanks for sharing!
"The best way to predict the future is to invent it." -- Alan Kay (and/or others.)
Seems a pretty straightforward extension of Goodhart's Law. Every measure is an incentive, with all the alignment and agency problems that come along with it.
The Alan Kay quote reminds me of my thermo prof saying, "Engineers love to control a process. That way, we don't have to understand it." I've always loved that one, but Alan Kay says it better.
I agree that simple, single stage game models do not usually predict important real world outcomes. I also agree that markets change players' incentives to act outside of that market.
However, society usually notices these blind spots and addresses them in one way or another. Szabo describes two real world problems:
1) Audits ignore outside accounts/trading activity
For public companies, deception of this kind is usually illegal. Key employees of a company certify on its annual report that “this report does not contain any untrue statement of a material fact or omit to state a material fact” or something to that effect. Also, insider trading can carry stiff penalties.
2) Prediction markets may become assassination markets
Murder is illegal. Also, profits obtained by illegal acts are subject to disgorgement in the United States.
…a prediction market on a certain person's death is also an assassination market. Which is why a pre-Gulf-War-II DARPA-sponsored experimental "prediction market" included a prop bet on Saddam Hussein's death, but excluded such trading on any other, more politically correct world leaders.
An “assassination market” on normal people exists today. It is called the life settlement market.
I want to see if I understand: you saying that participants in prediction markets use microeconomic and macroeconomic tools. And, because microeconomic tools can overfit and macroeconomic tools are subject to the Lucas critique, prediction markets has kinks to work out?
while a prediction market does incentivize feeding accurate information into the system
Not quite. Prediction markets allow you to bet on your forecasts and their aggregate output is just capital-weighted opinion of the participants. They incentivize being honest about one's forecasts, but that's a very different thing from "feeding accurate information".
No. If you forecast that the price of gold will go up, and the price instead goes down, then being honest about your forecast loses you money. Prediction markets reward people for making accurate predictions. Whether those predictions were an accurate reflection of beliefs is irrelevant.
Pretty much everything in life "reward[s] people for making accurate predictions". That's not the issue.
The problem is that to "supply accurate information" you need to know what is "accurate" ex ante and you don't. At the time you submit your bet to a prediction market you're operating on the basis of expectations -- you have no access to the Truth about the outcome, you only have access to your own beliefs. Accordingly, you don't tell the prediction market what is the correct choice, you tell it what you believe is the correct choice. Prediction markets aggregate beliefs, not truth values.
Nick Szabo writes about the dangers of taking assumptions that are valid in small, self-contained games and applying them to larger, real-world "games," a practice he calls a small-game fallacy.
This last point, which he expands on later in the post, will be of particular interest to some readers of LW. The idea is that while a prediction market does incentivize feeding accurate information into the system, the existence of the market also gives rise to parallel external incentives. As Szabo glibly puts it,
Futarchy, it seems, will have some kinks to work out.