Mandating Information Disclosure vs. Banning Deceptive Contract Terms
Economists are very into the idea of mutually beneficial exchange. The standard argument is that if two parties voluntarily agree to a deal, then they must be better off with the deal than without it, otherwise they wouldn't have agreed. And if the terms of that deal don't harm any third parties,* then the deal must be welfare-improving, and any regulatory restrictions on making it must be bad.
One objection to this argument is that it's not always clear what is and what is not "voluntary." I once has a well-published economist friend argue that there are no gradations of voluntariness: either a deal was made under some kind of compulsion or it wasn't. I asked him if he would be OK letting his then pre-adolescent son make any schoolyard deal he wanted as long as it was not made under any overt threat, and I think (but am not totally sure) that he has since backed off this position. So there is an argument for purely paternalistic restrictions on freedom of contract.
Another objection, one which economists tend to take more seriously, relates to information. Specifically, there is the idea that maybe one party to the contract is not fully informed about its terms. For this reason, many economists are willing to entertain policies by which firms are required to disclose certain information, and to do so in a way that is comprehensible to consumers. So for example we now have "Schumer boxes" that govern the ways in which credit card companies present certain information in promotional materials. This seems to many people to be a reasonable remedy: if the problem was that one side of the transaction was ignorant, then a regulation that eliminates that ignorance, while at the same time not interfering with their freedom to engage in mutually beneficial exchange, must be a good thing.
The continued misuse of the Prisoner's Dilemma
Related to: The True Prisoner's Dilemma, Newcomb's Problem and Regret of Rationality
In The True Prisoner's Dilemma, Eliezer Yudkowsky pointed out a critical problem with the way the Prisoner's Dilemma is taught: the distinction between utility and avoided-jail-time is not made clear. The payoff matrix is supposed to represent the former, even as its numerical values happen to coincidentally match the latter. And worse, people don't naturally assign utility as per the standard payoff matrix: their compassion for the friend in the "accomplice" role means they wouldn't feel quite so good about a "successful" backstabbing, nor quite so bad about being backstabbed. ("Hey, at least I didn't rat out a friend.")
For that reason, you rarely encounter a true Prisoner's Dilemma, even an iterated one. The above complications prevent real-world payoff matrices from working out that way.
Which brings us to another unfortunate example of this misunderstanding being taught.
Joint Distributions and the Slow Spread of Good Ideas
A few years ago a well-known economist named David Romer published a paper in a top economics journal* arguing that professional football teams don't "go for it" nearly often enough on fourth down. The question, of course, is how this can persist in equilibrium. If Romer is correct, wouldn't teams have a strong incentive to change their strategies? Of course it's possible that he is correct, but that no one ever knew it before the paper was published. But then would the fact that the recommendation has not been widely adopted** constitute strong evidence that he is not correct? The paper points out two possible reasons why not. First, the objective function of the decision-makers may not be to maximize the probability of winning the game. Second and more relevant for our purposes, there may be some biases at work. The key point is this quote from the article (page 362):
"Many skills are more important to running a football team than a command of mathematical and statistical tools. And it would hardly be obvious to someone without knowledge of those tools that they could have any significant value in football."
Less wrong economic policy
Yesterday I heard an interesting story on the radio about US President Obama's pick to head the Office of Information and Regulatory Affairs, Cass Sunstein. I recommend checking out the story, but here are a few key excerpts.
Cass Sunstein, President Obama's pick to head the Office of Information and Regulatory Affairs, is a vocal supporter of [...] economic policy that shapes itself around human psychology. Sunstein is just one of a number of high-level appointees now working in the Obama administration who favors this kind of approach.
[...]
Through their research, Kahneman and Tversky identified dozens of these biases and errors in judgment, which together painted a certain picture of the human animal. Human beings, it turns out, don't always make good decisions, and frequently the choices they do make aren't in their best interest.
Beware Trivial Inconveniences
The Great Firewall of China. A massive system of centralized censorship purging the Chinese version of the Internet of all potentially subversive content. Generally agreed to be a great technical achievement and political success even by the vast majority of people who find it morally abhorrent.
I spent a few days in China. I got around it at the Internet cafe by using a free online proxy. Actual Chinese people have dozens of ways of getting around it with a minimum of technical knowledge or just the ability to read some instructions.
The Chinese government isn't losing any sleep over this (although they also don't lose any sleep over murdering political dissidents, so maybe they're just very sound sleepers). Their theory is that by making it a little inconvenient and time-consuming to view subversive sites, they will discourage casual exploration. No one will bother to circumvent it unless they already seriously distrust the Chinese government and are specifically looking for foreign websites, and these people probably know what the foreign websites are going to say anyway.
Think about this for a second. The human longing for freedom of information is a terrible and wonderful thing. It delineates a pivotal difference between mental emancipation and slavery. It has launched protests, rebellions, and revolutions. Thousands have devoted their lives to it, thousands of others have even died for it. And it can be stopped dead in its tracks by requiring people to search for "how to set up proxy" before viewing their anti-government website.
Sunk Cost Fallacy
Related to: Just Lose Hope Already, The Allais Paradox, Cached Selves
In economics we have this concept of sunk costs, referring to costs that have already been incurred, but which cannot be recouped. Sunk cost fallacy refers to the fallacy of honoring sunk costs, which decision-theoretically should just be ignored. The canonical example goes something like this: you have purchased a nonrefundable movie ticket in advance. (For the nitpickers in the audience, I will also specify that the ticket is nontransferable and that you weren't planning on meeting anyone.) When the night of the show comes, you notice that you don't actually feel like going out, and would actually enjoy yourself more at home. Do you go to the movie anyway?
A lot of people say yes, to avoid wasting the ticket. But on further consideration, it would seem that these people are simply getting it wrong. The ticket is a sunk cost: it's already paid for, and you can't do anything with it but go to the movie. But we've stipulated that you don't want to go to the movie. The theater owners don't care whether you go; they already have their money. The other theater-goers, insofar as they can be said to have a preference, would actually rather you stayed home, making the theater marginally less crowded. If you go to the movie to satisfy your intuition about not wasting the ticket, you're not actually helping anyone. Of course, you're entitled to your values, if not your belief. If you really do place terminal value on using something because you've paid for it, well, fine, I guess. But we should all try to notice exactly what it is we're doing, in case it turns out to not be what we want. Please, think it through.
Dearest reader, if you're now about to scrap your intuition against wasting things, I implore you: don't! The moral of the parable of the movie ticket is not that waste is okay; it's that you should implement your waste-reduction interventions at a time when they can actually help. If you can anticipate your enthusiasm waning on the night of the show, don't purchase the nonrefundable ticket in the first place!
Awful Austrians
Response to: The uniquely awful example of theism
Why is theism such an ever-present example of irrationality in this community? I think ciphergoth overstates the case. Even theism is not completely immune to evidence, as the acceptance of, say, evolution by so many denominations over time will testify. Theism is a useful whipping boy because it needs no introduction.
But I think the case is overstated for another reason. There are terrible epistemologies out there that are just as bad as theism's. Allow me to tell you a tale, of how I gave up my religion and my association with a school of economics at the same time.
I grew up in a southern Presbyterian church in the U.S. While I was taught standard pseudo-evidential defenses for belief, such as "creation science" and standard critiques of evolution, my church was stringently anti-evidentialist. Their preferred apologetic was something called presuppositionalism. It's certainly a minority apologetic among major defenders of Christianity today, especially compared to the cosmological or morality arguments. But it's a particularly rigorous attempt to defend beliefs against evidence nonetheless.
Presuppositionalism (in some forms) hangs on the problem of induction. We cannot ultimately justify any of our beliefs without first making some assumptions, otherwise we end in solipsism. Christianity, then, justifies itself not on evidence, but on internal consistency. It is ok for an argument to be ultimately circular, because all arguments are ultimately circular. Christianity alone maintains perfect worldview consistency when examined through this lens, and is therefore correct.
Since I've spent a lot of time thinking about this--it can take a considerable effort to change one's mind, after all--I can imagine innumerable things wrong with it, but they're not the focus of this entry. First, I just want to note how close it is to a kind of intro-level Bayesian understanding. Bayesians admit that we must have priors, that it's indeed nonsense to think we can even have an argument with one who doesn't. We must ultimately admit that certain justifications are going to be either recursive or based on priors. We believe that we should update our priors based on evidence, but there's nothing in the math that tells us we can't start with a prior for some position of 0% or 100%. (There is something in the math that tells us such probability assignments are very bad ideas, and we have more than enough cognitive bias literature that tells us we shouldn't be so damn overconfident. But then, what if you have a prior that keeps you from accepting such evidence?) It doesn't have any of the mathematical rigor, but it comes very close on a few major points.
Dead Aid
Followup to So You Say You're an Altruist:
Today Dambisa Moyo's book "Dead Aid: Why Aid Is Not Working and How There Is a Better Way for Africa" was released.
From the book's website:
In the past fifty years, more than $1 trillion in development-related aid has been transferred from rich countries to Africa. Has this assistance improved the lives of Africans? No. In fact, across the continent, the recipients of this aid are not better off as a result of it, but worse—much worse.
In Dead Aid, Dambisa Moyo describes the state of postwar development policy in Africa today and unflinchingly confronts one of the greatest myths of our time: that billions of dollars in aid sent from wealthy countries to developing African nations has helped to reduce poverty and increase growth.
In fact, poverty levels continue to escalate and growth rates have steadily declined—and millions continue to suffer. Provocatively drawing a sharp contrast between African countries that have rejected the aid route and prospered and others that have become aid-dependent and seen poverty increase, Moyo illuminates the way in which overreliance on aid has trapped developing nations in a vicious circle of aid dependency, corruption, market distortion, and further poverty, leaving them with nothing but the “need” for more aid.
From the Global Investor Bookshop:
Dead Aid analyses the history of economic development over the last fifty years and shows how Aid crowds out financial and social capital and directly causes corruption; the countries that have caught up did so despite rather than because of Aid. There is, however, an alternative. Extreme poverty is not inevitable. Dambisa Moyo also shows how, with improved access to capital and markets and with the right policies, even the poorest nations could be allowed to prosper. If we really do want to help, we have to do more than just appease our consciences, hoping for the best, expecting the worst. We need first to understand the problem.
View more: Prev
= 783df68a0f980790206b9ea87794c5b6)
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)