- Sister Y's The Right to Marry
- A Really, Really, Really Long Post About Gay Marriage That Does Not, In The End, Support One Side Or The Other also recommended by CharlieSheen
Now suppose the existence of an amoral, demiomnipotent third party that can determine if a person understands the implications of an agreement and is free from coercion, will formalize any contract iff all parties understand the implications of said contract and are free from coercion, and enforces all formalized contracts only at the request of any party to the contract. Is that UFAI, FAI, or neither?
It's less unfriendly than fubarobfusco's example, but still not quite optimal, since refusing to enforce some contracts (most obviously, contracts which inflict technical externalities on third parties) increases utility.
You could weaken this conclusion by assuming that the AI can drive all transaction and contracting costs to zero, since then all Coasian-optimal contracts are made. But even that result assumes, e.g. that inequalities in marginal utility are not relevant (since otherwise a utilitarian AI will want to "redistribute" wealth - broadly understood - and use imperfect contract enforcement to do so).
Information asymmetries may also be a problem: it's possible that Coasian reasoning can be extended to yield a constrained Pareto optimum in such cases, but I'm not at all sure about that. Even then, what if the AI is better informed than the agents are?
Agents with self-control problems can incur "internalities" to themselves. Of course self-control issues can be mitigated if the agent alters her own behavioral tendencies and sets up appropriate incentives (acting as a "principal" to herself in a principal/agent setup): nevertheless, if such possibilities are inherently limited, then imperfect enforcement of contracts could increase the agents' utility in the long term.
Strategic considerations also pose a severe challenge to Coasian reasoning and freedom of contract more generally: if we allow all contracts, then extortion attempts may qualify as contracts and then agents will want to extort each other or evade shakedowns, and pay resource costs to do so.
Plus there might be other stuff I haven't thought of.
What would be an example of a penalty clause that 'inflict(s) technical externalities on third parties'? I might add the stipulation that those parties must also be parties to the contract.
I'm not asking that this entity actually do anything beyond the specific tasks related to contract enforcement that it has been assigned. It isn't intended to bring about immortality or make perfect predictions about the future or prove that it is physically possible to fulfill a contract (I assume that every formalized contract would have a penalty clause which is prova...