I’d begin by limiting the number of rounds allowed in the negotiation. I’ve spent the last decade in sales, and the best negotiations are typically the ones in which both parties have a hard deadline and have to get it done quickly.
Agreed. In an ideal world there's no benefit to having more than one round, since all information that's going to be shared can be shared up front. I'm not sure if real life considerations change that.
I would appreciate it if people shared why they're downvoting this post. It's very discouraging to spend time writing up a detailed question and then just get mass downvoted for no apparent reason.
Edit: For context, I wrote this when the post was at -6 votes.
It does seem odd from what I've observed, some high effort posts get downvoted and some low effort posts get upvoted by a lot in the first few hours.
It might be some kind of coordinated trolling, bot spam, or something else. Since it seems possible for a single individual to make multiple accounts. I just treat every karma score within 50 points of zero as roughly the same.
Yudkowsky has written about The Ultimatum Game. It has been referenced here 1 2 as well.
When somebody offers you a 7:5 split, instead of the 6:6 split that would be fair, you should accept their offer with slightly less than 6/7 probability. Their expected value from offering you 7:5, in this case, is 7 * slightly less than 6/7, or slightly less than 6.
It's worth acknowledging that "stragegy-proof" is a pretty limited definition. Second-price auctions are awesome in cases where there are more bidders than supply, and the supplier isn't trying to maximize their revenue (or is prevented from using less-transparent differential pricing). Even then, it's only strategy-proof in the one-shot case. If there will be future auctions or negotiations, the incentive to hide one's price-sensitivity returns.
I also react very badly to a teaser post, framed as a question. AT LEAST set up the bargaining scenario (who's trading for what kinds of things, with what substitutability, quality knowledge imbalances, and repeatability of transaction).
edit: a "teaser post" is one which says "I have a clever/useful/interesting idea, but I'm not going to explain it yet, just make some vague claims about it." Not only is it annoying, it almost NEVER results in an actually interesting explanatory post.
Thanks for the edit. It wasn't my intention to "tease" people; my idea isn't the focus of this post, I'm hoping other people will suggest better ones. I just wanted to mention that I had an idea as a way of showing that there exist plausible solutions, and to signal that I had put some thought into it myself and wasn't just "asking people to do my homework" as it were.
This isn't downvoted as I expected, so maybe I'm overreacting. I don't find the setup clear enough to answer (mixing very different aspects of discovery and price-setting, switching from auction to very-low-volume assymetric-information transactions, confusing theory of incentive-compatibility and strategy-proofness with the practical annoyance of car salesman tactics). But maybe it's just me - I look forward to your actual post that explains which aspects of things your idea addresses, and how.
Did you ever share your actual idea? I didn't see a post, and don't see a comment that lays it out here. It seems to be about used-car sales, which makes me suspect the idea misses a lot of incentives and desires of the two parties, especially the information assymetry and legal framework of recourse in the case of fraud (otherwise, why aren't you running a business with the idea, rather than just bringing it up here).
Took me longer than I expected, but I just did. Here you go: An attempt at a "good enough" solution for human two-party negotiations
Aren't the other used cars available nearby, and the potential other buyers should you walk away, relevant to that negotiation?
Yeah, that could be relevant, but the system might be able to factor that in. For example, maybe it could be modeled as decreasing the maximum price the buyer is willing to pay (since they know they can have more attempts), and the system factors that in. I want it to be able to handle a broad range of real-world negotiations, so the exact details ideally shouldn't matter that much.
Like what evand said, a used car buyer or seller can just walk away to another counterparty if they're unsatisfied.
Your going to have to provide a specific, actually limited to 1 on 1 example, or else it's too open ended to answer.
I think you're missing the point. Designing a bespoke system for an individual negotiation that takes into account the exact dynamics of that particular situation doesn't seem at all feasible. I'm talking about a general system that's "good enough".
You haven't defined 'good enough', without that then it's either impossible to answer or trivial.
Hence why we're confused as to what your asking.
e.g. a lot of common sense things will always work to varying degrees, like just setting a non negotiable price and walk if they decline.
Some types of negotiations are strategyproof; designed such that the optimal strategy is for each player to be truthful. For example in a Vickrey auction, there's no incentive to lie or bid less than your maximum; doing so would only put you at a disadvantage.
Unfortunately, when it comes to negotiations between a single buyer and a single seller, it's been proven that there is no strategyproof solution. (See Lying in negotiations: a maximally bad problem.) The seller is always incentivized to overrepresent the value of the item, and the buyer to underrepresent it. This can lead to brinksmanship, where both parties try to set a firm "take it or leave it" price in order to force the other party to accept, at the risk of no deal occurring at all.
Ideally, the correct price at which to sell the item is the price that maximizes utility across both players. But when it comes to real humans in the real world, it's very easy for one to lie about their own utility curve, so there's no good way for both parties to enforce this.
The typical way humans go about these negotiations this is with emotional manipulation, extortion, artificial self-restrictions, social coercion, etc. (e.g. think of the stereotypical car salesman.) This seems generally bad for epistemics, and as someone who has to negotiate a lot, I also find it personally very annoying.
I'd like to design a system that allows these negotiations to take place in a more incentive-compatible way that's faster to execute, doesn't reward skill at manipulating other people, and is less likely to lead to bad feeling afterwards. Obviously it can't be perfect and there will be some method of gaming the system, but humans aren't superintelligences, and if the system can make it hard to calculate the optimal strategy and make it so that providing one's true valuation doesn't give the player that much of a disadvantage, I expect that most people will comfortably just fall back on being truthful.
How would you design such a system?
(I have an idea that I'll share later, but I don't want to prime people with a specific kind of approach from the beginning.)