Taking another tack - human beings are prone to failure. Maybe the system should accommodate some degree of failure, as well, instead of punish it.
I think one obvious thing would be caps on the maximum percent of upvotes/downvotes a given user is allowed to be responsible for, vis a vis another user, particularly over a given timeframe. Ideally, just prevent users from upvoting/downvoting further on that user's posts or their comments past the cap. This would help deal with the major failure mode of people hating one another.
Another might be, as suggested somewhere else, preventing users from downvoting responses to their own posts/comments (and maybe prevent them from upvoting responses to those responses). That should cut off a major source of grudges. (It's absurdly obvious when people do this, and they do this knowing it is obvious. It's a way of saying to somebody "I'm hurting you, and I want you to know that it's me doing it.")
A third would be - hide or disable user-level karma scores entirely. Just do away with them. It'd be painful to do away with that badge of honor for longstanding users, but maybe the emphasis should be on the quality of the content than the quality (or at least the duration) of the author anyways.
Sockpuppets aren't the only failure mode. A system which encourages grudge-making is its own failure.
I agree with you that grudge-making should be discouraged by the system.
Another might be, as suggested somewhere else, preventing users from downvoting responses to their own posts/comments
Hmm. I think downvoting a response to one's material is typically a poor idea, but I don't yet think that case is typical enough to prevent it outright.
I am curious now about the interaction between downvoting a comment and replying to it. If Alice posts something and Bob responds to it, a bad situation from the grudge-making point of view is Alice both downvoting Bo...
Thanks to the reaction to this article and some conversations, I'm convinced that it's worth trying to renovate and restore LW. Eliezer, Nate, and Matt Fallshaw are all on board and have empowered me as an editor to see what we can do about reshaping LW to meet what the community currently needs. This involves a combination of technical changes and social changes, which we'll try to make transparently and non-intrusively.
Technical Changes
Changes will be tracked as issues on the LW issue tracker here. Volunteer contributions very welcome and will be rewarded with karma, and if you'd like to be paid for spending a solid block of high-priority time on this get in touch with me. If you'd like to help, for now I recommend setting up a dev environment (as laid out here and here).
Some technical changes (links to the issues in the issue tracker):
--Nick_Tarleton
This is something I care about quite a bit! Ideally, the three people above would scrutinize every change and determine whether or not it's worthwhile. In practice, they're all extremely busy, and as I'm only very busy I've been deputized to handle whether or not change will be accepted. If you're unsure about a change, talk to me.
Trike still maintains the site, and so it's still a Trike dev's call when a change will make its way to production (or if it's too buggy to accept). We've got a turnaround time guarantee from Matt for any time-sensitive changes (which I imagine few changes will be).
Social Changes
The rationalist community is a different beast than it was years ago, and many people have shifted away from Less Wrong. Bringing them back needs to involve more than asking nicely, or the same problems will appear again.
Epistemic rationality will remain a core focus of LessWrong, and the sorts of confusion that you find elsewhere will continue to not fly here. But the forces that push people from Main to Discussion to Open Threads to other sites need to be explicitly counteracted.
One aspect is that just like emotion is part of rationality, informality is part of the rationalist community.
--Alicorn
Another aspect is dealing with the deepening and specializing interests of the community.
A third aspect is focusing on effective communication. One of the core determinants of professional and personal success is being able to communicate challenging topics and emotions effectively with other humans. The applications for both instrumental and epistemic rationality are clear, and explicitly seeking to cultivate this skill without losing the commitment to rationality will both make LW a more pleasant place to visit and (one hopes) allow LWers to win more in their lives. But this is a long project, whose details this paragraph is too short to contain. I don't have a current anticipated date for when I'll be ready to talk more about this.
I expect to edit this post over the coming days, and as I do, I'll make comments to highlight the changes. Thanks for reading!