Also note that we significantly increased loading speeds on post pages, for the first time bringing it to a level that I feel is acceptable. Faster render speed to the frontpage should be coming in the following week.
I'm going to use this space to float a potential voting-rule change that we've talked about a bit internally, but seemed like the sort of idea that'd be good to get feedback on before we got around to implementing.
People tend not to downvote nearly as often as they upvote.
In an ideal world, for purposes of detecting which users are epistemically trustworthy and/or insightful, we want neutrally-okay comments to yield 0 karma, bad comments to be negative, good comments positive. (So far, so obvious?)
[EDIT since I think people were misinterpreting this: I think a typical average comment should display positive-karma on the comment itself – the question is how it should contribute to a user's _total_ karma, and/or their total-ability-to-influence-the-site]
Because people are biased against downvoting, posting on LW tends to be net-positive, and as such you can "grind karma" just by being active.
From a motivational standpoint this isn't necessarily bad [Edit: indeed, is quite good – I think it's good for people to have a smooth reward curve as they get involved with the site]. But in terms of using karma for higher-level trust actions (such as moderating frontpage posts), it's somewhat worrying.
So, something we were considering was to have downvotes function normally for a given comment, but to apply their downvote double or triple to the user's karma. (Since it seems roughly like people are 2-3 times as hesitant to downvote as to upvote).
I strongly disagree.
I almost didn't make an account here at all because I was put off by how removed and cold the community seemed. I know of at least one person who wants to post, but is still too nervous. Magnifying downvotes would worsen this problem. If the average poster submitting decent (but not earth-shattering) content averages 0 karma per comment, the negative ones are going to sting. We all know about risk aversion, and I imagine it holds even more for status in a highly-critical community. In turn, this probably will drive people from the site (if 0 is the new average karma).
To me, 0 karma implies "no one cares" - not "average"; likewise, negative doesn't just connote "subpar", but actually terrible. I think it's fantastic that people can post and gain karma and feel good. It'd probably just be better to scrutinize post histories more when making moderation decisions.
The issue is specifically that there's a distinction between "how much we want people to feel rewarded as they engage with the site", and "how much power over the culture and attentional direction of the site." I think the former should be quite liberal, because that's how motivation curves work. The latter currently seems inflated.
One (perhaps false) assumption I had was that people didn't pay as much attention to total karma (esp. on LW2.0 where you have to go out of your way to see it on your profile), so it wouldn't feel as much like a punishment.
By now I've updated a bit about which downvotes are most relevant (see Robby comment), but some additional possibilities include:
FYI I am still pretty attentive to my total karma, mostly because I want to know where I am in relation to the cutoffs for extra voting power.
I like looking at my total karma too, partly because I like knowing about votes on my comments and if total karma hasn't changed then I assume I haven't received any votes recently.
I discussed this a bit with Turntrout, and one solution (that comes with its own problems) is some kind of karma-decay function that has you lose some fraction of your karma on a continuous basis. This would make the karma more similar to ranking systems in many online games, where the rate at which you can gain points determines your final stable ranking, instead of just the amount of time you play.
I really like the idea of a karma floor. Under some threshold (100, maybe higher), downvotes don't count against you. You could provide a report to admins showing posters who would be hugely negative without this, so they can take administrative action against the serious problems. I'd love to see per-post and per-comment floors as well - no comment should get much below -15 unless it's pure spam or hatred, in which case the moderator should remove it.
I think it’s fantastic that people can post and gain karma and feel good.
I can hardly overstate how profoundly wrong and misguided this attitude seems to me. As far as I can tell, this is antithetical to the entire point of a forum like this.
For whatever it's worth, the sentence above seems true to me, so I would be interested in you elaborating your point. It seems clear to me that we want a positive reward curve for contributions. Without a reward (of some form or another) people will not post to the site. Being part of that reward is the whole point of karma on Reddit and many other websites that use karma systems (and Reddit is after all where our karma system comes from). In our case, we are also adding an additional dimension of giving power to users with a lot of karma, but the dimension of karma serving as positive incentive for posting is definitely still an important function.
I actually had the thought of "I think Said will disagree with me on this, but I forget why" while writing that comment. Like habryka, I'm interested in hearing why you find that misguided; I'll keep an eye out for a reply to either comment (if you choose to explain further).
My instinct is often to upvote or downvote comments/posts based on how much karma I think they should display. E.g., maybe I think two comments by new users both deserve about 10 karma, but one is currently at 10 while the other is currently at 18. I might then strong-downvote the latter comment to bring it to 10, while ignoring the former comment. This is all well and good, except under your system, it would lead to two equally good comments conferring +9 karma on one new user and somewhere between -7 and -15 karma on another.
The ideal solution to this might be for me to try to retrain my voting habits rather than modify the system to accommodate them. This is harder if my voting habits are shared by others, though.
One option might be to weight downvotes more heavily the lower the post/comment's karma was when the downvote occurred? I'm a lot more willing to downvote (and strong-downvote) something that currently has +70 karma than something that currently has +10 karma, because I'm likelier to think that the +70 is an overestimate and that lowering that total a bit is harmless. But that greater willingness means that my average downvote of a +70 post means a lot less than my average downvote of a +10 post.
People tend not to downvote nearly as often as they upvote. In an ideal world, for purposes of detecting which users are epistemically trustworthy and/or insightful, we want neutrally-good comments to yield 0 karma, bad comments to be negative, good comments positive... But, because people are biased against downvoting, posting on LW tends to be net-positive
It sounds to me like you think the first sentence implies the last one. It doesn't - if most comments are good, it makes sense that you'll get more upvotes than downvotes. This would only fail to be true if 'neutrally-good' was the empirical average comment quality, either by definition or just as a fact about commenters. This seems undesirable or unlikely to me respectively.
I don't think the first sentence necessarily implies the last one, but I do think in practice it's often the case. I noticed this in particular in my own voting patterns, in that I wasn't willing to downvote comments that I thought were actively making the site bad because it seemed too mean.
(that said, I think Robby's articulation of the thing seemed closer to the issue – that downvoting behavior is specifically distorted when a comment's karma is low)
I want to be clear I don't think an average comment should intrinsically get 0. But, a comment that provides zero bayesian evidence about your posting quality should receive 0 karma (somewhat tautologically)
I wasn't willing to downvote comments that I thought were actively making the site bad because it seemed too mean.
Can we bring this up to a top-level understanding in our voting system? It seems like we're confusing stocks and flows in a way that makes it hard to model. There are a bunch of dimensions in the decision to downvote, which get collapsed into one decision:
1) is the post good or bad
2) how has the post been judged so far (current post karma)
3) how will the post be judged in the future (for newish posts, what's it's equilibrium karma)
4) how much karma does the poster have, and how will he/she react to my downvote
Mostly, I'd like to ask what's wrong with the situation where you're not willing to downvote comments actively making the site bad because it seems mean? If it's already sufficiently downvoted, I'd argue it _IS_ mean to further downvote it, and you _SHOULD_ choose not to be mean. I really wish I could downvote all the downvotes on a mildly-bad and extremely-unpopular comment on a popular topic.
Background model: my biggest threat model for ways LW can die are commenters that are juuust under the line of "bad enough that it naturally feels right to downvote them." There's a lot of new commenters where, if I see a single comment of theirs, I want to give them the benefit of the doubt. But it becomes clear that there's a pattern there, and that if the site filled up with comments like theirs it'd rapidly become an unfun place to be. (i.e. comments that are slightly clueless, slightly confusing, or slightly uncharitable.
This doesn't necessarily have to be resolved with downvotes. Given infinite moderator-time, you could resolve it with PMs to each individual person. But given the resources we have that's not really practical.
That deserves a top-level post: "what's your threat model for the future of LW". The death spiral that worries me most is that new commenters/posters are discouraged because their first post inevitably misses some background, common knowledge, or unstated rule, and gets downvoted more than it "should" be. Existing posters eventually get discouraged (or just exhaust their interest), and the site is an echo chamber for the few people smart enough to get upvotes but not smart enough to realize that nobody cares.
Alternately, the site becomes filled with karma whores trying to guess the password to optimize their scores, and diluting the real value of content-based discussion.
first post inevitably misses some background, common knowledge, or unstated rule, and gets downvoted more than it "should" be
There are definitely (at least) two possible failure modes to fall into here. But I'm currently not that worried about newcomers getting downvoted because it empirically doesn't happen that often (we recently set up moderator-sidebar tools that keep mods in the loop about new comments – we see all comments that end up with one-or-less karma, so we have some sense of what trends are common there).
The biggest complaint we hear about is about overly-critical comments (which I do think contributes to downvotes feeling harsher). But most of those overly-critical comments are from lower-to-mid-karma users. One of the problems I'm hoping to solve here is to avoid making "superficial criticism" a reliable way to grind karma and gain disproportionate control over the site.
Meanwhile, we have a clear case study of how LessWrong died the first time, and this was largely because the experienced users started drifting off – a process that accelerated as the overall quality of the site went down. (This is discussed in some detail on Habryka's Strategic Overview post, with Scott Alexander's quote being a succinct description of the problem)
Based on lots of user interviews, it seems like a top priority is making sure LW is a productive place for the top contributors to engage in discussion. This provides the core of content that continues to attract new users in the first place.
I think this is also my perspective. I don't think that the average comment should receive 0 karma, the average comment is clearly quite strongly net positive. Posting on LW should get you some kind of net-positive reward, though it's not clear that that reward should be direct power over other people's experience of the site.
I am not sure the solution is ideal, but I agree that this needs a solution. The “grind karma” problem is real. If you can’t think of another way to solve that problem (and I can’t, at the moment—if I think of something, I’ll post!), then I think your solution is a good one.
I really hate the focus on karma. My preference would be just to use +1/-1 on posts, and not store totals at all (or to cap at 100 or so).
That said, it's an interesting puzzle how to encourage the behaviors you want (and how to even describe the behaviors you want, in measurable terms). I think I want neutrally-OK comments (and posts) to be positive karma - more content is better, and I want to encourage participation. A zero-karma comment is an indication to me that I've wasted people's time with no benefit at all.
Some problems you haven't considered (or at least haven't talked about) are:
1) how to distinguish between lack of votes and mixed up- and down-votes. If 100 people have viewed a comment and nobody has voted, it's a boring but not harmful post. If it has 40 upvotes and 40 downvotes (or even +30, - 40), it's a high-value comment that people care about.
2) relatedly, how to capture the value of simple existence of content that's not good (or bad) enough to get votes. If 50 people have read a comment, and nobody has voted (or the votes balance out), it should still get some credit.
3) strategic voting. I'm far more likely to downvote a post or comment if it seems mediocre but has high karma than if it seems mediocre and has low karma. Same for upvoting - I don't bother with things already upvoted by others.
4) Time. New posts/comments are going to have low karma because nobody's had a chance to vote. New comments on old posts can be in this state for a long long time. For posters, it's impossible to distinguish between an old-timer who's made a whole lot of small-value comments and posts vs someone who's made a few very-high value posts. This is even worse for karma from the old site, where the same content was worth 10x as a post than as a comment.
One idea might be for people to set level rather than direction for their votes - each user enters their desired level for a post/comment to reach, and their directional vote is always in the direction of that level. If I set a level of 20 when a comment is at 5, it's an upvote. It remains an upvote unless the value goes above 20, when it becomes a downvote.
Also, make karma decay (perhaps not to 0, but maybe to 1000) so people active years ago don't have unfixably-high karma today.
3) strategic voting. I'm far more likely to downvote a post or comment if it seems mediocre but has high karma than if it seems mediocre and has low karma. Same for upvoting - I don't bother with things already upvoted by others.
I've noticed many people saying this, and I don't see the value in voting that way. Your vote should carry information about your preferences about posts, not your preferences about the displayed information about the collective preferences about posts. That's the best way to capture the collective preferences, isn't it? As an example of the flaws of the latter perspective, changing the view order of a post can change its final karma, even though averages don't have an order.
Edit: strategic voting in general seems similar to Defect: it gives the individual greater effectiveness at conveying their preferences, but makes the collective preference estimates less accurate. (It's a negative-sum choice.)
What Dagon said. Your advice makes sense if the main signal people received is "this received one -5 vote, two -4 votes, one -1 vote, three +1 votes, and five +2 votes", but not if people are just receiving a "net upvotes" summary number. By default, the aggregate effect of everyone trying to "vote according to what's really in their heart" and disregard current vote totals is that either (a) lots of content gets absurdly, unwarrantedly high/low karma totals because people's opinions are correlated, or (b) lots of content gets no upvotes or downvotes at all because people are trying to correct for the possibility that things will be over-voted (even though they can see with their own eyes whether a vote total is currently too high or too low).
Perhaps this is a reason to replace the "net upvotes" system with one that lists the number of votes (at different levels).
I briefly had it display totals of each vote type, and immediately found myself having a bad experience whenever I saw downvotes. Although I can still figure it out roughly from the list of total-number-of-votes and total-score, it felt fine instead of upsetting.
Not sure if that generalizes but it's the reason we have the current configuration.
I suspect your reaction is a common one, but not universal and I have no way to guess whether it's the majority. I STRONGLY prefer to see my downvotes, and at times have sought to make posts that were controversial (got both up- and down-votes) rather than popular.
lots of content gets absurdly, unwarrantedly high/low karma totals because people's opinions are correlated
How is this absurd and unwarranted? The numerical value doesn't have any inherent meaning (as it would if any posters who received at least 1000 karma were given moderating powers, ferex). Is it that it produces an unusual vote distribution? A possible solution would be to adjust total votes downward by a factor that increases with total votes, if this is a problem, but I disagree that any solution is needed.
"People's opinions are correlated" is just another way of stating "many people agree about X", and that sounds like something that the voting system should be able to record (perhaps the only thing - isn't the voting system just a way of recording public opinion on a post?).
Votes determine comment order, which is a counterexample to my claim of "the numbers don't really matter," but that's scale-invariant so my point holds. But perhaps low-value posts are inflated more than high-value posts? But if LW voters tend to upvote low-value posts, then I think there's a larger issue that can't be solved by people occasionally throwing a wrench into inflating post-votes (how do we know that the strategic downvotes will correlate with low-value posts when upvotes can't do the same?).
lots of content gets no upvotes or downvotes at all because people are trying to correct for the possibility that things will be over-voted (even though they can see with their own eyes whether a vote total is currently too high or too low).
All the votes, past and future, are combined into one total. If someone aims for the final vote to be X, and their expected final vote is X, then why should they vote? That's just rational strategic voting. (However, possible details that reverse the optimal decision: everybody acts the same way and nobody votes. But wouldn't people realize that they all think that way? Alternatively, snowballing votes means that your vote could tip the final vote to either zero or larger than your wanted value, but only if people look at the total vote when deciding how to vote, which wouldn't happen in this hypothetical scenario where people vote from their hearts.)
I second this. This is a good expression of the view that underlies a lot of what I’ve said on this topic.
Your vote should carry information about your preferences about posts, not your preferences about the displayed information about the collective preferences about posts.
Nope. My vote is not an individual message to the poster. It's simply a change to the total that the poster sees. I submit my vote as an direction, but it's received as a sum.
Since I know that the signal received is a sum, I prefer to influence the sum rather than just picking a direction which may lead to a very different sum than I want.
I agree that people have preferences about vote sums, and that each individual's preferences can be better realized through strategic voting. The crux is that I think that strategic voting worsens our ability to estimate the collective opinion of LW on a post. Strategic voters act to absorb votes past a certain point that they choose, which means that the added presence of non-strategic voters may not have any effect on the total vote. (Also note the order dependence, which probably indicates something wrong.) Perhaps we could accept these costs in exchange for some gain, but I don't see what collective gain there is from strategic voting.
Perhaps we could accept these costs in exchange for some gain, but I don’t see what collective gain there is from strategic voting.
Vote total from strategic voting can better reflect a post or comment's quality as opposed to its quality*readership (number of people who read a post/comment), which is what you would get if people did non-strategic voting. With the latter, it's hard to tell whether a post's vote total is high because people think it's very high quality, or if it's just moderately high quality but read (and hence voted on) by a lot of people.
I've changed my mind - I think strategic voting might send more information than karma-blind voting. It counteracts visibility spirals as you describe. There might also be another effect: consider a community of identical, deterministic, karma-blind voters. Disregarding visibility spirals, everything gets sorted into five categories (corresponding to the number of ways for a single user to vote). In reality, deterministic and karma-blind voters aren't identical, so karma still varies smoothly. But is "people are different" the only way information should be sent? Doesn't a group of identical voters hold more than a quint of useful information? This is why I have a vague suspicion that strategic voters can send more information - they send more information in a degenerate case.
The thing is that any system needs to be resilient against people strategic voting (with people having different goals and opinions about how to vote and why). Given that at least a nontrivial chunk of people vote this way, if we don't want them to, it seems like the preferred solution is to change the voting system so that it no longer incentivizes that in the first place.
I agree with this statement, but I don't think you have as much control as you like about people's perception of votes, and what the incentive actually is for diverse individuals.
My incentives in voting are:
I think people perceive different types of vote systems fairly differently. I'm actually talking about bigger changes than you're probably thinking of.
If I wanted to incentivize voters to "vote their true beliefs without regard for the rest of social consensus" (what Wakalix seems to be describing, and which I think is orthogonal to the set of things you're talking about – you're pointing at what posting behavior the votes should incentivize, and I think Wakalix is talking more about voting behavior), I would:
a) hide the karma before they vote
b) use something closer to a 1-5 star rating system (not precisely that, but closer). I think this would dramatically change the relationship with voting, and would much more naturally output the the voting behavior that Wakalix describes.
Strong-upvoted, mostly because I had a positive system-1 reaction to the line "I really hate the focus on karma." I was going to straightforwardly agree with it, but then thought about it some more and came up with the following:
@ mostly the mod team: I'm not necessarily sure I grok why this site seems to care so much about karma, and I'm curious about it. I'm getting the impression that maybe it's more important than I thought though, as a tool for guiding site culture. Like, every time I see a post talking about changing the karma system, my first thought is "Whoah, isn't this way overthinking it? Why is this such an important issue?" Then I remind myself "oh, yeah, maybe it's for shaping site culture, which is important I guess, so maybe this is important." But then the next time I have the same system 1 reaction of "Why bother caring about karma so much?" Now I'm new here, and wasn't around for the death of old-LW (I came in around the time LW2 started), so maybe this is just due to the fact that "LW dying" isn't a particularly salient possibility for me, so I'm not worried so much about the nitty-gritty details of how to shape incentive gradients on the site so that doesn't happen.
I would also say my intuitive reaction is that low-positive karma seems the right place for neutral comments. I'm not sure I like the "levels" idea, just because I don't know how to determine what level I want a comment to be at on a scale that goes from -infinity to +infinity.
Appreciate spelling out your reasons and thoughts here.
I think, ideally, karma would fade into the background and not be something people overly worry about. BUT we still need a way to determine what posts get what level of visibility.
I think you mostly answer the question the way I would have – we think maintaining (and improving!) the site culture is really important, and this should ideally be as seamless a part of the site as possible.
I basically agree about low-positive karma being correct for what you see on a given comment. The question is about how the karma for that comment should translate into your longterm ability to influence the site. The easiest way to hand out higher-level privileges is automatically based on total karma. There are other ways one could think about this.
[Edit: I'd add that I see us as fortunate enough to have a dedicated userbase that cares about getting online discussion right, and this provides a rare an valuable opportunity to experiment.
Looking at most of the internet, you can see how technology shapes discussion. The sort of conversation that twitter incentivizes is different than what facebook incentivizes is different from what reddit incentivies – but all of those also share certain features by optimizing along a sort of "lowest common denominator" axis. Simple karma systems encourage posts with mass appeal, which isn't necessarily the same as high-quality discussion.
LessWrong has the potential to deliberately engineer a platform and culture that is robustly focused on high quality discourse. Our approaches to this manifest as a lot of discussion of karma, but one of the underlying pieces of that is an approach to experimentation]
This might make people even more reluctant to downvote. If a downvote removed a thousand points of karma, I would almost never use one. I'm more comfortable giving metaphorical slaps on the wrist the lighter the slap, so to speak. It's possible I'm typical minding here.
That said, if you did this and did not announce it, my downvote habits wouldn't change and this would work more or less as intended.
LessWrong already has much stronger downvotes then StackOverflow and on StackOverflow I think the right system goes well. I think it's okay to give people more moderation power simply by virtue of participating because people who participate a lot also develop a good idea of what the local standards happen to be.
(ES: conjecture and what I remember from other people discussing karma.)
They're trying to be more selective in deciding who has moderation power than "old and active users," since this has multiple failure modes. LW1 died due to a feedback loop of decreasing quality, and we should design LW2 to avoid this failure mode.
It does! Quickly double-tapping just zooms the screen, but I figured it out pretty quick. Thanks.
I haven't spoken up much about this UI, as I assume my situation is unusual. Most of the time I'm on a standard desktop or mobile browser, and all is fine. However, I'm somewhat regularly (few times a week) on a touchscreen UI remotely executing a desktop browser. This is read-mostly for me because the "keyboard" is painful, and it doesn't really bother me that I can't control my upvotes - I'd rather not care about the difference, so whichever default is fine.
Git Commit: 6a0c7bf54a29f2909ab50fc76da3b43854d0e112
We deployed recently, mostly with under-the-hood bugfixes and refactors that are aimed at making it easier to fork the LW codebase and restyle it.
Two concrete user-facing changes are: