You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

steven0461 comments on What bothers you about Less Wrong? - Less Wrong Discussion

18 Post author: Will_Newsome 19 May 2011 10:23AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (160)

You are viewing a single comment's thread. Show more comments above.

Comment author: NihilCredo 21 May 2011 07:06:52PM *  11 points [-]

In roughly decreasing order of annoyance:

A varying degree of belief in utilitarianism (ranging from a confused arithmetic altruism to hardcore Benthamism) seems to be often taken for granted, and rarely challenged. The feeling I get when reading posts and comments that assume the above is very similar to what an atheist feels when frequenting a community of religious people. The fix is obvious, though: I should take the time to write a coherent, organised post outlining my issues with that.

A little Singularitarianism, specifically the assumption that self-improving AI = InstantGodĀ®, and that donating to SIAI is the best possible EV for your cash. This isn't a big deal because they tend to be confined to their own threads. (Also, in the thankfully rare instance that someone brings up the Friendly AI Rapture even when it brings nothing to the conversation, I get to have fun righteously snarking at them, and usually get cheap karma too, perhaps from the other non-Singularitarians like me.) But it does make me feel less attached and sympathetic to other LessWrongers.

Of late, there's a lot of concern about what content should be on this site and about how to promote the site and its mentality to the 'muggles'. This kind of puzzles me, because I treat LW as just a place where mostly smart INTJ people hang out and flex their philosophical muscles when they feel like it, and I don't feel particularly interested in missionary work. While I do find it desirable to make more people more rational, I thought everyone here - except for those who get their paycheck from SIAI/FHI, I guess - had better and more efficient purposes to which to dedicate their precious, precious willpower-to-do-stuff-I-don't-enjoy than writing posts they don't really feel like writing. If providing "hardcore" content to LW feels like a chore, then we have a tragedy of the commons situation at hand, and is the site important enough to implement one of the standard workarounds to it?

Eliezer's reduced presence. Other contributors' posts are even more productive and useful than his, but none are quite as enjoyable to read.

Some top contributors regularly get double-digit karma for utterly trivial comments. Can't think of a fix that would be less annoying than the issue.

More Anglo prevalence than I would have expected for a site like this.

No Auto-Pager script for people's histories.

Comment author: steven0461 21 May 2011 10:59:39PM 2 points [-]

A varying degree of belief in utilitarianism (ranging from a confused arithmetic altruism to hardcore Benthamism) seems to be often taken for granted, and rarely challenged.

This simply isn't true. See, for example, the reception of this post.

Comment author: NihilCredo 22 May 2011 12:15:33AM *  4 points [-]

Altruism is a common consequence of utilitarian ideas, but it's not altruism per se (which is discussed in the linked post and comments) that irks me; rather, it's the idea that you can measure, add, subtract, and multiply desirable and indesirable events as if they were hard, fungible currency.

Just to pick the most recent post where this issue comes up, here is a thread that starts with a provocative scenario and challenges people to take a look at what exactly their ethical systems are founded on, but - with only a couple of exceptions, which include the OP - people just automatically skip to wondering "how could I save the most people?" (decision theory talk), or "what counts as 'people', i.e. those units of which I should obviously try to save as many as possible?". There's an implicit assumption that any sentient being whatsoever = 1 'moral weight unit', and it's as simple as that. To me, that's insane.

Edit: The next one I spotted was this one, which is unabashedly utilitarian in outlook, and strongly tied to the Repugnant Conclusion.

Comment author: steven0461 22 May 2011 12:25:31AM *  0 points [-]

Fair enough; I guess komponisto's comment in this thread primed me to misinterpret that part of your comment as primarily a complaint about utilitarian altruism.