Wiki Contributions

Comments

Sorted by
PlaidX60

I don't like contentless discussions of art either, but spewing paragraph after paragraph of awkward, stilted jargon about your hypothetical personal feelings isn't content, especially when they relate to a movie you haven't even seen!

If my friend says "That movie sucked", and I disagree, I ask "why".

If my friend says "I liked the animation, but the timing is terrible. Everyone telegraphs their reactions", that's a discussion of the film that's actually going somewhere.

If my friend says "Like everyone, I enjoy the physical experience of laughter, but-" and five minutes later they're still talking, I take a moment to look back at my life and wonder how I possibly thought it would be a good idea to see a movie with this person.

PlaidX170

The latter. Actually, I guess I still consume a lot of unknown things, but now almost exclusively online, where when the thing sucks, you can instantly move on to something else.

Much better to download a movie and watch five minutes of it and delete it than to coordinate going to the theater with someone, buy overpriced popcorn, watch a bunch of ads, then sit through an hour and a half of something you don't really like.

I can't really tell whether this is me failing to appreciate some aspect of human experience, or just that the way people tend to do things is stupid.

PlaidX20

Yeah, really what I find to be the ugliest thing about lesswrong by far is the sense of self-importance, which contributed to the post deletion quite a bit as well.

Maybe it's the combination of these factors that's the problem. When I read mainstream philosophical discourse about pushing a fat man in front of a trolley, it just seems like a goofy hypothetical example.

But lesswrong seems to believe that it carries the world on its shoulders, and that when they talk about deciding between torture and dust specks, or torture and alien invasion, or torture and more torture, i get the impression people are treating this at least in part as though they actually expect to have to make this kind of decision.

If all the situations you think about involve horrible things, regardless of the reason for it, you will find your intuitions gradually drifting into paranoia. There's a certain logic to "hope for the best, prepare for the worst", but I get the impression that for a lot of people, thinking about horrible things is simply instinctual and the reasons they give for it are rationalizations.

PlaidX20

Considering this style of thinking has lead lesswrong to redact whole sets of posts out of (arguably quite delusional) cosmic horror, I think there's plenty of neurosis to go around, and that it runs all the way to the top.

I can certainly believe not everybody here is part of it, but even then, it seems in poor taste. The moral problems you link to don't strike me as philosophically illuminating, they just seem like something to talk about at a bad party.

PlaidX210

I've found that I have the opposite problem. When given the opportunity to try something new, I take it, thinking "maybe this time", and invariably regret doing so.

Now I order the same food every time in restaurants, never go to shows, and am a happier person for it.

PlaidX50

Someone even more cynical might say that lesswrong only departs from mainstream skeptical scientific consensus in ways that coincidentally line up exactly with the views of eliezer yudkowsky, and that it's basically an echo chamber.

That said, rational thinking is a great ideal, and I think it's awesome that lesswrong even TRIES to live up to it.

PlaidX60

I haven't read TOO much mainstream philosophy, but in what I have, I don't recall even a single instance of torture being used to illustrate a point.

Maybe that's what's holding them back from being truly rational?

PlaidX170

I've heard that before, and I grant that there's some validity to it, but that's not all that's going on here. 90% of the time, torture isn't even relevant to the question the what-if is designed to answer.

The use of torture in these hypotheticals generally seems to have less to do with ANALYZING cognitive algorithms, and more to do with "getting tough" on cognitive algorithms. Grinding an axe or just wallowing in self-destructive paranoia.

If the point you're making really only applies to torture, fine. But otherwise, it tends to read like "Maybe people will understand my point better if I CRANK MY RHETORIC UP TO 11 AND UNCOIL THE FIREHOSE AND HALHLTRRLGEBFBLE"

There's a number of things that make me not want to self-identify as a lesswrong user, and not bring up lesswrong with people who might otherwise be interested in it, and this is one of the big ones.

Load More