New to LessWrong?

New Comment
6 comments, sorted by Click to highlight new comments since: Today at 2:58 PM

I've read a couple of Lou Keep's essays in this series and I find his writing style very off-putting. It seems like there's a deep idea about society and social-economic structures buried in there, but it's obscured by a hodgepodge of thesis-antithesis and vague self-reference.

As best I can tell, his point is that irrational beliefs like belief in magic (specifically, protection from bullets) can be useful for a community (by encouraging everyone to resist attackers together) even though it is not beneficial to the individual (since it doesn't prevent death when shot). He relates this to Seeing Like A State, in that any attempt by the state to increase legibility by clarifying the benefits makes them disappear.

He further points out that political and economic policies tend to focus on measurable effects, whereas the ultimate point of governments and economies is to improve the subjective wellbeing of people (happiness, although he says that's just a stand-in for something else he doesn't feel like explaining).

Extending that, he thinks we have probably lost some key cultural traditions that were very important to the quality of people's lives, but weren't able to thrive in a modern economic setting. He doesn't give any examples of that, although he mentions marriages and funerals as examples of traditions that have survived. Still, it seems plausible.

Overall, it reminds me of Scott Alexander's essay How the West was Won, about the advance of universalist (capitalist) culture and its ability to out-compete traditional systems whether or not it actually improves people's lives. Moloch is also relevant.

It's very likely I've missed a key aspect here. If anyone knows what it is, please let me know.

It seems like there's a deep idea about (...) buried in there, but it's obscured by a hodgepodge of thesis-antithesis and vague self-reference.

This is a standard technique to appear deeper than one is. By never saying what exactly your idea was, no one can find a mistake there. If people agree with you, they will find an interpretation that makes sense for them. (If the interpretation is good, you can take credit. If the interpretation is wrong, you can blame the interpreter for the lack of nuance.) If people disagree with you, they cannot quote you, so you can accuse them of attacking a strawman.

Or it simply buys you time and outsources research. You can play with the idea, observe what is popular and what is not, gradually converge on something, and pretend that this is what you meant since the beginning. (Note: there is nothing wrong with throwing a few random ideas at wall and seeing what sticks, as long as you admit that this is what you are in fact doing.)

There is also nothing wrong with feeling out an idea, trying to figure out which parts of it look OK and which parts don't, checking how it fits with other ideas and which corners catch, etc. -- lack of precision is bad at the end of the process but is highly desirable at the beginning ("Premature optimization is the root of all evil" :-D)

[-][anonymous]7y20

Yeah, that's a very good summary of what I think he's pointing to; much better than I could have done.

As far as essays go, Lou's stuff seems rambly, but it's also novel enough to pique my interest, I guess? It's not as off the deep end as Ribbonfarm, but it's got enough novel (to me) ideas like that of the cultural traditions argument (I especially like how he points out that inferring past dominance of certain traits from their prevalence in the modern day could be faulty) that I enjoy it.

Yeah, I have a lot of difficulty understanding Lou's essays as well. Nonetheless, there appear to be enough interesting ideas there that I will probably reread them again at some point. I suspect that attempting to write a summary as I go of the point that he is making might help clarify here.

[-][anonymous]7y30

Quick summary: Interesting look at how benefits to the group can lead to individuals holding irrational beliefs and the implications this has for the sort of social structure where this sort of thing arises. In the context of witchdoctors and "bullet-proofing" rituals.

(this was on deluks' Rational Feed, but I found it interesting enough to merit its own discussion.)