Posts

Sorted by New

Wiki Contributions

Comments

Armchair psych, agreed. The one bit was patients performed poorly on the Cognitive Reflection Test, but yeah that's not news. I'm sure most people in most circumstances would perform poorly on the CRT. System 1 is like original sin, it's the default state that takes effort to overcome.

Done. Fairly high confidence that I'm still the lone Filipino LessWronger.

Upvote, valuable insight. And meta thinking is switching from Tiger style to Crane style as the situation warrants. Good idea to have a set of modules ready to go.

I guess signalling non-agency is tactical level; protective camouflage, poker bluffing etc. Agenty thinking as above is essentially strategic, winning with moves that are creative, devious, hard to predict or counter, going meta, gaming the system. Pretending to be a loyal citizen of Oceania is a good tactic while you covertly work towards other goals.

For cultural agency, the Wikipedia page on locus of control's one place to start. And there was the Power Distance Index in Gladwell's Outliers.

Just finished reading K.J. Parker's Devices and Desires. What struck me at first was "Eh, no, medieval people didn't think like that," but after mentally shifting gears to thinking of it as an author tract like HPMOR, with modern characters in a quasi-historical setting, it was much more enjoyable.

Schelling's Strategy of Conflict says that in some cases, advertising non-agency can be useful, something like "If you cross this threshold, that will trigger punitive retaliation, regardless of cost-benefit, I have no choice in the matter."

Hello again. Used to post as "ZoneSeek" but switched to my real name. I'm from the science/science fiction/atheist/traditional rationality node, got linked to LW years ago through Kaj Sotala back in the Livejournal days. I have high confidence that I am the only LessWronger in the Philippines.

Upvote. The Drake Equation and SETI seem at least as relevant as, say, Pascal's Mugging. GIGO, sure, but a standard dismissal in statistics is to say there's not enough data, more research needed. Isn't this where Bayes is supposed to win over frequentism, when it comes to imperfect or incomplete information?

Babyeater FAI would be very different, but could still give us big hints on how to make human FAI. It's the standard science process, instead of reinventing the wheel, stand on the shoulders of giants and learn what other smart people who've come before have figured out.

Yo. I've been around a couple years, posted a few times as "ZoneSeek," re-registered this year under my real name as part of a Radical Honesty thing.

Load More