What they mean by "trust," is roughly, an expectation that someone will model their own interests with pretty good accuracy and generally work to fulfill them. They will act as an agent seeking their fully general benefit.
That's what I was referring to. Not "do I trust you to do X?" but "are you my ally?"
In this model, if the set of things I need to get right is not the same as the set of things a given normal needs to get right, they may give me dangerously bad advice and have no notion that there's anything wrong with this. Someone who will happily mislead me is not a good ally.
If you're a nerd, you might read all this and think I'm being hard on normal people (how can you say such awful things about them as that they're not logically consistent and that they don't ponder before answering questions?)...
Well, yes. This isn't just saying normal people aren't logically consistent, or that they don't put much effort into logical consistency. It's saying that they have no concept of truth. (Gut reaction: how could anyone ever trust such a person?)
while if you're a normal person reading t...
Gut reaction: how could anyone ever trust such a person?
You can't trust(1) = count on them to accurately report their likely future behavior or give you an accurate account of what they know. You can trust(2) = believe that they're part of your coalition and expect them to act to favor that coalition over outsiders, by observing their costly signals of commitment, e.g. adherence to the narrative even when it's inconvenient to do so, hard-to-fake signals of affection and valuing your well-being.
Related: Authenticity and instant readouts, Auth...
The "boiling point of nitrogen" conversational norm may be original to Boston, but the descriptive phrase "boiling point of nitrogen discussion" for the sort of thing the norm aims to avoid was in use at Caltech around 2005 and I'm fairly sure it originated there.
Would it be feasible to get consistency by using only LocalStorage and no cookie?