It's actually not just about lie detection, because the technology starts to shade over into outright mind reading.
But even simple lie detection is an example of a class of technology that needs to be totally banned, yesterday[1]. In or out of court and with or without "consent"[2]. The better it works, the more reliable it is, the more it needs to be banned.
If you cannot lie, and you cannot stay silent without adverse inferences being drawn, then you cannot have any secrets at all. The chance that you could stay silent, in nearly any important situation, would be almost nil.
If even lie detection became widely available and socially acceptable, then I'd expect many, many people's personal relationships to devolve into constant interrogation about undesired actions and thoughts. Refusing such interrogation would be treated as "having something to hide" and would result in immediate termination of the relationship. Oh, and secret sins that would otherwise cause no real trouble would blow up people's lives.
At work, you could expect to be checked for a "positive, loyal attitude toward the company" on as frequent a basis as was administratively convenient. It would not be enough that you were doing a good job, hadn't done anything actually wrong, and expected to keep it that way. You'd be ranked straight up on your Love for the Company (and probably on your agreement with management, and very possibly on how your political views comported with business interests). The bottom N percent would be "managed out".
Heck, let's just have everybody drop in at the police station once a month and be checked for whether they've broken any laws. To keep it fair, we will of course have to apply all laws (including the stupid ones) literally and universally.
On a broader societal level, humans are inherently prone to witch hunts and purity spirals, whether the power involved is centralized or decentralized. An infallible way to unmask the "witches" of the week would lead to untold misery.
Other than wishful thinking, there's actually no reason to believe that people in any of the above contexts would lighten up about anything if they discovered it was common. People have an enormous capacity to reject others for perceived sins.
This stuff risks turning personal and public life into utter hell.
You might need to make some exceptions for medical use on truly locked-in patients. The safeguards would have to be extreme, though. ↩︎
"Consent" is a slippery concept, because there's always argument about what sorts of incentives invalidate it. The bottom line, if this stuff became widespread, would be that anybody who "opted out" would be pervasively disadvantaged to the point of being unable to function. ↩︎
Yes, this is why I put "decentralized" in the title even though it doesn't really fit. What I was going for with the post is that you read it yourself, except whenever the author writes about law, you think for yourself about stacking the various applications that you care about (not courts) with the complex caveats that the author was writing about (while they were thinking about courts). Ideally I would have distilled it as the paper is a bit long.
This credibly demonstrates that the world we live in is more flexible than it might appear. And on the macro-civilizational scale, this particular tech looks like it will place honest souls higher-up on net, which everyone prefers. People can establish norms of remaining silent on particular matters, although the process of establishing those norms will be stacked towards people who can honestly say "I think this makes things better for everyone", "I think this is a purity spiral" and away from those who can't.
At work, you could expect to be checked for a "positive, loyal attitude toward the company" on as frequent a basis as was administratively convenient. It would not be enough that you were doing a good job, hadn't done anything actually wrong, and expected to keep it that way. You'd be ranked straight up on your Love for the Company (and probably on your agreement with management, and very possibly on how your political views comported with business interests). The bottom N percent would be "managed out".
high-trust friend groups
I'm having a hard time imagining a scenario in which I would find this valuable in my friend groups. If I were ever unsure whether I could trust the word of a friend on an important matter, I'd think that would represent deeper issues than a mere lack of information a scan of their brain could provide. Perhaps I'm nieve or particular in some way in how I filter people.
Do you have examples for how this would aid friendships? Or the other domains you mentioned?
I could see it being very valuable but I also find the idea very frightening, and I am not someone who lies.
The traditional technology used for similar purposes in some cultures is alcohol. The idea is that as alcohol impairs thinking, it impairs the ability to lie convincingly even more. Especially considering that even if one drunk person lies successfully to another drunk person, the next day the other person can reflect on the parts they remember with a sober mind.
Thus, alcohol is an imperfect lie detector with a few harmful side effects; and in cultures where it is popular, groups of friends do this together, and conspicuously avoiding it will provide evidence against your sincerity.
If I were ever unsure whether I could trust the word of a friend on an important matter, I'd think that would represent deeper issues than a mere lack of information a scan of their brain could provide.
Friendships exist on a scale. If you switch from "a stranger" to "100% trusted person" too quickly, you probably have some unpleasant surprises waiting for you in the future. Also, friendship is not transitive, and sometimes you need to know whether you can trust a friend of a friend (even when your friend says "yes"). I know some people whom I trust, but I definitely do not trust their judgment about other people.
I am sceptical about the role of alcohol you describe and dynamics around it as a form of lie detector, but I know there's a range of social dynamics I haven't necessarily been exposed to in my culture.
I have been in various groups that heavily drink on occasion, but I've never seen any evidence of people being viewed as having something to hide were they not to drink.
I think alcohol might make people more honest but I think it's usually things they already wanted to divulge but for lack of some courage or sense of emotional intimacy that alcohol can provide. It's hard for me to imagine alcohol playing a similar role as a lie detector for significant factual information people strongly want to hide.
Could you offer any examples of where a real lie detector would be valuable in friendships or potential friendships?
A lot of the things I might want to know seem challenging to address via a lie detector. "Will you do anything violent or steal or intentionally damage my property," People likely to do those things might honestly intend not to.
I could see it potentially being useful for people having sex more on the casual side.
Every subculture I've participated in has lowkey bad actors. The harms this causes are underrated imo.
There's bad actors who infiltrate, deceptively align, move laterally, and purge talented people (see Geeks, Mops, and Sociopaths) but I think that trust is a bigger issue.
High-trust environments don't exist today in anything with medium or high stakes, and if they did then "sociopaths" would be able to share their various talents without being incentivized to hurt anyone, geeks could let more people in without worrying about threats, and people could generally evaluate each other and find the place where their strengths resonate with others.
That kind of wholesome existence is something that we've never seen on Earth, and we might be able to reach out and grab it (if we're already in an overhang for decentralized lie detectors).
Although the emergence of functional lie detection would be an obvious total paradigm shift for the entire court system, the author didn’t seem to realize that this is also an obvious total paradigm shift for much bigger things, e.g. hiring, high-trust friend groups, an immune system for deceptively aligned humans, discouraging harmful behavior, nihilistic profiteering, and/or excessively extreme self interest.
Although centralization of power is a classic concern, broader open decentralized access could easily facilitate a new era of high-trust social dynamics within and between smaller groups.
So bear in mind, whenever the author says this is about courts, you’re allowed to think about whatever other use cases might come to mind, not just courts. This is about trust, cooperation, taken to an extreme level that we have never seen before on Earth, and plausibly can’t imagine until it happens. What kind of questions could you ask someone (or have them ask you) if you wanted to honestly build trust and collaborate? What problems in the world would vanish, perhaps preemptively? Probably a large majority.
We might even already be deep into a technological overhang, where transformation will materialize after a few technical breakthroughs, or perhaps just any effort at all towards applied research by serious, thoughtful, and pragmatic individuals. The current reputation of lie detectors not working basically revolves around the fact that they were first invented ~a century ago (hence the name "polygraph"; it was the coolest thing you could possibly do at the time with multiple graphs). So it's not particularly surprising that our civilization remained mediocre, in the context of such incredible capabilities existing but being so amazing that they were pursued way too early.
This paper (Using Brain Imaging for Lie Detection: Where Science, Law and Research Policy Combine) was published in 2012 and the world has changed a lot since then (sadly, with no room-temperature superconductors, which would facilitate large-scale deployment of fMRI hats). Since this is a snapshot of the state of brain research in a specific point in time (albeit a truly, truly excellent snapshot), I’ve sprinkled reminders throughout that this is from 2012.