Sunday, February 05, 2006


Try this on for size; repeat each of these words out loud, quickly:

red green blue purple orange yellow red blue purple yellow

Something like that is supposedly called the "Stroop task." It's supposed to be really difficult, according to this article in today's NY Times magazine.

The author of that piece, in writing about a contrived experiment on reporting the value of a particular playing card says:

Telling a spontaneous lie is similar to the Stroop task in that it involves holding two things in mind simultaneously — in this case, the truth and the lie — and making a choice about which one to apply.

There are limits to the ability of the human being to be able to report what one experiences; the "bearing of false witness" is thought to be immoral when done intentionally, and arguably, from a biblical standpoint, non-intentionally as well.

In Buddhism, the precepts on telling the truth are not so absolute; and readily admit exceptions that would cause harm (as do some forms of Christianity, Catholicism in particular).

Also arguably, the test involved isn't actually a lie; a real lie would be a lie that the subject initiated himself about the test, but without the collusion of the researchers...

Also interesting here, though is the relationship between mindfulness and fidelity of rerporting; if one thing can be kept in mind, and one simple task defined, it is not difficult.

But this "quantization problem" readily complicates the search for the ultimate "lie detector" (ethical quandary for you: can we all get to use this hypothetical device?); first it has to be defined just what kind of lies one wishes to detect. (The classic lie detector is reuse the metaphor from yesterday, like leeches, of course.)

Harvard research psychologist Stephen Kosslyn evidently agrees:

Even as these small bits of data emerge through functional-M.R.I. imagery, however, Kosslyn remains skeptical about the brain-mapping enterprise as a whole. "If I'm right, and deception turns out to be not just one thing, we need to start pulling the bird apart by its joints and looking at the underlying systems involved," he said. A true understanding of deception requires a fuller knowledge of functions like memory, perception and visual imagery, he said, aspects of neuroscience investigations not directly related to deception at all.

In Kosslyn's view, brain mapping and lie detection are two different things. The first is an academic exercise that might reveal some basic information about how the brain works, not only during lying but also during other high-level tasks; it uses whatever technology is available in the sophisticated neurophysiology lab. The second is a real-world enterprise, best accomplished not necessarily by using elaborate instruments but by encouraging people "to use their two eyes and brains." Searching for a "lie zone" of the brain as a counterterrorism strategy, he said, is like trying to get to the moon by climbing a tree. It feels as if you're getting somewhere because you're moving higher and higher. But then you get to the top of the tree, and there's nowhere else to go, and the moon is still hundreds of thousands of miles away. Better to have stayed on the ground and really figured out the problem before setting off on a path that looks like progress but is really nothing more than motion. Better, in this case, to discover what deception looks like in the brain by breaking it down into progressively smaller elements, no matter how artificial the setup and how tedious the process, before introducing a lie-detection device that doesn't really get you where you want to go.

The rest of the article's worth reading - but keep in mind this isn't likely going to be ready for use by "the government" any time soon, or at least, there is going to be high error rates if they do deploy such "technology."

On the other hand, let's face it: most folks don't trust certain politicians, and for good reason. The proof's been in the pudding way too long for serious doubt by now.

No comments: