You can’t hide your lyin’ mind

The lie detector test, or polygraph, is viewed skeptically in the courtroom, suspect enough to be subject to a patchwork of federal, state and local laws about whether the results can even be admitted as evidence.

Psychology professor Jennifer Vendemia and graduate student Jimmy Nye might help change that. They think that the new scientific approaches they’re using to study human deception could make lie detection a more reliable source of legal evidence in the next decade or so. They and their colleagues are using neuroscience to get an inside look into the mind as it seeks to deceive — something that takes a lot of brainpower not just in the thinking, but also in the doing, and with possible consequences for telling the truth.

To get a clear view of the lying mind, the researchers rely on two tools that give somewhat complementary information, the electroencephalogram (EEG) and functional magnetic resonance imaging (fMRI). An EEG measures how electrical impulses travel along the scalp, reflecting electrical flow between parts of the brain near the skull, and it can resolve brain activity on a millisecond time scale. The fMRI technique provides much more spatial information, pinpointing in three-dimensional detail the blood flow within the brain, but the time scale cannot be resolved nearly as precisely as with the EEG. An fMRI image is only available every several seconds and reflects a time-averaged picture of blood flow over those several seconds.

The motor parts of deception are difficult even when people were allowed time to prepare and practice the lie.

Graduate student Jimmy Nye

Together, these tools open a new window onto the act of deception. Today’s polygraph measures only the after-effects of telling a lie — physiological responses such as changes in heart rate, breathing and sweating. With EEG and fMRI, Vendemia and Nye are able to watch the formulation of the intent to lie, which they have shown can often take a lot more work than telling the truth.

A truth-teller largely accesses parts of the brain associated with memory, Nye says, presumably to recall the facts requested. Someone intent on deception, however, has a more complex response, accessing other parts of the brain that might be involved in processing the lie to ensure consistency or in calming the emotional responses that most people have to intentionally being untrustworthy. They also have a lot of cognitive work to do in avoiding letting anything slip. Concocting a plausible lie about what you were up to last Thursday is just one step. To get away with the deception, you also have to avoid mentioning anything you might know as a result of your actual doings that night, and putting mental fences around facts in your brain takes effort.

Even the act of speaking an untruth is difficult. “We’ve found that the motor part of the lie, the actual telling of it, can take more time than truth-telling,” Nye says. “We’ve seen that from measuring reaction times after a question was asked.”

That remains true even when people are given time to practice a lie. “The motor parts of deception are difficult even when people were allowed time to prepare and practice the lie,” he says.

What’s more, a person telling a series of lies appears to have increasing difficulty in unexpected ways.

“This is still preliminary, but what we’re starting to see is that the more lies you tell doesn’t really change how hard a lie was,” Nye says. “What really changes, though, the more lies you tell, is that it makes it harder to tell the truth.”

Having to maintain a lie throughout an interrogation might thus be reflected in difficulty in honestly answering even innocuous questions, which could be quantitatively detectable by a forensic examiner. “That’s something we really want to test, these long-term lies,” Nye says. “That’s where I see this going.”

Share this Story! Let friends in your social network know what you are reading about