We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

The lying game - Toward a clearer understanding of how humans behave when they bend the truth

Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 2 minutes

University of Huddersfield investigative psychology lecturer Dr. Chris Street is making breakthroughs that are leading towards a clearer understanding of how humans tell lies and how their deceptions can be detected. For more than 30 years it has been said that we should trust our hunches and unconscious knowledge of body language. Yet his work, described in a new journal article, shows that we would be better off consciously relying on a single "cue," such as whether or not a person is plainly thinking hard.


But gathering reliable research data is a tricky proposition. To begin with, a set of lies and truths need to be collected. Ideally, participants should not be aware that they are taking part in experiments that are dealing with the subject of truth and lies. So Street and his colleague devised an ingenious and well-intentioned deception of their own that involved hiring a film studio in London and persuading passers-by to be interviewed for a "documentary" on tourism.


They were told by research assistants placed outside the studio that the film makers were running out of time and asked if, in addition to describing genuine travel experiences, they would talk about places they had not actually visited. Inside the studio, the speakers were then interviewed by a director who—they supposed—was unaware that they had agreed to lie on film.


"The idea was that they were lying to someone that they could potentially deceive. They were lying on behalf of another person, but the lie was spontaneous and told with an intention to mislead," said Street. The sequence of filmed interviews that resulted from the experiment constitutes a valuable body of material that is being made available to other researchers in what is still the relatively new field of human lie detection.


For more than 30 years, the standard approach to tapping the unconscious has been to use the "indirect lie detection" method.


"People are asked to rate some behavior that is indirectly related to deception," explained Street. "For example, does the speaker appear to be thinking hard or not? The researcher then converts all thinking-hard judgments into lie judgments and all not-thinking-hard judgments into truth judgments."


The fact that these indirect judgments give better accuracy than asking people to directly and explicitly rate statements as truth or lies has been taken as evidence that people have innate, unconscious knowledge about human deception. Street and his co-researcher and author Dr. Daniel Richardson, of University College London, have developed a different explanation, which they explore in their new article in the Journal of Experimental Psychology: Applied.


"Indirect lie detection does not access implicit knowledge, but simply focuses the perceiver on more useful cues," write the authors. It is an argument that could have real-world significance, in the training of interrogators, for example.


"There has been a push in the literature suggesting that indirect lie detection works and the reason is that it is unconscious—so people should not be making reasoned judgments but relying on their gut feeling," said Street. "But if our account is correct, that is a very bad way to go."


He readily concedes that human lie detection—while a fascinating subject—requires a great deal more research and is a long way from infallibility.


"Typical accuracy rates are around 54 percent, reaching up to around 60 percent with training. So there is unlikely to be a one-size-fits-all strategy that gives us accuracy rates anything like what we would want in a legal setting. The field needs to start considering how to improve clues to deception, how to prevent raters from using less reliable clues, and to better understand how information about the current context plays into that judgment.


"We often think of nonverbal behavior when we think of deception," continued Street. "But it would be better to focus on the content of the tale people are selling us, and asking if it is consistent with other facts we know. But even then there is a large amount of room for error."


If human lie detection has a long way to go and there is probably a cap on the accuracy that can be achieved, could the polygraph machine fill the gap? No, asserts Street, adding that the British Psychological Society is one body that has dismissed the polygraph as a tool that will never be useful.


It purports to work by detecting anxiety. "But are liars more anxious than truth tellers?" said Street. "The reality is no, because often the reason we lie is that to tell the truth would be very difficult and more anxiety-provoking than a lie."


Note: Material may have been edited for length and content. For further information, please contact the cited source.

University of Huddersfield   press release


Publication

Street CNH, Richardson DC. The Focal Account: Indirect Lie Detection Need Not Access Unconscious, Implicit Knowledge.   Journal of Experimental Psychology: Applied, Published August 24 2015. doi: 10.1037/xap0000058