The conversation earlier today between Morning Edition co-host Steve Inskeep and NPR’s Shankar Vedantam about software that can reportedly detect when a CEO might be trying to hide something during a conference call with investment analysts sent us off on a search for more about the research that Shankar was discussing.
Layered Voice Analysis technology, according to researchers from Duke University and the University of Illinois, seems to be able to pick up on the “vocal dissonance markers” in the tone of a CEO’s voice that signal he or she might be shading the truth, trying to not say something or even lying. And the technology, according to a paper the researchers have produced, seems to do a better job of that than humans — the analysts taking part in such conference calls — can do.
Cognitive dissonance, as researchers Jessen Hobson, William Mayhew and Mohan Venkatachalam say, “is a state of psychological arousal and discomfort occurring when an individual takes actions that contrast with a belief, such as cheating while believing oneself to be honest.”
They caution that:
“LVA is an emerging technology and, as with most commercial products, its inner workings are proprietary. While our laboratory results suggest the LVA dissonance metrics capture aspects of the construct of cognitive dissonance, we are unable to document the mechanisms by which LVA is able to do so.”
If this all sounds like the plot of a TV show, you’re right: Fox’s now canceled Lie to Me.