There are certain limits when analyzing someone’s face
For a smile to be truly genuine, two things must happen:
- The zygomaticus major muscle must contract, which allows the corner of the lip to rise to form a smile.
- The orbicularis oculi muscle should be activated, which surrounds the eye and raises the cheek while lowering the eyebrows.
If we don’t detect both movements, the smile is probably false.
It‘s not a smile that would occur involuntarily, but rather we’ll control it. A forced smile, in other words.
It might be difficult for a person to detect these muscle differences. A fake smile can go undetected by the human eye, but can an AI detect it?
But first, what can we find out by looking at people’s facial gestures? And once we understand what lies behind face movements, can we detect lies and deceptions with facial coding?
Detecting emotions that are written on your face
We usually associate facial analysis with ID control for security reasons, but there’s much more information we can get from a person’s face.
The basic idea of emotional facial coding is this: our face shows our internal emotional states and by looking at our facial expressions we’ll know what’s happening inside (on an emotional level).
Usually, we don’t control our facial gestures. We’re don’t go talking to people while moving our face muscles intentionally to express what we’re feeling inside. That just happens on its own.
However, there are times when we want to conceal our emotions, and we consciously adjust our reactions. Now, there are several ways that emotional facial expressions can be intentionally manipulated:
- Simulated: an expression that is not accompanied by a genuine emotion. We make a face we don’t really mean.
- Masked: when the expression that accompanies that emotion is replaced by a false expression that corresponds to another emotion really. For instance, we might laugh but deep down we’re drowning in sorrow.
- Neutralized: when the expression of a true emotion is inhibited and the face remains neutral. For instance, we might feel angry but we maintain a poker face to hide our true emotions.
Now, when trying to hide information, the true emotion might emerge in different ways. It can be shown by a set of micro-expressions (too fast to catch), subtle gestures (that might be ignored), or as more lasting gestures.
People inevitably reveal their genuine emotion, especially if they’re so eager to cover it up.
Can we detect these different facial expressions?
When it comes to micro-expressions, a study found that micro-gestures of true emotion are generated within the first milliseconds of a reaction (between 40 milliseconds and 0.2 seconds) before being suppressed by one of the 3 options mentioned above.
A facial coding software that has the capacity to process hundreds of frames per second, is able to detect these sudden face movements. Plus, it can learn over time to become even more precise with its results.
AI might be more powerful than human beings in detecting these gestures, simply because of its processing power.
Then you’ve got subtle expressions. According to a recent study, humans are still better at recognizing these emotions than a facial coding software. When presented with images of people containing subtle emotional expressions, participants were able to classify them better than the software.
Part of the problem of machines recognizing patterns is that their database is filled with stereotypical facial expressions and when confronted with faces in natural environments, it performs poorly.
Maybe it’s a matter of time before AI can surpass our ability to detect and classify these subtleties better than us.
And finally, lasting or more pronounced gestures.
Some emotions are easier to hide than others. People are more credible when imitating a genuine smile than when they imitate negative emotions. This is explained because people have more experience faking smiles in their daily life.
Furthermore, negative emotions need more muscles to express the underlying emotion, which makes it even more difficult to control than smiling. Think of controlling a set of muscles that usually react involuntarily, tough challenge, right?
Being able to detect lies is a trained skill
Whether it’s a human or a computer, in order to detect lies you have to develop a certain skillset.
Anyone can realize when a toddler is lying, but can you say the same for an adult? It’s not easy, especially when you’ve seen him or her just once or twice.
When using a facial recognition program to detect lies, you must train your software with a large dataset. The database that serves as a reference must have involuntary and voluntary spontaneous expressions to contrast them with your sample when performing your analysis.
Computers can detect human expressions that occur in milliseconds, but to detect lies more accurately they should include various sources of information apart from facial features: body language, tone of voice, and semantic analysis.
Lies and deception are quite a challenge to uncover sometimes.
The following example might help illustrate that a more comprehensive approach is needed to investigate this phenomenon.
Best way to uncover the truth: use of multiple sources of information
The United States and Canadian governments are implementing a new lie detection system at their border crossing called AVATAR (Automated Virtual Agent for Real-Time Truth Assessment).
People must interact with a virtual agent in an interview with pre-selected questions. The system analyzes the facial features, tone of voice, and a transcript of the interview and issues its verdict on whether there has been any kind of lie or deception.
Whoever does not pass this interview is referred to a safety officer that will assess the person more thoroughly.
The goal is to speed up and make border controls more efficient, relying on the power of artificial intelligence. Currently, this system has a lie-detecting accuracy between 80% and 85%, a figure much higher than 54% of the effectiveness of humans.
Looks like we’ll be interacting with AI wherever we go.
Our face reveals more about ourselves than we sometimes care to share with others, and facial coding software is taking advantage of this.
We tend to show expressions that are mostly involuntary, but we are able to conceal reactions at least in three different ways.
We can simulate an emotion that’s not there, mask it with another one, or try to inhibit an emotion that’s trying to get out.
These emotional reactions can manifest in our faces as micro-expressions, subtle cues, or long-lasting gestures. And it’s up to skilled people or trained software to try to detect if we’re trying to deceive it.
In some cases humans are better at detecting emotions written on our faces and in other instances, computers can surpass us.
AI software is being used already for lie detection and the more sources of information we have, the more accurate the software can be.
The saying is that the truth always comes out, but will AI ultimately be the one hearing it on the other end?
Are you curious about the world of sound and music? Learn how music can enhance a plant’s growth, the way sound changes our sense of taste, understand the music industry, and much more!Join my newsletter to embark on this journey of sound awareness.
If you were interested in this article, I recommend you follow this link: Future