In the experiment, the scientists showed subjects black-and-white photographs of 160 men and women from Dating sites and asked to determine their income. All the people in the photo had a neutral facial expression. Half of them had an income of $150 thousand a year, half less than $35 thousand per year. In the end, the participants determined their abundance with a precision of 68%.
Although the subjects determined the income of a person in his face in General, the researchers found that the lips, as well as, to a lesser extent, the eyes, give people the opportunity correct to recognize income people.
After many repetitions of the experiment, the researchers realized that people at least partly guessed that people are rich when their neutral face seemed to subtly reflect the happiness and well-being. At the same time, if the people in the photos were smiling, the experiment participants were not able to determine accurately their financial situation.
Finally, scientists have found that meditation subjects typically took about half a second. So in reality judgments about the financial situation people are made very quickly, almost subconsciously.
Artificial intelligence has already learned to recognize people’s emotions, even when they are trying to hide them behind a neutral expression. In 2017, the algorithm Libratus, created by scientists from the University Carnegie Mellon in Pittsburgh (USA), was able to beat professional players in “Texas holdem” — one of the most popular forms of poker. The robot is able to calculate not only permutations, but also to recognize the gestures of the players. He won almost $2 million from professional players.