Researchers have succeeded in getting artificial intelligence to understand our self-perceptions of what makes faces attractive, and accordingly, new images are created that match our standards, according to the sciencedaily science site. Researchers from the Universities of Helsinki and Copenhagen used artificial intelligence to interpret brain signals, and combined the resulting brain-computer interface, with a generative model of artificial faces. This computer enabled the creation of face portraits that matched individual preferences.
“Sababi”, senior lecturer and researcher from the Department of Psychology and Biology at the University of Helsinki, says in our previous studies, we designed models that can identify and control the features of a simple image, such as hair color and emotion. However, people pretty much agree on who’s blond and smiling.
the biggest challenge
Attraction is a more challenging topic to study, as it correlates with cultural and psychological factors, which are likely to play unconscious roles in our individual preferences. In fact, we often find it very difficult to explain what makes something or someone beautiful.
Preferences the brain detects
Initially, the researchers gave the generative neural network (GAN) the task of creating hundreds of artificial images. One by one, the pictures were shown to 30 volunteers who were asked to pay attention to the faces they found attractive, while their brain responses were recorded via EEG. “It works somewhat like the dating app Tinder,” Sebabi explains, with participants swiping to the right when they see a cute face. But here, they didn’t have to do anything but look at the pictures. We measured their immediate brain response to the images. The researchers analyzed EEG data with machine learning techniques, and linked individual EEG data, through a brain-computer interface to a generative neural network. “A brain-computer interface like this is able to interpret users’ opinions about the attractiveness of a set of images, ”says academic research fellow and associate professor Touka Rosalo, who leads the project. “By interpreting their gaze, the AI model can explain brain responses, the generative neural network that shapes facial images together, a new facial image by combining what a particular person finds attractive.”
Modeling validity test
The researchers created new photos for each participant, and they expected they would find them personally attractive. By testing them in a double-blind procedure, against identical controls, they found that the new images matched people’s preferences with more than 80% accuracy.