EmoNet - Machine can now read human's feelings

Can the machine at a glance tell us the difference between joyful and sad images? Can she distinguish between a romantic comedy and a horror film in a few seconds? All these questions and other questions will be known through research conducted at the University of Colorado at Boulder by neuroscientists in the development of the EmoNet system.
Machine learning technology works well in categorizing images by the type of organisms they contain, said Tor Wager, who worked on the study and is still a professor of psychology and neurology at the University of Colorado. Pictures.
This study of machine learning and human brain processing of images focused on the application of neural networks to analyze and categorize emotions. The question was about how and where images are represented in the human brain. Research associate at the Institute of Cognitive Science that the visual cortex also has an important role in the treatment and development of emotion.

EmoNet's creation

Craigel initially created the AlexNet neural network, which enabled computers to recognize objects by manipulating information in a video stream that mimics vision in humans (classification by object is fed by a system of images containing objects such as a chair or a pen). .
After learning to discover objects well, it became possible to start a new challenge of identifying and categorizing emotions in images.He prepared a new neural network to predict what a person would feel when they see a particular image, and that neural network called EmoNet to classify emotions by mimicking the human brain.
Initially she was trained by feeding 25,000 images between natural images and sexy pictures and was asked to classify those images into 20 categories of feelings such as fear and excitement, but she was able to classify 11 categories of feelings accurately and consistently, the results were logical, and there were also varying proportions In the classification, where the class recognition was better than the other class.

EmoNet first results

For example, she accurately identified 95% of erotic images, and found it difficult to identify the most accurate emotions such as surprise, which could turn into joy or anger.In addition, feelings such as love, entertainment, and joy for the regime seemed very intertwined.
Colors also had an effect on that classification. For example, when EmoNet sees a black screen, it will indicate feelings of anxiety. It was not limited to identifying emotions but also to determining how strong emotions are (measuring the intensity of emotions in images).
She was then experimented with by researchers, where short films were shown, and asked whether she was a romantic comedy, horror or action? They got two-thirds of the answers correctly.
Marian Redan, a neuroscientist at Oxford University, said when she discovered EmoNet that the network can separate feelings of surprise and fear, not only identifying facial expressions but also learning something important about facial expressions and reactions.

What you see and how you feel

To test EmoNet more and improve it was compared with the human brain skill, bringing about 18 people and arrived to the functional MRI camera to measure their brain activity, where in 4 seconds flashes of 112 images were displayed and EmoNet saw and analyzed the same images in parallel ie obtained Patterns of brain activity and EmoNet together and compare.
Kriegel said that EmoNet is a computer program that mimics the human brain. However, I have learned to recognize emotions in a reasonable way, pointing out similarities in brain activity patterns in the cranial region and EmoNet modules for emotion representation.
Kriegel also said that emotions are a big part of our daily lives.Even if this network is not able to fully analyze it, it will have a limited understanding of how the brain works, but the surprise is that the neural network has been able to do the work required, but limited and this is a good start.

EmoNet In the future

EmoNet can be used to improve human-computer interactions, help develop emotional research, and can also be used as pre-video filters so users can block images and videos as they like.
Kriegel also wants to do research on whether a neural network, such as EmoNet, can categorize emotions according to sounds and sounds.
In the end, is this model really feeling?
"He doesn't feel," Redan said. "He just has to categorize emotions into categories, not the complexity of feeling in humans. Maybe he can feel those feelings one day."

No comments
Post a Comment

    Reading Mode :
    Font Size
    lines height