There are signs that emotional machines have a history of thousands of years. In the 1 century, the Greek mathematician Alexander designed an expression doll that could perform a mini-play. Although the original text of the relevant documents he wrote has not been handed down, a group of Sicilian scholars have found an Arabic translation that appeared in the13rd century. When the monks translated it into Latin, they created a new term "robot", which means "human" in Greek. Today we translate it into "humanoid robot" or transliterate it directly into "Android".
1774, Pierre and a Swiss inventor Henry made a robot toy, which has the appearance of a beautiful girl and can play the cello. They toured all over Europe with it. When you come out, you will shake your head and breathe with the music. One of their main competitors is the German inventor David? Roentgen, he imitates the performer Mary? Antoinette made a robot model that can play musical instruments and gave it to her as a gift.
At that time, people were very excited to see that the machine could perform such a humanoid performance. However, in 2 1 century, with the development of computer technology and artificial intelligence, people expect robots not only to be humanoid, but also to understand human language and emotions, and even to have emotions similar to human beings.
Google, Microsoft and other big companies have been studying the understanding and translation of natural languages for a long time. At first, the accuracy of sentence translation was poor, but in recent years, with the rapid development of artificial intelligence technology, great progress has been made. In May of this year, a start-up company in the United States developed a smart headset, claiming to be the first in the world. It can instantly translate different languages spoken by users, just like someone is translating at the same time, so that communication with foreigners is no longer a problem. It is expected that this product will be welcomed by the market.
The types of emotions are very complicated, and "love" is just one of them, which has very rich connotations. Even if you express your "love" for someone in the world, the way of expression may be quite different. For example, some people may have a faster heart rate, while others may show certain facial expressions; Some people will be particularly excited or "discharge" through their eyes, while others have changed their pronunciation and intonation. These manifestations are usually related to a person's gender, environment, education level, especially personality (introversion or extroversion).
Emotion recognition
If everyone expresses their emotions in different ways, how can the machine recognize them? That is, you can't set a general emotional model, but let the machine identify everyone's emotions separately. Nowadays, the "deep learning" algorithm, which is very popular in the field of artificial intelligence, can be used to train the data obtained from sensors (that is, "learning") and let the machine "recognize" this person's emotions.
With the rise of wearable electronic devices, sensors can be directly worn on people, which greatly expands the field of collecting a person's emotional expression. In other words, it is no longer limited to "facial recognition" to judge a person's mood, but also can be judged by his posture, moving speed, voice, heart rate, breathing, sweating degree and so on.
In daily life, such "emotion recognition" can find many applications. For example, someone made an "intelligent emotion mirror" with "emotion recognition" technology. Imagine that you are attending an important interview. Imagine that the interviewer asks you about your educational experience, work experience, successes and failures, as well as your strengths, weaknesses, foreign language level, what you can do for the company and so on. Until you falter, dare not look straight and sweat all over. Then, the interviewer tells you that you are too nervous. Come again next time.
Fortunately, this scene is just a simulation test before the "intelligent emotional mirror". "Smart Emotion Mirror" can prepare enough conversation materials to identify your emotional changes. It is to listen to your voice, record the parameters of voice change, and judge your sitting posture, body temperature, sweating degree (by measuring skin conductivity), facial expression and so on. It will observe your emotional reaction and compare it with your usual emotions using the training data in the database.
Maybe tomorrow you can finally date a boy or girl who has a crush on you for a long time. You can rehearse in front of this "intelligent emotional mirror" first, which will carefully judge and guide your facial expression, tone, posture and even dress. The judgment of the machine is "fair", without personal feelings, and extremely patient, far better than rehearsing in front of relatives and friends.
"love" on the machine
The above example is to let the machine recognize people's emotions. However, there is no breakthrough in transplanting human emotions to machines, especially in human-computer interaction (called "emotion synthesis"). Although many researchers have studied in this field for many years and have some preliminary products, this is only the beginning, and few research results have been transformed into products.
For centuries, there have been many stories about men or women falling in love with robots. In the past ten years, the idea of creating an artificial intelligence that you can "love" has moved from science fiction to research and industry. When seeing that artificial intelligence plays an outstanding role in some games, such as chess or Go, investors have invested resources in the research and development of "emotional computing", hoping that the system can identify, explain and process human emotions, and also simulate and "synthesize" human emotions.
Emotion is an important part of human intelligence. Evolutionary science has proved that for human beings, forming and expressing love is the basis for the continuous progress of intelligent society. To make you "fall in love" with a machine, you must first give the machine the ability to produce a person's feelings, the ability to understand the background and subtext, or the ability to understand the difference between what a person wants and what a person says. It is particularly important that the physical form of love between machines is basically similar to that of human beings.
German scientists have investigated the ways of love in many parts of the world and found that we mostly express our love through facial expressions and body language. They found that men and women often put their palms on their thighs, knees or tables. They will shrug their shoulders, nod their heads, and sometimes throw their long hair to the other side. And the other party will always make some kind of response.
According to this conclusion, a company that makes robots focuses on robot skin. They make artificial skin with a special composite material, which is very soft and elastic. This skin mimics 60 muscles of the face and neck and can be programmed separately. When this skin is applied to the robot's head, you can see the robot's smile, frown, blink and so on. The company believes that to cultivate love between humans and robots, we must first imitate these gestures. The founder of the company gave a speech at TED a few years ago, when he showed a robot imitating Einstein, which could recognize nonverbal emotional cues and react. When he frowns, Einstein's robot frowns; When he laughed, "Einstein" laughed together.
Eye movements are also important. Affectionate eyes, a lot of love letters and sweet love words are the basic ways of human "falling in love", and we should "transform" and "transplant" these to robots. First of all, we should establish the emotion model and database, and use artificial intelligence technologies such as artificial neural network (ANN) and "deep learning" to realize it.
If the robot has such "expressive force", it may attract the attention and love of users, which in turn will let people further improve the artificial intelligence technology and make the robot itself more perfect. When people can fall in love with pets, their own clothes and mobile phones ... why can't they fall in love with new species "robots"?
A big market and application of robots in the future will be to take care of the elderly and patients. It is conceivable that if these robots can smile every day, understand the feelings of the elderly and patients, and provide services with love, the quality of life of many people will be improved.
There are also traps in this. This happened a few years ago: someone made an automatic chat software (chat robot) and put it in the chat room of social networking site. Chat robots can automatically talk according to several different personality images, from "romantic lovers" to "sex offenders". It can chat endlessly with the chatters according to personal hobbies and personality, and the chatters will soon fall into their own feelings and think that they have really met their lover. It can communicate with as many as 1 0 "objects" within 30 minutes, and can easily collect all kinds of personal privacy information of the chatters every time.
Although emotional machines may be used by people with ulterior motives, as long as preventive measures are in place and technology is further improved, such machines must be worthy of love. This may be one of the ultimate goals of artificial intelligence.