2019年12月22日星期日

So considerate of the future there will be a robot psychotherapist?

According to foreign media reports, artificial intelligence (AI) can scan and identify human emotions, but in the future can it understand human emotions? Will there be robotic psychotherapists in the future? What do you think about psychotherapy from the robot? A machine with emotional intelligence may not be out of reach. In the past decades, artificial intelligence has become more and more good at reading human emotions. But emotional interpretation is not the same thing as emotional understanding. If AI can't experience emotions personally, can they really understand our emotions? Thanks to the increase of data for computer learning and the continuous improvement of computing power, the latest generation of AI technology has been born. These machines are becoming more and more adept at handling tasks that were originally thought only humans could accomplish.
AI can now do a variety of things, such as recognizing faces, sketching faces into photos, recognizing speech, playing go and so on. Identify criminals Recently, an AI technology developed by the researchers only needs to observe the facial features of a person to identify him as a criminal. The researchers used a Chinese ID photo database to evaluate the system and the results were staggering. The AI ​​system misclassified innocent people as criminals only about 6%, and successfully identified 83% of criminals. In other words, it is almost astonishing that the overall accuracy of identifying criminals is almost 90%. The system is based on a technology called "deep learning" which is effective in dealing with perceptual tasks such as face recognition. In this system, deep learning technology, coupled with a “face rotation model,” allows the AI ​​to distinguish whether two facial photos point to the same person. Even if the photo is illuminated or the angle changes, it can make an accurate identification. Deep learning technology will create a "neural network" modeled on the human brain. This network consists of hundreds of thousands of neurons organized in different layers. Each layer converts incoming information (such as facial images) into higher-level abstraction layers, such as a series of edges in specific directions and locations. This automatically highlights those features that are most relevant to performing a particular task. Deep Learning Based on Neural Networks of the Human Brain Given the advances made in deep learning technology, it is not surprising that artificial neural networks can distinguish between criminals and innocent people – if criminals and innocent people have truly discernable differences in facial features if. The study found that there are three such facial features. One is the angle between the tip of the nose and the corner of the mouth. The average angle of the offender is 19.6%. In addition, the average curl of the upper lip of criminals was 23.4%, and the distance between the inner corners of their eyes was on average narrower by 5.6%.
At first glance, this analysis seems to indicate that the outdated notion that criminals can be identified through physical characteristics is not entirely wrong. However, this is not the whole truth. It is worth noting that two of the most relevant facial features are related to the lips, which are our most expressive facial features. Photo ID photos, such as those used in this study, require people to maintain a neutral facial expression, but AI may be able to discover the hidden emotions of those photos. These feelings may be so subtle that they are difficult to detect. Look carefully at the sample photos presented in this study. You will see a smile in the pictures of innocent people. However, only some sample photos were presented above, so this conclusion cannot summarize the situation of a whole database. The Power of Emotional Computation This is not the first time that a computer can recognize human emotions. The so-called "emotional computing" research field has existed for several years. Researchers believe that if we feel comfortable living and interacting with robots, these machines should be able to understand and respond appropriately to human emotions. Researchers have done a lot of work in this area and it can be said that there is a wide range of possibilities. For example, researchers use face analysis techniques to identify students with learning difficulties in computer tutoring courses. The AI ​​is trained to recognize varying levels of engagement and frustration, so the system can know that students learn easily or are difficult. This technology can be used to enhance the learning experience of students online education platform. AI was also used by Beyond Verbal to detect human emotions based on sounds. The software they created was able to analyze audio modulations and discover specific patterns in how people speak. The company claims that the accuracy of its technology to identify emotions reached 80%. In the future, this technology may be used to help people with autism identify other people's emotional states. Sony is even trying to develop a robot that can establish an emotional bond with humans. As for how they intend to develop, the specific function of the robot will be, the company has not disclosed the specific situation. However, they mentioned that they seek to "provide compelling emotional experiences by integrating hardware and services." AIs with emotional intelligence have several potential benefits, such as escorting humans, helping to resolve loneliness, or helping us perform certain tasks. Task - from interrogating criminals to talking therapy. However, such AI can also cause ethical issues and risks. Should we rely on the companionship of AI for people with dementia? Should it make them feel that AI is having an emotional life? Can you declare that a person classified as an offender by AI is guilty? Obviously not. Conversely, once a system like this has been further improved and comprehensively evaluated, it may be useful for further examination of individuals who are considered “suspicious”. This application is relatively less harmful and may be helpful. So, what should we expect from future AI? Subjective things like emotions and feelings are hard to learn for AI, partly because AI hasn't got enough good data to analyze them objectively. For example, can AI understand irony? It may be ironic that a particular sentence is spoken in one context, but not in another context. However, the number of data is continuously increasing, and processing performance is also increasing. Therefore, in the coming decades, some AI systems are likely to be able to recognize different kinds of emotions like human beings. It is still controversial whether AI can experience emotions. Even if they can experience emotions, there must be some emotions that they cannot understand - it is difficult to truly understand those emotions.

G24 LED Bulb

G24 LED Bulb,G24 LED Lamp,G24 LED Light

Led-Up Light Co., Ltd , http://www.gd-ledlight.com

没有评论:

发表评论