On February 23, Taiwan's Kinmen Airport Station announced that it will have an emotional robot, Pepper, joining the terminal service team as an airport guide to welcome passengers. "The question is not whether smart machines can have any emotions but how emotions can be made when machines realize intelligence." Marvin Minsky, the father of artificial intelligence, said. Human emotions are difficult to quantify and emotional robots are difficult to quantify. To quantify invisible “emotions†into machine-understandable, expressible data or values, the robot’s “emotions†are the result. Are people sure that they can resist the temptation from "reading mind"? When a robot has consciousness and emotions, is it "human"? What is "sentiment analysis"? How does a robot learn "emotional computing"? Where does the emotional robot come from? With the increasing level of domestic artificial intelligence, the artificial intelligence program Ai, developed by Alibaba Cloud, has been able to predict "I am a singer" championship last year. "Alfa dogs have no way to penetrate people's lives because ordinary people do not use Alpha Dog technology. One of our very important tasks is to transform artificial intelligence technology into something that people can enjoy in their lives and can help people's products from time to time. "Jen Renxian, deputy dean of the former Microsoft Asia Engineering Academy, said. In fact, robots with rich emotions, like humans, have been enthusiastically promoted once they are launched. The world’s first emotional robot, Pepper, developed by Japan’s Softbank Corporation, was launched on the Internet early last year and sold for 198,000 yen (about 11,000 yuan). The first 1,000 units were sold out within a minute. At the end of the 20th century, Rosalind Picard, an MIT professor, proposed the concept of "emotional computing." From the point of view of physiology, he first detected various psychological parameters of the human body, such as heartbeat, pulse, and brainwaves, and calculated human emotions accordingly. State; From the perspective of psychology, receiving and processing environmental information through various sensors, and calculating the emotional state of the robot accordingly. The current research on sentiment analysis can be summarized as: emotional resource construction, emotional element extraction, sentiment classification, and sentiment analysis application system. Artificial intelligence experts believe that there are no parts of the human brain that contain emotional functions such as the amygdala, hippocampus, and cingulate gyrus. Therefore, it is doomed that the machine's emotional recognition is very different from that of humans. It is impossible to require the computer to recognize the process. Real emotional experience. However, machines have advantages that humans do not have, such as fast computing speeds, such as obtaining comprehensive data through the Internet of Things. Unlike Alpha Dog who defeated Li Shizhen, machine learning algorithms that read emotions rely on supervised learning and deal with regression and classification problems. "We need to feed the machine with many and many groups of marked data, tell it which units change, and show that there are emotions. After the machine digests these data, it will spit out a model. This model can apply other expressions and sounds. Or change in posture." Industry experts explained the meaning of "supervised." What does it mean to have emotion? The meaning of giving emotion to robots should not be underestimated. For humans, emotional robots can provide warmer and more humane services. On the other hand, for robots, emotions are likely to enlighten their sense of autonomy. Compared with the former, the latter may be more worthy of attention. In the 1970s, the American psychologist Paul Ekman proposed the expression method of facial emotions, namely facial movement coding system FACS. Through the combination of different coding and movement units, the robot can automatically recognize and synthesize complex expressions. Changes, such as happiness, anger, sadness and other expressions, are similar to motion analysis models and acoustic models. Unlike people, robots do not require a true psychological experience to express a certain emotion. Taking Pepper as an example, this is the world's first robot with emotions and response. Softbank CEO Sun Jung recalled that when he was an adolescent watching Astro Boy, he wanted to give the robot a soul with rich emotions, and Pepper's design philosophy was to accompany humans. Different from the "robots" that have been widely used in industry and service industries, Pepper is not good at cleaning and cleaning, is not suitable for cooking and cooking, and has no workshop work ability. However, it is equipped with speech recognition technology, articulated joint technology, and emotion recognition technology, with the most intuitive sensory systems that humans can understand: sound, touch, and affective systems. Human expressions, tone, joy, anger, etc. can be identified, and responses can be made based on human emotions. Today, nearly 200 emotional applications have been launched on Pepper. For example, Pepper's diary can be taken for pictures at family events. It can also write diaries, store family members' memories like smart albums, and can guess people's psychological state at this time, and then cut into the scene to chat with you and tell jokes. There are currently more than 10,000 Peppers serving people in Japan and Europe, and the United States is interested in introducing them. The Han robot developed by Hong Kong Hanson Robotics not only can understand the user's emotions, but also can express emotional feedback as a simulated facial expression. The domestic Gowild company has also launched the “Kids†robot that can provide life assistants and young people with enhanced social and emotional communication services. In fact, these tasks do not start from the cognitive mechanism, but judge human emotions through external forms (words, expressions, and limbs). Will emotions become the core? “Intelligent robots will become a very expressive field in the future, especially in relation to daily life, and even more 'humanized.' The small white robot partner is not a traditional robot but a growth. In the middle of the small partner, warm heart and even some 'comfortable'." "Kids White" robot maker, Gowild intelligent technology founder Qiu Nan said. In the application of sentiment analysis, the combination of sentiment analysis and artificial intelligence will generate a series of applications with a wide range of potential business value. It is expected to have a great deal in the areas of medical care, public services, research and smart home. Identify user emotions in chat bots and give emotional comfort. Further, the future sentiment analysis is applied to the appreciation of articles and poems, automatically generating their own opinions, positions and emotions, expressing the machine's own emotions, and thus moving toward strong artificial intelligence. "Artificial intelligence is not a technology, there is no stop period for research and development, it is an evolution of continuous simulation of the behavior and thinking of the brain, along with the evolution of humans, this is the essence of artificial intelligence." Jian Renxian believes. If, as the Internet forecaster Kevin Kelly puts it, artificial intelligence is an important technology for the next 20 years, experts speculate that emotional interaction is gradually becoming the core technology of artificial intelligence in the future. Robots understand the trilogy of human emotions Experts explained that robots can understand human emotions in three steps: recognition, understanding, and feedback. Recognition includes the recognition of human expressions and sounds. This includes contextual judgment and natural language processing. Divided into expression recognition and tone recognition. Expression recognition can be roughly divided into several common modes: happy, happy, excited, angry... Each of these modes can be converted to other modes, but the probability of conversion is different. Modal recognition can also be divided into several common modes, but the recognition of mood is much more difficult. It is not only to analyze semantics from the perspective of natural language processing, but also to make the robot really understand. Understanding includes a comprehensive analysis of human behavior by the robot and accurately determines the human emotional state. From the technical point of view, it is to collect a large amount of human psychological description samples. With more and more collected text information, the recognition accuracy of robots will become higher and higher. After understanding the human mind, the next step is to respond. In this respect, Japan's humanoid robot walks ahead.
TTN voltage stabilizer has the low energy consumption,the over voltage protection,the low voltage protection,the over-current protection,the over-loading protection,the over-temperature protection and so on.It boasts for many kinds of protections,the collection energy conservation and the environmental protection ect.This is a brand-new concept product which possess many new technologies!This series products simultaneously are applied for many technical monopolies
We already applied many kinds of this products patent, and the technical patent NO: 200720036394.1 and Appearance paten NO: 200730025909.3
2. Use for equipment:
Computer
Test equipment
Light system
Safe alarm system
Ray equipment
Medical equipment
Copy machine
Stereo equipment
Numerical control machine tools
Industrial automation equipment
Color and drying equipment
Test equipment
Hi-Fi equipment
Voltage Stabilizer,Voltage Stabilizer For Ac,Voltas Stabilizer,Power Stabilizer
zhejiang ttn electric co.,ltd , https://www.ttnpower.com