Softbank, one of the biggest mobile tech companies in Japan, collaborated with Aldebaran Robotics and Foxconn, to yet again put humanity one step closer to the . Say hello to Pepper, the interactive robot that can understand the emotion of whomever it speaks to.
Social robots interact with humans by mimicking their actions and roles in society. Pepper is no different; however, it is gifted with the ability to read and understand the emotional state of the person it is speaking to. Using an array of sensors that observe facial expressions, voice tone, and other variables, it tries to analyze how a person feels, learns from their reactions, and forms a sort of personality that continues to grow and evolve as it interacts with more people.
For example, if a certain action causes a person to act more positively to Pepper, it would build a point-based assessment using the action taken. It would then note the action and store the point assessment to an online database, where it will be analyzed to help Pepper utilize situations that bears similar social actions more efficiently.
Unlike other social humanoid robots that focus on imitating every point and aspect of being a human, Pepper is markedly concentrated only on its human interaction AI. This is evidently seen by its limited physical attributes. For instance, it doesn’t use legs for walking, and it also only has a total of 20 motors around its body for movement.
http://vr-zone.com/articles/pepper-r...eel/78899.html