Whenever there is speculation of a robot passing the TuringTest, it makes the news. On the other hand, there is very little chatter when a human fails it, as I suspect I nearly did after a long day of sitting on the phones in a former life as a customer success representative. After 256 repetitions of, “Hi this is Alison, and you’ve reached Tech Company.com,” convincing a caller that I was inhuman was hardly a challenge.
We all know that monotonous, repetitive labor can reduce the humanity in us all; but in many ways, this presents an opportunity for us at SoftBank.
Restoring humanity to human occupations is just one of the ways our robots can be powerful agents of change. As we do further development on Nao and Pepper, we are working to create robots that can supply unlimited levels of patience and even bring empathy to difficult situations.
Why Robots Are Excellent Listeners
Studies show that 45% of our waking hours are spent listening. And, while we like to think that we are great listeners as a species, we humans have our limits. Just ask anyone who has worked at a mall, or manned an information booth at a music festival. This is where robots can provide interesting solutions to difficult problems.
Not only do robots have an unlimited capacity to listen, but their inputs are not colored by their own emotions, fatigue, or memory. Robots are perfectly happy to answer the same question 500 times a day, answer the same question in a variety of ways, and answer the same question in a variety of languages.
Additionally, robots are free of the judgment that often comes with being human. A robot does not care that it’s 2018 and you still don’t know how to play your music through your portable speaker, or setup the bluetooth audio in your car. Thus, people are often less inhibited in talking to robots. Consider this: what’s a more candid and personal expression or record of what you want and value; what you put on your resume, or your Google browse history?
The ability to listen without judgement or frustration is a critical value-add for robots. As we continue to build out our arsenal at SoftBank, here are some ways we hope our robots will provide support without limits.
Robots In Therapy and Mental Health
In the early experiments with Eliza, one of the first chatbots in 1966, creator Joseph Weizenbaum was surprised to see the number of people that attributed human-like feelings to the computer program. This goes to show how willing humans are to form connections with robots.
Today, robots have already proven their use in a variety of therapeutic settings. In Japan, Pepper is one of many robots that has been used to provide comfort and support to nursing home residents, who are looking for companionship, and ways to keep their minds sharp and engaged. Companion pet robots have also become popular, as they give the elderly a chance to love an animal long after they are physically unable to care for live pets.
For people who are uncomfortable seeing a live therapist -- due to perceived stigma or social anxiety -- robots can be non-judgemental listeners. A recent NPR article discussed at length the eagerness with which humans open up to robots. This eagerness has even led to the development of a documentary that is being “shot” by robots. These “blabdroids,” who are responsible for interviewing the documentary subjects, have been incredibly successful in getting their interviewees to open up about their deepest hopes, fears, and dreams.
Because people are so comfortable telling a robot things they would never tell another human, there is a huge opportunity for robots to help people work through issues without stigma.
Robots as Educational Supports
Education -- particularly in the case of special needs students -- is another critical application for the patience and empathy of robotics. For example, the Ask Nao Program, launched in 2012, was specifically geared to help kids on the autism spectrum who struggled to thrive in a standard classroom.
When working with special needs students, robots can help teachers balance the needs of an entire classroom with the needs of the individual. For students that need additional support, robots like Nao can repeat information, and leverage built-in educational applications inspired from various behavioral approaches and models (ABA, PECS, TEACCH, DENVER, SCERTS) to provide students extra support in their learning. Robots don’t get frustrated or feel bad when their efforts fail, and they don’t mind spending an entire day working on one concept with a student.
In the case of Nao, the robot actually was able to facilitate faster learning for all students, by combining visual and audio support to help bring messages home. For example, in a story about a guitar, Nao could show a guitar animation and play a chord every time the guitar was mentioned, providing multiple paths to understanding.
Customer service is an obvious place for robotics, and as chatbot technologies have improved, they have been widely adopted in a variety of industries. Over the phone or computer, robots have been incredibly successful in helping companies quickly answer Tier II and Tier III questions, saving human representatives for the more challenging or interesting cases.
Not only do robots drive efficiency and speed through an organization, but they also can improve the customer experience. Robots are never in a bad mood, are an easily scalable resource, and never run out of energy. They can provide a larger breadth of information and help answer questions fast, by combing through large amounts of structured and unstructured data to extract relevant information.
Bringing this level of customer support to the real world is a critical goal for us at SoftBank, and Pepper is already being used by a number of customers in retail, banking, and hospitality, to augment the skills of onsite staff, help get questions answered faster, and generally improve the customer experience.
Forging the Path to Success
As robots are increasingly used to support humans in incredibly human capacities, there are some areas of development that are top of mind. One is mood recognition, as humans can deal with ambiguity, and handle things like sarcasm in a way robots can’t yet.
Today, if someone tells Pepper, “I’m disappointed,” she knows what that means, but the next step is putting together words and facial expression to matched intent. That’s an area of interest for us on the Studio team, as we teach Pepper how to take in face, words, and voice sentiment analysis, and body posture, to be more effective in emotional situations.
We are getting closer to helping Pepper understand the kind of tone that disambiguates an intention that’s not expressed in the words. Once that milestone is achieved, there is a lot of opportunity for Pepper to be a unique solution in a variety of fields where it was never possible before.