via digitalistmag.com

digitalistmag

Shopping In A Galaxy Not So Far Away: AI And Robots In E-Commerce

by Erica Vialardi

Unconscious imprinting from growing up a Gen-Xer during the original Star Wars years tells my generation that robots are futuristic. Except that is no longer so, and hasn’t been for a while now.

One of the features that made robots from Star Wars so awe-inspiring when we were children was their ability to understand and express human sentiments and behavior. Think about little R2-D2 rolling away offended, or C-3PO’s exaggerated politeness building bonds and empathy with viewers.

AI and robots are “invading” our daily lives, especially for shopping and e-commerce. Here are three robots and related technologies that are starting to integrate meta-communication aspects that go beyond purely utilitarian language to make the communication more human in consumers’ daily lives.

Meta-communication: When chatbots speak from another dimension

If you’ve ever been stuck with a customer support chatbot commanding you to “simplify your question,” you probably wondered why they even exist, since they appear to comprehend only questions that are listed in the company’s FAQ.

Instead, what if bots were able to pick up on the emotional state of a customer support request, such as if a customer is angry? Of interest is the conceptual idea of Charly the Chatbot being taught to understand emotional communication and react accordingly. This intuitive ability will help to lighten the mood between customers and brands with entertaining, small-talk conversations. Understanding secondary, unspoken meanings would make chatbot communication and problem-solving with customers much more efficient.

Body language: An army of clones could help

You may have already encountered an in-store robot assistant in some larger retailers’ stores. However, interacting with a robot in a public space still holds a dose of awkwardness.

Speaking live with another person means using words, tone of voice, facial mimicry, and gestures. But what “face” is one supposed to display when talking to a robot in front of other people? One quality that makes the humanoid robot Pepper so well-received is its capacity to use and mimic some body language.

Pepper can move its arms, turn its head toward you, and even use its eyes (not a weird built-in scanner) to scan your coupons. These user-friendly features are certain to delight any customer, and can even serve up some nostalgic vibes to Gen-Xers and older who dreamed of communicating with C-3PO.

Cultural awareness: The whole galaxy matters

When Siri was first released a few years ago, it seemed that her main function was to have people shout unspeakable sentences to her to test her array of smarty, generic retorts.

Now that voice and conversational commerce are getting a foothold in our everyday lives, the time has come to move on from mono-cultural voice assistants that express all-purpose phrases. The way we relate to shopkeepers and salespeople may greatly vary based on geography and culture.

This is something to consider carefully when programming voice devices, especially when attempting to make the robots more “human” through casual chit-chat. For example, some cultures may prefer different modes of communication based on gender or age, or an AI may need to expressly announce that a statement is intended as humor.

The current excitement around robotics and artificial intelligence is starting to provide answers on the optimal ways to have robots interact with customers, and where to draw the line between a frustrating or awkward shopping experience versus a satisfactory, mutually beneficial one.

After all, as C-3PO said: “He’s quite clever, you know… for a human being.”

SOURCE