- Canadian startup recently developed the first context-aware AI in the world
- The focus is on retail and entertainment, with plans to expand into other sectors
- Will this technology make human sales assistants obsolete?
They say that flattery will get you everywhere, and that statement is true more often than not, particularly in the retail world. How many times have you bought a piece of clothing just because a salesperson said it looked good on you? It’s a simple but proven sales technique that’s difficult for anyone to resist and it’s been used to get people to buy stuff for thousands of years. But how would you feel if the flattery didn’t come from a human but from a digital sales ’person’? Thanks to TwentyBN, we may soon find out.
Canadian startup recently developed the first context-aware AI in the world
Canadian startup TwentyBN recently unveiled an AI-powered avatar called Millie, designed specifically to get you to spend more money in retail stores by showering you with compliments. Millie is a life-sized digital companion that uses computer vision technology to understand human actions and interact with consumers in real time. “Millie is a context-aware avatar that we’re showing for the very first time,” says Roland Memisevic, the company’s co-founder and CEO, at a recent NeurIPS conference in Montreal, Canada, where Millie was presented for the first time. “For this particular use case a person stands in front of Millie, and what Millie does is she has a conversation. Here we’ve put several pairs of sunglasses on the table and Millie is there to encourage you to try them out.”
Millie uses advanced computer vision technology to observe the world around her, detect what people are doing, and understand the context. “The deep scientific premise behind this project is that we believe that in order to teach a system to become smarter, to even have a conversation about things that are going on in the real world, you have to start with vision, you have to start with a camera, that’s what we did here,” adds Memisevic. The company uses NVIDIA Tesla GPUs and a cuDNN-accelerated PyTorch deep learning framework to train Millie’s computer vision system to detect and recognise physical actions. To achieve this, TwentyBN built a proprietary crowd-acting platform and used it to gather millions of human engagements, which were then fed into the system. However, rather than doing the ‘dirty work’ on its own, the company paid ordinary people to record themselves while performing common actions in and around stores, such as picking up products, opening freezers, and pushing shopping carts. So far, it’s collected more than 220,000 video clips of people performing basic actions with everyday objects. This dataset was made available free of charge for research purposes.
The focus is on retail and entertainment, with plans to expand into other sectors
Right now, Millie can detect when a person enters the store, make eye contact with them, and follow them around with her gaze. She can also detect which product people are looking at and respond by encouraging them to pick it up or show them how it works. While her communication skills are still quite unpolished, she can understand and answer simple questions and engage customers in basic conversation, thanks to built-in speech recognition and natural language processing software. Furthermore, facial recognition software allows her to remember people and address them by their name the next time she sees them. “It’s really something new and exciting that shoppers are going to want to see,” says Memisevic. “It is novelty that draws attention but then you really start to feel a connection to her.”
According to TwentyBN, Millie could have a wide variety of applications in retail and entertainment. She could, for instance, perform the role of an intelligent store assistant, smart advertiser, concierge, or personal trainer. In the future, the company hopes to expand Millie’s skill set and human understanding even further, which would allow her to assume additional roles in other sectors as well. She could be a dance instructor, workplace assistant, or a child’s learning partner. Millie’s appearance and behaviour are fully customisable, allowing her to take on the most appropriate form for the desired context.
Will this technology make human sales assistants obsolete?
The company is currently in the process of negotiating trials with several retailers. Still, those who decide to give Millie a chance should proceed with caution. After all, not everyone likes the idea of being watched and having their every move monitored, which is what Millie needs to do to fulfill her function. That means that some people might find the whole experience unsettling, rather than compelling. “While this kind of tech is still novel, it is a way to get people into the store, but it might not be for everyone,” says Natalie Berg, the founder of the UK retail consultancy NBK Retail. Some people are also concerned that this technology could make human sales assistants obsolete, but Memisevic disagrees. While their roles might change slightly, Memisevic believes that AI assistants like Millie won’t take over their jobs, but rather augment them, allowing human workers to focus on deeper interactions with customers.
As e-commerce continues to grow in popularity, brick-and-mortar retailers are constantly forced to look for new ways to increase their in-store sales. To provide a more enjoyable and personalised consumer experience, retailers are increasingly turning to new technologies like artificial intelligence and computer vision. While it’s still too early to tell whether AI-powered digital assistants like Millie will catch on, it’s not too difficult to imagine a future where a life-sized avatar will greet us by name as we walk into a store, guiding us through every step of the shopping process.