Tech Show 2017: Advanced Bots & Brain-Interfacing Retail
Spotlighting the growing importance of live, conversational and more intuitive forms of real-time commerce, UK trade publication Retail Week’s inaugural Tech conference took place in London last week (September 13-14). Tackling both here-and-now and futuristic concepts driven by artificial intelligence (AI), key developments include the prospect of brain-controlled interfaces and robots. Here are the highlights.
- Individualised Fashion Advice Upgrades Service Bot: UK fashion e-tailer Very’s retail and technology director Jon Rudoe revealed that the company is working with IBM’s Watson AI software to upgrade its bot-based Assistant app, launched late 2016. The evolved service will respond directly to consumer’s questions to provide individualised fashion advice, such as which shoes suit a specific outfit. At present, the bot only deals with more basic ‘pick from a menu’-type queries such as the tracking of a parcel or handling returns, easing pressure on customer service staff.
- Shoppers Search, Brands Get to Look On: Tracy Issel, general manager of worldwide retail at Microsoft, shared its plans to launch interactive AI bot Cami with British electronics brand Dixons Carphone in November 2017. Cami will let customers check what’s in stock, help them research big-ticket purchases such as TVs or white goods by asking questions, and autonomously find and save specific products to their online profile. It will also allow store staff to see what each customer has been searching for, giving them a realistic idea of what’s building momentum.
- Dawn of ‘Brain-Computer’ Interfaces: Naji El-Arifi, head of innovation at British strategic marketing agency Salmon, discussed the more futuristic evolution of real-time retail. He flagged Facebook’s 2017 development work on a “brain-computer interface” that allows users to type 100 words per minute using just their minds, without invasive implants. Instead, the team plans to use optical imaging to scan the brain 100 times per second to detect ‘internal speaking’ and translate it into text. It’s currently working on the project with universities including UC San Francisco, UC Berkley, John Hopkins University’s Applied Physics Laboratory and Washington University School of Medicine.
Facebook isn’t the only organisation exploring brain-computer interfaces. Scientists at Massachusetts Institute of Technology’s Science and Artificial Intelligence Laboratory (CSAIL) and Boston University are developing a brain-controlled robot using data from an electoencephalography (EEG) monitor that records brain activity. The aim, says CSAIL director Daniela Rus, is to instigate actions instantaneously without needing to type or even speak a command – “a streamlined approach that would improve our ability to supervise factory robots and driverless cars, and other technologies we haven’t even invented yet”. Such concepts could fuel some of the ideas pinpointed in Retail: Workforce Tech Innovations, 2017.
In spring 2017, US tech entrepreneur and PayPal founder Elon Musk founded Neuralink to explore this form of computing. Its first application is in medicine, helping people who are paralysed or have motor difficulties. But the technology could potentially unlock new possibilities in the growing realm of shopping via ambient interfaces – even transcending voice-activated concepts. For example, it could allow someone to order more coffee simply by consciously thinking it. However, while investment in the area is growing, the technology is several years away from being robust enough to operate in a commercial or home environment.