We use cookies to give you the best personal experience on our website. If you continue to use our site without changing your cookie settings, you agree we may place these cookies on your device. You can change your cookie settings at any time but if you do , you may lose some functionality on our website . More information can be found in our privacy policy.
Please provide more information.
Stylus no longer supports Internet Explorer 7, 8 or 9. Please upgrade to IE 11, Chrome, Safari, Firefox or Edge. This will ensure you have the best possible experience on the site.
Brief Published: 11 Dec 2015

Olly: Robot Helper with Personality


London-based start-up Emotech launched a voice-controlled robot assistant with a unique personality at technology event TechCrunch Disrupt, held in London, from December 7-8. Still in prototype form, Olly looks like a large animated eyeball and is set in a cup-shaped base that swivels around to look at the user while they speak.

The responsive robot uses artificial intelligence and machine learning to establish its owner's preferred way of communicating, meaning it will interact in a way that specifically suits the owner's character and needs. "For example... I'm quite curious about everything, so my Olly is more pro-active, talks fast and always tries to give me more information," co-founder Chelsea Chen told TechCrunch. "But if the person is more serious, more logical, then theirs will not be like my more emotional Olly, but very data-driven instead."

Olly can instantly access requested information or sound morning wake-up calls and will provide a hub for controlling multiple smart devices in the connected home. As the robot gets to know its owner's habits better, it will also be able to provide certain lifestyle needs without any verbal prompt – for example, playing music at certain times. The device is set to launch publicly in around a year and a half, depending on a crowdfunding campaign the company is launching in early 2016.

Consumers increasingly expect brands and devices to learn from their behaviour and carry out tasks intuitively. See Predictive Tech and Digital Worlds Update: The Consumer of 2030 for more.