Earlier this month we were featured in Workspace’s in-house magazine HomeWork Issue Spring/Summer 2018 giving our views on machine learning. We were in the great company of DigitalMR, Qumodo and Filament.
You can read the magazine here or at any Workspace location ; )
But for now, here’s the article “Can AI live up to the Hype?”
We currently live in an age of weak AI: artificial intelligence focused on one narrow task, like Apple’s Siri. However, strong AI, or general artificial intelligence – a machine capable of performing a range of intellectual tasks at least as well as a human – is widely thought to be achievable within 10–20 years.
This astonishing progress, which is already underway, is driving what is being called the fourth Industrial Revolution: a fusion of the digital and physical worlds that will, with the help of robotics and nanotechnology, fundamentally change the way we live and work. How will AI make an impact? Arthur House asks some Workspace customers at the forefront of AI innovation to share their thoughts.
According to research firm Gartner’s annual Hype Cycle, an index that tracks the progress of emerging technology trends, machine learning and deep learning – two of the most widespread AI techniques – reached peak hype in July last year. By Gartner’s own model, these two terms should have fallen into a “trough of disillusionment” by now, whereby interest wanes as experiments fail to deliver. And yet AI is still big news. Barely a day goes by without superintelligent computers or killer robots popping up in our newsfeeds. So, what exactly is AI and how are businesses using it? And can it ever live up to the hype?
If you’ve ever used a personal assistant like Siri or Alexa, listened to Spotify recommendations or received a fraud notification from your bank, you have used AI in your everyday life. For businesses and organisations, there are a dizzying number of AI applications, from customer-service chatbots to financial-market analysis. All these examples employ forms of machine learning. Its simplest version, known as supervised machine learning or predictive analytics, is basically an advanced form of statistics, in which humans feed a computer algorithm with training data to enable it to infer things about other data. The more you train an algorithm, the better its predictions get.
DigitalMR, a company based at Workspace’s Vox Studios in Vauxhall, uses machine learning to ‘listen’ to online conversations for market-research purposes. Various algorithms are in play at once. One eliminates ‘noise’ around certain keywords (for example, if you’re interested in posts mentioning Apple products, you don’t want results involving fruit). Other algorithms analyse text and annotate it with sentiment – positive, negative, neutral – and emotion such as love, hate, joy and sadness. DigitalMR has also won several grants from Innovate UK, the government-affiliated innovation agency, to develop new machine- learning models. “Businesses can be using AI to crunch data a lot faster and reach insights that they couldn’t get their hands on before”
– Michalis Michael, CEO at DigitalMR
These include the magic captioner, a tool that can not only recognise a logo of a brand within an image, but also automatically caption it with a sentence that describes what’s in the image, for example: “This is a group of friends drinking Coca-Cola at a picnic”. This capability enables brands to research from social media photos when and how their products are being consumed without having to look at the images.
AI-generated image captioning is cutting-edge stuff. “Apart from us, only Google and Microsoft have this capability right now,” says Michalis Michael, DigitalMR’s CEO. “It is not easy to synthesise the objects in an image and come up with a sentence that makes sense.”
For its more advanced AI technology, DigitalMR uses a technique known as deep learning. This uses neural networks, which, as their name suggests, mimic the structure of the human brain and offer far more exciting and often unpredictable outcomes than “linear” algorithms trained by humans. Deep learning can also be unsupervised, which essentially means that the machine can learn by itself. London-based company DeepMind caused a sensation in 2016 when its human-trained AlphaGo computer beat Korean grandmaster Lee Sedol at the board game Go. Last October, the company unveiled a new version, AlphaZero, which uses unsupervised machine learning. It beat the previous version 100-0. Neural networks are the future, paving the way towards strong AI, but for the moment there’s still a lot to discover about how they work and what they can do.
So what sort of businesses can take advantage of machine learning? In short, any business that needs to process data. “Everybody deals with data and there’s lots of it around any business,” says Michael. “So businesses can be using AI to crunch data a lot faster and reach insights that they couldn’t get their hands on before.” Machine learning makes it possible to sift through and find patterns in quantities of data that would take humans years to examine manually. In order to find out which topics are driving conversations on social media, DigitalMR is able to drill down into millions of posts and know what they are about without reading them. “Ten years ago, you would have had to read it,” says Michael. “Now we can push millions of posts through these tools in a few minutes and the posts come out on the other side annotated with emotions, sentiment or topics.”
MyDrive Solutions is another Workspace customer that uses machine learning to make sense of big data. The company, which is headquartered at The Leather Market in London Bridge, collects and analyses data from drivers’ smartphones to understand how safely they are driving. It creates scores for different aspects of driving, such as acceleration, braking, cornering or speed, and gives them to the driver (via a smartphone app) and the driver’s insurance company. The aim is to encourage safer driving, which leads to lower premiums for the motorist and fewer payouts for the insurer. It’s the familiar ‘black box’ insurance model but using smartphone technology rather than onboard hardware.
“The tech in an iPhone or Android is way in advance of anything inside the old black boxes fitted in cars,” says Gavin Heavyside, MyDrive’s CTO (pictured above). “You’ve got GPS, accelerometers, gyroscopes, all kinds of sensors to understand when and how people are driving.”
Harvesting this huge volume of second-by-second information from thousands of drivers is one thing, but analysing it is another – and that’s where machine learning comes in. MyDrive also uses machine learning to exclude data that does not come from a car (similar to health apps that can tell whether someone is walking or running). It is also currently developing an AI model to guard against users being dangerously distracted by their phone while driving.
As for deep learning, MyDrive’s data scientists are researching its use in autonomous-vehicle control systems and computer vision, such as automatic detection of road signs and speed limits. Although some businesses are already employing deep learning in live products, Heavyside says that for now “there are a lot more people talking about it or experimenting with it, than using it, but that is changing fast”.