In October, Gartner released a report on the Top 10 Strategic Technology Trends for 2020. In somewhat prophetic fashion, Gartner identified “Hyperautomation” as the #1 trend for 2020 – as we plan for the post-COVID commercial environment, with leadership looking to create more streamlined organisations, their timing couldn’t have been better.
“Hyperautomation” is found at the intersection of Robotic Process Automation (RPA) and Machine Learning (ML) – it combines RPA’s approach to automating business processes, with ML’s ability to drive insight from data. It the potential to significantly reduce costs by automating processes that may include intelligent decisioning, at a time where leaders everywhere need to create more efficient, smarter organisations.
In many ways, there is nothing new here – a key goal of data science investment for the last ~10 years has been to automate decision-making processes based on AI and ML. As an organisation, we are frequently asked to deliver initiatives that aim to automate decision making processes using a combination of data science and software engineering.
What is new here is the perspective – the “RPA-first” approach that underpins “Hyperautomation” is another tool in the arsenal when we look at automating process, and drives increased collaboration across analytic and IT functions.
Perhaps the most important aspect of the rise of “Hyperautomation” is its impeccable timing. Not only are we needing to create more streamlined organisations (due to COVID and the impending recession), but it comes at a time when (in some quarters) serious questions are being raised about the value generated from investment in data science. With talk of an impending AI-Winter, and anecdotal stories of data science teams struggling to deliver realisable business value, talk of “Hyperautomation” provides a great opportunity – a chance to deliver on the potential of analytics to drive measurable cost reduction.
“Hyperautomation” is an opportunity to capture the imagination and focus of the business – to more deeply engage with them in a collaborative fashion to explore possible processes that could be automated. And when we find high-value process that could be automated, then we have more tools in our arsenal with which to build a solution.
To use a recent example, we were engaged by a client to automate their “price comparison” process, where customers would email details of a quote and ask whether our client could beat it. Using a mixture of technologies and machine learning, we were able to dynamic read and understand the given quote, generate a comparative quote, and use NLP to dynamically create a response using an appropriate tone of voice. The initial automation “success” was low, with only 8% of cases being full automated. However, that already delivered sufficient business value to demonstrate a return on investment in a few months. Moreover, the data generated by the “manual” process is already being used to dynamically improve the model, leading to an increased success rate and more savings. All in all, this “humans pretending to be AI pretending to be humans” model really provides a platform for ongoing efficiency gains and cost reductions.
As businesses emerge post-COVID, we’re all going to be in a difficult financial position, in an ultra-competitive landscape with lots of unknowns. To get through, companies will be looking to drive costs efficiencies wherever they can, making it a great time to talk about the application of “Hyperautomation” as a way to reduce the unnecessary day to day burden of process heavy tasks.
Reducing Costs With AI & Hyperautomation in a Post-COVID World Webinar
Join Rich Pugh as he provides an insight into AI and Hyperautomation – how businesses will be adopting this very technique as they strive to reduce costs.