Predictive Analytics Should Work in Tandem with Decision Automation

As organizations implement analytics to detect or predict problems, they quickly realize that prediction itself solves only half the problem. Analytics tells you about anomalies and the probability that a problem will occur imminently. But predictive analytics doesn’t help determine what should be done next.

In some cases, teams have implemented predictive analytics in tandem with decision automation. In most cases currently, the decision is made manually by end users assigned to investigate the problem. Manual decision-making stretches the capacity of most enterprises’ support services because the process is designed around normal capacities needed to react to known problems. Predictive analytics creates greater work volumes because there is:

  • High signal noise from problems that are likely to occur and require investigation to resolve
  • High signal noise from false positives as the decision models are refined
  • A process shift from reacting to problems, to proactively identifying and resolving them

An accurate prediction allows a team to more quickly investigate and resolve problems than a reactive one. By that measure, workload should be equal between the predictive system and the reactive one. But there is more workload with predictive, even factoring out signal noise. If customers have a bad experience, for example, they may not react by calling in to complain. But predictive analytics used in the context of predicting a customer experience tends to also be designed around fixing the experience before it turns bad. That creates more workload than reaction-oriented design.

The easiest way to solve capacity problems is to put logic in place that filters out anything but highest probability scores before moving the problem into a queue. When that happens, however, the total value of the investment in predictive analytics is not realized, because the prediction models can’t be improved without investigation and resolution that are fed back into the data.

This slows down time to value and results in the need to cover both the costs of the investments in predictive analytics and the continued sub-optimal process.

Combining Predictive Analytics with Decision Automation

It makes more sense to design predictive analytics in combination with prescriptive techniques that automate decisions wherever possible and identify logical next actions when there is uncertainty about what to do next. The goal of using these techniques in tandem is to continuously improve the quality of the prediction and resolution, without increasing labor costs.

That means managing decisions becomes as important as managing predictions. The use of rules engines and decision services is not yet widespread in IoT, customer experience management and other transformation initiatives. But there is growing recognition that decision automation and decision support is a needed area of investment operationally. Rules software abstracts the conditional, decision-oriented logic from system and application logic. Rules are then wired to decisions, making it easier to automate those decisions.

While this type of reasoning is often embedded as code in systems, the richness of decision automation is missed because code-embedded rules are often too simple to realize the full benefit of automation. The abstraction of decision services improves the ability to make changes rapidly as situations change. Subject-matter experts become the managers of decision assets rather than developers, who without rules, make changes to conditional logic embedded in systems through change requests.

By viewing predictions and decisions as equal assets, there is an opportunity to significantly improve processes, while lowering costs. Harnessing data for training both the prediction and assessing the quality of the decision creates an opportunity to speed up innovation by improving quality, increasing automation and embedding continuous improvement into the process.

Share this blog post
Share Button

Maureen Fleming

Maureen Fleming is Program Vice President for IDC's Business Process Management and Middleware research area. In this role, Ms. Fleming examines the products and processes used for building, integrating, and deploying applications within an extended enterprise system. With more than 20 years of industry and analyst experience, Ms. Fleming most recently came from Symantec, where she worked in the strategy and planning group. Ms. Fleming was also an analyst at Gartner, where she researched technologies that allowed enterprises to create and manage information, particularly real-time information and the associated enabling technologies.

More Posts