Companies still struggle to make better, data-driven decisions, despite more powerful (big) data infrastructures and the availability of sophisticated predictive analytics, AI and machine learning methodologies. We explore why and how companies should make optimal decisions based on prescriptive analytics algorithms.
There are two main drivers for the lack of progress in data-driven decision-making:
Let’s examine these drivers in more detail.
A search for “Predictive Analytics” on Google yields over 7.8m results whereas “Prescriptive Analytics” only 431,000: this is a solid indicator of how the topic of prescriptive analytics has not received yet enough attention and focus. Certainly not as much as its predictive sibling did!
So…what is prescriptive analytics exactly?
While predictive analytics employs mathematical methodologies to analyze data and forecast future events, prescriptive analytics exploits data to identify the best course of action and achieve a goal. Forecasts built with predictive analytics methodologies are often the input of prescriptive analytics models.
While prescriptive analytics is a relatively new term, the idea of prescriptive analytics is nevertheless rooted in operations research, a discipline established in the 1930s, and in the concept of constrained optimization. Optimization is about translating business goals into an objective function that should be maximized (e.g. revenues or performance) or minimized (e.g. costs or travel distance). This is achieved by translating into a mathematical form the set of constraints which ultimately determine whether or not a solution is acceptable.
Let’s take a concrete example from some of KPMG’s recent project experience supporting various sport organizations globally in optimizing their tournament schedules. In these use cases, the most frequent objective is minimizing the time that the teams spend traveling (and/or the related travel costs). The constraints typically include availability of the venues where the qualification pools should be played, the airline schedules, the preference expressed by some of the hosting countries (which might be in a position to host a tournament in specific periods of the year but not in others), etc.
Deep domain knowledge: Mathematical modelers require a thorough understanding of a business and its vision. Such understanding is necessary to align the goals of the prescriptive analytics project with the objectives based on management vision.
Deep technical domain knowledge is necessary to build an effective mathematical model. It is often the case that great simplification of the problem can be achieved by simplifying assumptions. However, identifying the repercussions on a business of such assumptions is as difficult as it is determining their impact on the problem’s computational complexity. It is not infrequent to see optimization models which process for several days and, sometimes, weeks before producing acceptable results.
Prescriptive = Predictive + operations research: The success of prescriptive analytics projects depends on the availability of a broad set of methodological expertise, including mathematical optimization techniques such as classical mathematical programming, meta-heuristics, evolutionary algorithms and reinforcement learning. There is no silver bullet. The choice of the right mathematical optimization technique can depend on many factors, such as:
Ensuring the availability of such a broad portfolio of methodological expertise requires a focused hiring strategy and the ability to acquire professional profiles with vastly heterogeneous backgrounds, outside of the standard data science curriculum.
Industrialized analytics: Getting the mathematics right is not, alone, enough. Mathematical models need to be packaged as resilient micro services equipped with logging and monitoring as well as integrated within existing business processes. In order to get there, data science processes needs to lay on solid dev-ops and system engineering foundations.
The trust gap: In the past three years, 67 percent of CEOs have overlooked insights provided by data and analytics models or computer-driven models because they contradicted their own experience or intuition. Adoption of prescriptive analytics might be slowed down if companies do not focus on building trust in their data and models.
Building trust in data and analytics is a complex journey which entails complex organizational change: trust goes hand in hand with accountability. As a consequence, organizations need to clearly define who is responsible for ensuring the trustworthiness and accuracy of advanced analytics and models.
The development of prescriptive analytics solutions that translate data and forecasts into optimal decisions requires a different and broader methodological expertise than is required for predictive analytics. The lack of relevant skillsets hinders the exploitation of new data platforms and the availability of more accurate algorithms. Predictive algorithms are of little use if companies cannot make optimal decisions based on accurate forecasts.
At the same time, prescriptive analytics might fail to realize its goals and benefits if companies do not trust the data and, respectively, the course of action which the prescriptive model would recommend. Bridging the trust gap is a complex organizational journey that requires C-suite endorsement and support.
 Boyd, Stephen, and Lieven Vandenberghe. Convex optimization. Cambridge university press, 2004.
 Lones, Michael. Sean Luke: essentials of metaheuristics. (2011): 333-334.
 Eiben, Agoston E., and James E. Smith. Introduction to evolutionary computing. Vol. 53. Berlin: springer, 2003.
 Sutton, Richard S., and Andrew G. Barto. Introduction to reinforcement learning. Vol. 135. Cambridge: MIT press, 1998.
 Fowler, Susan J. Production-ready microservices: Building standardized systems across an engineering organization. O’Reilly Media, Inc., 2016.
 Growing pains – 2018 Global CEO Outlook
 Guardians of trust: Who is responsible for trusted analytics in the digital age?