Hocus pocus or necessity?
Reporting in treasury is currently one of the functions requiring the most outlay in the company, providing straightforward benefits. While this is especially due to the amount of manual work involved, it is also because the use of analytics is still limited. Simple aggregation of data from days gone by is now up against advanced analytics, which ranges from linear regression from simple data cubes to multivariable algorithms and forecasts of complex risk indicators.
However, with such a high concentration of quantitative competencies in treasury, one can assume that the department has an affinity for analytical methods and applications – the ideal prerequisites for the step towards analytics.
What is meant by analytics?
One possible definition is: 'Business analytics (BA) is an iterative, systematic investigative method for company data which focuses on statistical analyses. BA is used by companies that aim to base decision-making on data.' Analytics thus serves to derive specific measures/decisions from findings and is upstream from the decision itself. After a decision, however, it comes back into play, since it helps to determine whether or not the decision/measure has led to the desired outcome and if this was effective and efficient.
A cash position thus does not come under analytics, and nor does liquidity planning, derived from a mix of posted receivables and liabilities and corporate planning.
It is possible to derive measures by converting data to information, so that this can be interpreted and thus can be assessed for validity (allowing for uncertainty). Furthermore, historical and current data is assessed for patterns to derive statements for the future. Despite all uncertainties about the future, in many practical applications, it remains the only possible method.
So what specifically comes under analytics?
Practical examples can be found in all functional areas of treasury, including the compliance function. Let's look at three examples:
We can see that rules-based machines or even systems with artificial intelligence are better suited for such practical applications than are people, as a large amount of data and information can be processed simultaneously. Over time, consistent and logical decisions can be made and their impacts better assessed.
If we go a bit further, we arrive in the world of big data.
Big data analytics means analysing large volumes of data of different types (big data) to discover hidden patterns, unknown correlations and other useful information. The above example for liquidity planning already takes a step in this direction. Big data analytics would mean also incorporating unstructured data from social media feeds, internet articles and forecast data on economic development or the weather.
Other questions that could be addressed through big data might be:
For all practical examples, it is clear that the issue is less about reducing manual outlay and more about gaining an understanding that would not previously have been possible without access and analysis options in this form.
Using big data with today's technology, however, would in most cases be too resource-heavy to gain a quantifiable positive benefit.
First, the issue must be defined in as much detail as possible. Defining the matter with all relevant (known) interdependencies helps especially to create a comprehensive link to the underlying business process.
The next step is to create the necessary database for the analytics application. This could be data from one or more source systems that are transferred to a data model so that the necessary analyses can then be conducted. The complexity of the statistical forecast models, from simple regression models to the modelling of multivariate neural networks, is freely scalable and governed by the required forecast accuracy and the available data.
For example, in the case of cash flow forecasts, public holidays, payment run plans or trends can be mapped without great difficulty. However, if additional external source data is integrated into the forecast model, complex modelling and validation is necessary. For instance, commodity price indices may have only a delayed impact on specific planning items in individual companies or even countries. Moreover, increasing complexity means that causal relations become less transparent and more difficult to understand. Nevertheless, it is important to understand the models, as both users and management are, after all, supposed to trust the resulting forecasts.
Ultimately, a suitable reporting tool must be used to prepare and visualise past and forecast data so that statements and findings are immediately recognisable. The market for business intelligence applications offers numerous user-friendly applications for these purposes, including some that don't require life-long support from an advisor.
Even if there is no limit to the imagination of treasury analytics applications and it makes fundamental sense to deploy them, it goes without saying that the cost-benefit trade-off must be clear. So the question is: what are the implementation and operating costs and what quantifiable benefit can be derived from the application? Experience suggests that the answer is different for a retailer with a dramatically changing business model resulting in the need for very precise liquidity planning and for a complex commodity exposure model of a medium-sized entity within the food production sector.
Source: KPMG Corporate Treasury News, Edition 81, June 2018
Author: Börries Többens, Senior Manager, Finance Advisory, firstname.lastname@example.org
© 2019 KPMG AG Wirtschaftsprüfungsgesellschaft, ein Mitglied des KPMG-Netzwerks unabhängiger Mitgliedsfirmen, die KPMG International Cooperative (“KPMG International”), einer juristischen Person schweizerischen Rechts, angeschlossen sind. Alle Rechte vorbehalten.