Indispensable adviser or pointless number games?
As computers become ever more powerful, and in some ways more intelligent, they become increasingly indispensable to many business processes. And computer models have assumed a special role here. They can be used to simulate, and therefore forecast, nearly anything - even the entire world if need be. One need only think of the increasingly detailed climate models. The energy industry is a good example of how forecasting models in particular have become an integral and indispensable part of the planning and execution of many business processes. But also of what their shortcomings are and where the journey is heading. In this newsletter, we will examine in loose succession - starting with this edition - the models that are being used in the energy industry and that are, in some cases, of great relevance to other industry sectors too. We will start with the most comprehensive and elementary exemplars; the long-term fundamental models of the electricity market.
What are long-term forecasting models needed for anyway? There are many answers to this question. Politicians need long-term forecasts in order to create reliable framework conditions. Industry associations use forecasts in order to be able to help shape, one could also say influence, these framework conditions. One need only think of the recently published BDI study "Klimapfade für Deutschland" (Climate Paths for Germany) that was widely discussed in the media, in which the BDI outlined an industry-friendly route to achieving its climate goals for 2050. This study is based, amongst other things, on the electricity market model from Prognos.
The purpose of the application, which will be examined more closely here, is another, however: planning future earnings in companies and the analysis necessary to do so. In most industries, planning horizons of three to five years are usual. This time frame is well covered by futures markets in many sectors. You don't need to give much thought to price fluctuations; you just draw on the collective wisdom of the masses, as it were. The energy industry is different. Planning cycles are still significantly longer here. In the past, when large-scale power stations were the norm, planning periods of 30 to 40 years were not uncommon. But even a modern wind farm has a technical service life of 20 years. The earnings value over the remaining service life must be determined both when making investment decisions and for evaluations performed as part of the annual financial statement or in the event of an acquisition or sale. The price development is naturally crucial for this. But since there is no futures market for electricity sales in 20 years, the next best option is to use forecasting models.
But what happens in this kind of long-term forecasting model until eventually a time series for the electricity price is generated? The way they work can actually be summed up quickly, the complexity is mainly a consequence of the myriad of input parameters and the quantity of data involved. After all, the entire European integrated grid is generally modelled with a resolution of one hour over a time horizon of decades. In essence, the model works as follows. During the first step of a two-step optimisation process, it is determined as to whether the forecasted demand for electricity can be covered by the existing generating plants and storage facilities. If this is not the case (e.g. because a power station has been decommissioned due to age), the model builds in capacity using the most economical generating technology. In many cases, this decision is constrained, e.g. due to a prohibition on new coal-fired power stations. In the second step, the hourly demand is covered using the most cost-efficient power stations. The electricity price is set by the most expensive power station necessary to satisfy the current demand. Some models even take into account surcharges due to an artificial shortage of the generation resources. This is justified by the need to have flexible power stations that may have to recoup their long run marginal costs over the course of a very limited number of hours of operation. The models are usually much more complex than outlined here, of course, and take into account such things as cross-border connections, flexible consumers, technical restrictions of the power stations and much more. Beyond that, different scenarios are considered in order to account for uncertainties in the assumptions made. The basic idea, however, is to cost-effectively cover the demand for electricity in each hour.
This underlying concept results in a shortcoming of these models, however. Although electricity price curves accurate to an hour are generated as needed to evaluate a flexible power station, they are not comparable with real price curves on the spot market, but are nevertheless often used for evaluations anyway. If one takes a look at the logic of the model, it becomes clear where the forecast price curves and the real ones differ. The model accounts for neither unexpected events nor sub-optimal or irrational conduct. This results in the volatility in the modelled price curves being significantly lower than in real life. For a nuclear power station that continuously produces power without any major output adjustments, volatility is not so important. For a highly flexible gas turbine that only operates for a few hours and whose market success relies to a great extent on its flexibility, having the most accurate volatility information is crucial. It's for this reason that price curves based on a fundamental model should not be used to evaluate such a gas turbine or the majority of the other modern generation and storage facilities there are. The underestimated volatility can significantly negatively affect evaluations.
So what should you do? For rough estimations, a linear extrapolation of future market data is often sufficient. If one examines the range of scenarios in a fundamental model, one often sees a difference between the high and low electricity price exceeding 100 percent after just ten years. The extrapolation of the future market price nearly always lies within these bounds. For more exact evaluations, it is recommended to use a combination of an annual average price based on the fundamental model with added volatility derived from historical data. While this leaves out the increasing flexibility of the electricity consumer and other potential developments that affect volatility, the results are nevertheless far more realistic than an evaluation using the hour-based electricity price curves of a fundamental model.
If fundamental model electricity price paths are inadequate for evaluation purposes, do fundamental models still have a reason to exist at all? This question can be answered with a firm "yes". As already outlined at the beginning, their intended purpose is not just to generate long-term electricity price paths, but is far more comprehensive. The entire system comprising generation plant, storage and consumers is projected into the future. As such, their great value lies in their ability to make the future a bit more tangible and examine the future impact of decisions made today. Long-term models naturally suffer from great uncertainty, but even the high-performance computers of tomorrow will not be able to resolve this issue. But if you already knew today what the world will look like in 50 years, life would be pretty boring.
Source: KPMG Corporate Treasury News, Edition 76, February 2018
Author: Joachim Hermann, Manager, Finance Advisory, email@example.com
© 2019 KPMG AG Wirtschaftsprüfungsgesellschaft, ein Mitglied des KPMG-Netzwerks unabhängiger Mitgliedsfirmen, die KPMG International Cooperative (“KPMG International”), einer juristischen Person schweizerischen Rechts, angeschlossen sind. Alle Rechte vorbehalten.