Forecasting: Should the Mean APE Rule the Accuracy Planet?

By Hans Levenbach, PhD, CPDF author of Change & Chance Embraced

Hans Levenbach, PhD is Executive Director, CPDF Training and CertificationProfessional Development Programs. He conducts hands-on Workshops on Demand Forecasting for multi-national supply chain companies worldwide. He is group manager of the LinkedIn groups (1) Demand Forecaster Training and Certification, Blended Learning, Predictive Visualization, and (2) New Product Forecasting and Innovation Planning, Cognitive Modeling, Predictive Visualization.

He invites you to join if you have interest in sharing conversations on these topics.

LinkedIn profile: Hans Levenbach

Planners and managers in supply chain organizations are accustomed to using the Mean Absolute Percentage Error (MAPE) as their best (and sometimes only) answer to measuring forecast accuracy. It is so ubiquitous that it is hardly questioned. I do not even find a consensus on the definition of forecast error in supply chain organizations around the world among practitioners who participate in the forecasting workshops. For most, Actual (A) minus Forecast (F) is the forecast error, for others just the opposite.

Among practitioners, it is a jungle out there trying to understand the role of the APEs in the measurement of forecast accuracy. Forecast accuracy is commonly measured and reported by just the Mean Absolute Percentage Error (MAPE), which is the same no matter which definition of forecast error one uses.

Bias is the other component of accuracy, but is not consistently defined, either. For some, Actual (A) minus Forecast (F) is the forecast error, for others just the opposite. If bias is the difference, what should the sign be of a reported underforecast or overforecast? Who is right and why? 

Outliers in forecast errors and other sources of unusual data values should never be ignored in the accuracy measurement process. For a measurement of bias, for example, the calculation of the mean forecast error ME (the arithmetic mean of Actual (A) minus Forecast (F)) will drive the estimate towards the outlier. An otherwise unbiased pattern of performance can be distorted by just a single unusual value. 

When an outlier-resistant measure is close to the conventional measure, you should report the conventional measure. If not, the analyst should check out the APEs for anything that appears unusual. Then work with domain experts to find a credible rationale (stockouts, weather, strikes, etc.)

Are There More Reliable Measures Than the MAPE?  

The M-estimation method, introduced in Chapter 2 of my new book can be used to automatically reduce the effect of outliers by appropriately down- weighting values ‘far away’ from a typical MAPE. The method is based on an estimator that makes repeated use of the underlying data in an iterative procedure. In the case of the MAPE, a family of robust estimators, called M-estimators, is obtained by minimizing a specified function of the absolute percentage errors (APE). Alternate forms of the function produce the various M-estimators. Generally, the estimates are computed by iterated weighted least squares.

It is worth noting that the Bisquare-weighting scheme is more severe than the Huber weighting scheme. In the bisquare scheme, all data for which | ei | ≤ Ks will have a weight less than 1. Data having weights greater than 0.9 are not considered extreme. Data with weights less than 0.5 are regarded as extreme, and data with zero weight are, of course, ignored. To counteract the impact of outliers, the bisquare estimator gives zero weight to data whose forecast errors are quite far from zero.  

What we need, for best practices, are robust/resistant procedures that are resistant to outlying values and robust against non-normal characteristics in the data distribution, so that they give rise to estimates that are more reliable and credible than those based on normality assumptions.

Taking a data-driven approach with APE data to measure precision, we can create more useful TAPE (Typical APE) measures. However, we recommend that you start with the Median APE ( MdAPE) for the first iteration. Then use the Huber scheme for the next iteration and finish with one or two more iterations of the Bisquare scheme. The Huber-Bisquare-Bisquare Typical APE (HBB TAPE) measure has worked quite well for me in practice and can be readily automated even in a spreadsheet. This is worth testing with your own data to convince yourself whether a Mean APE should remain King of the accuracy jungle!!

Details may be found in Chapter 4 of Change & Chance Embraced: Achieving Agility with Demand Forecasting in the Supply Chain.

The Triple A of Art, Artificial Intelligence & Actuals vs Budget

By Timo Wienefoet, Managing Partner at IMPLEXA GmbH

Hito Steyerl was recently crowned the most powerful contemporary artist by ArtReview. The Professor at the renowned UdK Berlin cofounded the UdK Research Centre for Proxy Politics. Her latest works at the Documenta and the Skulptur Projekte continue to dissect the digital world. It was a critique of her that inspired to span the bridge from Arts to Artificial Intelligence (AI) to Corporate Planning and Analysis:

“Statistics have moved from constructing models and trying to test them using empirical data to just using the data […] They keep repeating that correlation replaces causation. But correlation is entirely based on identifying surface patterns, right? The questions–why are they arising? why do they look the way they look? – are secondary now. If something just looks like something else, then it is with a certain probability identified as this “something else,” regardless of whether it is really the “something else” or not.” (Hito Steyerl, full article)


AI-driven decision as  a "Black  Box"

The indication that AI-driven decisions are merely understood is widely discussed. Two examples focus on the societal effects and responsibility for

  • Social media in creating Echochambers– the algorithmic incarnation of the confirmation bias to generate advertisement revenues - and handling hate speech,
  • Compensation claims on machine learned driverless cars involved in accidents.

How can the risks these algorithms post be economically grasped if the algorithm is understood as a black box immune to insights and legal claims? This is one question to ask in the coming planning cycles when these concepts are to be integrated into the value chains.


Can the Technology-Driven Methods Bring Value to FP&A?

Another aspect is considering AI in the FP&A process itself. The AFPs Survey on Budgets ranks the logistics team last in “value perceived from budgeting”. Logistics is the corporate unit most exposed to AI. The communicative, non-deterministic aspects of corporate planning are undervalued when planning mainly concerns dead matter. Key question is, can their technology-driven methods bring value to the very lively corporate budgeting and forecasting? They can and they should because AI provides for the basic heuristics evolution prove most fit. This includes the misuse like the German saying to use a cannon to shoot a sparrow “mit Kanonen auf Spatzen schiessen”.


AI in FP&A: Exploration-Exploitation Tradeoff

One intersection of artificial intelligence and budgeting is the exploration-exploitation tradeoff. Exploration is the acquisition and collection of data while exploitation is making use of it. The selection of a casino machine involves this consideration. Explore by using coins on many machines, exploit your lucky one until the luck leaves. The conundrum lies in the interaction of the terms: continued exploration requires reevaluation of the exploitation. Two valuable FP&A insights from the tradeoff emerge: there is no optimal strategy and the timeframe is crucial.

The next cycle resets the existing one, makes it obsolete. Additionally, the timeframe of the corporate plan exceeds most Las Vegas stays. This holds true for the complexity of the planning scope. Compared both to choosing “your” machine a casino hall. The lesson can be applied to investment decisions as decisions on exploitation. The decision process should reflect explorative changes rather early than late. A strong argument for the Rolling Forecast where the near future weighs heavier than the distant one. One AI representation reflecting this is called “Least Regret”. It converts the valuation form maximizing future potential gains to minimizing future regret. Least Regret is not about chasing the best, but about preventing the worst. Regular investment reviews and divide & conquer approaches to big projects are representations of this method. Exploitation must start to yield results, although in a month – don’t make it a year – exploration may a turning tide.


Neomania in FP&A

People over-explore. The former “quant” Nassim Taleb described Neomania as focus on “to be justified innovations” at the expense of proven methods. The concept was described in detail as part of the not so obvious facets on budgeting and forecasting. Does AI qualify for over-exploration? It is a humane survival instinct, as the unknown options have a strong upside with limited downside. The math behind the bias was proven by the Operations Research Professor John Gittins from Oxford University. Gittins index structures decisions on which path into the unknown to take. It helps to identify the exploitable casino machine. The index is calculated out of the available explored information of each option/ machine. Gittins embedded a concept called discounting of future rewards. That concept sounds familiar to FP&A. The results apply well to gambling: a winning turn urges to stay, a streak justifies a couple losses before switching to more promising machines the gambler knows less about. It shows, there is value in finding out. Results are suboptimal with switching cost and timely calculation requirements, which depict the main requirements in Capital Expenditure decisions: well calculated, because of high costs in switching. Flexibility by design diminishes this. One example of a regulatory induced flexibility is the Data Portability concept in the European Data Protection Law. It seems to improve chances if a vendor selection goes sour.

Two options are cited to optimize under the mentioned limitations: the first is the mentioned Least Regret. The second one is the simplification. Simplification as an alteration to Gittins Index skips discounting future rewards calculated out of past information. It focusses solely on the potential of the new. These upper confidence bound algorithms are a perfect metaphor for Neomania. The emphasis is on exploration, the unknown strongly favored. Incubators, Accelerators and Break Outs are organizational means to harness the next big thing hidden in plain sight. A stringent funding methodology coupled with experience should navigate these means into the drawer or out into the business world.


Combining the Now and the Future requires a well-prepared FP&A team

In general, the explore-exploit offers more practicable insights to FP&A:

  1. For the past, good teams stay away from exploring late in the game and exploit the best option. For FP&A it then is to lay out the efficacy of handling the restless reality of past decisions telling the story via the financial statements.
  2. The reality of tomorrow is best prepared for with a restless mind that exploits late while exploring the less mature fields. The reality assigns 90% of FP&A to pursue the first, omitting the potential to better understand the future economically.
  3. Artificial intelligence tackling the explore-exploit tradeoff so far has not and is believed to never will be able to solve this dilemma.

Future coordination is a "brainer "- not a no-brainer. Combining the Now and the Future requires a well-prepared FP&A team. AI as Robotic Process Automation (RPA) can support with the “no-brainers”. Looking ahead requires diligence and thinking time, which RPA can provide space for. Also, computer algorithms prescribe to keep exploring for future gains. This includes looking to the Arts and the Artificial Intelligence developments.

Four Steps Towards Analytic Maturity

The volume of data available today is staggering. It is estimated that every 60 seconds around 1,820 Terabytes of data will have been created, 700,000 Google searches performed along with 98,000+ tweets and 11 million instant messages. Of course, not all will be relevant to your organisation, but if data represents the ever changing requirement of our customers, shouldn’t we be interested? Business Analytics provides the only way to quickly and effectively sift through an ever-increasing mountain of data to highlight hidden trends and nuggets of information that could be vital to survival and growth.

There is no point on aimlessly analysing data in the hope that something will jump out at you.  It won’t and all that you will do is waste vast quantities of time and effort.  Like any search, there must be an objective and a plan to reach that objective.  This is where a mature approach to analytics comes in.

Stage 1: Review existing BI/analytic needs

The first step is to look at your current use to see if they cover the key questions in the business:

  • What actually happened?
  • How efficient and effective are our business processes?
  • What is likely to happen if we carry on as we are?
  • Where should we be aiming given where the market is heading?
  • What could we do differently and how much would it cost?

This set of questions will almost certainly require different models (see my blog on 7 Key Analytic models).  Check what you have and note which areas are weak of non-existent.  For each weak area, ask yourself what would be the value to the organisation if you knew the answer.  You may want to ask colleagues/senior management the same question, which could be defined in terms of the threats and opportunities of ‘not knowing’.

Stage 2: Focus on the area with the most impact

From the above, take one area that has the biggest potential on improving performance (or mitigating disaster) and determine what data would be needed to provide the answer.  Remember it’s not about what data you have but what you need.  
Once you’ve done this then go and see what data exists, making a note of where it is, the cost of acquiring it and, if not available, how to get a reliable estimate.  Knowing the status/cost of crucial data is a key element in creating a strategic approach to business analytics.  Just because you have data doesn’t mean it is useful – it could just be a distraction from what you really need to know.

Stage 3: Gain experience in using a modern BI/Analytics system

Business analytic systems have changed tremendously over the years.  Today, most systems can analyse huge amounts of disparate data, quickly and with little knowledge of IT systems.  When evaluating a potential system keep in mind the types of analyses that you’ll need to answer the key business questions.  Then try out the products capabilities that support those analyses on a sample set of your data.  Check it for usability, it’s collaboration potential with other users, and how it links into planning.  The cost of acquiring, developing and supporting an analytic system should be placed in context of the benefits that will be obtained.

Stage 4:  Create a Business Analytics strategic plan for the organisation.

Armed with the above information, you should be able to create a plan that outlines the development plan for business analytics throughout the organisation.  Analytic models are never static and will always require tweaking or re-development.  That’s because the data being analysed will generate new data requirements and analyses that will be needed to move forwards. 

A good way forward is to establish a group of people to monitor the impact and potential of business analytics.  People that understand how the organisation functions, its strategic goals and have up to date knowledge of the latest development in analytic systems. An ideal place for this group is within the FP&A department

The article was originally published on the  prevero Blog

FP&A Strategy Execution Readiness

By Niels Van Hove, S&OP/IBP expert at Truebridges

strategy Your company has a strategy. Are you confident it is ready to be deployed and executed? Most likely not, as 60%-90% of strategy implementations fail and only 14% of executives are satisfied with the execution of a strategy. This post introduces a framework to assess your company’s strategy execution readiness, in order to align leaders around how to bridge your strategy to execution gaps. I also provide a free strategy execution checklist to assess your readiness.

Before a strategy can be deployed and executed effectively, you need to have leadership alignment around execution readiness. Your leadership team has to ask themselves at least the following questions:

  1. Do we have the best possible circumstances to execute our strategy?
  2. If not, what are the gaps to effectively execute our strategy?
  3. Are we aligned around our strategy execution gaps and strengths?
  4. How and when are we going to address our gaps?

Answers to these questions will provide understanding between your leaders on how they perceive the risk profile of execution the strategy. It will give your leadership team insight if they feel ready to execute their strategy. But most important, it will create alignment on how your company can create the best possible circumstances to execute the strategy successfully.

To help executives understand their gaps to effective strategy execution, I researched 7 strategy implementation frameworks and 75 leading strategy execution articles published over the last 30 years. I looked for the most common strategy execution factors, and distilled and grouped 5 key success factors.

I developed a 40 question checklist to assess business confidence across the 5 strategy execution key success factors. If you take this survey with your leadership team, it provides you answers to powerful questions like:

  1. What are our perceived strategy execution gaps?
  2. What are our perceived strategy execution strengths?
  3. Are we aligned around strategy execution readiness?

What follows is a brief description of these 5 ultimate strategy execution success factors.

strategy alignment

Leadership Alignment:

There are collective leadership and common language around purpose, vision, behaviours, strategic capabilities, balanced scorecard and budget objectives. This supports focused decision making, resource allocation and issue resolution.

60% of organizations do not have strategic initiatives in the budget

Mindset and Behaviours

A resilient, positive and growth mindset culture, with effective behaviours and group dynamics supports alignment, integration and strategy execution. These cultures are proven to outperform negative, fearful cultures with aggressive behaviours. People & Culture get a mention in 6 from 7 researched strategy frameworks, however:

30% of managers mention cross-unit working as the greatest challenge to strategy execution

Performance and Appraisal

The strategy is cascaded to individual performance level. Objectives , rewards and consequences are clear and include the ‘What’ as well as the ‘How’ we do it. Recruitment policies are aligned with values and behaviours and there is action towards performance issues. Reward systems are mentioned in 6 out of 7 researched strategy frameworks, however:

70% of middle management and 90% of front line employees incentives is not linked to strategy

Organization Change

The organizational structure supports the strategy. There is strategic capability building, clear roles and responsibilities and continuous formal, informal and two-way strategy communication to engage employees. Organizational structure is mentioned in 7 out of 7 researched strategy frameworks, however:

In 38% of companies, managers do not inform their team about the chosen strategic direction

Integrated Planning & Monitoring

There is a strategy implementation plan. A periodic rolling forecast provides visibility in gaps to budget and supports enterprise resource re-allocation. Strategic initiatives, goals, measurements and targets are periodically monitored. Control, process, information systems and goals are mentioned in 5 out of 7 researched strategy frameworks, however:

92% of companies do not report on strategic lead performance indicators

If you and your leadership team ask yourselves the right questions around these five key strategy execution themes, you will create understanding around your strategy to execution gaps. You will start a conversation together about your strategy execution readiness. You will hold up a mirror and ask; Are we ready to execute?

You can download the Strategy Execution Survey here. Success with aligning your leadership team around strategy execution.

Data-driven FP&A: the tip of the iceberg

By Irina Steenbeek, Founder at Data Crossroads

In Part 1 of the series ‘FP&A and the three-headed serpent‘,  you got acquainted with the ‘data – business planning – tools & techniques’ model, which is the basis for FP&A practices. The combination of words ‘data-driven’ seems to imply that you should start with data, and to some extent this is true. 

As an FP&A professional, you deal with data that is being delivered to you by various business stakeholders. Your data travels a long way from its source through applications to your reporting system.

The data you deal with is the tip of the iceberg. The major part of data processing is the part that is hidden beneath the water. And this is also the place of origin of all the data quality issues.

You do not have much influence on this hidden part. This part of the iceberg is out of your area of accountability. Unfortunately, this is the most important part of making a business data-driven. This dilemma is similar to the ‘chicken or the egg’ dilemma: which one was there first, and where do you start? You want to make data manageable and should become a sponsor of this idea within your company, but you know it cannot happen overnight. There is a long way to go before your company can achieve this goal.

There is still a lot what you can do now at the tip of the iceberg in order to move closer to ‘data-driven’ FP&A. These actions are not only data-related but can involve the business processes and techniques you are using.

Define who really needs which data

As an FP&A professional, you have probably built a strong business partnership with top and operational management.

Insight 2 from the global FP&A research  stipulates that FP&A Teams aspire to be more strategic, quote: ‘Companies are wasting valuable analytical talent on low-value adding activities such as data reconciliation, data cleansing, reporting reconciliation etc.’.

Very often, FP&A staff simply delivers standard reports to their main stakeholders with a certain frequency. The first step for you is simply checking whether all reports you deliver are really necessary. Reports themselves are simply containers of information.  The second step is specifying the data needs and requirements of your main stakeholders. These requirements may include critical information needed for decision making, the frequency of report delivery, or how they are to be delivered.

Optimize reports and information you deliver

It is possible that the actual needs of the stakeholders are quite different from the reports you have been delivering all this time. In this case, you should start by analyzing your current reporting practices. Review main reports first. Hopefully, you will find a way to optimize the number and content of these reports. You might find that different reports contain similar information. You should concentrate on the data which is critical for decision making. An effective technique used in report optimization is working out report flows. You mind find some closed loops in reports circulating even within your department.

Speak the ‘same language’ as your colleagues

After you optimized the number and content of your reports, you should have a look at the language you use in these reports. The terms that are clear to you and your FP&A colleagues, might be not that understandable for other stakeholders. In one of my earlier articles, I provided an example of the definition of the term ‘revenue’: ‘In accounting, revenue is the income that a business has from its normal business activities, usually from the sale of goods and services to customers. Revenue is also referred to as sales or turnover.’ If you have time for a spare project, spend it on creating a business glossary that will ease the communication between you and your colleagues.

Improve information delivery

As soon as you have refreshed the information needs and requirements, you might start thinking about the way to deliver the data. At this point, many of you deliver information in the form of reports, either on paper or in electronic form.
A while ago, I had an interesting conversation with a data manager from a large Dutch company. I was fascinated by the way he managed to organize the ‘tip of the iceberg’ in the company. They developed a system of KPIs to be used by stakeholders from different departments for decision making on different management levels. After that, they implemented dashboards across the whole company. Such an approach allowed them to reduce the production of a lot of unnecessary reports within the company. After cleaning up the ‘top’, they turned to start cleaning up the hidden part of the iceberg.
All the steps discussed above concern data and optimization of data-related processes at your level of accountability.
In our following blog we will be discussing what data is actually right for FP&A. Stay tuned!