“The same companies that vow to respond quickly to market shifts cling to budgeting – a process that slows the response to market developments until it is too late.”
- Jeremy Hope and Robin Fraser, co-founders of BBRT
With analytical driver-based models, FP&A’s processes become more flexible and dynamic. A new generation of systems allows for agility, easy management of FP&A routines and collaborative planning. FP&A systems are now more often managed in the finance department, which helps to eliminate the problems of ‘black boxes’ and heavy reliance on expensive programmers and IT departments.
Today, we are seeing the emergence of a new generation of FP&A professionals, who are different from traditional accountants. These practitioners can see the big picture, understand key business drivers, build models and generate valuable business insights. Most importantly, they can communicate the insight for effective decision-making and also inspire people across the organisation.
The basis for this evolution is a changing business culture. The traditional, outdated budgeting mentality is being challenged more and more. Target negotiations, political games and biased planning processes cost companies a lot of money. These conventional budgeting practices are non-value adding and need to be abandoned if organisations want to stay competitive in this dynamic business environment.
While the process of moving away from a traditional budgeting mentality is painful and slow, it is happening around the globe. The question is, are we ready to abandon the traditional budget completely?
Going Beyond Budgeting
The proponents of the ‘beyond budgeting’ movement are convinced that the budget attempts to fill too many roles, namely:
- Setting targets.
- Resource allocation.
- Performance management.
One of the biggest conflicts of interest lies in trying to combine planning with target negotiations. It is a well-known paradigm: the budgeting process is a process of heavy target negotiations. Targets are linked to remunerations. This makes traditional budgeting a very emotional process: it is a battle for next year’s pay package of the participants. Can it still be objective and unbiased? Probably not.
This is a game, where ‘subjective’ or ‘objective’ are often forgotten. When this subject is discussed at FP&A Club events in different countries, it invariably produces smiles on FP&A professionals’ faces. They can strongly relate this behaviour to inefficiencies in their processes. Unfortunately, this traditional process is still so heavily embedded in business practices around the globe that often it is very difficult to challenge the status quo.
Luckily, change is already underway. More and more, companies are realising that budgeting is out-of-date before its finalisation and, therefore, is not that relevant for business.
Some companies have even started to think about abandoning the budgeting process altogether. Norway’s StatOil and Sweden’s Handelsbanken are among those to have done so, demonstrating that it is possible to survive and prosper without a budget.
However, this cannot happen overnight. Going beyond budgeting requires serious changes in business culture. It is well-known that such changes are the slowest and most painful.
What is Beyond Budgeting?
Bjarte Bogsnes, vice president, performance management development of StatOil, defines beyond budgeting thus: “The main purpose is liberation from dictatorship, micromanagement, number worshiping, calendar periods, hierarchies, secrecy, stick and carrot.”
The main problems of traditional budgeting may be summarised as follows:
- Planning and forecasting are mixed with the target setting process.
- Compensation targets are fixed and heavily negotiated during the planning and budgeting process.
- Budgeting for the year is fixed. It is perceived as the head of each department’s annual allowance. If he/she does not spend this year, that budget can be cut next year.
- The capital budget is set during the budgeting process. It is often fixed for the next financial year (in both projects and money terms). If you see an interesting business opportunity during the year, it may be not considered at all if you do not have the budget for this new project.
Looking at the problems of traditional budgeting, many realise that it does not fit for purpose. Are we ready to abandon it though?
Many FP&A professionals find the idea of abandoning the budget difficult to comprehend. Typical responses include:
- How will we report to the market analysts?
- How will we compensate our sales force?
- How will we control our expenses?
The reality is that the beyond budgeting philosophy does not leave an empty space. Once the budget has been abandoned it gives new management tools that allow companies to overcome inefficiencies and a culture of dysfunctional behaviour.
Beyond Budgeting, Step-by-Step
Basically, the first step should involve separating the three processes that are historically combined in the traditional budgeting routine; namely target, forecast and resource allocation.
The next step should consist of improving each of these processes:
- The target should become more ambitious, but relative. For example, it should be connected to external industry benchmarks. Fixed targets do not support holistic performance management practices.
- The forecast should be driver based and analytical, assumptions should be logical. Only a logical, analytical process can help create an unbiased planning and forecasting process.
- Resource allocation should be flexible, not fixed. If a new opportunity arises during the year, it should not be killed because of the lack of capital budgeting.
The instruments of beyond budgeting are different. In general, it is based on rolling forecasts, balanced scorecards, analytical driver-based models, modern FP&A systems and efficient FP&A departments.
Above all, the most important factor for beyond budgeting is business culture. While some organisations are not yet ready for this change, they may well become so over time and the process will be evolutional.
Many companies are currently in a stage of transition: deeply dissatisfied with their FP&A processes, overworked and bombarded by deadlines, and looking for new FP&A talents that can be difficult to find. They are beginning to take the first positive steps towards a better FP&A business culture: implementing rolling forecasts, driver-based models and modern FP&A systems.
However, the culture is often driven from the top of the organisation. Chief financial officers (CFOs) and chief executives (CEOs) have lived with budgets for too many years.
The simple test in order to understand how embedded the "traditional budgeting culture" in your processes is to ask your FP&A team the following questions:
- Are you ready to challenge your CFO/CEO, when he/she sets unrealistic targets? Are you ready to point out when a plan is biased and not in-line with the trends and driver-based models you have?
- Do you have enough courage and power to eliminate ‘hockey-stick forecasts’ in your company, which often demonstrate how unrealistic and biased the process is?
If you can answer ‘yes’ to the above questions, then probably you are ready for the beyond budgeting journey.
If not, good luck in your search. Please remember though that if you want your FP&A department to become more strategic and influential, you have to challenge the existing status quo.
This article was first published on http://www.gtnews.com and http://www.afponline.org/
Integrated Business Planning (IBP) is often seen as a natural progression from Sales and Operations Planning (S&OP), which came to life in the 80’s to align sales and operations. As S&OP found its origin in the supply chain, IBP is often biased with supply chain terminology and reasoning. It can be argued that current IBP development is still driven by a supply chain bias. With this lack of diverse thinking, IBP innovation runs the risk of being not truly ‘integrated’.
Contrary to most current defined maturity phases of IBP, one can find on the internet, we also can define IBP maturity phases from a more strategic angle. Many experts agree that IBP has a monthly check and balance with the budget and the strategic intentions of a business. Therefore, a well-executed IBP cycle will provide monthly visibility and measures progress against business objectives and strategy in the long-term horizon. Furthermore, we can say that a business strategy and the required strategic resources and capabilities have the goal to get a company closer to its vision.
According to Collins and Porras, a company vision exists from its core values, core purpose, a BHAG (big, hairy, audacious goal) and a vivid description. The core purpose is the reason for being; it captures the soul of the organization. Where you can fulfill a strategy, you can’t fulfill a purpose. Core values define what the company stands for. A company will stick to them, even if it became a competitive disadvantage in certain situations. Well defined, integrated and truly lived, purpose and values will drive companywide behavior. Imbedded company behaviors will drive a sustainable company culture, which will last over time. A well-defined achievable BHAG with a vivid description provides employees with an envisioned future they can identify with and which creates an emotional attachment, which makes them go the extra mile. As CEO Bob McDonald says on the emotional component and innovation at P&G; ‘People will innovate for financial gain or for competitive advantage, but this can be self-limiting, there is a need for an emotional component as well – a source of inspiration that motivates people‘.
If a company wants to track its budget and strategy and we use this vision framework and IBP as the planning process to support the business, IBP can be defined with the following maturity phases:
1. Integrated planning:
In this phase, companies start to focus on integrated planning between previously siloed functional areas. Some functions are more advanced than others. A company might have focused on state of the art finance processes and systems, but doesn’t reap the full benefits of that due to lack of integration of other functional areas into the finance process. Some integration exists, but not across all functional area’s and there is not enough integration with finance to make a monthly financial prediction on EBIT level in the long term horizon. S&OP as most define it will be in this phase.
2. Dynamic budget planning:
In this phase, enough functional areas plan in an integrated way for the process to provide their input to the P&L to create a fully loaded forward projected P&L. Finance understands the ‘volume’ input and the other functional areas understand the financial ‘value’ planning. This will provide the company visibility on how it is tracking versus the budget or annual operating plan on a monthly basis on EBIT level. Why EBIT level? Because I heard too many times in a boardroom the argument, when only gross profit was on the table; ‘we can’t decide on this because we don’t have EBIT a number’. We can also expect these companies to deliver monthly balance sheet and cash flow prediction. For these companies, there is no separate budgeting or forecasting cycle. Every month can be the budgeting cycle. Dynamic indicates that opportunity and risk scenarios across all functional areas are integrated into the financial projection.
3. Dynamic strategy and capability planning:
In this phase, the company has defined its strategic goals, measurements, and targets and is capable of checking and communicating monthly if they are on track to meet the strategy in the horizon beyond the budget. The strategic intent, which can be defined on lower levels like product segment, country or business unit level, will also guide in decision making for decisions on the budget horizon.
The company has also defined its core strategic capabilities to meet its strategy. There are many strategic capabilities possible. Ideally, a company shouldn’t have more than a handful as if it will define more it will erode the focus on these capabilities. Some examples are:
- Risk management: for companies that have extended and complex networks that are sensitive and dependent on changes in global and geopolitical events. For example companies with global supply chains, but also the Finance industry.
- Innovation: for companies that in a highly competitive market can outpace their competitors based on innovation and new product development. Often seen in technology industry and CPG
- Commodity trading: for companies that are highly influenced by commodity cycles as the commodities can be more than 80% of the COGS of their core products. For example, food & beverage companies that have crops and livestock as core raw material.
- Demand is driven supply chain: for companies that can get the competitive advantage from driving their business from the front end of the supply chain. For example food, beverage and consumer package industry. Often in retail and consumer environment, which are promotional driven and where POS information is available,
- Knowledge management: for companies that are highly dependent on knowledge workers and the exchange of knowledge between people and business units. Companies that have IP to integrate, sell and protect. For example Consultancies and software industry.
- Supply exploration: for companies that have to spend high amounts of capital to find new or increase the supply of their core product. For example oil and mining industry.
- Collaboration: Collaborative IBP can be a separate phase for companies that see the strategic advantage in collaborating with their suppliers and customer in the longer horizon and therefore want to integrate their business plans. For companies that have the power in the supply chain through size or uniqueness of offering this will most likely not be a strategic focus.
The list can go on and on with Technology, Sustainability etc. Once a company has defined its strategic capabilities and has defined goals, measurements and targets for these capabilities, it needs plans to implement or improve these strategic capabilities. An update on status, progress, risks and mitigations for those plans will be part of the IBP cycle in this phase. Dynamic indicates that sensitivity analysis around the plans to reach the goals of the strategic capability is part of the update.
4. Integrated vision & purpose:
In this phase, companies have a well-defined purpose, values and an achievable BHAG with a vivid description that people can identify with and which create an emotional attachment. The company aims to integrate this with the IBP cycle. A company can decide not to pursue strategic opportunities because doing so would compromise their core purpose or values. A large multi-billion dollar beverage company, for example, decided not to enter the very lucrative market of premium RTD’s (Ready to Drink) alcoholic beverages because the alcohol content was too high. Although the opportunity was achievable and margins were very interesting, the alcohol content would not be in line with their core purpose of ‘bringing more sociability and wellbeing to our world’. The purpose guided decision making in the strategic horizon.
The company values and the emotional attachment will be tracked in the monthly IBP process and have actions, goals, and measurement. Executives follow progress to understand if employees believe and identify with the companies values, BHAG, and purpose and show emotional attachment. This can be done by 360 degrees feedback, engagement surveys or roundtable discussion between executives and employees. Executives also have to lead by example in behavior and actions. Their own behavior will have goals and measurements and progress is tracked,
For all phases, communication is important, although it can be argued that it’s most important when developing an emotional connection. An IBP document on the key decision, outcomes, progress and wins in the IBP cycle can be communicated to a well-defined stakeholder group in the company. This will both give the stakeholders an understanding of business performance, priorities, improvement opportunities and successes, as well as keep the engagement with the company vision, purpose, and the IBP process. Executives have to realize and appreciate that this communication document is the results of all the hard work from middle and lower management to gather all required IBP information for the executives to make decisions in the IBP meeting. This communication makes sure the IBP meeting is not seen as a ‘black hole’ which only sucks up information and doesn’t provide feedback.
Once a company masters these four phases, it tracks and plans on a monthly basis the budget, the strategic intent and strategic capabilities, the company values, and purpose and the emotional attachment of the employees. If a company then links these plans with shorter term control plans and execution, we might call it real Integrated Business Planning.
Would these four phases be IBP innovation?
Inside this article are the bread and butter ratios of financial statements. Some gauge effective use of assets. Some report the financial condition of the company. It is with these the ratios (relationships) where most financial analysis begins. They are powerful in both developing forecast statements, assessing performance and validating model output. What I am outlying here is the very core of virtually all financial analysis output.
Recapping, the purpose of corporate finance is to maximize shareholder wealth. Maximizing shareholder wealth requires asset management, investment management and financing management. Sometimes corporate finance is managing “as is” assets in place. Sometimes corporate finance is managing what it believes the future may hold so it may properly position itself today (investment and future growth opportunities). And to execute asset, investment and financing decision making we must accurately assess both parts of our fundamental valuation equation: CF/r.
Valuation is, by concept, forward-looking. Therefore, both CF and r require forecast figures. Forecast figures require a base of observations from which to draw inferences for the estimation of future events and circumstances. Internally, our base of observations are our historic statements. The inferences are our drivers. These drivers are most often expressed as ratios (% of X) or adjusted actuals (X(1+g)). Or a combination of both.
Drivers must be based on proven relationships which have stood the test of time (the “use test” as referenced in Basel II) and must be adjusted as warranted (“subjective overlay”). This combines fundamental storm weathered statement analysis technique and subject matter expertise.
Please take note that the words “big data”, “transformation”, “analytics” will not appear in this text. This is because:
The single greatest empowerment to FP&A in the 21st century is the fundamental financial theory and statement analysis developed over the past 80 years.
- Que in on published material and Executive guidance
- Observe trends and existing relationships
- Use your textbook ratios for analysis, KPIs and output validation
- Common size
- Look year over year
- All costs are semi-fixed, ultimately
- Normalize figures by using informed adjustments (SME)
A Word on Drivers, Rations and KPIs
- Drivers steer $ or utilization figures/metrics as inputs. A driver may also be a key performance indicator, but they are usually used from an input perspective (independent variable). A driver or metric may be keyed in on by executive management, but only to the extent that a KPI was not met.
- KPIs measure performance output. They are generally dependent variables.
- There can be crossover, but keep on mind that the term "KPI" is a relatively new naming convention.
- Drivers and KPIs are often expressed as ratios. Some ratios may be neither drivers nor KPIs, but they relate important information concerning the health and position of a firm and can be used as model output validation. A KPI to one party (a debt holder, for example) may not hold the same weight or meaning as a KPI to another stakeholder.
Starting a Forecast: How Do We Begin?
Start high, go low as data allows, and improve insight over time. Start with fundamental drivers and statement formats. Stats and drill down database techniques will someday further enhance the process downstream in your model. Here is an income statement summary. Start here, pull apart later on.
Common sizing statements is a good start for financial statement forecasting.
From Investopedia: What is a 'Common Size Income Statement’. “A Common size income statement is an income statement in which each account is expressed as a percentage of the value of sales. This type of financial statement can be used to allow for easy analysis between companies or between time periods of a company.”
One obvious result to the common sized income statement are sub-margins and net margins. Makes for good M,D&A discussion and general trend analysis, especially for variable costs.
Also useful is a common sized balance sheet statement. It will flush out casual relationships quickly.
“Horizontal” common sized statements show year over year changes, or “growth”. Very simple, easy to execute and powerful as sanity checks and in more formal validation. Trends become apparent, as do anomalies. The chart below is the work of Robert A. Weigand, Ph.D. It is simple, clear and concise.
Classification of Ratios/KPIs and Definition of Terms
Ratios can be divided into four main categories. These may also be used as KPIs, depending on need, circumstance and executive directive:
- Activity Ratios
- Liquidity Ratios
- Debt and Solvency Ratios
- Profitability Ratios
These ratios provide checks for development, statement management and validation. Is model output reasonable and does it follow accepted patterns? Ratios not only reflect the financial position of a firm but may also reflect a model’s calculation abilities and errors. These ratios reflect either casual relationships, times series relations or naïve (flat) relationships.
Operating Expense % of Revenue
Operating Expenses / Revenue
Revenue / [[A/R Beg. Balance + A/R Ending Bal.]/2]
Fixed Asset Turnover
Revenue /[ [Fixed Asset Beg. Balance + Fixed Asset Ending Bal.]/2]
Total Asset Turnover
Revenue / [[Total Assets Beg. Balance + Total Assets Ending Bal.]/2]
Risk / Leverage Ratios
Current Assets / Current Liabilities
Debt / Equity
Total Interest-Bearing Debt / Total Equity
EBIT / Interest Expense
EBIT / Revenue
Net Income / Revenue
NCF % Revenue
Net Cash Flow / Revenue
NCF % of Net Income
Net Cash Flow / Net Income
Return on Assets
Net Income / [[Total Assets Beg. Balance + Total Assets Ending Bal.]/2]
Return on Equity
Net Income / [[Equity Beg. Balance + Equity Ending Bal.]/2]
[[Revenue2 / Revenue1] -1]
Earnings Growth (EBIT)
[[EBIT2 / EBIT1] -1]
Earnings Growth (Net Income)
[[Net Income2 / Net Income1] -1]
With causal relationships, analysts examine the cause-and-effect relationships of a variable with other relevant variables such as the level of consumer confidence, an income statement or balance sheet item. Below are examples of common causal relation calculations:
- Position calculations represent a company's financial position regarding earnings, cash flow, assets or capitalization. Calculations can be expressed as a dollar amount, a percentage, or a comparison. Position calculations are often referred to as “common-sized” when it is uniformly applied to a whole statement.
- Cash % Assets
- Debt / Equity
- Metric calculations assess financial position relative to a non-financial figure such as days, transactions or number of customers.
- Transaction Figures (Units Sold Per Day, Transaction Days, etc.)
- Utilization %
Below, the Hertz 10K includes some great metrics to use as starting points in a forecast:
Here are some well used financial statement metrics used to develop line items for forecasts:
- Cash % of revenue
- Cash % of assets
- Inventory % of sales
- Accounts receivable % of sales
- Accounts receivable based on days outstanding
- Short term securities % of debt
- Short term securities % of cash
- Depreciation and amortization function of first cost
- Payables % of cost of goods sold
- Debt function of D:E target
- Dividend result of D:E target
- Revenue = price x volume
- Cost of goods sold % of revenue
- Selling expenses % of revenue
- Depreciation function of first cost of asset
- Interest expense % of average debt balance
- Taxes at tax rate
A widely-known causal method is regression analysis, a technique used to develop a mathematical model showing how a set of variables are related. Regression analysis that employs one dependent variable and approximates the relationship between these two by a straight line is called a simple linear regression. Regression analysis that uses two or more independent variables to forecast values of the dependent variable is called a multiple regression analysis.
Time Series Relationships
Time series relationships move from period to period, albeit daily, weekly, monthly, quarterly, seasonal or cyclical. Here are some other financial statement metrics used to develop line items for forecasts which are a result of year over year changes:
- Cost of Goods Sold (Price x Volume)
- Non-recurring and one-time expenses such as legal fees
- Not all costs can be evenly divided between fixed and variable. Though the same underlying methodologies and techniques apply.
- Here is the classic formula:
“The formula for total fixed cost is fixed costs plus variable costs multiplied by quantity equals total cost, or
FC +VC(Q)=TC, according to Education Portal. Fixed costs are costs that do not change based on aspects such as production levels, where variable costs change based on production.” Reference.com.
This simple formula cab be further refined by further breaking apart fixed and variable cost, weighted.
- Keep in mind, conceptually all costs are neither fixed nor variable in their entirety.
Breaking Down 'Semi-Variable Cost
The fixed portion of a semi-variable cost incurs repeatedly, while the variable portion occurs as a function of the activity volume. Management may analyze different activity levels by manipulating the activity level to the variable costs. A semi-variable cost with lower fixed costs is favorable for a business as the breakeven point is lower.
Examples of Semi-Variable Costs
Overtime on a production line has semi-variable features. If a certain level of labor is required for production line operations, this is the fixed cost. Any additional production volume that requires overtime results in variable expenses dependent on the activity level. In a typical cellphone billing contract, a monthly flat rate is charged in addition to overage charges based on excessive bandwidth usage. A business likely experiences a similar structure when charged for utilities. Also, a salesperson’s salary typically has a fixed component such as a salary and a variable portion such as a commission.
A business experiences semi-variable costs in relation to the operation of fleet vehicles. Certain costs, such as monthly vehicle loan payments, insurance, depreciation and licensing, are fixed and independent of usage. Other expenses including gasoline and oil are related to the use of the vehicle and reflect the variable portion of the cost.
The fixed portion of a semi-variable cost is fixed up to a certain production volume. This means certain costs are fixed for a range of activity and may change for different activity levels. For example, rent expense for a production facility may be $2,000 per month. However, if production doubled and an additional facility is rented, the new fixed rent charge may be $3,500. This charge remains fixed even though the dollar amount changed because the production volume was adjusted.
Accounting for Semi-Variable Costs
Generally accepted accounting principles (GAAP) do not require a distinction between fixed and variables costs. These costs are not distinguished on a company’s financial statements. Therefore, a semi-variable cost may be classified into any expense account such as utility or rent. A semi-variable cost and analysis of its components is a managerial accounting function for internal use only.”
Normalized and Recast Statement Adjustments
The purpose of normalizing financial statements is to adjust the financial statements of a business to more closely reflect its true economic financial position and value conclusions of operation on a historical and current basis. Common normalizing adjustments include balance sheet adjustments to bring asset values to current market values, and income statement adjustments to reflect standardized revenues and expenses.
Recast statements reflect the unique financial statement structure of an entity, taking into account unusual or non-recurring revenue and expenses.
Here is an example from a valuation for the proposed sale of a business:
Ratios, metrics and KPIs serve little purpose if not adjusted for abnormal and non-recurring events, especially on a comparative basis.
Benchmarks as Drivers and Validation Indicators
- Prior Year Results
- Alternative Estimates
- Subjective Reasoning
- Industry Norms
- General Economic Activity
Drivers and Forecasting: An Example
Traditional Statement Rules
Here are some time tested financial statement techniques and rules of thumb:
|Balance Sheet||Always balances.|
|Working Capital||A function of operations with few exceptions such as the current portion of long term debt.|
|Debt||Built from asset’s ability to obtain financing, possibly drawn down to zero or maintained at a certain debt to equity ratio.|
Acts as the “keel” to the balance sheet.
|Equity||Cash is not run up, dividends are used to balance assets and liabilities plus equity.|
|Cash flow||A function of balance sheet and income statement. Often produces a “hypothetical” dividend, as companies may choose retain excess cash as opposed to issue a dividend.|
Both conceptually and mechanically, integrated statements, by acknowledging the balance sheet-income statement-cash flow interdependency, demonstrate a core understanding of corporate finance synthesis and provide a foundation for both accounting and modeling structure. This is why the investment banks and Big Four consulting firms go to great lengths to develop this skill in their incoming analysts. Without integrated statement forecasts, accompanying DCFs are likely to not hold upon to scrutiny.
Integrated statements can create circular references such as this:
- Interest expense on the income statement is a function of an average debt balance on the balance sheet.
- Current net income is dependent upon interest expense.
- Balance sheet equity is dependent upon current period net income.
- Debt is dependent upon total capitalization
- Total capitalization is dependent upon equity.
Though conceptually correct, avoid circularity. Circular references within circular references can trip up Excel (yes, I have seen it happen). This can be solved by using a goal seek macro, for instance, maintaining a debt to equity level through all model periods.
However this is handled, the best integrating approach will reflect the actual management approach employed in the organization.
Building the Bottom of the Balance Sheet
βeta - Operating and Financial Risk
- Fixed assets act similar to debt when it comes to risk. They are a fixed cost.
Both sides of the balance sheet can be divided into fixed and variable costs. Fixed costs create leverage. They also create reward once covered, since all else falls to the bottom line after. This can make your statement forecasting somewhat easier than might be envisioned. Match your debt to the assets which they support.
The below image follows what is called the Pecking Order Theory. Developed by Stuart Myers, this theory says that businesses follow a hierarchy of financing sources preferring internal financing first. In effect, this theory starts at the top of the balance sheet (cash) and works its way down through secured and unsecured debt to common stock.
- Internal financing
2. Then debt
3. Finally, issuing equity
Each asset has its own debt to equity. For example, a car can be easily financed with debt. Ideas and intangibles cannot. It’s really that simple. The bottom of the balance sheet is a function of the weighted debt to equity of the assets on the top of the balance sheet.
Probability Expected Values
Ideally, forecast cash flows should be probability-weighted. In other words, they should be “statistically expected” values. This is not always practiced. The discount rate in the denominator will reflect a risk return requirement assuming a distribution of outcomes; so should the cash flow.
No more than one outcome may actually occur, and thousand of years of human experience tells us that realized cash flows tend to be lower than the “most likely” forecast. In a symmetrical distribution, the most likely outcome is equal to the probability weighted outcome. Asset capacity and elasticity constraints (utilization rates), however, usually prevent this distribution from occurring in real life.
- Incorporating only a few possible CF outcomes into analysis. No one is suggesting that Monte Carlo simulation is run for all analysis, but having a clear understanding of sensitivities to variables and the possibility of more than a few outcomes is key.
- Model errors
- Model omissions, such as P&L impacts, ignoring the PV of new opportunities projects may present, etc.
- Compounded growth
- Incorrect application of theory (tax effects, mismatched cash flows and discount rates, TV, etc.)
Here is a great textbook for learning and reviewing common financial statement ratios and analysis:
The Analysis and Use of Financial Statements, White, Sondhi and Fried.
About the Author
Rob Trippe is a highly accomplished MBA, investment bank trained, Fortune 10 and 200 corporate finance leader for strategy and planning with a focus on model risk, analytics and project finance. Rob’s had the Fortune of working and learning from some of the finest companies in the world; GE, Houlihan Lokey and Hertz.
Rob is a "wing to wing" corporate finance model and project manager. He consults, lectures, and writes about corporate finance and financial modeling exclusively and extensively. Rob has developed financial models spanning every balance sheet asset and capitalization class, numerous industries and for a variety of decision-making and regulatory purposes and applications. The commonality across these dimensions? Core theoretical concepts, model governance and best in class model structure.
Myths about Target Setting during your Budgeting Process
The world of management is full of myths and rituals. Many seem to have been with us forever, often explained and justified as “we have always done it like this”. Some have a shorter history, often introduced because “everybody else is doing it”.
One of these rituals definitely feels like it falls into the first category. Target setting seems to always have been there. In most organisations, this is actually not true. If we go back 20-30 years, the situation was quite different. There were far fewer targets around. Yes, there were (and still are in many organisations) detailed budgets which often represented targets. These were mainly financial numbers, however. The birth of the Balanced Scorecard in the late nineties led to a massive increase in non-financial metrics, resulting not only in much more measurement through KPIs but also in much more target setting on these. This new way of managing later spread to the public sector, where “New Public Management” has also led to a massive increase in measurement and target setting.
Along the way, we seem to have forgotten that “the target is not the target”. It is not about hitting a number. What we really want is the best possible performance, given the circumstances. Setting targets are one way of achieving this, but not the only way and too often not the best way.
The target myths
Rituals aren’t necessarily a problem. Myths are different because they are typically not true. There are three strong myths surrounding targets, all seemingly undisputable justifications for this popular management practice.
- Without targets people won’t know what to do
- Without targets people will not be motivated to perform
- Without targets we are unable to evaluate performance
These myths are, however, not just disputable. They are simply not true. Here is why.
Let me first make a clarification. The targets and the target setting I challenge here are numerical targets. They are very specific, different from more general ambition statements with rounded numbers; ”in the range of”, or “towards”. In addition, they are absolute, not relative where own performance is compared with others. They are typically set from above, in order to control, punish and reward. I have fewer problems with targets people set for themselves to learn and improve, although also these can be problematic if taken too literally.
“Without targets, people won’t know what to do”. Not true. Words can often address direction and expectations much more clearly and intelligently than what any single number can do.
“Without targets, people will not be motivated to perform”. Not true. Many, including myself, are much more fired up by the right words, igniting our hearts in a very different way than those clinical and decimal-loaded numbers which only reach our brains.
“Without targets, we are unable to evaluate performance”. Not true. This one might be the most solid myth to bust, so let us dive a bit deeper here.
No target - no performance evaluation?
A target is trying to describe what good performance looks like at one point in time down the road, for instance at year-end in the case of an annual target. This is often quite difficult, especially when there is a lot of uncertainty that we can’t control. Where is the market going? What will competitors do? And what about the oil price and exchange rates? We have to make a number of assumptions about all these uncertainties, forcing us to be quite subjective even if we don’t want to. But when we finally land on a specific number, the subjectivity seems to disappear. Now it has all become more orderly, and we can focus on measuring whether we are hitting the number or not.
What we have been through, however, is actually a premature performance evaluation, and quite a difficult one, due to all that uncertainty forcing us to make all those guesses. Wouldn’t it make more sense to do this job afterwards only, when all the uncertainty is gone? We then know what happened with markets and competitors. We know how the oil price and exchange rates moved, and a lot more. Why should we let a yardstick decided twelve months ago be the judge when we can now look at facts instead of basing us on all those guesses we made back then? Most of us know what good performance looks like when we see it.
Targets work. That’s the problem.
Many will still argue that target setting works. “What gets measured gets done”. Yes, targets do work. That is actually the problem. Managers hitting their target is, however, no guarantee whatsoever that this was their best possible performance, given the circumstances. Maybe some could have performed even better because assumptions changed. Others had a lot of headwinds and might have performed great even if they didn’t hit their target. But we only know this afterwards!
Targets are often expressed through KPIs. Those still insisting on setting targets should not forget that the “I” in KPI stands for “Indicator”. They are seldom telling the full truth, which probably is why they are not called “KPT”s; Key Performance Truth. We have to look behind the indications before we can conclude.
Zooming blindly in on targets can, therefore, be highly problematic, even dangerous, also when there is no change in assumptions. Volkswagen and Wells Fargo are recent and sad examples of blind KPI target management running the show.
A holistic performance evaluation
If we do set KPI targets, both the uncertainties discussed require that we take hindsight insights into account:
- Indicator uncertainty - how big is the “I”?
- Target uncertainty - what is the right number?
In addition, there are also other questions that should be asked in such a holistic performance evaluation. How did we achieve our results? Which risks were taken? How are sustainable the delivered results?
Some would argue that all this assessment on top of measurement makes the performance evaluation too subjective. They prefer the objectivity of only looking at actual versus target, end of story. But as we just have discussed, this is an illusion of objectivity, due to all the subjectivity going into target setting in the first place.
The longing for full objectivity might also have something to do with managerial laziness. It is obviously much easier to compare two numbers only and conclude. Making a deeper performance assessment by looking at what really happened and digging behind measured results in order to reveal the true underlying performance takes an effort. It takes leadership. Some find that cumbersome, even difficult. But we need leaders with competence beyond the ability to compare numbers. Leadership is not meant to be easy.
If you are new to, but intrigued by these reflections, you might wonder if it’s all nice theory and wishful thinking. It’s not. Many organisations have either skipped or never introduced targets. Check for instance out Handelsbanken or Miles, two great companies who not only operate without targets but also without most other management rituals, including budgeting.
There is a better way!
P.S. Linking incentives like the individual bonus to targets will only turbocharge the many challenges described above. It also opens up another management myth; what is it that really motivates us? This topic is so big that it deserves its own article. Stay tuned!
According to Wikipedia, prescriptive analysis is – after descriptive and predictive analysis -the third and final stage of business analytics. Gartner plots prescriptive analysis as the final and most difficult stage of data analytics. This article will draw an analogy with the mapping and car industry to suggest that prescriptive analysis as an opportunity to support Integrated Business Planning (IBP) and business optimization, is not a final stage, but just at the beginning of a new planning era.
Where descriptive analysis answers the question; ‘what happened and why’, predictive analysis answers the question; ‘what will happen’, prescriptive analysis will also suggest actions that benefit from the predictions and show the implication of each option. Or, as Anne Robinson, a past president of INFORMS, a society for analytics and operations research professionals tells us; “Prescriptive tells you the best way to get to where you want to be”.
Integrated Business Planning as GPS
Companies like River Logic, model financial and operational business constraints and use prescriptive analysis to optimize a business objective. The business objective can be to maximize an asset, maximize throughput or minimize capital use and cost. Or all of the above at once. In combination with Integrated Business Planning to provide visibility and manage change in model assumptions and constraints, this creates a powerful and holistic scenario and decision making capability for executives.
Integrated Business Planning is like a GPS for a business
Robinson’s definition sounds pretty much like using a GPS system in your car. It will tell you the best way to get where you want to be. Integrated Business Planning is like a GPS for a business. IBP done well will provide a company with a valid and reliable periodic rolling forecast and strategy status. This, in turn, provides executives visibility on gaps versus budget and strategic intent, so they can steer the company to ‘where they want it to be’.
The evolution of map technology
If we take this analogy a little bit further, Integrated Business Planning and prescriptive analysis can learn from GPS and the evolution of the underlying map technology. Only 15 years ago we still used paper maps in our car. The first commercial GPS systems were launched in the 80’s. TomTom, a Dutch mapmaker and traffic company launched their first portable GPS system in 2002. In December 2015, TomTom launched the world’s first commercially available near real-time updated map. Let’s have a look how they got there and what is next. Let’s open our minds and have a look at trends in the mapping, traffic and the car industry and see what prescriptive analysis and IBP can learn from it.
Dynamic data input
Map making and maintaining used to be an expensive exercise, where you needed to drive lots of cars around every street over and over again. TomTom made deals with tech giants like Apple and Uber to provide automated data input to their maps. Millions of iPhone users and Uber drivers continuously send information to the TomTom database, which is used for mapping, traffic and route analysis to update a standard maps almost real time.
We can easily imagine a prescriptive analytics phase where the optimization model is automatically updated with macro assumptions and constraints like GDP, consumer spend, oil prices and total market capacity. Dynamic micro assumptions like capacity, downtime, material availability can come directly from intelligent machines. This creates dynamic constraints in the optimization model.
Alerts and interventions assistance
Advanced Driver Assistant Systems (ADAS) are already capable to warn drivers for upcoming traffic jams and many middle class cars already operate with automatic break control in city traffic. Some cars like the Mercedes E-class, can take this further and automatically take over the steering wheel when a driver loses control of the car.
With dynamic data input in the optimization model, prescriptive analysis can create alerts and intervene before a certain threshold of a critical resource or constraint is reached. An asset can be almost at capacity or underused, cash can dry up, or throughput can reach unsustainable limits.
TomTom has presented a 3D map of Germany and Michigan, in support of the German and US car industry. It has 3D maps in Silicon Valley in support of US tech companies and is partnering with Bosch - a world leading German electronics provider to the car industry - to make rapid progress to 3D map other parts of the world. 3D maps provide a 360 view of the road so a car knows where important landmarks, every lane, traffic sign, building or traffic light is and includes road geometry like curves and slopes. 3D maps are required for highly automated driving (HAD) and in combination with sensors on the car for autonomous driving (AD).
Optimizing our own business with prescriptive analysis is like a 2D map. Now add to that the impact competitors, suppliers and customers have on your business within your industry. Now imagine the impact of other industries, countries, the weather and geopolitics. In our connected, interdependent global world, similar to a 3D map, we will require multidimensional network prescription and optimizing.
Cars like the Mercedes e-class and Tesla S model already can change lanes autonomous and park the car without the driver being in the car. Audi will launch a HAD version of its flagship A8 in 2017. Elon Musk recently told analyst that Tesla is two years away from self-driving cars. Google has accumulated a millions kilometres of autonomous driving. McKinsey recently predicted that 15 percent of new cars sold in 2030, could be fully autonomous. Soon humans don’t have to touch a steering wheel.
We already know based on research that humans better not touch a short term forecast as we are riddled with bias and emotions. We already let demand sensing take over short term forecasting and replenishment decisions. It is not hard to imagine prescriptive algorithms take over tactical and strategic decisions in a distant future. This will create prescriptive execution.
Self-sustaining prescriptive algorithms
In some existing self-driving cars, the driver has to touch the steering wheel every now and then to let the car know he is still there and awake. In the future this will not be needed anymore. Cars are idle for over 90%. All this wasted capacity will once be used by intelligent systems. Every car will be connected. In cities, less people will own cars. Public ride shares and taxi bots will rule the cities and take you where you need, drop you off and go to their next ride. This requires artificial intelligence (AI) and deep machine learning.
Google’s AI team has beaten a top human player in the game of Go, a 2,500-year-old game that is exponentially more complex than chess. IBM’s self-learning machine Watson, capable of beating world chess masters and the average joe at jeopardy, is available commercially for any business. It might be a small step for an AI algorithm to not only prescribe the best solution and take the decision to implement it, but also prescribe changes to the optimization algorithm afterwards to incorporate learnings and newly detected circumstances. The prescriptive algorithm will become self-learning and self-sustaining.
Collaboration to reach the holy grail
Only 15 years ago we were using a paper map to find our way around in our car. Due to a common vision for the holy grail in the car industry – the autonomous car - map makers, tech companies, the car industry and suppliers to the car industry all joined forces. There is also urgency due to healthy competition, as the car industry itself feels threatened by the tech giants’ entry into their industry. Due to this collaboration, competition and urgency, commercially available self-driving cars will become a reality in the next five years. Who would have imagined that 15 years ago?
Prescriptive analysis and execution might just be the holy grail of integrated business planning and business optimization. Maybe the biggest learning from the mapping and car industry is what can be achieved if powerful coalitions are made to chase a holy grail.
The future of IBP and prescriptive analysis
To produce a rolling forecast and strategy status update, an IBP process periodically needs to review its plans and resources and update its underlying assumptions and business constraints. An IBP process can provide these same inputs periodically as boundaries in to a prescriptive algorithm and in return get suggested decisions for an optimized rolling forecast.
My on-line research indicates that over the last three years collaborative planning with customer and suppliers have increased, but most businesses still use IBP within the boundaries of their own company walls. And although more than half of my survey participants indicate they integrate financial planning & budgeting as a key task in their IBP process, it is likely that this is supported by slightly dis joined predictive analysis. Therefore, most companies will operate in the bottom left corner of below figure.
In terms of understanding constraints, risks and opportunities, the future of IBP will be to plan across the whole value chain, rather than within the company walls. A final step in IBP scope will be to understand and incorporate constraints from interdependent value chains, commodities and countries to create a global view.
Progress from predictive to prescriptive analysis creates the opportunity for a business to develop a periodic optimized plan within its IBP scope. Once resources and constraint data can be dynamically input in a prescriptive algorithm, near real-time optimized plans with global constraints can be achieved. At that point, IBP can still govern the choices of data input and decisions the prescriptive algorithm makes.
This will end when artificial intelligent and self-learning systems tune the optimization algorithm and govern and change the constraints and assumptions it uses automatically. Similar to the driver in a highly autonomous driving car, who has to touch the steering wheel to let the software know he is still awake, executives might have to let the AI algorithm know they’re watching the integrated business planning decisions and numbers it produces.
We have a long way to go before we’re at that stage, but if we use our imagination, we can see how prescriptive analysis is only the start of a new IBP era.