Applied Corporate Financial Modeling

By Rob Trippe, MBA, Financial Modelling Veteran

Rob Trippe is a financial modelling veteran. With over fifteen years’ experience, Rob has developed corporate finance models for valuation, M&A, forecasting and performance monitoring. He is widely respected for his deep understanding of corporate finance theory, lectures at university and has worked with some of the world’s largest and most respected firms including the investment bank Houlihan Lokey, Hertz and GE Capital. He also was the managing director and operator of a coveted professional sports franchise at a young age. Rob was accredited in valuation in 2008 and holds an MBA, Finance from Boston College.

Publishing Accomplishments

  • Rob’s research while at the investment bank Houlihan Lokey Howard & Zukin was published in the Wall Street Journal and USA Today.
  • Rob’s cash flow model (his design) while at the Hertz Corp. was published in SEC and quarterly press release filings.
  • Rob has contributed to FP&A-Trends on numerous occasions.
  • Authored a state university finance exam.
  • Currently working on a commissioned case study for a major academic publishing house.

The Financial Modeling Landscape Is Changing And So Are The Demands Put Upon Business Analysts.

Federal guidance, spurred by the great recession, enhanced IT capabilities, the need for greater security, an increased acceptance of financial theory and the overall pursuit of best practice have all elevated the level of detail and precision which financial modeling in the business world has pursued.

Learning Excel, a widely used and accepted computer application is one essential skill required of a strong financial modeler. But these days, the business world requires more:

1. An understanding of modern model governance standards and management,

2. The FP&A analyst gold standard of developing fully integrated forecast financial statements as opposed to simply ad hoc spreadsheet analysis, and

3. Knowledge of proper validation and monitoring techniques.

This article addresses those needs.

Many types of modeling approaches exist. This article focuses on forecasting an integrated set of financial statements, line item by line item, using what is called  a “deterministic disaggregated component” approach.

To the corporate insider, this type of forecast model is invaluable. CEOs, CFOs, business planning, treasury and investor relations all rely heavily on forecast statements, line by line. Such forecasting is used to monitor departments and business units and key figures are often communicated to outside investors and the general public.

Industry analysts too use the same deterministic forecasting approach to model financial statements and subsequent valuations.

Large companies can forecast with remarkable accuracy. Often, tolerance levels from budget are as low as 3-5%. If we were simply buying a stock as in investor, other approaches may suffice with less effort. Here we are not – we are managing an entire company.

1.What is a Model?

2.Introduction to Model Governance

3.Best Practice Modeling for Excel

4.Forecast Statement Analysis and Design

5.Validation and Acceptance

6.Additional Resources


1. What is a Model?

Federal Reserve issued guidance to banks (SR 11-7 among other documents) defined a model for the purpose of model risk management after the wake of the great recession in 2008. It provides a framework for best in class financial model development, validation and monitoring.

SR 11-7 outlines six criteria a financial tool must possess to be a model.

1.The analysis must employ academic theory

2.Three components: inputs-calculation processes-outputs

3.The analysis must transform data into useful business information

4.Must be used repetitively

5.Output must be quantitative in nature

6.Subject matter expertise is required

7.Possible qualitative model inputs or output


Models are not limited to one programming language environment or application. The term “model” should be viewed “wing to wing” from initial data inputs to final model output. Numerous programming environments may be involved as with numerous data sources. “Downstream” models should be identified.



What Are Corporate Finance Models Used For?

Financial models inform decision makers and also assist in complying with regulatory requirement. The three primary corporate finance activities are:

1. Investment Decisions – estimating and comparing cash flows, risks and returns to requirements and expectations.

2. Financing Decisions – determining capital structure; what type of instruments will we use to finance the assets, what are the terms of this financing (e.g. length, terms) and to what degree can each asset be levered?

3. Dividend Policy – should we invest for growth and replacement or should we dividend cash flows to investors and what are investor’s expectations?



Influencing Model Factors

Models improve in integrity and effectiveness as the result of several guiding influences. 



2. Introduction to Model Governance

Model Governance concerns itself with:

A.Identifying models

B.Developing internal policy and procedure

C.Developing and enforcing model standards

D.Interpreting and abiding by regulatory guidance and rule

E.Researching and identifying best practice

F.Ensuring proper documentation

G.Assisting in tracking models and ensuring proper remediation when required

H.Overseeing model testing

I.Risk management

J.Change management


Model Risk Management

Virtually each of the activities mentioned in this article relate to development and subsequent model risk management. Model risk refers to the chance of unintended consequences resulting from model development, inputs or outputs.


The Model Lifecycle

Models follow a what is called the “model lifecycle”. The model lifecycle has four components. In order, they are:



Model Participants

There are numerous players and business units involved through the model lifecycle. They include:

Model Developers. Those develop the model using systems, theories, formulas and data.

Model Owners. Those responsible for model use in “real time”.

Model Users. Those who implement the model and its output.

Model Testers. Usually IT, who test model development results and manage model change.

Model Validators. Those who confirm a model is working as intended, often using mathematical approaches.

Business Units. Business Units implementing the model and utilizing the model output.

Model Governance. Those who see that models are tracked, implementing monitored, inventoried and comply to standards.

Risk Management. Those who monitor and manage the risk element of a business’s activities.

Compliance. Those who confirm activities meet regulatory requirements.

Audit. A last internal line of defense.


Model Standards

Here is one set of standard topics for non-regulated corporate finance models. These ensure ownership, improves accuracy and reduces key person dependence. Regulatory requirement may come into play as well for certain financial institutions which ups the ante.


Model Documentation

Good documentation is a key component to model risk management and should be viewed as such and not simply a compliance and requirement exercise. Advisory bulletin 2013-07 supports this assertion and adds that documents should be the responsibility of the model developer (development) and owner (operating procedures, ongoing monitoring), thereby providing a first line of defense and control between model developers, users and model owners. Proper documentation is cumbersome and takes time, but it is critical to a model’s overall success by reducing risks and costs. Benefits include:

  • Reduced key person dependence
  • Mitigated transitional challenges as models pass from user to user and output need to output need as conditions and requirements change
  • A more streamlined validation and audit process
  • Reduction in potential user error
  • Avoidance of potential misuse
  • Effective communication of challenging theoretical concepts – “comfort”

The documentation of financial models is a critical element of model risk management. It behooves developers, users and owners to spend the time and effort to properly document a model through its life cycle. Documentation, though time consuming and cumbersome, aids in validation, monitoring and audit and provides a platform for model developers to effectively communicate their model and its conceptual soundness to all areas of an institution, from data downloaders to executive management. Federal agencies will look favorably upon models which have comprehensive and well-conceived documents to support their models.


Model Flow Charting

Flow charts are powerful tools. Flow charts can come in many forms and there is no one exact manner to flow a model. From my experience, two flow charts stand out as invaluable;

1. System flow chart “wing to wing” (input to output)

2. Development flow chart

A system flow chart will show, left to right, a model’s data inputs and IT/business unit environments, calculation processes, and model output and IT/business unit environments. A development flow chart will visualize the exact mathematical methodologies and techniques employed to transform input data into useful business information and will generally focus on only one business environment.

Flow charting financial statement builds provide the user a quick gauge on what data is at hand versus what is needed to complete a model. A flow chart will help identify the required builds and environments from which data will be acquired. Flow charting will also help visualize core components within and across models and are an ideal medium for identifying and communicating risk and control points. Flow charts will dramatically improve model buy-in and provide a path for solid structure. Dead ends, duplicity and unmanageable audit trails now become visual.

flow chart


Model Definitions

Here are some great definitions to know:

Back-test. Use of historic data as a test to model output validity.

Benchmark. The comparison of model output to the output of an outside and independent source.

Emerging Risk. Unforeseeable risk arising further in time and model execution.

FAST. Set of rules for financial model design. Flexible, Appropriate, Structured and Transparent.

Impact Analysis. Assessment of cost, timing, scope and quality of a model-consequence.

In-Sample. Historical data used in model development.

Model. Quantitative method, system, or approach that applies statistical, economic, financial, or mathematical theories, techniques, and assumptions to process input data into quantitative estimates.

Out of Sample. Historical data not used in model development.

Outcomes Analysis. The comparison of model output to actual outcomes. Back-testing is one example.

Parameter. Numerical characteristic of a set or population of numbers.

Calibration. Adjustment of data and assumptions.

Residual Risk. Remaining risk after a risk mitigation action has been performed.

Risk Appetite. Largest tolerable degree of uncertainty acceptable.

Scenario. Multiple changes to inputs to reflect a given set of circumstances.

Secondary Risk. Risk arising from a risk response.

Sensitivity. Impact of a change to an input relative to the change in output.

Stress Test. Assessment of model stability by employing hypothetical data inputs or drivers.

Threshold. Measure of uncertainty or impact worthy of attention.

Tolerance. Degree of deviation within which a model still functions properly.

Validation. A set of processes and activities intended to verify that a model performs as intended and as expected.


3. Best Practice Modeling for Excel

Numerous organizations have developed modeling rules for the use of Excel. These rules provide structure and commonality among spreadsheet design and use. Many rules and guidelines can apply to other languages as well. Technically, Excel is an application and not a language. And it is only one of many possible languages and applications which may be incorporated into a model. Due to its wide-spread use and applicability to recent college graduates, this section focuses on best practice for Excel.

Excel Standards Organizations

  • FAST (Flexible, Appropriate, Structured, Transparent)
  • Smart Financial Modeling
  • BPM (Best Practice Modeling)

Common Rule Categories

  • Workbook Design
  • Worksheet Design
  • Line Item Design
  • Cell Design and Use
  • Assumption Design and Use
  • Use of Features
  • Formatting Rules

The purpose of these rules is to provide clarity, accuracy, safety and a clear audit trail. Only create work which you would gladly show on a blackboard in a court of law.



Use color protocol

Use color sparingly

Inputs are blue

Calcs are black

External links are green

Trouble spots are red

Saving and Printing

Use naming protocol

Use saving protocol

Print format for all pages

Print then proof, never off the screen only


When creating a model, you are telling a story. As the FAST standards say, you are really writing sentences, paragraphs and chapters. That approach will help get your model “read” and more easily accepted. Models are not political tools.



Keep formulas short

Assumptions shown on same page as calc

Keep audit trails short

Avoid daisy chains

Never replicate math, except as proofs

Avoid external links, use exported tabs instead

Highlight external links when required to use

Avoid circular references when possible

Rows, Columns and Tabs

One column, one purpose

One row, one purpose

No line or column breaks between calculations

Avoid merging cells

Forecast periods move left to right

Statements move top to bottom

No breaks between rolling forecast periods

Don’t hide rows or columns

Tabs flow left to right

Use summary sheets, clean of build ups

Formulas, Calculations and Functions

Use statement accounting sign convention as you would read from a 10K

Subtotals presented below

Use balance and sanity checks

Show all math

Anchor repetitive calcs left to right and top to bottom

Macros should be used sparingly

Avoid named ranges

Create dedicated calculation areas

Anchor “drivers”


Common Statement Rules

Here are some time tested financial statement techniques and rules of thumb:

Balance Sheet. Always balances.

Working Capital. A function of operations with few exceptions such as the current portion of long term   debt.

Debt. Built from asset’s ability to obtain financing, possibly drawn down to zero or maintained   at a certain debt  to equity ratio.

Debt: Equity. Acts as the “keel” to the balance sheet.

Equity. Cash is not run up, dividends are used to balance assets and liabilities plus equity.

Cash flow. A function of balance sheet and income statement. Often produces a “hypothetical” dividend, as companies may choose retain excess cash as opposed to issue a dividend.


4. Forecast Statement Design and Analysis

Five key approaches to a sound forecast model:

1. Integrate Statements

2. Utilize Quantitative Forecast Techniques

3. Roll - Period to Period and Model to Model

4. Use Expected Values

5. Validate Results


Types of Models

Deterministic models use assumptions (independent variables) which are believed to be “known”, that is they are static. Run a deterministic model ten times and you will get the same result.

Stochastic models use assumptions (independent variables) which are only believed to be “known”, within a set of parameters or “boundaries”. Run a stochastic model ten times and you will get ten different results. Dependent variable output therefore has a random element to it. Models are often then run using Monte Carlo simulation for numerous results and an average of these results is used as the final model output.


Deterministic Forecast Modeling

This article focuses on one type of modeling called “deterministic” modeling. Another common modeling approach is called “stochastic” modeling. Here is a brief explanation of both.
Deterministic models are mathematical model in which outcomes are determined through known relationships among data and assumptions, without room for random variation. In a deterministic model, utilizing the same input data and assumptions will always produce the same output. There is no random element.

Stochastic models have a distribution of potential outcomes since they allow for random variations in one or more inputs. The random variation is usually based on fluctuations observed in historical data for a chosen period using standard time-series techniques. In other words, assumptions are not precisely known and may vary through time.

If possible, see if your model can combine both types of models. Key assumptions developed through stochastic processes incorporated into a deterministic model could provide the best of both worlds.

Other Model Classifications

Extrapolative Modeling utilizes prior period results in time series fashion, year over year growth, for example.

Index Modeling utilizes outside data to estimate future internal results.

Disaggregated Modeling uses disparate data (line items on financial information, for example) to estimate future outcomes.

Forecast Statement Development

Forecast models do not produce guesses, but rather estimations. They are developed using real world observations (often actuals) and may be modified as necessary. Fortunately, fundamental statement analysis provides a perfect framework through the use of common financial statement metrics and observations. Some observations may be internal (last year’s sales, number of customer, and some may be external (GDP growth, housing starts, as examples).

Metrics will drive certain line items on each financial statement, which in turn, may drive other financial statement’s line items using integrated modeling techniques. Interest expense as a percentage of debt is an example.

Other ratios can be used to assess performance and judge the validity of forecast results.

Integrating Statements

Both conceptually and mechanically, integrated statements, by acknowledging the balance sheet-income statement-cash flow interdependency, demonstrate a core understanding of corporate finance synthesis and provide a foundation for both accounting and modeling structure. This is why the investment banks and Big Four consulting firms go to great lengths to develop this skill in their incoming analysts. Without integrated statements forecasts, accompanying DCFs are likely to not hold upon to scrutiny.

Integrated statements can create circular references such as this:

  • Interest expense on the income statement is a function of an average debt balance on the balance sheet.
  • Current net income is dependent upon interest expense.
  • Balance sheet equity is dependent upon current period net income.
  • Debt is dependent upon total capitalization
  • Total capitalization is dependent upon equity.

Though conceptually correct, both modeling organizations FAST and Best Practice Modeling recommend avoiding circularity. Circular references within circular references can trip up Excel (yes, I have seen it happen). This can be solved by using a goal seek macro, for instance, maintaining a debt to equity level through all model periods.

However this is handled, the best integrating approach will reflect the actual management approach employed in the organization.

Common Quantitative Forecast Techniques

Budgeting and forecasting methods can be divided into two broad categories: qualitative and quantitative. Listed below are common quantitative forecast techniques for financial statement modeling.

1.Time series change (price, for example)

2.Causal relationships (cash and receivables as a % of sales)

3.Naïve - Beginning balances equaling prior period ending balances or flat-lined (fixed assets, debt)

Qualitative techniques may be used to adjust quantitative forecast output, based upon subject matter expertise.

Time Series

A time series is just a set of observations measured at successive points in time or over successive periods of time. Time series uses past trends of a variable.


Any recurring sequence of points above and below the trend line that last more than a year is considered to constitute the cyclical component of the time series. Example; while the trend line for gross domestic product (GDP) is upward sloping, the output growth displays a cyclical behavior around the trend line.


Seasonal components capture the regular pattern of variability in the time series within one-year periods. Many economic variables display seasonal patterns.  Seasonal will require you to forecast by quarter and/or month.


Similar to seasonal, but follows a calendar path, such as weeks or quarters (Q1, Q2, Q3, Q4).

Time Series Refinements

Smoothing methods are appropriate when a time series displays no significant effects of trend, cyclical, or seasonal components (often called a stable time series). In this case, the goal is to smooth out the non-recurring components of the time series by using an averaging process. This relates closely to valuation’s use of “normalized” statements. The purpose of normalizing financial statements is to adjust the financial statements of a business to more closely reflect its true economic financial position and value conclusions of operation on a historical and current basis.


The term "moving" refers to the way averages are calculated—the forecaster moves through the time series to pick observations to calculate using an average of a fixed number of observations. In calculating moving averages to generate forecasts, the forecaster may experiment with different-length moving averages. She will choose the length that yields the highest accuracy for the forecasts generated. Weights may be assigned to time periods.


With causal relationships, analysts examine the cause-and-effect relationships of a variable with other relevant variables such as the level of consumer confidence, an income statement or balance sheet item. Below are examples of common causal relation calculations:

1. Position calculations represent a company's financial position regarding earnings, cash flow, assets or capitalization. Calculations can be expressed as a dollar amount, a percentage, or a comparison. Position calculations are often referred to as “common-sized” when it is uniformly applied to a whole statement.

  • Cash % Assets
  • Debt / Equity

2. Metric calculations assess financial position relative to a non-financial figure such as days, transactions or number of customers.

  • Transaction Figures (Units Sold Per Day, Transaction Days, etc.)
  • Utilization %

A widely-known causal method is regression analysis, a technique used to develop a mathematical model showing how a set of variables are related.  Regression analysis that employs one dependent variable and approximates the relationship between these two by a straight line is called a simple linear regression. Regression analysis that uses two or more independent variables to forecast values of the dependent variable is called a multiple regression analysis.

Causal Metrics (Drivers)

Here are some well used financial statement metrics used to develop line items for forecasts:


  • Cash % of revenue
  • Cash % of assets
  • Inventory % of sales
  • Accounts receivable % of sales
  • Accounts receivable based on days outstanding
  • Short term securities % of debt
  • Short term securities % of cash
  • Depreciation and amortization function of first cost


  • Payables % of cost of goods sold
  • Debt function of D:E target
  • Dividend result of D:E target

Income Statement

  • Revenue = price x volume
  • Cost of goods sold % of revenue
  • Selling expenses % of revenue
  • Depreciation function of first cost of asset
  • Interest expense % of average debt balance
  • Taxes at tax rate

Time Series Metrics (Drivers)

Here are some other financial statement metrics used to develop line items for forecasts which are a result of year over year changes:
  • Price
  • Volume
  • Cost of Goods Sold (PxV)
  • SG&A
  • Non-recurring and one-time expenses such as legal fees

Other considerations

Roll the Model Period to Period

Rolling the model means allowing time series drivers to work directly off of the prior period balances. Furthermore, ending balance sheet item balances should flow directly to the next period’s beginning balance. Try developing your model’s statements using these line items first, then move to causal line items. Never cut paste special unless importing data.

Use Expected Values

Though asset capacity and elasticity constraints (utilization rates) usually prevent symmetrical distribution of outcomes from occurring in real life, as a planning and forecast tool, forecasts most often assume a bell-shaped curve. In such a symmetrical distribution, the most likely outcome is equal to the probability weighted outcome. This will provide compatibility to most downstream models such as valuation models which most often are built upon the assumption of expectation. Expected values are not shaded by optimism, pessimism or hope.


Assumption Examples

Table 1


Table 2


Table 3


Table 4


Table 5


Table 6


5. Validation and Acceptance

Model Validation Techniques

Validation of model results can be performed several ways:

Back-testing – Use of historic data as a test to model output validity. This is also a great way to develop a model. Simply build your model using historic data and solve for a known answer.

Benchmarking – As previously discussed, this verifies your model’s output by comparing it to another source, either internal or external.

Scenario Analysis – Run your model using various inputs and assumptions to test for model integrity and reasonableness.

Sensitivity Analysis – Vary inputs and assumptions to compare against other input and assumption changes to see if a proper correlation exists in model output.

Use Test – Has the model stood the test of time?

Common Validation Ratios

These ratios provide sanity checks for validation. Is model output reasonable and does it follow accepted patterns? Ratios not only reflect the financial position of a firm but may also reflect a model’s calculation abilities and errors.

These metrics may also help verify inputs and assumptions and also validate a model’s results:

Validation Benchmarks

Model output can be compared to numerous benchmarks. They include:

  • Prior performance
  • Expectation
  • Competitor results
  • Industry standards
  • Analyst forecasts
  • Internal consensus

Sensitivity Analysis

For deterministic models, simple line charts flush out model output inconsistencies. There is no “scatter” around a fitted line, since deterministic charts have no random variable.
Graph 1
Graph 2


Assuming we use a three year average to forecast accounts receivable for the year 2018. Is our result consistent with out of sample actuals at a tolerable level? Here, it appears so.


Validation Trends

Through a company’s life cycle, there are several common types of model pattern output depending on assumptions and their effect on earnings and/or cash flow growth. Look for irregularities (sudden drops or steep inclines) and see if your model follows traditional forecast paths.

Ongoing Monitoring

Ongoing monitoring plays a vital role in model integrity. Are models bask-tested annually? What business conditions and/or data sources have changed? What external factors may have changed affecting the model’s relevancy? Is the regulatory environment adopting new rules and standards?

Model Acceptance

Models are hard work. But, in the end, are they accepted both in build and results? Here are some key thoughts to remember:
  • Does your model comply to accepted model development or are they simply proprietary? (FAST standards)
  • Was the model developed with “management  buy-in?”
  • Do key model users buy into the theory behind the model? If not, should the model attempt to better support the theory driving model calculations?
  • Does model development theory still hold?
  • Does model output “marry” to downstream use? If not, how can this be improved?
  • What approval processes are in place for the ongoing use and approval of the model?
  • Does the data input and model final output remain relevant?


6. Additional Resources

Here are links to model documentation and other guidance:

SR 11-7 

FHFA AB 2013-07          

FHFA AB 2009-03

OCC 2011-2012

FAST modeling standards

The Global Association of Risk Professional has numerous articles on model risk and validation.



Corporate FP&A Done Right

By Carl Seidman, Business Strategist | CFO Advisor  at Seidman Global LLC

FP&A Tags: 

According to recent findings by CFO Magazine, Chief Financial Officers are increasingly disappointed with the return on investment received from their financial planning & analysis (FP&A) function.   This is no surprise.  Companies that don’t get the most from their FP&A functions tend to focus meticulously on accounting data, its presentation, and what the numbers mean.  While data accuracy is critical to successful data-driven analytics, HOW that information is used is what differentiates an exceptional return on FP&A investment from a modest one.

When companies put greater weight on what the numbers mean, they often rely more heavily on financial reporting and perceive FP&A as a ‘nice to have’.  But it’s quite the contrary.  FP&A should be used to assist company leadership in problem-solving, risk management, and growth planning including the following:

  • Anticipate forthcoming decisions
  • Evaluate “what-if” scenarios and contingencies
  • Contrast new insights with conventional wisdom

The reasons leadership doesn’t use FP&A to its full potential (or at all) is three-fold:

  • A lack of trust in data
  • A poor understanding of how to interpret and use financial information
  • A culture of pervasive, gut-instinct decision-making

Lack of Trust

An overwhelming number of decision-makers interpret data incorrectly because of bad information.  The adage of garbage in, garbage out rings true here and leads to leadership getting continuously burned.  The fault often lies with inaccurate, untimely, or irrelevant financial reporting stemming from poor systems and unqualified personnel.

If the rubbish output becomes a chronic issue, leadership will adapt to making decisions without data – a logical, but dangerous response.  If and when the company realizes success, over time, leadership may begin to rationalize:  “We’ve gotten this far without access to good data, so we must be doing something right.”  

Poor Understanding of How to Interpret and Use Financial Information

Most owners tend not to come from deep financial backgrounds but possess enough knowledge to understand financial health and what’s going on in their businesses.  But there’s a major difference between having an objective understanding of the finances and creatively knowing how to use the information in making future decisions.   Often it isn’t that leadership doesn’t see a need for FP&A; they have a poor understanding of how to interpret and use financial information and therefore don’t know how to build a function to support it.

Pervasive Gut-Instinct Decision-Making

Business builders can be a confident bunch.  When a business grows and profitability increases, it gives decision-makers the impression they’re on the right track.  Echoing the above, they explain: “We’ve gotten this far, so we must be doing something right.”  There is a tendency to be self-assured, often deceptively, in their abilities to interpret figures and make smart decisions.  Who needs FP&A when gut-instinct has proved effective?

The combination of a lack of trust in data, poor understanding of how to interpret and use financial information, and over-confidence gained from prior wins results in misuse of available information and poor decision-making.

Truth and Consequences

Historical achievements are not always a signal of future success.  Accomplishments of the past may give rise to risks and competitive interference never previously seen.  A ‘shoot-from-the-hip’ mentality will eventually lead to disastrous consequences – some recoverable, some not.  However, intuition, when combined with financial intelligence and analysis, lowers risk exposure and gives rise to better decisions.  FP&A is not meant to replace the experience and instinctual nature of leadership; it’s meant to complement it. 

First, the CFO, Finance VP, or Controller professional must build out the enterprise resource planning (ERP) system or financial platforms to ensure information is timely and accurate.  Those responsible for generating the numbers should not only understand what they are gathering and reporting, but why they are doing it and how it will be used for making important decisions.  This may introduce a cultural shift that should be reinforced by training, coaching, and demonstration of its importance.  Without trust in the information, leaders won’t use it and will instead rely on their intuition.  

Second, instead of relying extensively on objective financial reporting, the business must adopt a more subjective, forward-looking mindset.  It is a move from being reactive to proactive.  FP&A should readily be able to provide an all-inclusive view of the business and where it might go.  Analyses of near-term and future decisions should be conducted with ‘what-if’ outcomes reflecting alternatives and uncertainties.  Evaluation of these scenarios should be considered mandatory, not just a ‘nice-to-have’.  Outputs generated from FP&A will assist in more credible negotiations and discussions with lenders, venders, customers, and other stakeholders. 

Making Educated Decisions

Business leaders should beware of overzealousness and believing what got them here will get them there.  Complacency, aimless strategy, and poor decisions can all lead to missed opportunities, stagnation, or distress.  We’ve witnessed in countless instances the damage caused by a lack of foresight and failure to react to changing circumstances.   A strong FP&A function emboldens leadership to make educated decisions rooted in genuine information well in advance.   In its absence, decision-makers place greater faith on back-of-the-envelope calculations and gut instinct.  No doubt, the shift from reactive to proactive thinking takes courage and investment, but it supports the businesses’ ability to be nimble and avert future pitfalls.

Has zero-based budgeting been given a second life?

By Daniele Tedesco, CEO at Apliqo AG / Cubewise CH

The Author, Daniele Tedesco is a former CFO and investment banker and has successfully led a vast number of M&A transactions (buy- and sell-side) as well as driven value based management approaches in different companies.

He has more than 15 years’ experience in corporate finance and was the head of Technology Equity Research of a large Swiss bank.

budgetingGrowth challenges, cost pressure and new competition from the Far East – the challenges for companies seldom come alone. CFOs are therefore faced with a frantic fall of planning. Above all, there is currently a high demand for new methods of efficient and flexible budgeting. An old trend that dates back to the US in the 1970s is reemerging, seemingly right on cue. Back then, Peter Phyrr, a former manager at Texas Instruments, came up with the idea of zero-based budgeting and successfully implemented it in his company.

Start from zero

In zero-based budgeting, the budget is recalculated from scratch every year. All expenditures for the next period must be justified anew. In other words, the planning stage always starts from zero and assumes that every function must be regularly reviewed with regard to its cost and benefit – unlike most common processes in which planning always begins with the current budget. The advantages are obvious: The various budget items are calculated based on the immediate needs of the different departments for the following year and not simply based blindly on the previous year’s budget or expense level. This enables inefficient and rigid structures to be dismantled. The objective is to thoroughly review and justify every planned expenditure so that all budget items are only released and approved if they are indeed required to the given extent.

Sustainable cost management

It may sound simple, but it's actually quite profound. Zero-based budgeting involves far more than simply creating a budget from scratch. ZBB is a circular process with the clear objective of establishing sustainable cost management within the company. This is facilitated by the fact that ZBB is applicable to every type of cost. Whether it be investments, operating expenses, marketing costs or sales expenditures – all corporate items can be rebalanced from scratch. At the same time, a method dating back to the 1970s may obviously have difficulty depicting the complexities of today’s business world. Modern solutions are required so that the ZBB process can be implemented in an efficient, transparent and effective manner. 

Identifying and justifying expenses – a specific example

Let us assume a company in the engineering industry implements zero-based budgeting. This process requires closer scrutiny of costs in the company‘s manufacturing department. The CFO subsequently realizes that the costs of certain components that are used in its end products and have been outsourced to another manufacturer, increase by 5% every year. However, the company has the know-how and technology to manufacture these components in-house. After carefully weighing the positives and negatives, the company determines that it can produce the components less expensively than its external supplier and that it should do so internally.
In the past, the company may have in this case simply blindly increased the budget for purchasing the corresponding components and cleverly disguised the increase. However, by applying zero-based budgeting, it was able to identify potential alternatives and assess whether it would be better to manufacture the parts itself. Cost drivers within departments cannot be identified using the traditional budgeting methods. In contrast, ZBB is a detailed process that aims to unmask and justify expenses. It is important that the costs of the ZBB process themselves always be weighed against the savings, as zero-based budgeting usually involves more time and effort.

Planning tools as the basis for ZBB

It only makes sense to implement ZBB if the entire process can be executed efficiently, transparently and effectively. Planning solutions provide the basis for this, as such analysis tools can be used to identify key demand and cost drivers. These value drivers allow detailed bottom-up sub-plans to be drawn up for sales, material costs, personnel costs, operating expenses and any other relevant expense items that ultimately result in integrated planning. This enables a holistic view of the income statement, balance sheet and cash flow, which are based on ZBB methods and therefore efficiently allocate costs.


Exceptional EPM/CPM Systems are an Exception

By Gary Cokins, Founder and CEO: Analytics-Based Performance Management LLC

Quite naturally, many organizations over-rate the quality of their enterprise and corporate performance management (EPM/CPM) practices and systems. In reality, they lack in being comprehensive and how integrated they are. For example, when you ask executives how well they measure and report either costs or non-financial performance measures, most proudly boast that they are very good. Again, this is inconsistent and conflicts with surveys where anonymous replies from mid-level managers candidly score them as “needs much improvement.”

Every organization cannot be above average!

What makes exceptionally good EPM/CPM systems exceptional?

Let’s not attempt to be a sociologist or psychologist and explain the incongruities between executives boasting superiority while anonymously answered surveys reveal inferiority.  Rather let’s simply describe the full vision of an effective EPM/CPM system that organizations should aspire to possess.

First, we need to clarify some terminology and related confusion. EPM/CPM is not solely a system or a process. It is instead the integration of multiple managerial methods – and most of them have been around for decades arguably even before there were computers. EPM/CPM is also not just a CFO initiative with a bunch of scorecard and dashboard dials. It is much broader. Its purpose is not about monitoring the dials but rather moving the dials.

What makes for exceptionally good EPM/CPM is that its multiple managerial methods are not only individually effective, but they are also seamlessly integrated and embedded with analytics of all flavors. Examples of analytics are segmentation, clustering, regression, and correlation analysis.      

EPM/CPM is like musical instruments in an orchestra

I like to think of the various EPM/CPM methods as an analogy of musical instruments in an orchestra. An orchestra’s conductor does not raise their baton to the strings, woodwinds, percussion, and brass and say, “Now everyone plays loud.” They seek balance and guide the symphony composer’s fluctuations in harmony, rhythm and tone. 

Here are my six main groupings of the EPM/CPM methods – its musical instrument sections:

  • Strategic planning and execution – This is where a strategy map and its associated balanced scorecard fits in. Together they serve to translate the executive team’s strategy into navigation aids necessary for the organization to fulfill its vision and mission. The executives’ role is to set the strategic direction to answer the question “Where do we want to go?” Through the use of correctly defined key performance indicators (KPIs) with targets then the employees’ priorities, actions, projects, and processes are aligned with the executives’ formulated strategy.
  • Cost visibility and driver behavior – For commercial companies, this is where profitability analysis fits in for products, standard services, channels, and customers. For public sector government organizations this is where understanding how processes consume resource expense in the delivery of services and report the costs, including the per-unit cost, of their services. Activity-based costing (ABC) principles model cause-and-effect relationships based on business and cost drivers. This involves progressive, not traditional, managerial accounting such as ABC rather than broadly averaged cost factors without causal relationships.    
  • Customer Management Performance – This is where powerful marketing and sales methods are applied to retain, grow, win-back, and acquire profitable, not unprofitable, customers. The tools are often referenced as customer relationship management (CRM) software applications. But the CRM data is merely a foundation. Analytical tools, supported by software, that leverage CRM data can further identify actions that will create more profit lift from customers. These actions simultaneously shift customers from not only being satisfied to being loyal supporters. 
  • Forecasting, planning, and predictive analytics – Data mining typically examine historical data “through the rear-view mirror.” This EPM/CPM group directs attention forward to look at the road through the windshield. The benefit of more accurate forecasts is to reduce uncertainty. Forecasts for the future volume and mix quantities of customer purchased products and services are core independent variables.  Based on those forecasts that so many dependent variables have relationships with, therefore process-related costs derived from the resource expenses can be calculated and managed. Examples of dependent variables are the future headcount workforce and spending levels. CFOs increasingly look to driver-based budgeting and rolling financial forecasts grounded in ABC principles using this group. 
  • Enterprise risk management (ERM) – This cannot be omitted from the main group of EPM/CPM. ERM serves as a brake to the potentially unbridled gas pedal that EPM/CPM methods are designed to step hard on. Risk mitigation projects and insurance requires spending which reduces profits and also steers expenses from resources the executive team would prefer to earn larger compensation bonuses.  So it takes discipline to ensure adequate attention is placed on appropriate risk management practices.
  • Process improvement – This is where lean management and Six Sigma quality initiatives fit in. Their purpose is to remove waste and streamline processes to accelerate and reduce cycle-times. They create productivity and efficiency improvements.

EPM/CPM as integrated suite of improvement methods

CFOs often view financial planning and analysis (FP&A) as synonymous with EPM/CPM. It is better to view FP&A as a subset. And although better cost management and process improvements are noble goals, an organization cannot reduce its costs forever to achieve long-term prosperity.

The important message here is that EPM/CPM is not just about the CFO’s organization; but it is also the integration of all the often silo-ed functions like marketing, operations, sales, and strategy. Look again at the six main EPM/CPM groups I listed above. Imagine if the information produced and analyzed in each of them were to be seamlessly integrated. Imagine if they are each embedded with analytics – especially predictive analytics. Then powerful decision support is provided for insight, foresight, and actions. That is the full vision of EPM/CPM to which we should aim to aspire in order to achieve the best possible performance.    

Today exceptional EPM/CPM systems are an exception despite what many executives proclaim. If we all work hard and smart enough, in the future they will be standard practices. 

Strategic Planning is about ‘Talks & Figures’

By Richard Reinderhoff, Freelance Consultant Strategy & Operations​

A ‘financial’ strategist is a strategist first, and a financial second. For decades financials have been applying solutions to become a strategic business partner for the C-suite, from financial engineering and tax planning, to centralising (global) operations and deep analytics today. To avoid drilling deeper and still find nothing, reverse engineering the strategic role of the financial will show another route to be of value and increase the yield on IRR or profits with double digits… 

Reality check

If the aim of CFO’s and FD’s is to improve the decision-making process by the C-suite, meaning add value to the business, accounting and compliance aren't helping this quest. In fact, if you read the report by American Institute of CPAs (AICPA) and the Chartered Institute of Management Accountants (CIMA), Joining the Dots: Decision Making for a New Era, you’ll be in shock:

  • 80% of respondents admit that their organisation used the flawed information to make a strategic decision at least once in the last three years. One-third (32%) of respondents say big data has made things worse, not better…”
  • “72% of companies have had at least one strategic initiative fail in the last three years because of delays in decision-making, while 42% say they have lost competitive advantage because they have been slower to make decisions than more agile competitors.”

It is all about information, yet more details or more of the same data will give you the same answers, only in more detail or at a higher spend (Capex). The trick is to reverse the direction the financial is looking: from ‘stargazing into a black hole’ to ‘storytelling based on facts’. For this to happen, the strategy of the business needs to be placed into the ‘accounting’ systems. 

A simplified example will be used to show, how this ‘street smart’ solution was encountered (step 1) and how it is set-up (step 2).

Step 1. Talk to Sales & Operations

A strategy is often a group of plans, where the numbers disappear in PowerPoint presentations, Excel sheets or BI software. Many financials are hooked by their screens, yet the story in the plans is ‘lost in translation’. 

For example, what does the following overview tell you?

Not much, it just shows you the composition of spending, not the strategic intent. As a board member, you might even be tempted to reduce Consultancy fees and Temps, as part of a company-wide ‘savings initiative’ as a response to market pressures.

The first step to increasing your understanding is to talk to Sales & Operations and ask what drives and blocks their business. For the same example, the “Accounts” have been decomposed and reshuffled into the “Business Drivers”, it shows how each business is planning to be developed. 

When the result is lagging, which question will you ask now?

Of course, as a board member, you will ask where and why performance is lagging. This is how the quality of the decision is increased, avoiding ‘one size fits all’ solutions. And normally, it’s the financials that should have provided the CEO, CFO or GM with the right answer.

  • In a real case, 14 companies worldwide were managed based on these kinds of expense reports, which built up into Return On Capital Employed (ROCE) and Value Added ratios. This was also enough for a local Finance Director to manage the expenses until the business focus changed from selling ‘bulk’ to ‘custom build’. Now, with a highly segmented market approach, a cost management system had to be devised to meet this differentiation and his reporting needs, meaning he had to get involved with Sales & Operations and monitor each segment.

Step 2. Implement Project Accounting

As a financial, you have talked to Sales & Operation (managers, directors, and global VP’s) and they have given you a jointly agreed list of key “business drivers” for each market. This list should match their business plans or strategy. One problem, there are no such descriptions in the Chart of Accounts or as Cost Centres. Here enters project accounting. 

What is a project? Basically, it is a sequential flow of various tasks. Each project task can contain any kind of spending, following the Chart of Accounts, and different ‘Cost Centres’ can book on a project activity, when working (or purchasing) together.

Standard project: 

Within project management, each project task or activity is called a Work-Breakdown-Structure (WBS), with a WBS-description and WBS-number. To transform project accounting into a strategy storyboard, you give each project task a WBS-description of a “business driver”, and have a term which lasts e.g. 20 years or more. Now the strategy is in the accounting system!

Strategy storyboard: 

The only instruction you have to give to the budget owners is that their assistant books each transaction and allocation with one additional code: the WBS-number, which is given by the budget owner and related to one of the business drivers. 

Now that you have the strategy translated into a storyboard in your accounting system and amount are being booked, this is what you get:

  1. The local manager will get a report enabling him/her to tell the strategic story to the regional director.
  2. The regional director can follow the role-out and effectiveness of different strategies across the region, and explain and advise the global VP where and how best to allocate or reduce spend, going forward.
  3. Global VP’s can present a ‘Use-of-Funds’ overview to their CEO, telling the real strategic story on how the money was used by their business (versus strategic intent) and which additional initiatives were taken to improve performance.
  4. The CEO will be able to match the initial detailed ‘Use-of-Funds’ overview (= the strategic plan or Pitchbook) presented to boards and investors, with the quarterly updated ‘Use-of-Funds’ overviews, to defend any change (or not) in business focus and actions are taken on major events impacting the company.
  5. Now boards can constructively participate in discussing and making the strategy work, without the need to just trust the PowerPoint of the CEO or worry about the report of their Audit Committee.

Project accounting has more reporting advantages, e.g. it ‘overwrites’ / no cross-border limits, bookings can be split between different WBS-numbers, and various consolidation hierarchies possible.

  • In another real case, around 8 different Marketing Managers each had their marketing plan translated and linked to business drivers defined by their own regional Marketing Director. The local FP&A specialist applied project accounting to all these plans and reporting needs. As a consequence, finance stopped receiving inquiries from Marketing Directors controlling the spending of their Marketing Manager, and the Marketing Managers could deviate from global ‘reduction initiatives’, securing their prioritised spend and meet or exceed their targets.

How does this increase the yield with double digits? Re-allocation of Capital!

A company using a traditional business planning method will learn quickly. After management sees where the money really went into and noticing they spent less time understanding the numbers, they will start to permanently reduce spend on non-value added activities and fund only the best new opportunities. 

Those companies familiar with Beyond Budgeting, Driver-Based Planning, or (strategic) Zero-Based Budgeting, will immediately see the real advantages:

  • No strategic initiative will be wasted (= effectiveness of capital).
  • The allocation of capital can easily be prioritised and changed in accordance with the business environment (= market leadership).
  • Strategic changes will not be hindered by the planning cycle (= transparency and agility).
  • Changing KPI’s will result in faster solution finding and best-practices (= cost savings).
  • Re-assigning people to value-added activities increases motivation (= retention of talent).
  • Business focus, growth and development will be watched closely by boards (= continuity).
  • M&A must show where the synergy will be (or is) happening (= accountability, goodwill, impairment). 

The increase in returns comes from the effective execution of the strategy and adapting it in accordance with the business environment. By linking the strategy with accounting, and not the other way around(!), project accounting is writing the real success story, every month!

Epilogue: Strategic Planning is about ‘talks & figures’

Independent of the accounting, ERP or BI system installed, using project accounting can turbocharge any financial into a strategic business partner without the need for any significant investment. When FP&A is placed within this bigger picture, the link to strategic planning becomes evident. By translating the strategic intent of the company into business drivers made visible through the ‘use-of-funds’, the execution of the strategy becomes fact-based, transparent and verifiable: ‘talks & figures’. Just think about it, and add real value to your role and your company.