SR 11-7 and Corporate Finance Modelling: Managing Risk and Promoting Success
Through the financial crisis, the advent of drill down database capabilities and with direction from the Federal government, financial modeling is evolving into both a defined art and science. The idea that financial modeling means one knows Excel and work with numbers has been superseded by a financial model governance framework which requires the proper employment of academic theory, collaborative development, identification and management of risks and controls, and verification of final model output through validation techniques and ongoing monitoring practices.
Since the great recession, the Federal Reserve has issued “SR 11-7” which defines what a model is, how a model is developed and implemented and how a model should be proven accurate and effective. The audience for SR 11-7 are financial institution’s model developers, owners and users. But much can be learned from the document to serve the far broader corporate modeling world and is invaluable in pursuing the ill-used and often undefined wording of “best practice” in everyday corporate life.
What is a Model?
SR 11-7 begins by eliminating the idea that models can pretty much mean anything to anyone. By definition, a model employs:
- Quantitative theory such as statistical, financial or accounting theory
- Three components: inputs-calculation processes-outputs
- Transformation of data into useful information (a reporting component)
- Quantitative output
- Possibly qualitative model inputs (assumptions), calculations or outputs so long as the model remains quantitative in nature, thereby having some uncertainty in outputs
- Repetitive use
- Subject matter expertise
A few other model characteristics which may help model identification and rank of importance are:
- Risk potential of model use
- Reporting impact of changes resulting from model output
Development, Implementation and Use
SR 11-7 sets forth general guidelines to ensure the model development approach is disciplined, knowledge-based and properly implemented.
Banks employ a dedicated development team and resources for key models. With this approach, models operate under the concept of leverage. Just as with operating leverage, substantial upfront time and effort can lead to losses but also extraordinary gain through the quality of output and confidence in that output. Consider taking the time and expense of developing your model upfront, research methodologies, and calculation techniques, give design and structure a top billing and take the time to validate it through back-testing and benchmarking. Model development utilizing lower leverage will witness a more limited reward, but with reduced risk. Models developed and improved on an “as needed” basis over time can result in confusion, key person risk, extraneous bulk, and circuitous audit trails.
Form and Structure
Thought to form and structure is critical for all models. From experience, we know that such thought is simply not always the case. All financial models follow the same logical order. They:
- Compile data
- Adjust and conform data as required
- Transform data through calculation
- Present final model output such as a value, a table or a forecast
Use this commonality to develop discipline in how models are built and structured among and across users and functional areas. Create an intuitive and easy to follow workflow, such as using tabs left-to-right in Excel. Models in MatLab code can leverage replicable building blocks.
I am a fan of the FAST standards which can be found at the link below. The FAST standards view Excel financial models not simply as mathematical calculation tools, but communication tools. I find that powerful, and think any veteran financial modeler will ultimately gravitate towards models which possess well defined structures, clarity in logic, and brief audit trails. FAST views financial models as narratives; with sentences, paragraphs, and chapters. Here is the link to the FAST organization: www.fast-standard.org.
Flow charting financial statement builds provide the user a quick gauge on what data is at hand versus what is needed to complete a model. A flow chart will help identify the required builds and environments from which data will be acquired. That’s helpful in managing large projects. In M&A, a banker or seller’s rep will often provide information which does not align to a valuation model builds. With a flow chart, it is easier to identify and manage what is required versus what is available and provide focus on the point where most models breakdown – the middle. Flowcharting will help visualize core components across models and those which are “plug ‘n play”, meaning unique to the particular model at hand. A good example of this would be a set of financial statements (balance sheet, income statement, and cash flow) in consolidated formats. These are applicable to a range of models; capital allocations, valuation, forecasting, just with each model tailored to provide additional output.
Though your model output is needed urgently, things change, and what you solve for today might not be the same as what is required tomorrow. Solid component piece model building provides built-in flexibility and adaptation. Such models can shed one-time output with reduced risk and react to changing sources and systems more easily.
In financial modelling, no one functional area reigns supreme. Example: a financial model solves for a balance sheet answer. That means keeping accountants involved in the model process, wing-to-wing. Statistical methodologies and techniques may be employed in data transformation, but stats are just part of a broader model framework. At every possible step speak in terms of the language used in a model’s final output environment (finance, accounting, etc.). Functional areas will often use the same terminology to mean different things. Be aware of this and agree on definitions first. When a term is used for the first time, stop and define it for all involved. One company I worked with made investments in equity securities using a portion of equity. Imagine the potential for confusion as model output is passed from entity to entity to entity! “Corporate”, owners and business units do not always coordinate and come to terms on basic terminologies and build and often work under differing constraints.
Below is a link covering Excel design content and protocol (Wall Street Prep). Every workplace across functional areas should be having conversations on modeling topics such as these: www.wallstreetprep.com/knowledge/financial-modeling-best-practices-and-conventions/
Good documentation serves a few powerful purposes; it allows one to communicate a model’s structure, design and output across functions and academic disciplines with confidence. It provides comfort that a model was accurately developed, and it forces the model developer to think longer and harder about both academic standard and quality of design. A common approach to model documentation would include the three primary components of SR 11-7:
- Development, Implementation, and Use
- Validation and Ongoing Monitoring
- Governance and Controls
SR 11-7 discusses documentation as a critical component of both development and validation. Documentation ensures smoother use of a model as owners and users change over time.
Flow chart documents are powerful tools. Flow charts can come in many forms and there is no one exact manner to flow a model. From my experience, two flow charts stand out as invaluable;
- System flow chart “wing to wing” (input to output)
- Development flow chart
A system flow chart will show, left to right, a model’s data inputs and IT/business unit environments, calculation processes, and model output and IT/business unit environments. A development flow chart will visualize the exact mathematical methodologies and techniques employed to transform input data into useful business information and will generally focus on only one business environment. Flow charts will dramatically improve model buy-in and provide a path for the solid structure. Dead ends, duplicity and unmanageable audit trails now become visual.
Here is a document link to PwC which shows the documentation of a cash flow model with integrating statements.
Model validation is a set of processes and activities intended to verify that a model performs as intended and as expected. All components of a model (inputs, calculation processes, outputs) should be subject to this verification. Validation should be commensurate with the potential risk in the model’s use. Validation does not end once a model is implemented. The same validation tools which are used during development can be employed on an ongoing manner. SR 11-7 recommends established periodic review (seldom seen in the corporate finance world) and the establishment of thresholds and tolerance levels when model output deviates from expectation or actual.
The tools SR 11-7 suggests for validations are back-testing and benchmarking.
- Back-testing utilizes historic data as a proof to model output. In development, building a model with actuals and observing known and reliable metrics as building blocks would be a sound approach and align a model’s build to already established and accepted metrics. On a forward-looking, ongoing basis, once the forecast period becomes actual, back-testing is again in play.
- Benchmarking is the comparison of model output to the output of an outside and independent source. One example of this would be reconciling a DCF’s output to a market multiple approaches, careful to explain variations between methodologies.
Both require a degree of independence from the model developers and owners, though that would vary case by case in a non-regulatory environment. Seminars, workshops, and certification are available for model validation.
The “Use Test” adds qualitative validity and is mentioned in Basel II. Model validation through sensitivity and scenario analysis rounds out the validation process. Sensitivity analysis tests for the impact of a change to an input relative to the change in output. Scenario analysis would involve multiple changes to inputs to reflect a given set of circumstances.
The Global Association of Risk Professional has numerous articles on model risk and validation. The link is https://www.garp.org.
For starters, know exactly what your model solves for. Define your final model output upfront and in painstaking detail. What is the answer conceptually and how will it be expressed? What environment will the final model output be in and will it affect downstream models? Example: if you are developing a net cash flow model, make sure your model solves for a net figure (all economic benefits less all economic detriments resulting from the implementation of the planned activity). Another example is fair value versus fair market value. Fair value is a recently conceived accounting measure for balance sheet reporting. Fair market value is an age-old valuation concept. Lawyers go to remarkable lengths to define financial statement items and their calculations for the purposes of debt covenants in securities documentation. Model developers should as well. It’s arduous, but an impressive example.
Acknowledging conceptual limitations is critical to model integrity. All theory is limited and so too are models which depend upon it. A classic example is CAPM. Simple, intuitive, widely adopted, uncanny in accuracy and flawed in its claim to measure expected returns. Another example is IRR. Solving for IRR is extremely useful when comparing similar projects and investments, but IRR also comes equipped with built-in pitfalls. If you know your model’s limitations upfront and have articulated these limitations comprehensively, you will have a far greater chance of model acceptance and adoption than if you are broadsided during the challenge and cross-examination.
The Fed asks that model developers give thought to various theories and approaches (for example DCF versus normalized earnings, or market multiples versus income approach). Documentation provides a platform to communicate conceptual soundness and academic theory and empirical evidence should be cited, as should alternative approaches. Qualitative judgment should be challenged and put to test to ensure that subjective adjustments to the model are not simply compensating for an equal but opposite model error.
Alignment to Academic Standard
Once, I opened a weekly forecast model to find it lacking any standard professional structure and protocol. Fair enough, this had been someone’s individual work assignment and they knew it well. The harder challenge to the model was the conceptually incorrect methodology it employed in attempting to forecast EBITDA. The model extrapolated a small percentage of monthly revenue into a full month’s figure. The population base (i.e. a number of customers) of the revenue streams was small (about 40) and dissimilar. This is not right (just because it rained the first two days of the month does not mean it will rain all month). Adjustments required to reach an accurate estimate were cumbersome and undermined model credibility. So, seek advice from other business units and corporate for model approaches, if needed. Coordination from corporate to business units is not always as strong as one might hope. In the absence of methodological guidance, the business plan should be the starting position for any forecast in corporate finance. This will also serve a second use in keeping your business plan development in check.
Another example of aligning to academic standard is financial statement structure and terminology. Over time companies will have developed statements and statement terminology meaningful internally. Taking the time to document and align to common statement structure and terminology will dramatically improve a model’s adoption by future users and outside parties. EBIT, EBI and EBT for example, should be clearly shown as such. “Cash flow” is a generic term. So, define your cash flow as you would read from a textbook, such as “net debt free cash flow to equity holders”.
Ongoing monitoring is highlighted in SR 11-7 to ensure a model continues to function as intended and to evaluate whether any external changes require model alteration. Ongoing monitoring will also ensure that changes by a model user separate from the developer do not affect the model’s intended output. These would include overrides, partial formulae, etc.
Sensitivity analysis and benchmarking are specifically cited for monitoring purposes, as is outcomes analysis. Outcomes analysis is the comparison of model output to actual outcomes. Back-testing, which was previously mentioned, is a type of outcomes analysis and may serve as an excellent model development approach as well.
Governance and Controls
Policy and procedure formalize risk management activities for implementation. SR 11-7 recommends an emphasis be placed on testing and analysis with a key goal of promoting accuracy. These roles and responsibilities can be divided among ownership, control, and compliance.
Strong governance coordinates processes and model output across functional areas and validates the final model output. It provides a venue for sharing ideas across areas and business units. Governance activities may include:
- Challenge to model development and verification of validation testing
- Developing and disseminating model standards which identify the critical elements of model development, implementation, use and monitoring and provide a road map of expectations for success.
- Naming and ensuring secured models.
- Model inventorying which should be taken on a periodic basis noting both a model’s rank of importance and where each model is in the development-use-monitoring lifecycle.
- Implementation and use governance (use of links, data dumps, input identification).
- Shared control tools and techniques for users; watch windows, balancing checks, backups, etc.
- Flow chart guidance which will visually point out risk and control points which may not be obvious to a user in a live environment.
- Qualitative risk analysis is a good starting point in identifying risk points and controls. Consider brainstorming, Nominal Group or Delphi Technique which includes both upstream and downstream model owners and users.
Become an informed, insightful and invaluable employee by utilizing its guidance, even in non-regulatory environments. Excellent executive decision-making demands excellent modelling and analysis.
SR 11-7: www.federalreserve.gov/bankinforeg/srletters/sr1107.htm
Rob Trippe is a financial modelling veteran. With over fifteen years’ experience, Rob has developed corporate finance models for valuation, M&A, forecasting and performance monitoring. He is widely respected for his deep understanding of corporate finance theory, lectures at university and has worked with some of the world’s largest and most respected firms. His research while at the investment bank Houlihan Lokey Howard & Zukin was published in the Wall Street Journal and USA Today. His cash flow model while at the Hertz Corp. was published in SEC and quarterly press release filings. Rob was accredited in valuation in 2008 and holds an MBA, Finance from Boston College.