By Randall Bolten, longtime Silicon Valley CFO, author of "Painting with Numbers: Presenting Financials and Other Numbers So People Will Understand You” and adjunct professor at U.C. Berkeley Extension
On one point, there is no argument: You will be considered a great FP&A professional only if you can communicate clearly, effectively, and eloquently.
Regardless of what you’re communicating, your audience will form opinions about the content of your information – and about you personally – from what you present, how you present it, and how you behave while you’re presenting. They’ll form opinions about your intelligence, your professionalism, your grasp of the subject matter, your respect for your audience, your work ethic, your integrity, and your honesty. And those opinions are intertwined: the credibility of your content affects your credibility as a professional, and vice versa.
The fact that you’re communicating numbers doesn’t change anything – in fact, you face even more presentation choices than you do when presenting words. The most basic of those choices is whether a use a table or a graph. To make that choice intelligently, it’s critical that you answer each of these four questions, and in the following order:
The bottom line: If you’re an FP&A professional responsible for producing a regular periodic reporting package – such as monthly, quarterly, or yearly – consider using tables for the basic information that everyone is expecting. Use graphs sparingly, to make your most important points, and only when those points lend themselves to a visual presentation.
Obviously, the most important goal of almost all numbers presentations is to inform your audience. But every good FP&A professional should have another, more selfish goal: to showcase your critical thinking skills. If you keep both goals in mind, achieving each will make the other more likely.
The article was first published in prevero Blog
RANDALL BOLTEN grew up in Washington, D.C., the son of a CIA intelligence officer and a history professor. He is passionate about the importance of presenting financials and other numerical information in a cogent and effective way, and in his current life is the author of Painting with Numbers: Presenting Financials and Other Numbers So People Will Understand You (John Wiley & Sons, 2012).
He is a seasoned financial executive, with many years directing the financial and other operations of high-technology companies. His experience includes nearly twenty years as a chief financial officer of software companies.
He has held the CFO position at public companies BroadVision and Phoenix Technologies, and at private companies including Arcot Systems, BioCAD, and Teknekron. Before his CFO positions, he held senior financial management positions at Oracle and Tandem Computers.
He received his AB from Princeton University, headed west to earn an MBA at Stanford University, and ended up staying in Silicon Valley.
In addition to writing Painting with Numbers, he currently operates Lucidity, a consulting and executive coaching practice focused on organizing and presenting complex financial information. He divides his work time between Glenbrook, NV and Washington, DC, and maintains an office in Menlo Park, CA.
Framing is how situations are presented to people. How situations are presented affect the decisions that people make. Framing has a role in the work of FP&A practitioners.
FP&A practitioners can work through narrow frames. Narrow frames can appear on income statements through revenues from specific products, executive salaries, and equipment depreciation. Narrow frames can appear on balance sheets through work in process inventory, interest payable, and common stock. Narrow frames provide an opportunity for FP&A practitioners to employ a bottom-up approach to their work. Employing this approach improves the ability to be precise in areas like financial plans.
FP&A practitioners can work through broad frames. Broad frames can appear on income statements through net income. Broad frames can appear on balance sheets through total assets, total liabilities, and total equity. Broad frames provide an opportunity for FP&A practitioners to employ a top-down approach to their work. Employing this approach improves the ability to think and learn about the value proposition that businesses have.
Finance is about wealth. Wealth can be measured in a number of ways. Measuring wealth is based on framing. Narrow frames like revenues from new customers or products, the average collection period, or the average age of inventory can provide insight into the ability of businesses to create wealth. Broad frames like net income, return on assets, and return on equity also can provide insight into the ability of businesses to create wealth. The key for FP&A practitioners is to manage biases, i.e. errors in judgment, when assessing wealth. One way for FP&A practitioners to manage biases when assessing wealth is to know which method of framing, narrow or broad, is more appropriate to use under certain situations.
FP&A is called upon to stimulate thinking and learning about how organizations earn revenues, incur expenses, and generate cash flows. Thinking and learning are processes that are subject to errors. Understanding the concept of framing can help FP&A practitioners minimize the effects of errors in their work.
By Rob Trippe, MBA, Financial Modelling Veteran
Financial model definitions can be tricky. Financial models are often dependent upon numerous functional areas and academic disciplines, such as accounting, finance and statistics. These disciplines may have differing uses of the same terminology. Model risk management has also drawn on numerous disciplines in its evolution. The result can be communicating at cross purposes.
No one academic discipline may lay claim to how a financial model’s terminology is defined. Financial model's output is often either a corporate finance concept or an accounting concept, while a driving calculation process may be statistical. Therefore, terminology should be defined among developers, owners and users as early as possible. A data dictionary may be a required element to financial models which fall under regulatory scrutiny. Add a definition of model terms dictionary too. This effort can be spear-headed by the data manager. When financial model output will be used in comparison to other figures, their definition, both numerically and non-numerically, should be identical. Non-narrative depiction of a definition can be extremely useful. Show builds and flows when possible. Here are twenty financial modelling definitions worth memorizing and employing:
Use of historic data as a test to model output validity.
The comparison of model output to the output of an outside and independent source.
3 Emerging Risk
Unforeseeable risk arising further in time and model execution.
Set of rules for financial model design. Flexible, Appropriate, Structured and Transparent.
5 Impact Analysis
Assessment of cost, timing, scope and quality of a model - consequence.
Historical data used in model development.
Quantitative method, system, or approach that applies statistical, economic, financial, or mathematical theories, techniques, and assumptions to process input data into quantitative estimates. Models provide an explanatory framework for real world observations.
8 Out of Sample
Historical data not used in model development.
9 Outcomes Analysis
The comparison of model output to actual outcomes. Back-testing is one example.
Numerical characteristic of a set or population of numbers.
Adjustment of data and/or assumptions.
12 Residual Risk
Remaining risk after a risk mitigation action has been performed.
13 Risk Appetite
Largest tolerable degree of uncertainty acceptable.
Multiple changes to inputs to reflect a given set of circumstances.
15 Secondary Risk
Risk arising from a risk response.
16 Sensitivity Analysis
Impact of a change to an input relative to the change in output.
17 Stress Test
Assessment of model stability by employing hypothetical data inputs or drivers.
Measure of uncertainty or impact worthy of attention.
Degree of deviation within which a model still functions properly.
A set of processes and activities intended to verify that a model performs as intended and as expected.
By Michael Huthwaite, Founder and CEO at FinanceSeer LLC
The long-standing narrative of Enterprise Performance Management (EPM/CPM) has been squarely focused on the effort to steer organizations away from spreadsheets by embracing Enterprise Performance Management suites (i.e. platforms).
Yet, the dirty secret that is rarely spoken about is that most organizations continue to remain heavily reliant on spreadsheets even after spending huge sums of money on EPM solutions.
So, why are so many organizations still deeply dependent on spreadsheets? The answer to this question lies at the Edge.
Any network, whether it be a social network or a computer network has activity occurring on both a centralized (core) and decentralized (edge) level. This is because data live in the Core, but it’s often conceived at the Edge.
Information that is managed at the Core level tends to represent data that can be highly leveraged (i.e. reused or accessed by others). Examples of Core data might be the latest forecast for an established product line or business unit.
Conversely, Edge level data tends to represent high-growth opportunities, which is often the life blood for your organization’ s future. An example of Edge level data might be the evaluation of a new investment/acquisition opportunity or the risk assessment of a big swing in the market (a proverbial black swan).
Despite the distinct differences between Core and Edge, you’ll never eliminate the symbiotic relationship that they share. Therefore, in order to optimize the overall network, it’s best that organizations take a holistic view by addressing the need to harmonize both the Core and Edge.
The concept of “core and edge” has been well documented for decades. The notion exists in many technical areas, including, but not limited to Enterprise Performance Management.
Recent proliferations of “core and edge” harmonization can be seen with the rise of Mobile, the Internet of Things (IoT) or Wearable technology. In all three cases, it’s the people and devices that operate at the Edge that creates incremental value while the centralized or Core infrastructure acts to leverage or unify an established ecosystem. In my opinion, this helps explains why as individuals, we are often hopelessly addicted to our devices and mobile apps. Our brains pick up on the perceived incremental value that we create at an Edge level, while the Core level helps organize and maintain our lives over time through robust centralized infrastructure.
In general, the tell-tail-signs of Core and Edge look like this:
At the Core level, technologies are often marketed as “platforms” and require a great deal of IT involvement and specialized Admin to operate. Furthermore, these solutions tend to have the higher number of users, remain relatively static in their configuration and to a large degree focus on database storage (often associated with a “single version of the truth”).
Conversely, Edge level applications are just that, applications. They’re not platforms per-se, but self-service solutions that focus on supporting small teams of individuals working together to drive new ideas. Edge level applications are typically installed locally (much like a smartphone app or a desktop application). These applications are not web portals that enable users to enter and submit data, but rather, highly focused solutions that enable end-users to evaluate or try alternative configurations in order to maximize value creation.
In addition to having full control of these Edge level applications, end-users must also have the ability to freely share their ideas and findings among their small team of peers in order to build consensus. The speed and fluidity at which these business processes occur mean that IT and System Admins aren’t directly involved, but their ability to indirectly enforce governance must remain in place, if the ecosystem is to thrive.
Edge level data differs from Core level data because the business processes occurring at the edge simply don’t require data to be centrally stored or accessed at such an early stage. Rather, the speed and flexibility of the data is what enables end-users to think more creatively and begin the process of building credibility around that data.
Integration is quite literally the silent partner in the Core and Edge Paradigm. Integration is often, but not always, managed by IT or System Admins and often requires stronger technical skills that are not necessary for end users. A great deal of intelligence and validation must go into establishing a strong integration approach. Integration includes data mapping, but should also include a broader range of communication between the Core and Edge.
As the proliferation of Core and Edge increases, I think we’ll start to see more integration level concepts flourish such as Artificial Intelligence (AI) and Predictive Analytics where information captured at the Core is suggested to devices at the Edge, which in-turn enables end-users to take action. For example, when Google Maps tells you that there is traffic on your current route that is a great example of tight integration between Core and Edge. Slow moving traffic is captured on mobile devices and sent to Google’s Core servers and then passed back to the Edge devices of other users who are heading on the same route. These motorists can then elect to go a different route or ignore the suggestion altogether.
The Core and Edge Paradigm is prevalent in all areas of business planning.
This is the one area of Business Planning and Analytics that has a strong record of embracing the Core and Edge Paradigm. Self-service solutions that capture data and empower end-users to visualize data by running their own unique queries, applying their own custom filters and formatting in a way that uncovers data so it can be fully discussed and debated is quite prevalent in the market today.
As a result, we should look to these solutions as examples for how businesses can begin to adopt better Edge level applications for business planning. Sure, the technology won’t be the same across all areas of business planning, but there are a great deal of similarities that business planning can learn from Edge Analytics that can push the needle forward.
Financial Planning at the Edge
I think it’s fair to say that everyone has a natural distain for Budgeting. It takes up too much time and is outdated the minute it is published. So maybe we should not give Budgeting all the credit in the world, but does that mean we should completely abolish it?
This debate reminds me of my days as a student when I learned that the Income Statement is based on accounting, not cash. I was shocked, but does that mean that we should abolish the Income Statement? Probably not, but we should recognize it for what it is and what it isn’t.
What is important is the need to set a unified plan (Core level functionality) while at the same time empowering individuals and small teams to recognize whether the ongoing market turbulence is significant enough to warrant a reforecast or not (Edge level functionality).
Having the proper discussion and debate at the Edge level can help to regroup the individuals and small teams responsible for sounding the reforecasting alarm. This saves the organization a ton of effort by enabling unaffected parts of the business to continue on without getting caught up in lengthy and continuous planning exercises.
Some people may advocate for a Rolling Forecast approach that initially appears to eliminate the Core and Edge Paradigm. However, based on my experience, I would argue that it probably doesn’t. Rather this middle-ground approach is probably a significant reason why so many Rolling Forecast initiatives don’t achieve the level of success they were hoping for.
Operational Planning at the Edge
Spreadsheets are currently responsible for a lot of the analysis that end-users perform in the realm of Operational Planning. To be fair, a good deal of these spreadsheets should probably be replaced with Core level technologies. This will enable organizations to have more “Connected Planning” capabilities that will enable for better horizontal planning. But that will still never eliminate the need for Operational Planning at the Edge.
At the “task” level, for example, there will always be a need for individuals and small teams to tie their daily activities with their personal and team operational targets. This could be achieved with greater integration to numerous Edge level applications that would be managed and updated by end-users (not System Admins).
Strategic Planning at the Edge
Strategic Planning is an exercise that is largely performed at the Edge. As a result, this is why most companies continue to rely so heavily on spreadsheets to perform Strategic Planning regardless of their investment in EPM.
The need to evaluate competing strategies (concurrent) or combine alternative scenario variations together (inclusive) is not the sweet spot of EPM. These days, every EPM suite is quick to refer to their solutions as “models”, but does that mean that the end-users are experiencing greater “modeling” capabilities? No. Modeling is an Edge level activity that requires rapid reconfiguration of models in order to screen for potential business opportunities.
Once the strategic plan is set, the need to execute strategy (using Strategy Management tools and methodologies) switches the conversation back to a Core level functionality.
If the Core and Edge Paradigm was so simple to achieve, then I think the EPM market would already contain a wide variety of Core and Edge solutions to choose from (which it clearly does not). So, what is holding us back?
I think the answer to this question cannot be pinned on one single reason, but rather it’s a shared problem that needs to be addressed across business users, IT and solution vendors.
Business users waste an enormous amount of time trying to deal with Edge related business processes using inadequate tools. Sure, spreadsheets, presentation tools and other personal productivity applications can enable business users to get the job done, but at what cost? Business users need to recognize the advantage of Edge level solutions and start creating a greater demand for these alternatives.
Like spreadsheets, Edge level solutions put end-users in control of their business processes, but they can do it in a way that provides more integrity and pre-built intelligence that will save end-users an immense amount of time and effort.
IT has numerous considerations to take into account when dealing with the purchase of new software technologies. ROI, security and ongoing support are just a few of the most common considerations.
As a result, it’s natural for IT to want to limit the number of solutions. This makes perfect sense when dealing with “point solutions”. Point Solutions are software applications that address specific problems without trying to address the concerns of any related or adjacent business processes. This creates disjointed processes that often increase both total cost of ownership and end-user inefficiencies.
However, it’s important to realize that Edge level solutions are not point solutions. Rather, they are solutions that operate on different architectural levels and as a result, actually complement each other by furthering capabilities that cannot be optimally achieved by a single level architecture.
Therefore, it’s important for IT to consider a multi-level architecture (Core and Edge) as part of their formal IT policy (sometimes referred to as Line of Business applications). Failing to do so, will only limit creativity and increase reliance on generic personal productivity tools.
If there was going to be any blame placed on a lack of Edge level solutions, I would have to put the onus squarely on software vendors. Their natural tendency to focus on database driven technology and old Enterprise Sales models is skewing their view of the Core and Edge Paradigm. As a result, they are too heavily focused on defending their existing Core level solutions rather than embracing and actively developing Edge solutions and opening secure APIs to other 3rd party Edge applications.
Traditional on-premises vendors often associate Edge level requirements as ad hoc activities that can be addressed using Office add-ins. However, Edge level applications are not one-time business processes.
Cloud-based solutions are often quick to point out their intuitive UI, making it easier for users to perform logic changes, yet their centralized architecture still addresses Core functionality. Just because it’s less of a barrier to make logic changes, doesn’t mean that all end users are free to do it.
As I pointed out in the beginning, the result of not addressing Edge level architecture means that spreadsheets continue to have a strong foothold in Business Planning and Analytics.
Yet, reliance on spreadsheets is not the optimal solution for end-users, IT nor software vendors.
End-users will still suffer from inefficient and error-prone spreadsheets, causing a tremendous amount of wasted time.
IT will continue to lack the security and governance that eludes them with spreadsheets.
Software vendors will never fully realize the unanimous adoption of EPM/CPM that they have been trying to convey for the past 20+ years.
This problem will only truly be solved when end-users, IT and software vendors come together and establish a balanced response to dealing with both the Core and the Edge.
Like most initiatives, it all starts with awareness. End-users need to do a better job articulating their pain-points. IT needs to clearly establish their policy on “core and edge” and software vendors need to challenge themselves to develop safe and secure technology that clearly addresses both the Core and Edge and stop relying on slick marketing messages to pull the wool over the eyes of their customers.
Until then, I’m afraid the spreadsheet will remain an integral and error prone element of our Business Planning process.
By James Myers, Global Finance Executive and Finance Transformation Consultant
It’s the middle of the year and it’s time to take charge of your “data destiny” before the budgeting and planning season starts. What exactly is your data destiny? No, it’s not the new Netrunner card game where the objective is to control all the data in the Universe.
It’s understanding how you can leverage all your internal and external financial and operational data to give you a strategic advantage. Relevant data is the key to understanding it all, but too often this is disbursed throughout your organization.
In 2017, Globally, 39% of all CFO’s are pursuing a significant upgrade in their company’s information, data, and communications systems (1). This trend is real and I’m already seeing companies moving swiftly to take advantage of new BI trends and hire more data-centric talent into their Finance teams. More than a “trend”, it’s a “shift”.
FPA’s data maturity model is a step-by-step process to help you move along the value/maturity curve until you achieve your ultimate goal of game-changing data analytics.
Step 1: Aggregation = Single source
Having all your data in one place is the first major step toward data nirvana. This often requires a lot of IT heavy lifting, but it doesn’t have to be. There is new technology out there that will help, but my advice is to start small. Find the most critical data and focus on that – run through all the steps before on boarding the next set of data. This will enable you to prove out the solution and demonstrate the value quickly – Once you start adding value it will be easier to find more investment.
Step 2: Accuracy = Strong executive support
This step is all about agreeing on terminology and rules. This part is heavily reliant on all the stakeholders agreeing to a single naming convention; methodology and taxonomy – it’s all about standardization. In mature large companies, this can be one of the hardest tasks as it often requires change management. To ensure this part succeeds you will need to have strong executive support. You will also need to validate your data and bring the offline logic into the system. Accuracy is key as there is nothing worse than spending more time debating numbers as opposed to their meaning. You will need everyone bought into the idea that your single source of data is the only source otherwise you will fail in the next steps
Step 3: Access = Mastering dashboards
This is typically the easiest because of the development of powerful BI tools, but unless there is strong governance it can easily get out of hand (see Mastering Dashboards to Build Value). Good dashboards are designed to drive strategic initiatives into the company and to ensure users can identify issues quickly and know what corrective actions to take. Try to avoid dashboard design that results in: confusion; groups competing against each over or more common just being ignored. Security is key at this stage – making sure that the right users have access to the right data at the right time.
Step 4: Actionable = Think!
Building dashboards that keep us informed are interesting but don’t add a whole lot of value. Take revenue for example – this is a typical metric you will see on any dashboard e.g. how much have we booked to date; how are we performing to targets or year over year growth. What we really want is to drive actions into the organization. The best way to do that is to focus on the underlying drivers of revenue e.g. what is your pipeline conversion rate; what pipeline coverage do you have or what is the impact of your marketing promotions.
Step 5: Accountability = Support
The key to driving insight is starting with the business problem. Too often I see people bombarding their stakeholders with data, normally in the form of 70 page PowerPoint decks with detailed tables and 11pt type. That only alienates. Take a step back and take the time to understand your stakeholder and what’s really important to them. Spend time on understanding the drivers behind this and create dashboards that give your stakeholders key insights that help them identify the outcome of their actions. You’ll know when this support is working when they come back and ask for more.
This is the time for data leadership. Discerning and communicating it in a meaningful way to the organization is literally the most important job you will ever do! Use FP&A’s Data Maturity Model to accelerate your FP&A transformation. The time to get ahead of this is right now.
1. THE CFO ALLIANCE - 2017 CFO SENTIMENT STUDY: 39% of respondents reported they will be pursuing a significant upgrade in their company’s information, data, and communications systems in 2017, up from 29% in 2016
Copyright, ©2017 fpa-trends.com. All rights reserved.