The cost of too much data

The cost of too much data

By Emmanuel Jibodu, FP&A Manager at Stericycle Inc.

"We are drowning in data but starving for actionable insight"

- Unknown                                                                                    

The Financial Planning & Analysis (FP&A) function has become more prominent in the decision making process in most large organizations because of the emergence of BIG DATA (large volumes of unstructured and structured data). Today, most companies have access to vast amounts of data (financial and operational) and must figure out how to use it to drive growth and profitability.

The ability for organizations to synthesize the large volumes of data in their possession was expected to be a source of competitive advantage. Access to large volumes of data was expected to make it possible to identify trends, spot risks, pinpoint causal relationships, and recognize historical patterns that affect an organization's performance.

Today, the ability for FP&A teams to provide actionable insights to senior executives is of paramount importance as result of global competitive pressures and fast-moving markets. Big data was expected to transform the work of FP&A professionals involved with the budgeting, forecasting, and planning cycles for organizations. Big data was expected to improve forecasting accuracy, add agility to analytical processes, and make forecasting methodologies more dynamic. 

You might be reading this right now hoping this is not one of those articles on the Big data phenomenon. Most people have been inundated with documentaries, articles, and news stories focused on Big data. Data (financial or operational) has to be organized, analyzed, and synthesized for it to tell a story that provides actionable insight. Simply providing data to business executives does not optimize the decision making process for them. The emergence of Big data does not necessarily mean organizations are optimizing the value that can be derived from it. Outlined below are four tenets to espouse in order avoid being bogged down by too much data during an analytical process:

  • Thesis
  • Simplicity
  • Qualitative Factors
  • Perfection is NOT the goal


In a University or College, a paper or project has to have a thesis or else it is likely to be convoluted. In analysis, it is prudent to have a clear idea of what the focus of the analysis is. Analysis without focus is likely to consume too much time and not provide actionable insight upon its conclusion. A few questions analysts could ask themselves prior to conducting analysis are: What is the point of this analysis? What questions should we answer? Who is going to use this analysis? In most finance teams, financial analyses are primarily focused around: What activities should be funded? How much an activity should consume? Whether something should be sold, acquired, or developed and whether products or services should be eliminated. 


The analysis of financial and operational data has to be comprehensive and comprise sufficient detail for it to be validated. However, it is common for people analyzing data to become engrossed in complex concepts and models during the course of their analysis. For example, a company's (business unit, product line, geographic market) revenues are likely to be affected by drivers. These drivers could comprise of macroeconomic indicators such as GDP growth, unemployment rate, inflation rate, interest rates, housing starts, etc. When building a model that incorporates the effect these aforementioned macroeconomic indicators have on a sales, it would be prudent to identify 3 or 4 indicators that markedly affect sales. Using too many indicators could lead to incorporating ones that impact sales but have a weak causal relationship. As a result, insight garnered from using weak drivers in a model is likely not to increase enterprise value if acted upon.


Organizations usually require business cases centred on numbers during a decision making process. Certain parameters are used to determine if a project or capital expenditure will increase enterprise value. Analysts use such tools such as Net Present Value (NPV), Discounted Cash Flow (DCF) models, and hurdle rates to optimize decision-making for executives and managers. However, it is advisable to consider the qualitative factors that could affect a project before performing mathematical calculations. Qualitative factors provide context to quantitative information and can serve to illuminate the causal relationships, patterns, and trends in the numbers.


Financial analysis is about being good not perfect. Attributes of good analysis include: timeliness, accuracy, and practicality. Algorithms, models, drivers, variables are unlikely to precisely forecast future business performance with no variation. Deviations from expected performance are likely to occur because corporations operate in a global environment that is fraught with exchange rate fluctuations, hyper competition, volatile markets (commodities and capital), and regulatory pressures.

The synthesis of the enormous amounts of data organizations have access to could potentially be a source of competitive advantage. Analytics teams should not expect that simply mining massive amounts of data will generate actionable insights. Actionable insights when effectively executed upon by executives can serve to increase stakeholder wealth.

The full text is available for registered users. Please register to view the rest of the article.
to view and submit comments