Michael Huthwaite

Passionate advisor and thought leader in the area of Strategic Finance. A seasoned consultant with 20 years of experience, Mike has advised numerous mid-market and large multi-national organizations enhance their strategic planning capabilities. 

Mike is an advocate for Cash Flow and Discounted Cash Flow (DCF) analytics. He is a regular speaker and blogger on the topics of uncertainty, Real Options (advanced scenario analysis) and various financial modeling technologies.

Author's Articles

There are currently no published articles of this author.

Closing the Strategy Gap for Good

By Michael J. Huthwaite, Founder and CEO of FinanceSeer LLC 

For many organizations, the strategy gap is a major obstacle that systematically prevents businesses from truly maximizing their Strategic Planning efforts and sustainably creating value for their organization. 

So, what are the underlying causes that routinely lead us down this same perilous path year-after-year?  Should all the blame be placed on competitive and/or volatile markets?  Or are we failing to apply the right management processes to a reoccurring and well-known business problem?

I believe the answer to this question lies in how we choose to view and embrace uncertainty.  

By changing our attitude and approach to uncertainty, perhaps we can produce both meaningful and agile strategies that can actually be executed each and every year, thus eliminating the strategy gap as we know it once and for all.

The Fear of Uncertainty

The natural inclination for anything we fear is to avoid it and such is the case with uncertainty. 

Rather than embrace uncertainty, we often throw our hands up or worse provide mindless projections (ex. escalation rates) that no one believes in nor takes seriously. 

This default approach to uncertainty not only limits our thinking, it also limits our business potential.

Understanding Uncertainty

So, what is uncertainty?  The mistake many people make when defining uncertainty is to infer the prefix “un” to mean “not”.  Under this thinking, “uncertainty” would mean “not certain”, which would equate to a blanket statement for anything that is not known.

In reality, the prefix “un” in uncertainty actually means “a lack of” or an absence of certainty.  This provides a much more meaningful definition as it establishes a range or spectrum for predictability.

What lies in-between certainty and uncertainty is what we call probability (that which has some measurable degree of certainty).

The Window of Probability

In FP&A terms, this spectrum can also be associated with a time range, what I like to call a “window of probability”.

The window of probability may be different by organization, but for most organizations it typically equates to a 12-18 month view into the future.  Everything after that tends to look too murky to predict. 

Of course, it’s possible that the window can grow and shrink from year-to-year, but for the most part healthy organizations and the markets they operate in tend to keep the size of this window constant. 

But regardless of the size of the window, what’s important to realize is that for FP&A, the window of probability is where the fertile ground is located.  This is because probability represents measurable risk and anything that is measurable is manageable and process management is what FP&A does best.

By building up significant control and command processes FP&A can work to squeeze out as much performance as possible, all the while keeping the tolerance for variability low.   

Understanding the Value of Uncertainty

As we attempt to move out into longer-term planning (strategic planning), we tend to find ourselves operating squarely in the realm of uncertainty. 

Uncertainty is a completely different beast than probability.  Uncertainty is by definition impossible to predict and is always changing, like a computer generated password that is re-scrambled continuously. 

But uncertainty, is also where the majority of the value lies.  Ignoring uncertainty or trying to use the same processes and tools used to address the window of probability is where most organizations falter.

In short, uncertainty is where strategy is developed.  Probability is where strategy is executed. 

How should organizations approach uncertainty

How organizations approach uncertainty is the key to building meaningful and attainable goals.  Below are 11 ways in which organizations can pivot from a predictable world to a world of uncertainty. 

1 - Switch to Screening for Real Options

Uncertainty is not only impossible to predict, but it is also continuously changing.  Therefore, the best way to approach uncertainty is by switching to real options.  A Real Option often represents the potential outcome of an entire SBU or Asset.  This is different from traditional planning where the focus is on line item forecasting.  Real Options are particularly valuable because you’re always screening/testing multiple options at a time in order to understand which direction is the best way to proceed.

2 - Develop Multiple Views of the Future

In addition to screening for Real Options, it’s important to maintain multiple views of the future as well.  A simple way to do this is to evaluate different real options under different risk scenarios (expected, down turn, deep down turn, etc.).  Having this point of view will enable you to make strategic choices that will still pan out even if the economy takes a down turn in the future.

3 - Focus on Moving Constraints

The upside to uncertainty is that you often have several years in front of you.  We need to focus that time on breaking down barriers and creating obstacles for our competitors to follow.  This is what strategy is all about.  Short-term tactical planning benefits from a greater level of probability, but it is virtually impossible to move constraints in the short-term.

4 - Look for Long-Term Value

As we showed in the graph earlier, organizations can create value for their shareholders if they are able to take advantage of uncertainty. 

Efficiency can help grow the bottom line temporarily, but it’s not where the sustainable value is created.

5 – Focus on Calibration

One of the biggest challenges when trying to embrace uncertainty is understand what level of detail is necessary to help guide you.  Unfortunately, there isn’t a magic answer for this question, but organizations must continuously look to optimize their models in order to stay relevant. 

Resist the urge to collect more data in a failed attempt to eliminate doubt.

6 – Know what you are Measuring

Uncertainty represents an opportunity for organizations to deliver world class strategy and that should always be tied back to shareholder value (share price).

Conversely, Earnings represent a partial view of the world and take the focus away from making strategic investments.  As a result, Earnings is a far better measurement for tactical planning.

7 – Be Ready to Tell Your Story

Uncertainty is all about making strategic investments and measuring cash flow over time. 

Don’t rely on KPIs which can be misleading as they rarely take into account any concerns around time value of money and often rely heavily on accounting treatments.

8 – Pick the right tools

In order to address uncertainty, you need to have tools that are not only agile, but also promote alternative discussion and debate.  This usually means “Edge” based tools. 

“Core” tools are great for centralizing the discussion and are therefore optimal for tactical planning, but these solutions are often limited when it comes to dealing with uncertainty.

9 – Adopt a Culture of Collaboration

The best way to address uncertainty is by enabling your peers to “suggest” alternatives.  This is a critical form of collaboration under uncertainty because it is the key to promoting discussion and debate around alternative perspectives. 

This is the opposite approach to “submission” based planning where alternative ideas are excluded from the process. 

10 – Become an Influencer

Uncertainty favors individuals who have the skills to successfully influence their peers and help create lasting value for their organizations

Sure, we’re all familiar with presenting our findings, but under uncertainty, the real work happens long before the presentation starts.

11 – Question the Status Quo

Sometimes the best approach to uncertainty is to ask questions.  Asking questions can help uncover problems and ultimately help teams produce optimal plans under uncertainty.

Spending all your time simply justifying outcomes can be a dangerous approach when dealing with uncertainty as it often restricts future agility and cohesion.

Conclusion

Closing the strategy gap for good requires organizations to put forth meaningful and attainable strategic targets that inspire and reward those who are responsible for it’s execution.  However, it also requires building management processes that can navigate uncertainty.  Failing to identify or hide from uncertainty will cause organizations to lose sight of what is really important.

The article was first published in Prevero Blog

The full text is available for registered users. Please register to view the rest of the article.
to view and submit comments
Drawing the Line Between Strategic and Tactical Planning

By Michael J. Huthwaite, Founder and CEO of FinanceSeer LLC 

Driver Based Planning and Rolling ForecastDriver-based planning and rolling forecasts have been two of the hottest subjects in FP&A in recent years. Fueled by the need for directional insight and agility, most FP&A teams are now considering these modeling techniques to be must-haves for their organization.

Despite the fact that these concepts represent obvious choices for better predictive planning, many companies are still finding it difficult to implement them successfully. This is troubling, especially as the next frontier of prescriptive (AI) planning is already upon us. Falling too far behind could be a real problem.

So why are so many companies still struggling to master driver based-planning and rolling forecasts?

Dialing in the right level of calibration

A common problem many organizations encounter when implementing driver-based planning/rolling forecasts has to do with establishing the right level of calibration.  Organizations often want enough sophistication to take action, yet at the same time they don’t want to take on too much complexity for fear it will reduce their ability to remain agile or run rapid what-if scenarios.  
As a result, FP&A teams often fall into goldilocks thinking, where the model shouldn’t be too complex yet not too high-level either.  Unfortunately, this can often be the worst place to be as it’s not strategic enough to grow the business yet not tactical enough to manage the business moving forward.       

Standing in the middle of the court

The best way for organizations to avoid standing in the middle of the court (a no-no in tennis) is by clearly drawing the line between Strategic vs Tactical planning.  
The better we are able to detect this line, the better we are able to avoid straddling it.  Of course, this doesn’t mean that we should focus on doing one approach over the other.  Nor does this mean that we can forgo strong integration between Strategic and Tactical Planning.  Rather, what is critical to realize is that trying to use a one-size-fits-all solution for two distinct planning processes is invariably going to limit your capability and efficiency over time.    
By segmenting out Strategic and Tactical planning, organizations can focus on addressing the people, processes and technology that are uniquely different between Strategic and Tactical planning, regardless of the fact that both approaches rely on driver-based planning and rolling forecast constructs. 

Two-model approach is the norm

For organizations thinking about implementing a new driver-based/rolling forecast solution, it’s important to realize the fact that most companies continue to maintain separate tactical and strategic models regardless of the technology available to them.  Trying to move towards a single model approach is simply not the norm, despite the suggestions or ambiguity made by experts and software vendors.
Some software vendors will go as far as suggest that organizations can add Strategic Planning later once they have successfully implemented tactical planning.  However, based on my experiences these organizations rarely realize any meaningful benefit down the road.

The real differences between strategic and tactical planning

The number of time periods (duration) and the amount of line item detail are two obvious visible differences between strategic and tactical models.  However, it’s important to realize that these are merely the visible effects of two completely different processes.
Here are 5 concrete signs you should look out for when drawing the line between optimal Strategic and Tactical planning.

1 – When you’re looking to grow value not find efficiency

True strategy, is all about exploiting new markets and opportunities by breaking down boundaries and creating barriers for others to follow.  This is what shareholder value creation is all about, establishing higher cash flow returns than your competitors and continue to maintain that advantage over longer periods of time.  
On the other hand, Tactical planning is all about hitting or exceeding targets based on the limited resources available to you in the most cost-efficient way possible.  It’s about managing issues as they come up and executing on the strategy in an efficient manner.

2 – When you need to evaluate multiple outcomes simultaneous

Strategy is not about planning for a single outcome, rather it’s about opening the discussion up to understanding multiple opportunities and tradeoffs and then using that data to influence peers based on discussion and debate.  
On the other hand, Tactical planning may require multiple reforecasts over time, but the objective should remain singular in nature (hit the target with a high degree of certainty).

3 - When the underlying strategy or approach is highly unique

When you’re forced to copy another company’s strategy under the same constraints with no meaningful differentiators, then you're only hoping to win is via efficiency.  
This is by definition another form of tactical planning.  Rather than using your own Strategic plan, you’re essentially borrowing the strategic plan of your competitor.  In this case, you don’t need to think strategically at all. You simply need to focus on execution and there is a big difference between the two.  Unfortunately, too many companies rely on operational efficiencies to run their business. 

4 - When the workflow rules are not pre-defined

Strategy is about influencing others based on your own perspectives and beliefs.  It’s about building consensus across peers, which requires free-flowing discussion and debate around alternative ideas and scenarios.  In order for this process to occur, users must be free to establish their own workflow rules in an ad-hoc and highly collaborative way.  The workflow pattern in Strategic planning often resembles a network map (pattern) where distinct peer teams interact with other peer teams in order to gather critical insight and achieve buy-in in order to establish a unified Strategic plan.
Tactical planning tends to leverage pre-defined workflow rules that often mirror hierarchical org charts within the organization.   Furthermore, Tactical planning is heavily reliant on the submission/approval process as part of an overall highly orchestrated and rigid communication process. 

5 – When you’re dealing with uncertainty not probability 

Strategic planning is full of uncertainty.  Yet literally anything is possible with the right level of investment, effort and time. This is quite different from the term “probability”, where resource limitations reduce the number of potential outcomes enabling modelers to think in terms of either discrete or continuous distributions.  
What makes planning under uncertainty so unique is that it is best evaluated based on an effective discussion and debate across a series of alternative outcome combinations.  This requires not only having the ability to quickly reconfigure alternative scenarios, but it also requires modelers to invite peers and subject matter experts to suggest alternatives and/or provide critical feedback to weed out risky or less advantageous options.  This is the cornerstone of a highly functioning strategic planning process.  
On the other hand, Tactical planning benefits greatly from assigning probabilities to short-term targets.  This helps manage short-term risk and enables companies to apply the proper short-term resources most efficiently.      

Conclusion

For organizations looking to get the most out of their driver-based planning and rolling forecasting initiatives, it is critical to realize that these terms apply in both Strategic and Tactical planning.  Yet the people, processes and technology applied to these two domains are quite unique.  Developing a one-size-fits-all answer two these distinct areas is where many FP&A organizations go wrong. 
Creating an environment that incorporates both strategic and tactical planning as separate, yet highly integrated processes is how organizations get the most out of their financial planning efforts.

 

The article was first published in prevero Blog

The full text is available for registered users. Please register to view the rest of the article.
Seven Essential Tips for Strategic Finance

By Michael J. Huthwaite, Founder and CEO of FinanceSeer LLC 

Over the past 15 years, I have had the amazing opportunity to work with numerous Corporate Finance teams from around the world in an effort to help them get the most out of their strategic modeling practices. Over this time, I have uncovered a common pattern of reoccurring misconceptions and pitfalls that I believe routinely inhibit companies from maximizing their strategic modeling capabilities. 

In hopes of eliminating these common mistakes, I want to share with you a presentation that I often give to individuals looking to enhance their way of thinking about the very important area of Strategic Finance. 

#1 — Why the strategy SHOULD hang in the balance

I tend to find that although most people can probably agree on a definition for “Strategic Planning”, agreeing on what form of analysis should be performed is often a bit more contentious. 

That's because, for most people, Strategic Planning is characterized as a “Strategic Guidance” exercise, where the focus is on performing market research, evaluating benchmarks/comps, understanding competitive advantages and analyzing a business’s strengths and weaknesses.  Typical owners of an organization’s Strategy Guidance document might be the Chief Strategy Officer (CSO) or even the CEO. These people tend to have deep knowledge in their industry and might use any number of tools or services to help them develop their guidance documents. 

However, organizations that solely equate Strategic Guidance to Strategic Planning also run the likely chance of creating a “strategy gap”, where the content/context in the guidance documents don’t provide enough information to accurately set achievable targets for the Budgeting process or the guidance is ultimately deemed “pie-in the-sky” because it hasn’t been thoroughly tested.  As a consequence, Strategic Guidance on its own can often just sit on the shelf, regardless of the amount of thought and skill that went into producing it. 

To avoid a strategy gap, Finance departments need to take the lead in the parallel exercise of Strategic Finance.  This is primarily a top-down financial modeling exercise where assumptions from the Strategic Guidance and elsewhere are used to drive integrated financial statements that (once agreed-upon) should act as a high-level target for the all-important Budgeting process. 

The most common areas of analysis for Strategic Finance are long-range planning/target setting, acquisition/investment screening, treasury planning, executive compensation planning and valuation analysis. 

One of the keys to successful Strategic Planning is to maintain both a good balance yet distinct separation of Strategic Guidance and Strategic Finance.  This often requires strong communication and coordination between Finance and other internal/external industry experts responsible for producing the Strategic Guidance. 

# 2 – Why Goldilocks got it all wrong

Bottom-up budgeting and top-down planning (Strategic Finance) sit at the opposite ends of the financial planning spectrum and serve two different yet necessary purposes.    

The purpose of top-down planning is to set long-range operational targets that justify investment and capital needs.  While bottom-up budgeting’s primary purpose is to increase the likelihood in the short-term that those targets are met or exceeded by establishing accountability at lower levels within the organization.   

Taking a middle-of-the-road approach to planning sounds great, but in practice, it is often the worst approach to take.  It’s not detailed enough to assign accountability and measurability to actuals yet too detailed to quickly set targets and become directionally correct when running scenario analysis.

It’s important to note that there should always be room to evaluate and optimize the level of precision of both bottom-up and top-down tools.  Having agile tools in place that allow you to make changes over time is critical.  In fact, I have never run into a company that got it right the first time. 

However, one must always resist the temptation to compromise in the middle.

#3 – Christmas in July

There is often a great deal of confusion in the term “top-down” planning.  It's not uncommon for people to think it’s an approach taken by management to enter numbers at a higher-level and then allocate them down the throats of the subject matter experts (SME) who are sitting at the Cost Center level.  However, this is not the case and trying to do so would lead to disaster in the long-run. 

It’s important to realize that top-down planning is done because the leaders at the top know that when they accept capital from shareholders and banks that they are beholden to providing a return that would increase Shareholder value in the long-term and would keep the company afloat.  Therefore, it makes sense that top-down planning is about setting clear and measurable targets without providing restrictions on how the SME is going to reach that target. 

This is why it is important to realize that top-down planning must be a cascading exercise, where small teams discuss, debate and form consensus from the top and then hand off those targets to the next level down the management hierarchy where a new leadership team can discuss ways in which they could use their knowledge to reach those targets by setting their own targets for their LoBs. 

I like to illustrate this point using a Christmas tree, where the star at the top is typically Corporate Finance and the bulbs below represent secondary cascading management units.  It would be chaotic to have 40-50 people all running scenario analysis concurrently.  Instead, it’s more often a case of 5-10 peers debating targets before it is cascaded to the next small team.  

When it comes to new business ideas, investment proposals and acquisitions, it’s ok to have BUs submit their own ideas (as they are probably highly knowledgeable about the products and markets relating to the deal).  However, I have also seen it many times where the cascading target includes additional acquisition placeholders that point out the size and shape of the business that should be acquired but don’t have a particular target in mind. 

#4 – Why “Simple” does not mean “Simplistic”

Recently there has been a lot of buzz on making software “simple”.  The term conveys a sense of efficiency and intuitiveness which is critical for busy users who don’t have the time to invest in becoming a super-user on day-one.  In the minds of many business professionals, “simple” is now synonymous with “effortless”

However, the natural impulse is to also think that simple means simplistic (or less content).  Yet, this is not the case.  “Simple” is actually a design language that takes complex subject matter and makes the experience look effortless to the end-user.  In fact, to accomplish this it actually requires more thought and effort on the part of the developers, usually resulting in more functionality or smarter usage patterns.  As a result, “simple” actually means “smarter”.    

For example, recall the netbook craze of a few years ago.  These (often smaller) PCs were marketed as internet focused machines, but, in reality, were nothing more than low powered PCs.  Along came the tablet, with touch screens and purpose-built apps and they blew away the netbook. 

The pitfall I see all too often when dealing with Strategic Finance is that companies try to take bottom-up budgeting tools and simply apply less content to them in hopes of achieving an intuitive or effortless top-down model.  Yet, bottom-up tools don’t provide additional intelligence to the top-down process.  As a result, companies aren’t getting more with less, they are just getting less.

#5 – Why are we constantly “dealing” with ad hoc analysis

Anyone who has taken part in Strategic Finance knows that a large part of the analysis is looking at new deals (new business ideas, investments, acquisitions).  These deals can come and go in rapid succession or they can sometimes languish around for months before either getting eliminated or green-lighted. 

This issue poses a big challenge for server-based technology (technology used for most budgeting tools).  That’s because server-based technology often centralizes the data forcing users to create placeholders for the data first prior to entering the data.  That approach doesn’t make sense if the deal has a chance of being discarded quickly or if the people that need to review that deal are different from deal to deal.  This form of agility is not what centralized budgeting tools were designed to do.

The reality is that when it comes to Strategic Finance, most companies still gravitate toward spreadsheets despite the well-known pain points.  One of the few benefits to a spreadsheet is that it is a document-based technology.  When dealing with a document, users are free to save and share with whomever they want or, of course, they can throw it away or archive it and recall it later.  This approach is far more flexible for the needs of Strategic Finance.  

#6 —The truth about “Single Version of the Truth”

In the business world, we hate the unknown.  So in the short-run, we try to eliminate it by holding people accountable to certain expectations, all the while making sure we can drill down into the details to find out what’s not going to plan.  Predictability and accountability are a good thing in Finance. 

However, in the long run, it becomes more difficult to eliminate the unknown.  As a result, we have to think more creatively and opportunistically and realize this environment creates more room for discussion, debate/influence, and consensus building.  Paving over this basic communication requirement with rigid workflow management functionality and centralized data submissions is not helpful for Strategic Finance. 

We need to embrace solutions that complement both our short-term and long-term requirements so that we can optimally plan for all time periods and stop continuing to apply a “short-term fix”. 

#7 — Spend less time getting there and more time looking around

Many people think that Strategic Finance is an exercise that is typically performed either just before or just after the annual Budgeting process and that’s it, back on the shelf it goes.

Although this can certainly be the case, successful Strategic Finance teams often benefit the most when they are able to continuously screen for opportunities and threats that are a result of fluid changes in market conditions. This is not to say that companies should be constantly updating their strategy, but instead they should be in a perpetual screening process, ready to take advantage of opportunities when they arise.

Black Swans are a perfect example of the need to screen for opportunity and risk.  By definition, Black Swans cannot be detected by statistical software.  Their likelihood would deem them an outlier and without any real context around the situation, they are either ignored or undetected.  The only way to find a Black Swan is to look for it.

Sometimes we spend so much time focusing on climbing the mountain that we fail to spend the time looking around once we get to the top.  In all likelihood, that vantage point might just be the competitive advantage we need.

Conclusion

I hope the topics covered in my article cause you to reflect on your own Strategic Finance activities.  Is your Strategic Finance effort getting the level of attention it deserves? Or like many organizations, is it getting crowded-out and distorted by other adjacent activities? 

If done correctly, Strategic Finance can represent an immense value-add to your organization while at the same time, reduce the costly time and effort of the Budgeting process.   Make a conscious effort to consider these seven essential tips and I’m sure confident you will see the benefits. 

The full text is available for registered users. Please register to view the rest of the article.
to view and submit comments
Why Your Business Planning Process Needs More Edge Answers

By Michael Huthwaite, Founder and CEO at FinanceSeer LLC

 

The long-standing narrative of Enterprise Performance Management (EPM/CPM) has been squarely focused on the effort to steer organizations away from spreadsheets by embracing Enterprise Performance Management suites (i.e. platforms).  

Yet, the dirty secret that is rarely spoken about is that most organizations continue to remain heavily reliant on spreadsheets even after spending huge sums of money on EPM solutions.  

So, why are so many organizations still deeply dependent on spreadsheets?  The answer to this question lies at the Edge.  

Understanding the Relationship Between Core and Edge

Any network, whether it be a social network or a computer network has activity occurring on both a centralized (core) and decentralized (edge) level.  This is because data live in the Core, but it’s often conceived at the Edge.  

Information that is managed at the Core level tends to represent data that can be highly leveraged (i.e. reused or accessed by others).  Examples of Core data might be the latest forecast for an established product line or business unit.  

Conversely, Edge level data tends to represent high-growth opportunities, which is often the lifeblood for your organization’ s future.  An example of Edge level data might be the evaluation of a new investment/acquisition opportunity or the risk assessment of a big swing in the market (a proverbial black swan).    

Despite the distinct differences between Core and Edge, you’ll never eliminate the symbiotic relationship that they share.  Therefore, in order to optimize the overall network, it’s best that organizations take a holistic view by addressing the need to harmonize both the Core and Edge.  

Real-World Examples of Core and Edge

The concept of “core and edge” has been well documented for decades.  The notion exists in many technical areas, including, but not limited to Enterprise Performance Management.  

Recent proliferations of “core and edge” harmonization can be seen with the rise of Mobile, the Internet of Things (IoT) or Wearable technology.  In all three cases, it’s the people and devices that operate at the Edge that creates incremental value while the centralized or Core infrastructure acts to leverage or unify an established ecosystem.  In my opinion, this helps explains why as individuals, we are often hopelessly addicted to our devices and mobile apps.  Our brains pick up on the perceived incremental value that we create at an Edge level, while the Core level helps organize and maintain our lives over time through a robust centralized infrastructure.

What are the Hallmarks of Core and Edge

In general, the tell-tail-signs of Core and Edge look like this:

 

Core
At the Core level, technologies are often marketed as “platforms” and require a great deal of IT involvement and specialized Admin to operate.  Furthermore, these solutions tend to have the higher number of users, remain relatively static in their configuration and to a large degree focus on database storage (often associated with a “single version of the truth”).

Edge
Conversely, Edge level applications are just that, applications.  They’re not platforms per-se, but self-service solutions that focus on supporting small teams of individuals working together to drive new ideas. Edge level applications are typically installed locally (much like a smartphone app or a desktop application).  These applications are not web portals that enable users to enter and submit data, but rather, highly focused solutions that enable end-users to evaluate or try alternative configurations in order to maximize value creation.  

In addition to having full control of these Edge level applications, end-users must also have the ability to freely share their ideas and findings among their small team of peers in order to build consensus.  The speed and fluidity at which these business processes occur mean that IT and System Admins aren’t directly involved, but their ability to indirectly enforce governance must remain in place if the ecosystem is to thrive.  

Edge level data differs from Core level data because the business processes occurring at the edge simply don’t require data to be centrally stored or accessed at such an early stage.   Rather, the speed and flexibility of the data are what enables end-users to think more creatively and begin the process of building credibility around that data.

Integration
Integration is quite literally the silent partner in the Core and Edge Paradigm.  Integration is often, but not always, managed by IT or System Admins and often requires stronger technical skills that are not necessary for end-users.  A great deal of intelligence and validation must go into establishing a strong integration approach.  Integration includes data mapping, but should also include a broader range of communication between the Core and Edge.  

As the proliferation of Core and Edge increases, I think we’ll start to see more integration level concepts flourish such as Artificial Intelligence (AI) and Predictive Analytics where information captured at the Core is suggested to devices at the Edge, which in-turn enables end-users to take action.  For example, when Google Maps tells you that there is traffic on your current route that is a great example of tight integration between Core and Edge.  Slow-moving traffic is captured on mobile devices and sent to Google’s Core servers and then passed back to the Edge devices of other users who are heading on the same route. These motorists can then elect to go a different route or ignore the suggestion altogether.

How Does Edge Work in Business Planning and Analytics?

The Core and Edge Paradigm is prevalent in all areas of business planning.

 

Edge Analytics
This is the one area of Business Planning and Analytics that has a strong record of embracing the Core and Edge Paradigm.  Self-service solutions that capture data and empower end-users to visualize data by running their own unique queries, applying their own custom filters and formatting in a way that uncovers data so it can be fully discussed and debated is quite prevalent in the market today.    

As a result, we should look to these solutions as examples of how businesses can begin to adopt better Edge level applications for business planning.  Sure, the technology won’t be the same across all areas of business planning, but there is plenty of similarities that business planning can learn from Edge Analytics that can push the needle forward.  

Financial Planning at the Edge
I think it’s fair to say that everyone has a natural disdain for Budgeting.  It takes up too much time and is outdated the minute it is published.  So maybe we should not give Budgeting all the credit in the world, but does that mean we should completely abolish it?  

This debate reminds me of my days as a student when I learned that the Income Statement is based on accounting, not cash.  I was shocked, but does that mean that we should abolish the Income Statement?   Probably not, but we should recognize it for what it is and what it isn’t.  

What is important is the need to set a unified plan (Core level functionality) while at the same time empowering individuals and small teams to recognize whether the ongoing market turbulence is significant enough to warrant a reforecast or not (Edge level functionality).  

Having the proper discussion and debate at the Edge level can help to regroup the individuals and small teams responsible for sounding the reforecasting alarm.  This saves the organization a ton of effort by enabling unaffected parts of the business to continue on without getting caught up in lengthy and continuous planning exercises.  

Some people may advocate for a Rolling Forecast approach that initially appears to eliminate the Core and Edge Paradigm.  However, based on my experience, I would argue that it probably doesn’t.  Rather this middle-ground approach is probably a significant reason why so many Rolling Forecast initiatives don’t achieve the level of success they were hoping for.

Operational Planning at the Edge
Spreadsheets are currently responsible for a lot of the analysis that end-users perform in the realm of Operational Planning. To be fair, a good deal of these spreadsheets should probably be replaced with Core level technologies. This will enable organizations to have more “Connected Planning” capabilities that will enable for better horizontal planning. But that will still never eliminate the need for Operational Planning at the Edge.  

At the “task” level, for example, there will always be a need for individuals and small teams to tie their daily activities with their personal and team operational targets. This could be achieved with greater integration to numerous Edge level applications that would be managed and updated by end-users (not System Admins).

Strategic Planning at the Edge
Strategic Planning is an exercise that is largely performed at the Edge.  As a result, this is why most companies continue to rely so heavily on spreadsheets to perform Strategic Planning regardless of their investment in EPM.  

The need to evaluate competing strategies (concurrent) or combine alternative scenario variations together (inclusive) is not the sweet spot of EPM.  These days, every EPM suite is quick to refer to their solutions as “models”, but does that mean that the end-users are experiencing greater “modeling” capabilities?  No.  Modeling is an Edge level activity that requires rapid reconfiguration of models in order to screen for potential business opportunities.   

Once the strategic plan is set, the need to execute strategy (using Strategy Management tools and methodologies) switches the conversation back to a Core level functionality.  

What are the Challenges of the Core and Edge Paradigm
 

 

If the Core and Edge Paradigm were so simple to achieve, then I think the EPM market would already contain a wide variety of Core and Edge solutions to choose from (which it clearly does not).  So, what is holding us back?  

I think the answer to this question cannot be pinned on one single reason, but rather it’s a shared problem that needs to be addressed across business users, IT and solution vendors.

Business Users
Business users waste an enormous amount of time trying to deal with Edge related business processes using inadequate tools.  Sure, spreadsheets, presentation tools and other personal productivity applications can enable business users to get the job done, but at what cost?  Business users need to recognize the advantage of Edge level solutions and start creating a greater demand for these alternatives.  

Like spreadsheets, Edge level solutions put end-users in control of their business processes, but they can do it in a way that provides more integrity and pre-built intelligence that will save end-users an immense amount of time and effort.  

IT
IT has numerous considerations to take into account when dealing with the purchase of new software technologies.  ROI, security and ongoing support are just a few of the most common considerations.  

As a result, it’s natural for IT to want to limit the number of solutions. This makes perfect sense when dealing with “point solutions”.  Point Solutions are software applications that address specific problems without trying to address the concerns of any related or adjacent business processes. This creates disjointed processes that often increase both total cost of ownership and end-user inefficiencies.  

However, it’s important to realize that Edge level solutions are not point solutions. Rather, they are solutions that operate on different architectural levels and as a result, actually, complement each other by furthering capabilities that cannot be optimally achieved by a single level architecture.  

Therefore, it’s important for IT to consider a multi-level architecture (Core and Edge) as part of their formal IT policy (sometimes referred to as Line of Business applications).  Failing to do so, will only limit creativity and increase reliance on generic personal productivity tools.

Software Vendors
If there was going to be any blame placed on a lack of Edge level solutions, I would have to put the onus squarely on software vendors. Their natural tendency to focus on database-driven technology and old Enterprise Sales models is skewing their view of the Core and Edge Paradigm. As a result, they are too heavily focused on defending their existing Core level solutions rather than embracing and actively developing Edge solutions and opening secure APIs to other 3rd party Edge applications.    

Traditional on-premises vendors often associate Edge level requirements as ad hoc activities that can be addressed using Office add-ins.  However, Edge level applications are not one-time business processes.  

Cloud-based solutions are often quick to point out their intuitive UI, making it easier for users to perform logic changes, yet their centralized architecture still addresses Core functionality.  Just because it’s less of a barrier to make logic changes, doesn’t mean that all end users are free to do it.  

What Should Be Done to Address the Edge?

As I pointed out in the beginning, the result of not addressing Edge level architecture means that spreadsheets continue to have a strong foothold in Business Planning and Analytics.  

Yet, reliance on spreadsheets is not the optimal solution for end-users, IT nor software vendors.

End-users will still suffer from inefficient and error-prone spreadsheets, causing a tremendous amount of wasted time. 

IT will continue to lack the security and governance that eludes them with spreadsheets.

Software vendors will never fully realize the unanimous adoption of EPM/CPM that they have been trying to convey for the past 20+ years.  

This problem will only truly be solved when end-users, IT and software vendors come together and establish a balanced response to dealing with both the Core and the Edge.  

Like most initiatives, it all starts with awareness. End-users need to do a better job articulating their pain-points. IT needs to clearly establish their policy on “core and edge” and software vendors need to challenge themselves to develop safe and secure technology that clearly addresses both the Core and Edge and stop relying on slick marketing messages to pull the wool over the eyes of their customers.  

Until then, I’m afraid the spreadsheet will remain an integral and error-prone element of our Business Planning process.  

The full text is available for registered users. Please register to view the rest of the article.
The Evolution of Scenario Planning

By Michael J. Huthwaite, Founder and CEO of FinanceSeer LLC 

We live in a world of uncertainty.  But in that uncertainty, lies a great deal of opportunity for those organizations capable of successfully executing a winning plan. 

To help navigate this challenge, we often turn to scenario planning to guide us.  Yet, despite all the advances in technology, we are still stuck in the past when it comes to the evolution of scenario planning. 

Variance analysis, corridor planning, checkerboard diagrams and various other forms of scenario planning are just a few of the techniques routinely discussed as ways in which organizations could evaluate future business outcomes. 

However, before FP&A can begin to master any of these techniques, it is important that we understand where we are in the functional evolution of scenario planning and where we need to go in order to make scenario planning a truly powerful force for combating uncertainty.

Modeling for Uncertainty

Models essentially come in two distinct forms. 

Stochastic models are one form of model that deals with probability algorithms (regression analysis, Monte Carlo Analysis or even machine learning (AI)). 

The other form is Deterministic models which deal with specific assumptions or driver-based inputs.    

Stochastic or probabilistic modeling operates best when you’re dealing with constraints such as time (short-term planning) or other resource constraints (number of Sales people in the field).  These natural limitations reduce or narrow the number of potential outcomes and therefore enable users to think in terms of probabilities (potential outcomes). 

On the other hand, deterministic modeling doesn’t necessarily share that same narrowing of resources.  In fact, strategy modeling is often about trying to expand boundaries such as time horizons and funding limitations in order to evaluate growth opportunities.

With this in mind, I’m choosing to focus this discussion purely on deterministic scenario models only. 

The Evolution

Save As (The Spreadsheet Answer)

Spreadsheets are the ultimate personal productivity tool.  We all have them and we all use them.  So, we’re all familiar with running scenarios this way. 

The process starts off so simple.  You change a few assumptions, evaluate the results and then click “Save as” to record your changes. 

However, if you are planning on trying to collaborate with others team members or combine other scenarios then you’re in for an error prone and uphill battle.  That’s because “Save as” is essentially a technique for taking scenarios “offline”.    

Versioning

Over the last 20+ years, companies have begun to shift over to Enterprise Performance Management (EPM) solutions to help streamline their financial planning capabilities. 

EPM solutions (which often rely on OLAP or hybrid OLAP technologies) use “Versioning” as their primary form of scenario management.  Versioning works especially well for tactical planning where the focus is on variance analysis (ex. “Plan to Budget”, “Budget to Latest Forecast” and “Latest Forecast to Actuals”). 

Unlike scenarios performed in spreadsheets, Versioning remains an online activity, which enables everyone to work together on a given scenario. 

Yet, despite the online benefits of Versioning, it also has some significant limitations.  Perhaps the most important limitation with regards to Versioning is the inability to address meaningful strategic scenarios. 

In strategy, scenario analysis is all about managing multiple views of the future so that organizations can understanding which scenarios could potentially be combined to establish an optimal strategy.

This is difficult to do in Versioning as each scenario represents a complete standalone dataset.  Trying to maintain multiple datasets at the same time is a nightmare.  Furthermore, because each version is a complete dataset, it’s difficult to combine or merge different datasets together to settle on an optimal plan. 

Sensitivity Scenarios

One way to begin to address scenario planning in a more strategic way is via Sensitivity scenarios. 

Sensitivity scenarios (sometimes called Overlays) work by allowing users to store alternative inputs that essentially override the underlying or initial assumption in the model. 

I like to think of them as clear transparencies that overlay an underlying dataset (often called a “Baseline”) enabling the user to simply recalculate the model on the fly (using in-memory technology).

This approach is far more efficient than Versioning as there isn’t a need to maintain duplicate dataset as all you need to do is store the handful of assumptions that differ from the original dataset (baseline). 

Because Sensitivity scenarios are easy to create and maintain, they are often used for evaluating multiple concurrent scenarios (a must for strategic modeling).  Modelers can easily create corridor or contingency plans that stay relevant throughout the entire planning process, giving modelers multiple views of the future, rather than a single version of the truth. 

In addition to maintaining multiple scenarios at the same time, sensitivity scenarios also enable users to evaluate combined scenarios (two or more scenarios occurring at the same time).  This works by simply layering Sensitivity scenarios on top of each other.  

Sensitivity scenarios are amazingly simple and work well in a true in-memory modeling engine, but they also have limitations.

When combining sensitivity scenarios, it works best when the assumption overlays are unique (no conflicting intersections).  Generally, the only way to address conflicting intersections is by allowing the user to select which overlay to use in a conflict (this is often handled by ordering sensitivities in a way that prioritizes assumption conflicts).

Impact Scenarios

To get the most out of scenario planning, organizations must also be able to understand the impact of a scenario as it relates to an overall strategy.  This is difficult to do with Sensitivity scenarios as we are simply swapping out assumptions and recalculating the model. 

This is where Impact Scenarios come into play.  They look and feel like Sensitivity scenarios, but they take the process one step further by calculating the financial impact.    

This enables us to visualize the impact of scenario planning in a really powerful way. 

Normally we are used to looking at financials we can usually view that data across time or across entity, but with impact scenarios, users can also pivot the data by scenario contribution, where the “baseline” represents your default outcome and each additional scenario contribution is added on top.

Because each scenario is captured and evaluated incrementally, users are free to include or exclude scenarios or even delay (“time shift”) a scenario by several months to arrive at optimal strategic plan of action. 

Impact Scenarios are created in much the same way as Sensitivity scenarios, but rather than overlaying the assumptions and recalculating the model, the model recalculates each scenario separately and then stores the delta impact of that scenario.   

Not only is this analytically superior to Sensitivity scenarios, it also overcomes the problem of conflicting or overlaying assumptions from scenario to scenario.

Impact Scenarios are ideal for evaluating checkerboard style strategies where different initiatives can be combined under different risk profiles (the optimal way to plan in a world of uncertainty).  

Conclusion

In a world of uncertainty, scenario planning is the key to evaluating what is possible.  The best strategies are rarely achieved by constructing a single scenario plan.   Rather the best strategies are the ones that enable peers to take action based on a combination of quantifiable scenarios. 

Putting your organization in a position to measure the impact of alternative scenarios is the key to achieving long-term success.  But, in order to get there, we need to continue to evolve our capabilities for meaningful scenario planning. 

The article was first published in Prevero Blog.

 

The full text is available for registered users. Please register to view the rest of the article.
to view and submit comments

Pages

Author's Articles

July 31, 2018

For many organizations, the strategy gap is a major obstacle that systematically prevents businesses from truly maximizing their Strategic Planning efforts and sustainably creating value for their organization. 

July 26, 2018

We live in a world of uncertainty.  But in that uncertainty, lies a great deal of opportunity for those organizations capable of successfully executing a winning plan. 

June 26, 2018

Lately, I’ve noticed a significant uptick in the number of connections I have on LinkedIn who now list Strategic Finance as their primary job description.

May 30, 2018

For organizations looking to get the most out of their driver-based planning and rolling forecasting initiatives, it is critical to realize that these terms apply in both Strategic and Tactical planning. Yet the people, processes and technology applied to these two domains are quite unique.

Pages