The Role of Databases for Strategic Planning – Some General Remarks

Large databases, traditionally the domain of the financial departments, are increasingly entering the world of strategic planners. Under the label “business intelligence”, database software and data mining tools are marketed to strategic planners, and their acceptance is quite obviously on the rise. Contributing factors could be a change of generations among planners, more user-friendly tools available, increasing technological experience among those contributing data (who often have a background in marketing rather than technology) and a narrowing cultural gap between strategic management and the technology people necessarily involved in setting up and running such databases.

The main driver behind the spread of strategic management information systems, decision support systems and strategic planning cockpits, however, is the decision makers’ insatiable hunger for definitive answers, clear recommendations and solid data. Where traditional strategic conceps like portfolios or SWAT analyses are highly aggregated and deliberately vague in their conclusions, a strategic database can assign aggregated discounted cashflow numbers to a selection of potential future products, based on data from product and region experts across the company. We have to be aware, however, that the origins of such information about the future remain essentially the same: extrapolation, projection, estimates and, more often than not, educated guesses.

Working with databases in strategic planning offers some obvious advantages:

  • Databases help to avoid the chaos of versions and formats that often occurs when strategic information is traded within the company using standard office tools like tables or presentations. The data can be located on a central server or even an external cloud under the control of the corporate IT experts and governed by corporate IT security guidelines. Adequate access rights for the different users can be set individually or by standard rules.
  • Database user interfaces and data mining programs provide convenient tools to aggregate and visualize the gathered data, speeding up the process of generating bite-size information for decision makers and potentially reducing the workload in planning departments typically short of resources.
  • The standardization of data going into the database and the tools employed to fill it force contributors to address a certain minimum of questions in their planning process, adhere to common conventions and summarize their results in a predefined form.
  • Everybody discussing a decision can argue based on one agreed set of data, representing the best available, up-to-date information from experts across the company’s network, which may include external sales partners, market researchers and consultants.

These advantages, however, come at a price:

  • The clarity of versions and formats is not so much the result of the database itself, but of the strictly implemented strategic planning process that necessarily comes with it. If the thoughts behind a changed estimate in the database or a quick summary for an executive still end up being communicated in spreadsheets sent by e-mail, the advantage is eroded and the database becomes just one more data format users have to deal with.
  • The reduced workload resulting from the use of business intelligence tools has to be compared to the additional resources needed to set up and run the systems. The needed expertise will often not be available within the company, and even for the most user-friendly tools, the actual planning cockpits will in many cases be programmed by external consultants.
  • While standardized data structures to be filled define a minimum of questions to be addressed in generating the data, they also discourage any planning going beyond that, which may not fit into the database. Such standardization is particularly detrimental to any qualitative, critical or out-of the-box thinking that could be priceless as an indicator of possible yet unknown threats or as a source of ideas for future growth not included in current planning.
  • The uniform view of the future defined by a planning database tends to reduce the awareness that the actual future will always be uncertain. The fact that the one future (or, at best, the generic base/best/worst case structure) defined in the database has been built from the input of many contributors and has been agreed upon between different departments makes it particularly difficult to argue against the results and ask the necessary “what ifs”.

Some of these challenges can be addressed early in the process of setting up the database. Looking for synergies with database solutions already in use in the company, for example in controlling, can reduce the workload and accelerate the learning curve in the introduction phase. However, it also may introduce a bias towards processes and structures that are not ideal for information that contains estimates for an uncertain future rather than numbers from a well-accounted past. Leaving space for unstructured information within the database costs technical efficiency, but it may end up containing the one piece of information that avoids the need for parallel data exachange by e-mail or the decisive warning about an external threat that might otherwise have been unheard. Asking in time if an external support is to work as a consultant or merely as a programmer can save time and effort later and can avoid implementing potentially inefficient structures.

It is important to be aware that databases, data mining tools and even strategic planning cockpits can be an interesting source of information to be taken into account in a decision, but they are not decision tools. Asking the many “what ifs”, evaluating alternative strategies, testing for different external scenarios or analyzing potential competitors’ strategies can be done including information from such a database, but these, the actually decisive steps of strategic planning, are not done by the database. In most cases, the user interfaces employed are optimized for visualizing what’s in the database and are not even very well suited for interactively calculating the effects of assumptions that go beyond the scope of the underlying data structures.

It is, however, possible to develop tools to interactively calculate the impact of many different “what ifs” on the agreed planning basis, draw all the necessary information from the database and even write results for different scenarios back to the database, usually in separate but linked structures. The implementation will depend on the framework used, which will usually be either relational databases or multidimensional cubes. Furthermore, it depends on whether a separate data mining interface is used to access and visualize the data and if it should also provide the interface to the simulation and calculation tool.

In the upcoming weeks, we will look at two case studies on such interactive planning tools linked to pre-existing databases, both allowing the same scenario and strategic alternative evaluations on the same data, but in different database environments. One will be a relational database accessed through a data mining tool, the other a multidimensional cube providing its own user interface. We will look at similarities and differences of the two implementations and suggest ways to work around their respective limitations.

Dr. Holm Gero Hümmler
Uncertainty Managers Consulting GmbH

The Difficulty of Estimating Probabilities

The May Issue of the German Harvard Business Manager shows the results of a survey among corporate leaders on the importance of various risk factors (Die Sorgen der Konzernlenker, page 22). The survey was done at the World Economic Forum in Davos and asked about the estimated probability and expected effects for 50 predefined global risks. In the article, the results are plotted as the number of answers color coded in 5×5 grids (probability horizontal, effects vertical).

The striking result are not so much the five main problems identified in the article, but the similarity of the distributions of answers, particularly on the probability axis. Whereas the effects of some risks are actually (although not dramatically) seen as larger or smaller than the others, the probability estimates show a strong tendency towards the center of the scale. In other words, the 469 participating global leaders consider all but a handful of 50 global risks about equally probable. In fact, there are very similar distributions of probability estimates for things that are already happening, like the proliferation of weapons of mass destruction, deeply rooted organized crime or a rise of cronic diseases (the inevitable consequence of rising life expectations), versus largely imaginary risks like the vulnerability to geomagnetic storms or unwanted results of nanotechnology.

On the one hand, this shows a general problem with the use of standardized scales in surveys. There is, of course, a tendency towards the middle of the scale, and asked for numbers, different participants might actually assign very different numbers to a “medium” probability. Even the same participant might think of very different numbers for a “medium” probability dependent on the category the risk is assigned to.

But the strange probability distributions also show something else: We are simply not very good at estimating the probability of future events, especially if they are rare or even unprecedented. In fact, the only future events that we can assign a probability to with a certain degree of confidence are events that recur quite regularly and where there is no reason to expect the systematics to change. And, of course, any complicated probability calculations, most notably the value-at-risk approach commonly used in risk management, are voodoo if they hide the fact that the input probabilities are wild guesses.

Dr. Holm Gero Hümmler
Uncertainty Managers Consulting GmbH

Preparing for an Uncertain Future in a Small Company: Coffee Table Talk at the World Skeptics Congress

After my talk at the World Skeptics Congress, I had an interesting conversation with the owner of a small, technology-driven company. The whole conversation lasted no longer than a cup of coffee. The main question was how a small to mid-size company can leverage the principles for dealing with uncertainty that I had outlined at the end of my conference talk. Here are the results of our talk (plus some explanations) in a few quick points:

  1. Work with the knowledge that exists in your company. If you needed outside experts to tell you how your own market works, you would have probably gone out of business already. Outside help may, however, be useful (and sometimes necessary) to moderate the planning and decision process, to calculate financial impacts and to ask critical questions. The more technical and market expertise a business leader displays, especially if he or she is also the owner or founder of the company, the more hesitant many employees may be to bring up risks they see or feel looming on the horizon.
  2. Make it clear what the purpose of the analysis is. Are there strategic decisions to make – if so, what are the options? If you want to test the viability of an ongoing strategy, what are the areas in which you could make adjustments? In addition, define a reasonable time range you want to plan. If you plan to build a production facility, that time range will be much longer than if you develop mobile phone software.
  3. Look at uncertainties inside-out, going from effects to possible causes. The number of things that can happen in the world around you is infinite. The number of significantly different impacts on your business is rather small. Start with the baseline plan for your business – you have one, explicitly or implicitly. Look systematically what could change, for example in a tree structure: Sales or cost could be impacted. On the sales side, demand or your ability to supply could change. A change in demand could come from the market size or your market share. Market size can change via volume or price level. Get rid of branches in the tree that are (even after critical questions) unrealistic, have negligible impact or would not be affected by the strategic options or adjustments you are evaluating. If you can’t prepare for it, there’s no point in planning it. Also, don’t continue into branches that have identical or very similar impact on your actual business.
  4. Identify key drivers of uncertainty and find possible values. Each branch of the tree gives you a driver of uncertainty. Compare their impact and get rid of the minor ones. You should end up with no more than ten key drivers (or ten for each market you are in, if there are several). Then assign two to five possible values each driver could have in the future. Think of normal as well al unusual developments, but try to avoid the generic base/best/worst assumptions.
  5. Condense possible developments into scenarios. The different values of the drivers open up a cone (see graph below) of possible future developments. Scenarios are roads through that cone. The real future will not be identical to any one scenario, but should be somewhere around or between them. A scenario contains one possible value for each driver. Start out by looking for reasonable combinations of values, then build scenarios around them. There should be at least three scenarios, and more than five or six are rarely necessary. At least one scenario should cover the center of the cone of possible developments mentioned above, but others should lead to the more extreme corners, as well.
    Isolate single factors that don’t fit into the scenario logic, either because they are independent from anything else (oil prices or tax rates sometimes fit in that category) or because there is a feedback (for example, in a local market, competitors may react to the strategy you choose). There should be no more than two or three such factors. Keep them separated and at the end of the whole process, check if varying them within reasonable limits changes the results of your analysis.
  6. Derive the impact on your strategy and options. In a larger company, I would develop an interactive business plan simulation to do this, but in smaller companies, a grid of results with manual calculations or estimates should do. One axis of the grid are your strategic options or your current strategy and possible adjustments. The other axis are the scenarios. Write down (with some basic numbers!) where your company will be at the end of the planning period with each combination of strategies and scenarios. If a strategy looks catastrophic in one scenario or survivable but bad in several scenarios, you may want to stay away from it. If you decide on a strategy and find that it runs into trouble in one of the scenarios, derive which of the scenario’s values in the key drivers could function as an early warning indicator.
  7. Do it! Here’s your main advantage over some of the multi-billion-Euro companies out there: Once you have come to a conclusion, actually implement it. Write down which steps you have to take to make it happen and check them off. If you have come up with early warning indicators, hang them on your office wall, put them on your computer desktop or into your calendar at regular intervals and test them. The only bad thing you can do with this analysis is to let it rot in your drawer.

Dr. Holm Gero Hümmler
Uncertainty Managers Consulting GmbH

Risk Perception and Risk Management in Companies and Management Science – Talk at the 6th World Skeptics Congress in Berlin

The 6th World Skeptics Congress in Berlin in May 2012 had the topic “Promoting Science in an Age of Uncertainty”. While the main focus of the congress was on pseudoscience, creationism and alternative medicine, one of the aspects discussed was the rationality of human risk assessment. The Saturday afternoon session featured Professor Walter Krämer, mathematician and one of Germany’s most renowned authors on risk perception and statistics (“So lügt man mit Statistik”, “Die Angst der Woche: Warum wir uns vor den falschen Dingen fürchten”) and a presentation by Holm Hümmler.

Holm Hümmler’s talk focused on the question if the structures of a company and the teachings of business schools make people’s decisions more rational or just irrational in a different way than those of humans acting on their private behalf. The talk touched on the role of financial risk management as a way of dealing with future uncertainties, Nassim Taleb’s popular book The Black Swan and the decisions leading up to the 2009 financial crisis. The complete presentation is available for download here:

6th World Skeptics‘ Congress, May 19, 2012, Session 6, Risk Perception and Risk Management in Companies and Management Science – Observations about Managers Confronted with an Uncertain Future