Commensurate analysis


“We learn from history that we do not learn from history.” Georg Hegel

I should begin by saying that I do not necessarily agree with quotations at the start of my blog posts.  They are comments that I have found to be thought-provoking – and I hope you do too – rather than definitive.

In 2002 Donald Rumsfeld, the US Secretary of Defense, proposed a three-fold classification for our state of knowledge.  There are, he said, known knowns – things that we know we know; known unknowns – things that we know we don’t know; and unknown unknowns – things that we don’t yet know we don’t know.

For instance, we know that tomorrow the sun will rise: this is a known known.  We know that in two weeks’ time it might rain or it might not: this is a known unknown.  Within a decade, it is possible that some sort of superbug will have mutated, proved resistant to antibiotics and killed a large number of people; but not only do we not know whether such an event will happen, we have little idea what this superbug might be.  This is an unknown unknown.  (In Rumsfeld’s case, he was trying to arouse concern about the unknown evils that might or might not be found in Saddam Hussein’s weapons arsenal in Iraq.)

Interestingly, there is a fourth category that is never mentioned: unknown knowns.  These are things that we know, yet fail to be aware of.  Into this category, we could place a large number of little appreciated facts about the world, like the Latin name for a daffodil.  Of more psychological interest, we can also include facts about ourselves that we refuse to countenance (for instance, I will never be a very good tennis player).

There is an argument for putting Hegel’s observation above into this category.  We know from the evidence of history that crimes will be re-committed; that wars will recur; and, somewhat less tragically, that forecasts will be wrong.  And yet we blithely overlook the fact that we know all this.  We fool ourselves that we will not repeat the mistakes of the past, and that we will get things right in the future.

And yet, Hegel was perhaps too pessimistic.  We can learn something from our past errors, as a wily old tennis player learns block returns, spins, how to hit the ball into space, and other techniques for combatting a hard-hitting younger opponent.  In the field of forecasting, we can learn how to estimate margins of error, how much effort to put into analysis of existing data, which type of modelling technique is likely to be most appropriate in a given context.  We can learn – even if we usually don’t.

Commensurate analysis

We can learn the art of commensurate analysis.  Its guiding principle might be something like this: the level of detail and precision which we attempt when we analyse something should be commensurate with both the quality of data to hand and the potential influence of uncertain variables.  The greater the uncertainty, the more high level the analysis should be; the less reliable the data, the less appropriate a high degree of precision.

In forecasting, we are usually beguiled by computing power and ignore what ought to be common sense.  We conflate data based on real world observations with computer calculations; and build enormous numerical “datasets” which are often based on insubstantial evidence.  We apply techniques that computers can handle, like linear optimisation, because computers can handle them, rather than because they are most appropriate to the problem we are trying to solve.  We forget that “future data” is not data.

Insidiously, there is a tendency to think that analysis has not been done properly if it is approximate or data-light.  In the business world, we want value for our money, which usually means more number-crunching rather than less; quantity masquerading as quality.

In part, the answer lies in increased statistical awareness.  A number of books have been published in recent years, including The Black Swan by Nassim Taleb and The Drunkard’s Walk by Leonard Mlodinow, written in the apparent belief that ignorance about statistics and randomness lies at the heart of many social ills.  No doubt such contributions are helpful, though they neglect the psychological elements: our yearning for certainty for example, and the temptation to interact with a computer screen we can control rather than a human being we cannot.

As much as anything, forecasters need to be honest with themselves and their clients about what is genuinely possible to forecast.  If they are, then there is potential for learning from history, in defiance of Hegel.  Like the wily tennis veteran, they can learn techniques such as when it makes sense to assume a lack of foresight, when to amalgamate data rather than dissecting it, how to incorporate a variety of possible outcomes, and how to judge (and hence give credence to) forecasts on the basis of the context in which they are made.  A course of action may be proposed that isn’t necessarily optimal, given the impossibility of knowing what will happen, but is resilient to a wide range of things that might happen.

This trade-off, between a course of action that is optimal and one that is resilient, is an art form.  Here, in the development of this art form, the experienced forecaster does genuinely have something to contribute.  But it requires courage and humility to recognise that human judgement and human interaction are sometimes superior to specious numerical output from a powerful computer: to turn the thing off and say to a client “I do not know, but this is what I advise …” And, on the client’s side, a willingness to value what a forecaster can really offer.

 

 

 


Leave a Reply

Your email address will not be published. Required fields are marked *