RearView

Managing a business on historical data is like trying to drive a car by watching the line in the rear-view mirror.

Myron Tribus

… and, of course, the terrifying collateral of that observation is that historical data is all we have. Nobody has yet found a way to sample the future. Tribus isn’t saying “Don’t do it!” He’s saying “We can do no other … and isn’t it scary!”

We can make predictions and forecasts. We have to, to steer the car. Tribus’s mental image acts as a potent reminder of the subliminal philosophical commitment management makes in running a business. It exposes the fragility of our foresight.

History
The problem Tribus articulates has a distinguished history. In 1703, Jacob Bernoulli wrote to Gottfried Leibniz, perhaps having had the idea of making money out of life insurance. Both men had been investigating games of chance with cards and dice. Both had started to formulate some early tentative probabilistic analyses of such games. No doubt, the advantages of having the calculating edge had crossed their respective minds. Bernoulli asked whether their elementary statistical tools could be used, on the basis of historical data, to calculate the probability of (say) a man aged 20 outliving a man aged 60. Leibniz was pessimistic.

Nature has established patterns originating in the return of events but only for the most part. New illnesses flood the human race, so that no matter how many experiments you have done on corpses, you have not thereby imposed a limit on the nature of events so that in the future they could not vary.

Leibniz had realised that our past experience is the aggregate of a system of interacting causes. Nothing in the data alone can guarantee that the factors that act in the future will be only those causes represented in the historic data.

Scottish philosopher David Hume was much troubled by the problem at a fundamental level. When it came to the question of scientific law, how could a belief in the continued uniformity of nature be justified? Hume’s 1740 solution was that our minds have an efficient faculty for judging when the past is likely to be a good predictor of the future, and when not.

Hume’s analysis was reconsidered by mathematician John Maynard Keynes in 1921. Keynes emphasised that simply characterising the historic cause system may be inadequate to predict the future. He later coined the phrase “uncertain knowledge”.

By “uncertain” knowledge … I do not mean merely to distinguish what is known for certain from what is only probable. The game of roulette is not subject, in this sense, to uncertainty … The sense in which I am using the term is that in which the prospect of a European war is uncertain, or the price of copper and the rate of interest twenty years hence, or the obsolescence of a new invention … About these matters there is no scientific basis on which to form any calculable probability whatever. We simply do not know!

What Keynes emphasised was that some committed belief in historical continuity is necessary to make even a probabilistic prediction about the future. Unless the same cause system continues to act, the future is unknown. If the cause system endures, we can forecast, at least a little way into the future, though usually in no more than a probabilistic way.

That all seems fairly pessimistic. However, our instincts suggest that Hume was at least half right. We do make successful forecasts in many cases. Life would be impossible if we couldn’t. We can also identify problems where historical data may not be much use to us. Driving a mountain pass on the rear view mirror would be folly. We are also keenly aware that there are many situations where our forecasts are tentative. There are sundry futures where we predict with a compromised confidence.

Firm foundations
The first steps towards placing this on a formal foundation were taken in 1924 by British logician William Ernest Johnson, a tutor of Keynes. In Volume III of his work Logic he introduced the idea of exchangeability. Data is exchangeable if we come to an identical (probabilistic) conclusion no matter how we reorder the data. In other words, future samples behave like earlier samples. In the late 1920s, Italian philosopher Bruno de Finetti championed the concept of exchangeability and gave it an exact mathematical definition.

Exchangeability is certainly something over which we have to exercise a Humian judgment. We cannot derive it mechanistically just from historical data. We may know of, or fear, imminent changes to the cause system. We may be ignorant of them though they lie in wait. But it would be weird not to look at that historical data. If the earlier half of the available data alone would have led us to act differently from the latter half then the data is not exchangeable, even within its own historical context. Unless we can account confidently for the difference, all hopes of forecasting are ruined.

Physicist and engineer Walter Shewhart joined the Western Electric Company Inspection Engineering Department at the Hawthorne Works in 1918. At that time industrial quality management was limited to inspecting finished products and removing defective items. That’s not really even a rear view mirror. It’s listening for the bang when you hit the curb. Western Electric were constructing telegraphic systems across the US. Amplifiers were buried underground and caused serious interruption of service when they failed. Shewhart was tasked with improving reliability. He saw the importance of exchangeability in using data to manage a manufacturing operation. He called exchangeability statistical control. But he went further. In 1924, Shewhart suggested a heuristic method of assessing whether historical data, within its own context, so far exhibited statistical control/ exchangeability. He called his heuristic the control chart.

Shewhart likely chose the term “statistical control” because his motivation was improving the effectiveness of manufacturing and reliability of finished articles. Shewhart certainly saw quality control (sic) as a critical motivation in developing broader statistical practice. One of his most celebrated books is called Statistical Method from the Viewpoint of Quality Control (sic). However, “statistical control” doesn’t sit comfortably with many of the great varieties of data we encounter in business, economics and society.

More recently, statistician Don Wheeler has championed the less obscure term stable and predictable as carrying less baggage than “exchangeable/ in statistical control”. Again, Wheeler has argued that control charts are better called process behaviour charts. That’s the terminology I adopt throughout this blog.

Thinking about driving
The power of the process behaviour chart is that it gives us a visual picture of whether our historical data is, so far, stable and predictable. If not, all attempts to predict and forecast are imperilled. Further, the process behaviour chart gives us early warning of when a previously stable and predictable process is beginning to behave unpredictably. Given all the unknowable futures, it offers us the best early warning signal anyone has yet devised. It tells us when driving might be, within certain limits, safe. It tells us when driving is definitely dangerous and we need urgently to implement defensive measures, perhaps braking to a halt.

Psychologist and Nobel laureate Daniel Kahneman has performed research in decision making under uncertainty for several decades. Kahneman has identified what he labels as two “systems” that typify human thought. System 1 is instinctive, fluent, heuristic and integrated with the experience base. System 1 is over confident and often leads us astray, especially where there is some history of performance that is, at least remembered as, satisfactory. Kahneman links System 1 with Nassim Taleb’s thinking on Black Swans. System 1 tells us that we can cruise along the undisclosed highway ahead because we can’t recall ever hitting the curb, except perhaps on a couple of occasions that “don’t count”.

System 2 employs reflective considered analysis. It can, when properly guided by statistical theory, guard against the hazards of System 1. System 2 is astute to the dangers of driving with no forward view. However, it demands focus and application, and is easily distracted and fatigued. There is only a limited quantity of System 2 processing that we can deploy at any instant. Even when endeavouring to think in System 2 there is a tendency to fall back on some simplified heuristic, really more characteristic of System 1.

Assessing stability and predictability is really a System 2 activity. However, in any business, System 2 thinking is the scarcest of resources. In fact, it is the allocation and management of System 2 thinking that fundamentally determines the prospects for business success. The process behaviour chart offers an efficient heuristic that tells us when it is (relatively) safe to relax into System 1 thinking and press ahead on the experience base.

A signal on the process behaviour chart is the catalyst to switch from System 1 to System 2 thinking. The experience base is no longer reliable. Cruising forwards is perilous. Costs must be incurred in implementing defensive measures. Reflective thought must be employed to investigate and understand what has changed and what its implications.

The road ahead
Of course, relying slavishly on the chart is the sort of System 1 thinking that leads a business into danger. However, the process behaviour chart is a key tool in helping us in the very difficult task of knowing how to divide out attention. It helps us to know when to worry, and when not to worry (too much).

Predict and forecast we must, to steer the car or the business. Process stability and predictability is essential to do that reliably and economically. The process behaviour chart challenges us to confront stability and predictability. It goes further and sounds the alarm when circumstances demand that we snap into System 2 thinking.

I think the power of Tribus’ observation is that it forces upon us the precarious nature of prediction. All is not lost. The statistical instrument of the process behaviour chart, and Kahneman’s elucidation of the heuristics and biases that beset rational expectations provide an open invitation more fully to engage in the art of prediction. The true power of synthesising these insights is yet to be exploited.

Anthony Cutler

Advertisement

3 thoughts on “RearView

  1. Pingback: Another driving metaphor | Anthony Cutler

  2. Pingback: Risks of Paediatric heart surgery in the NHS | Unanticipated Knowledge

  3. Pingback: #executivetimeseries | Unanticipated Knowledge

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s