From nylon to a Covid vaccine: The good news about innovation

1024px-SARS-CoV-2_without_backgroundThus, an activity will in general have two valuable consequences: the physical outputs themselves and the change in information about other activities.

Kenneth J Arrow
Essays in the Theory of Risk-Bearing

That seems like an obscure theoretical point in academic economics but there are plenty of illustrations. Arrow himself drew attention to the development of nylon. DuPont developed the polymer nylon during the interval between the World Wars of the twentieth century. It was an enormous commercial success for them. That was the obvious valuable consequence of the “development of nylon” activity. Arrow’s point was that, if we look at this from the perspective of global wealth, the contribution of nylon itself was tiny compared with the value to society of knowing that things like nylon are possible and valuable. Once other companies knew that, they were emboldened and incentivised to revisit the physio-chemical fundamentals, learn the chemical engineering technologies and build their own knowledge base. DuPont now themselves had a lead in knowledge and an advantage in know-how teeming with commercial potential. The fast followers had the advantage of enjoying a big chunk of the risk already having been sunk by DuPont. The industrial development and marketing of polymers during the twentieth century played an important role in global wealth creation. You probably have to be a certain age to remember papier maché washing-up bowls. Or just washing-up bowls.

Arrow’s proposition is a broad and general economic principle not a mathematical theorem. My hunch is that it’s going to be validated again with the global response to the Covid-19 pandemic. On 9 November 2020, the news media covered the announcement by Pfizer and BioNTech that they had, in partnership, developed the first candidate for an effective vaccination against Covid-19. That is blessed news. My hunch is that the knowledge won in developing that vaccine will benefit society far beyond ridding us of this ghastly pestillance.

Since then there have already been other vaccines announced. The intellectual effort in developing the BioNTech vaccine was driven by two individuals, Uğur Şahin and Özlem Türeci. Şahin and Türeci’s discovery represents the unfashionable globalist, liberal intellectual culture of Europe and the Levant. But their ideas would have gone nowhere without the, equally unfashionable, American systematic productivity, capital and risk appetite of DuPont. It is a similar story to the development of penicillin. European creativity and American energy. British artist David Hockney loved California because he felt it offered the best of both worlds. Pace Harold Macmillan, the Europeans are ever the Greeks to the American Romans.

The Covid-19 vaccine itself was developed from the technologies that BioNTech had already fostered and exploited in the different field of individualised cancer immunotherapy. I have to confess that my scientific tastes are more for the mechanical and the electrical and I understand nothing of the science here. However, I can see that the benefits of novel cancer treatments have turned out to have created value in a wholly different area of medicine. As Arrow would have predicted. My bet is that the scientific and technological advances spurred by Covid-19, not least the management skills in expediting clinical trials, will turn out to have wider external benefits, individually unpredictable but a moral certainty as an engine of wealth creation.

Faith in economic laws is a useful mindset. Julian Simon used to remark that people in general find no difficulty in accepting that, if there exist conducive economic conditions, cheese will be manufactured. However, the proposition that, if there exist conducive economic conditions, technological innovation will occur, meets more skeptical resistance. Surely technological innovation is different from cheese. But is it?

There are lots of books about innovation around at the moment. They rehearse many fascinating anecdotes but I find there is no useful over-arching theory. I find anecdotes useful. So should you, but you need to be savvy about uncertainty and causation. I have some favourite anecdotes. That said, I think the following is an essential, and almost true, insight. Certainly for business.

While knowledge is orderly and cumulative, information is random and miscellaneous.

Daniel J Boorstin

Convivial knowledge management

If you want to teach people a new way of thinking, don’t bother trying to teach them. Instead, give them a tool, the use of which will lead to new ways of thinking.

Buckminster Fuller

All that suggests that the most important thing that any organisation should be working on is Knowledge Management, capturing as much of the byproduct learning the organisation produces that is not oriented to immediate goals. Again, software packages, holding themselves out as tools abound. I am sure many are worth the license fee. Try them and see. But do keep an account of costs and benefits.

I would like to suggest two simple technologies, what Ivan Illich would have called convivial tools. One I have talked a lot about on this blog is the Shewhart chart. The other remains, I think, under explored and under exploited, that is the Wiki. Wikipedia is one of those things that only works in practice, a collaborative encyclopaedia with no editorial authority. It’s for you to judge whether it has been a success.

My view is that the scope for using Wikis in intra-organisational knowledge building has yet to be fully exploited. Wikis can be used for collaborative development of searchable and structured manuals of fact, insight and open questions. A Shewhart chart that could be collaboratively edited would be a very powerful thing. But that was Shewhart’s original intention, a live document continually noted-up with the insights of the workers using it. Perhaps in modern times we would want this to be all cloud based rather than a sheet of paper on a workshop wall. A wiki-Shewhart chart.

Where can we buy the software?

The “Graph of Doom” 9 years on

I first blogged about this soi-disant “Graph of Doom” (“the Graph”) back 2013. Recent global developments have put everyone in mind of how dependent we are on predictions and forecasts and just how important it is to challenge them with actual data. Then we should learn something about what we are talking about in the process.

GraphofDoom

I first came across the Graph in a piece by David Brindle in The Guardian on 15 May 2012. As far as I can see this comes from a slide pack delivered at Barnet London Borough dated 29 November 2011.

The Graph was shared widely on social media in the context of alarm as to an impending crisis, not just in Barnet but, by implication, local government funding and spending and social care across the UK. To be fair to Brindle, and most of the other mainstream commentators, they did make it clear that this was a projection. As he said, “The graph should not be taken too literally: by making no provision for Barnet’s anticipated rise in income through regeneration schemes, for instance, it overstates the bleakness of the outlook.”.

I blogged about this in 2013 I made the following points about the Graph and the charted predictions, forecasts and projections in particular.

  • Use ink on data rather than speculation.
  • Ditto for chart space.
  • Chart predictions using a distinctive colour or symbol so as to be less prominent than measured data.
  • Use historical data to set predictions in context.
  • Update chart as soon as predictions become data.
  • Ensure everybody who got the original chart gets the updated chart.
  • Leave the prediction on the updated chart.

Nine years on, as far as I can see from my web search, points 5 to 7 have not been addressed, certainly not in the public domain. I am disappointed that none of the commentators has taken the opportunity to return to it. As I set out below, there’s a great story here. I decided it was down to me.

I went to look at Barnet’s website to search for published accounts. I wanted to see if I could find out how this actually evolved. I did not find this easy. The relevant accounts are not easy to find on the website. I am not an accountant. Perhaps a large proportion of Barnet’s residents are. Firstly, I could not find any figures before 2012/13 so I am still unsure as to whether the 2010/11 picture is forecast, preliminary or actual. There also seemed to be a number of different analysis models within which accounts were given. After a bit of judicious guesswork matching numbers, I decided that the projected budget was referring to the General Fund Revenue Budget (“the GFRB”) which is the account that revenue expenditure and income is charged for the council’s services (excluding the Housing Revenue Account). It says. The service budgets must then refer to the expenditures charged against the account. I found finalised accounts for 2012/13 to 2018/19. There were provisional accounts for 2019/20 but, as far as I could see, those did not include the GFRB so didn’t really assist.

I’m happy to be corrected on this by anybody who has a better view on the relevant numbers.

I didn’t have the original data to plot afresh, or the forecasting model. I have had to over-plot a bitmap. Not a perfect situation. I could not address all the data visualisation criticisms I made in my earlier post. That said, here is the Graph with the actual budgets and expenditures.

GoDU1

I am guessing that the original Graph adjusted future revenues and expenditures to 2011 prices. I have, therefore, replotted adjusting for CPIH, the government’s preferred inflation measure. This is a measure of consumer price inflation but I found nothing better for indexing local government expenditure. I am not an economist. Here is the adjusted chart.

GoDU2

There’s actually a great story here for somebody. This is not boring! It certainly looks as though forecasts of total funding held up well, a little below predicted. However, expenditure on social care appears to have diminished well below the parlous levels projected in the Graph of Doom. It has gone down rather than up. That must be because:

  1. Barnet have done a wonderful job in performing these services more efficiently.
  2. The effectiveness and performance of the services has deteriorated.
  3. The demographic forecasts were inaccurate..

I am betting against (3) because demographic forecasts over so short a period don’t have many ways of going wrong. I am surprised that, if (1) is the case, then Conservative members of Barnet London Borough aren’t shouting very loudly about it. Conversely, if (2), I’m surprised that Labour members are silent. What I’m looking for is somebody to put the Graph of Doom back on the stage and use it to celebrate success or attack an opponent. I would expect the national party principals to find capital in the data. Data. Perhaps it is more “nuanced” but that still sounds like an interesting story. Of course, I would also like to see some data about the effectiveness of the social services. That’s a huge part of this narrative too. Perhaps I shall look for that myself.

I would have thought that there was a good story here for a data journalist. Our mainstream media still have, thankfully, plenty of left and of right sympathies.

We need improvement stories to inspire, motivate and educate to broader and more diverse improvement efforts. We need warnings of scandal and failed public provision to inspire, motivate and educate to broader and more diverse improvement efforts. We need to show not tell.

I do just note that Barnet’s accounts also have forecasts for each succeeding year. These are so good I haven’t felt it worth blogging about them. Perhaps it all carries the spoor of rule 4 of Nelson’s funnel. But that is another story. Worth a journalist’s time I think.

I’ll be back.

Data versus modelling

Life can only be understood backwards; but it must be lived forwards.

Søren Kierkegaard

Journalist James Forsyth was brave enough to write the following in The Spectator, 4 July 2020 in the context of reform of the UK civil service.

The new emphasis on data must properly distinguish between data and modelling. Data has real analytical value – it enables robust discussion of what has worked and what has not. Modelling is a far less exact science. In this [Covid-19] crisis, as in the 2008 financial crisis, models have been more of a hinderance than a help.

Now, this glosses a number of issues that I have gone on about a lot on this blog. It’s a good opportunity for me to challenge again what I think I have learned from a career in data, modelling and evidence.

Data basics

Pick up your undergraduate statistics text. Turn to page one. You will find this diagram.

Frame

The population, and be assured I honestly hate that term but I am stuck with it, is the collection of all things or events, individuals, that I passionately want to know about. All that I am willing to pay money to find out about. Many practical facets of life prevent me from measuring every single individual. Sometimes it’s worth the effort and that’s called a census. Then I know everything, subject to the performance of my measurement process. And if you haven’t characterised that beforehand you will be in trouble. #MSA

In many practical situations, we take a sample. Even then, not every single individual in the population will be available for sampling within my budget. Suppose I want to market soccer merchandise to all the people who support West Bromwich Albion. I have no means to identify who all those people are. I might start with season ticket holders, or individuals who have bought a ticket on line from the club in the past year, or paid for multiple West Brom games on subscription TV. I will not even have access to all those. Some may have opted to protect their details from marketing activities under GDPRUK. What is left, no matter how I chose to define it, is called the sampling frame. That is the collection of individuals that I have access to and can interrogate, in principle.  The sampling frame is all those items I can put on a list from one to whatever. I can interrogate any of them. I will probably, just because of cost, take a subset of the frame as my sample. As a matter of pure statistical theory, I can analyse and quantify the uncertainty in my conclusions that arises from the limited extent of my sampling within the frame, at least if I have adopted one of the canonical statistical sampling plans.

However, statistical theory tells me nothing about the uncertainty that arises in extrapolating (yes it is!) from frame to population. Many supporters will not show up in my frame, those who follow from the sports bar for example. Some in the frame may not even be supporters but parents who buy tickets for offspring who have rebelled against family tradition. In this illustration, I have a suspicion that the differences between frame and population are not so great. Nearly all the people in my frame will be supporters and neglecting those outside it may not be so great a matter. The overlap between frame and population is large, even though it may not be perfect. However, in general, extrapolation from frame to population is a matter for my subjective subject matter insight, market and product knowledge. Statistical theory is the trivial bit. Using domain knowledge to go from frame to population is the hard work. Not only is it hard work, it bears the greater part of the risk.

Enumerative and analytic statistics

W Edwards Deming was certainly the most famous statistician of the twentieth century. So long ago now. He made a famous distinction between two types of statistical study.

Enumerative study: A statistical study in which action will be taken on the material in the frame being studied.

Analytic study: A statistical study in which action will be taken on the process or cause-system that produced the frame being studied. The aim being to improve practice in the future.

Suppose that a company manufactures 1000 overcoats for sale on-line. An inspector checks each overcoat of the 1000 to make sure it has all three buttons. All is well. The 1000 overcoats are released for sale. No way to run a business, I know, but an example of an enumerative study. The 1000 overcoats are the frame. The inspector has sampled 100% of them. Action has been taken on the 1000 overcoats, the 1000 overcoats that were, themselves, the sampling frame. Sadly, this is what so many people think statistics is all about. There is no ambiguity here in extrapolating from frame to population as the frame is the population.

Deming’s definition of an analytic study is a bit more obscure with its reference to cause systems. But let’s take a case that is, at once, extreme and routine.

When we are governing or running a commercial enterprise or a charity, we are in the business of predicting the future. The past has happened and we are stuck with it. This is what our world looks like.

Frame

The frame available for sampling is the historical past. The data that you have is a sample from that past frame. The population you want to know about is the future. There is no area of overlap between past and future, between frame and population. All that stuff in statistics books about enumerative studies, that is most of the contents, will not help you. Issues of extrapolating from frame to sample, the tame statistical matters in the text books, are dwarfed by the audacity of projecting the frame onto an ineffable future.

And, as an aside, just think about what that means when we are drawing conclusions about future human health from past experiments on mice.

What Deming pointed towards, with his definition of analytic study, is that, in many cases, we have enough faith to believe that both the past and future are determined by a common system of factors, drivers, mechanisms, phenomena and causes, physiochemical and economic, likely interacting in a complicated but regular way. This is what Deming meant by the cause system.

Managing and governing are both about pulling levers to effect change. Dwelling on the past will only yield beneficial future change if exploited, mercilessly, to understand the cause system. To characterise what are the levers that will deliver future beneficial outcomes. That was Deming’s big challenge.

The inexact science of modelling

And to predict, we need a model of the cause system. This is unavoidable. Sometimes we are able to use the simplest model of all. That the stream of data we are bothered about is exchangeable, or if you prefer stable and predicable. As I have stressed so many times before on this blog, to do that we need:

  • Trenchant criticism of the experience base that shows an historical record of exchangeability; and
  • Enough subject matter insight into the cause system to believe that such exchangeability will be maintained, at least into an immediate future where foresight would be valuable.

Here, there is no need quantitatively to map out the cause system in detail. We are simply relying on its presumed persistence into the future. It’s still a model. Of course, the price of extrapolation is eternal vigilance. Philip Tetlock drew similar conclusions in Superforecasting.

But often we know that critical influences on the past are pray to change and variation. Climates, councils, governments … populations, tastes, technologies, creeds and resources never stand still. As visible change takes place we need to be able to map its influence onto those outcomes that bother us. We need to be able to do that in advance. Predicting sales of diesel motor vehicles based on historical data will have little prospect of success unless we know that they are being regulated out of existence, in the UK at least. And we have to account for that effect. Quantitatively. This requires more sophisticated modelling. But it remains essential to any form of prediction.

I looked at some of the critical ideas in modelling here, here and here.

Data v models

The purpose of models is not to fit the data but to sharpen the questions.

Samuel Karlin

Nothing is more useless than the endless collection of data without a will to action. Action takes place in the present with the intention of changing the future. To use historical data to inform our actions we need models. Forsyth wants to know what has worked in the past and what has not. That was then, this is now. And it is not even now we are bothered about but the future. Uncritical extrapolation is not robust analysis. We need models.

If we don’t understand these fundamental issues then models will seem more a hinderance than a help.

But … eternal vigilance.

Social distancing and the Theory of Constraints

WarningPoster1

An organised queue or line1

I was listening to the BBC News the other evening. There was discussion of return to work in the construction industry. A site foreman was interviewed and he was clear in his view that work could be resumed, social distancing observed, safety protected and valuable work done.

Workplace considerations are quite different from those in my recent post in which I was speculating how an “invisible hand” might co-ordinate independently acting and relatively isolated agents who were aspiring to socially isolate. The foreman in the interview had the power to define and enforce a business process, repeatable, measurable, improvable and adaptable.

Of course, the restrictions imposed by Covid-19 will be a nuisance. But how much? To understand the real impact they may have on efficiency requires a deeper analysis of the business process. I’m sure that the foreman and his colleagues had done it.

There won’t be anyone reading this blog who hasn’t read Eliyahu Goldratt’s book, The Goal.2 The big “takeaway” of Goldratt’s book is that some of the most critical outcomes of a business process are fundamentally limited by, perhaps, a single constraint in the value chain. The constraint imposes a ceiling on sales, throughput, cash flow and profit. It has secondary effects on quality, price, fixed costs and delivery. In many manufacturing processes it will be easy to identify the constraint. It will be the machine with the big pile of work-in-progress in front of it. In more service-oriented industries, finding the constraint may require some more subtle investigation. The rate at which the constraint works determines how fast material moves through the process towards the customer.

The simple fact is that much management energy expended in “improving efficiency” has nil (positive) effect on effectiveness, efficiency or flexibility (the “3Fs”). Working furiously will not, of itself, promote the throughput of the constraint. Measures such as Overall Equipment Effectiveness (OEE) are useless and costly if applied to business steps that, themselves, are limited by the performance of a constraint that lies elsewhere.

That is the point about the construction industry, and much else. The proximity of the manual workers is not necessarily the constraint. That must be the case in many other businesses and processes.

I did a quick internet search on the Theory of Constraints and the current Covid-19 pandemic. I found only this, rather general, post by Domenico Lepore. There really wasn’t anything else on the internet that I could find. Lepore is the author of, probably, the most systematic and practical explanation of how to implement Goldratt’s theories.3 Once the constraint is identified:

  • Prioritise the constraint. Make sure it is never short of staff, inputs or consumables. Eliminate unplanned downtime by rationalising maintenance. Plan maintenance when it will cause least disruption but work on the maintenance process too. Measure OEE if you like. On the constraint.
  • Make the constraint’s performance “sufficiently regular to be predictable”.4 You can now forecast and plan. At last.
  • Improve the throughput of the constraint until it is no longer the constraint. Now there is a new constraint to attack.
  • Don’t forget to keep up the good work on the old constraint.

This is, I think, a useful approach to some Covid-19 problems. Where is the constraint? Is it physical proximity? If so, work to manage it. Is it something else? Then you are already stuck with the throughput of the constraint. Serve it in a socially-distanced way.

The court system of England and Wales

Here is a potential example that I was thinking about. Throughput in the court system of England and Wales has, since the onset of Covid-19, collapsed. Certainly in the civil courts, personal injury cases, debt recovery, commercial cases, property disputes, professional negligence claims. There has been more action in criminal and family courts, as far as I can see. Some hearings have taken place by telephone or by video but throughput has been miserable. Most civil courts remain closed other than for matters that need the urgent attention of a judge.

And that is the point of it. The judge, judicial time, is the constraint in the court system. Judgment, or at least the prospect thereof, is the principal way the courts add value. Much of civil procedure is aimed at getting the issues in a proper state for the judge to assess them efficiently and justly. The byproduct of that is that, once the parties have each clarified the issues in dispute, there may then be a window for settlement.

What has horrified the court service is the prospect of the sort of scrum of lawyers and litigants that is common in the inadequate waiting and conference facilities of most courts. That scrum is seen as important. It gives trial counsel an opportunity to review the evidence with their witnesses. It provides an opportunity for negotiation and settlement. Trial counsel will be there face to face with their clients. Offer and counter offer can pass quickly and intuitively between seasoned professionals. Into the mix are added the ushers and clerks who manage the parties securely into the court room. It is a concentrated mass of individuals, beset with frequently inadequate washing facilities.

Court rooms themselves present little problem. Most civil courts in England and Wales are embarrassingly expansive for the few people that generally attend hearings. Very commonly just the judge and two advocates. I cannot think of that many occasions when there will have been any real difficulty in keeping two metres apart.

With the judge as the constraint and the court room not, what remains is the issue of getting people into court. Why is that mass of people routinely in the waiting room? Well, to some extent it serves, in the language of Lean Production, as a “supermarket”,5 a reservoir of inputs that guarantees the judicial constraint does not run dry of work.6 Effective but not necessarily efficient. This  is needed because hearing lengths are difficult to predict. Moreover, some matters settle at court, as set out above. Some the afternoon before. For some matters, nobody turns up. The parties have moved on and not felt it important to inform the court.

As to providing the opportunity for taking instructions and negotiation that is, surely, a matter that the parties can be compelled to address, by telephone or video, on the previous afternoon. The courts here can borrow ideas from Single-Minute Exchange of Dies. This, in any event, seems a good idea. The parties would then be attending court ready to go. The waiting facilities would not be needed for their benefit. The court door settlements would have been dealt with.

The only people who need waiting accommodation are the participants in the next hearing. In most cases they can be accommodated, distanced and will have sufficient, even if sparse, washing facilities. These ideas are not foreign to the court system. It has been many years since a litigant or lawyer could just turn up at the court counter without first telephoning for an appointment, even on an urgent matter.

That probably involves some less ambitious listing of hearings. It may well have moved the constraint away from the judge to the queuing of parties into court. However, once the system is established, and recognised as the constraint, it is there to be improved. Worked on constantly. Thought about in the bath. Worried at on a daily basis.

Generate data. Analyse it. Act on it. Work. Use an improvement process. DMAIC is great but other improvement processes are available.

I’m sure all this thinking is going on. I can say no more.

References

  1. Image courtesy of Wikipedia and subject to Creative Commons license – for details see here
  2. Goldratt, E M & Cox, J (1984) The Goal: A Process of Ongoing Improvement, Gower
  3. Lepore, D & Cohen, O (1999) Deming and Goldratt: The Theory of Constraints and the System of Profound Knowledge, North River Press
  4. Kahneman, D (2011) Thinking, Fast and Slow, Allen Lane, p240
  5. What a good supermarket looks like“, Planet Lean, 4 April 2019, retrieved 24/5/20
  6. Rother, M & Shook, J (2003) Learning to See: Value-stream Mapping to Create Value and Eliminate Muda, Lean Enterprise Institute, p46

 

Social distancing and the El Farol Bar problem

Oh, that place. It’s so crowded nobody goes there anymore.

Yogi Berra

If 2020 has given the world a phrase then that phrase is social distancing. However, it put me in mind of a classic analysis in economics/ complexity theory, the El Farol Bar problem.

I have long enjoyed running in Hyde Park. With social distancing I am aware that I need to time and route my runs to avoid crowds. The park is, legitimately, popular and a lot of people live within reasonable walking distance. Private gardens are at a premium in this part of West London. The pleasing thing is that people in general seem to have spread out their visits and the park tends not to get too busy, weather depending. It is almost as though the populace had some means of co-ordinating their visits. That said, I can assure you that I don’t phone up the several hundred thousand people who must live in the park’s catchment area.

The same applies to supermarket visits. Things seem to have stabilised. This put me in mind of W B Arthur’s 1994 speculative analysis of attendances at his local El Farol bar.1 The bar was popular but generally seemed to be attended by a comfortable number of people, neither unpleasantly over crowded nor un-atmospherically quiet. This seems odd. Individual attendees had no obvious way of communicating or coordinating. If people, in general, believed that it would be over crowded then, pace Yogi Berra, nobody would go, thwarting their own expectations. But if there was a general belief it would be empty then everybody would go, again guaranteeing that their own individual forecasts were refuted.

Arthur asked himself how, given this analysis, people seemed to be so good at turning up in the right numbers. Individuals must have some way of predicting the attendance even though that barely seemed possible with the great number of independently acting people.

The model that Artur came up with was to endow every individual with an ecology of prediction formulas or rules, each taking the experience base and following a simple rule, using it to make a prediction of attendance the following week. Some of Arthur’s examples were, along with some others:

  • Predict the same as last week’s attendance.
  • Predict the average of the last 4 weeks’ attendances.
  • Predict the same as the attendance 2 weeks ago.
  • Add 5 to last week’s attendance.

Now, every time an individual gets another week’s data he assesses the accuracy of the respective rules. He then adopts the currently most accurate rule to predict next week’s attendance.

Arthur ran a computer simulation. He set the optimal attendance at El Farol as 60. An individual predicting over 60 attendees would stay away. An individual predicting fewer would attend. He found that the time sequence of weekly attendances soon stabilised around 60.

Fig 1

There are a few points to pull out of that about human learning in general. What Arthur showed is that individuals, and communities thereof, have the ability to learn in an ill-defined environment in an unstructured way. Arthur was not suggesting that individuals co-ordinate by self-consciously articulating their theories and systematically updating on new data. He was suggesting the sort of unconscious and implicit decision mechanism that may inhabit the windmills of our respective minds. Mathematician and philosopher Alfred North Whitehead believed that much of society’s “knowledge” was tied up in such culturally embedded and unarticulated algorithms.2

It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle — they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.

The regularity trap

Psychologists Gary Klein and Daniel Kahneman investigated how firefighters were able to perform so successfully in assessing a fire scene and making rapid, safety critical decisions. Lives of the public and of other firefighters were at stake. Together, Klein and Kahneman set out to describe how the brain could build up reliable memories that would be activated in the future, even in the agony of the moment. They came to the conclusion that there are two fundamental conditions for a human to acquire a predictive skill.3

  • An environment that is sufficiently regular to be predictable.
  • An opportunity to learn these regularities through prolonged practice

Arthur’s Fig.1, after the initial transient, looks impressively regular, stable and predictable. Some “invisible hand” has moved over the potential attendees and coordinated their actions. So it seems.

Though there is some fluctuation it is of a regular sort, what statisticians call exchangeable variation.

The power of a regular and predictable process is that it does enable us to keep Whitehead’s cavalry in reserve for what Kahneman called System 2 thinking, the reflective analytical dissection of a problem. It is the regularity that allows System 1 thinking where we can rely on heuristics, habits and inherited prejudices, the experience base.

The fascinating thing about the El Farol problem is that the regularity arises, not from anything consistent, but from data-adaptive selection from the ecology of rules. It is not obvious in advance that such can give rise to any, even apparent, stability. But there is a stability, and an individual can rely upon it to some extent. Certainly as far as a decision to spend a sociable evening is concerned. However, therein lies its trap.

Tastes in venue, rival attractions, new illnesses flooding the human race (pace Gottfried Leibniz), economic crises, … . Sundry matters can upset the regular and predictable system of attendance. And they will not be signalled in advance in the experience base.

Predicting on the basis of a robustly measured, regular and stable experience base will always be a hostage to emerging events. Agility in the face of emerging data-signals is essential. But understanding the vulnerabilities of current data patterns is important too. In risk analysis, understanding which historically stable processes are sensitive to foreseeable crises is essential.

Folk sociology

Folk physics is the name given to the patterns of behaviour that we all exhibit that enable us to catch projectiles, score “double tops” on the dart board, and which enabled Michel Platini to defy the wall with his free kicks. It is not the academic physics of Sir Isaac Newton which we learn in courses on theoretical mechanics and which enables the engineering of our most ambitious monumental structures. However, it works for us in everyday life, lifting boxes and pushing buggies.4

Apes, despite their apparently impressive ability to use tools, it turns out, have no internal dynamic models or physical theories at all. They are unable to predict in novel situations. They have no System 2 thinking. They rely on simple granular rules and heuristics, learned by observation and embedded by successful repetition. It seems more than likely that, in many circumstances, as advanced by Whitehead, that is true of humans too.5 Much of our superficially sophisticated behaviour is more habit than calculation, though habit in which is embedded genuine knowledge about our environment and successful strategies of value creation.6 Kahneman’s System 1 thinking.

The lesson of that is to respect what works. But where the experience base looks like the result of an pragmatic adjustment to external circumstances, indulge in trenchant criticism of historical data. And remain agile.

Next time I go out for a run, I’m going to check the weather.

References

  1. Arthur, W B (1994) “Inductive reasoning and bounded rationality, The American Economic Review, 84 (2), Papers and Proceedings of the Hundred and Sixth Annual Meeting of the American Economic Association, 406-411
  2. Whitehead, A N (1911) An Introduction to Mathematics, Ch.5
  3. Kahneman, D (2011) Thinking, Fast and Slow, Allen Lane, p240
  4. McCloskey (1983) “Intuitive physics”, Scientific American 248(4), 122-30
  5. Povinelli, D J (2000) Folk Physics for Apes: The Chimpanzee’s Theory of How the World Works, Oxford
  6. Hayek, F A (1945) “The use of knowledge in society”, The American Economic Review, 35(4), 519-530