Deconstructing Deming X – Eliminate slogans!

10. Eliminate slogans, exhortations and targets for the workforce.

W Edwards Deming

Neither snow nor rain nor heat nor gloom of night stays these couriers from the swift completion of their appointed rounds.

Inscription on the James Farley Post Office, New York City, New York, USA
William Mitchell Kendall pace Herodotus

Now, that’s what I call a slogan. Is this what Point 10 of Deming’s 14 Points was condemning? There are three heads here, all making quite distinct criticisms of modern management. The important dimension of this criticism is the way in which managers use data in communicating with the wider organisation, in setting imperatives and priorities and in determining what individual workers will consider important when they are free from immediate supervision.

Eliminate slogans!

The US postal inscription at the head of this blog certainly falls within the category of slogans. Apparently the root of the word “slogan” is the Scottish Gaelic sluagh-ghairm meaning a battle cry. It seeks to articulate a solidarity and commitment to purpose that transcends individual doubts or rationalisation. That is what the US postal inscription seeks to do. Beyond the data on customer satisfaction, the demands of the business to protect and promote its reputation, the service levels in place for individual value streams, the tension between current performance and aspiration, the disappointment of missed objectives, it seeks to draw together the whole of the organisation around an ideal.

Slogans are part of the broader oral culture of an organisation. In the words of Lawrence Freedman (Strategy: A History, Oxford, 2013, p564) stories, and I think by extension slogans:

[make] it possible to avoid abstractions, reduce complexity, and make vital points indirectly, stressing the importance of being alert to serendipitous opportunities, discontented staff, or the one small point that might ruin an otherwise brilliant campaign.

But Freedman was quick to point out the use of stories by consultants and in organisations frequently confused anecdote with data. They were commonly used selectively and often contrived. Freedman sought to extract some residual value from the culture of business stories, in particular drawing on the work of psychologist Jerome Bruner along with Daniel Kahneman’s System 1 and System 2 thinking. The purpose of the narrative of an organisation, including its slogans and shared stories, is not to predict events but to define a context for action when reality is inevitably overtaken by a special cause.

In building such a rich narrative, slogans alone are an inert and lifeless tactic unless woven with the continual, rigorous criticism of historical data. In fact, it is the process behaviour chart that acts as the armature around which the narrative can be wound. Building the narrative will be critical to how individuals respond to the messages of the chart.

Deming himself coined plenty of slogans: “Drive out fear”, “Create joy in work”, … . They are not forbidden. But to be effective they must form a verisimilar commentary on, and motivation for, the hard numbers and ineluctable signals of the process behaviour chart.

Eliminate exhortations!

I had thought I would dismiss this in a single clause. It is, though, a little more complicated. The sports team captain who urges her teammates onwards to take the last gasp scoring opportunity doesn’t necessarily urge in vain. There is no analysis of this scenario. It is only muscle, nerve, sweat and emotion.

The English team just suffered a humiliating exit from the Cricket World Cup. The head coach’s response was “We’ll have to look at the data.” Andrew Miller in The Times (London) (10 March 2015) reflected most cricket fans’ view when he observed that “a team of meticulously prepared cricketers suffered a collective loss of nerve and confidence.” Exhortations might not have gone amiss.

It is not, though, a management strategy. If your principal means of managing risk, achieving compelling objectives, creating value and consistently delivering customer excellence, day in, day out is to yell “one more heave!” then you had better not lose your voice. In the long run, I am on the side of the analysts.

Slogans and exhortations will prove a brittle veneer on a stable system of trouble (RearView). It is there that they will inevitably corrode engagement, breed cynicism, foster distrust, and mask decline. Only the process behaviour chart can guard against the risk.

Eliminate targets for the workforce!

This one is more complicated. How do I communicate to the rest of the organisation what I need from them? What are the consequences when they don’t deliver? How do the rest of the organisation communicate with me? This really breaks down into two separate topics and they happen to be the two halves of Deming’s Point 11.

I shall return to those in my next two posts in the Deconstructing Deming series.

 

The dark side of discipline

W Edwards Deming was very impressed with Japanese railways. In Out of the Crisis (1986) he wrote this.

The economy of a single plan that will work is obvious. As an example, may I cite a proposed itinerary in Japan:

          1725 h Leave Taku City.
          1923 h Arrive Hakata.
Change trains.
          1924 h Leave Hakata [for Osaka, at 210 km/hr]

Only one minute to change trains? You don’t need a whole minute. You will have 30 seconds left over. No alternate plan was necessary.

My friend Bob King … while in Japan in November 1983 received these instructions to reach by train a company that he was to visit.

          0903 h Board the train. Pay no attention to trains at 0858, 0901.
          0957 h Off.

No further instruction was needed.

Deming seemed to assume that these outcomes were delivered by a capable and, moreover, stable system. That may well have been the case in 1983. However, by 2005 matters had drifted.

Aftermath of the Amagasaki rail crashThe other night I watched, recorded from the BBC, the documentary Brakeless: Why Trains Crash about the Amagasaki rail crash on 25 April 2005. I fear that it is no longer available in BBC iPlayer. However, most of the documentaries in this BBC Storyville strand are independently produced and usually have some limited theatrical release or are available elsewhere. I now see that the documentary is available here on Dailymotion.

The documentary painted a system of “discipline” on the railway where drivers were held directly responsible for outcomes, overridingly punctuality. This was not a documentary aimed at engineers but the first thing missing for me was any risk assessment of the way the railway was run. Perhaps it was there but it is difficult to see what thought process would lead to a failure to mitigate the risks of production pressures.

However, beyond that, for me the documentary raised some important issues of process discipline. We must be very careful when we make anyone working within a process responsible for its outputs. That sounds a strange thing to say but Paul Jennings at Rolls-Royce always used to remind me You can’t work on outcomes.

The difficulty that the Amagasaki train drivers had was that the railway was inherently subject to sources of variation over which the drivers had no control. In the face of those sources of variation, they were pressured to maintain the discipline of a punctual timetable. They way they did that was to transgress other dimensions of process discipline, in the Amagasaki case, speed limits.

Anybody at work must diligently follow the process given to them. But if that process does not deliver the intended outcome then that is the responsibility of the manager who owns the process, not the worker. When a worker, with the best of intentions, seeks independently to modify the process, they are in a poor position, constrained as they are by their own bounded rationality. They will inevitably by trapped by System 1 thinking.

Of course, it is great when workers can get involved with the manager’s efforts to align the voice of the process with the voice of the customer. However, the experimentation stops when they start operating the process live.

Fundamentally, it is a moral certainty that purblind pursuit of a target will lead to over-adjustment by the worker, what Deming called “tampering”. That in turn leads to increased costs, aggravated risk and vitiated consumer satisfaction.

Richard Dawkins champions intelligent design (for business processes)

Richard Dawkins has recently had a couple of bad customer experiences. In each he was confronted with a system that seemed to him indifferent to his customer feedback. I sympathise with him on one matter but not the other. The two incidents do, in my mind, elucidate some important features of process discipline.

In the first, Dawkins spent a frustrating spell ordering a statement from his bank over the internet. He wanted to tell the bank about his experience and offer some suggestions for improvement, but he couldn’t find any means of channelling and communicating his feedback.

Embedding a business process in software will impose a rigid discipline on its operation. However, process discipline is not the same thing as process petrification. The design assumptions of any process include, or should include, the predicted range and variety of situations that the process is anticipated to encounter. We know that the bounded rationality of the designers will blind them to some of the situations that the process will subsequently confront in real world operation. There is no shame in that but the necessary adjunct is that, while the process is operated diligently as designed, data is accumulated on its performance and, in particular, on the customer’s experience. Once an economically opportune moment arrives (I have glossed over quote a bit there) the data can be reviewed, design assumptions challenged and redesign evaluated. Following redesign the process then embarks on another period of boring operation. The “boring” bit is essential to success. Perhaps I should say “mindful” rather than “boring” though I fear that does not really work with software.

Dawkins’ bank have missed an opportunity to listen to the voice of the customer. That weakens their competitive position. Ignorance cannot promote competitiveness. Any organisation that is not continually improving every process for planning, production and service (pace W Edwards Deming) faces the inevitable fact that its competitors will ultimately make such products and services obsolete. As Dawkins himself would appreciate, survival is not compulsory.

Dawkins’ second complaint was that security guards at a UK airport would not allow him to take a small jar of honey onto his flight because of a prohibition on liquids in the passenger cabin. Dawkins felt that the security guard should have displayed “common sense” and allowed it on board contrary to the black letter of the regulations. Dawkins protests against “rule-happy officials” and “bureaucratically imposed vexation”. Dawkins displays another failure of trust in bureaucracy. He simply would not believe that other people had studied the matter and come to a settled conclusion to protect his safety. It can hardly have been for the airport’s convenience. Dawkins was more persuaded by something he had read on the internet. He fell into the trap of thinking that What you see is all there is. I fear that Dawkins betrays his affinities with the cyclist on the railway crossing.

When we give somebody a process to operate we legitimately expect them to do so diligently and with self discipline. The risk of an operator departing from, adjusting or amending a process on the basis of novel local information is that, within the scope of the resources they have for taking that decision, there is no way of reliably incorporating the totality of assumptions and data on which the process design was predicated. Even were all the data available, when Dawkins talks of “common sense” he was demanding what Daniel Kahneman called System 2 thinking. Whenever we demand System 2 thinking ex tempore we are more likely to get System 1 and it is unlikely to perform effectively. The rationality of an individual operator in that moment is almost certainly more tightly bounded than that of the process designers.

In this particular case, any susceptibility of a security guard to depart from process would be exactly the behaviour that a terrorist might seek to exploit once aware of it.

Further, departures from process will have effects on the organisational system, upstream, downstream and collateral. Those related processes themselves rely on the operator’s predictable compliance. The consequence of ill discipline can be far reaching and unanticipated.

That is not to say that the security process was beyond improvement. In an effective process-oriented organisation, operating the process would be only one part of the security guard’s job. Part of the bargain for agreeing to the boring/ mindful diligent operation of the process is that part of work time is spent improving the process. That is something done offline, with colleagues, with the input of other parts of the organisation and with recognition of all the data including the voice of the customer.

Had he exercised the “common sense” Dawkins demanded, the security guard would have risked disciplinary action by his employers for serious misconduct. To some people, threats of sanctions appear at odds with engendering trust in an organisation’s process design and decision making. However, when we tell operators that something is important then fail to sanction others who ignore the process, we undermine the basis of the bond of trust with those that accepted our word and complied. Trust in the bureaucracy and sanctions for non-compliance are complementary elements of fostering process discipline. Both are essential.

The Monty Hall Problem redux

This old chestnut refuses to die and I see that it has turned up again on the BBC website. I have been intending for a while to blog about this so this has given me the excuse. I think that there has been a terrible history of misunderstanding this problem and I want to set down how the confusion comes about. People have mistaken a problem in psychology for a problem in probability.

Here is the classic statement of the problem that appeared in Parade magazine in 1990.

Suppose you’re on a game show, and you’re given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what’s behind the doors, opens another door, say No. 3, which has a goat. He then says to you, “Do you want to pick door No. 2?” Is it to your advantage to switch your choice?

The rational way of approaching this problem is through Bayes’ theorem. Bayes’ theorem tells us how to update our views as to the probability of events when we have some new information. In this problem I have never seen anyone start from a position other than that, before any doors are opened, no door is more probably hiding the car than the others. I think it is uncontroversial to say that for each door the probability of its hiding the car is 1/3.

Once the host opens door No. 3, we have some more information. We certainly know that the car is not behind door No. 3 but does the host tell us anything else? Bayes’ theorem tells us how to ask the right question. The theorem can be illustrated like this.
Bayes

The probability of observing the new data, if the theory is correct (the green box), is called the likelihood and plays a very important role in statistics.

Without giving the details of the mathematics, Bayes’ theorem leads us to analyse the problem in this way.

MH1

We can work this out arithmetically but, because all three doors were initially equally probable, the matter comes down to deciding which of the two likelihoods is greater.

MH2

So what are the respective probabilities of the host behaving in the way he did? Unfortunately, this is where we run into problems because the answer depends on the tactic that the host was adopting.

And we are not given that in the question.

Consider some of the following possible tactics the host may have adopted.

  1. Open an unopened door hiding a goat, if both unopened doors have goats, choose at random.
  2. If the contestant chooses door 1 (or 2, or 3), always open 3 (or 1, or 2) whether or not it contains a goat.
  3. Open either unopened door at random but only if contestant has chosen box with prize otherwise don’t open a box (the devious strategy, suggested to me by a former girlfriend as the obviously correct answer).
  4. Choose an unopened door at random. If it hides a goat open it. Otherwise do not open a door (not the same as tactic 1).
  5. Open either unopened door at random whether or not it contains a goat

There are many more. All these various tactics lead to different likelihoods.

Tactic Probability that the host revealed a goat at door 3: Rational choice
given that the car is at 1 given that the car is at 2
1

½

1

Switch
2

1

1

No difference
3

½

0

Don’t switch
4

½

½

No difference
5

½

½

No difference

So if we were given this situation in real life we would have to work out which tactic the host was adopting. The problem is presented as though it is a straightforward maths problem but it critically hinges on a problem in psychology. What can we infer from the host’s choice? What is he up to? I think that this leads to people’s discomfort and difficulty. I am aware that even people who start out assuming Tactic 1 struggle but I suspect that somewhere in the back of their minds they cannot rid themselves of the other possibilities. The seeds of doubt have been sown in the way the problem is set.

A participant in the game show would probably have to make a snap judgment about the meaning of the new data. This is the sort of thinking that Daniel Kahneman calls System 1 thinking. It is intuitive, heuristic and terribly bad at coping with novel situations. Fear of the devious strategy may well prevail.

A more ambitious contestant may try to embark on more reflective analytical System 2 thinking about the likely tactic. That would be quite an achievement under pressure. However, anyone with the inclination may have been able to prepare himself with some pre-show analysis. There may be a record of past shows from which the host’s common tactics can be inferred. The production company’s reputation in similar shows may be known. The host may be displaying signs of discomfort or emotional stress, the “tells” relied on by poker players.

There is a lot of data potentially out there. However, that only leads us to another level of statistical, and psychological, inference about the host’s strategy, an inference that itself relies on its own uncertain likelihoods and prior probabilities. And that then leads to the level of behaviour and cognitive psychology and the uncertainties in the fundamental science of human nature. It seems as though, as philosopher Richard Jeffrey put it, “It’s probabilities all the way down”.

Behind all this, it is always useful advice that, having once taken a decision, it should only be revised if there is some genuinely new data that was surprising given our initial thinking.

Economist G L S Shackle long ago lamented that:

… we habitually and, it seems, unthinkingly assume that the problem facing … a business man, is of the same kind as those set in examinations in mathematics, where the candidate unhesitatingly (and justly) takes it for granted that he has been given enough information to construe a satisfactory solution. Where, in real life, are we justified in assuming that we possess ‘enough’ information?

Music is silver but …

The other day I came across a report on the BBC website that non-expert listeners could pick out winners of piano competitions more reliably when presented with silent performance videos than when exposed to sound alone. In the latter case they performed no better than chance.

The report was based on the work of Chia-Jung Tsay at University College London, in a paper entitled Sight over sound in the judgment of music performance.

The news report immediately leads us to suspect that the expert evaluating a musical performance is not in fact analysing and weighing auditory complexity and aesthetics but instead falling under the subliminal influence of the proxy data of the artist’s demeanour and theatrics.

That is perhaps unsurprising. We want to believe, as does the expert critic, that performance evaluation is a reflective, analytical and holistic enterprise, demanding decades of exposure to subtle shades of interpretation and developing skills of discrimination by engagement with the ascendant generation of experts. This is what Daniel Kahneman calls a System 2 task. However, a wealth of psychological study shows only too well that System 2 is easily fatigued and distracted. When we believe we are thinking in System 2, we are all too often loafing in System 1 and using simplistic learned heuristics as a substitute. It is easy to imagine that the visual proxy data might be such a heuristic, a ready reckoner that provides a plausible result in a wide variety of commonly encountered situations.

These behaviours are difficult to identify, even for the most mindful individual. Kahneman notes:

… all of us live much of our lives guided by the impressions of System 1 – and we do not know the source of these impressions. How do you know that a statement is true? If it is strongly linked by logic or association to other beliefs or preferences you hold, or comes from a source you trust and like, you will feel a sense of cognitive ease. The trouble is that there may be other causes for your feeling of ease … and you have no simple way of tracing your feelings to their source”

Thinking, Fast and Slow, p64

The problem is that what Kahneman describes is exactly what I was doing in finding my biases confirmed by this press report. I have had a superficial look at the statistics in this study and I am now less persuaded than when I read the press item. I shall maybe blog about this later and the difficulties I had in interpreting the analysis. Really, this is quite a tentative and suggestive study on a very limited frame. I would certainly like to see more inter-laboratory studies in psychology. The study is open to multiple interpretations and any individual will probably have difficulty making an exhaustive list.  There is always a danger of falling into the trap of What You See Is All There Is (WYSIATI).

That notwithstanding, even anecdotally, the story is another reminder of an important lesson of process management that, even though what we have been doing has worked in the past, we may not understand what it is that has been working.