Richard Dawkins champions intelligent design (for business processes)

Richard Dawkins has recently had a couple of bad customer experiences. In each he was confronted with a system that seemed to him indifferent to his customer feedback. I sympathise with him on one matter but not the other. The two incidents do, in my mind, elucidate some important features of process discipline.

In the first, Dawkins spent a frustrating spell ordering a statement from his bank over the internet. He wanted to tell the bank about his experience and offer some suggestions for improvement, but he couldn’t find any means of channelling and communicating his feedback.

Embedding a business process in software will impose a rigid discipline on its operation. However, process discipline is not the same thing as process petrification. The design assumptions of any process include, or should include, the predicted range and variety of situations that the process is anticipated to encounter. We know that the bounded rationality of the designers will blind them to some of the situations that the process will subsequently confront in real world operation. There is no shame in that but the necessary adjunct is that, while the process is operated diligently as designed, data is accumulated on its performance and, in particular, on the customer’s experience. Once an economically opportune moment arrives (I have glossed over quote a bit there) the data can be reviewed, design assumptions challenged and redesign evaluated. Following redesign the process then embarks on another period of boring operation. The “boring” bit is essential to success. Perhaps I should say “mindful” rather than “boring” though I fear that does not really work with software.

Dawkins’ bank have missed an opportunity to listen to the voice of the customer. That weakens their competitive position. Ignorance cannot promote competitiveness. Any organisation that is not continually improving every process for planning, production and service (pace W Edwards Deming) faces the inevitable fact that its competitors will ultimately make such products and services obsolete. As Dawkins himself would appreciate, survival is not compulsory.

Dawkins’ second complaint was that security guards at a UK airport would not allow him to take a small jar of honey onto his flight because of a prohibition on liquids in the passenger cabin. Dawkins felt that the security guard should have displayed “common sense” and allowed it on board contrary to the black letter of the regulations. Dawkins protests against “rule-happy officials” and “bureaucratically imposed vexation”. Dawkins displays another failure of trust in bureaucracy. He simply would not believe that other people had studied the matter and come to a settled conclusion to protect his safety. It can hardly have been for the airport’s convenience. Dawkins was more persuaded by something he had read on the internet. He fell into the trap of thinking that What you see is all there is. I fear that Dawkins betrays his affinities with the cyclist on the railway crossing.

When we give somebody a process to operate we legitimately expect them to do so diligently and with self discipline. The risk of an operator departing from, adjusting or amending a process on the basis of novel local information is that, within the scope of the resources they have for taking that decision, there is no way of reliably incorporating the totality of assumptions and data on which the process design was predicated. Even were all the data available, when Dawkins talks of “common sense” he was demanding what Daniel Kahneman called System 2 thinking. Whenever we demand System 2 thinking ex tempore we are more likely to get System 1 and it is unlikely to perform effectively. The rationality of an individual operator in that moment is almost certainly more tightly bounded than that of the process designers.

In this particular case, any susceptibility of a security guard to depart from process would be exactly the behaviour that a terrorist might seek to exploit once aware of it.

Further, departures from process will have effects on the organisational system, upstream, downstream and collateral. Those related processes themselves rely on the operator’s predictable compliance. The consequence of ill discipline can be far reaching and unanticipated.

That is not to say that the security process was beyond improvement. In an effective process-oriented organisation, operating the process would be only one part of the security guard’s job. Part of the bargain for agreeing to the boring/ mindful diligent operation of the process is that part of work time is spent improving the process. That is something done offline, with colleagues, with the input of other parts of the organisation and with recognition of all the data including the voice of the customer.

Had he exercised the “common sense” Dawkins demanded, the security guard would have risked disciplinary action by his employers for serious misconduct. To some people, threats of sanctions appear at odds with engendering trust in an organisation’s process design and decision making. However, when we tell operators that something is important then fail to sanction others who ignore the process, we undermine the basis of the bond of trust with those that accepted our word and complied. Trust in the bureaucracy and sanctions for non-compliance are complementary elements of fostering process discipline. Both are essential.

Trust in bureaucracy I – the Milgram experiments

I have recently been reading Gina Perry’s book Behind the Shock Machine which analyses, criticises and re-assesses the “obedience” experiments of psychologist Stanley Milgram performed in the early 1960s. For the uninitiated there is a brief description of the experiments on Dr Perry’s website. You can find a video of the experiments here.

The experiments have often been cited as evidence for a constitutional human bias towards compliance in the face of authority. From that interpretation has grown a doctrine that the atrocities of war and of despotism are enabled by the common man’s (sic) unresistsing obedience to even a nominal superior, and further that inherent cruelty is eager to express itself under the pretext of an order.

Perry mounts a detailed challenge to the simplicity of that view. In particular, she reveals how Milgram piloted his experiments and fine tuned them so that they would produce the most signal obedient behaviour. The experiments took place within the context of academic research. The experimenter did everything to hold himself out as the representative of an overwhelmingly persuasive body of scientific knowledge. At every stage the experimenter reassured the subject and urged them to proceed. Given this real pressure applied to the experimental subjects, even a 65% compliance rate was hardly dramatic. Most interestingly, the actual reaction of the subjects to their experience was complex and ambiguous. It was far from the conventional view of the cathartic release of supressed violence facilitated by a directive from a figure with a superficial authority. Erich Fromm made some similar points about the experiments in his 1973 book The Anatomy of Human Destructiveness.

What interests me about the whole affair is its relevance to an issue which I have raised before on this blog: trust in bureaucracy. Max Weber was one of the first sociologists to describe how modern societies and organisations rely on a bureaucracy, an administrative policy-making group, to maintain the operation of complex dynamic systems. Studies of engineering and science as bureaucratic professions include Diane Vaughan’s The Challenger Launch Decision.

The majority of Milgram’s subjects certainly trusted the bureaucracy represented by the experimenter, even in the face of their own fears that they were doing harm. This is a stark contrast to some failures of such trust that I have blogged about here. By their mistrust, the cyclist on the railway crossing and the parents who rejected the MMR vaccination placed themselves and others in genuine peril. These were people who had, as far as I have been able to discover, no compelling evidence that the engineers who designed the railway crossing or the scientists who had tested the MMR vaccine might act against their best interests.

So we have a paradox. The majority of Milgram’s subjects ignored their own compelling fears and trusted authority. The cyclist and the parents recklessly ignored or actively mistrusted authority without a well developed alternative world view. Whatever our discomfort with Milgram’s demonstrations of obedience we feel no happier with the cyclist’s and parents’ disobedience. Prof Jerry M Burger partially repeated Milgram’s experiments in 2007. He is quoted by Perry as saying:

It’s not as clear cut as it seems from the outside. When you’re in that situation, wondering, should I continue or should I not, there are reasons to do both. What you do have is an expert in the room who knows all about this study and presumably has been through this many times before with many participants, and he’s telling you, there’s nothing wrong. The reasonable, rational thing to do is to listen to the guy who’s the expert when you’re not sure what to do.

Organisations depend on a workforce aligned around trust in that organisation’s policy and decision making machinery. Even in the least hierarchical of organisations, not everybody gets involved in every decision. Whether it’s the decision of a co-worker with an exotic expertise or the policy of a superior in the hierarchy, compliance and process discipline will succeed or fail on the basis of trust.

The “trust” that Milgram’s subjects showed towards the experimenter was manufactured and Perry discusses how close the experiment ran to acceptable ethical standards.

Organisations cannot rely on such manufactured “trust”. Breakdown of trust among employees is a major enterprise risk for most organisations. The trust of customers is essential to reputation. A key question in all decision making is whether the outcome will foster trust or destroy it.

Managing a railway on historical data is like …

I was recently looking on the web for any news on the Galicia rail crash. I didn’t find anything current but came across this old item from The Guardian (London). It mentioned in passing that consortia tendering for a new high speed railway in Brazil were excluded if they had been involved in the operation of a high speed line that had had an accident in the previous five years.

Well, I don’t think that there is necessarily anything wrong with that in itself. But it is important to remember that a rail accident is not necessarily a Signal (sic). Rail accidents worldwide are often a manifestation of what W Edwards Deming called A stable system of trouble. In other words, a system that features only Noise but which cannot deliver the desired performance. An accident free record of five years is a fine thing but there is nothing about a stable system of trouble that says it can’t have long incident free periods.

In order to turn that incident free five years into evidence about future likely safety performance we also need hard evidence, statistical and qualitative, about the stability and predictability of the rail operator’s processes. Procurement managers are often much worse at looking for, and at, this sort of data. In highly sophisticated industries such as automotive it is routine to demand capability data and evidence of process surveillance from a potential supplier. Without that, past performance is of no value whatever in predicting future results.

Rearview

Music is silver but …

The other day I came across a report on the BBC website that non-expert listeners could pick out winners of piano competitions more reliably when presented with silent performance videos than when exposed to sound alone. In the latter case they performed no better than chance.

The report was based on the work of Chia-Jung Tsay at University College London, in a paper entitled Sight over sound in the judgment of music performance.

The news report immediately leads us to suspect that the expert evaluating a musical performance is not in fact analysing and weighing auditory complexity and aesthetics but instead falling under the subliminal influence of the proxy data of the artist’s demeanour and theatrics.

That is perhaps unsurprising. We want to believe, as does the expert critic, that performance evaluation is a reflective, analytical and holistic enterprise, demanding decades of exposure to subtle shades of interpretation and developing skills of discrimination by engagement with the ascendant generation of experts. This is what Daniel Kahneman calls a System 2 task. However, a wealth of psychological study shows only too well that System 2 is easily fatigued and distracted. When we believe we are thinking in System 2, we are all too often loafing in System 1 and using simplistic learned heuristics as a substitute. It is easy to imagine that the visual proxy data might be such a heuristic, a ready reckoner that provides a plausible result in a wide variety of commonly encountered situations.

These behaviours are difficult to identify, even for the most mindful individual. Kahneman notes:

… all of us live much of our lives guided by the impressions of System 1 – and we do not know the source of these impressions. How do you know that a statement is true? If it is strongly linked by logic or association to other beliefs or preferences you hold, or comes from a source you trust and like, you will feel a sense of cognitive ease. The trouble is that there may be other causes for your feeling of ease … and you have no simple way of tracing your feelings to their source”

Thinking, Fast and Slow, p64

The problem is that what Kahneman describes is exactly what I was doing in finding my biases confirmed by this press report. I have had a superficial look at the statistics in this study and I am now less persuaded than when I read the press item. I shall maybe blog about this later and the difficulties I had in interpreting the analysis. Really, this is quite a tentative and suggestive study on a very limited frame. I would certainly like to see more inter-laboratory studies in psychology. The study is open to multiple interpretations and any individual will probably have difficulty making an exhaustive list.  There is always a danger of falling into the trap of What You See Is All There Is (WYSIATI).

That notwithstanding, even anecdotally, the story is another reminder of an important lesson of process management that, even though what we have been doing has worked in the past, we may not understand what it is that has been working.

Trust in data – II

I just picked up on this, now not so recent, news item about the prosecution of Steven Eaton. Eaton was gaoled for falsifying data in clinical trials. His prosecution was pursuant to the Good Laboratory Practice Regulations 1999. The Regulations apply to chemical safety assessments and come to us, in the UK, from that supra-national body the OECD. Sadly I have managed to find few details other than the press reports. I have had a look at the website of the prosecuting Medicines and Healthcare Products Regulatory Agency but found nothing beyond the press release. I thought about a request under the Freedom of Information Act 2000 but wonder whether an exemption is being claimed pursuant to section 31.

It’s a shame because it would have been an opportunity to compare and contrast with another notable recent case of industrial data fabrication, that concerning BNFL and the Kansai Electric contract. Fortunately, in that case, the HSE made public a detailed report.

In the BNFL case, technicians had fabricated measurements of the diameters of fuel pellets in nuclear fuel rods, it appears principally out of boredom at doing the actual job. The customer spotted it, BNFL didn’t. The matter caused huge reputational damage to BNFL and resulted in the shipment of nuclear fuel rods, necessarily under armed escort, being turned around mid-ocean and returned to the supplier.

For me, the important lesson of the BNFL affair is that businesses must avoid a culture where employees decide what parts of the job are important and interesting to them, what is called intrinsic motivation. Intrinsic motivation is related to a sense of cognitive ease. That sense rests, as Daniel Kahneman has pointed out, on an ecology of unknown and unknowable beliefs and prejudices. No doubt the technicians had encountered nothing but boringly uniform products. They took that as a signal, and felt a sense of cognitive ease in doing so, to stop measuring and conceal the fact that they had stopped.

However, nobody in the supply chain is entitled to ignore the customer’s wishes. Businesses need to foster the extrinsic motivation of the voice of the customer. That is what defines a job well done. Sometimes it will be irksome and involve a lot of measuring pellets whose dimensions look just the same as the last batch. We simply have to get over it!

The customer wanted the data collected, not simply as a sterile exercise in box-ticking, but as a basis for diligent surveillance of the manufacturing process and as a critical component of managing the risks attendant in real world nuclear industry operations. The customer showed that a proper scrutiny of the data, exactly what they had thought that BNFL would perform as part of the contract, would have exposed its inauthenticity. BNFL were embarrassed, not only by their lack of management control of their own technicians, but by the exposure of their own incapacity to scrutinise data and act on its signal message. Even if all the pellets were of perfect dimension, the customer would be legitimately appalled that so little critical attention was being paid to keeping them so.

Data that is properly scrutinised, as part of a system of objective process management and with the correct statistical tools, will readily be exposed if it is fabricated. That is part of incentivising technicians to do the job diligently. Dishonesty must not be tolerated. However, it is essential that everybody in an organisation understands the voice of the customer and understands the particular way in which they themselves add value. A scheme of goal deployment weaves the threads of the voice of the customer together with those of individual process management tactics. That is what provides an individual’s insight into how their work adds value for the customer. That is what provides the “nudge” towards honesty.