Trust in data – IV – trusting the team

Today (20 November 2013) I was reading an item in The Times (London) with the headline “We fiddle our crime numbers, admit police”. This is a fairly unedifying business.

The blame is once again laid at the door of government targets and performance related pay. I fear that this is akin to blaming police corruption on the largesse of criminals. If only organised crime would stop offering bribes, the police would not succumb to taking them in consideration of repudiating their office as constable, so the argument might run (pace Brian Joiner). Of course, the argument is nonsense. What we expect of police constables is honesty even, perhaps especially, when temptation presents itself. We expect the police to give truthful evidence in court, to deal with the public fairly and to conduct their investigations diligently and rationally. The public expects the police to behave in this way even in the face of manifest temptation to do otherwise. The public expects the same honest approach to reporting their performance. I think Robert Frank put it well in Passions within Reason.

The honest individual … is someone who values trustworthiness for its own sake. That he might receive a material payoff for such behaviour is beyond his concern. And it is precisely because he has this attitude that he can be trusted in situations where his behaviour cannot be monitored. Trustworthiness, provided it is recognizable, creates valuable opportunities that would not otherwise be available.

Matt Ridley put it starkly in his overview of evolutionary psychology, The Origins of Virtue. He wasn’t speaking of policing in particular.

The virtuous are virtuous for no other reason that it enables them to join forces with others who are virtuous, for mutual benefit.

What worried me most about the article was a remark from Peter Barron, a former detective chief superintendent in the Metropolitan Police. Should any individual challenge the distortion of data:

You are judged to be not a team player.

“Teamwork” can be a smokescreen for the most appalling bullying. In our current corporate cultures, to be branded as “not a team player” can be the most horrible slur, smearing the individual’s contribution to the overall mission. One can see how such an environment can allow a team’s behaviours and objectives to become misaligned from those of the parent organisation. That is a problem that can often be addressed by management with a proper system of goal deployment.

However, the problem is more severe when the team is in fact well aligned to what are distorted organisational goals. The remedies for this lie in the twin processes of governance and whistleblowing. Neither seem to be working very well in UK policing at the moment but that simply leaves an opportunity for process improvement. Work is underway. The English law of whistleblowing has been amended this year. If you aren’t familiar with it you can find it here.

Governance has to take scrutiny of data seriously. Reported performance needs to be compared with other sources of data. Reporting and recording processes need themselves to be assessed. Where there is no coherent picture questions need to be asked.

On forecasting as the slave of our passions

Last weekend I was reading Dominic Lawson’s Sunday Times (London) review (10 November 2013) of Normal Greenspan’s recent book The Map and the Territory: Risk, Human Nature, and the Future of Forecasting. Lawson expresses his astonishment at what Greenspan says.

I and other economic forecasters didn’t understand that markets are prone to wild and even deranging mood swings that are uncoupled from any underlying rational basis.

I have to share Lawson’s astonishment. After all, Greenspan was the man who criticised the markets’ irrational exuberance back in the 1990s.

Lawson usefully reminded me of an important observation by eighteenth century philosopher David Hume.

Reason is … only the slave of the passions and can never pretend to any other office.

Perhaps computer pioneer Marvin Minksy put it in a more colloquial way.

Logic doesn’t apply in the real world.

That is something that we have to be very wary of in the management of an enterprise. Whatever the consensual mission, it is ultimately under threat from narrow decisions by individuals, or self-reinforcing groups, that might be influenced more by emotional reactions to local events than by an appreciation of the organisational system. I think that there are some things leaders can do to minimise the risks.

Firstly, put key measures on a process behaviour chart and run it continuously. This provides a focus for discussion, for testing opinions and for placing decision making in context.

Secondly, formalise periodic reviews of process capability accompanied by a reappraisal of, and immersion in, the voice of the customer. Communicate this review widely. Do not allow it to be ignored or minimised in any discussions or decision processes.

Thirdly, just be aware of the risks that decisions might be emotionally founded with only post hoc rationalisation. Keep an eye on people who chronically avoid engagement in the process behaviour chart and capability study. Be mindful of your own internal thought processes. They are certainly less rational than you think.

I think that with those precautions organisations can harness the positive emotions that generate enthusiasm for a product or process and passion for its continual improvement.

Richard Dawkins champions intelligent design (for business processes)

Richard Dawkins has recently had a couple of bad customer experiences. In each he was confronted with a system that seemed to him indifferent to his customer feedback. I sympathise with him on one matter but not the other. The two incidents do, in my mind, elucidate some important features of process discipline.

In the first, Dawkins spent a frustrating spell ordering a statement from his bank over the internet. He wanted to tell the bank about his experience and offer some suggestions for improvement, but he couldn’t find any means of channelling and communicating his feedback.

Embedding a business process in software will impose a rigid discipline on its operation. However, process discipline is not the same thing as process petrification. The design assumptions of any process include, or should include, the predicted range and variety of situations that the process is anticipated to encounter. We know that the bounded rationality of the designers will blind them to some of the situations that the process will subsequently confront in real world operation. There is no shame in that but the necessary adjunct is that, while the process is operated diligently as designed, data is accumulated on its performance and, in particular, on the customer’s experience. Once an economically opportune moment arrives (I have glossed over quote a bit there) the data can be reviewed, design assumptions challenged and redesign evaluated. Following redesign the process then embarks on another period of boring operation. The “boring” bit is essential to success. Perhaps I should say “mindful” rather than “boring” though I fear that does not really work with software.

Dawkins’ bank have missed an opportunity to listen to the voice of the customer. That weakens their competitive position. Ignorance cannot promote competitiveness. Any organisation that is not continually improving every process for planning, production and service (pace W Edwards Deming) faces the inevitable fact that its competitors will ultimately make such products and services obsolete. As Dawkins himself would appreciate, survival is not compulsory.

Dawkins’ second complaint was that security guards at a UK airport would not allow him to take a small jar of honey onto his flight because of a prohibition on liquids in the passenger cabin. Dawkins felt that the security guard should have displayed “common sense” and allowed it on board contrary to the black letter of the regulations. Dawkins protests against “rule-happy officials” and “bureaucratically imposed vexation”. Dawkins displays another failure of trust in bureaucracy. He simply would not believe that other people had studied the matter and come to a settled conclusion to protect his safety. It can hardly have been for the airport’s convenience. Dawkins was more persuaded by something he had read on the internet. He fell into the trap of thinking that What you see is all there is. I fear that Dawkins betrays his affinities with the cyclist on the railway crossing.

When we give somebody a process to operate we legitimately expect them to do so diligently and with self discipline. The risk of an operator departing from, adjusting or amending a process on the basis of novel local information is that, within the scope of the resources they have for taking that decision, there is no way of reliably incorporating the totality of assumptions and data on which the process design was predicated. Even were all the data available, when Dawkins talks of “common sense” he was demanding what Daniel Kahneman called System 2 thinking. Whenever we demand System 2 thinking ex tempore we are more likely to get System 1 and it is unlikely to perform effectively. The rationality of an individual operator in that moment is almost certainly more tightly bounded than that of the process designers.

In this particular case, any susceptibility of a security guard to depart from process would be exactly the behaviour that a terrorist might seek to exploit once aware of it.

Further, departures from process will have effects on the organisational system, upstream, downstream and collateral. Those related processes themselves rely on the operator’s predictable compliance. The consequence of ill discipline can be far reaching and unanticipated.

That is not to say that the security process was beyond improvement. In an effective process-oriented organisation, operating the process would be only one part of the security guard’s job. Part of the bargain for agreeing to the boring/ mindful diligent operation of the process is that part of work time is spent improving the process. That is something done offline, with colleagues, with the input of other parts of the organisation and with recognition of all the data including the voice of the customer.

Had he exercised the “common sense” Dawkins demanded, the security guard would have risked disciplinary action by his employers for serious misconduct. To some people, threats of sanctions appear at odds with engendering trust in an organisation’s process design and decision making. However, when we tell operators that something is important then fail to sanction others who ignore the process, we undermine the basis of the bond of trust with those that accepted our word and complied. Trust in the bureaucracy and sanctions for non-compliance are complementary elements of fostering process discipline. Both are essential.

Trust in bureaucracy I – the Milgram experiments

I have recently been reading Gina Perry’s book Behind the Shock Machine which analyses, criticises and re-assesses the “obedience” experiments of psychologist Stanley Milgram performed in the early 1960s. For the uninitiated there is a brief description of the experiments on Dr Perry’s website. You can find a video of the experiments here.

The experiments have often been cited as evidence for a constitutional human bias towards compliance in the face of authority. From that interpretation has grown a doctrine that the atrocities of war and of despotism are enabled by the common man’s (sic) unresistsing obedience to even a nominal superior, and further that inherent cruelty is eager to express itself under the pretext of an order.

Perry mounts a detailed challenge to the simplicity of that view. In particular, she reveals how Milgram piloted his experiments and fine tuned them so that they would produce the most signal obedient behaviour. The experiments took place within the context of academic research. The experimenter did everything to hold himself out as the representative of an overwhelmingly persuasive body of scientific knowledge. At every stage the experimenter reassured the subject and urged them to proceed. Given this real pressure applied to the experimental subjects, even a 65% compliance rate was hardly dramatic. Most interestingly, the actual reaction of the subjects to their experience was complex and ambiguous. It was far from the conventional view of the cathartic release of supressed violence facilitated by a directive from a figure with a superficial authority. Erich Fromm made some similar points about the experiments in his 1973 book The Anatomy of Human Destructiveness.

What interests me about the whole affair is its relevance to an issue which I have raised before on this blog: trust in bureaucracy. Max Weber was one of the first sociologists to describe how modern societies and organisations rely on a bureaucracy, an administrative policy-making group, to maintain the operation of complex dynamic systems. Studies of engineering and science as bureaucratic professions include Diane Vaughan’s The Challenger Launch Decision.

The majority of Milgram’s subjects certainly trusted the bureaucracy represented by the experimenter, even in the face of their own fears that they were doing harm. This is a stark contrast to some failures of such trust that I have blogged about here. By their mistrust, the cyclist on the railway crossing and the parents who rejected the MMR vaccination placed themselves and others in genuine peril. These were people who had, as far as I have been able to discover, no compelling evidence that the engineers who designed the railway crossing or the scientists who had tested the MMR vaccine might act against their best interests.

So we have a paradox. The majority of Milgram’s subjects ignored their own compelling fears and trusted authority. The cyclist and the parents recklessly ignored or actively mistrusted authority without a well developed alternative world view. Whatever our discomfort with Milgram’s demonstrations of obedience we feel no happier with the cyclist’s and parents’ disobedience. Prof Jerry M Burger partially repeated Milgram’s experiments in 2007. He is quoted by Perry as saying:

It’s not as clear cut as it seems from the outside. When you’re in that situation, wondering, should I continue or should I not, there are reasons to do both. What you do have is an expert in the room who knows all about this study and presumably has been through this many times before with many participants, and he’s telling you, there’s nothing wrong. The reasonable, rational thing to do is to listen to the guy who’s the expert when you’re not sure what to do.

Organisations depend on a workforce aligned around trust in that organisation’s policy and decision making machinery. Even in the least hierarchical of organisations, not everybody gets involved in every decision. Whether it’s the decision of a co-worker with an exotic expertise or the policy of a superior in the hierarchy, compliance and process discipline will succeed or fail on the basis of trust.

The “trust” that Milgram’s subjects showed towards the experimenter was manufactured and Perry discusses how close the experiment ran to acceptable ethical standards.

Organisations cannot rely on such manufactured “trust”. Breakdown of trust among employees is a major enterprise risk for most organisations. The trust of customers is essential to reputation. A key question in all decision making is whether the outcome will foster trust or destroy it.

Trust in data – III – being honest about honesty

I found this presentation by Dan Ariely intriguing. I suspect that this is originally a TED talk with some patronising cartoons added. You can just listen.

When I started off in operational excellence learning about the Deming philosophy, my instructors always used to say These are honest men’s [sic] tools. From that point of view Airely’s presentation is pretty pessimistic. I don’t think I am entirely surprised when I recall Matt Ridley’s summary of evolutionary psychology from his book The Origins of Virtue.

Human beings have some instincts that foster the greater good and others that foster self-interest and anti-social behaviour. We must design a society that encourages the former and discourages the latter.

When wearing a change management hat it’s easy to be sanguine about designing a system or organisation that fosters virtue and the sort of diligent data collection that confronts present reality. However, it is useful to have a toolkit of tactics to build such a system. I think Ariely’s ideas are helpful here.

His idea of “reminders” is something that resonates with maintaining a continual focus on the Voice of the Customer/ Voice of the Business. Periodically exploring with data collectors the purpose of their data collection and the system wide consequences of fabrication is something that seems worthwhile in itself. However, the work Ariely refers to suggests that there might be reasons why such a “nudge” would be particularly effective in improving data trustworthiness.

His idea of “confessions” is a little trickier. I might reflect for a while then blog some more.