Risk of dishonesty – the supermarket checkout

File:Self checkout using NCR Fastlane machines.jpgThis recent news item got me thinking again about the risks of dishonesty faced by organisations. It appears that modern self-service supermarket checkouts provide the opportunity for, and perhaps a “nudge” towards, theft. You may remember my earlier blog about this interesting presentation by Dan Ariely. One of the things Ariely suggests is that the cumulative losses from many small acts of dishonestly are far from negligible in economic terms.

In any organisation, it is a sad and disconcerting fact of human nature that there is a genuine and widespread propensity for dishonesty. Defensive policing is costly and probably ineffective. It is an attempt to “inspect quality into a product”. That means that systems have to be set up to “nudge” employees towards honesty at the design stage.

As the supermarket checkout example shows, individuals’ moral reactions are often sensitive to system design in subtle ways. Dishonesty does not often show up on risks assessments or FMEAs. Uncomfortable as it may feel, experience tends to suggest that it is something that should be ever present in analysing risk. Perhaps that visibility might in itself be a positive “nudge” towards honesty.

Yet in suggesting that, I fear that the emotional costs of raising the issue in most organisations might outweigh the benefits. I wonder if including honesty in the list of assumptions of a risk assessment would influence the people involved in the assessing. But then how to provide the “nudge” to those who weren’t there?

Advertisements

Do I have to be a scientist to assess food safety?

I saw this BBC item on the web before Christmas: Why are we more scared of raw egg than reheated rice? Just after Christmas seemed like a good time to blog about food safety. Actually, the link I followed asked Are some foods more dangerous that others? A question that has a really easy answer.

However, understanding the characteristic risks of various foods and how most safely to prepare them is less simple. Risk theorist John Adams draws a distinction between readily identified inherent and obvious risks, and risks that can only be perceived with the help of science. Food risks fall into the latter category. As far as I can see, “folk wisdom” is no reliable guide here, even partially. The BBC article refers to risks from rice, pasta and salad vegetables which are not obvious. At the same time, in the UK at least, the risk from raw eggs is very small.

Ironically, raw eggs are one food that springs readily to British people’s minds when food risk is raised, largely due to the folk memory of a high profile but ill thought out declaration by a government minister in the 1980s. This is an example of what Amos Tversky and Daniel Kahneman called an availability heuristic: If you can think of it, it must be important.

Food safety is an environment where an individual is best advised to follow the advice of scientists. We commonly receive this filtered, even if only for accessibility, through government agencies. That takes us back to the issue of trust in bureaucracy on which I have blogged before.

I wonder whether governments are in the best position to provide such advice. It is food suppliers who suffer from the public’s misallocated fears. The egg fiasco of the 1980s had a catastrophic effect on UK egg sales. All food suppliers have an interest in a market characterised by a perception that the products are safe. The food industry is also likely to be in the best position to know what is best practice, to improve such practice, to know how to communicate it to their customers, to tailor it to their products and to provide the effective behavioural “nudges” that promote safe handling. Consumers are likely to be cynical about governments, “one size fits all” advice and cycles of academic meta-analysis.

I think there are also lessons here for organisations. Some risks are assessed on the basis of scientific analysis. It is important that the prestige of that origin is communicated to all staff who will be involved in working with risk. The danger for any organisation is that an individual employee might make a reassessment based on local data and their own self-serving emotional response. As I have blogged before, some individuals have particular difficulty in aligning themselves with the wider organisation.

Of course, individuals must also be equipped with the means of detecting when the assumptions behind the science have been violated and initiating an agile escalation so that employee, customer and organisation can be protected while a reassessment is conducted. Social media provide new ways of sharing experience. I note from the BBC article that, in the UK at least, there is no real data on the origins of food poisoning outbreaks.

So the short answer to the question at the head of this blog still turns out to be “yes”. There are some things where we simply have to rely on science if we want to look after ourselves, our families and our employees.

But even scientists are limited by their own bounded rationality. Science is a work in progress. Using that science itself as a background against which to look for novel phenomena and neglected residual effects leverages that original risk analysis into a key tool in managing, improving and growing a business.

Trust in data – III – being honest about honesty

I found this presentation by Dan Ariely intriguing. I suspect that this is originally a TED talk with some patronising cartoons added. You can just listen.

When I started off in operational excellence learning about the Deming philosophy, my instructors always used to say These are honest men’s [sic] tools. From that point of view Airely’s presentation is pretty pessimistic. I don’t think I am entirely surprised when I recall Matt Ridley’s summary of evolutionary psychology from his book The Origins of Virtue.

Human beings have some instincts that foster the greater good and others that foster self-interest and anti-social behaviour. We must design a society that encourages the former and discourages the latter.

When wearing a change management hat it’s easy to be sanguine about designing a system or organisation that fosters virtue and the sort of diligent data collection that confronts present reality. However, it is useful to have a toolkit of tactics to build such a system. I think Ariely’s ideas are helpful here.

His idea of “reminders” is something that resonates with maintaining a continual focus on the Voice of the Customer/ Voice of the Business. Periodically exploring with data collectors the purpose of their data collection and the system wide consequences of fabrication is something that seems worthwhile in itself. However, the work Ariely refers to suggests that there might be reasons why such a “nudge” would be particularly effective in improving data trustworthiness.

His idea of “confessions” is a little trickier. I might reflect for a while then blog some more.

Trust in data – II

I just picked up on this, now not so recent, news item about the prosecution of Steven Eaton. Eaton was gaoled for falsifying data in clinical trials. His prosecution was pursuant to the Good Laboratory Practice Regulations 1999. The Regulations apply to chemical safety assessments and come to us, in the UK, from that supra-national body the OECD. Sadly I have managed to find few details other than the press reports. I have had a look at the website of the prosecuting Medicines and Healthcare Products Regulatory Agency but found nothing beyond the press release. I thought about a request under the Freedom of Information Act 2000 but wonder whether an exemption is being claimed pursuant to section 31.

It’s a shame because it would have been an opportunity to compare and contrast with another notable recent case of industrial data fabrication, that concerning BNFL and the Kansai Electric contract. Fortunately, in that case, the HSE made public a detailed report.

In the BNFL case, technicians had fabricated measurements of the diameters of fuel pellets in nuclear fuel rods, it appears principally out of boredom at doing the actual job. The customer spotted it, BNFL didn’t. The matter caused huge reputational damage to BNFL and resulted in the shipment of nuclear fuel rods, necessarily under armed escort, being turned around mid-ocean and returned to the supplier.

For me, the important lesson of the BNFL affair is that businesses must avoid a culture where employees decide what parts of the job are important and interesting to them, what is called intrinsic motivation. Intrinsic motivation is related to a sense of cognitive ease. That sense rests, as Daniel Kahneman has pointed out, on an ecology of unknown and unknowable beliefs and prejudices. No doubt the technicians had encountered nothing but boringly uniform products. They took that as a signal, and felt a sense of cognitive ease in doing so, to stop measuring and conceal the fact that they had stopped.

However, nobody in the supply chain is entitled to ignore the customer’s wishes. Businesses need to foster the extrinsic motivation of the voice of the customer. That is what defines a job well done. Sometimes it will be irksome and involve a lot of measuring pellets whose dimensions look just the same as the last batch. We simply have to get over it!

The customer wanted the data collected, not simply as a sterile exercise in box-ticking, but as a basis for diligent surveillance of the manufacturing process and as a critical component of managing the risks attendant in real world nuclear industry operations. The customer showed that a proper scrutiny of the data, exactly what they had thought that BNFL would perform as part of the contract, would have exposed its inauthenticity. BNFL were embarrassed, not only by their lack of management control of their own technicians, but by the exposure of their own incapacity to scrutinise data and act on its signal message. Even if all the pellets were of perfect dimension, the customer would be legitimately appalled that so little critical attention was being paid to keeping them so.

Data that is properly scrutinised, as part of a system of objective process management and with the correct statistical tools, will readily be exposed if it is fabricated. That is part of incentivising technicians to do the job diligently. Dishonesty must not be tolerated. However, it is essential that everybody in an organisation understands the voice of the customer and understands the particular way in which they themselves add value. A scheme of goal deployment weaves the threads of the voice of the customer together with those of individual process management tactics. That is what provides an individual’s insight into how their work adds value for the customer. That is what provides the “nudge” towards honesty.