Proposition 65

WarningPoster1I had break from posting following my recent family vacation to California. While I was out there I noticed this rather alarming notice at a beach hotel and restaurant in Santa Monica. After a bit of research it turned out that the notice was motivated by California Proposition 65 (1986). Everywhere we went in California we saw similar notices.

I stand in this issue not solely as somebody professionally involved in risk but also as an individual concerned for his own health and that of his family. If there is an audience for warnings of harm then it is me.

I am aware of having embarked on a huge topic here but, as I say, it is as a concerned consumer of risk advice. The notice, and I hesitate to call it a warning, was unambiguous. Apparently, this hotel, teeming with diners and residents enjoying the pacific coast, did contain chemicals emphatically “known” to cause cancer, birth defects or reproductive harm. Yet for such dreadful risks to be present the notice gave alarmingly vague information. I saw that a brochure was available within the hotel but my wife was unwilling to indulge my professional interest. I suspect that most visitors showed even less curiosity.

As far as discharging any legal duty goes, vague notices offer no protection to anybody. In the English case of Vacwell Engineering Co. Ltd v B.D.H. Chemicals Ltd [1969] 3 All ER 1681, Vacwell purchased ampules of boron tribromide from B.D.H.. The ampules bore the label “Harmful Vapour”. While the ampules were being washed, one was dropped into a sink where it fractured allowing the contents to come into contact with water. Mixing water with boron tribromide caused an explosion that killed one employee and extensively damaged a laboratory building. The label had given B.D.H. no information as to the character or possible severity of the hazard, nor any specific details that would assist in avoiding the consequences.

Likewise the Proposition 65 notice gives me no information on the severity of the hazard. There is a big difference between “causing” cancer and posing a risk of cancer. The notice doesn’t tell me whether cancer is an inevitable consequence of exposure or whether I should just shorten my odds against mortality. There is no quantification of risk on which I can base my own decisions.

Nor does the notice give me any guidance on what to do to avoid or mitigate the risk. Will stepping foot inside the premises imperil my health? Or are there only certain areas that are hazardous? Are these delineated with further and more specific warnings? Or even ultimately segregated in secure areas? Am I even safe immediately outside the premises? Ten yards away? A mile? I have to step inside to acquire the brochure so I think I should be told.

The notice ultimately fulfils no socially useful purpose whatever. I looked at the State of California’s own website on the matter but found it too opaque to extract any useful information within the time I was willing to spend on it, which I suspect is more time than most of the visitors who find their way there.

It is most difficult for members of the public, even those engaged and interested, to satisfy themselves as to the science on these matters. The risks fall within what John Adams at University College London characterises as risks that are known to science but on which normal day to day intuition is of little use. The difficulty we all have is that our reflection on the risks is conditioned on the anecdotal hearsay that we pick up along the way. I have looked before at the question of whether anecdote is data.

In 1962, Rachel Carson published the book Silent Spring. The book aggregated anecdotes and suggestive studies leading Carson to infer that industrial pesticides were harming agriculture, wildlife and human health. Again, proper evaluation of the case she advanced demands more attention to scientific detail than any lay person is willing to spare. However, the fear she articulated lingers and conditions our evaluation of other claims. It seems so plausible that synthetic chemicals developed for lethal effect, rather than evolved in symbiosis with the natural world, would pose a threat to human life and be an explanation for increasing societal morbidity.

However, where data is sparse and uncertain, it is important to look for other sources of information that we can “borrow” to add “strength” to our preliminary assessment (Persi Diaconis’ classic paper Theories of Data Analysis: From Magical Thinking through Classical Statistics has some lucid insights on this). I found the Cancer Research UK website provided me with some helpful borrowing strength. Cancer is becoming more prevalent largely because we are living longer. Cancer Research helpfully referred me to this academic research published in the British Journal of Cancer.

Despite the difficulty in disentangling and interpreting data on specific risks of alleged pathogens we have the strength of borrowing from life expectancy data. Life expectancy has manifestly improved in the half century since Carson’s book, belying her fear of a toxic catastrophe flowing from our industrialised society. I think that is why there was so much indifference to the Santa Monica notice.

I should add that, inside the hotel, I spotted five significant trip hazards. I suspect these posed a much more substantial threat to visitors’ wellbeing than the virtual risks of contamination with hotel carcinogens.

Risk of dishonesty – the supermarket checkout

File:Self checkout using NCR Fastlane machines.jpgThis recent news item got me thinking again about the risks of dishonesty faced by organisations. It appears that modern self-service supermarket checkouts provide the opportunity for, and perhaps a “nudge” towards, theft. You may remember my earlier blog about this interesting presentation by Dan Ariely. One of the things Ariely suggests is that the cumulative losses from many small acts of dishonestly are far from negligible in economic terms.

In any organisation, it is a sad and disconcerting fact of human nature that there is a genuine and widespread propensity for dishonesty. Defensive policing is costly and probably ineffective. It is an attempt to “inspect quality into a product”. That means that systems have to be set up to “nudge” employees towards honesty at the design stage.

As the supermarket checkout example shows, individuals’ moral reactions are often sensitive to system design in subtle ways. Dishonesty does not often show up on risks assessments or FMEAs. Uncomfortable as it may feel, experience tends to suggest that it is something that should be ever present in analysing risk. Perhaps that visibility might in itself be a positive “nudge” towards honesty.

Yet in suggesting that, I fear that the emotional costs of raising the issue in most organisations might outweigh the benefits. I wonder if including honesty in the list of assumptions of a risk assessment would influence the people involved in the assessing. But then how to provide the “nudge” to those who weren’t there?

M5 “fireworks crash” – risk identification and reputation management

UK readers will recall this tragic accident in November 2011 when 51 people were injured and seven killed in an accident on a fog bound motorway.

What marked out the accident from a typical collision in fog was the suggestion that the environmental conditions had been exacerbated by smoke that had drifted onto the motorway from a fireworks display at nearby Taunton Rugby Club.

This suggestion excited a lot of press comment. Geoffrey Counsell, the fireworks professional who had been contracted to organise the event, was subsequently charged with manslaughter. The prosecutor’s allegation was that he had fallen so far below the standard or care he purportedly owed to the motorway traffic that a reasonable person would think a criminal sanction appropriate.

It is very difficult to pick out from the press exactly how this whole prosecution unravelled. Firstly the prosecutors resiled from the manslaughter charge, a most serious matter that in the UK can attract a life sentence. They substituted a charge under section 3(2) of the Health and Safety at Work etc. Act 1974 that Mr Counsell had failed “to conduct his undertaking in such a way as to ensure, so far as is reasonably practicable, that … other persons (not being his employees) who may be affected thereby are not thereby exposed to risks to their health or safety.”

There has been much commentary from judges and others on the meaning of “reasonably practicable” but suffice to say, for the purposes of this blog, that a self employed person is required to make substantial effort in protecting the public. That said, the section 3 offence carries a maximum sentence of no more than two years’ imprisonment.

The trial on the section 3(2) indictment opened on 18 November 2013. “Serious weaknesses” in the planning of the event were alleged. There were vague press reports about Mr Counsell’s risk assessment but insufficient for me to form any exact view. It does seem that he had not considered smoke drifting onto the motorway and interacting with fog to create an especial hazard to drivers.

A more worrying feature of the prosecution was the press suggestion that an expert meteorologist had based his opinion on a biased selection of witness statements that he had been provided with and which described which way the smoke from the fireworks display had been drifting. I only have the journalistic account of the trial but it looks far from certain that the smoke did in fact drift towards the motorway.

In any event, on 10 December 2013, following the close of the prosecution evidence, the judge directed the jury to acquit Mr Counsell. The prosecutors had brought forward insufficient evidence against Mr Counsell for a jury reasonably to return a conviction, even without any evidence in his defence.

An individual, no matter how expert, is at a serious disadvantage in identifying novel risks. An individual’s bounded rationality will always limit the futures he can conjure and the weight that he gives to them. To be fair to Mr Counsell, he says that he did seek input from the Highways Agency, Taunton Deane Borough Council and Avon and Somerset Police but he says that they did not respond. If that is the case, I am sure that those public bodies will now reflect on how they could have assisted Mr Counsell’s risk assessment the better to protect the motorists and, in fact, Mr Counsell. The judge’s finding, that this was an accident that Mr Counsell could not reasonably have foreseen, feels like a just decision.

Against that, hypothetically, had the fireworks been set by a household name corporation, they would rightly have felt ashamed at not having anticipated the risk and taken any necessary steps to protect the motorway drivers. There would have been reputational damage. A sufficient risk assessment would have provided the basis for investigating whether the smoke was in fact a cause of the accident and, where appropriate, advancing a robust and persuasive rebuttal of blame.

That is the power of risk assessment. Not only is it a critical foundational element of organisational management, it provides a powerful tool in managing reputation and litigation risk. Unfortunately, unless there is a critical mass of expertise dedicated to risk identification it is more likely that it will provide a predatory regulator with evidence of slipshod practice. Its absence is, of course, damning.

As a matter of good business and efficient leadership, the Highways Agency, Taunton Deane Borough Council, and Avon and Somerset Police ought to have taken Mr Counsell’s risk assessment seriously if they were aware of it. They would surely have known that they were in a better position than Mr Counsell to assess risks to motorists. Fireworks displays are tightly regulated in the UK yet all such regulation has failed to protect the public in this case. Again, I think that the regulators might look to their own role.

Organisations must be aware of external risks. Where they are not engaged with the external assessment of such risks they are really in an oppositional situation that must be managed accordingly. Where they are engaged the external assessments must become integrated into their own risk strategy.

It feels as though Mr Counsell has been unjustly singled out in this tragic matter. There was a rush to blame somebody and I suspect that an availability heuristic was at work. Mr Counsellor attracted attention because the alleged causation of the accident seemed so exotic and unusual. The very grounds on which the court held him blameless.

Do I have to be a scientist to assess food safety?

I saw this BBC item on the web before Christmas: Why are we more scared of raw egg than reheated rice? Just after Christmas seemed like a good time to blog about food safety. Actually, the link I followed asked Are some foods more dangerous that others? A question that has a really easy answer.

However, understanding the characteristic risks of various foods and how most safely to prepare them is less simple. Risk theorist John Adams draws a distinction between readily identified inherent and obvious risks, and risks that can only be perceived with the help of science. Food risks fall into the latter category. As far as I can see, “folk wisdom” is no reliable guide here, even partially. The BBC article refers to risks from rice, pasta and salad vegetables which are not obvious. At the same time, in the UK at least, the risk from raw eggs is very small.

Ironically, raw eggs are one food that springs readily to British people’s minds when food risk is raised, largely due to the folk memory of a high profile but ill thought out declaration by a government minister in the 1980s. This is an example of what Amos Tversky and Daniel Kahneman called an availability heuristic: If you can think of it, it must be important.

Food safety is an environment where an individual is best advised to follow the advice of scientists. We commonly receive this filtered, even if only for accessibility, through government agencies. That takes us back to the issue of trust in bureaucracy on which I have blogged before.

I wonder whether governments are in the best position to provide such advice. It is food suppliers who suffer from the public’s misallocated fears. The egg fiasco of the 1980s had a catastrophic effect on UK egg sales. All food suppliers have an interest in a market characterised by a perception that the products are safe. The food industry is also likely to be in the best position to know what is best practice, to improve such practice, to know how to communicate it to their customers, to tailor it to their products and to provide the effective behavioural “nudges” that promote safe handling. Consumers are likely to be cynical about governments, “one size fits all” advice and cycles of academic meta-analysis.

I think there are also lessons here for organisations. Some risks are assessed on the basis of scientific analysis. It is important that the prestige of that origin is communicated to all staff who will be involved in working with risk. The danger for any organisation is that an individual employee might make a reassessment based on local data and their own self-serving emotional response. As I have blogged before, some individuals have particular difficulty in aligning themselves with the wider organisation.

Of course, individuals must also be equipped with the means of detecting when the assumptions behind the science have been violated and initiating an agile escalation so that employee, customer and organisation can be protected while a reassessment is conducted. Social media provide new ways of sharing experience. I note from the BBC article that, in the UK at least, there is no real data on the origins of food poisoning outbreaks.

So the short answer to the question at the head of this blog still turns out to be “yes”. There are some things where we simply have to rely on science if we want to look after ourselves, our families and our employees.

But even scientists are limited by their own bounded rationality. Science is a work in progress. Using that science itself as a background against which to look for novel phenomena and neglected residual effects leverages that original risk analysis into a key tool in managing, improving and growing a business.

The cyclist on the railway crossing – a total failure of risk perception

This is a shocking video. It shows a cyclist wholly disregarding warnings and safety barriers at a railway crossing in the UK. She evaded death, and the possible derailment of the train, by the thinnest of margins imaginable.

In my mind this raises fundamental questions, not only about risk perception, but also about how we can expect individuals to behave in systems not of their own designing. Such systems, of course, include organisations.

I was always intrigued by John Adams’ anthropological taxonomy of attitudes to risk (taken from his 1995 book Risk).

AdamsTaxonomy1

Adams identifies four attitudes to risk found at large. Each is entirely self-consistent within its own terms. The egalitarian believes that human and natural systems inhabit a precarious equilibrium. Any departure from the sensitive balance will propel the system towards catastrophe. However, the individualist believes the converse, that systems are in general self-correcting. Any disturbance away from repose will be self-limiting and the system will adjust itself back to equilibrium. The hierarchist agrees with the individualist up to a point but only so long as any disturbance remains within scientifically drawn limits. Outside that lies catastrophe. The fatalist believes that outcomes are inherently uncontrollable and indifferent to individual ambition. Worrying about outcomes is not the right criterion for deciding behaviour.

Without an opportunity to interview the cyclist it is difficult to analyse what she was up to. Even then, I think that it would be difficult for her recollection to escape distortion by some post hoc and post-traumatic rationalisation. I think Adams provides some key insights but there is a whole ecology of thoughts that might be interacting here.

Was the cyclist a fatalist resigned to the belief that no matter how she behaved on the road injury, should it come, would be capricious and arbitrary? Time and chance happeneth to them all.

Was she an individualist confident that the crossing had been designed with her safety assured and that no mindfulness on her part was essential to its effectiveness? That would be consistent with Adams’ theory of risk homeostasis. Whenever a process is made safer on our behalf, we have a tendency to increase our own risk-taking so that the overall risk is the same as before. Adams cites the example of seatbelts in motor cars leading to more aggressive driving.

Did the cyclist perceive any risk at all? Wagenaar and Groeneweg (International Journal of Man-Machine Studies 1987 27 587) reviewed something like 100 shipping accidents and came to the conclusion that:

Accidents do not occur because people gamble and lose, they occur because people do not believe that the accident that is about to occur is at all possible.

Why did the cyclist not trust that the bells, flashing lights and barriers had been provided for her own safety by people who had thought about this a lot? The key word here is “trust” and I have blogged about that elsewhere. I feel that there is an emerging theme of trust in bureaucracy. Engineers are not used to mistrust, other than from accountants. I fear that we sometimes assume too easily that anti-establishment instincts are constrained by the instinct for self preservation.

However we analyse it, the cyclist suffered from a near fatal failure of imagination. Imagination is central to risk management, the richer the spectrum of futures anticipated, the more effectively risk management can be designed into a business system. To the extent that our imagination is limited, we are hostage to our agility in responding to signals in the data. That is what the cyclist discovered when she belatedly spotted the train.

Economist G L S Shackle made this point repeatedly, especially in his last book Imagination and the Nature of Choice (1979). Risk management is about getting better at imagining future scenarios but still being able to spot when an unanticipated scenario has emerged, and being excellent at responding efficiently and timeously. That is the big picture of risk identification and risk awareness.

That then leads to the question of how we manage the risks we can see. A fundamental question for any organisation is what sort of risk takers inhabit their ranks? Risk taking is integral to pursuing an enterprise. Each organisation has its own risk profile. It is critical that individual decision makers are aligned to that. Some will have an instinctive affinity for the corporate philosophy. Others can be aligned through regulation, training and leadership. Some others will not respond to guidance. It is the latter category who must only be placed in positions where the organisation knows that it can benefit from their personal risk appetite.

If you think this an isolated incident and that the cyclist doesn’t work for you, you can see more railway crossing incidents here.