Richard Booth reflects on his role in the ACSNI human factors study group, part of the UK’s response to the Chernobyl Disaster.This was one of the first major reports to link an organisation’s culture to its safety performance and bring the importance of safety culture to the attention of the health and safety profession.

By far the most effective committee on which I have ever served was the human factors study group of HSC’s [1] Advisory Committee for Safety in Nuclear Installations (ACSNI).  I was a member when the 3rdreport was prepared: “Organising for safety” [2].  I have already described the Group’s visit to two nuclear power stations.  The Chair was Dr Donald Broadbent CBE FRS. The members of the Group are listed at the end.

The success of the Group’s 3rd Report was a result of ‘process’ – just as was the case with SOP development at Station B.  This is implicit in all that follows.

The Group was established in consequence of the 28 April 1986 Chernobyl disaster.  Lord Marshall, then Chairman of the CEGB, was startled by the prevalence of rule violations at Chernobyl.  He asked the profoundly naïve question: was it conceivable that violations could even occur in UK nuclear power stations?

The 3rd Report described in detail, and defined, the concept of ‘safety culture’ – and the link with violations.  It was the foundation for UK safety climate attitude surveys.  The recommendations were apposite to safety management in general, not just to nuclear safety.  It was the thinking person’s HSG65.

When I joined, hopefully to provide expertise in safety management, the Group was just completing its 2nd report [Ref: HSC (1991) “ACSNI Study Group on Human Factors Second Report: Human Reliability Assessment: a Critical Overview” HMSO, London (out of print)].  The report compared competing methods for predicting human error rates.  Its conclusions were typically iconoclastic: the calculated probabilities were almost meaningless; but such analyses were nonetheless invaluable in anticipating human errors and reducing their likelihood – as long as no one believed the results!

The nuclear industry’s focus on the number of unintended errors (slips, lapses and mistakes) – required for quantified risk assessment – was to the detriment of a focus on violations.  Of course violations were nothing new.  A major challenge of machinery guarding from the Industrial Revolution onwards was the improper removal of guards when the machinery was running.  Nuclear power stations have much critical equipment and procedures where rule-violations are foreseeable.

The meetings

The success of our work was that all the members were enthusiastic experts who enjoyed coming to the meetings.  Donald told me that his first objective was to make meetings fun, doubtless from his experiences of interminable university committees.

The fun was in observing and participating in incisive debates where Donald and Professor Terry Lee (both eminent psychologists; engaging in workplace safety organisation the first time) challenged the rest of us to justify key precepts of safety management, sometimes to our discomfiture.  Only two research papers directly relevant to our work were found to be secure (one was Zohar (1980).   Even the methodology of the ‘Hawthorne Experiment’ was found to be deeply flawed (despite the findings being ‘right’).

All the report drafting was carried out in our own time.  The meetings were for intense debate from first principles; this was the process.   We all met our deadlines because the work took top priority.  Drafts were then allocated for review and editing by the member who had shown the most interest in the draft, guided by the trend of discussion.  In one case, text I had written included a safety planning process about which there was some disquiet.  The material was allocated to another for review.  My planning process was quietly omitted.  Further editing work came back to me once the deletion had been made.  Donald’s skill was to avoid tedious arguments; he achieved his goals by subtle fait accompli.  Incidentally, my (Hastam’s) planning process was later included in BS8800: 1996 and later in BS18004: 2008 ; where it belonged.


We called witnesses, including Professor Andrew Hale whose ‘Hale-Glendon’ model was included as an appendix in the report.

Some of our interviewees were given a hard time, notably the Nuclear Installations Inspectorate (NII).  Their team spoke with fervour that human factors were crucially important in the industry, and were a top priority.  Terry interjected “There are 250 nuclear inspectors …. how many of them are human factors specialists?”  “[Prolonged embarrassed silence] …. One … but we are thinking about recruiting a second!”

The most impressive witness was Tom Ryan who reported on the findings of the huge US research programme following Chernobyl.  The researchers were asked to rank order the issues that most affected safety in power stations.  The top four were:

  • effective communication, leading to commonly understood goals, and means to achieve the goals, at all levels in the organisation;
  • good organisational learning, where organisations are tuned to identify and respond to incremental change;
  • organisational focus, simply the attention devoted by the organisation to workplace safety and health;
  • external factors, including the financial health of the parent organisation, or simply the economic climate within which the company is working, and the impact of regulatory bodies.”

We said:

“A constant theme of the discussion of safety culture is that it is a sub-set of, or at least profoundly influenced by, the overall culture of an organisation. It follows that the safety performance of organisations is greatly influenced by aspects of management that have not traditionally been ‘part of safety’.”

My contributions to the Report

What were my contributions that I was most proud of?

First though I recall my least auspicious contribution.  I worked half the night to prepare a long list of semi-structured questions, and prompts, for the interviews with power station staff the next day.  My ‘team’, Donald and Terry at breakfast looked politely at my questions, passed them back to me, and in the event their ‘blagged’ questions and supplementaries were far more insightful than mine.  I watched in awe and said nothing, but kept copious notes for then and later.

But near the end of our work I went though the whole report to identify the key precepts that we were advocating.  I turned all these into a structured question-set that was the starting point for HSE’s ‘Safety Climate Tool

Moreover, and following discussion of the existing definitions, I was asked to draft a new ‘definition’ of ‘safety culture’, widely adopted thereafter, eg, in HSG65 (ibid).  The key point was that a safety culture could be good or bad, in contrast to earlier definitions.  The second half is the part that matters:

“Organisations with a positive safety culture are characterised by communications founded on mutual trust, by shared perceptions of the importance of safety and by confidence in the efficacy of preventive measures.”

My material was generally well received.  Donald attributed this to my sending in my material in 14 point font; double-spaced; with short paragraphs; in contrast to Terry (the other main author) whose impenetrable prose was tightly-packed on the page!   On one occasion we were both asked to write competing conclusions to a key chapter.  It turned out that our drafts were totally different, but complementary.  Both were included.  A feature of the report was the seamless shifts from discussion of traditional principles, or at least those that survived, and the underpinning research evidence.


The Group ‘worked’ because everyone contributed in our debates with perceptive observations.  The HSE secretariat helped us avoid technical and regulatory naiveties, and the witnesses brought us rapidly up to speed as to the state of the art.  The site visits helped the team bond.  Above all the Chair ensured that a seminal report was delivered on time, and that we enjoyed the work.

At the end of the proceedings the CBI representative (my cousin, Richard Amis CBE) entertained us all to dinner at the Carlton Club.  The committee members had become good friends, and wished to celebrate our work.

If I get a moment, I will write another post which will quote from, and comment on, the Report, partly for historical reasons, but mainly because its wisdoms are just as apposite today.

The members of the Study Group were:

Dr D E Broadbent, CBE, FRS, Department of Experimental Psychology, University of Oxford

Mr R H Amis, CBE, Formerly Chairman of Alfred Booth & Co Plc, representing the CBI

Dr P W Ball, Head of Quality and Safety Assurance, BNFL

Mr G Bellard, representing the TUC

Professor R T Booth, Director of Health and Safety Unit, Aston University

Professor T R Lee, Psychological Laboratory, University of St Andrews

Mr G C Simpson, Head of Ergonomics, British Coal

The Study Group was supported in its activities by:

Dr R Stubbs, HM Principal Nuclear Inspector, ACSNI, Technical Secretary

Mr R Delleman, ACSNI Secretary (to August 1992)

Mr J Carling, ACSNI Secretary (from August 1992)

[1]      Now part of HSE of course.

[2]      HSC (1993) “Advisory Committee for Safety in Nuclear Installations Human Factors Study Group Third Report: Organising for Safety” HMSO London; (out of print); Booth RT & Lee TR (1995) “The Role of Human Factors and Safety Culture in Safety Management” Proc Inst Mech Engrs Vol 209 pp 393-400.  The latter is a ‘de-nuclearized’ version of the HSC report focussing only on the key themes.

Leave a Reply

Be the first to hear from our experts: OPEN
Your Name Your Email