Leadership and decision making in safety-critical environments
1. Overcoming blind drift
By designing to contain risk management can create an 'illusion
of control' and an institutional blindness to the possibility of
extreme events by ignoring weak signals, practical drift and the
normalisation of risk.
Achieving safety via organisational
ambidexterity
The ability to achieve both efficiency, control, and incremental
improvement as well as develop the flexibility, autonomy, and
experimentation needed to deal with unexpected events.
• Structural (e.g. different departments, groups)
• Temporal (e.g. normal operations, high tempo and
emergency)
• Contextual (e.g. leaders and employees who can manage
the tensions/paradoxes)
Managing the unexpected
Complexity limits our ability to explain, predict and
control…. resulting in uncertainty and stress.
Changing after extreme events
Events that require "legitimate" recommendations to codify
learning and institutionalise practices may constrain
creative and flexible responses to an emerging future crisis
event.
Managing change in aftermath of an extreme event can be
a wicked problem due to the multiplicity of stakeholders, the
socially constructed nature of 'extreme' events, ambiguity
relating to the cause, lack of precedents, the opportunity
cost and unintended consequences of interventions, the
difficulty of learning by trial and error and the inability to
define an achievable endpoint – how safe is safe enough?
Significant also are the media in shaping public perception
and often transforming 'incidents' into 'real crises' worthy of
a front-page story.
Changes need to be embedded into the culture of the
organisation.
Addressing latent flaws
Active errors occur at the sharp end of the process. Latent
errors occur at the blunt end (e.g. equipment design flaws
or organisational flaws, such as staffing decisions)
creating ‘holes’ in the ‘defensive layers” in the process :
Leadership and decision making
in safety-critical environments
Professor David Denyer
David Denyer, Professor of Organisational Change
Email: david.denyer@cranfield.ac.uk
A strategic and integrating perspective on risk management
Increases understanding of the role of management practices and organisation culture in
both the causation of extreme events and in enabling or constraining the implementation
of lessons learned to prevent recurrence.
Centre
for
Customised
Executive
Development
Cross disciplinary
approach
The research builds bridges
between researchers in
different fields of study
through the inter-disciplinary,
collaborative and cross-
sector approach.
Whole event
sequence
Through the application of a
range of methods, the
research offers a detailed
empirical and analytical
account of extreme events
from event initiation to
implementation of lessons
learned. Few previous
studies address the whole
event sequence
Preventing the failure of
TECHNOLOGY AND
MANAGEMENT
in complex systems
Dynamic
Cause and effect are subtle, and
where the effects over time of
interventions are not obvious….
Senge
Interactive
Unfamiliar, unplanned, or unexpected
sequences of events in the system
particularly at the level of the working
environment…
Tightly coupled
The parts are highly interdependent,
that is, linked to many parts in a time-
dependent manner - change in one
part rapidly affects the status of other
parts and influences the system’s
ability to recover…
“We should expect normal
accidents’ (Perrow)
“Accidents do not occur because people
gamble and lose, they occur because people
do not believe that the accident that is about to
occur is at all possible”
Perrow
Informed decision making: avoiding biases,
judgement traps, and bounded awareness
Common biases that contribute to extreme events:
• ‘Overconfidence’ (tendency of decision-makers to
overestimate their abilities to consistently make effective
decisions)
• ‘Confirmation’ (tendency to seek out or put more weight on
information that supports an initial opinion)
• ‘Anchoring’ (being closely ‘wed’ to an initial thought and
reluctant to adjust sufficiently away from it)
• ‘Availability’ (tendency to consider information that is easily
retrievable from memory as ‘more likely, more relevant, or
more important’ for making the judgement)
• ‘Hindsight’ (seeing events that have already occurred as
being more predictable than they were before they took
place)
• Judgement traps that contribute to extreme events
• Rush to solve’ (a tendency to strive toward quick compromise
and early consensus, often to avoid conflict);
• ‘Groupthink’ (suppression of divergent views and/or
acceptance of dominant team members’ views expressed
early on)
• ‘Solving the wrong problem’ (often by not carefully/precisely
defining the problem)
• ‘Bounded awareness’ also contributes to extreme events
• ‘Inattentional blindness’ (failing to notice an unexpected
stimulus that is in one's field of vision when other attention-
demanding tasks are being performed)
• ‘Change blindness’ (failing to see changes in one’s
environment)
Creating HROs
High reliability organisations (HROs) are
organisations that work in situations that have the
potential for large-scale risk and harm, but which
manage to balance effectiveness, efficiency and
safety. They also minimise errors through
teamwork, awareness of potential risk and
constant improvement.