Browse the glossary using this index

Special | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | ALL

Page:  1  2  3  4  5  6  (Next)
  ALL

5

5 Why Technique

The 5 Whys is a question-asking ‘Six Sigma’ methodology used during the evaluation to drill down on the cause-and-effect diagram for further details needed to determine a problem’s root cause.


A

Acute risks

Risks that are the result of a sudden event eg. licencing standard not met


B

Black Swans

"Before the discovery of Australia, people in the Old World were convinced, that all swans were white, an unassailable belief as it seemed completely confirmed by empirical evidence. The sighting of the first black swan might have been an interesting surprise for a few ornithologists (and others extremely concerned with the colour of birds), but that is not where the significance of the story lies. It illustrates a severe limitation to our learning from observations or experience and the fragility of our knowledge. One single observation can invalidate a general statement derived from millennia of confirmatory sightings of millions of white swans. All you need is one single (and I am told, quite ugly) black bird."

(Taleb, Nassim N. 2007. The Black Swan: The Impact of the Highly Improbable. New York: Random House, P1 Prologue)

The Black Swam Theory was developed by Nassim Nicholas Taleb to explain:

  • The disproportionate role of high-profile, hard-to-predict, and rare events that are beyond the realm of normal expectations in history, science, finance, and technology
  • The non-computability of the probability of the consequential rare events using scientific methods (owing to the very nature of small probabilities)
  • The psychological biases that make people individually and collectively blind to uncertainty and unaware of the massive role of the rare event in historical affairs

Taleb in his 2007 book is critical of our ability to predict the future. Much of our focus is on trying to create a narrative to explain the complexities of our world, we are also over reliant on projecting historical data in order to forecast and measure risks.


Butterfly Effect

The butterfly effect is a deceptively simple insight extracted from a complex modern field. In 1961, Edward Lorenz created an early computer program to simulate weather. One day he changed one of a dozen numbers representing atmospheric conditions from .506127 to .506.

That tiny alteration utterly transformed his long-term forecast, a point Lorenz amplified in his 1972 paper, ‘Predictability: Does the Flap of a Butterfly's Wings in Brazil set off a Tornado in Texas?’ The term ‘butterfly effect’ stems from Lorenz's suggestion that a massive storm might have its roots in the faraway flapping of a tiny butterfly’s wings.

The butterfly effect has become a metaphor for the existence of seemingly insignificant moments that alter history and shape destinies. Typically unrecognised at first, they create threads of cause and effect that appear obvious in retrospect, changing the course of a human life or rippling through the global economy.


C

Catastrophic risks

A disaster, many deaths at one time

 


Chaos Theory

Chaos theory concerns the qualitative study of unstable aperiodic behaviour in deterministic non-linear dynamical systems. (‘Non-linearity’ suggests that you can do the same thing several times over and get completely different results – all human relationships are non-linear). Underlying chaos theory is that:

  • systems, no matter how complex, rely on underlying order;
  • very minor changes can cause complex behaviours and events.

The relevance of chaos theory can be summarised by the ‘butterfly effect’, which describes a situation where minute changes in the starting condition can have major and unpredictable consequences.


Chronic risks

Risks that arise because of a slowly changing situations and circumstances (not a single event) they accumulate over a long time or recurring frequently e.g climate change.

 


Control Risks

Process of identifying, analysing and planning for newly arising risks, keeping track of the identified risks and those on the watch list.

 


Critical Success Factors

The concept of success factors is usually credited to Daniel (1961) who introduced it in relation to 'management information crisis' that was being brought about 'by too rapid organisation change'

 


D

Decision Trees (Cause and Effect Diagram)

 

  • Decision Trees (Cause and Effect Diagram) - also called Ishikawa diagram and the Fish bone diagram. -  a technique based on displaying causal factors in a tree-structure such that cause-effect dependencies are clearly identified

 



Page:  1  2  3  4  5  6  (Next)
  ALL