Sunday 30 October 2016

Risk Intelligent

Are You Risk Intelligent?

Risk is all around us, not all risk is bad risk. If we don’t take risks in life, not only do we not learn, we miss opportunities for business sucess. Preventing people from taking risks can stop them from learning how to make calculative decisions to manage the risk, if it ever arises.

While businesses strive to
develop attributes such as
agility and resilience, without
applying Risk Intelligence,
neither is achievable.
What is Risk Intelligence?

Risk Intelligence is a concept that generally means beyond risk management. Risk Intelligence is knowing how to identify the key indicators (Alarm Bells) that indicate risk, it is then knowing how to manage what is found, effectively enough to prevent failure.

Risk Indicator Clues

What are the type of risk ‘Risk Indicator Clues’ we need to consider?
  • Bounded Rationality – Also known as Tick and Flick or Flooding, this idea was first introduced by a researcher Herbert Simons. What bounded rationality means is, people cannot retain information, particularly in their short term memory. When we flood people with lots of data & documentation, we tend to bound their rational thinking, so they begin to tick and flick to get it out of the way. A lot of time they automatically revert back to what they feel is the right way, which majority of the time is not a calculated and rational decision. An example of this might be when a worker is provided a checklist with a whole range of things to check off, along with, procedures, JSA, behavioural observations, take 5 / step backs and added with the unforeseen time pressure.

  • Pace and Flow – Also known as rushing, a person who is engrossed in creating something or is just working continuously, tend to get in the flow of things. When we get in the flow, we tend to become oblivious to the things going on around us and even become blind to our surroundings. When things are going well in the workplace and we are in the flow of things getting done, it could very well be that we are in the energy of rushing where we see things less. Pace and flow is something we should be looking for as a risk indicator.

  • Preoccupation with Failure – Also known as Crosschecking. Karl E Weick came up with the term Preoccupation with Failure, which in other words means we should entertain doubt. When we are rushing, in the flow of things, doing the same things over and over day in day out and things are going well, this is the time where we are more likely to be unable to see something that is about to go wrong. This is a time where we should perhaps stop and entertain doubt, and re-look at what we are doing and try to discover what we may have become blinded to. By using crosscheck in our day to day activities allows us to keep at bay the little issues that may become big issues. Crosscheck is respecting that humans are fallible, having a separate set of eyes checking over what we are doing helps to prevent errors. This is a critical step that can help prevent disasters. When many little issues go unnoticed they start piling up and usually become the big failure, bit like looking at the Swiss cheese model. The aviation industry uses crosscheck on all their routine flights as a measure of managing risk.

Is Common Sense
        common?
  • Sense-Making – No Common Sense. The phrase ‘common sense’ is one that cops a lot of flak. Sense-making is not common, so no there is no common sense. Hearing the words common sense is an indication that we believe everyone should see and be on the same page, yet we are not. An example, Johnny believes it is common sense to vote for the Greens political party, yet this does not make sense to others, so sense making is therefore not common. It is critical to meet with teams in your workplace and gather their sense-making around risk, it will not be common. Good Leaders know that the critical part of managing risk, is asking the individuals in their team how they make sense of the risk. Let’s stop asking for common sense, or expecting common sense, let’s start asking people how they make sense of the things in front of them.  

  • Hubris – Also known as Overconfidence, the word Hubris derived from the Ancient Geeks. An indicator of Hubris is someone who is blinded by their own overconfidence to the point they stop listening, observing or even communicating or working with others (form of arrogance). We may hear it or see it in the workplace, if so, we need to understand that Hubris will make people blind to healthy decision making to manage risk.

  • Risk Homeostasis – Also known as Desensitisation, the meaning of Risk Homeostasis was proposed by Professor Gerald Wilde. When we have something new to be aware of we become more aware of it and its potentials, however after working with it or doing it for a long period of time we become desensitised. Sometimes we tend to think a person with more experience is the best person to deal with risk, when in fact it could very well be the opposite. The more experienced we become at something, the longer we do something, the more we become bored with it, and the less sensitive we are to the risks associated with the task / process. When we hear or notice Risk Homeostasis in peoples work we should be aware they are far less sensitive to risk.

Are you ready to become 
      Risk Intelligent?
  • Assumptions – What we need to do to mitigate Assumptions is create communications with open questions. We often make assumptions about all we do, the indicator for this is when we hear people asking closed questions. Open questions are the best way to get at people’s assumptions and to help prevent you, the observer from assuming too. Often we think others are seeing the world the same way we are, however most times they have an entirely different view. Socratic questioning is a brilliant way to discover others views on risk and helps prevent assumptions being made. Closed questions is another key risk indicator that is critical to managing risk.

  • Automaticity – Also known as Autopilot, to be in autopilot is when we do things automatically (non-rational) without any real thinking. One of the best ways to prevent falling into the trap of automaticity is to work together with others and become collectively mindful about what we do. When we work, we work from the fast part of the mind, also known as (unconscious state of mind) ’10 - 100 million bits per second’. When we work with others it helps us to discuss collectively how to tackle a task, so when we use our automaticity it will be somewhat collectively understood and managed. It is important to realise that we are not in a rational state of mind when we are working in an automatic state, 98% of our time is in automaticity, that is part of the reason why collective discussions are important, so we can rationalise the risks beforehand.

    Perceptions – also known as Fallibility, humans are fallible, we do make mistakes and it’s inevitable. Our perceptions can be tricked visually with illusion, auditory with what we hear, there’s a whole range of ways our perceptions of reality can be puzzled, even with the way we see risk. Discussing how we perceive risk and make sense of risk will help us to tackle and managing risk better, we become more risk intelligent as we learn from one another.

  • Tradeoffs & By-Products – When we consider risk intelligence we need to think about tradeoffs and by-products, every time a decision is made we should consider that there will be a tradeoff or by-product. No decisions are made for managing risk without a by-product or tradeoff. Even using the Hierarchy of control we need to understand, to be risk intelligent no matter what the control is there will be either a tradeoff or by-product. I think of it like a compass, no matter which way we shift North will still remain on the compass, we have just change direction, same with risk, it is still there just moved or been traded off for another risk.


  • Absolutes – Also known as inflexible, Hearing or seeing non-negotiable language or communication could be signs of vulnerability to risk, to be so tightly coupled we make no room to look for new ways, new learnings or new risks. Language like ‘zero harm’ can unforeseeably drive absolutes in an organisation even though that is not the intent. It is critical to be loosely coupled, flexible and agile, if we are open to the unknown unknowns and new learnings, we will become far more resilient to risk. Being ridged with risk is dangerous.








    Priming, Framing, Anchoring, Semiotics Language, symbols and signs play a significant part in how we make sense with risk. The words we use prime, frame and anchor people to certain things. An example is the fire sign, the acronym we remember when it’s time to be calm and not panic, is to RACE. “Remember to Race”, yet the business who uses this sign actually wants everyone to keep calm. The significance of this sign subconsciously put in the heads of people when a fire happens, you race. The sign works both on the collective and the individual unconscious. Risk intelligence teaches people to look at words and how they prime and frame the way people think both consciously and unconsciously. Knowing what to hear and look for with this key indicator is very important to how we perceive and manage risk.

This article only gives brief examples or the critical Risk Indicator Clues needed for Risk Intelligence, there are many more critical learnings to go with this content to become truly risk intelligent.
At Human Rysk we have a series of experiential learning programs that teaches practical ways to become ‘Risk Intelligent’. Get involved and have your organisation become Resilient to Risk!   

   
      

M +61 427 052 998

E dennis@humanrysk.com.au

W www.humanrysk.com.au

No comments:

Post a Comment