What happens when a financial institution takes a traditional approach to risk, with the aim to eliminate or minimise human error with procedure laden controls and a focus on individuals' actions? Unless the industry is highly stable with very little complexity involved, it can be like trying to fit a square peg into a round hole. For example, in industries like banking and finance, where unpredictability and paradox are the only certainty, controls such as second person verification and individual disciplinary action rarely address the key issues.
Why is this?
Unfortunately, traditional approaches to risk management tend to focus on the study of individual failures and the “cause” of errors. When we focus on the individual, we overlook the complexity and interdependency of systemic factors such as the internal/external environment, systems, procedures, and culture involved in the work.
To explore this further, let’s take a look at a hypothetical case study from the banking sector.
Here, a staff member was transferring shares from one account to another and ended up sending the funds to the wrong international account. In a typical risk investigation, the focus would be on the individual failing to identify that the wrong account number was auto-populated in the system and that the individual failed to update the number. Additionally, a typical root cause analysis would simply identify “human error” as the “cause” and recommend retraining or even disciplinary action. However, such approaches fail to address the systemic issues (such as poor software usability, a lack of training, unrealistic timeframes, or poor standard operating procedures), allowing them to reside in the organisation and potentially contribute to other errors in the future (even after this individual has left the organisation).
In addition, focusing on individual “causes” fosters a culture where staff feel a sense of shame or guilt when they make a mistake or error. This leads to significantly less reporting of accidents and incidents from fear of blame or punishment [1].
When this occurs, the organisation loses out on the opportunity to learn from valuable mistakes. This also prevents the greater organisation and industry from fully understanding the risks involved in the work/context/situation/system; preventing growth, adaptation and impacting safety, performance and productivity [2].
So, what can we do about this?
The answer, by employing human factors principles in the banking and finance industry (as with other high-reliability industries) we can significantly reduce the likelihood of error, improve operational performance, increase profit, enhance reputation, and improve regulatory approval.
How does this work?
First, by focusing on systemic issues or adopting a systems thinking approach to investigations [3], we can avoid the oversimplification of complex, multi-dimensional and interdependent problems (like in the bank transfer example above) and ensure that key contributing factors, like those listed below, are not overlooked.
· Systems and organisation: work systems, rules and leadership
· People and Performance: communication, processes and competencies
· Equipment and Interfaces: technology, procedure and performance [4].
Next, despite common traditions of attributing individual blame/fault when something goes wrong, a Just Culture fosters the voicing of safety concerns and makes employees feel safe reporting errors and near misses [5]. This environment is created by building trust, where staff know that they won’t be punished for making honest mistakes (which is distinct from intentionally reckless behaviour) [6]. The efficacy of a Just Culture can be seen through increased generation of risk/incident data (as a result of greater reporting), which allows improved risk monitoring and more targeted interventions to address systemic issues [7].
Lastly, with risk as inherent, inevitable and part of the very essence of the banking and financial industry, attempts to eliminate risk can prove counterproductive and redundant. Instead, the application of Threat and Error Management (TEM) principles allows the identification of threats (such as decision bias, fatigue, monotony, anxiety and pressure), and the actions/inactions that exacerbate those threats (errors) and increase the chances of an adverse event occurring [8]. Through TEM programs, we can provide employees with the skills to better detect, respond and manage these threats and errors and prevent them from contributing to adverse events/states in the future [9].
Overall, a human factors approach to banking aims to build ultra-resilience by developing adaptive systems that manage complexity, uncertainty, and volatility with greater efficacy and efficiency [10]. Furthermore, this approach allows us to design bespoke tools and procedures to identify trends and associations in contributing factors and develop targeted interventions for increased safety, and enhanced operational and financial performance.
Here at Cortexia we have built a reputation as a world leader in Human Factors for Banking & Finance. See our video below for more information or visit us at cortexia.com.au/human-factors-for-banking-finance/
References
1. Zabari, M. L., & Southern, N. L. (2018). Effects of Shame and Guilt on Error Reporting Among Obstetric Clinicians. Journal of Obstetric, Gynecologic & Neonatal Nursing, 47(4), 468-478. doi:https://doi.org/10.1016/j.jogn.2018.03.002
2. Weinzimmer, L. G., & Esken, C. A. (2017). Learning From Mistakes: How Mistake Tolerance Positively Affects Organizational Learning and Performance. The Journal of Applied Behavioral Science, 53(3), 322–348. https://doi.org/10.1177/0021886316688658
3. Reason, J. T. (1990). Human Error. New York: Cambridge University Press.
4. Niskanen, T. (2018). A Resilience Engineering -related approach applying a taxonomy analysis to a survey examining the prevention of risks. Safety Science, 101, 108-120, https://doi.org/10.1016/j.ssci.2017.08.016.
5. Shirali, G. A., & Nematpour, L. (2019). Evaluation of resilience engineering using super decisions software. Health Promotion Perspectives, 9(3), 191-197. doi: 10.15171/hpp.2019.27.
6. Bahr, N. J. (2018). System Safety Engineering and Risk Assessment: A Practical Approach. Second Edition: CRC Press.
7. Dekker, S. (2016). Just culture : Restoring trust and accountability in your organization, third edition. ProQuest Ebook Central https://ebookcentral-proquest-com.ezproxy.library.uq.edu.au
8. Ruskin, K. J., Stiegler, M. P., Park, K., Guffey, P., Kurup, V., & Chidester, T. (2013). Threat and error management for anesthesiologists: a predictive risk taxonomy. Current opinion in anaesthesiology, 26(6), 707–713. https://doi.org/10.1097/ACO.0000000000000014
9. Gordon, S., Mendenhall, P., & O'toole, B. B. (2012). Beyond the checklist : What else health care can learn from aviation teamwork and safety. ProQuest Ebook Central https://ebookcentral-proquest-com.ezproxy.library.uq.edu.au
10. Woodward, S. (2019). Moving towards a safety II approach. Journal of Patient Safety and Risk Management, 24(3), 96–99. https://doi.org/10.1177/2516043519855264