Risky Behavior and the Casinos of the Human Mind
By Michael D. Larrañaga, CIH, CSP, PE, Ph.D.
Oklahoma State University
Daniel Kahneman won the 2002 Nobel Memorial Prize in Economics for his work on Prospect Theory. Prospect Theory describes the human tendency to choose between alternatives that involve various levels of risk. What Prospect Theory tells us is that people tend to make real-life choices based on perceived losses or gains rather than the final outcome. In short, Prospect Theory maintains that humans hate to lose more than they love to win, and therefore make bets where losing is more important than winning. In addition, once we make a decision not to lose, regardless of the risk, we become anchored in that decision, making it difficult to change our minds, which clouds the line between rational and irrational decision making. This tendency is particularly important when we bet on risk that involves potential harm to life, health, and property.
We have all seen this type of risky betting on more than one occasion, and all of us participate in risk gambling on a daily basis. Should I walk or drive? Should I take the train or fly? Should I enroll my child in private school or public school? Should I eat that doughnut or exercise? Each of these alternatives presents some level of risk to us personally, and we gamble based on our experiences and perceptions of levels of risk when choosing between various alternatives every day.
With regards to the prevention of accidents and injuries in the workplace, prospecting, as defined by Prospect Theory, can have critical influence on risk-informed decision-making. Humans have a tendency to view accidents as outliers, or uncommon events; yet experience shows that accidents (small and large disasters) are the norm, and stable systems are the outliers. Human nature forces people to view the world through naïve “disaster” lenses. Perhaps this is the human condition – perpetual optimism. Ted Lewis, author of Bak’s Sand Pile, maintains that life is a series of passages from catastrophe to catastrophe with inconsequential periods of calm in between, thus staking the claim that catastrophe is the norm, not the exception.
Let’s use the recent and tragic Oklahoma tornadoes as an example. There are approximately 1,250 tornadoes per year in the United States, with many of those striking in an area referred to as Tornado Alley ranging from South Dakota down through Texas.
Since 1998, the Federal Emergency Management Agency (FEMA) has promoted the installation of tornado shelters in homes and businesses for areas with a high frequency of tornadoes. Yet few people have built tornado shelters in their homes or purchased new homes with tornado shelters as an option. One of the reasons is because tornados are inherently less risky at the local level. Of the more than 1,200 tornadoes every year, very few carry high consequences. The vast majority result in low or no consequences. Hence, humans perceive that tornadoes are rare. Tornado shelters are expensive, and we “bet” that a tornado won’t hit us. We intuitively know that a tornado could hit our home or business, but few of us take the extra step to build shelters. In either case, we either choose to build a shelter or we do not. This is an extremely simplified case for a very complicated problem. But at the simplest level, this helps us start a conversation about risk.
The human mind is like a casino in that humans are always betting on risk, hedging their bets, focusing on most recent events (a phenomenon called association bias that makes it easier to forget negative occurrences), and we often make irrational decisions involving finances, family, property, and the like. You can see examples of this type of betting in every disaster, including accidents of all sizes and consequences.
Once we make a bet based on winning or losing, it is very difficult to change our minds and approach the risky situation objectively. We tend to become anchored to a decision and actively resist alternative choices once the decision is made. The key to avoiding prospecting and anchoring in decision-making is to recognize that these tendencies exist and put systems in place that enable us to engage in objective, risk-informed decision-making.
Recognizing that these tendencies exist in human nature requires education, training, and continual evaluation of our perceptions of risk – a continuous improvement process for ourselves, perhaps. Many accidents occur when one or more individuals accept an increased risk in some part of an operation. Workers may be under stress, either from the organization or other outside pressures, and may violate their own values and prior experience to adjust their perception of the risk involved. This risk acceptance may be deliberate or unconscious. Either way, risk acceptance under pressure may force workers to respond in unpredictable ways. Almost all of us have rushed through a yellow light, texted while driving, jaywalked, or exceeded the speed limit dangerously; these are actions we know we should not do, but we anchored ourselves to justifying the extra risk without proper consideration of the potential consequences.
One way of consistently reminding ourselves, employees, and managers about objective, risk-informed decision-making is to utilize leading safety indicators that provide metrics to facilitate the identification of potential areas of weakness in advance. Leading indicators remind us that human and organizational errors cause the overwhelming majority of injuries, workplace illnesses, and accidents. By understanding this, we can remind ourselves that we can and should use leading indicator trends and data to make risk-informed decisions. Leading indicators precede an undesirable event and may allow us to predict the arrival of the event, whether it is an accident, incident, near miss, or undesirable safety state. Leading indicators are associated with proactive activities that identify hazards and assess, eliminate, minimize, and control risk. Leading indicators identify hazards, allow response to changing work processes, measure effectiveness of control systems, and evolve as organizations change.
High quality risk management strategies isolate and make preventative plans against predictive leading indicators such as:
- Employee turnover rate — a high turnover rate can indicate organizational inefficiency and loss of corporate knowledge, forcing employees and the organization into a continuous learning curve cycle.
- Negative risk assessments and hazard analyses — negative risk assessments and hazard analyses allow us to identify potential accident and injury pathways that can be intercepted in advance of an undesirable event.
- Unacceptable observations or investigation results — indicators from unacceptable observations or investigation results help target potential accident and injury pathways.
- Lack of training — effective training and education is the key to developing the ability to identify prospecting behavior and anchoring in ourselves and others so that those tendencies can be counteracted in advance of an accident.
- Unsatisfactory employee perception surveys — employees of an organization are the key to its safe performance. Employee perception of an unsafe workplace may identify areas where the safety of the organization can be improved.
By utilizing leading indicators, we can use measurement and safety systems to counteract the human tendencies to prospect and bet on risk. Defining indicators, tracking them, and using feedback loops to prevent prospecting and anchoring in human decision-making can be powerful tools in preventing accidents. A simple example is our bank account. None of us like to overdraw the account and are penalized when we do. Tracking our bank account balance is a leading indicator of our financial risk, and we use this information to predict our financial state. Much the same can be said for the use of leading indicators in the workplace.
Michael D. Larrañaga is Professor, Simplex Professor, and Department Head of the School of Fire Protection and Safety Engineering Technology (SFPS) at Oklahoma State University (OSU). He serves as Director of the Boots & Coots Center for Fire Safety & Pressure Control and the U.S. Department of Homeland Security (DHS) Science, Technology, Engineering and Mathematics (STEM) Fellows program.
Dr. Larrañaga’s research focuses on the application of complexity theory and network science to real world systems to model risk, emergence and cascade failure. He serves on the Board of Directors for the American Board of Industrial Hygiene and as an appointed member to the Board of Scientific Counselors for the Centers for Disease Control and Prevention’s National Institute for Occupational Safety and Health (NIOSH). He is a Certified Industrial Hygienist, Certified Safety Professional and Professional Engineer. Dr. Larrañaga earned his Bachelor of Science in Fire Protection and Safety Engineering Technology from OSU, his Master of Science in Environmental Science from the University of Houston-Clear Lake, a Master of Arts in Security Studies from the United States Naval Postgraduate School and a doctoral degree in Industrial Engineering from Texas Tech University.