By: Karl Shreeves – PADI Education and Content Development Executive
During a 1988 conflict between Iran and Iraq, a U.S. AEGIS class missile cruiser under orders to protect U.S. interests and shipping was escorting U.S. vessels passing through the Persian Gulf, when two Iranian F-4 fighters took off from a near-by airbase. Instead of following their normal coastal/border patrol routine, they circled aggressively, locking their targeting radar on the U.S. vessels – a hostile act, especially in a recognized combat zone.
The AEGIS commander had only two options – shoot or don’t – and almost no time to ponder them. The wrong choice, potentially, could lead to hundreds of deaths on the ships under his care. But, a highly experienced officer, the commander’s response was so fast, uncanny and correct that it has been discussed by cognitive scientists since. He had his crew go on alert, watch the F-4s and be ready, but otherwise, he did nothing. In his judgment, they were not going to attack. And sure enough, after a few minutes the F-4s turned off their radar, went on their way and that was that.
The question is, how did the commander choose so well under extreme time and consequence pressure? This is worth knowing as tec divers because, when faced with a critical situation underwater, we have to do the same thing – enact potentially life-or-death choices with little time to think about them. It turns out that cognitive science has a lot to say about how we make good decisions in these situations, but first a disclaimer. “Decision making” is a huge field, so this will be a very narrow, brief look at some main points. So, if you have a background in this, please don’t beat me up for generalizing and staying out of the weeds. (For those interested, there’s a list of references and sources at the end.)
We Don’t Decide Everything the Same Way
Human decision making is complex because we don’t use the same processes for every decision. How we decide depends on what we know, social influences, biases, our experience and training and the time we have to do it. We commonly use mental shortcuts (heuristics), which are actually highly reliable within a given window (but not always perfect).
Getting narrow, here some key differences between decisions we make planning a tec dive versus while on the dive:
Planning
· Low time pressure · New information can be obtained · Decision consequences delayed · Situation static · Best decision more important than speed |
During
· High time pressure · Little or no new information · Consequences immediate · Situation dynamic & rapidly changing · Timely adequate response usually better than delayed ideal response |
Because of these differences, how we make good decisions differs in each situation, so let’s look at how we make critical decisions well during a dive (we’ll leave before the dive for another blog post). According to Dr. Gary Klein – the leading researcher and expert in naturalistic decision making (his term for it) – here’s how:
We don’t.
Researching combat decisions for the U.S. military in the late 1970s, Klein recognized that good “decisions” in high pressure situations like warfare, aviation, firefighting and (by our extension) tec diving are not really decisions, but diagnoses, after which the appropriate action is (or should be) obvious. We make similar adaptive diagnoses and actions as the situation develops.
According to Klein and other researchers, good (successful) decisions (actions) result from good planning, training, and experience. These interact and give us three important capabilities:
- Pattern recognition. Our training and experience allow us to recognize and assess based on and how things “should be” and situational expectations. Klein found that the subconscious plays a major role, and that successful actions were often based on “intuition,” “a gut feeling” that something wasn’t right, or that doing something “just felt right” rather than an intentional, calculated conscious process (though those happen, too).
- Mental simulation. According to Klein and other researchers, we conduct quick mental simulations that interact with recognized patterns and what we know through training and planning. Also often a subconscious process, this allows us to diagnose the situation and adapt what we do as the situation unfolds based on projected outcomes.
- Response availability. Successful outcomes rely having appropriate responses to the situation. As you’d expect, these come from training, experience and good predive planning, but these interplay so that you don’t have to think about every variation of every possible problem. Rather, your skills are “tools;” you pull the right ones out of your “tool box” and adapt them based on your diagnoses and mental simulations, even though you’ve never been in that exact situation before.
Back in 1988
This explains the AEGIS commander’s rapid, appropriate response. Through his experience and training, he recognized (perhaps feeling it more than consciously) that the F-4 behavior, while seemingly hostile, didn’t match the pattern of attacking aircraft. He probably ran some mental simulations, again perhaps subconsciously. What would attacking fighters do? To avoid detection, they would come in low, below radar, and wait to lock targeting radar until the last possible moment.
And, these F-4s? They were flying in plain view, and locked their radars well outside of a reasonable attack range. Beyond this, from intelligence briefings the commander knew that the Iranians were having radar maintenance issues and avoided using them unnecessarily. The behavior pattern didn’t match a real attack, but very much matched harassment. The commander therefore set minimum distances and other precautions in the unlikely case the Iranians felt suicidal, but he did not take the bait and engage them. Although he’d never seen this exact situation before, training, experience and planning (orders and intelligence briefings) allowed him to diagnose and respond successfully, not only avoiding an international incident, but likely saving the lives of the outgunned F-4 pilots and their RIOs.
As Tec Divers
Applying Klein’s work to tec diving, we set ourselves up for good decisions – diagnoses and responses, really – by planning what we need to know as well as what we need to do. You’ve heard to plan your dives within your training and experience envelope ad nauseum, but now we see why this is so important, and why untrained, inexperienced “internet tec divers” who “know what to do” from reading online are often incidents waiting to happen.
Naturalistic decision making shows us that we really make (or should) major decisions before the dive, either through training (routine responses) and planning (responses choices specific to the particular dive). We put these together within our experience limits, which we gradually expand to grow as divers. This means we need to:
- Have the training. This obviously establishes the foundational knowledge and skills you need for your tec diving level. It provides basic response availability, so that, for example, if you’re a cave diver you don’t have to figure out (much less discuss) how to share gas with a long hose before every cave dive, or worse during the dive when a teammate needs gas!
But, besides learning what you will apply (or may need to apply) on every dive, you gain contextual base experience in using them. This is one reason why guided practice with an instructor generates more reliable responses than “just knowing” what to do.
- Plan dives well. This applies your training and experience to the dive at hand. Take the time to think through variables and possible emergencies so your team has the “if-then” answers to situations that will (non-emergency) or could (emergency) occur.
Continuing the previous long hose gas sharing example, if you’ll be going through a complex restriction with sidemount cylinders removed, you’d probably discuss how you’d share gas coming back through it if necessary. It’s neither necessary nor possible to foresee every possible variation of every possible emergency; instead base planning on what you might need to know and do for everyone to come back safely, even if things go awry.
- Stay within your personal training and experience limits. It is within this envelope that your pattern recognition and heuristics (mental shortcuts) are most reliable. The more complex the dive, the more important these are. If you stray beyond these limits, not only do these mental processes become unreliable, they can actually work against you.
A classic example is open water divers in a Florida underwater cave. Inexperienced, what do these divers’ pattern recognitions expect? As the late Wes Skiles said, they expect a “dark and scary place.” But, Florida caves are not dark and scary. They’re air-clear (until kicked up) and in the cool months, warmer than the basin in front of them. The divers’ undeveloped/untrained pattern recognition detects no danger as they swim deeper and deeper into the cave, all-to-often with dire consequences.
But, within your experience envelope, applying these processes in a conservative direction is highly reliable. Klein cites an example that has become well-known: A fire officer led a small team of firefighters into a “routine” house fire. The team entered the living room, saw flames in the next room and began dousing the fire. After only a few moments, the officer suddenly ordered everyone out immediately. Moments after the team exited, the floor they had been standing on collapsed. The main fire was actually on the floor directly below, and they would have died.
Asked how he knew to withdraw, initially the officer said he didn’t know — it just felt wrong. He just knew. With some questioning, however, he recalled that it was quiet, but fires are normally loud. When sprayed, the fire refused to die down, but normally hosed flames die rapidly. His subconscious (a.k.a. intuition) noted the pattern mismatch, set off the alarms (anxiety) and saved their lives.
So, listen to your intuition if it tells you to bail, but to extend/continue the dive, use more rationale processes.
References and Additional Information
Bisbey, T. (2014) Toward a theory of practical drift in teams. HIM 1990-2015. 1555. stars.library.ucf.edu/honorstheses1990-2015/1555
Busemeyer J., Townsend, J. (1993) Decision field theory: a dynamic-cognitive approach to decision making in an uncertain environment. Psychological Review, Vol. 100, No. 3, 432-459. American Psychological Association.
Dietrich, C. (2010) Decision making: factors that influence decision making, heuristics used, and decision outcomes, inquiriesjournal.com/articles/180/ (series)
Einhorn, H., Hogarth, R. (1981) Behavioral decision theory: processes of judgment and choice.
Annual Review Psychology. 32: 53-88. Annual Reviews Inc.
Estrada, F. (2010) Economics and rationality of organizations: an approach to the work of Herbert A. Simon. MPRA Paper No. 21811. mpra.ub.uni-muenchen.de/21881/
Fellner, A. (2017) The aviation human factor as a necessary element of human resources management in the public transport. Public Transport. No. 3, 2017. http://www.uitp.org
Frederick, S. (2005) Cognitive reflection and decision making. Journal of Economic Perspectives, Vol. 19, #4, 25-42
Kahneman, D., Tversky A. (1973) On the psychology of prediction. Psychological Review, Vol. 80, No. 4, 237-251. American Psychological Association Inc.
Klein, G., Calderwood, R., Clinton-Cirocco, A. (1988) Rapid decision making on the fire ground, technical report 796, Battlefield Information Systems Technical Area, Research Institute for the Behavioral and Social Sciences, US Army
Klein, G., Klinger, D. (1991) Naturalistic decision making. Gateway, Vol. XI, No. 3, 16-19. Human Systems IAC, iac.dtic.mil/hsiac
Klein G. (1993) Naturalistic decision making: implications for design. Report to crew systems ergonomics information analysis center SOAR-CSERIAC 93-01, 20081009161.
Klein, G. (2015) A naturalistic decision making perspective on studying intuitive decision making. Journal of Applied Research in Memory and Cognition. http://www.elsevier.com/locate/jarmac
Perrow, C. (1999) Organizing to reduce the vulnerabilities of complexity. Journal of Contingencies and Crisis Management, Vol. 7, No. 3, 150-155. Oxford UK: Blackwell Publishers
Pham, T. Decision traps: ten barriers to brilliant decision-making and how to overcome them, J. Eward (sic) Russo and Paul J.H. Schoemaker, author’s review.
Price, M., Williams, T. (2018) When doing wrong feels so right: normalization of deviance (abstract). Journal of Patient Safety, Vol. 14, No. 1, 1-2. journals.lww.com/journalpatientsafety/Citation/2018/
Samuelson, W., Zeckhauser, R. (1988) Status quo bias in decision making. Journal of Risk and Uncertainty, 1: 7-59, Boston: Kluwer Academic Publishers
Staw, B., Hoang, H. (1995) Sunk costs in the NBA: why draft order affects playing time and survival in professional basketball. Administrative Science Quarterly, 40 (1995) 474-494
Tversky, A. (1981) Evidential impact of base rates. Report to/supported by Engineering Psychology Programs, Office of Naval Research, US Navy
Wilcutt, T., Bell, H. (2014) The cost of silence: normalization of deviance and groupthink. NASA safety message presentation, sma.nasa.gov/safety-messages