On the evening of 31 May 2009, 216 passengers and 12 crew members aboard Air France flight 447 disappeared into the South Atlantic. For almost two years, what happen remained a mystery, until black boxes were plucked from nearly two miles deep revealed that it was not poor weather conditions that brought down the plane, but simple human error.
In an age of advanced technology, how could human error override a properly functioning airplane? The answer is that under stress, human beings can lose their ability to think clearly, and fail to properly execute the things they have learned. It appears that from a cruising altitude of seven miles, a co-pilot failed to apply his training, resulting int he aircraft crashing down to the remote and frigid ocean.
In a nutshell, according to official reports, while the senior pilot took a break, co-pilot Pierre Cedric Bonin took over the controls. Moments later, the plane entered a large cloud produced by a tropical thunderstorm. Moisture froze on the plane’s air-speed sensors causing the autopilot system to shut down. The two co-pilots took over manual flight control, with some of the critical information they were accustomed to seeing missing.
The correct action would have been to hold the plane in a level attitude and consult checklists to sort out the airspeed issue. Instead, it appears that Bonin pulled back on the controls, climbed and pitched the aircraft into an aerodynamic stall. The plane quickly lost altitude and began plunging to the the sea below.
During the last fatal moments, the stall-warning alarm blared 75 times, but the two less experienced co-pilots were baffled. The captain arrived in the cockpit to hear, “We’ve totally lost control of the plane. We don’t understand… we’ve tried everything.” The final words recorded of the pilot, possibly as he saw the yoke pulled back, was, “no, no no!”
Under stress, the human brain can lose its ability to engage in complex reasoning. Neuropsychologists call it “brain freeze.” It seems that when the amygdalae portion of the brain are aroused, the frontal cortex partially shuts down. This leaves the person capable of only executing instinctive, or well-learned and practiced behavior.
Bonin was inexperienced and in his panic, the evidence shows he simply held the controls until he crashed. This accident reminds us that even though airlines crashes are rare, they are usually caused by human error in the face of a problem or problems that should be solveable. The same is true in tec diving.
Recently, an experienced diving colleague of mine suffered what appears to have been a rare, high-level failure on his rebeather. Some 3000 feet back in a cave, a HUD alarm light suddenly alerted him to a problem. According to his display, one sensor was badly out of range from the other two. Voting logic had isolated that sensor and was flying the unit based o the readings of the remaining sensors. The display said his PO2 was low, and so he added oxygen to bring the loop back to an acceptable level. The PO2 remained low, so he added more oxygen into the loop. Still low, he could not figure out why, but he turned the dive and started swimming out.
As he swam toward the exit, his heart rate increased. The rebreather didn’t seem to be behaving predictably, and he felt “strange.” His HUD alarm triggered again – his primary display showed he was out of oxygen.
How could this be? What was going on? He was unable to slow his brain down and sort things out. Thankfully, his training to bail to open circuit remained – which he did at this point. Bailing out, coupled with conservative bailout gas planning, got him out of the cave.
Later diagnosing the issue, he discovered that it appears he had a rare (but possible with any CCR) simultaneous dual oxygen sensor failure. He had been flying the unit based on the two defective sensors. This is the Achilles heel of any voting logic – the computer will assume the far-more-likely situation of a single failed sensor and control the loop based on the two that agree.This is why the human brain cannot disconnect – it must interpret the issue and take corrective action if two sensors fail at once.
Beyond bailing out, the recommended response to any sensor failure issue – whether you think your PO2 is high or low — is a diluent flush. A diluent flush would have revealed that the problem was two bad sensors and allowed my colleague to “simply hold the plane in a level attitude and consult his checklist to sort out the issue.” That is, he could have controled setpoint with manual control based on the known working sensor.
In this case, ultimately his training did save the day – he bailed out. He is extremely fortunate to survive what must have been an extraordinarily high PO2 during those frightful moments.
When faced with any sort of instrument failure, as a tec CCR diver, your best immediate action (beyond bailing out, which is never wrong) is to ensure a breathable known gas in the loop. You don’t need any displays to know what is in the loop if you perform a diluent flush, then take your time to consult your displays and mental checklists to sort through the problem. In the most extreme situation, remember that your breathing loop is just a container of gas. If you have at least one sensor you have confirmed is reliable with a diluent flush, and you have adequate diluent and oxygen, you can go to manual setpoint control and end the dive on the loop.