What We Can Learn From Air France Flight 447

By Jill Heinerth

On the evening of 31 May  2009, 216 passengers and 12 crew members aboard Air France flight 447 disappeared into the South Atlantic. For almost two years, what happen remained a mystery, until black boxes were plucked from nearly two miles deep revealed that it was not poor weather conditions that brought down the plane, but simple human error.

In an age of advanced technology, how could human error override a properly functioning airplane? The answer is that under stress, human beings can lose their ability to think clearly, and fail to properly execute the things they have learned. It appears that from a cruising altitude of seven miles, a co-pilot failed to apply his training, resulting int he aircraft crashing down to the remote and frigid ocean.

In a nutshell, according to official reports, while the senior pilot took a break, co-pilot Pierre Cedric Bonin took over the controls. Moments later, the plane entered a large cloud produced by a tropical thunderstorm. Moisture froze on the plane’s air-speed sensors causing the autopilot system to shut down. The two co-pilots took over manual flight control, with some of the critical information they were accustomed to seeing missing.

The correct action would have been to hold the plane in a level attitude and consult checklists to sort out the airspeed issue. Instead, it appears that Bonin pulled back on the controls, climbed and pitched the aircraft into an aerodynamic stall. The plane quickly lost altitude and began plunging to the the sea below.

During the last fatal moments, the stall-warning alarm blared 75 times, but the two less experienced co-pilots were baffled. The captain arrived in the cockpit to hear, “We’ve totally lost control of the plane. We don’t understand… we’ve tried everything.” The final words recorded of the pilot, possibly as he saw the yoke pulled back, was, “no, no no!”

Under stress, the human brain can lose its ability to engage in complex reasoning. Neuropsychologists call it “brain freeze.” It seems that when the amygdalae portion of the brain are aroused, the frontal cortex partially shuts down. This leaves the person capable of only executing instinctive, or well-learned and practiced behavior.

Bonin was inexperienced and in his panic, the evidence shows he simply held the controls until he crashed. This accident reminds us that even though airlines crashes are rare, they are usually caused by human error in the face of a problem or problems that should be solveable. The same is true in tec diving.

Recently, an experienced diving colleague of mine suffered what appears to have been a rare, high-level failure on his rebeather. Some 3000 feet back in a cave, a HUD alarm light suddenly alerted him to a problem. According to his display, one sensor was badly out of range from the other two. Voting logic had isolated that sensor and was flying the unit based o the readings of the remaining sensors. The display said his PO2 was low, and so he added oxygen to bring the loop back to an acceptable level. The PO2 remained low, so he added more oxygen into the loop. Still low, he could not figure out why, but he turned the dive and started swimming out.

As he swam toward the exit, his heart rate increased. The rebreather didn’t seem to be behaving predictably, and he felt “strange.” His HUD alarm triggered again – his primary display showed he was out of oxygen.

How could this be? What was going on? He was unable to slow his brain down and sort things out. Thankfully, his training to bail to open circuit remained – which he did at this point. Bailing out, coupled with conservative bailout gas planning, got him out of the cave.

Later diagnosing the issue, he discovered that it appears he had a rare (but possible with any CCR) simultaneous dual oxygen sensor failure. He had been flying the unit based on the two defective sensors. This is the Achilles heel of any voting logic – the computer will assume the far-more-likely situation of a single failed sensor and control the loop based on the two that agree.This is why the human brain cannot disconnect – it must interpret the issue and take corrective action if two sensors fail at once.

Beyond bailing out, the recommended response to any sensor failure issue – whether you think your PO2 is high or low — is a diluent flush. A diluent flush would have revealed that the problem was two bad sensors and allowed my colleague to “simply hold the plane in a level attitude and consult his checklist to sort out the issue.” That is, he could have controled setpoint with manual control based on the known working sensor.

In this case, ultimately his training did save the day – he bailed out.  He is extremely fortunate to survive what must have been an extraordinarily high PO2 during those frightful moments.

When faced with any sort of instrument failure, as a tec CCR diver, your best immediate action (beyond bailing out, which is never wrong) is to ensure a breathable known gas in the loop. You don’t need any displays to know what is in the loop if you perform a diluent flush, then take your time to consult your displays and mental checklists to sort through the problem. In the most extreme situation, remember that your breathing loop is just a container of gas. If you have at least one sensor you have confirmed is reliable with a diluent flush, and you have adequate diluent and oxygen, you can go to manual setpoint control and end the dive on the loop.

9 Replies to “What We Can Learn From Air France Flight 447”

  1. Very interesting, but you missed important information. Whenever a sensor failure and my electronics performs the logical vote, I always do a diluent flush and then ask for my electronic display which PO2 expected at that depth with that diluent. So I check which sensor is actually wrong, if that’s what was voted off or are the two that are inside.

  2. Excellent post. This is why I believe that as technology improves and we become more reliant on the ‘system’, the gathering of information about incidents (rather than accidents) is more important than ever before. Until we have a black box in every CCR, the only witness who can attest to what actually happened is the operator, and even they sometimes make mistakes!! Importantly, we need to create an environment where people can talk about equipment failures, their mistakes and their training omissions without fear of ridicule or criticism, allowing lessons to be passed on.

  3. Great article. The stated serious malfunction is thanks god rare, but can turn into a disaster. After more than thousand hours on different Rebreathers I found out for myself what I teach now all my students regarding self rescue. First Bailout, sort out the problem while breathing a safe gas, return to the Loop if possible and end the dive. Therefore, I personally don’t dive a Rebreather without BOV. Is it because I’m a wimp?
    No – it’s because I’m not 21 anymore thinking that the world is mine and because I have lots to loose and would put people in trouble when I’m not there anymore .

  4. I don’t have that problem as I fly a manual rebreather, unfortunately not recognised by PADI as a “t” type so I have to teach it through TDI….

    What is interesting is that to date ALL PADI approved rebreathers both “R” type and “T” type rely on this flawed voting logic….like diving with a monkey on your back…lets hope this doesnt lead to a whole bunch of more CCR incidents hjust as we are trying to promote diving in this area!

    1. Incorrect, the Poseidon MkVI stands out as the only type R rebreather and uses automatic O2 sensor calibration and validation through out the dive.

      1. No Monkey, not quite. There are other ‘R’ Type rebreathers out there (Hollis Explorer?) although the Mk VI is the only one to have the novel O2 sensor validation technique which I think is great. However, it still doesn’t stop people doing silly things like not putting a scrubber cartridge into the unit and then diving it ending in fatal consequences. The same can be said of ‘tec’ units too though.

        Back to the original article, and the ability to learn from data collected by the unit, there have been a number of fatalities which have been reconstructed from the data recordings and show how the incident developed but not necessarily why the divers made the decisions they did.

        The same goes for the AF447 incident, we can see what happened, and the CVR provides some evidence as to why they chose to do the things they did, but not necessarily the thought processes behind the actions.



  5. How can you write an article based on facts that are still unclear. The investigation is not over on AF447 and it is highly inappropriate to make deductions based on speculations with additionaly no aircraft background. Have you ever seat on the left seat of an Airbus to judge those pilots ?

Comments are closed.

%d bloggers like this: