The Guardian reports that a new Bureau of Investigations and Analysis study of Air France 447, which crashed in a storm in 2009, "raises concerns about training for pilots flying hi-tech planes when confronted with a high-altitude crisis."

The crash looks like an example of the sort of "normal accident" that Yale sociologist Charles Perrow described: one in which the failure of a highly complex system that usually needs a minimal amount of human tending was exacerbated because of a special kind of human error. The complexity of these systems makes it difficult for operators to fully understand how they work, and people trained in their normal operation may not be well-prepared to deal with problems when things go wrong.

Central to this accident is the fact that when the automation failed, the pilots were presented with conflicting information which was obviously incorrect, said William Voss, president of the Flight Safety Foundation in Alexandria, Virginia. But they were unable to look through this and understand what the aircraft was actually doing. "Pilots a generation ago would have done that and understood what was going on, but [the AF447 pilots] were so conditioned to rely on the automation that they were unable to do this," he said. "This is a problem not just limited to Air France or Airbus, it's a problem we're seeing around the world because pilots are being conditioned to treat automated processed data as truth, and not compare it with the raw information that lies underneath."

This is the downside of highly complex systems: when they work they may work great, but when they fail they do so realy spectacularly, and in ways that are hard for us to predict.