When Technology Fails – Enter Proficiency, Persistence, and Resilience

Added on Jun 18, 2016

Safety depends on two kinds of high reliability. The first kind is managing the expected, as in using technology and standardization. The second is managing the unexpected, which is much more about thinking and thinking together when the technology and standardization go wrong. This is a story about resilience – that second kind of high reliability.

Hospital Safety

Have you ever been lost in your car and low on fuel? Staying mentally disciplined is challenge enough at this point. Just imagine the emotions of Jay Prochnow on his flight mission to deliver a Cessna 188 crop duster (inset right) to Norfolk Island, north of Australia, in December 1988. Departing from American Samoa, he had 22 hours of fuel for the trans-Pacific trip, but relied solely on his automatic direction finding needle to guide his flight, which failed. He planned his arrival after 14 hours of seeing nothing but water.

So what are your options when you discover technology has failed you?

Jay, a former US Navy pilot, began a square pattern search, but soon concluded he was lost with no other means to locate his destination, thus broadcasting a “Mayday” distress call to Auckland air traffic control. Auckland contacted an Air New Zealand DC-10 as potentially close to the distressed aircraft. The DC-10 pilot, Gordon Vette, was a licensed navigator and recruited his passengers to aid the lost aircraft. Once in communication, with no other means of locating the Cessna, Vette reverted to basic environmental cues to narrow his search.

Hospital Safety

Jay instructed the two aircraft to directly face the setting sun, and upon noting a 4 degree difference in heading, he knew the Cessna was to his south. He then directed Prochnow to measure the sun’s height by placing fingers upon the front of the windscreen, yielding a 2-digit difference from his, thus estimating him to be further west and closer to the setting star… probably 240 miles from his position. As the DC-10 proceeded toward this estimated position and once in VHF (Very High Frequency) radio contact, Vette directed Prochenow to start a tight orbit and continue broadcasting. He knew the VHF range to be 400 miles, and planned a box pattern with turns at the points of contact lost and regained, from which a little geometry with bisecting the chord lines across the circle would give a more precise location. This intersection was closer, but still did not yield visual contact, so with fuel and daylight entering the “concerned” stage and knowledge that they were east of Norfolk, Vette directed the Cessna to head west and report when the sun hit the horizon. With a 22.5 minute differential from Norfolk Island’s same report, he calculated they were 291 miles east. Finally good fortune appeared with Prochenow sighting an oil rig under tow, who radioed coordinates to the DC-10, and soon a passenger spotted the lost craft. Vette passed a precise heading for Norfolk Island, and now fuel became the major concern… the Cessna had been airborne for 21 hours with approximately 2 hours remaining to the destination. Prochenow’s conservation techniques allowed for the extra range, and he safely landed near midnight after 23 hours in the air.

What can we learn from this Precursor Safety Event… and nail-biting saga? Clearly proficiency plays a role, as Prochenow’s Navy aviation experience (box pattern, fuel conservation) and Vette’s navigation proficiency contributed to avoiding a tragedy. Insufficient planning for a risky flight is a factor, as no backup means of navigation had been arranged. The most significant view from this high reliability perspective, would be to consider performance modes by the main two characters. Prochenow was in a skill-based mode for most of the flight, following instruments, with occasional exercise of rule-based mode when accomplishing flight planning and position reporting. His use of knowledge-based mode quickly kicked in when he realized he had no alternate navigation systems, and immediately requested assistance (Mayday)…moreover, anticipating rescue efforts as a result of his dilemma.

Enter Vette, who immediately engaged a knowledge-based mode and started scanning his mental database for tactics that could help narrow the position search. His creativity in comparing sun position differences was a step beyond his license requirements, which teaches the use of sun techniques to establish one’s own position. Even greater creativity was demonstrated in the VHF geometry exercise. Although, the precision of radio range can be a rough estimate at best, as well as time consuming when remaining fuel was an issue. Perhaps the final sunset/timing tactic was just in time, as it convinced both players that westbound progress was immediately essential if recovery was possible.

Questions to consider:

1. Which of our technologies can suddenly fail to thrust us into that second kind of reliability?

2. Are our teams prepared as clinicians for when technology fails? Too often we suffer from over-reliance on technology and deskilling, like many of us who cannot do math without the calculator app on our phones. 

3. How can we keep our clinical skills sharp like the airmanship of Gordon Vette? We at HPI think the answer is in-situ team simulation and debriefing everyday care activities. Learning is doing with feedback. So if we are doing, we are learning something. Debriefing means feedback, and feedback means we are learning the right things the right way.