On 23 August 2013, a Super Puma helicopter en route to Sumburgh airport in Shetland crashed into the North Sea after suffering a "catastrophic loss of power". The tragic crash took the lives of four people on board, thankfully the 14 other people onboard survived.
In the immediate aftermath of the crash, helicopter operator CMC announced that it was to ground all of its Super Puma AS332 L2 aircraft until an investigation into the causes of the crash and whether or not the helicopter itself was to blame was completed.
Describing his experience at the crash site, RNLI rescue co-ordinator, Jim Nicholson, said: "There appears to have been a catastrophic loss of power which meant the helicopter suddenly dropped into the sea without any opportunity to make a controlled landing."
Shetland crash caused by human error
Those initial concerns of mechanical problems would turn out to be misplaced. In a report into the crash published in September 2013, the U.K. Air Accidents Investigation Bureau (AAIB) reported that the cause was pilot error . The report found that the helicopter’s nose pitched up and resulted in a sharp drop in airspeed. It reported that the crew "belatedly" realised the danger and that by that point the low altitude meant that "impact with the sea was unavoidable". Based on flight data, the Super Puma was ruled to have behaved as expected by safety experts.
While the mechanics were found to be sound, the accident in Shetland raised concern that the increased use of automated systems and technology may be reducing rather than strengthening safety. The AAIB investigation into the crash made repeated mentions of the automation technology in use during the flight, with the Automatic Voice Alarm Device, autopilot and Automatic Flight Control Panel identified in the document.
But automation overshadowed pilots own operational expertise
The report showed that in the moments before, the pilot and co-pilot were made aware of the danger through the automated audio system. Laying out the sequence of events, the report said: "At 2.0 nm the co-pilot advised the commander that the height at 1nm should be 390 ft. The co-pilot made a call at 100 ft above the mda (300 ft); the commander acknowledged. There was then an automated audio call of "Check Height", an acknowledgement by the commander, and then a comment by the co-pilot to draw the commander’s attention to the airspeed."
A new Exosuit enables deep-water diving without the need for risky decompression.
"At this time the helicopter’s airspeed was 35 kt and reducing. Shortly thereafter, there was a second automated audio call of "CHECK HEIGHT", followed by a "100 FEET" automated call two seconds before impact with the surface of the sea."It concluded: "At some point the commander saw the sea, but he was unable to arrest the helicopter’s descent and it struck the surface shortly thereafter at 1717 hrs."
In a safety review of offshore helicopter transport operations, the Civil Aviation Authority (CAA) accepted that the issue of automation and its impact on flight operations needs to be addressed. The review said: "In common with commercial airline operations, the review found that loss of control associated with the sophistication and automation of modern aircraft and helicopters is an issue requiring attention."
Clear benefits but vulnerabilities remain
While the CAA is yet to produce a full report on the issue of automation and its impact on flight safety, the US FAA conducted a comprehensive study on the topic in 2013. In its report, the FAA made it clear that the development of new automated systems had served to improve performance in a number of areas.
The group found that "automated systems have made important positive contributions to safety, through reduction in workload and other factors." It added that the systems had contributed to improvements in accuracy and fuel efficiency".
However, the FAA also raised concern over "vulnerabilities" relating to pilot use and interaction with the systems. The report said: "Pilots sometimes over-rely on automated systems – in effect, delegating authority to those systems, which sometimes resulted in deviating from the desired flight path under automated system control."
Figures included in the report showed that roughly a quarter of pilots involved in accidents had been over-confident in the systems and that some were reluctant to intervene. In its review of accidents, the group also found that more than half of the pilots were out of the control loop and not prepared to assume control when necessary.
The FAA suggests that in a number of cases, a pilot’s over-confidence or over-reliance on automation is leading them to doubt their own abilities, or in fact to lead them to a reduction in their capabilities. It notes that such problems are amplified by operating policies of some operators that prioritise automated operation over manual. Identifying a possible consequence the report said: "Pilots may not be prepared to handle non-routine situations, such as malfunctions or off-nominal conditions."
The risk of information overload
Another potential risk posed by the shift to greater automation is the pilots being presented with too much information. "The WG was told by several manufacturers that today’s technology allows for too much information to be presented to the pilot, which could overload the pilot during routine or critical phases of flight. What is presented, how it is presented, and resulting pilot understanding, especially in regards to flight path management, must be addressed for safe operations."
Based on its own initial findings and the information provided by the FAA, the CAA has issued a series of recommendations which it believes should be adopted by industry. Relating specifically to automation, it called for helicopter manufacturers to review their recommended training material to ensure that pilots are better prepared for operating "modern highly complex aircraft".
Further it called for pilot training organisations and helicopter operators to adopt manufacturers operating philosophies and recommended practices in order to minimise the risk malfunction or misuse.
As the FAA and CAA both explained, the evolution of automated technology within the aviation sector has provided significant benefits, both in terms of safety and operation. But the issues involved in the crash last August, and other similar incidents, suggest that an awkward tension has developed between human pilots and their computerised counterparts.
While they were originally developed to merely assist, automated flight systems are now, in some cases, the dominant force. For the industry and the authorities, the challenge is to ensure that the role of the automated systems and the human pilots are clearly defined and carried out without fail, as confusion can lead to grave consequences.