Human dependability: how to deal with human error
The expression 'unmanned spacecraft' is ultimately misleading. Human beings are an indispensable part of space missions. The systems up in orbit may most often be automated but this 'space segment' is only the tip of a very large iceberg, which includes what is referred to as the 'ground segment'. This takes in all the infrastructure, computer systems and human personnel down on Earth that keep the mission running.
Human dependability is about the contribution of the human in a space system to safety and reliability. Machines can fail, so can 'peopleintheloop' of space systems, sometimes with catastrophic consequences. ESA's Dependability and Safety Section therefore has a longstanding interest in the subject of human dependability: how can the incidence of human error be reduced, and its effects minimised?
A standardised approach
This interest has been boosted in recent years, as the scope of ESA's space activities has expanded (as with the possibility of providing safetycritical systems and services on Earth through EGNOS and Galileo). Human errors not only have the potential to put multimillion euro spacecraft at risk, but also human lives.
ESTEC hosted a Human Dependability Workshop - called HUDEP - in September 2009. HUDEP was attended by more than 40 participants from ESA programmes and operations, as well as space and nonspace industry. The purpose was to share current best practices within the space sector, and 'spinin' nonspace expertise on this extremely complex subject.
The subject owes its origin to aircraft accident investigations in the 1940s, out of which the very term 'human error' was coined. Today, human dependability is an extremely important consideration within safetycritical technological fields such as nuclear energy, air traffic control and highspeed railway transport.
The workshop conclusions are being used as the basis to elaborate an initiative to establish human dependability as a discipline at ESA in order to provide and coordinate support to Agency projects.
Taking lessons from the air
Humans are involved in all instances in the development, management and operation of space systems. Potential human errors need to be identified early and prevented. Air traffic makes for a good comparison. The pilot of a plane as well as air traffic controllers can make mistakes with fatal consequences.
In contrast to machines, which are good at doing endless repetitive tasks, the human intellect can solve unexpected situations or other problems in a creative way - as when a pilot makes a splitsecond decision during an emergency. However, humans tend to get bored and tired when having to do repetitive tasks for too long and tend to make more fatal mistakes in extreme stress situations.
Research by Prof. Heiner Bubb of Munich Technical University shows that the 'mean time between human failures' decreases dramatically as the complexity and stress of the tasks being done increases - from a mistake around every half an hour for simple wellpracticed activities, to a mistake every 30 seconds for the hardest, seldomtried procedures.
"What we know is that human failure rate is high," Captain Eugen H. Buehle of Lufthansa told the workshop. "And airplane operation is risk management. When it comes to the people flying planes, pilot selection is one key factor and pilot training is the other."
The company's pilots are given a minimum of four simulator sessions per year as well as special training scenarios and coaching. The aim is to keep their ability to handle emergency situations sharply honed.
Poor humanmachine interface designs, an unsuitable operator environment, insufficient training, an overly ambitious operational schedule or even time of day - all of these factors make mistakes more likely to occur.
Sylvie Figarol of France's Air Navigation Technical Centre recounted how simulatorbased tests of air traffic controllers and pilots fed back into improvements of the Traffic Collision Avoidance System (TCAS), a highlyreliable automated system that raises the alarm when aircraft risk coming too near each other.
The widelyused TCAS system has saved many lives, but miscommunication or errors around these alarms have occasionally triggered dangerous actions. A 2002 midair collision over Überlingen, Germany, took place after an air traffic controller told aircrews to do the opposite of what TCAS automatically instructed them.
Actual incidents were recreated in test simulations, with participants wired up to observe their physiological behaviour. Extreme stress increased the tendency to act, though not always in a welljudged way. Test results inspired TCAS training improvements and simplified procedures. "Design of such a safety net must include human aspects," said Ms Figarol.
Press releases you might also be interested in
Weitere Informationen zum Thema "Luft- / Raumfahrt":
Eine offene Brücke für Big Data
Es klingt ganz einfach: Industrie 4.0 benötigt Technologie 4.0. Doch was genau zeichnet eine moderne Shop-Floor-Lösung aus? Auf welche Punkte müssen Unternehmen achten, wenn sie den Sprung ins IIoT wagen wollen? Industrie-4.0-Spezialist FORCAM stellt in einer Artikelserie die wichtigsten Module vor.Weiterlesen