Human dependability: how to deal with human error

Paris, (PresseBox) - To err is human, goes the saying - one that space mission planners can never forget. Human error has spelt the doom of numerous missions during the last half century. Mistakes will inevitably be made in any system that people are involved in, and space is extremely unforgiving of mistakes.

The expression 'unmanned spacecraft' is ultimately misleading. Human beings are an indispensable part of space missions. The systems up in orbit may most often be automated but this 'space segment' is only the tip of a very large iceberg, which includes what is referred to as the 'ground segment'. This takes in all the infrastructure, computer systems and human personnel down on Earth that keep the mission running.

Human dependability is about the contribution of the human in a space system to safety and reliability. Machines can fail, so can 'peopleintheloop' of space systems, sometimes with catastrophic consequences. ESA's Dependability and Safety Section therefore has a longstanding interest in the subject of human dependability: how can the incidence of human error be reduced, and its effects minimised?

A standardised approach

This interest has been boosted in recent years, as the scope of ESA's space activities has expanded (as with the possibility of providing safetycritical systems and services on Earth through EGNOS and Galileo). Human errors not only have the potential to put multimillion euro spacecraft at risk, but also human lives.

ESTEC hosted a Human Dependability Workshop - called HUDEP - in September 2009. HUDEP was attended by more than 40 participants from ESA programmes and operations, as well as space and nonspace industry. The purpose was to share current best practices within the space sector, and 'spinin' nonspace expertise on this extremely complex subject.

The subject owes its origin to aircraft accident investigations in the 1940s, out of which the very term 'human error' was coined. Today, human dependability is an extremely important consideration within safetycritical technological fields such as nuclear energy, air traffic control and highspeed railway transport.

The workshop conclusions are being used as the basis to elaborate an initiative to establish human dependability as a discipline at ESA in order to provide and coordinate support to Agency projects.

Taking lessons from the air

Humans are involved in all instances in the development, management and operation of space systems. Potential human errors need to be identified early and prevented. Air traffic makes for a good comparison. The pilot of a plane as well as air traffic controllers can make mistakes with fatal consequences.

In contrast to machines, which are good at doing endless repetitive tasks, the human intellect can solve unexpected situations or other problems in a creative way - as when a pilot makes a splitsecond decision during an emergency. However, humans tend to get bored and tired when having to do repetitive tasks for too long and tend to make more fatal mistakes in extreme stress situations.

Research by Prof. Heiner Bubb of Munich Technical University shows that the 'mean time between human failures' decreases dramatically as the complexity and stress of the tasks being done increases - from a mistake around every half an hour for simple wellpracticed activities, to a mistake every 30 seconds for the hardest, seldomtried procedures.

Avoiding error

"What we know is that human failure rate is high," Captain Eugen H. Buehle of Lufthansa told the workshop. "And airplane operation is risk management. When it comes to the people flying planes, pilot selection is one key factor and pilot training is the other."

The company's pilots are given a minimum of four simulator sessions per year as well as special training scenarios and coaching. The aim is to keep their ability to handle emergency situations sharply honed.

Poor humanmachine interface designs, an unsuitable operator environment, insufficient training, an overly ambitious operational schedule or even time of day - all of these factors make mistakes more likely to occur.

Sylvie Figarol of France's Air Navigation Technical Centre recounted how simulatorbased tests of air traffic controllers and pilots fed back into improvements of the Traffic Collision Avoidance System (TCAS), a highlyreliable automated system that raises the alarm when aircraft risk coming too near each other.

The widelyused TCAS system has saved many lives, but miscommunication or errors around these alarms have occasionally triggered dangerous actions. A 2002 midair collision over Überlingen, Germany, took place after an air traffic controller told aircrews to do the opposite of what TCAS automatically instructed them.

Actual incidents were recreated in test simulations, with participants wired up to observe their physiological behaviour. Extreme stress increased the tendency to act, though not always in a welljudged way. Test results inspired TCAS training improvements and simplified procedures. "Design of such a safety net must include human aspects," said Ms Figarol.

Press releases you might also be interested in

Weitere Informationen zum Thema "Luft- / Raumfahrt":

Eine offene Brücke für Big Data

Es klingt ganz ein­fach: In­du­s­trie 4.0 be­nö­t­igt Tech­no­lo­gie 4.0. Doch was ge­nau zeich­net ei­ne mo­der­ne Shop-Floor-Lö­sung aus? Auf wel­che Punk­te müs­sen Un­ter­neh­men ach­ten, wenn sie den Sprung ins IIoT wa­gen wol­len? In­du­s­trie-4.0-Spe­zia­list FOR­CAM stellt in ei­ner Ar­ti­kel­se­rie die wich­tigs­ten Mo­du­le vor.

Weiterlesen

Subscribe for news

The subscribtion service of the PresseBox informs you about press information of a certain topic by your choice at a choosen time. Please enter your email address to receive the email with the press releases.

An error occurred!

Thank you! You will receive a confirmation email within a few minutes.


I want to subscribe to the gratis press mail and have read and accepted the conditions.