Human factors in aviation accidents Essay.
As a result of researches that recognized the existence of human factors in error management, the aviation industry also began to take initiatives to reformat aviation organisations. The aviation industry shifted to a more open culture that valued communication and at the same time recognized that human error will always be present but through time and more advanced studies can be reduced further (Sexton, Thomas & Helmreich, 2000). Almost all aspects have been looked into. Selection and training processes were raised to a different level.
While technical skills were highly valued before, they have been found to be inadequate in dealing with safety concerns. Selection of aircrew now includes processes that determine their ability to learn from errors and to become team players. There also has been a new approach for training – not only the pilots, but the entire crew is trained. There are also a mounting number of interventions that are being tested for their effectivity to modify behavioral patterns that could injure a system’s safety.
Because crew resource management (CRM) failure is implicated in most aircraft accidents, CRM training programs, which began as a part of a National Aeronautics and Space Administration [NASA] program in 1979, began to be widely received by aviation organisations. Now on its 5th generation, CRM trainings for the crew have been conducted by major airlines and even the navy. The CRM training program encompasses many aspects of aviation safety such as situation awareness, task management and communication (Thomas, 2004).
The 5th generation CRM programs can be viewed as a tool to effectively manage errors. CRM combats errors in three ways: obviously, the first one is the avoidance of error; the second Human Factors in Aviation Accidents 10 one is the “suppression” of the error before it happens; and third, the mitigation of the effects of the errors should it not be avoided (Helmreich, Merritt & Wilhelm, 1999). The focus of the 5th generation CRM is the normalization of errors – whether due to active or latent failures – and the development of strategies to manage these errors (Helmreich, Merritt & Wilhelm, 1999).
Although CRM programs focus on human errors and its effects on aviation safety, it is never entirely a behavioral training. After all, if you recall, a “productive system” is an interaction of human and technological factors. Therefore, CRM is not to be taken as a stand-alone program. It is meant to be incorporated to technical trainings where the crew has to be adept in the operation of modern aviation technology and at the same time has to learn non-technical skills such as effective communication, coordination and teamwork.
Despite the fact that intervention programs like the CRM has been introduced and conducted for crews in major and regional airlines for the two decades, the percentage of CRM failures in aviation accidents remained relatively flat (Wiegmann & Shappell, 2001). Although initial results of CRM programs have been found to be encouraging such that positive results were seen almost immediately after the conduction of the program, it was soon obvious that such results were short-lived (Taneja, 2002). What could have happened that prevented such interventions to perform as expected?
Helmreich & Merritt (2000) offers an explanation – at least for CRM. First, not everybody responds to CRM training. Some may have become less accepting of CRM after the training. Although attitudes do not necessarily define behavior, it has been a well-known maxim that those who reject a concept are more likely not to follow the principles that it imparts. (Helmreich, Merritt & Wilhelm, 1999). Human Factors in Aviation Accidents 11 Culture – national, professional and organisational – is also a significant factor that determines the level of acceptance for a CRM concept (Helmreich & Merritt, 2000).
When CRM was introduced to other national cultures, it soon became evident that certain CRM concepts can either be readily accepted or rejected depending on the national culture. Cultures such as China and many Latin American countries that stress the importance of absolute power and authority of their leaders will necessarily be less receptive to the idea of subordinates questioning the decisions of their leaders, than cultures that are less hierarchical (Hofstede, 1980 as cited by Helmreich, Merrit & Wilhelm, 1999).
While CRM programs encourage subordinates to be more assertive in questioning their leaders, junior crew members in these cultures are quite disinclined to do this because of fear of showing disrespect. These same cultures are also collectivists who stress interdependenc and the necessity to work together for a common goal. In contrast, American and European cultures, which are highly individualistic, give more value to independence and more priority to individual goals.
The value of teamwork and the need for coordination will most likely be readily accepted in the former than in the latter. There are also High Uncertainty Avoidance cultures like Greece, Korea and many Latin American countries that prefer CRM concepts already specifying required behaviors. Cultures that are low in Uncertainty Avoidance tend to be more flexible when it comes to behaviors but have difficulty adhering to standard operating procedures.
Furthermore, this kind of culture, along with non-collectivist cultures are more questioning with regards to the usage of automation while High Uncertainty Avoidance and collectivist cultures accept the idea of automation usually without, or relatively less, questions. Intervention programs such as CRM should therefore not be patterned after a single national culture. From the discussion it can be seen above that autocratic cultures can also Human Factors in Aviation Accidents 12 value teamwork and interdependence than non-autocratic cultures (Helmreich, in press).
Cultures exhibiting difficulty of adherence to SOPs may be more innovative when dealing with novel situations not covered by procedures. In contrast, cultures who stress strict adherence to rules may find it difficult to be more flexible in new situations. Although CRM programs require behavior modification, certain beliefs ingrained into a culture are quite difficult to modify. If CRM has to attain widespread global use, it has to design programs that are congruent to national culture and yet still enhances safety.
Error management should therefore be embraced as a culture in itself. Focusing on threat and error management as goals, training programs should not aim for a total reversal of norms and beliefs but rather for a drawing out of positive behaviors without directly confronting national culture. Many professions, including aviation, have strong cultures and develop their own norms and values (Helmreich, in press). Each culture encompasses both positive and negative aspects. Aircrew for example has a high level of motivation and a strong sense of professional pride.
The negative component, which is seen to be universal, is the inability to admit vulnerability to stressors. Majority of pilots in almost all national cultures agree that their decision-making abilities are not hampered by personal problems and that it is as good in emergency situations as in normal situations. Furthermore, most of these pilots have indicated that they do not make errors even while under stress. This “macho” culture, when left uncorrected, can lead to risk taking, failure to coordinate with other crew members and error.
Indeed, one focus of the 5th generation CRM program is to help pilots acknowledge that human error does occur and they are more vulnerable to it if they continue to deny the existence and effect of stressors. As stated earlier, the organisational culture determines the principle of the organisation towards safety. One reason why CRM does not deliver results when it should is Human Factors in Aviation Accidents 13 the organisational context on which the program was delivered (Helmreich & Merritt, 2000).
Organisations may have a highly evolved safety culture and possesses a positive outlook on safety. Such organisations will most likely benefit from CRM and other intervention strategies. Others react only to safety threats once they are looming while some organisations give perfunctory attention to safety issues without really embracing a safety culture that works towards a highly effective accident prevention program. The current process of investigating errors during accidents also contributes to the apparently limited success of intervention strategies.
Most accident reporting systems are primarily technological and have been designed with little regard to human factors. Such systems are on their way to being perfected in terms of identifying mechanical failures but in itself is also a bit of a failure when it comes to assessment of human errors (Wiegmann & Shappell, 2001). An examination of an accident investigation process may help drive down the point. When an accident due to mechanical failure happens, investigators examine objective and quantifiable information such as that obtained from the flight data recorder.
The data is processed and the probable causes of the accident are recorded so that safety recommendations can be identified. After the investigation, the data obtained is entered into a database which can be periodically accessed to provide feedback to investigators. The information in the database can also be used by funding organizations to determine which researches to fund. As a result, intervention strategies are further developed to prevent mechanical failure from happening or mitigate consequences once they do happen (Helmreich & Merritt, 2000).
Either way, the number of accidents due to mechanical failures has been greatly reduced. Human Factors in Aviation Accidents 14 In contrast, investigations of accidents due to human errors produce results that are not tangible and are difficult to quantify. And because the studies of human factors came later than studies of mechanical failures, investigative techniques used in human error analysis are less refined than those used to assess engineering and design concerns.
When these techniques are used to analyze accident data, the results are rarely very useful and safety issues cannot be readily addressed. Therefore any intervention strategy designed using this data is not assured of success. Taneja (2002) echoes this concern stating that human errors have been implicated in 70-80% of aircraft accidents demonstrating the very limited success of current intervention strategies and also proposes a solution: a holistic approach to intervention strategies. To date, most researches on human factors in aviation have focused on specific aspects of aircraft accident prevention.
Sarter and Alexander (2000) have observed that current researches on human error focus mainly on three key aspects: the development of error classification schemes, the design of error-tolerant systems, and error prevention through improved design and additional training (as cited by Taneja, 2002). Because accident prevention is such a wide field of study, it is very possible that certain aspects have been missed out or overlooked. Error framework also abound as almost every human factor researcher comes up with his own error framework.
With so many intervention strategies proposed by researchers and some even practiced in the industry, there have been inadequate attempts to integrate these strategies into a holistic solution. A holistic approach to intervention strategies involve looking at all the possible links to an aircraft accident in order to come up with an intervention that best fits an organisation. Basing on all the links, an organisation must assess in what area their weaknesses lie and determine if these need intervention.
The intervention strategy to be used in a certain Human Factors in Aviation Accidents 15 organisation will therefore be customized according the weak links in the organisation’s system. The two links that are directly related to aviation accidents are the aircraft and the aircrew. Although accidents due to mechanical failures have been greatly reduced to advancement in technology, there needs to be constant assessment of the aircraft in general and the human factors affecting the man-machine interface in particular (Taneja, 2002).
Intervention strategies that can be applied to aircrew can be divided into two broad categories: selection and training. Ever since the knowledge that human factors contribute to aviation accidents, the selection process have been amended such that the pilots are not chosen based on technical skill alone but more so on their ability to coordinate with fellow aircrew, recognize their own errors and be willing to work with others to deal with these errors. Once chosen, the pilot’s training will have a great impact on his proficiency and possibly, on his ability to avoid accidents.
Possible interventions on these two main links can have a major influence on the number of accidents encountered by the organisation. The process of accident investigation, although a secondary link, should also be looked into in for possible intervention strategies. To date there is no benchmark as to the training and expertise a safety investigator must have. In order for a standardised level of investigation to be achieved, there needs to be a minimum with regards to the exposure an investigator must have. The investigative tools that the safety investigator uses could also be subject to an intervention.
As discussed earlier, investigation techniques for human errors are inadequate at present and needs further researches in order the reach the level of refinement of evaluative techniques for mechanical failures. The kinds of human errors as discussed by Wiegmann and Shappell (2001) while developing HFACS, are also links that could be used to come up with more effective Human Factors in Aviation Accidents 16 intervention strategies. Furthermore, autopsy results during aircraft accidents, although unable to prevent future accidents, can be used to design intervention strategies that will make an aircraft accident more survivable.
Upon carefully assessing the links just mentioned and determining the areas that need intervention, it is recognized that the success of any strategy ultimately depends on the Organisational Safety Culture (Taneja, 2002). Toft (1989) defined an organisation’s safety culture as “the set of norms, beliefs, attitudes and roles, social and technical practices that minimizes the exposure of the managers, employees and the general public to conditions considered dangerous or injurious” (as cited by Taneja, 2002). Discussed in brief earlier, the way an organisation handles errors and error management strategies depend on its safety culture.
A safety conscious organisation will not treat an accident as just another unavoidable circumstance but rather will focus its efforts in preventing future accidents from happening. A careful investigation that searches for the possible factors leading to the occurrence of the accident will be performed. Ideally, an organisation with a highly effective safety culture will aim to plug the hole in the Swiss cheese model before another opportunity for a catastrophe will be created (Reason, 1990 as cited by Taneja, 2002).
Even organisations that have a sound safety culture however can still be plagued by errors caused by human limitations such as fatigue, severe workloads, inadequate training, poorly-maintained equipment and errors caused by the air traffic control which would consequently affect the aircrew and the organisation. Even if the organisation has tried to remove every conceivable active and latent failure, chance errors can still take place (Heimrich & Merritt, 2000). But every organisation is bound to have a loophole. Thus, constant monitoring of all aspects in the organisation , whether directly related to safety or not, is essential.
Human Factors in Aviation Accidents 17 Because errors can arise from a variety of sources, any single intervention strategy, such as the CRM must not be taken as a panacea for eliminating error. CRM is only a tool that organisations use to perform error management. All intervention strategies have its limitation – the effects it will have on an organisation depends on the national culture, the strengths and weaknesses of the professional culture, and the organisational safety culture. Furthermore, the way the people at the frontline perceive the intervention strategy will affect the outcome of such a strategy.
In summary, the study of human factors affecting aircraft accidents has fostered an abundance of researches in this field. With the conception of CRM almost three decades ago, numerous error frameworks and intervention strategies have been proposed and used. Unfortunately, such a plethora of studies have not created a significant reduction on human error-related aircraft accidents. Presumably because it is still a relatively young field, the researches need more focus and the intervention strategies need more refinement. Intervention strategies have to address differences in national, professional and organisational culture.
It is also important that intervention strategies be customized to a certain organisation to ensure a greater amount of success. With the continuous development and improvement of researches in this field, there will bound to be a breakthrough in time, and the success that has long been sought will finally be at hand. Human Factors in Aviation 18.
References Helmreich, R. L. (in press). Culture, threat and error: Assessing system safety. In Safety in Aviation: The Management Commitment: Proceedings of a Conference. London: Royal Aeronautical Society.Retrieved October 2, 2007 from http://homepage. psy. utexas. edu/HomePage/Group/HelmreichLAB/Publications/pubfiles/Pub257. pdf. Helmreich, R. L. & Merritt, A. C. (2000). Safety and error management. The role of Crew Resource Management.
In B. J. Hayward and A. R. Lowe (Eds. ), Aviation Resource Management. Aldershot, UK: Ashgate Publishing Ltd. 107-119. Helmreich, R. L. Merritt, A. C. & Wilhelm, J. A. (1999). The evolution of Crew Resource Management training in commercial aviation. International Journal of Aviation Psychology, 9(1):19-32. Reason, J. (2000).
Human error: Models and management. BMJ, 320(7237): 768-770. Sexton, J. , Thomas, E. J. & Helmreich, R. L. (2000). Error, stress and teamwork in medicine and aviation: cross sectional surveys. BMJ, 320:745-749. Taneja, N. (2002). Human factors in aircraft accidents: A holistic approach to intervention strategies. Retrieved October 1, 2007 from www. humanfactors. uiuc. edu/Reports&PapersPDFs/humfac02/tanejahf02. pdf. Thomas, M. J. W. (2004). Error management training: Defining best practice. ATSB Aviation Safety Research Grant Scheme Project 2004/0050.
Retrieved October 1, 2007 from www. atsb. gov. au/publications/2004/pdf/error_management_training_best_practice. pdf. Wiegmann, D. A. & Shappell, S. A. (2003). A human error approach to aviation accident analysis. Aldershot, UK: Ashgate Publishing Ltd. Human Factors in Aviation Accidents 19 Wiegmann, D. A. & Shappell, S. A. (2001 February). A human error analysis of commercial aviation accidents using the Human Factors Analysis and Classification System [HFACS]. Office of Aviation Medicine. Retrieved October 2, 2007 from docs/508/docs/cami/0103. pdf.