QUALITATIVE HUMAN FACTORS ANALYSIS ON COLGAN 3407


By Kent B. Lewis

NTSB Identification: DCA09MA027
Scheduled 14 CFR Part 121: Air Carrier operation of COLGAN AIR INC
Accident occurred Thursday, February 12, 2009 in Clarence Center, NY
Aircraft: BOMBARDIER INC DHC-8-402, registration: N200WQ
Injuries: 50 Fatal.

This is preliminary information, subject to change, and may contain errors. Any errors in this report will be corrected when the final report has been completed.

On February 12, 2009, about 2217 eastern standard time (EST), a Colgan Air Inc., Bombardier Dash 8-Q400, N200WQ, d.b.a. Continental Connection flight 3407, crashed during an instrument approach to runway 23 at the Buffalo-Niagara International Airport (BUF), Buffalo, New York. The crash site was approximately 5 nautical miles northeast of the airport in Clarence Center, New York, and mostly confined to one residential house. The four flight crew and 45 passengers were fatally injured and the aircraft was destroyed by impact forces and post crash fire. There was one ground fatality. Night visual meteorological conditions prevailed at the time of the accident. The flight was a Code of Federal Regulations (CFR) Part 121 scheduled passenger flight from Liberty International Airport (EWR), Newark, New Jersey to BUF.


Human Factors Analysis and Error Classification

Qualitative research focus comes from deciding what is significant and simply providing enough evidence to illuminate and make that case (Patton 2002).

Analysis


The purpose of analysis is hazard detection and to identify the multiple causal factors of a mishap. This mishap was a classic stall/spin mishap, with multiple contributing human, material and situational factors. There are two types of “cause factors”, mishap cause factors and cause factors of damage and/or injury occurring in the course of the mishap. Pilot error was not the sole or primary cause of this mishap. Key factors were present prior to the mishap in the regulatory, organizational and supervisory structure that created latent conditions conducive to failure of the system. These factors constitute hazards that cause mishaps when left unmitigated. A complex interaction of these conditions should be examined and a causal map developed for this mishap. Without proper mitigations, these latent conditions led to active perceptual and skill based errors by the mishap crew.
Reasonable and rational human beings conducting normal operations can become the victims of the negative confluence of these latent factors, even when acting in a professional and responsible manner. Many times the difference is that too many negative influences existed months prior and appeared just prior to the mishap. If this constellation of latent conditions and active failures are not quickly identified and controlled, disaster can ensue.
When considering failures of high reliability organizations, time and causality are seamless, and the complex, interactive nature of the organization must be examined. Through an integrated approach of qualitative and quantitative analysis, investigators strive to successfully balance description and interpretation, provide enough detail to provide a rich insight, present a convincing argument and identify underlying deficiencies that might cause other mishaps. This analysis is “not intended to discover a single cause, assign blame or liability, or excuse human error”; this approach is in accordance with International Civil Aviation Organization standards and recommended practices for mishap investigation (ICAO). By identifying as many contributing factors as possible we hope to improve performance of high reliability organizations within the system and prevent future mishaps.


Error Classification


Regulatory Factors

-Operational process: Clearly defined objectives: Practical Test Standards for Private Pilot, Commercial, Multiengine, and Airline Transport Pilot.
Do current knowledge and practical requirements ensure proper skill set with regards to slow flight, stall and spin training? AKA upset recovery training?

-Operational Process: Program management and oversight role in development of SMS.
Many failures of high reliability organizations share two common features. First, latent conditions and active failures are always present, and the quality of operational safety is dependent on established organizational safety processes. Second, the essential process of checking and reviewing system defenses broke down. It is easy for regulators and operators to lose sight of operational safety requirements in the face of economic pressures, but it is the duty of the regulator to provide oversight and guidance to those who develop and manage safety systems. In fact, it is the stated mission of the regulator.

In a generative safety management system, all participants would be trained and equipped to ensure safe operations, especially front line operators, all managers and top decision makers. This was not the environment that existed at the time of the mishap, either at the regulator or at the organization. There was no requirement for top management or Director of Safety to have formal education or experience in safety programs or managements systems, but many formal schools and training courses are available.

“DR. BYRNE: What positions at Colgan have you had?
MR. MORGAN: I came in to Colgan in this position as the
VP of Safety and Regulatory Compliance.
DR. BYRNE: What airman's certificates, ratings or pilot
experience do you have?
MR. MORGAN: None, sir. I'm not a pilot.
DR. BYRNE: Give us a brief overview of your background
and education.
MR. MORGAN: Prior to Colgan Air, I had a 30-year career
with primarily Continental Airlines in operations management. My last position prior to coming to Colgan Air was as Chief Operating Officer of Continental Micronesia based in Guam.
DR. BYRNE: What previous experience or positions have you held regarding safety oversight?
MR. MORGAN: My primary safety oversight was in the recent position as COO of Continental Micronesia because I had all
the Part 119 positions reporting to me in that position. Prior to that, I held a number of staff-level positions which were creating policy and procedures for the company, as well as field leadership positions in the international operation.
DR. BYRNE: Have you had any specific training for this position?
MR. MORGAN: No, sir. I have not” (NTSB Public Hearing 2009).

Just as it is critical for pilots to have the proper knowledge, skills, resources and experience to safely operate aircraft, it is even more critical for managers and leaders to possess these same attributes so that they may effectively direct the organization. With this in mind, the Director of Safety and senior members of flight safety investigation team should be graduates of an industry recognized aviation safety course or have other suitable training or qualifications. The Director of Safety should also have direct budget authority for aviation safety program, and be directly responsible to the regulator.

“MR. MORGAN: Oh, I'm sorry. It's the Federal Aviation 15 Regulation Part 119 that designates certain individuals who hold a position of authority and responsibility within the airline directly responsible to the FAA in carrying out their duties. The director of operations, the director of maintenance, the director of quality assurance, these are positions that have a line of authority and responsibility to the FAA (NTSB Public Hearing 2009).

-Operational Process: Aircraft Certification Standards:
Inappropriate barriers and safeguards developed to protect against and alert pilots of low airspeed condition, coupled with yaw sensitivity = the perfect stall/spin machine flown by pilots with inadequate initial skills training and certifications, and no valid recurrent training. According to Dr Helmreich, in order for skilled crews to maintain a learned skill they must “be given a booster shot every 3-5 years.”

End of Regulatory Factors.

Organizational Factors

-Resource Management: Poor aircraft design: Automation and electronic flight instrument system (EFIS) design and interface.

Automation is used to control simple, linear tasks that can be easily automated, leaving the human to take care of complex flight management scenarios or compound emergencies. Dr James Reason relates to us the “Ironies of Automation” developed by British engineering psychologist Lisanne Bainbridge, where humans are suddenly abandoned by automation in the control loop, leaving them to perform critical skills that are rarely practiced in extremis conditions (Reason 1997, 43). In many cases these scenarios are novel and require focused attention of an entire crew to resolve, as the undesired system state was not anticipated by engineers and the crew is left to create adaptive strategies. The final irony is that it is the most highly automated systems that rarely fail that require the highest levels of operator training.
New technologies can radically change the nature of accidents or change the nature of the system that they protect. In the case of the Q400 flown in this mishap, design improvements to increase power and payload were not balanced with adequate flight control and automation protective systems. There is a relationship between human, situational and technical factors that must be considered during the conceptualization and design process. In this case the designers did not consider the impact of increased deceleration performance in an vulnerable, high workload environment such as the one experienced by the mishap crew.

“DR. DISMUKES: So this is normal communication. This is nothing extraneous right here but, yes, there were a number of concurrent tasks and this is a vulnerable period, no question about it.
MS. HERSMAN: So what we saw is they lost about 50 knots of airspeed --
DR. DISMUKES: Yes.
MS. HERSMAN: -- in 20 seconds.
DR. DISMUKES: Yes.
MS. HERSMAN: And during this period of time, they were preparing the aircraft --
DR. DISMUKES: -- for approach and landing. And this is not something that's unusual. The Safety Board's seen quite a few accidents and a couple during my time at the Board, where the crew has failed to monitor airspeed during this heavy workload --
DR. DISMUKES: Right.
MS. HERSMAN: -- event, and this decrease in airspeed happened pretty rapidly --
DR. DISMUKES: Yes.
MS. HERSMAN: -- over a short period of time. So it's not as if it was deteriorating for five minutes and they didn't catch it.
DR. DISMUKES: That's correct.
MS. HERSMAN: It was deteriorating, you know, in a quick
period of time. Do you think -- you talked about the stick shaker as being kind of an extraordinary event or experience maybe.
DR. DISMUKES: Yes, a dramatic intrusion on their attention.
MS. HERSMAN: Yeah, sure.
DR. DISMUKES: A dramatic event and, you know, I questioned whether or not it would make sense to give the crew an alert for low airspeed prior to the onset of the shaker. I think an alert that your airspeed is deteriorating is kind of like a fire alarm, and stick shaker and pusher are like you're in room that's already on fire, and if you could get an alert to let you know that your speed is deteriorating, then you have the ability to correct that before you get into that dramatic event or that room that's on fire. When I'm jump seating in cockpits, even when they're on autopilot, they get a chime. They get an audible alert when they're 1,000 feet from their target altitude” (NTSB Public Hearing 2009).

The mishap crew was also tasked with managing a mixed mode of automation on a continual basis, where the autopilot controls pitch, roll and yaw but the crew members must simultaneously manage power settings, in addition to navigation and communication duties, checklists and aircraft configuration changes. This is problematic for several reasons. Studies conducted by G. A. Miller…
This design increases the monitoring duties of the pilot, which may be acceptable during normal flight operations, but becomes more critical when flying an aircraft in the a low altitude, night, IMC environment that can accelerate and decelerate in a rapid manner.

The Q400 has also been described by line pilots as being extremely sensitive in the yaw axis, which leads to challenges when manually flying the aircraft.

“HOT-2 I think it's fun flying with— with captains. not so much any— lately but right at first that came from the Saab and they'd see— they'd see the rudder and they'd— aww # and kick it really hard and fling the plane back and forth.
HOT-1 kind of like I did a little while ago.
HOT-2 yeah kind of but uh at first I flew— I flew with some captains that were doing it really bad.
HOT-1 really.
HOT-2 like knock the flight attendants down in the back” (CVR Transcript).

When this characteristic is coupled with the flight regime experienced in high drag configurations and a nose high upset or stall, maintaining control of the aircraft can be difficult without continual practice. The risk treatment strategies here could be proper initial and recurrent flight training on the unique slow flight and stall characteristics of the Q400. Other options are to add automation barriers and display systems that both warn of this impending hazardous condition and/or protect against it. A high reliability system is supposed to include a multiplicity of mutually supporting and overlapping defense that protect it from single point human or technical failures (Reason 1997, 7).


The current mixed automation configuration not only does not protect against this flight envelope, the autopilot actually flew the aircraft into a nose high regime with no salient cues while airspeed rapidly decayed.

“DR DISMUKES: We don't have as many salient feedback loops and the training to recover from an incipient stall with emphasis on minimizing loss of altitude is very different. It doesn't give you the same feedback. As I said, it doesn't give you the surprise” (NTSB Public Hearing 2009).


A display is the primary means of presenting information to the flight crew’s visual, aural and tactile senses. Displays should alert the crew and draw attention, represent the nature of the condition and when possible recommend or direct corrective actions. The low airspeed cue simply provides information in a passive fashion, hardly indicative of an impending upset or out of control flight situation. This design is a tradeoff accepted during transition to electronic flight instrument systems (EFIS) and primary flight display (PFD), and this mishap identifies an emerging hazard. EFIS and the PFD is supposed to reduce the risks of visual illusions and spatial disorientation during instrument flight conditions by supplementing normal visual cues, and the tape system currently used is not intuitive. There is very little positional movement of the display due to the nature of the design, unlike a conventional analog airspeed indicator where experienced pilots can build appropriate mental models of airspeed based on position of the airspeed needle alone. The PFD airspeed tape appears static during the conventional instrument scan, and movement is not noticed unless focused attention is placed on the display, which is not a good information retrieval strategy when managing concurrent tasks during periods of high workload. Compounding this problem is the fact that the airspeed tape and altitude tape represent information in a radically different manner. If the airspeed tape is rising, airspeed is decreasing and a correction could require pitch down. Conversely, if the altitude tape is rising, the aircraft is descending and a correction would require pitch up. While this may be satisfactory to a pilot placed in an automation monitoring role, it is not intuitive to a pilot with an embedded conventional instrument scan.

One attempt at identifying desired airspeed conditions on PFDs is the addition of low airspeed cues, electronic information that represents crucial and selected airspeed in relation to the airspeed tape. The fact that the neither pilot reacted to the low airspeed cue suggests that the cue was not salient enough. No aural or visual warning of an impending undesired aircraft state is provided. Also, there was another similar event with the Q-400 at BVT during VMC, in which case not only did the crew not notice the deteriorating airspeed, but neither did a Line Check Airman observing the flight from the cockpit jumpseat (Witness statement).
The characteristics of a good cue are described as follows: It should attract attention at the critical time (conspicuous), it should have sufficient information about what task needs to be carried out (content) and when (context), and it should allow the operator to ensure correct performance of the task (check) (Reason 1997, 98). The mishap pilot’s scan was developed on conventional flight instruments, round dial aircraft. Salient cues for airspeed are presented and learned differently in this environment, and imprint onto cognitive memory. A transition to electronic flight instrument systems is challenging, and further research is needed in this area. Tape displays may be adaptable to the operational environment, but current low airspeed cues, and lack of alerts, caution warnings and safeguards are to be considered hazards, especially in impoverished visual conditions.

“MS. Hersman: A low airspeed alert seems to me to be much more of a potential to prevent catastrophe, and they don't get an alert to let them know that, you know, their airspeed is deteriorating so rapidly. So that's certainly something that I think that the Safety Board would have some concerns about because we do see that human beings lose their focus in these high workload events and failing to monitor your speed for 20 seconds had a, you know, catastrophic results. I mean I think this crew went from complacency to catastrophe in 30 seconds.
DR. DISMUKES: Yes” (NTSB Public Hearing 2009).

If we attempt to recreate the mishap and change the cues, environment or airplane and outcome most likely would be different. Change the crew and we still have critical aircraft states and potential departure from controlled flight. That leads us to the second avenue to control error, which is to reduce the consequences of human error by cross-monitoring and crew cooperation. The automation should be viewed as part of this crew, charged with monitoring and cross-check of airspeed. Automation and display design which prevent errors and can monitor and supplant human performance contribute greatly to limiting errors and their severe consequences. In this event, safeguards, warnings, aircraft design and crew training were not sufficient to prevent the mishap. A systems safety process could identify the hazards specific to this system and engineer controls to adequately manage the risks. Those controls, from least to most effective, are training, warnings, policy and procedure change, physical barriers and elimination of the hazard. In this event a low airspeed warning and protection system could have reduced tasking on the flight crew during a period of high workload. (HFIAI).

-Organizational climate: Operational process: Safety programs, Oversight and management, Monitoring to ensure a safe work environment.

Ensuring operational safety is an inherent responsibility of command. Aviation safety promotes operational readiness through preservation of human and material assets. The estimated direct and indirect costs of this mishap far exceed any perceived cost savings realized from human regulatory and organizational risk management decisions.
The objectives of an aviation safety program are to establish safety policies, detect hazards and eliminate hazards, ensure safety process assurance and provide education and awareness of safety information. These objectives are both command and employee responsibilities, but management is ultimately accountable.
In order to run a successful safety program, at a minimum you need top executives, managers and staff with education and experience in aviation safety programs, which includes education in safety programs, aerostructures, aerodynamics, human factors and a robust reporting program. Also required are trained personnel with experience in aerospace medicine, aerospace physiology, maintenance, pilots who are highly qualified in type/model/series of the organization’s aircraft, flight operations, quality assurance and ground safety.
The exercise of command should be used to solicit and encourage hazard reporting, that is in keeping with the spirit and intent of an aviation safety program. Top safety managers should also ensure that they are scanning the environment and have decision authority to incorporate industry safety standards and recommended practices. Without this delegated authority, organizational safety programs will most like be effective, as is the case with this organization where the Director of Safety does not have direct communications with regulatory safety inspectors.

“MR. MORGAN: Oh, I'm sorry. It's the Federal Aviation Regulation Part 119 that designates certain individuals who hold a position of authority and responsibility within the airline directly responsible to the FAA in carrying out their duties. The director of operations, the director of maintenance, the director of quality assurance, these are positions that have a line of authority and responsibility to the FAA. They're a designated
position. These are the people who get the primary communications directly from a POI or from a principal maintenance inspector.
MS. HIGGINS: But by definition, as SAFO is a safety directive?
MR. MORGAN: It is. But it is, in terms of distribution, is sent to those office holders within the airline who then have their internal responsibility to share it” (NTSB Public Hearing).

-Organizational Climate: Values and culture: SMS (Policy, Culture have ties to fatigue. Need more work here).

Key components of a safety management system that meets the current ICAO standards are safety policy, safety risk management, safety assurance and safety promotion (ICAO SMM 2009). Colgan did not have a safety management system in the truest definition.
It is crucial for a high reliability organization to develop a reporting, learning, informed and just culture. More representative of the organizational culture were memos that prohibited rest/sleep/nap and a monetarily punitive fatigue policy.
The engine that drives risk management is stakeholder reporting. The primary reporting process (ASAP) and other reporting systems were not effective and collected minimal operational data from crews, either due to lack of understanding or a perceived punitive nature of the program. There was no FOQA program. One crucial aspect of a safety program is that hazard reports should be generated any time a hazard is detected. Three purposes of hazard reporting are to report the hazard and remedial action taken, to report the hazard and recommend risk treatment strategies or to report the hazard and request that another agency determine appropriate corrective action. While the organization had several operational hazard reporting mechanisms in place, witness testimony reveal little use of any of the reporting systems.

-Human Resources: Compensation scheme attracts pilots with limited experience and basic skills who cannot afford nor desire migratory lifestyle.

Human resource considerations are key to an organization’s success. Unfortunately, sufficient basic human resources in the form of shelter and sustenance were not provided to these employees and they were forced to adapt to the hostile environment. The most important resource that any organization has is its people, and no high reliability organization will operate safely if employees are not provided enough resources to at least meet their basic human needs. In order to build a professional corps of pilots, human resources managers need to consider employee needs, job design, recruitment, selection, orientation, training, evaluation coaching and discipline (Evans and Ward 2004, 361).

“MS. HERSMAN: But given that Colgan closed 11 bases and maybe opened up eight and you might have an operator or a pilot that will have to move several times, I think certainly given the economic times and the housing market now, I think it would be a challenge to expect people to, within 60 days, to relocate, especially if they're getting paid $16,000 a year.
MR. MORGAN: Well, it's certainly conceivable that we could look at the amount of time, but moving pilot bases and aircraft around is not necessarily a function of what is the most convenient for pilots, it is what works for the airline to put airplanes in places where they need to be flying and we try to adjust our bases as best we can, but we first and foremost have an airline to run” (NTSB Public Hearing 2009).

“MS HIGGINS: I want to then turn to this whole commuter issue because I think, again, you're telling us you're hiring professionals, you're training them, but what they do on their own time is not something you can effect” (NTSB Public Hearing 2009)..

“MR. HAUETER: Okay. Given the salary of new hires and the cost of living in the Newark area, how do you expect some of your staff to find adequate accommodations?
MR. MORGAN: I would expect the way other pilots do and other pilots have done for many, many years who have commuted in and out of different bases. People do find a way to share rooms, they share apartments, they have avenues available to them. It's something that's been done for many, many ages in the airline industry and I expect them to follow and do the same thing that others have done” (NTSB Public Hearing 2009)..

There was much discussion regarding the mishap pilot’s decision to commute and the nature of this behavior. This behavior can be viewed as a reflection of organizational influence and goal orientation (Simon 1995, 5). The pilots valued their role in the organization and made compromised decisions that supported mission accomplishment, and assumed unreasonable personal and professional risk in the process. The organization is complicit in this decision and lacking in oversight, as appropriate resources were not provided for the employees and managers did not ensure operational safety by assigning fully qualified and rested crews.

-Operational process: Strategic risk assessment: Schedules: Business plan (multiple bases, rapid growth + low pay = commuting pilots = pilots flying fatigued, both chronic and acute

Before continuing to supervisory factors, it is important to pause and realize that all of these latent conditions at the organizational level manifest themselves as preconditions for unsafe acts by affecting the threat and error management skills of the operators. In this particular case, they created adverse mental states to develop by creating threats to attention and cognitive task saturation.

End of Organizational Factors.

Supervisory Factors


-Inadequate Supervision: Failed to administer proper training. CRM.

CRM breaks down at night and/or IMC because of adverse physiological (ex spatial disorientation) and mental states (ex. fatigue). "When visual cues are limited, aircrew coordination, both internal and external to the cockpit, is even more critical than usual" (Shappell and Wiegmann 2003, 139).

Beyond the maneuvers training (or lack thereof), how much and what type of CRM training had they had at Colgan and prior to? A lack of appropriate training and skill sets would be traced to inadequacies in organizational processes and human resources management.

“Dr Dismukes: But it's very clear that it is hard to figure out what the two pilots need to be saying to each other, but there is emphasis in the industry on required call outs in a lot of situations. For example, unstabilized approach, that's sometimes honored in the breach, but I really think that having highly practiced, memorized, standard call outs for situations, certainly including a stall recovery, would help a lot. It helps a lot to have the other person there who is not flying the airplane prompting you” (NTSB Public Hearing).

“Dr Nesthus: We're less able to track multiple sources of information and avoid distractions. We're highly distractible when we're tired. We are often moody. Although this never happens to me, we become impatient with others, we lost our interpersonal skills, CRM becomes a difficult situation, so it becomes difficult communicating clearly, which I'm probably having right now” (NTSB Public Hearing).

-Inadequate supervision: Failed to administer proper training: Slow fight, stall recognition and recovery.

Crew did expected recovery but aircraft performance (-) exceeded level required of crew training, given the current aircraft state.

“MS. HERSMAN: I mean these are just, you know, kind of the room's on fire at that point. You talked about being highly practiced at something, and I think that aside from Mr. Warner who's a test pilot, who said he had had 1,000 experiences recovering for stall, I think that most pilots don't actually have that opportunity, and so they may not be conditioned to respond appropriately. I think they do get trained on approach to stall but I'm not sure they get trained on recovery from stall in the aircraft that they're flying. Would you agree with that?
DR. DISMUKES: Absolutely” (NTSB Public Hearing).

-Failed to correct a known problem: Failure to monitor aircrew performance: Rejected:

Crew was qualified, in fact PF had extra training to meet proficiency qualification requirements. Similar events had happened recently to other crews operating the Q-400 in less demanding environments (BVT).

-?Scheduling? (CDOs, appropriate planned and re-planned duty and rest periods taking into account the effects of circadian rhythm disruption and sleep deprivation, sufficiently in advance to provide the opportunity for flight crew members to plan adequate rest for the duty period envisaged (ALPA 2009, 3). Also absent were training program that included information on fatigue and sleep education, mitigation and countermeasure strategies.

-Inadequate Supervision: Failed to develop and administer proper training: Stall series.

Are crews taught to effect recovery with the minimum necessary pitch reduction, IOT minimize altitude loss? Also are they trained for dirty, power on, departure type stalls (overrotation). Seems as if that is where the autopilot put them.

-Inadequate Supervision: Administered improper training: Tailplane icing video shown when not required?

When we couple the fact that inadequate stall training was provided, and at the same time training was provided on tailplane icing, we have the possibility of a negative habit transfer. When crews are presented with a novel and unexpected situation, “the nature of memory is to retrieve the most typical explanation for the event (Dismukes et al 2007, 201). This reaction is termed “representative bias.” Also important to understand is that “surprise, confusion, stress and the need to respond quickly impair deliberate cognitive processing…However these cognitive processes are vulnerable to error when the current situation only partly matches previous experience and when surprise and time pressure are present” (Dismukes 2007, 206-7).
Once an error slips though the first level of defenses it can undermine other defenses downstream. The initial error of not noticing the low airspeed combined with the inherent cognitive vulnerabilities, vestibular illusions and features of the situation set the stage for misinterpreting the decay of airspeed. This second error then combined with other cognitive vulnerabilities and situation features to set the stage for mishandling the nose high upset and subsequent stall.

End of Supervisory Factors.

Preconditions for unsafe acts


-Environmental: Physical environment: Vision restricted: Icing, darkness, clouds

“21:52:57.2 HOT-2 alrighty and for the rest of that weather uh three miles. it's snowing with some mist” (CVR).

-Environmental: Technological environment: Display/interface characteristics, instrumentation and sensory feedback systems.

“MS. HERSMAN: So for the Captain, you know, who had been GA pilot before, and I can recall from my experience in a Piper Cub, that in a GA aircraft, your feedback is pretty good. I mean when you're on the edge of stall, you can really feel the buffet and in this aircraft, they're on autopilot, and so they aren't really feeling anything until they get into the stall situation.
DR. DISMUKES: Right.
MS. HERSMAN: Given that we've heard a lot about the feedback going to the pilots from a human factors perspective, why does it make sense in an automated environment to take the pilot out of the loop with respect to the feedback that they might be getting?
DR. DISMUKES: That's a very interesting point. Of course, you have to put it into context. There are tradeoffs in how many alerts you put in the system. There are a lot already, but I take your point about every time they are approaching level out, they get a chime. Isn't this more important? And it's absolutely true there would be a little -- you'd want a very distinctive alert but one that is not so dramatic. That's well worth looking at (NTSB Public Hearing).

-Condition of operator: Adverse mental: Acute and/or chronic fatigue effects.

Fatigue is a physiological state of reduced mental and/or physical performance capability resulting from lack of sleep and/or increased physical activity that can reduce a flightcrew member’s alertness and ability to safely operate an aircraft or perform safety related duties (ALPA 2009, 3). This condition could negatively affect pilot performance, especially in the high workload environment in which the mishap crew was operating, and could lead to aggravation of spatial disorientation and inattention to critical flight parameters.

“MR. FREY: Is it fair to ask that if a pilot were at night, instrument conditions -- in other words, in the clouds where he didn't have a visual horizon and was fatigued, could a sudden surprise application of power in certain situations, could that contribute to a sense of vertigo or impairment of correct understanding of what's going on?
DR. NESTHUS: It's possible, but that may not have anything to do with fatigue, actually. Just being in a dark environment without your outside peripheral visual cues might cause that.
MR. FREY: Would a person be able to recognize that state if they were in it? Is there a way that they could recognize that they were experiencing such a thing?
DR. NESTHUS: Yeah, if they're experiencing vertigo?
MR. FREY: Yeah.
DR. NESTHUS: Well, again, spatial disorientation, there are different types of spatial disorientation and one of them is that you don't recognize that you are disoriented, so you may or may not be able to recognize that even if fatigue were not a factor.

“Dr DISMUKES: A computer can fly, stay on the assigned course a lot better. Where we come into play is in making decisions, judgment about ambiguous, novel situations, making value judgments, that kind of thing. Only human can do that, but we're at the sort of boundary in some of the tasks we perform. We cannot be 100 percent reliable, and that is what's behind this whole concept of threat and error management which is we have to
acknowledge that the best pilot, the best Olympic performer in the world, will make a mistake and so we need to give this person tools to quickly recognize the mistake and work out the best thing to do then.
MR. FREY: Could we say the same for crews performing critical tasks then?
DR. DISMUKES: Even more so. If it's a novel situation, if they're under stress or fatigue, then their performance will be degraded even more” (NTSB Public Hearing 2009).

The Air Line Pilots Association’s message for years has been simple. There needs to be three basic elements to guide any change in the regulatory environment: First element, sleep. Pilots must have a minimum rest period that provides for an 8-hour sleep opportunity at the right time of the day and before the next duty period. Second element, duty. Time since awake and time on task (spent on duty) are driving factors in pilot alertness. The duty-day length must take into account the time the duty day started. Third element, circadian. Basic protections will not always be adequate when multiple time zones are crossed or flying is done on the back side of a pilot’s physiological clock (Belenky).
We know that adequate sleep sustains performance, inadequate sleep degrades performance, and inadequate sleep creates risk and loss. What are the consequences of sleep restriction or deprivation on performance of duty? The consequences are errors leading to accidents, catastrophe, inadequate strategizing, poor life decisions and long-term health effects. The following are some factors that factors that affect sleep, circadian rhythms, and alertness: Early start times, extended work periods, amount of work time within a shift or duty period, and insufficient time off between work periods. Also contributing to performance degradation are number of consecutive work periods, insufficient recovery time off between consecutive work periods, night work through window of circadian low (WOCL), daytime sleep periods, day-to-night or night-to-day transitions (schedule stability), and changing work periods (e.g., starting and ending times, cycles). The remaining factors to consider are on-call or reserve status, schedule predictability (i.e., available in advance) and unplanned work extensions (Rosekind).
Circadian factors alter the basic sleep and duty provisions due to sleep disruption, etc. The list of factors above applies, but they are further magnified due to circadian disruption. The short story to circadian issues is that they alter alertness and performance, alter ability to obtain restorative sleep, alter length of sleep and affect duration of re-synchronization (Hursh).

Fatigue summary: Many of these factors were present for both crewmembers in this mishap. Both the Captain and First Officer experienced significant disruptions to sleep, duty and circadian elements in the days and weeks preceding the mishap. Sleep is the key. The only thing that fixes a loss of sleep is sleeping itself. The length of duty, where it occurs, and any circadian disruption affect not only the quality of work, but also the ability to obtain recovery sleep before the next duty. They are all factors that are intertwined and must be considered together. They are not only factors that must be considered by front line operators, but they must also be considered by regulators who set policy and by organizational leaders who make human resources decisions. When these decisions revolve around a business plan that involves rapid expansion and multiple pilot base and equipment changes, front line employees will bear the weight of these decisions. Unfortunately, sufficient basic human resources in the form of shelter and sustenance were not provided to these employees and they were forced to adapt to the hostile environment. This is a failure of organizational process as well as personal decision making.

“21:49:18. HOT-1 I've gotta do this. I've gotta— I'm ready to move on. um [sound similar to yawn] excuse me. it's kind of like me. you know I started this this little gig late in life” (CVR).

“22:07:14.2 HOT-2 [sound similar to yawn] alright I'm gonna call in range. I'll be off one for a second” (CVR).


-Illness:

“21:50:11. HOT-2 [sound similar to sneeze] excuse me.
21:56:17.4 HOT-2 do you want to go down?
21:56:18.6 HOT-1 huh? ohh. I was thinking about that.
21:56:26.4 HOT-2 might be easier on my ears if we start going down sooner.
22:09:26.0 HOT-1 how's the ears?
22:09:27.3
HOT-2 uh they're stuffy.
22:09:31.6 HOT-1 are they poppin?
22:09:32.7 HOT-2 yeah.
22:09:33.3 HOT-1 okay. that's a good thing.
22:09:35.7 HOT-2 yeah I wanta make em pop. [sound of laughter]” (CVR).

-Condition of operator: Adverse mental state: Threats to attention: Cognitive task saturation, channelized attention (prior to and during upset, more to add here on multitasking).

This increased the importance of monitoring flight instruments to maintain awareness of the airplane attitude and altitude. The pilot's tasks during the approach, however, included maintaining an instrument scan, monitoring pitch, roll, yaw and power, initiating heading changes, arming approach, checking environmental conditions, monitoring and responding to flight deck and ATC communications, calling for flaps, gear and initiating checklists. These tasks required attention both inside and outside the cockpit. These competing tasks depleted cognitive reserves, created rapidly shifting frames of reference, left the pilot vulnerable to common visual and vestibular illusions, and reduced his awareness of the airplane's attitude, altitude and trajectory.

-Condition of operator: Adverse mental state: Temporal distortion exacerbated by aircraft performance characteristics?

-Condition of operator: Adverse physiological state: Somatogravic Illusion.

Somatogravic illusion is probable as a factor during initial perception of undesired aircraft state and decision formation, because of the conditions that existed and the flight path of the aircraft. This has been studied exclusively and is known to cause the characteristic flight path exhibited by the mishap aircraft. FDR data reveals rapid deceleration and that the pilot initially pulled back on the controls, increasing the angle of attack. The CVR is inconclusive, except absence of callouts for stall recovery may be supporting indicators. It is probable that the pilot flying initially did not perceive a stall but rather excessive pitch down due to deceleration resulting from power reduction, gear deployment, propeller drag increase, induced drag from increased aoa and parasitic drag from flap deployment . The aircraft decelerated 50 knots in 26 seconds, which would produce a strong deceleration force and vestibular illusion of tumbling forward.

The FAA's Aeronautical Information Manual has a section on Illusions Leading to Spatial Disorientation. The section, in part, stated: “Various complex motions and forces and certain visual scenes encountered in flight can create illusions of motion and position. Spatial disorientation from these illusions can be prevented only by visual reference to reliable, fixed points on the ground or to flight instruments…Somatogravic illusion. A rapid acceleration during takeoff can create the illusion of being in a nose up attitude. The disoriented pilot will push the aircraft into a nose low, or dive attitude. A rapid deceleration by a quick reduction of the throttles can have the opposite effect, with the disoriented pilot pulling the aircraft into a nose up, or stall attitude” (FAR/AIM 2009, 8-1-5.b.2.d).

“DR. DISMUKES: Well, it can take a long time. In this case, I don't see any evidence that he ever understood the situation he was in. I mean he knew something was wrong, but I don't know if he ever finally said, wait a minute. I've got to get the nose down no matter where I am. I've got to get the nose down. He did advance the throttles but -- I haven't seen the FDR data but I don't know if he ever did what he had to do at that point which is get the nose down to recover flying speed, reduce the angle of attack” (NTSB Public Hearing).

This illusion has been cited by seven times by NTSB last 10 years, and was recently cited in a factual report where the cause of the mishap was listed as “the pilot's spatial disorientation…”, which “…resulted in a loss of control and subsequent collision with trees” (NTSB NYC08FA039). Somatogravic illusion was also cited as a factor in NTSB accident reports DEN04FA104, CHI08FA066, NYC08LA223, NYC03FA205, LAX03FA254, NYC01FA214 and most recently by the Transportation Safety Board of Canada as part of their accident investigation of a multi-engine propeller airplane crash (TSB 2007).

“The somatogravic illusion occurs in conditions of poor visibility or in darkness when there is an absence of visual cues. Instrument-rated and experienced pilots are not immune to this illusion, which is a subtle and dangerous form of disorientation. The illusion occurs because the body relies on sensory organs in the inner ear to maintain balance and, in the absence of visual cues, signals from these organs can produce a very powerful disorientation. In the case of an aircraft that is accelerating during a go-around, the sense organs of the inner ear of the pilot send a signal to the pilot’s brain that is interpreted as tilting backwards instead of accelerating forward.
According to text in the Fundamentals of Aerospace Medicine, “A relatively slow aircraft, accelerating from 100 to 130 knots over a 10-second period just after take-off, generates +0.16 Gx on the pilot. Although the resultant gravitoinertial force is only 1.01 G, barely more than the perceptible force of gravity, it is directed 9° aft signifying to the unwary pilot a 9° nose-up pitch attitude (Davis 2008, 171). If the aircraft nose is simultaneously raised, which is usually the case in a go-around, the pilot has a very strong sensation of climbing. The illusion of false climb tends to lead the pilot to lower the nose and descend. The aircraft then accelerates and the illusion can intensify. If the aircraft is being flown in proximity to the ground, ground contact can occur before the pilot can assimilate information from the aircraft’s instruments, overcome the powerful illusion, and take corrective action” (A07C0001).

Condition of operator: Adverse physiological state: Spatial Disorientation:

According to the FAA Airplane Flying Handbook (FAA-H-8083-3), "Night flying is very different from day flying and demands more attention of the pilot. The most noticeable difference is the limited availability of outside visual references. Therefore, flight instruments should be used to a greater degree.” Generally, at night it is difficult to see clouds and restrictions to visibility, particularly on dark nights or under overcast. "The vestibular sense (motion sensing by the inner ear) in particular tends to confuse the pilot. Because of inertia, the sensory areas of the inner ear cannot detect slight changes in the attitude of the airplane, nor can they accurately sense attitude changes that occur at a uniform rate over a period of time. On the other hand, false sensations are often generated; leading the pilot to believe the attitude of the airplane has changed when in fact, it has not. These false sensations result in the pilot experiencing spatial
disorientation."
Spatial disorientation has been cited by NTSB in mishap reports 200 times since January 1, 1999. In one event, a commercial airliner climbing out over the water rolled over 60 degrees to the right before the pilot flying realized the upset condition. In this incident the NTSB determined the probable cause to be linked to spatial disorientation. Other factors in the incident were the cloud layer and dark night. The NTSB also referenced Federal Aviation Administration Advisory Circular (AC) 60-4A, "Pilot's Spatial Disorientation," where upset recovery tests were conducted with qualified instrument pilots. “The results indicated that it can take as long as 35 seconds to establish full control by instruments after a loss of visual reference of the earth's surface. AC 60-4A further stated that surface references and the natural horizon may become obscured even though visibility may be above visual flight rules minimums and that an inability to perceive the natural horizon or surface references is common during flights over water, at night, in sparsely populated areas, and in low-visibility conditions” (IAD00IA032).
In another event, a commercial airliner slowed below stall speed after the autothrottles disconnected for an undetermined reason (DCA97MA049). The aircraft subsequently lost 3,000 feet before the pilots could recover. The germane point here is the amount of time and altitude required to recognize the effects of spatial disorientation, then recognize and recover from the upset and subsequent stall is significant.



-Condition of operator: Adverse physiological state: Physical fatigue.

-Personnel factors: Crew resource management: Failure to coordinate actions (raising flaps).

CRM Not properly trained

-Personnel factors: Personal readiness: Failure to obtain adequate crew rest.
Previously discussed, manifestation of organizational human resources error.

End of Preconditions to Unsafe Acts.

Active Failures / Unsafe Acts

Scenarios:
1) Did not recognize?
2) Recognized upset and PF reacted incorrectly no call outs, did not lower nose sufficiently; PM reacted too quickly by raising flaps, actions map out correctly for tailplane cognitive memory or stall recovery but NOT timing (lack of industry and organizational training on OOCF, CRM
3) Reacted IAW insufficient/incorrect training to perceived tailplane stall
4) PF recognition overridden by and spatial D, supported by lack of comm..

-Errors: Skill based errors: Memory failure: Omitted step in procedure (call outs) or performed incorrect procedure (nose high upset recovery).

DR. DISMUKES: Well, you know, we've got some issues, in particular, with upset attitude recovery. The data we have which is incomplete suggests that surprise and stress are problematic, that even experienced pilots initially flail around. They do not recognize which upset situation they're in and thus do not make the right responses. So theoretically in principle, we would really like to provide training in upset attitudes that includes the element of surprise because that is probably the biggest stumbling block. Now that's not an easy matter. I don't know of anybody who's doing that. I think it could be done but it would take some work” (NTSB Public Hearing).

“DR. DISMUKES: As with many procedural issues, you have to start with the theory in the classroom, but that's not going to do much good. You've got to practice it repetitively until it's automatic and then you've got to check it. You get what you reward and what you check…Stall recoveries, the pilots know they're about to experience a stall. They even know which kind, and they are hair triggered to respond. What we would like for them to do is somehow put them in the scenario that's similar to actual accident scenarios, so that they don't know it's coming and they have to diagnose which situation they're in. I don't know exactly how you'd do that. Somebody yesterday mentioned, I think it was the
gentlemen from the FAA, at least have them fly into it on autopilot. That's a step in the right direction, but we need to do a lot more. We haven't done the research” (NTSB Public Hearing).

-Violations: Routine: Violation of sterile cockpit regulation:

-Errors: Skill based errors: Poor technique: Airmanship: Nose high upset recovery; Stall/Spin. Rejected. Not trained properly?

“MR. FREY: Thanks. Was there any type of recovery or upset condition that was more likely to result in a failure than others?
DR. DISMUKES: It seems, looking for patterns, nose high was the most problematic. The two scenarios in which there was good recovery were a wind shear recovery which was getting a lot of emphasis at the time and still is, and then a nose low spiral dive. The others, in some way or the other, involved nose high and recovery was not stellar, and as I had mentioned earlier, recognition of what the condition was, was problematic, even things that would seem obvious, like clicking off autopilot often didn't happen.
MR. FREY: What about night conditions or instrument conditions? Were they looked at?
DR. DISMUKES: Very much so because we're designed, our visual system, all the rest of our brain is designed to extract information from the environment continuously and keep our model of the situation updated. In impoverished conditions, basically when we don't have visual cues, it's a lot harder” (NTSB Public Hearing).


-Errors: Decision errors: picked inappropriate but cognitively salient procedure; ?wrong? response to undesired aircraft state i.e. tailplane stall?

“DR DISMUKES: And, as I said earlier, one of the risks, if you haven't practiced a maneuver, to the point of proficiency and you're surprised, you're under stress, your mind is flailing around, it may set on the first thing it hits on and that may be the wrong thing.
MR. FREY: Could it possibly be that what happened to this crew?
DR. DISMUKES: It's conceivable, certainly conceivable” (NTSB Public Hearing).

-Errors: Perceptual error: Visual misperception: Airspeed monitoring

Procative mishap prevention plans have well designed, integrated controls for human error. First, minimize the occurrence of errors by ensuring high levels of staff competence, designing controls to match humans task management characteristics, providing proper checklists, procedures, publications and reducing stressful environmental working conditions. Training programs aimed at increasing cooperation between crew members will reduce the number of errors, but total elimination of error is unrealistic.
Before a crew can react to information, it must first be sensed. There is potential for error here, because sensory systems function only in a narrow range and that range is further reduced in impoverished conditions of low visibility and night time. Once information is sensed, it is must make its way to the brain for processing. This takes valuable time, after which a conclusion is drawn about the nature and meaning of the information received. This interpretation is also prone to error. Experience, expectation, attitude, motivation, and fatigue all have definite influences on perception and are additional sources of error.

After conclusions have been formed about the meaning of the message, decision making begins. Many factors lead to erroneous decisions; commercial considerations, fatigue, physical and physiological states. Action follows decision. This is another stage for error, because if equipment is designed in such a way that it can be operated wrongly, sooner or later it will be. Once action is taken, feedback can also compound error.

PF needed information and the airplane did not provide that critical information in a salient or timely manner. The appearance of a small low airspeed cue on the left side of a primary flight display during this dynamic phase of flight was not a sufficient alert, caution or warning of impending danger. Stall shaker and pusher timing is too late, too many barriers have been breached to effectively regain desired state, given dynamic stability of the aircraft coupled with pilot training and environmental conditions.


“DR DISMUKES: We also need to develop and train the specific techniques that pilots can use. It's not enough to say you're supposed to monitor A, B and C. We need to say how to do it or when to do and what techniques you can use, and that research, it's part of the research matter but it's also a matter of just implementing what we could do now” (NTSB Public Hearing).

“MR. HARRIS: Understood. Thank you. Is there a direct or indirect relationship between fatigue and situational awareness?
DR. DISMUKES: I believe there is. I don't know that I've seen explicit research on this, but you heard in the testimony yesterday about fatigue, that it does affect our attention. It has very classical effects that we persevere in efforts that may not be productive, our attention narrows and so forth. So I would be inclined to say that the problems I talked
about, that we're all vulnerable to at all times are exacerbated considerably by fatigue” (NTSB Public Hearing).


Recommendations:

To FAA:
1.
Ensure aircraft systems meet basic human factors design philosophies
2. Create science-based Flight Time Duty Time rules.
3. Require more experience, screening, and mentoring for new-hire pilots entering the airline industry.
4. Facilitate voluntary safety reporting programs already in place at many airlines.

To Industry:
1. Improve pilot training to reduce Loss Of Control mishaps

Conclusion:


Recent research by the NASA Flight Cognition Laboratory has identified multiple cross-cutting factors that appear in mishaps. These factors are concurrent task management and workload issues, situations requiring rapid response, plan continuation bias, equipment design flaws, and misleading or absent cues contributing to crew error. Also discovered was that inadequate knowledge or experience provided in training and hidden weaknesses in defenses against error appeared on a regular basis. Other factors were stress and fatigue, which narrows perception and reduces working memory capacity, Finally, social, cultural and organizational factors that inevitably play a role in not only every mishap but also in every flight (Dismukes et al 2007, 296-300).
While it is natural for society to want simple causal explanations, that is not representative of the sociotechnical nature of mishaps in high reliability organizations. The air transportation system is inherently safe, and random events of this nature are opportunities to learn, not to place blame. To do so would reduce the motivation of all stakeholders in the system to “make changes that might prevent future accidents (Dekker 2008; Reason 1990). Flight crews are often implicated directly because of mandated determinations of cause or probable cause, but these investigative efforts will reach the heights of fruition if we look for causality in the probabilistic confluence of many factors.


References:


Air Line Pilots Association (ALPA) 2009. ALPA adopts landmark fatigue policy. Retrieved November 15, 2009 from http://www.alpa.org/.

Belenky, Gregory. Sleep and Performance Research Center, Washington State University.

Davis, J.R. et al. 2008. Fundamentals of Aerospace Medicine, Fourth Edition, Lippincott
Williams & Wilkins, Chapter 6 – Spatial Disorientation in Flight: 177.

Dekker, Sidney 2008. Just culture: Balancing safety and accountability. Burlington, VT. Ashgate Publishing Company.

Dismukes, R. Key, Benjamin A. Berman and Loukia D. Loukopoulos 2007. The limits of expertise. Rethinking pilot error and the causes of airline accidents. Burlington, VT. Ashgate Publishing Company.

Dismukes, R. Key, Immanuel Barshi and Loukia D. Loukoplous 2009. The multitasking myth. Handling complexity in real-world operations. Burlington, VT. Ashgate Publishing Company.

Evans, G. Edward and Patricia L. Ward 2004. Management basics for information professionals. New York. Neal-Schuman Publishers, Inc.

Human Factors in Accident Investigation Manual (HFIAI) 2005. Transportation Safety Institute. Oklahoma City, OK.

Hursh, Steven R. President, Institutes for Behavior Resources; Professor, Johns Hopkins University, School of Medicine.

International Civil Aviation Organization (ICAO). Annex 13: Aircraft Accident and Incident Investigation.

Jeppesen 2009. FAR/AIM. United States.

National Transportation Safety Board (NTSB) 2009. Aviation Accident Database and Synopses. Retrieved November 14, 2009 from http://www.ntsb.gov/ntsb/query.asp#query_start.

National Transportation Safety Board (NTSB) 2009. Loss of control during maneuvering flight in IMC NTSB Identification: NYC08LA223, Saturday, June 21, 2008 in Rockland, ME, PIPER PA-28-140, registration: N8776N. Retrieved November 14, 2009 from http://www.ntsb.gov/ntsb/brief.asp?ev_id=20080710X01005&key=1.

National Transportation Safety Board (NTSB) 2009. Spatial disorientation during the initial climb causing loss of control at night with poor visual conditions. NTSB Identification: CHI08FA066, Wednesday, January 16, 2008 in Cleveland, OH,
Hawker Beechcraft Corp. 58, registration: N3217L. Retrieved November 14, 2009 from http://www.ntsb.gov/ntsb/brief.asp?ev_id=20080710X01005&key=1.

National Transportation Safety Board (NTSB) 2009. Spatial disorientation, which resulted in a loss of control and subsequent collision with trees during in IMC and dusk conditions. NTSB Identification: NYC08FA039, November 20, 2007 in Maine,, NY,
MOONEY M20K, registration: N252DD. Retrieved November 14, 2009 from http://www.ntsb.gov/ntsb/brief.asp?ev_id=20080710X01005&key=1.

National Transportation Safety Board (NTSB) 2005. Attempted flight into IMC, airplane collided with trees. NTSB Identification: DEN04FA104, July 11, 2004 in Paris, AR, Cessna 172I, registration: N46174. Retrieved November 14, 2009 from http://www.ntsb.gov/ntsb/brief.asp?ev_id=20080710X01005&key=1.

National Transportation Safety Board (NTSB) 2004. Failure to maintain aircraft control due to spatial disorientation in IMC, aircraft descended into trees. NTSB Identification: NYC03FA205, Saturday, September 27, 2003 in Concord, MA, Cessna 182T, registration: N963LP. Retrieved November 14, 2009 from http://www.ntsb.gov/ntsb/brief.asp?ev_id=20080710X01005&key=1.

National Transportation Safety Board (NTSB) 2005. Pilot's in-flight loss of control due to a somatogravic illusion and/or spatial disorientation during night flight. NTSB Identification: LAX03FA254, August 08, 2003 in Bishop, CA, Cessna 340A, registration: N340DC. Retrieved November 14, 2009 from http://www.ntsb.gov/ntsb/brief.asp?ev_id=20080710X01005&key=1.

National Transportation Safety Board (NTSB) 2003. While taking off at night in IMC, pilot's failure to maintain a proper climb rate, which was a result of spatial disorientation. NTSB Identification: NYC01FA214. Friday, August 24, 2001 in Ithaca, NY, Learjet 25, registration: N153TW. Retrieved November 14, 2009 from http://www.ntsb.gov/ntsb/brief.asp?ev_id=20080710X01005&key=1.

National Transportation Safety Board (NTSB) 2001. Failure to maintain control of the airplane during climb out over water at night, which was a result of spatial disorientation. NTSB Identification: : IAD00IA032 Thursday, March 30, 2000 in NEW YORK CITY, NY Boeing 767-332, registration: N182DN. Retrieved November 15, 2009 from http://www.ntsb.gov/ntsb/brief.asp?ev_id=20001212X20649&key=1.

National Transportation Safety Board (NTSB) 1997. While taking off at night in IMC, pilot's failure to maintain a proper climb rate, which was a result of spatial disorientation. NTSB Identification: DCA97MA049 May 12, 1997 in WEST PALM BEACH, FL., Airbus Industrie A300B4-605R, registration: N90070. Retrieved November 15, 2009 from http://www.ntsb.gov/ntsb/brief.asp?ev_id=20001208X07893&key=1.

Naval Postgraduate School 1995. Aviation safety program: Command policy and reporting. Monterey, CA.

Reason, James 1990. Human error. New York: Cambridge University Press.

Reason, James 1997. Managing the risks of organizational accidents. Burlington, VT. Ashgate Publishing, Ltd.

Rosekind, M.R. 2005. Managing work schedules: An alertness and safety perspective. In M.H. Kryger, T. Roth, W.C. Dement, editors, Principles and Practice of Sleep Medicine: 682.

Shappell, Scott A. and Douglas A. Wiegmann 2003. A Human Error Approach to Aviation Accident Analysis. Burlington, VT. Ashgate.

Simon, Herbert A. 1997. Administrative behavior. New York, The Free Press.

Transportation Safety Board of Canada 2007. Aviation Investigation Report, Collision with Terrain, Transwest Air, Beech A100 King Air C-GFFN, Sandy Bay, Saskatchewan, 07 January 2007, Report Number A07C0001.