0 Members and 1 Guest are viewing this topic.
" I can't remember who it was that said it now, but their concern was that a lot of automation might result in a situation where the crew is so overwhelmed with data that they forget to fly the aeroplane."I believe the term that you are referring to is "information overload" ;-) Whilst your question is interesting, one should note that "automation" isn't related to automation - it is related to information acquisition - whether automated or not. In fact, if used correctly, automation can avoid information overload and improve situational awareness of pilots through decision support and/or filtering of incoming data.Orasanu and Martin wrote an excellent article on the subject: Errors in Aviation Decision Making: A Factor in Accidents and Incidents , HESSD, 1999.
1985, February 19: A B-747 SP, flown by a China Airlines Capt., suffered an engine failure while cruising at 41,000 ft. The Capt. left it on autopilot too long. The autopilot tried to maintain that altitude, which was ultimately impossible at that weight, with only 3 engines functioning. As it approached the stall, because the speed kept decelerating, the Capt. finally disconnected the auto pilot. He was not prepared, because he had failed to trim in rudder to compensate for the asymmetrical thrust condition; the autopilot was maintaining wings level by the use of aileron and spoilers only.[Autopilots normally do not control the rudder in climb, cruise, or descent. They use only the ailerons, spoilers, elevators and horizontal stab trim.] When he hit that disconnect switch, the plane rolled rapidly and entered a dive. Although the plane exceeded the speed of sound, tearing parts off and causing major structural damage, the Capt. was able to make a recovery at a few thousand feet over the Pacific Ocean, after he broke out of the clouds and could see his attitude via outside visual reference. There were, incredibly, only two serious injuries to the 274 passengers and crew.
Airline pilots are required by (European) law to undergo simulator tests (which also involve "manual" piloting) every six weeks to eight. Failing one of these tests will cause you to lose your job.
Seems to suggest a "well, we know that's never going to happen" mindset.
Where did you see that simulators can't emulate a stall?
QuoteSeems to suggest a "well, we know that's never going to happen" mindset.Well, maybe. To do live training they'd presumably have to genuinely stall a genuine plane. Probably repeatedly. The risk of accidental stalling is clearly a real one, and a stall of a loaded passanger plane which leads to a crash will likely lead to the death of several hundred people... but that doesn't necessarily mean that live training is the right thing to do. If stalls in modern, highly automated planes are very rare, if for example there is a reasonable expectation that most pilots wouldn't encounter a stall in their career otherwise (and I have no idea what the stats are, but it must be something for which stats exist), more live training would not necessarily be expected to save more lives in the long run.. because if stall training killed a significant number of crews that would add up! (Leaving aside the fact that if pilots dropped several planes out of the sky in training, it might obviate the need for training entirely.. because it would put the public right off those models, I'd think!)
Quote from: graham.d on 09/08/2011 17:47:19Where did you see that simulators can't emulate a stall? I thought it was on that site, but I can't find now it either! I'll try to retrace my steps and dig it up. I was quite surprised when I read it too.I'm pretty sure it's easy enough to simulate the stall. I got the impression that what was lacking was the simulators' ability to handle the recovery accurately.