The Legal Examiner Affiliate Network The Legal Examiner The Legal Examiner The Legal Examiner search instagram avvo phone envelope checkmark mail-reply spinner error close The Legal Examiner The Legal Examiner The Legal Examiner
Skip to main content

Investigations will soon be underway regarding the Washington D.C. Metro’s Red Line crash that killed nine people and injured dozens. Early indications have suggested a computer system malfunction, while several accounts have raised questions about whether or not the train’s driver applied the brakes in time. The problem, many experts allege, is that investigations usually focus our attention on discrete aspects of human or machine error; however, the real problem usually lies in the relationship between man and the automated system. In fact, metro officials have already begun a review of the automated system on the track where the accident occurred and found anomalies. Although such measures are essential, a safer automated system leads to a paradox because the safer the automated system, the more powerful it becomes and the more humans rely on it.

Automated systems are often created to relieve humans from partaking in repetitive tasks. For example, the autopilot on a plane or the automated speed-control systems in mass transit are conveniences but can become crutches. The more reliable the system, the more likely humans will lose their concentration, or overtrust the system, thus increasing the likelihood for catastrophe. Many times machines are used instead of humans in very important situations, such as airplanes being manufactured with weight-sensitive sensors that activated once the wheels touched down for landing. These sensors eliminated a pilot’s judgment, and in one case, led to an accident after rain on the runway caused the plane to hydroplane. Because the sensors did not allow the pilot to interfere and operate the plane’s thrust reversers, the airplane overshot the runway.

There is a growing consensus among experts that automated systems should be designed to enhance the performance and accuracy of human operators rather than make them complacent; human “interference” should not be eliminated. Several studies show regular training exercises that require operators to shut off their automated systems and run everything manually are useful in retaining skills and alertness. Further, understanding the way automated systems are designed allows the operators to not only detect when a system has failed, but to notice when systems are on the brink of failure.

Comments for this article are closed.