Interpersonal Divide was among the first to question technology overriding common sense in Boeing Max 8 crashes. Now consumer advocate Ralph Nader makes the same argument, dubbing the airline disasters a product of “algorithmic arrogance” in autonomous systems.
Last month Interpersonal Divide noted that technology was not only suspect in the failure of two Boeing 737 MAX 8 aircraft; it also played a role in pilot training with a 56-minute online iPad lesson. Now Boeing software combined with inadequate training have been cited as causes in the crashes of Lion Air and Ethiopian Airlines, resulting in the deaths of some 300 people.
Boeing and the Federal Aviation Administration showed an appalling lack of common sense after the second Ethiopian Airlines crash. We noted: “One aircraft falling from the skies might have gotten the FAA’s attention; a second similar craft doing the exact same thing–pilots struggling to aright their jet–should have set off alarms. Nevertheless, the FAA initially didn’t act after countries around the globe had grounded the MAX 8 after the second crash. This might have occurred because the Administration is ‘data-driven’ and data from the black box of the Ethiopian Airlines crash was not immediately available.”
Boeing Max 8 aircraft are grounded as the company updates its safety software, again relying on technology to fix what Nader believes is faulty engineering.
This is an example of what Ralph Nader calls “algorithmic arrogance.” In his interview with NPR, he states:
You know, this is a harbinger. You’ll be covering a lot of this in medical technology, autonomous weapons, self-driving cars. It’s the arrogance of the algorithms, not augmenting human intelligence but overriding it and replacing it. The significance of this Boeing disaster is that it can teach us some very important lessons about maintaining human intelligence and not ceding it to autonomous systems that have no moral base and no intuition.
In 2007, Interpersonal Divide author Michael Bugeja contributed a chapter titled–“Digital Ethics in Autonomous Systems“–in Lee Wilkins and Clifford G. Christians The Handbook of Mass Media Ethics.
Here is an excerpt, citing French-Maltese philosopher Jacques Ellul’s dictum that technology overrides human intellect, altering any system in a deterministic manner; “it realizes itself”:
In other words, apply technology to the economy, and the economy henceforth is about technology (think NASDAQ). Apply it to politics, and politics henceforth is about technology (think Kennedy-Nixon debates). Apply it to education, and education henceforth is about technology (think Sesame Street). Apply it to journalism, and journalism henceforth is about technology (think convergence). Moreover, because technology is autonomous and independent of everything, it cannot be blamed for anything.
Ralph Nader is challenging that last assumption, calling for a recall of all Boeing Max 8 aircraft, just like auto companies do. “They’ve got to go back to the drawing board, a clean sheet design of a new plane.”
A new edition of Wilkins and Christian’s The Handbook of Mass Media Ethics is in process with an updated Bugeja chapter, to be published by Routledge/Taylor & Francis. Wilkins and Christians rank among the top media ethicists. Wilkins also focuses her research on media coverage of the environment and hazards and risks. Christians is internationally known for his research in the contemporary study and philosophy of ethics and technology.