Self-Driving Cars, Streaming Video: A Lethal Combination

Rafaela Vasquez, operating an Uber autonomous (self-driving) vehicle as safety backup, reportedly was viewing “The Voice” streaming video when her vehicle struck and killed Elaine Herzberg as she crossed the street with her bicycle.

You can read about the particulars of the case here.

A lengthy report from the Tempe Police Department stated: “The driver in this case could have reacted and brought the vehicle to a stop 42.61 feet prior to the pedestrian.”

Several aspects of this case relate to research in Interpersonal Divide in the Age of the Machine, from the distraction of entertainment videos, especially when associated with driving,  to self-driving cars and the false sense of security that convenience affords.

Of special interest in this case is how Big Data analytics helped identify what distracted Vasquez moments before the crash. According to PC Mag,

During their investigation, police sent search warrants to YouTube, Netflix, and Hulu to retrieve driver Rafaela Vasquez’s viewing history at the time of the crash. YouTube and Netflix both said Vasquez was not watching anything on their services at that time. Hulu, however, said she was streaming an episode of The Voice just before the crash occurred.

Since its first publication in 2004, Interpersonal Divide has warned about everyday moments–some that require our full attention–being deemed boring because of the ubiquitous presence of consumer media. Whether checking social media in lecture or texting on route to work, technology lures us with on-demand content and insistent digital engagement. If you couple that with self-driving cars that require safety-driving human back-ups, such as Vasquez was being paid to do, you will have the inevitable result of vehicular homicide.

This just happens to be one of the first of its kind. More will follow as self-driving cars become the norm rather than the exception.

The lesson also involves Uber and its corporate responsibility to ensure that its drivers get adequate training about distracted driving in self-driving cars, putting people’s lives in the programmed hands of a machine rather than in the human ones supposedly on watch.

Interpersonal Divide in the Age of the Machine devotes several chapters to the dangers of unmonitored technology assuming ever greater control of our lives. The first step in identifying the dangers is to understand technology’s nature, which changes everything it touches without it changing much at all. Once we acknowledge that, we can gain greater control over how we use technology–or how it uses us–in everyday activities.

Bugeja Discusses Machine Values at Educator State Convention

 MIchael BugejaA 06_07_2018

Photo caption and credit: Michael Bugeja discusses machine vs. moral codes at a state convention of women educators. Photo by Diane Bugeja.

The average American spends 70 percent of waking hours looking at screens, including television, running the risk of replacing human values with machine ones, Iowa State Professor Michael Bugeja told members of Delta Kappa Gamma at its state convention Friday in Des Moines.

Delta Kappa Gamma promotes professional and personal growth of women educators and excellence in education. Some 140 current and retired teachers attended the two-day event at the West Des Moines Marriott.

Dr. Bugeja, author of Interpersonal Divide in the Age of the Machine (Oxford Univ. Press, 2018), spoke about common machine values that have replaced humane ones like truth, integrity, responsibility, and empathy.

Machine values include:

  • IMPORTANCE OF SELF over others
    The rise of the selfie in the front-facing camera is a symbol of this value,” Bugeja said, noting that corporations data-mine the self so as to sell more products to consumers and their social media friends.
  • BOREDOM over attentiveness
    Any free moment, including in cars or at lecture in class, is boring, triggering urge to check smartphone or social media,” he said.
  • OVERSHARING over privacy
    What we tell others on Facebook, often mere acquaintances and strangers, we used to share rarely, and only with trusted friends and family.”

Convention Co-chair Sheila Anderson invited Dr. Bugeja after hearing him and Iowa State’s Doug Jacobson speak about privacy and big data on Talk Of Iowa on IPR.

You can see more machine values here or hear the IPR interview by clicking here.

Dr. Bugeja teaches media ethics and technology and social change at ISU’s Greenlee School of Journalism and Communication. Interpersonal Divide in the Age of the Machine is available at $19.95 from Amazon or Oxford University Press.

 

 

 

Tesla’s Setback: Autopilot, Lithium Batteries and Fire

Tesla’s Model X with self-driving autopilot technology and crash-avoidance systems nevertheless crashed into a barrier, igniting the Lithium battery pack and killing an an Apple engineer.

According to DZNet,   the NTSB’s preliminary findings of the investigation “don’t look good for Elon Musk’s electric-vehicle company” in as much as the vehicle’s crash-avoidance systems failed to kick in “before the horrific crash, which sheared off the front-end of the Model X and killed its 38-year-old driver, Apple engineer Wei ‘Walter’ Huang.”

Verge, a technology and culture site, reported Tesla is a party “in two other ongoing investigations into non-fatal accidents: one from January 22nd, 2018 involving Autopilot, and one from last summer involving a battery fire.

The Model X utilizes Lithium ion batteries that may ignite in crashes, although there are no reliable statistics as yet that show the battery pack is more prone to fires than traditional gasoline tanks.

Click here to read Tesla’s statement about the crash. The company noted: “If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.”

The issue explored in this post concerns other variables associated with self-driven cars using Lithium-ion batteries that may be prone to fire. Factor that with increasing distracted driver statistics due to smartphones and other digital devices in cars. The National Safety Council reports that “more than 3,000 people are killed on U.S. roads every year in distracted driving crashes, the federal government reports. Cell phone use is a common driver distraction.”

Interpersonal Divide in the Age of the Machine covers self-driven cars and the complexities of human life that technology sometimes cannot compute: “Human motivation and decision-making, even in negotiating who drives first at a four-way stop (something self-driving cars cannot achieve), are too complex, random and illogical.”

Self-driven cars have a long way to go in assessing the multitude of variables that may contribute to future tragic crashes.

Podcast: Digital Assistants and Privacy

During this episode of River to River, host Ben Kieffer talks with Iowa State University Professors Michael Bugeja and Doug Jacobsen about how smart these speakers really are.  They discuss how speakers pose a privacy risk after recent incidents where smart speakers started recording conversations and sending them without being told to do so.  Jacobsen, who is a professor of engineering, says in an era of big data, privacy is nearly dead.

Bugeja and Jacobsen also talk about new privacy rules being implemented in the European Union.

To listen to the podcast, click the photo above or click here.

Twitter: Fountain of Untruth

Trick question: How many characters does it take to kill truth? There are two answers: 280 Twitter keystrokes or 1.3 billion Twitter users.

“There was a time when some intelligent observers of social media believed that Twitter was a ‘truth machine’ — a system whose capacity for rapidly debunking falsehoods outweighed its propensity for spreading them,” Slate reported in March 2018. “Whatever may have remained of that comforting sentiment can probably now be safely laid to rest.”

Slate cited a study in the journal Science that found rumors on Twitter typically spread faster and farther than tweets that eventually were proved to be true. Bots, Russian or otherwise, weren’t the reason for the falsehoods. People were.

Twitter is a font of untruth and a test to Iowa’s culture of “nice.”

For the rest of the commentary, visit the Des Moines Register or click here.

What Will Be Social Media’s Role in NFL’s New Anthem Rule?

In an age of presidential tweets, society has learned how social media can circumvent old rules and establish new norms that impact policy in government and the corporate world.

Last week the NFL announced a new policy that fines any player for taking a knee during the national anthem but allows protests out of camera view in the locker-room but not on the field.

That locker-room concession is legally important when it comes to the First Amendment.

In general, private companies like NFL franchises can restrict First Amendment freedoms and in some states even fire employees who violate policies.

But that’s not as ironclad as some attorneys might believe, especially ones working with the NFL on this new anthem policy. That’s where the locker-room alternative comes into play.

Fact is, some states have laws that protect political speech rights of employees in private companies. According to Eugene Volokh,  Gary T. Schwartz Professor of Law at UCLA,  in his paper titled “Private Employees’ Speech and Political Activity: Statutory Protection Against Employer Retaliation” …

“About half of Americans live in jurisdictions that protect some private employee speech or political activity from employer retaliation. Some of these jurisdictions protect employee speech generally. Others protect only employee speech on political topics. Still others protect only particular electoral activities such as endorsing or campaigning for a party, signing an initiative or referendum petition, or giving a political contribution.”

Volokh was quoted on May 24, 2018 in a Washington Post article about the new policy, stating that the locker-room exception helps make it a “pretty solid case for the NFL.”

If that was the legal intent of the new policy, imagine, then, the scenario of NFL players using Facebook Live or other social media apps to cover their protest inside locker-rooms?

There’s a bigger question associated with mainstream media with access to locker-rooms: Will they be banned for simulcasting locker-room protests?

Interpersonal Divide in the Age of the Machine documents how social media continues to be one step ahead of the best legal minds, chiefly because no one can predict the circumstances and consequences of its 24/7 instantaneous global access.

Perhaps the NFL will work with Facebook to filter out such protests, especially since Facebook is looking at ways to do just that, according to this article in Tech Times.

We will have to wait until the NFL season begins to see how the new policy plays on the field, in the locker-room and potentially, in the courts.

Couple Not Laughing at Eavesdropping Alexa

In March, people using Amazon’s digital assistant, Alexa, reported eerie laughter, a problem the company claimed to have fixed, according to the New York Times.

Alexa was said to mistakenly hear “Alexa, laugh”; when other words are spoken.

Amazon reprogrammed the device to say, “Sure I can laugh,” so as not to mistake the command.

Well, a Portland couple isn’t laughing when the same technical problem–Alexa misinterpretting language–caused the digital assistant to cherrypick a series of words  and phrases that it took out of context:

  • “Alexa,” which triggers the recording mechanism.
  • “Send message,” which disseminates the message.
  • “[Name],” which sounded like one in the couple’s contact data.

As Bloomberg News reported, the couple was contacted by an acquaintance who warned, “”unplug your Alexa devices right now. You’re being hacked.”

Interpersonal Divide in the Age of the Machine warns against glitches like this that can be catastrophic when artificial intelligence attempts to make sense out of the illogical, positional, multicultural English language.

Worse, conversations and data can be shared without users knowing.

Here’s an excerpt:

In addition to knowing all about each individual from our most popular devices such as iPhones and applications such as Facebook, which surveil and sell simultaneously, the government can compile specific dossiers about our electronic identities which may or may not represent who we truly are. A digital fingerprint differs from a real one. As this book documents, we are more than our cookies say we are. But machines could care less.

All technology comes with privacy risks. That’s why it is vital to understand service terms along with what each device is programmed to do when users invite machines into the long-gone privacy of their homes.