App Mistakenly Matches Congress Members with Mugshots

An ACLU test of “Amazon Rekognition” facial identification tool falsely matched 28 members of Congress–disproportionally people of color–with criminal mugshots, documenting algorithmic bias, as covered in Interpersonal Divide in the Age of the Machine.

According to the ACLU, false matches included “six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.).” The organization is calling on Congress to join its efforts to halt law enforcement use of face surveillance.

You can read a full account of the test as covered by National Public Radio.

The ACLU test paired Rekognition software with a database of 25,000 arrest photos and then searched that database against photos of current members of Congress.

The results were no surprise to those who have read the new edition of Interpersonal Divide, which has covered Amazon and Facebook since 2004, noting how revenue-generating apps like “Rekognition” have changed ethical and social norms at home, school and work.

Here’s an excerpt about algorithmic racism from the second edition:

If you believe that institutional racism exists, that systems and organizations over time believe falsehoods about under-represented groups, then imagine the long-term consequences if such bias is coded in and programmed into machines. For instance, if machines compile data suggesting that a certain race, gender and age of people living in a given location may have a higher inclination for wrongdoing, and that person happens to wander into a wealthier section of the neighborhood, merchants equipped with apps might be prone to mistake innocent shoppers for potential shoplifters, depriving them of service or worse, accusing them of crimes.

Interpersonal Divide documents algorithmic racism across digital platforms and datasets, including decisions associated with social justice, such as determining whether inmates should be granted parole. (Penal boards in half the states use algorithms in parole hearings.)

The $19.95 book is available from Oxford University Press or  from online book sellers.

Facebook Finds Russian Election Interference: Do You Care?

What does it matter if your mind is fixed as a progressive liberal or staunch conservative as long as your view of the world is affirmed?  Do you care that your affirmation comes from enemy intelligence in a country whose arsenal is pointed at you as you read this? Probably not.

But you should. (It may be too late.)

“According to MSNBC”–that phrase, indicating the source of information–may upset you off if you believe its reporters invent fake news as enemies of the people. “According to Fox News” may do the same to another group of voters.

Well, here’s the news: More than 290,000 Facebook accounts are following pages reportedly assembled by Russian entities with the sole purpose of undermining American’s beliefs in democracy.

You can read about it here, from Fox News. Or here from the New York Times.

The stories are basically the same because both news organizations are reporting fact. Their opinion writers, bloggers and commentators may side on the right or left of the political spectrum. But that doesn’t define basic news reporting.

Do you really want to protect yourself from fake news? If so, here’s a handy guide, “Media Literacy in the Era of the Machine.” You can access it for free like so many other resources on Internet.

But you probably aren’t interested. At least, not yet. Until maybe it’s too late.

Here’s a test: Do you believe that former President Obama signed an executive order banning the Pledge of Allegiance in Schools?

Well, that was the top 2016 fake news story doctored to resemble ABC News, garnering 2.1 million shares and comments in two months. According to BuzzFeed, some 3,000 US adults were fooled by fake news 75 percent of the time, with Facebook likely as their main source.

But if you dislike President Obama you might think he’d sign such an unpatriotic order, even if you were told it wasn’t true.

Maybe you believe that First Lady Melania Trump so dislikes her husband, refusing to accompany him in public appearances, that a body double was hired just for show. Some 9 million people seemed to think so based on a flimsy social media post that suggested that with no evidence whatsoever, according to Snopes, the fact-checking site.

But if you so loathe President Trump you might think this was a distinct possibility, even if you were told it wasn’t true.

So maybe fake vs. real news doesn’t matter to you … until it does.

You live in a dangerous world. About 4 people in every 100,000 will die because of gun violence. It doesn’t matter whether you are for or against the Second Amendment if your loved one is one of those four. Suddenly you have perspective. Diseases happen, especially cancer. Did you know that the National Cancer Institute says that immunotherapy– treatments that empower a person’s immune system–is increasingly effective? Perhaps you didn’t read news reports about the importance of annual checkups, especially if you have been treated for cancer before. Viruses–the human rather than computer kind–kill as many as 49,000 people each yearDid you know that 80 percent of child flu deaths occur in those who failed to be vaccinated?

True, you may encounter these fact-based stories on Facebook. Or maybe you missed them looking at kitten videos. Maybe you were sharing posts by Russian agents and bots.

In the past, when news mattered–and it matters less and less, with newsroom employment down 23% in 10 years–we wouldn’t read the entire New York Times or Wall Street Journal each day. But we knew where to find factual information when situations warranted.

We are losing that knowledge. It’s tragic for some. Perhaps even all of us in America.

For more information about social media and fake news, see Interpersonal Divide in the Age of the Machine, one of the first such works to track Facebook and its impact on society since 2004.

 

College President’s Astute Response on Being Hacked by Professor

Watch the sentencing of former Adrian College history professor, Stephanie June Jass, by clicking the photo YouTube video.

After Jass hacked his email account, President Jeffrey Docking told Inside Higher Ed,

“Everybody — staff, students, the community — have had just about enough of their privacy being invaded, with credit cards stolen or mortgages being looked at, or emails and personal texts hacked.”

That citation is apt as most of us have little tolerance anymore for any kind of invasion of privacy. Intentional hacking by a friend or colleague is relatively rare. As such, the Jass case has special implications because of the extent of the infraction on a college campus.

Interpersonal Divide in the Age of the Machine covers email hacking and other cyber crimes being committed by people of all social classes, primarily because of the temptation to view confidential information or otherwise gain access to private data for personal or corporate gain.

According to IHE, Jass read confidential emails about other employees in addition to messages “between him and his wife and adult children about family medical and other issues.”

The hacking occurred in April 2017.

The Daily (MI) Telegraph, which covered Jass’s sentencing, reported that she pleaded guilty to one felony count of unauthorized computer access. County Circuit Judge Margaret M.S. Noe admonished her not “to disappoint this court or violate your terms of probation. That would just be intolerable.”

IHE reported that Jass made a plea deal, agreeing to one year’s probation with payment of restitution. Violating probation could mean five years’ prison time.  

Jass, previously also known as a 7-time Jeopardy game show champion, reportedly expressed remorse for “all the pain that I have caused my friends, my family, my community, and I am ready to make amends and go back to being a credit to my friends, my family, my congregation and my community.”

Self-Driving Cars, Streaming Video: A Lethal Combination

Rafaela Vasquez, operating an Uber autonomous (self-driving) vehicle as safety backup, reportedly was viewing “The Voice” streaming video when her vehicle struck and killed Elaine Herzberg as she crossed the street with her bicycle.

You can read about the particulars of the case here.

A lengthy report from the Tempe Police Department stated: “The driver in this case could have reacted and brought the vehicle to a stop 42.61 feet prior to the pedestrian.”

Several aspects of this case relate to research in Interpersonal Divide in the Age of the Machine, from the distraction of entertainment videos, especially when associated with driving,  to self-driving cars and the false sense of security that convenience affords.

Of special interest in this case is how Big Data analytics helped identify what distracted Vasquez moments before the crash. According to PC Mag,

During their investigation, police sent search warrants to YouTube, Netflix, and Hulu to retrieve driver Rafaela Vasquez’s viewing history at the time of the crash. YouTube and Netflix both said Vasquez was not watching anything on their services at that time. Hulu, however, said she was streaming an episode of The Voice just before the crash occurred.

Since its first publication in 2004, Interpersonal Divide has warned about everyday moments–some that require our full attention–being deemed boring because of the ubiquitous presence of consumer media. Whether checking social media in lecture or texting on route to work, technology lures us with on-demand content and insistent digital engagement. If you couple that with self-driving cars that require safety-driving human back-ups, such as Vasquez was being paid to do, you will have the inevitable result of vehicular homicide.

This just happens to be one of the first of its kind. More will follow as self-driving cars become the norm rather than the exception.

The lesson also involves Uber and its corporate responsibility to ensure that its drivers get adequate training about distracted driving in self-driving cars, putting people’s lives in the programmed hands of a machine rather than in the human ones supposedly on watch.

Interpersonal Divide in the Age of the Machine devotes several chapters to the dangers of unmonitored technology assuming ever greater control of our lives. The first step in identifying the dangers is to understand technology’s nature, which changes everything it touches without it changing much at all. Once we acknowledge that, we can gain greater control over how we use technology–or how it uses us–in everyday activities.

Bugeja Discusses Machine Values at Educator State Convention

 MIchael BugejaA 06_07_2018

Photo caption and credit: Michael Bugeja discusses machine vs. moral codes at a state convention of women educators. Photo by Diane Bugeja.

The average American spends 70 percent of waking hours looking at screens, including television, running the risk of replacing human values with machine ones, Iowa State Professor Michael Bugeja told members of Delta Kappa Gamma at its state convention Friday in Des Moines.

Delta Kappa Gamma promotes professional and personal growth of women educators and excellence in education. Some 140 current and retired teachers attended the two-day event at the West Des Moines Marriott.

Dr. Bugeja, author of Interpersonal Divide in the Age of the Machine (Oxford Univ. Press, 2018), spoke about common machine values that have replaced humane ones like truth, integrity, responsibility, and empathy.

Machine values include:

  • IMPORTANCE OF SELF over others
    The rise of the selfie in the front-facing camera is a symbol of this value,” Bugeja said, noting that corporations data-mine the self so as to sell more products to consumers and their social media friends.
  • BOREDOM over attentiveness
    Any free moment, including in cars or at lecture in class, is boring, triggering urge to check smartphone or social media,” he said.
  • OVERSHARING over privacy
    What we tell others on Facebook, often mere acquaintances and strangers, we used to share rarely, and only with trusted friends and family.”

Convention Co-chair Sheila Anderson invited Dr. Bugeja after hearing him and Iowa State’s Doug Jacobson speak about privacy and big data on Talk Of Iowa on IPR.

You can see more machine values here or hear the IPR interview by clicking here.

Dr. Bugeja teaches media ethics and technology and social change at ISU’s Greenlee School of Journalism and Communication. Interpersonal Divide in the Age of the Machine is available at $19.95 from Amazon or Oxford University Press.

 

 

 

Tesla’s Setback: Autopilot, Lithium Batteries and Fire

Tesla’s Model X with self-driving autopilot technology and crash-avoidance systems nevertheless crashed into a barrier, igniting the Lithium battery pack and killing an an Apple engineer.

According to DZNet,   the NTSB’s preliminary findings of the investigation “don’t look good for Elon Musk’s electric-vehicle company” in as much as the vehicle’s crash-avoidance systems failed to kick in “before the horrific crash, which sheared off the front-end of the Model X and killed its 38-year-old driver, Apple engineer Wei ‘Walter’ Huang.”

Verge, a technology and culture site, reported Tesla is a party “in two other ongoing investigations into non-fatal accidents: one from January 22nd, 2018 involving Autopilot, and one from last summer involving a battery fire.

The Model X utilizes Lithium ion batteries that may ignite in crashes, although there are no reliable statistics as yet that show the battery pack is more prone to fires than traditional gasoline tanks.

Click here to read Tesla’s statement about the crash. The company noted: “If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.”

The issue explored in this post concerns other variables associated with self-driven cars using Lithium-ion batteries that may be prone to fire. Factor that with increasing distracted driver statistics due to smartphones and other digital devices in cars. The National Safety Council reports that “more than 3,000 people are killed on U.S. roads every year in distracted driving crashes, the federal government reports. Cell phone use is a common driver distraction.”

Interpersonal Divide in the Age of the Machine covers self-driven cars and the complexities of human life that technology sometimes cannot compute: “Human motivation and decision-making, even in negotiating who drives first at a four-way stop (something self-driving cars cannot achieve), are too complex, random and illogical.”

Self-driven cars have a long way to go in assessing the multitude of variables that may contribute to future tragic crashes.

Podcast: Digital Assistants and Privacy

During this episode of River to River, host Ben Kieffer talks with Iowa State University Professors Michael Bugeja and Doug Jacobsen about how smart these speakers really are.  They discuss how speakers pose a privacy risk after recent incidents where smart speakers started recording conversations and sending them without being told to do so.  Jacobsen, who is a professor of engineering, says in an era of big data, privacy is nearly dead.

Bugeja and Jacobsen also talk about new privacy rules being implemented in the European Union.

To listen to the podcast, click the photo above or click here.