The new documentary from PBS’s Independent Lens series documents the whistleblowers fighting to make AI more equitable

By Valerie Milano

A police officer raises his pepper spray handgun as he detains a man during a march against the national security law at the anniversary of Hong Kong’s handover to China from Britain in Hong Kong, China July 1, 2020. (REUTERS/Tyrone Siu)

Los Angeles, CA (The Hollywood Times) 03/13/2021 – Hong Kong police using facial recognition software to track down protest leaders; officers in London preemptively stopping a young man on the street because he resembles a suspect; Amazon’s facial recognition software having significantly greater accuracy when used on white men than on women and people of color; these are a few of the vignettes highlighted in PBS’s new Independent Lens documentary Coded Bias, revealing the ethical problems with the cutting-edge surveillance technologies that are shaping our lives.

The documentary centers around the story of whistleblower Joy Buolamwini, a Ph.D. candidate at the MIT Media Lab who discovers racial and gender bias in the algorithms controlling facial recognition technology. Ms. Buolamwini, a black woman, finds that the facial recognition software she is experimenting with becomes conspicuously more accurate in detecting her facial features when she puts on a white mask. This leads Joy to become an outspoken advocate for equity and accountability in the development and application of AI technologies.
7th Empire Media
The documentary Coded Bias will be screened online for free on March 14.

The documentary illustrates several systemic problems in AI and facial recognition technology:

  1. The most punitive applications of surveillance are tested on marginalized communities first before being applied to society at large.
  2. The algorithms that control the technology have the bias of their creators, most of whom are white men, which results in the historical inequities of race and gender being programmed into them.
  3. The corporations and governments that primarily own the technology use it in ways that are at odds with the public good.
As Buolamwini puts it in the film, “Algorithmic justice—making sure there’s oversight in the age of automation—is one of the largest civil rights concerns we have.”
Coded Bias director Kantayya Shalini. Courtesy of HotDocs.

“Algorithms are really defining who gets hired, who gets healthcare, and who gets undue police scrutiny,” said Coded Bias director Shalini Kantayya. “And what I learned through Joy was that we are outsourcing our decision-making to these machines, these same systems that have not been vetted for racial bias or for gender bias. And this causes unintended harm.”

“The point of the film is not only to question the bias embedded in the AI technologies, but it is also to question the technologies themselves,” said Ms. Kantayya. “I think we need more people in the room, and we need more humanity to govern these systems. But we also need to look at whether a technological solution is always the best one. Sometimes the best solution is a human one.”

 

The documentary draws a vivid comparison between China and the United States regarding the use of facial recognition. In China, according to Meredith Broussard, author of Artificial Unintelligence, the surveillance is used to bolster totalitarian control by the Chinese Communist Party over its population. In contrast, the major tech corporations in the United States use it for commercial purposes by exploiting people’s personal information for greater profits.

 

Although the Chinese version of surveillance seems more ubiquitous and repressive, the model in the United States empowers private companies to use personal data to determine which individuals have access to which services. This results in individuals with certain profiles being blocked from obtaining services based on predictive modeling.
CATHY O´NEIL

Cathy O’Neil, a former Wall Street Investment Analyst turned whistleblower who authored the book Weapons of Math Destruction, calls for an ‘FDA for algorithms’ to provide oversight and accountability in the deployment of AI.

 

However, the picture the documentary paints of AI’s nefarious applications is not all doom and gloom. It concludes on a hopeful note, documenting Ms. Buolamwini’s fight for algorithmic justice as she takes it to the national stage, publishing a widely-cited paper exposing the biases in Amazon’s facial recognition software. Joy testifies before Congress on the subject, leading to several cities banning facial recognition technology and federal legislation being introduced to curb the technology’s use; even Amazon puts a temporary ban on the commercial deployment of its technology, which remains in place today.

 

Coded Bias documents a nascent technological system where ubiquitous data collection, largely hidden from public view, is shaping our daily lives, especially the opportunities that are available to us. It also documents the whistleblowers who are fighting to ensure that the system is fair and ethical; that it does not reproduce the injustices that have plagued society for so long. As Ms. Kantayya put it: “I think I make films because they remind me that everyday people can change the world.” And at the end of the day, this is a film about everyday people fighting for justice.
Coded Bias premiers Monday, March 22, 2021, at 10 pm ET on PBS and the PBS Video App.