Stephen A. Fuqua (saf)

a Bahá'í, software engineer, and nature lover in Austin, Texas, USA

In Pursuit of Data and Algorithmic Equity

Advances in the availability and breadth of data over the past few decades have enabled the rapid and unregulated deployment of statistical algorithms that aim to predict and thereby influence the course of human behavior. Most are designed to promote the corporate bottom line, not the welfare of the people. Those that aim to promote the common good run the danger of straying into authoritarian suppression of freedoms. Regardless of intention, these algorithms often reinforce existing social inequities or present a double-edged sword, with potential for positive use weighed against potential for misuse.

Coded Bias film poster

The films Coded Bias (now in virtual theaters) and The Social Dilemma (Netflix) probe these issues in detail through powerful documentary filmmaking and storytelling. Where The Social Dilemma focuses on the dangers of corporate and extremist manipulation through social media, Coded Bias reveals the biases inherit more broadly in “artificial intelligence” (AI) / machine learning (ML) systems. If you must choose just one, I would watch Coded Bias both for its incisive reveal of injustices large and small and its inspiring depiction of those working to bring these injustices to light.

Several well-regarded books explore these topics; indeed some of the authors are among those featured in these films. While I have yet to read the first three, they seem well-regarded and worth mentioning:

In Race After Technology (2019), Ruha Benjamin pulls the strands of algorithmic injustice together in a broader critique of technology’s impact on race, describing what she calls the New Jim Code: “The employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era.” (p10)

The New Jim Code thesis is a powerful critique of technology that simultaneously fails to see people of color (facial recognition, motion detection) and pins them in a spotlight of law enforcement surveillance and tracking. By explicit extension it is also a critique of the societies that tolerate, sponsor, and exploit such technologies, even while it acknowledges that many of the problems emerge from negligence rather than intention. From the outset, Benjamin gives us a useful prescription for working our way out of this mess, exhorting us to “move slower and empower people.” (p16).

After detailing many manifestations of technological inequality and injustice, she urges technologists like me to “optimize for justice and equity” as we “come to terms with the fact that all data are necessarily partial and potentially biased” (p126). The book concludes with further explorations on how justice might (and might not) be achieved while re-imagining technology.

Benjamin’s book was also my introduction to the Algorithmic Justice League, an advocacy organization that “combine(s) art and research to illuminate the implications and harms of AI.” The AJL is featured prominently in Coded Bias, and their website provides many resources for exploring this topic in more detail.

These works send a clear message that data and the algorithms that exploit them are causing and will continue to cause harm unless reined in and reconceptualized. This is the techno-bureaucratic side of #BlackLivesMatter, calling for #InclusiveTech and #ResponsibleAI. As with so many other issues of equity and justice, we all need to stand up, take notice, and act.

  1. Think twice before choosing to use AI/ML. Then get an outside review.
  2. Vet your data sets carefully and clearly describe their origin and use for posterity and review.
  3. Ask yourself: what are the potential impacts of the technology I am developing on historically oppressed groups?
  4. Cross-validate your assumptions, data, and results with a broad and representative audience.
  5. Keep listening. Keep learning.

Slack - choosing skin tone
Something positive: choosing an emoji skin tone

Posted with : Social Discourse, Justice, Inclusion and Anti-Racism, Tech, Data Analytics and Visualization