Abstract

Excerpted From: Cindy Thomas Archer, Do/Can/Should Legal Practitioners Rely on Google Search? Algorithms of Oppression: How Search Engines Reinforce Racism, Safiya Umoja Noble (2018), 211 Pages, 22 Legal Communication & Rhetoric: JALWD 163 (Fall, 2025) (27 Footnotes) (Full Document)

 

CindyThomasArcherWith the recent focus on generative AI--using large language models to generate “original” content--it’s easy to forget that we have been relying on artificial intelligence in its broadest application for decades in the form of extractive AI. Our reliance on extractive AI, through search engines like Google and specialized legal research databases, has become so ubiquitous that we do not think of it as artificial intelligence. More importantly, we do not often question the results. Dr. Safiya Noble’s groundbreaking work, Algorithms of Oppression, gives us a framework to be more critical of those results.

Published in 2018, Algorithms of Oppression is the result of Dr. Noble’s multiyear study of the biased results produced by commercial search engines. The book’s main premise is that while we have come to accept that search engines will return results consistent with stereotypes, we focus on only one cause: prejudiced users who input biased data. But this is only half of the story. It’s not just the data but the algorithms themselves that facilitate the biased results.

Algorithms of Oppression encourages readers to adopt a more critical lens when using search engines. Dr. Noble’s work easily extends to the bias of algorithms in large language models that generate content as well. As she cautions, “The near-ubiquitous use of algorithmically driven software, both visible and invisible to everyday people, demands a closer inspection of what values are prioritized in such automated decision-making systems.” We cannot let anticipated efficiencies and profitability distract us from the profound ethical and societal consequences at stake when we rely on these technologies.

[ . . . ]

 

Whether googling or prompting, any lawyer who uses extractive or generative artificial intelligence should read Algorithms of Oppression. We can only fully understand the results of our research when we dig deeper into how those results are produced. And if we cannot fully comprehend the algorithms that produce the results, we can at least be more knowledgeable about the motives and goals of the people who created or own them. Through concrete examples from Dr. Noble’s extensive research and other experts, Algorithms of Oppression opens a window into, not all, but some of the more insidious problems with googling like the bias, stereotyping, and discrimination its algorithms facilitate.

 


Professor of Lawyering Skills, University of California, Irvine, School of Law.