The New Jim Code
Script error: No such module "AfC submission catcheck".
The New Jim Code is defined as “the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era”.[1]:5
Benjamin lists four vital components that constitute New Jim Code:
- Impartiality
- Personalization
- Merit
- The framework of a forward-looking enterprise that promises social progress
It draws on the term “New Jim Crow”, which was created by Michelle Alexander[1]:8 to highlight how the existing racist systems extend to technology. Understanding race in the context of technology is an underlying factor that contributes to the understanding of the terminology. It challenges the belief that technological innovation is race-agnostic, instead positing that the idea of race not being a factor in algorithms is incorrect.
Examples[edit]
Data sharing[edit]
One example that Benjamin states that stems from algorithmic inequity is the issues from data sharing, which results in an unequal targeting of marginalized groups by those obtaining shared data. While seeming to be a positive phenomenon, it has posed data privacy issues and allows acts of discrimination to occur based on data generalizations. Stigmas play a part in data sharing issues, such as with African Americans being denied services due to supposed health risks like sickle cell[2]
Predictive guilt[edit]
The use of algorithms and AI tools by law enforcement is seen in products such as Snapshot DNA Phenotyping Service by Parabon Nanolabs. It uses a technique called DNA phenotyping by collecting and using samples from crime scenes to construct facial renditions of persons involved. Criticisms have been raised about its validity, such as Dr. Yaniv Erlich, who compared the phenotyping to “science fiction”, or other scientists noting the lack of peer review [3]. It raises concerns of the effectiveness of the software and the implications it could have on those suspects due to the product’s output.[4]
Downfall of artificial intelligence[edit]
Another example is the risks of Artificial Intelligence, because AI can get face recognition and voice recognition wrong. Not familiarizing someone's voice and it can mistake someone else, such as saying a password, or listening between an innocent person vs a criminal person to see who the AI can recognize and they might get it right or wrong. This also includes the same for face recognition, such as lighting, as people may have similar face structure, or if skin tones are too dark for the AI to recognize a person, etc.
However, there are Doctors and scientists who think that AI can change the world and make the world better. But AI can have some lack biases towards race and sex, mostly women and African Americans. Overall a lot of scientists and engineers are trying their best now to improve AI so that they can help in future generations.
Reactions[edit]
In response to the described New Jim Code, Ruha Benjamin posits that there are ways of thinking about technology development that can change the underlying culture that enables the New Jim Code [1]:183. Benjamin suggests that by viewing technology as a tool and valuing the politics and purpose of said tool, thinkers will question the idea of technology design as a “solution” to societal and cultural problems that contribute to inequity [1]:178-183. In doing so, she believes over time the creation of technology that, rather than value innovation and profit, promotes and centers equity can begin [1]:183.
References[edit]
This article "The New Jim Code" is from Wikipedia. The list of its authors can be seen in its historical and/or the page Edithistory:The New Jim Code. Articles copied from Draft Namespace on Wikipedia could be seen on the Draft Namespace of Wikipedia and not main one.
- ↑ 1.0 1.1 1.2 1.3 1.4 Benjamin, Ruha (2019). Race After Technology. Polity Press. Search this book on
- ↑ Heeney, C.; Hawkins, N.; de Vries, J.; Boddington, P.; Kaye, J. (2010-03-29). "Assessing the Privacy Risks of Data Sharing in Genomics". Public Health Genomics. 14 (1): 17–25. doi:10.1159/000294150. ISSN 1662-4246. PMC 2872768. PMID 20339285.
- ↑ Southall, Ashley (2017-10-19). "Using DNA to Sketch What Victims Look Like; Some Call It Science Fiction". The New York Times. ISSN 0362-4331. Retrieved 2023-12-04.
- ↑ Sex, Race, and Robots. Search this book on