The Boy on the Tricycle: Bias in Generative AI

Nettrice Gaskins
4 min readMay 1, 2024

--

My first computer-generated artwork, circa 1987

Bias is a natural inclination for or against an idea, object, group, or individual. It is often learned and is highly dependent on variables such as a person’s socioeconomic status, race, ethnicity, education, gender, and religion. Several studies have been published about bias in generative artificial intelligence, including this one HERE, HERE, and my field report from 2022 titled “Interrogating AI Bias through Digital Art.” “Bias in Generative AI” features a study that analyzed images generated by three popular AI tools — Midjourney, Stable Diffusion, and DALLE 2 — to investigate potential bias in AI generators.

Generative AI tools, like Midjourney can inadvertently perpetuate and intensify societal biases related to gender, race, and emotional portrayals in their images. Dr. Joy Buolamwini’s “Unmasking AI” features a story of how the author uncovered “the coded gaze” — the evidence of encoded discrimination and exclusion in tech products. I decided to test this out using the very first computer-generated image I created in 1987. Back then, I used Deluxe Paint or D-Paint on an Amiga computer. For my experiment I uploaded my image to Midjourney and used the “Describe” command to create text prompts that depict the old CG image in new, different ways.

Midjourney “Describe” thumbnails (4) — ROUND 1

One thing the “Describe” command missed was the race of the child in my original image. I added one word to the prompt: Black. In all four of the initial thumbnails (and my original image) there are trees and other plants. But in the next round of thumbnails (with “Black” in the prompt) there is much less foliage. Also, the picket fence was replaced with cement walls… sometimes with markings or graffiti on them.

Midjourney “Describe” thumbnails (4) —ROUND 2

While generative AI has numerous benefits in creative fields, we must also remain aware and vigilant for potential drawbacks. The “Bias in Generative AI” study revealed systematic gender and racial biases in all three AI generators against women and African Americans. The study also uncovered more nuanced biases or prejudices in the portrayal of emotions and appearances.

No foliage and rundown surroundings

It is important not only to consider who is shown in GenAI images, but in which light they are shown. Most of the images with the Black child (prompt) show rundown or poor/neglected surroundings but that wasn’t in the original image from 1987 and it didn’t show up in the initial Midjourney thumbnails. More important, I took note of how I felt when this was happening. Next, I imagined how someone who was less astute would feel. Would they feel less inclined to continue using the tool(s)?

Why is his access blocked?

In the initial image there is a gap in the white picket fence on the left side. The AI-generated version with the Black child shows a closed fence and the child seems to be looking for a way inside. His access is blocked. This last image visualizes the problem of inclusion and diversity in digital art. In a world where technology (esp. AI) continues to shape how we interact and learn, we must find ways to get more diverse people on the development side of the technology, to help inform how large language/training models are created and come up with better ways to counter bias in AI.

I like this perspective-based version

As our reliance on AI intensifies, it becomes imperative to ensure fairness and equity in the design and deployment of these technologies. We must prioritize the development of generative AI systems that are not only technologically advanced but are also shaped by an ethical commitment to inclusivity and equity. — Zhou et al. 2024

Also, we must also represent diverse cultures, perspectives, and experiences in digital art collections (ex. online, galleries, museums).

--

--

Nettrice Gaskins
Nettrice Gaskins

Written by Nettrice Gaskins

Nettrice is a digital artist, academic, cultural critic and advocate of STEAM education.

Responses (10)