AI & Creativity: Addressing Racial Bias in Computer Graphics

Nettrice Gaskins
4 min readAug 11, 2021

--

Racial bias in the CG algorithm (left) from Theodore Kim’s SIGGRAPH 2021 talk

Theodore Kim recently gave an interesting talk at SIGGRAPH 2021 about racial bias in computer graphics or CG. Kim asserts that basic scientific formulations have insidious biases built into them, which can be traced back to early film technology and techniques from the analog era. This includes Shirley Cards and today’s facial recognition artificial intelligence or A.I. These developments, according to Kim, determine the “physical formulations and numerical algorithms used to depict virtual humans.”

Kodak’s “Shirley Card” circa 1970
Jean Renoir’s “The Golden Coach,” 1952

Kim begins his talk by asserting that programmers and other technologists need to consciously think about and deal with the racial implications in CG (and A.I.) in order to adequately address issues that are prevalent and long-lasting in the field. Otherwise, we will keep replicating the same problems over and over again. Kim says,

We recognize the pull of historical inertia and we make the deliberate decision to take a step in the opposite direction.

One example Kim gives is ‘subsurface light transport,’ which gives computer-generated objects a glow similar to the “skin glow” effect seen in paintings by Johannes Vermeer and more contemporary media that portrays white (European diaspora) people. Generally, as it relates to these subjects, there is nothing wrong with the subsurface scattering effect. That is until you try to create or simulate this effect in much darker skinned people.

Johannes Vermeer. “Girl With a Pearl Earring (detail),” dated c. 1665
Subsurface scattering in white skin (examples)

Instead of the subsurface light or glow we see in the previous examples it is a ‘specular reflection’ or shine that gives Delroy Lindo’s skin its character in “Clockers.” According to Kim, glow is not the dominant physical quality of dark skin. His observation reminds me of artist Kehinde Wiley who often talks about having to learn how to paint dark (black) skin, after having been trained in school to do otherwise. You can see ‘shine’ in Wiley’s subjects.

Delroy Lindo’s shine in Spike Lee’s film “Clockers”
Kehinde Wiley. “Saint John the Baptist in the Wilderness,” 2013

Kim’s point is that when you apply the subsurface scatter algorithm to dark skin you get a less than realistic result. Kim noted the lack of diversity in examples of “skin shaders” and even in the way that hair is rendered using commercial CG software. There is no algorithm for “type 4” (afro or kinky) hair. You can watch / listen to the rest of Kim’s talk below.

Google search results for “skin shaders”
Hair types chart by Andre Walker (via Kim)

One way to address or counter historical racial bias in CG is through the use of deep (machine) learning, a process that allows us to toss out antiquated rules and create new, different kinds of image (skin) effects. In the examples below you can see ‘skin glow’, as well as ‘shine’. There is even a different treatment of type 4 hair in each example. Thus, deep learning could go against the flow of systemic bias by creating a new, futuristic aesthetic for CG (digital media).

Nettrice Gaskins. “Du Bois’s Dream,” 2021. Created using a deep learning algorithm
Nettrice Gaskins. “Dandy Dream,” 2021. Created using a deep learning algorithm

For the site-specific work I did for the Smithsonian, I used deep learning to create portraits of ‘featured futurists’ that include a diverse array of skin and hair types. For my self-portrait that will also be in the show, I applied an Afrofuturist lens to the process, generating glow and shine effects in my skin. This work is not intended to be hyperrealistic. Rather I want to see how far the deep learning algorithms go as far as removing historic and system bias and going in a different direction to create digital art.

--

--

Nettrice Gaskins

Nettrice is a digital artist, academic, cultural critic and advocate of STEAM education.