
Generative AI has become a part of everyday life for many people. Whether you’re using it or looking at it, it’s estimated that there are currently over 350 generative AI tools on the market. Additionally, approximately 88 percent of organizations use at least one of these tools in their daily operations, and about 70 percent of Gen Z report using generative AI tools in their daily lives. Still, that doesn’t mean the tools are without their problems. Generative AI tools are well known for producing inaccurate information as well as problematic images. When it comes to producing and analyzing images of Black people, in particular, it can lean heavily into harmful stereotypes or refuse to produce anything at all. Fortunately, one innovator is taking an important step in changing an aspect of that.
How Blanca Burch is Helping
When trying to use generative AI to produce images of Black women with different hairstyles, Blanca Burch noticed that it was almost impossible to get what she needed. The options lacked diversity, and sometimes, the generative tool struggled to produce anything usable. These limitations often resulted in a shortage of variety when it came to images featuring Black women. That’s where her research comes in.
Burch is focusing her research on introducing more data sets into the generative AI database. By feeding these tools with images of real Black women who have differing hairstyles, she hopes to give generative AI what it needs to place Black people on equal footing with their white counterparts. This means that the AI tools that most people use to generate images for everyday use should have reference images that allow them to correctly decipher Black hairstyles and textures.
While her work is ongoing, Burch hopes to get at least 200 participants in the project. Having more images for the generative AI tools to draw from will likely improve the landscape over time.
RELATED: 10 Ways To Love Your Curls Even More
How Her Data Can Help Generative AI With Inclusivity
Burch was inspired to pursue this kind of study by her own experiences with using generative AI, but also a recent study in Forbes. When using generative AI to produce images of Black women, Burch saw that the tools found it difficult to produce certain hairstyles. The typical straightened hair was easy, but when she asked for styles like an afro, braids, twists, or cornrows, the AI tools weren’t very helpful.
The independent study from Forbes was even more troubling. According to the magazine, AI tools couldn’t identify the same Black women when they changed their hairstyles. These same tools also regularly categorized Black women as being less intelligent or professional if asked to assess images where they were wearing braids or other Afrocentric hairstyles. When asked to analyze pictures of white women with differing hairstyles, however, AI tools didn’t penalize them in the same way.
The results of this study further highlight why Burch’s data regarding Black hair is essential. If provided with more objective data, generative AI tools are less likely to pull from the limited – and often biased – information that’s available to them. Since these tools can’t think in the truest sense, they’re left to amass the data that’s available before providing what’s been asked of them. In the case of Black women, the result can be a testament to the societal bias that is prevalent on online platforms and forums.
With Burch’s data, though, there is some hope that this can change, and generative AI will be more prepared to produce and correctly assess images of Black women.

The Current State of Generative AI and Black People
As you may suspect, however, the issues with generative AI and Black people don’t just stop at hair. Over the years, significant disparities have been found when it comes to how AI systems identify Black people. When analyzing footage of criminal activity, AI tools have repeatedly had issues with identifying the right person, if they’re Black. In fact, it was found that they got things wrong 34 percent of the time for dark-skinned women as opposed to 0.8 percent of the time with light-skinned men.
Unfortunately, misidentification of Black people has resulted in multiple arrests of the wrong person because people were flagged by an AI system instead of a human being. This is troubling considering how many judicial systems are incorporating AI-based identification tools into their processes.
In another study, it was shown that an AI tool identified Oprah Winfrey as male and classified her as such. To make matters worse, the same tool falsely matched 28 members of Congress with criminal mugshots in the system. This doesn’t surprise some people who have been studying how AI handles Black images, as another study showed that these technological tools were far more likely to match two unidentical Black faces than two whites ones that don’t match.
Another way in which poor representation in AI databases can affect Black people is with employment. As noted previously, many organizations use AI tools in their day-to-day tasks. One activity that has been delegated to AI tools is the assessment of resumes. While these tools can sort through applications based on qualifications, some employers are including criteria that are far less objective.
These criteria can include personality assessments and other tests that are meant to determine how well an applicant may fit into an organization. Unfortunately, this can lead to AI tools pulling on materials online that tend to malign the professionalism and suitability of Black people for certain roles. Of course, applicants’ pictures are often included in their applications, and Forbes’ study shows that having certain hairstyles can result in being rejected for the position without ever being assessed by a human being.
Even without an image, AI can use other data to analyze whether or not the applicant is Black or white. When this happens, statistics show that Black applicants may be rejected at a higher rate depending on the position in question.
While there are pros and cons to the use of AI, the scales are still tipped in the negative for Black people. Since AI can only pull on the data that’s available, societal bias has seeped into technology that people are hoping will be objective. These biases continue to have significant implications for Black people. Hopefully, more research projects along the lines of what Blanca Burch has undertaken can help to level the technological field.






