Is UX the key to public sector success?
Patrick Rønning shares what the UK can learn from Denmark’s digital transformation
The launch of Bias Breaker is taking aim at AI’s role in perpetuating stereotypes
In a marketing ecosystem in which stereotyping still stops people from achieving their full potential, it is difficult to overestimate the role of AI in perpetuating stereotypes at scale.
Research from The Brandtech Group shows that several major foundation models continue to generate stereotypes. One of the most high profile examples being the fact that 98% to 100% of images are of a man (or a male appearing) when you type the word ‘a CEO’ into an AI image generator.
Pencil, the Generative AI marketing platform acquired by The Brandtech Group in June last year, has already created over 1 million ads for over 5,000 brands since it launched in 2018. A scale which underlines just how quickly AI can perpetuate harmful stereotypes.
Brands and advertisers cannot simply accept bias as the status quo.
Tyra Jones-Hurst, Managing Partner at Oliver US and Founder of InKround
Yet, that scale also affords brands and agencies the opportunity to deliver solutions at scale. With this responsibility and opportunity in mind, The Brandtech Group has launched a raft of initiatives to promote and set boundaries for the ethical use of generative AI. These include a free to access and open blueprint for creating an ethical Generative AI policy. As well as proprietary technology that tackles bias in foundation models called ‘Bias Breaker.’
The proprietary Bias Breaker technology is designed to layer inclusivity onto AI-prompts. Brandtech configured several of the most common elements of diversity, which, in this iteration of Bias Breaker, include age, race, ability, gender identity, and religion. So, when a user enters a simple prompt such as ‘a CEO’ the tool adds a number of types of inclusivity, which will vary each time, creating a more sophisticated prompt to use in any image generation model. This will produce images which, instead of being reflective of bias, add positive bias towards a wide spectrum of diversity and intersectionality that current models simply do not provide for.
The example of the CEO search, which has become a shorthand for AI perpetuating bias on the marketing conference circuit, is based on solid research. According to The BrandTech Group, two of the models showed 100% male-appearing images in 100 generations when prompted for an image of a CEO. While another was 98% male, and two others offered male CEOs 86% and 85% of the time.
This example highlights how AI can perpetuate inequality and bias. McKinsey’s Women in the Workplace study last year identified that 28% of C-suite roles were held by women. While 10.4% of CEOs of Fortune 500 companies were female. In this example AI therefore serves to make an already unequal work even more homogenous.
Tyra Jones-Hurst, Managing Partner at Oliver US and Founder of InKround, explains: “The answer to this problem of bias in Gen AI is far from set in stone but one thing’s for certain - brands and advertisers cannot simply accept bias as the status quo. One way to address it is to prioritise inclusion through creating automated features where they don’t already exist, and integrate them to build on top of foundation model use. This is what we have done with Bias Breaker.”
Rebecca Sykes, Group Partner and Head of Emerging Technology at The BrandTech Group adds: “Looking at CEO examples is just one way bias shows up, and the simple use case we have used to demonstrate the challenge and implications of not addressing bias. There are many other examples, ranging from the obvious - nurses and carers are mostly female, there is a stark lack of disability etc - to much more nuanced instances. Over time, we hope to address all of this.”
According to The Brandtech Group CMOs are increasingly asking questions around ethical usage of Generative AI as they move from pilots to scaled deployment.
David Jones, founder and CEO of Brandtech, added: “When social media exploded onto the scene no-one saw the negative issues coming. With Gen AI, we have the ability to get ahead, instead of simply being reactive after the event.”
Looks like you need to create a Creativebrief account to perform this action.
Create account Sign inLooks like you need to create a Creativebrief account to perform this action.
Create account Sign in