Real stories deserve real images

Many organisations are wrestling with how to use AI responsibly, with AI-generated imagery just one of many tightropes to walk. Pascal Barollier, Gavi’s Chief External Engagement, Comms and Advocacy Officer, explains Gavi’s position.

Credit: Pascal Barollier
Credit: Pascal Barollier
 

 

For all of us working in global health and development it’s obvious AI is a gamechanger. It is also a tightrope. I started my media career as a news photographer in the late 80s in Paris at one of the leading photo agencies Sipa Press. Here are the views of a dinosaur embracing AI... selectively.

Getting AI right means maintaining a delicate balance between maximising the potential of this incredible new tool and all the speed and efficiency gains we can make, but also preserving the accuracy, the creativity and ultimately the credibility we have spent decades building for our organisations.

This week Devex reported on one of the countless arenas in which this balancing act is playing out: AI-generated imagery.

AI engines are now churning out an estimated 34 million images every single day, with the technology edging closer and closer to true photo-realism. With budgets shrinking across the sector, there is a reasonable question being asked by communications teams worldwide: why spend valuable, scarce time and money on getting quality photography and videography from some of the world’s most challenging locations when we can create it for $10 a month?

At Gavi, our position is clear: we will never use AI‑generated imagery to depict real people, real communities or real‑world settings. And that’s not simply a matter of preference; it is a matter of principle.

Real communities, real stories

Gavi’s mission is grounded in partnership with the countries and communities we support. When we document the impact of vaccination or tell the stories of frontline health workers we owe it to our audiences, and to the people featured, to be truthful. That means ensuring that the faces, places and moments we show are real.

Our VaccinesWork platform runs on a network of nearly 100 professional freelancers producing real stories from the countries where we work. These are journalists from the communities we support reporting on the communities we support. We take the same approach with our multimedia: employing local photographers and videographers who understand the cultural, social and historical context they are capturing. Their work does more than illustrate, we believe it honours and humanises the subjects they are capturing.

AI‑generated depictions of people, especially in a humanitarian or global health context, risk introducing inaccuracy, stereotyping or unintended bias. Even when AI aims to represent a community it may inadvertently distort or caricature it. I have enormous sympathy for organisations being forced into tough decisions under the current financial situation: for Gavi we have prioritised maintaining the ability to gather quality content from the countries we support.

Our approach to AI content

But this is not intended to preach against the perils of AI. While we draw a firm line on depictions of reality, AI does have an increasingly valuable place in our creative workflow. We are trying to walk the tightrope.

When it comes to abstract illustration, non‑realistic animation and conceptual visual elements, generative AI can lower costs, support creativity and reduce production bottlenecks. But they never replace the authenticity and integrity that real writing, photography and film bring to our storytelling.

And prioritising real imagery is not just an ethical stance. It is increasingly in line with what audiences want.

As mentioned above, 34 million AI images are being churned out and posted on social media platforms every day. AI content farms are producing endless videos with barely any human involvement, fact-checking or quality control, rapidly accelerating the spread of mis- and disinformation. We are already in the age of ‘AI slop’, and audiences are turning against it.

Multiple surveys show growing concern with AI-generated content. Algorithms are beginning to detect and demote synthetic content. Search engines like Kagi already downrank deceptive AI‑generated material and platforms such as YouTube are attempting to curb ‘inauthentic’ AI content.

In this environment, prioritising real content is the strategic choice.

AI can support creativity - it will never replace authenticity

AI will continue to evolve and, with it, the expectations placed on communicators. But while our use of AI will almost certainly evolve and change with it, our values – accuracy, dignity and trustworthiness – will not. AI can support creativity but it will never replace authenticity.

As we continue to explore what responsible AI adoption looks like for a global health organisation, as we continue to walk the tightrope, we remain guided by a simple belief: real stories deserve real images.