Understanding the UN’s AI Refugee Avatars

A new kind of storytelling tool is making waves at the intersection of humanitarian work and artificial intelligence. The United Nations University Centre for Policy Research (UNU-CPR) recently introduced AI-powered avatars meant to simulate the lived experiences of people caught in conflict. These digital characters, based on fictional personas, are built to engage users in conversations about war, displacement, and survival. The goal? Raise awareness. Maybe even move people to act.

But there’s a lot to unpack. What exactly are these AI avatars? Who made them? Why is it stirring up debate? And more importantly, does this really help the people it’s supposed to represent?

What are AI avatars and how do they work?

Let’s start simple. An AI avatar is basically a computer-generated character that uses artificial intelligence to interact with humans in real-time. Ideally, these avatars are meant to simulate “face-to-face” conversations that feel more human, more emotionally grounded. In the case of the UN project, these avatars talk about fleeing war zones, living in refugee camps, and surviving violence.

These avatars are typically powered by large language models combined with computer-generated visuals. The goal is to make the conversation feel as real as possible. You ask them questions, they answer based on their persona. Some responses are scripted, but most come from an AI model that draws from a large set of data to respond in context.

The result? A conversation that mimics reality.. or at least tries to.

Who created these avatars? Meet the UNU-CPR

The project didn’t come out of the main United Nations headquarters. Instead, it came from the United Nations University Centre for Policy Research (UNU-CPR). This is a research think tank tied to the UN but operates independently. It’s not part of peacekeeping or refugee logistics. It focuses more on exploring big-picture policy ideas.

UNU-CPR brought this avatar concept to life with help from Columbia University and AI experts. Originally, the idea started in an academic setting, as part of a course on AI and conflict prevention. From there, it grew into a real (or at least digital) prototype.

Two avatars were created: Amina, a fictional Sudanese woman living in a refugee camp in Chad, and Abdalla, a persona designed as a member of a paramilitary group involved in Sudan’s conflict. Each one has a narrative based on real-world events, but their words are generated in real-time by AI.

Why make AI refugee avatars? What the project is trying to do

The idea behind this experiment is fairly bold. The creators wanted to make it easier for people to connect emotionally with faraway crises. Let’s be honest: most of us don’t spend our mornings reading reports about civil war or famine. The hope was that a conversation with someone (even virtually) who had “lived” through it might do what charts and articles can’t.

The project also aimed to be a tool for donor engagement. Talking to Amina or Abdalla is supposed to give policymakers, journalists, or fundraisers a more visceral understanding of what’s happening. One of the creators even said this was an easier way to make a case to potential donors.

At the same time, the UNU-CPR team has made it clear this is an experimental tool. It’s not replacing real human voices, and it isn’t officially part of any UN relief effort.

Public reactions: a mixed bag

People have opinions about this. Lots of them. Some users said the avatars were powerful. They felt more engaged after talking with Amina, like they’d just had a window into a life they hadn’t considered before. Others said the whole thing felt off.

One of the biggest criticisms? Refugees already speak for themselves. They write, film, and report their own experiences. Why use AI to create a version of their voice when the real thing is already available? Critics worry that this could actually do more harm than good, by replacing lived experiences with artificial ones.

Others raised concerns about accuracy and bias. An AI, no matter how advanced, pulls from datasets that may contain outdated or skewed perspectives. And there’s always the risk that users will mistake fiction for fact.

Then there’s the ethical debate: Is it right to simulate the trauma of war using code? Even if it’s well-intentioned?

Do these AI refugee avatars actually help?

On a purely technical level? Yes. The avatars work. They engage people. They spark conversation. And that’s not nothing.

But whether this is the best use of AI in humanitarian work is still very much up for debate. It might bring attention to overlooked crises. Or it might distract from what matters most: listening to the actual people who are living through these situations.

AI can tell stories. But it can’t live them.

If the goal is to humanize refugees, maybe we shouldn’t be creating artificial humans to do it.

The tech is interesting. The purpose is noble. The impact? It depends on how, and where, it’s used.

What Are Refugees from Sudan and Chad Facing Right Now?

Let’s zoom out of the screen and into real life. In 2025, Sudan is still in chaos. A civil war has torn the country apart since 2023, mainly between the Sudanese Armed Forces and the Rapid Support Forces. It’s brutal. Think bombings, mass displacement, collapsed healthcare, and famine.

The numbers are staggering. Around 12 to 13 million people are displaced within Sudan. Millions more have fled across borders. Many of them end up in Chad, which shares a long and very porous border with Sudan.

Chad, for its part, is overwhelmed. It hosts more than 1.2 million refugees from Sudan alone. Most of them came in just the last year. Towns like Adre and Tine are packed beyond capacity. Refugees live in makeshift tents or open fields. Food and clean water are scarce. Health clinics are stretched thin. There might be one doctor for every 25,000 people.

Children are out of school. Women are especially vulnerable to assault. Diseases like cholera and measles spread quickly in overcrowded camps. The humanitarian system is buckling.

These aren’t just statistics. They’re real people in real danger.

So, where do we go from here?

AI can be a powerful tool. No doubt about it. Used well, it can explain complex issues, tell compelling stories, and maybe even build empathy. But it can’t replace real voices, real suffering, or real action.

The UNU-CPR refugee avatars are trying to do something ambitious. They’re trying to make the global public care about a crisis that often goes ignored. That’s not a bad goal. The risk lies in how we use the tool.

If we use it to open the door to deeper engagement, great. If we use it to check a box and feel informed, not so great.

We can do both: explore AI storytelling while still prioritizing the people it’s meant to support. One doesn’t have to cancel out the other.

But the balance matters.

FAQ

What are the UN’s AI refugee avatars, and who created them?

The AI refugee avatars are digital characters developed by the United Nations University Centre for Policy Research (UNU-CPR) to simulate conversations with fictional individuals affected by conflict. The goal is to raise awareness about humanitarian crises by letting users interact directly with avatars who share personal stories based on real-world data. These avatars, named Amina and Abdalla, represent a Sudanese refugee and a paramilitary fighter, respectively.

Are these avatars representing real people or fictional characters?

The avatars are entirely fictional but are inspired by real events and common experiences of people affected by war in Sudan. They are not based on any specific individual, but rather constructed personas meant to reflect broader realities. The AI responds in character based on the persona’s background and context.

What are the main concerns critics have about these avatars?

Critics are concerned that using AI to simulate refugee experiences may unintentionally dehumanize the people it’s trying to support. Some argue that it risks replacing real refugee voices with artificial ones, while others worry about potential inaccuracies, ethical issues, and reinforcing stereotypes. Supporters argue the tool can build empathy and improve engagement, especially among those who might not otherwise interact with refugee stories.

What real-life challenges are refugees from Chad and Sudan facing right now?

Refugees from Chad and Sudan are dealing with severe overcrowding, food and water shortages, lack of medical care, and limited access to education. Many have fled ongoing civil war in Sudan, only to find humanitarian support in Chad stretched far too thin. Camps are underfunded and overwhelmed, leaving people vulnerable to disease, hunger, and violence.

Is this technology being used to replace real humanitarian efforts?

No. The AI avatars are experimental tools intended to complement—not replace—real-world humanitarian work. They are not used in aid delivery or policy decisions. Instead, they aim to enhance storytelling and donor engagement by offering a more immersive, emotional understanding of conflict and displacement.