Brown, Aisha. “AI-Generated ‘Poverty Porn’ Fake Images Being Used by Aid Agencies.” The Guardian, 20 Oct. 2025, https://www.theguardian.com/global-development/2025/oct/20/ai-generated-poverty-porn-fake-images-being-used-by-aid-agencies
In this article, Aisha Brown explains how some aid groups and stock photo sites have started using AI-generated images that show people living in poverty or suffering from hunger. These pictures are often fake but are made to look real for fundraising or awareness campaigns. Brown notes that while these images are cheaper and easier to make, they can spread false ideas about poor communities and take away the dignity of real people. The article also includes examples from organizations like Plan International and the United Nations that show how AI can be misused even with good intentions. One expert warns that these images “create a false sense of reality,” while another says, “AI can’t capture consent, emotion, or lived experience.”
This article is a good source because it uses real examples and expert quotes to show the ethical problems of using AI in media and charity work. It helps readers think about how technology can change the way we see and represent others. It’s also clear, detailed, and directly connects to current issues of truth and responsibility in digital content.

Leave a Reply