
Technology shapes nearly every aspect of modern life—from the algorithms that curate our social media feeds to the artificial intelligence systems that generate images and make decisions affecting millions. This dialogue—the third installment of Ford’s “Learning Globally, Leading Locally: Conversations With Ford Global Fellows” series—brings together two fellows who work with communities across Africa and historically excluded groups in the United States to end the systemic biases and exploitation that have long been overlooked under the guise of technological advancement.
Nettrice Gaskins, assistant director for the Lesley STEAM Learning Lab at Lesley University in Cambridge, Massachusetts, fights bias embedded in both technology and education. Her work takes an equity-oriented approach to Science, Technology, Engineering, Art, and Mathematics (STEAM). In her published writing and doctoral work, and through her daily practice of creating AI-generated art that counters racist imagery flooding digital spaces, Nettrice demonstrates how representation and resistance can coexist in the same creative act.
Mercy Mutemi works to narrow this inequality daily. She is the executive director of the Oversight Lab in Kenya, which supports tech workers and vulnerable communities harmed by systemic power structures in the tech ecosystem. Through strategic litigation, Mercy is challenging the doctrine of platform immunity—the legal shield that has long protected technology companies from accountability—and building an Africa-wide infrastructure to ensure that technological development doesn’t replicate colonial patterns of extraction. In 2023, she was included in both TIME’s “100 Next” andBusiness Daily‘s “Top 40 Under 40” lists.
These two Ford Global Fellows agree: Technology is not neutral, and it must be reconsidered through the lens of equity. In this conversation moderated by Adria Goodson, director of the Ford Global Fellowship, Mercy and Nettrice explore systemic harms within tech innovations, the questions that drive them forward, and the learned wisdom that sustains them in their fights for justice.
Disrupting Inequality Through Strategic Action
Adria Goodson:
“Nettrice, would you please tell us a specific story about how you’ve worked to disrupt inequality within the past six months?”
Nettrice Gaskins:
“The maker movement, an integral part of STEM and STEAM practices, is about tapping into the natural “do-it-yourself” inclinations of creative people and the educational power of creating and making things. Yet many groups are underrepresented in maker culture, STEM and STEAM. After my first book was published in 2021, many people around the world have shared with me how they were inspired to use it. For example, using my framework, a creative digital art studio named AMPL Labs is currently working with a Black youth-led education organization on a digital literacy STEM program. This activity aligns with my work that is about creating dynamic learning environments where more diverse people feel they belong.
Every day for eight years, I have been posting AI-generated art online. Usually, I’m trying to counter some of the messaging and some of the imagery that I’m seeing coming out of the AI space, and it’s also therapeutic for me. I’ve also had strangers approach me to say that they see themselves in the work, which is very important.”
Adria Goodson:
“Mercy, let’s turn to your work.”
Mercy Mutemi:
“In 1996, it was written into the law in the U.S. and E.U. that you couldn’t sue social media platforms for their algorithms because these companies need to be protected so that they could continue developing their technology. That cascaded over time into these companies making algorithm decisions based on what makes the most money for them. This means algorithms pushing hate speech and incitement content, catalyzing civil conflict, and the platforms having absolute power to get away with anything.
But this April, in Kenya, a decision was made that social media platforms do not have immunity when it comes to violation of rights. While platform immunity might be a very well-recognized principle in most of the Western world, in an African context, it does not exist because of the colonial history we have that gave us constitutions that mainstream human rights. This decision stemmed from a case that I filed under my law firm three years ago.
About eight months ago, I started to really think about the sustainability of the work I’m doing. I represent content moderators who have been harmed by tech companies and labor violations and communities that have been harmed by algorithms. But all that was done in the context of a law firm, which is unsustainable because it’s not just one person being harmed, it’s entire communities. The response that is necessary is not reactive but, rather, systemic. And that was the reasoning behind coming up with The Oversight Lab—to be an African response for what we see as systemic harm.”
Breaking the Patterns of Harmful Technology
Adria Goodson:
“Nettrice or Mercy, do you hear something in each other’s work or stories that makes you say, ‘I want to build on that now?’”
Nettrice Gaskins:
“I have something on the idea, on the fact of harmful technology. Yesterday, I responded to a post I saw on Instagram about online creators using offensive AI-generated content to flood social media with memes. And it’s using Google’s V3’s AI-generated images of Black women as primates to perpetuate racist tropes. Online creators took an, I quote, ‘fairly innocuous’ trend on social media and repurposed it to dehumanize Black women. I was reminded of a field review I wrote that leverages different media and scholarship to understand how racial bias in early photographic and film techniques carried over into machine learning and facial recognition AI, which has led to harmful technological designs.”
Adria Goodson:
“It’s a serious situation that then spills into the expectations of what can happen in Africa, right? If we are dehumanizing an entire race, that allows us to continue to dehumanize and transactionalize an entire continent.”
Nettrice Gaskins:
“That’s right!”
Mercy Mutemi:
“I think the work of documenting over time is so important. This work that we do to try and advocate for change has to be data-led so people have an idea of what the problem is. But still, it becomes difficult to design effective solutions because accurate data is missing. Where others meet us from an unequal perspective is that they have data that supports all the marketing jargon that they want to use: ‘connecting people, advancing technology.’ A lot of the time, we don’t meet them with that same data that shows where the harm is.
You don’t even have to draw a very long line between dehumanizing people and real-life harm. And we saw in Ethiopia and Rwanda, if people are curious enough to study the patterns of war and the patterns of conflict and genocide, that’s the first step when you dehumanize people. You’re making it comfortable for everybody to call them the names of animals. The next step is just incitement. It becomes so much easier for violence to be achieved because people have become desensitized. So what starts off as boardroom conversations about budgets and about prioritizing profit and algorithms has a very direct impact on what happens in day-to-day lives.”
From Restorative to Systemic Justice
Adria Goodson:
“We’re all jumping into a river that’s hundreds of years old. This fight is not new, and nor will it end, even if we were to win.
Mercy, Nettrice, what are the questions at the forefront of your hearts or minds right now? What is it you’re wrestling with now that you don’t yet have an answer to?”
Nettrice Gaskins:
“My question is also a long-term observation of how we break these cycles. One-off workshops and classes are not enough to have people in a space where they feel empowered. I’m quick to remind people that I swim through technology but my mother was a computer programmer, so of course I swim through technology as well as make art. My mother was a rarity, being a Black woman at that time and being a computer programmer. But that’s why I’m comfortable. There are so many people who don’t have that exposure and fear technology, rightfully so. But when it comes time to develop these tools, how do we make sure that we are either developing our own or that we are at the table when the tools are made so that they don’t have these biases?”
Mercy Mutemi:
“How do we arrest the harm before it happens? One of the most frustrating aspects of my work is by the time communities come to me or workers come to me, they’ve really been harmed by tech companies or by the tech itself. It always feels quite discouraging that the best we can get for them is restorative justice, and then the pattern continues. How do we ensure the next person isn’t hurt? How do we change behavior before another person has been harmed in the exact same way? Is there a way that we could leverage litigation to go to the root of the behavior and actually change the behavior, so that the one case we take is the end of the harm?”
Finding New Definitions Through the Ford Global Fellowship
Adria Goodson:
“Could you share one experience within the FGF community of practice that has influenced the way you’re doing your work now?”
Mercy Mutemi:
“Before my fellowship, I think I undervalued the connection between my work and what it does to you, to me, in terms of safety and security and risk. Doing this kind of work is risky because of the context that we operate in. It’s very easy to mischaracterize people who do this advocacy work as enemies of technology and enemies of progress. It has happened to us.
Through our FGF interactions, especially through the first convening, I see now that I don’t have to present the problem as ‘tech companies are bad’ because then what policymakers, government, and the public hears is that we’re an enemy of progress or we’re anti-tech. Instead, we’re trying to achieve several things at the same time: Yes, let’s have a dream to have a million people hired through digital companies and tech companies, but at the same time, let’s make sure those million jobs are being done in a way that actually protects and preserves the inherent dignity. We want to participate in this next evolution. As a continent, we do want to be part of the next revolution—but at the same time, we don’t want to be exploited when we do that.”
Nettrice Gaskins:
“I really connected with the FGF trip to Soweto a few years ago. We learned about the Soweto Uprising of 1976, when the youth fought back and resisted the government’s plans to impose Afrikaans as a medium of instruction in schools for Black students. Being in the space, it was the moment when I realized how important it is to say, ‘This isn’t working. How can we…?’ Looking at the power that young people have to change things, and then thinking more about my own work and what I am trying to counter, and also just being with the fellows in that space.”
Adria Goodson:
“Do you have any guidance or advice for incoming Ford Global Fellows?”
Nettrice Gaskins:
“I think the only word is: listen. Oftentimes, we have our own voices and points that we want to come across, but listen for alliances, connections, opportunities.”
Mercy Mutemi:
“Imagine a space where your questions, your doubts, your insecurities find a home, a safe space for you to be able to share them. For you to feel very seen and be given an opportunity to discuss what is bothering you and what is stopping you from getting to the next stage. And if you can imagine that, then allow yourself to use the Ford Global Fellowship as that space—because that’s what it’s been for me.”
This post is part of a conversation series with Ford Global Fellows. Read more.
