Feminist Tech Geopolitics with Dr Payal Arora
Payal Arora is a Professor of Inclusive AI Cultures at Utrecht University and co-founder of two initiatives – Inclusive AI Lab, for debiasing tech and FemLab, a feminist futures of work approach to tech. She is a leading digital anthropologist with two decades of user experiences in the Global South to help shape inclusive AI enabled designs and policies. Payal is the author of 100+ journal articles and award-winning books including “The Next Billion Users” with Harvard Press. Forbes named her the ‘next billion champion’ and the ‘right kind of person to reform tech.’ She has been listed in the 100 Brilliant Women in AI Ethics 2025 and won the 2025 Women in AI Benelux Award for her work on Diversifying AI. Her new book with MIT Press “From Pessimism to Promise: Lessons from the Global South on Designing Inclusive Tech” has been longlisted for the 2024 Porchlight Business Book Awards. 200+ international media outlets have covered her work including the Financial Times, Fast Company, Wired, BBC, The Economist, and Tech Crunch. She has consulted for the public and the private sector including UNHCR, Spotify, KPMG, Adobe, IDEO, Google, and GE and sits on several boards including for UN EGOV, and LIRNE-Asia. She has given 350+ keynotes and invited talks in 85 countries for events such as ACM Facct, Copenhagen Tech Festival, re:publica, COP26, World Economic Forum, and the Swedish Internet Foundation, and TEDx talks on the future of the internet and innovation. She is a Harvard University, Columbia University, and Rockefeller Bellagio Resident Fellow alumni, and currently lives in Amsterdam.
Photo credits: Eva Roefs
What’s your story, Dr Payal, and how did your work in this field begin?
I was born and brought up in Bangalore and I was an artist and didn't really like the constraints around the educational system. I went to a traditional and typical convent school. I followed the typical trajectory of the times, especially because of the choices that were permissible at the time. I'm bringing this up because it is particularly gendered. I was suspended from school for three months because I had witnessed a girl kissing a boy. It was not just me – there were two others. The stigma meant that I had to go in front of the entire school and apologize for the “sinful act of witnessing” the girl and the boy who kissed. This did not even happen in the school – it was outside school, and on a Saturday. It was at a get together, which had nothing to do with the school. The news travelled to the nuns.
I bring this up is because India has such deeply gendered practices and there's so much taboo around the act of the very human act of kissing, the very essential fundamental need of intimacy, sexuality, sex, and it's so tied to our mental and physical well-being. Fortunately, I came from a nurturing family that helped me go through this with some sort of dignity, though I lost all my friends and was completely socially isolated. All for just witnessing something! This actually cemented my perspective on how I would like to take control of being more activist-oriented, particularly towards women and girls in the global south is because we are so burdened by the norms that prevail in society. Oftentimes, there's also a strong sense of victim blaming.
Following all of this, I moved out and lived in the US for many years. Now, I live in the Netherlands. I'm currently a professor in inclusive AI cultures and I've set up two labs, one called “A Feminist Futures of Work,” which is with Usha Rahman, a professor at University of Hyderabad. Since she’s retiring soon, I set up the Inclusive AI Lab in partnership with another fabulous woman, the head of AI at Adobe, Laura Herman. What I love about her is that she is a deeply thoughtful person in big tech. I'm trying to show that we can't think about entire institutions in binary terms of good and bad. A real feminist perspective strives to disrupt binaries. If we start to think of things in binaries and look at government and tech as bad and then act on it and decide we won’t work with either, we will be very alone. If we look at things as binaries, I don't know how we can justify any collaboration. It's always about people and we take a stand of how we want to make change from within, and not just from without alone.
I primarily work at the intersections of gender and technology, particularly AI-related tools. Right now, for example, I’m working with Google to build a gender AI safety protocol, which basically means that when there's a victim of deep fake porn, which oftentimes is women and girls, we consider the kinds of systems that should be built from a big tech company like Google to enable them to report and gain some kind of digital well-being. We are trying to shape that while centring women and girls in the global south who experience these realities differently given their contexts and lived experiences. We are looking into work in Mexico, Ghana, and India. We're also trying to see how the global south, to some degree, has deep solidarities, while also being very different, and what it might mean to take this into consideration.
What stands out to you as a contrast between the Global South and Global North when it comes to this work?
Activism in the West looks different from what it is in the rest of the world. It is a real privilege to go on the streets to protest, and express yourself directly with little consequence. I see this again and again amongst my students and the media politic in Europe and the United States. They almost entirely put themselves on a pedestal of righteousness as if to say, “Look I'm out here and you need to stand up to and speak up against your government.” I would love to see them in Russia or Syria or Afghanistan or Iran, stepping up to speak their mind. Would they be willing to pay that price? I think it is important for the West and those in the West to be more humble and recognize the privilege that comes with living in liberal democracies – something that only 11% of the world’s countries have.
Feminism from the Global South perspective manifests in different ways. It’s quiet in the sense that it is tends to not be very in-your-face. It works with the paradigms and questions it nevertheless. Let me give you an example. Recently, we got a grant for a project concerning women gig workers in the digital economy in India, Bangladesh, and Kenya, and it was with Usha. We had the COVID-19 pandemic shortly after, and it was interesting because it was meant to be an anthropological work. As you can imagine, it was a real challenge to continue the study during the pandemic. But we witnessed some real game-changing transformations. When people were attacking big tech on the one hand, Facebook Live actually saved many Bangladeshi women who were not allowed to sell their garments on the streets. They took to Facebook Live because it's intuitive, user friendly, and helped them connect with their clients. They were also able to receive payments. We need to de-centre the notion of what a good or bad platform is, and understand that when people, particularly those who are at the margins – who are very often women and girls, are relying on a resource, it is a positive impact that must be taken into account. They make these choices not because of the tropes those of us in privilege have in mind – that they are illiterate, or the many facets of the deficiency model, where we claim that they don’t know how toxic and extractive the technologies they’re relying on and if they did know better, they wouldn’t use it. Instead, they’re exercising their agency based on relative choices available, within the systems they are part of, and from among the legitimate options they have. This is not to mean that we should dismiss literacy and upskilling and awareness work, but to also mean that it is a privilege to talk about technology extractivism and privacy and expect those facing livelihood threats to conform to the expectation of questioning the system. There is no denying that privacy and data are currency, but we cannot demand that those whose livelihoods are fragile be the ones at the forefront in this fight.
I taught this course with a gender studies group at Utrecht University recently. I notice that liberal thinkers have become so fundamentalist, that they actually go in extremes without acknowledging the privilege they have to do so. For instance, in one of the polls I run during my classes, I asked where they stand when it comes to technology, and a majority of them want to opt out of all technologies altogether. I puzzle over what that means – because these technologies can be significant for many communities that lack privilege in many ways. We can’t put the onus on the backs of those that have experienced historic marginalization to give up relying on technology on the ground that they are being complicit in the data extractor regime – it’s not that binary. Imagine holding Afghan women to account for using WhatsApp to access education, and thus being complicit in the data extractor regime, when they legally do not have access to schools! The digital is one of the few places where they can humanize themselves, gain education, and find solidarity. We have to understand that these choices are existential, and digital is fundamental to their well-being. This is the climate we are in – it is part of the geopolitics of polarization. Unfortunately, women and girls are the ones being thrown under the bus in the process.
So much of what you’ve named here is related to big tech choosing not to centre social justice up front, and instead, only sometimes, adding it and stirring the mix. What are your observations on why we care more for moving fast and breaking things, than social justice values?
Social justice is such a loaded term and it tends to be dictated by American interests and concerns and particularly racial interests and concerns. Start by looking at the literature on data justice and social justice. Data justice is connected to social justice, and relates to the online space, and is all about the notions of things like consent, which, by the way, I actually am not a fan of, because I think it puts the onus on those who are marginalized. It suggests that there is legitimate choice, when there isn’t. For example, the guardianship system that a lot of women are subject to, affects this – women do wind up “consenting” to their phones being tracked, by their husbands, in-laws, and in some cases, sons. It is part of the cultural expectation that they cannot enjoy privacy, but none of these standards are applicable to men. When you think of data justice as grounded in consent, you also have to see how much of this consent is really coming from legitimate choice. The literature on data justice is heavily Eurocentric.
On the other hand, when you look at social justice, the literature is very racialized and centres on the African-American interests. I’ve dealt with a lot of tech companies, particularly DEI groups and equity product teams, and they have a limited understanding underpinning their approach to social justice. What's interesting to see in the back end is the way in which these big tech are organized. The equity product team is separate from the international team, and they don't talk to each other. What's also interesting is that they don't talk to their extensions of groups in different parts of the world in the global south because they are outsourcing centres, even though, in these outsourced places, you have very smart people who have many degrees. They’re often doing night shifts, and analysing American consumers. These DEI and equity product teams and international teams are coming up with structures and policies for the Global South.
It’d be almost farcical, if it wasn’t disturbingly true. The way they’ve organized themselves internally reflects a lot about how they see the rest of the world. What's especially interesting to me is that one third of Silicon Valley are foreigners, and a significant number of them are Indians. However, they are Indian male of a particular caste. Caste and gender matters in terms of what they prioritize and for what. We see this manifest in different ways. For example, take the idea of locked profiles. It actually does go in line with the patriarchal thinking in the global south, which we see as the ideas underlying questions like “Why are you on the streets at this time? Why didn't you stay at home? Why are you roaming freely? What were you wearing?” If you look at how the safety protocols are designed, you will see this particular culture reflected in it, and in many, many instances, it reflects a victim blaming mindset. This is part of what we are trying to do – we want to push back and have them recognize that this is where they will lose people who are trying to get justice through their system, because it’s just not accessible. Think about it – if you need content removed, you’d have to reach out to the company and that’s not even within their nations or their government.
Things are changing, for sure, but we’re not going to see the transformation anytime soon. We have to recognize that mere representation is not enough. Look at the creator economy – there is a glass ceiling for women creatives here, because the more visible they become as influencers, the more misogyny they face. And it goes to a point where remaining online affects their mental well-being that they drop out of the platform altogether. There is a double gendered discrimination – one is in getting online, and the other is in staying online.
And yet, there’s reason for optimism. You've talked about inclusive tech in your book, From Pessimism to Promise, with examples from the Global South. Can you share some of your insights?
We tend to see inclusion as if it's some altruistic act. It says something when AI for Good as a movement comes dominantly from Silicon Valley, where we’re again seeing the deep colonial mindset of “We're going to do good in the world. We're going to spread democracy,” and the regular civilizing tropes, which is underpinned by this notion that they don't have much to learn from the global south. This mentality has very disturbing consequences. For example, in one of the examples I give in my book, I talk about these AI-enabled drones developed by the “AI for Good” lot that can detect poachers entering safari spaces in different parts of Africa.
However, inclusion requires you to have a more holistic perspective.
We need to ask ourselves why these tools were built. Was it because the issue they’re addressing is the biggest challenge? You do not want these animals going extinct from poaching, and it is important to mitigate that. But what are the reasons this happens? Why is the core assumption that it is happening because it can’t be detected? One reason it is happening is that these safari spaces are huge and many rangers are not paid salaries for months. Many of them have bad shoes, they are nearly unarmed all the time. Something like the rhino horn costs more than, say, cocaine. The groups that prop up poaching operations are part of a multi-billion dollar industry and are highly militarized. How is a ranger expected to take them on? What incentive do they have to fight a poacher? They are not paid their salaries on time, they haven’t been paid for months, they don’t have healthcare, their shoes are terrible, they have little to no protective gear, they don’t even have proper vehicles – and there aren’t even paved roads! There is also complicity – because in many instances, rangers and poachers live in the same community and may even be related. In some cases, rangers may be informants themselves. These might look like boring details, but it is exactly what is enabling poaching, as opposed to an information deficit that suggests that rangers can’t track poachers because they don’t know where they are. The challenges are systemic.
We have to look at the root causes, but we cannot put ourselves on a moral high-horse when we do so. I remember using this example in class to ask my students what we should do to end poaching. Many of them recommended shooting to kill, that is – arming drones to shoot poachers on the spot when they enter. This demonstrates a tendency to dehumanize. But would they come up with such a response to address a comparable crime in the Netherlands, for example? So I brought about a parallel, and asked if we should come up with drones in Amsterdam, to shoot a criminal on the spot if they were to attack a criminal. The immediate response was, “No, because we have rights and there’s a law, and you cannot just shoot people.” When I asked what the difference was between both situations, though, they didn’t have an answer. The “us and them” component was clear. This is what I speak about in the book – we have double standards on how we see the majority world.
We need to rehumanize ourselves to the world and to the companies, of course, but also to each other. In many parts of the Global South, we dehumanize people from rural areas. After my book came out, a number of entrepreneurial tech companies reached out to me to express a sense of being moved by their learnings on the rural areas and their uptake of tech.
If technology was owned and developed and deployed as communities instead of as capitalists, we might not find ourselves in the heart of so many crises. What comes to mind when you hear this?
It’s interesting you’d ask this. I'm doing a keynote for this cooperative AI group in Turkey, which is all about the idea of the shared economy. There’s a lingering question for why we don’t have many Wikipedias – and many talk about the need for platforms like this because they can be cooperative or worked owned. All this is great in theory, but in practice, Wikipedia exists because one person, Jimmy Williams, refuses to sell out. I once met him at the Copenhagen Tech Festival, and I asked him why there weren’t more, and he said that he cares for the integrity of what he does and that is rare. So it is alive for a very personal reason, rather than an institutional one.
It is true that cooperatives, most often than not, fail. The reason is the classic tragedy of the commons: Nobody feels ownership of a particular space, we don't recognize hierarchy in terms of representative or extractive hierarchies, and we do not understand incentives entirely. This is simply because we're not all perfect human beings. And then you have bad actors who also bring down a space.
Eleanor Ostrom, who won the Nobel Prize, came up with an alternative to the commons idea. She said, we don't have to go in the way of the tragedy of the commons because we have to recognize that if you want good governance, there has to be thoughtful and accountable hierarchy. We don't have to go into this binary of “capitalism or cooperativism.” We need to have a model where we have to enable entrepreneurship and encourage the aspirations of individuals, while also recognizing that we cannot lose sight of humaneness. I’m extrapolating from Ostrom’s theory.
We need more out of the box thinking about how can we build better, especially in the age of AI where we are going to be experiencing a lot of experimentation, unlearning, and relearning. None of it shouldn't come at the price of a generation, or of those who are the most marginalized.
What could a feminist perspective bring to both the consumption and the development of tech?
In a book I wrote with Usha and Renee on a feminist take to the digital economy, we talk about this. There’s been a lot of talk about the rise of femtech, to indicate how feminist technology is getting. In reality, though, it is not “feminist” tech as much as it is “female” tech. We’re just another material to be extracted and leveraged. A lot of femtech relates to reproduction, birth cycling, and menstruation apps. These apps can also be pivoted against you – as the data are shared with third-party vendors including the government. So if you decided to have an abortion, it could be an issue because your data could be in the hands of the wrong people. Or if you identify as gay, that kind of data getting out can be harmful if you are in a country where your identity is grounds for legal punishment. We need to look at the political economy of these developments.
I'm not dismissing the emergence of femtech, but what we propose is to replace that notion with feminist work, where tech is everyday labor. If we can understand that there is no such thing as tech without our everyday sweat and tears and effort, we can also understand data as part of our culture and our everyday communicative action. Data is not out there as an abstract something that is far away from us – it is very much an extension of ourselves, and is an embodied form of our expressions and aspirations. We have to keep in mind that human beings are building these tools and bringing them to life. Technologies are always in the becoming. We have the ability, through our collective agency, to shape the way in which algorithms function, and the way in which our data get to spearhead how the world sees ourselves.
You can innovate with very limited resources. A feminist take is to put at the heart of the design, the fact that data is our culture, something that can support our well-being. We should ensure that the marginalized majority are at the centre of where these tools go. If they are not serving us, then we should be willing to not go in that direction altogether.