Feminist Tech Geopolitics with Leah Kimathi
Leah Kimathi is a social researcher and activist with a deep interest in governing the state in Africa, elections, peace and security (DDR, SSR, TJ) and diversity and inclusion.She has a wealth of experience spanning over 20 years working in various African countries with both state and non-state actors including Kenya, Uganda, South Sudan, Somalia, Rwanda, Zimbabwe, Malawi, Ethiopia and Zambia. Her portfolio includes programs’ design and implementation, applied research, policy advisory, organizational capacity development, curriculum design and training for civilian, police and military personnel among other engagements.Currently, she is the National Convener of the Peace Actors’ Forum; a platform that brings together over forty civil society actors in peace and conflict in Kenya. She is also a consultant in elections in Africa for the International Foundation for Electoral Systems (IFES). She also routinely consults for the International Peace Support Training Centre (IPSTC) as both a curriculum developer and trainer/facilitator for peace support operations.Ms. Kimathi is a founder member of the Council for Responsible Social Media and a Claude Ake Scholar. She has also extensively published on the state in Africa.
What got you started in the field of tech and social justice?
My journey into the intersection of technology and social justice began almost two decades ago, rooted in my work on governance, peace, and security in Kenya and across the Horn of Africa. This region, unfortunately, is one of the most violent and conflict-prone globally. For years, my focus was on state governance and peacebuilding practices. Over time, particularly in the governance space, I couldn't ignore how elections became significant flashpoints for political violence, especially as many African countries transitioned to multi-party democracies. For the last ten years, I've been supporting various African nations, particularly in Eastern and Southern Africa, as an electoral security specialist.
As I delved deeper into elections, the role of technology became increasingly prominent. Initially, there was a real sense of "tech-for-good." Technology was a powerful resource to enhance voter registration, enable biometric identification, and speed up result transmission, promising transparency and efficiency. However, alongside this positive evolution, we quickly witnessed the emergence of technology's harmful side – its divisive nature. It became chillingly clear how easily tech could be weaponized in the electoral space to undermine the whole democracy project.
This duality of technology, its capacity for both immense good and profound harm, particularly in already vulnerable and fragile societies like ours, is precisely what pivoted me more deeply into the realm of technology. The urgent need to address its negative impacts became undeniable.
On the face of it, the intersection between tech and social justice is a no-brainer. And yet, we find social justice perspectives excluded in the pursuit of tech advancements. What does your experience in the field tell you about why this is always the case?
The fundamental reason for this disconnect lies in the very conception and ownership of technology today. As social justice activists and tech consumers today, for example, the critical questions we must ask are: who owns this tech, what values underlie its development, for what purpose, and what power dynamics are at play? There's a clear rift between the ideal of tech-for-good or tech for social justice and what technology has become.
When you look at social media and most emerging technologies, it's evident that they've been developed and are primarily owned by global corporations based in the Global North. These corporations are overwhelmingly profit-driven. Social justice, the public good, or common welfare are rarely their primary concerns. Their focus is on market expansion and maximizing profits, which often means they simply gloss over crucial social justice considerations.
It's even worse for us in the Global South and Africa, more specifically, where digital colonialism is at play. Here, global tech giants dominate Africa's digital infrastructure, data, and discourse, often extracting vast amounts of user data without adequate local oversight. This leads to content governance frameworks that fail to comprehend our unique socio-political nuances. They also control our digital public spheres without adequate consideration of local contexts and redress mechanisms for arising threats. Oftentimes, these companies relegate social justice concerns like privacy, human rights, sovereignty or democratic participation to an afterthought, because they challenge the very business models that prioritize rapid growth and profits.
Emerging technologies and digital tech, for that matter, are already fundamentally changing the world we live in. However, such a major shift should not happen without serious reflection on its impact on humanity. We should learn from history how such selfish projects in the past, like colonialism, have adversely affected humanity at scale. We should not allow this latest revolution to happen without due consideration for its impact on humanity, and these conversations must be nuanced and contextual. Short of this, and if we allow these global behemoths to operate without accountability to the users, then we will pay a heavy price sooner rather than later, and as usual, the cost will be heavier on the Global South populations.
You've set up the Council for Responsible Social Media. Can you talk a little bit about that? What does it do and what are some of your observations from your journey so far in this work?
The Council for Responsible Social Media is about three years old. It is a is a multi-sector platform dedicated to securing a safe and equitable digital future for Kenya and increasingly, the African society. Our core focus is on addressing digital harms stemming from social media and other emerging technologies across different sectors. We do this by actively holding "Big Tech" accountable, challenging pervasive digital colonialism and marginalization, and advocating for local innovation and localization that truly reflects African social, economic, and cultural values, all informed by feminist tech principles. Another key strategy at the Council is advocating for a more enabling legal environment at both national and international spaces, especially at the AU level, which will lead to governing our digital public sphere in a way that reflects our realities, hopes, and dreams as a people. All these efforts are underpinned by rigorous research to ensure that policy and advocacy are based on the real needs of the African people.
As a founder member, the Council came about from my experience working in elections in several African countries and witnessing the increasingly outsized role of tech in elections, especially where tech could be easily manipulated to cause harm through disinformation and polarization, like during the 2017 and 2022 elections in Kenya.
One of the most pressing concerns I consistently encounter in my work, both here in Nairobi and globally, is a dangerous paradox: while we've enthusiastically embraced the myriad benefits of the digital space, we remain astonishingly slow to recognize and act on its harmful underbelly. This includes the rampant spread of disinformation, the rise of political polarization, insidious forms of digital neo-colonialism, and a host of other societal vices. This failure to acknowledge and address the dark side of digital tech is particularly acute among African users.
That's precisely why, as the Council for Responsible Social Media, our mission is unwavering: we demand that these powerful platforms provide the same duty of care to users in Africa as they do in global minority markets of Europe and the United States, without discrimination. We fundamentally reject the notion that platforms are merely benign enablers of communication. On the contrary, we assert that they bear a profound responsibility in safeguarding our digital spaces and actively preventing the weaponization of digital tools against the users.
You have a background in elections, peace, and security. What are your observations on how tech has affected these spaces?
In Kenya, and indeed across the globe, technology presents a double-edged sword within the critical domains of elections, peace, and security. While its capacity for good is undeniable, we are currently witnessing an alarming proliferation of negative impacts, largely due to an environment of unregulated digital spaces and unaccountable platforms.
Our experience in Kenyan elections offers a stark example. Despite the heavy investment in technological deployment, we run one of the most expensive elections in the world per capita—from biometric voter registration to digital result transmission—we've observed a deeply concerning trend: the manipulation of this technology. This has, regrettably, further eroded public trust in electoral processes. While digital media has played a vital role in informing citizens and facilitating civic education, it has simultaneously been gravely abused. The unchecked proliferation of disinformation, hate speech, and polarizing content during election cycles has deepened societal divisions, creating a volatile landscape where truth is often obscured by amplified falsehoods.
Beyond elections, the peace and security landscape in Kenya also reveals the dual nature of digital technology. With over 25% of Kenyans, predominantly youth, active on social media, and an internet penetration nearing 50%—significantly higher than many regional counterparts—digital spaces have become powerful arenas. We've seen, particularly since last year, the inspiring mobilization of Kenyan youth, popularly referred to as the Gen Zs, who have effectively leveraged these platforms to organize and push back against bad governance and government excesses. This demonstrates the immense potential of digital tools for civic engagement and accountability.
However, the flip side is equally stark. The government in Kenya as in many other countries, has increasingly utilized the very same digital space to clamp down on civic freedoms and rights. This manifests through draconian legislation, illegal surveillance, the dissemination of state propaganda, and even the violent targeting of digital activists. also, the digital realm has become a battleground for extremist ideologies. There is clear evidence of terrorist groups like Al-Shabaab exploiting online spaces for recruitment and radicalization within Kenya. We also cannot ignore the role platforms played in fueling the conflict in the Tigray region, a case that has even found its way before the courts here in Kenya.
The way I see it, while the digital space and emerging technologies hold immense promise for Kenya as an early adopter in the region, the critical absence of effective guardrails, robust regulation, and genuine accountability has allowed harms to proliferate among users, not just in Kenya but across the broader region. The benefits are clear, but without systemic change, the dangers will only continue to mount.
I'd love for you to imagine how this might pan out, out loud: "If technology was owned, developed, and deployed by people as communities rather than by people as capitalists, we wouldn't be in the heart of the many, many crises we find ourselves in." Perhaps you agree, perhaps you don't - but we'd love to hear what this brings up for you!
I largely agree with that statement. If technology were truly owned, developed, and deployed by communities, rather than being driven by a few individuals, mostly men in the Global North with the capitalist imperative for profit, control, and colonization, we would undoubtedly witness far greater resilience, equity, and justice. What we are experiencing now is technology in the service of capitalism and profit, which is precisely why the world finds itself in its current state of crisis.
I envision a scenario where, if communities had genuine control over the kind of technology produced and utilized, there would be a much greater nuancing and contextual relevance to technology. The priority would naturally shift to public good, community well-being, and genuine community needs over profits.
Moreover, for me, coming from Africa, from societies with long histories of colonialism, exploitation, and marginalization, I believe that if technology were truly owned by us as a community, and as a people, it would be profoundly emancipatory. It would help us begin to address decades, even centuries, of oppression, exploitation, and capitalist dominance over African communities.
However, I must include a crucial caveat here. My background in peacebuilding has taught me that we must be very careful when we use the term "community." We need to acknowledge the inherent power dynamics within communities themselves; we are not homogeneous. Even within communities, layers of oppression and marginalization exist. If we simply leave things to "community" without actively dismantling these internal power dynamics, we risk re-entrenching the very injustices we aim to overcome. So, even as we advocate for community ownership, we must remember that it's about localization – ensuring that technology is owned and defined from the bottom up by as many diverse members of the community as possible, with close attention to intersectionality, where questions of gender, generations, ability and ethnicities are taken into consideration and effectively included in the design, development and deployment of these technologies. This is where the principles of feminist tech become incredibly relevant.
What might a decolonial feminist approach to tech look like? For people who are consuming tech, how do we do better if they aren't developing better?
As consumers, embracing a decolonial feminist approach to tech would empower us to critically interrogate the technologies we use because technologies are not value or interest-free. They are products of their creators, embedded with specific worldviews, biases, and power dynamics. As consumers, this approach compels us to recognize and actively engage with the harmful practices and ideologies that are inherent in, or propagated by, current tech. We begin to see digital colonialism for what it is—a new frontier of exploitation—and identify the rise of digital authoritarianism, the deepening digital divides and inequalities, and the troubling opaqueness of algorithms. A decolonial feminist approach dictates that we can no longer afford to be passive users of technology.
But recognition of these harmful ideologies is not enough, we must move to the next level of demanding and effecting change. As consumers, we must insist on accountable technology, deeply rooted in social justice principles, and inherently alive to intersectionality and contexts. This means we push for the design of algorithms that do not perpetuate existing biases, especially those related to gender, race, and class. We advocate for content moderation policies that are culturally nuanced, genuinely protect marginalized voices, rather than silencing them or enabling harm. It compels us to challenge the patriarchal and colonial biases embedded in tech development, advocating fiercely for diverse leadership and participatory design processes that authentically include women, indigenous communities, and other underrepresented groups from the Global Majority.
To truly achieve this vision, we must forge alliances across polities and geographies. We must work with like-minded progressive individuals and institutions from both the Global North and South. Together, we can advocate for robust frameworks at both the multinational and national levels that prioritize making technology truly human-centric. In this way, tech then becomes not a tool for control and extraction but an emancipatory force for building inclusive, equitable and truly democratic digital futures that serve all of humanity, not just the privileged few.