By Maitreya Shah
Digital applications and websites have become an integral part of peoples’ lives; their reach only extending to more personal, intimate facets like sexuality, wellness, and bodily autonomy. Plethora of apps in the market now provide services in the areas of safe menstruation, sexual health, sex-positivism, and for prevention against abuse and violence amongst others. Applications like Ruby– developed to provide women a way to track their sexual health, Nurture– to help women track the status of their pregnancy, Maya– to help manage menstruation, or Ferly– for sex-positivism, are only a few examples. While the significance of such apps cannot be undermined, the question on the privacy of the participants is also of utmost importance given the amount of data these apps collect. This piece intends to look at the dichotomy of individual and relational rights in their interplay with corporate surveillance, for their objective of commercialization.
Privacy International, a digital rights organization came out with a report in 2019 highlighting the serious concerns with menstruation apps in the way they shared their customers’ data to companies like Facebook for commercial interests; or as they put it: “turning your periods into money”. The existence of such apps in the market that are non-compliant to data protection regulations like the European General Data Protection Regulation (GDPR) is not very uncommon. Such apps then pose serious threats to the privacy and security of the participants, exposing them to things they never would have consented for in the first place. While the only way of safeguarding one’s privacy from such apps is to avoid using them, there are many apps that are compliant with these regulations, and still have issues with them. Often, the users are not aware of the amount of data these companies have about them; as it happened with an Austrian privacy activist Max Schrems, who received a CD containing 1200 pages of data that Facebook had on him. The categories and types of such data are also important; for instance, the ‘inversely private information’ that Microsoft researched on, is about the data that third-parties have about the individual, who doesn’t have access to such data herself. Moreover, talking about the role of sex in making women more vulnerable to privacy invasions, Dr. Anita Allen (2000) cites: “If women are more active consumers of certain categories of consumer goods and health services, it is conceivable that private industries would wish to monitor and record their movement through cyberspace more closely than that of men”.
GDPR, one of the most progressive data protection regulations in the current times, is established on the premise of individual rights, or as feminist thinkers would put it, ‘sovereign subject’. Here, the sovereign subject would be construed to mean the individuals only following their internal individual order, with no accountability towards the larger social community. The individual in cases of these regulations has the right to restrict the use of her data; to also exercise rights of ‘to be forgotten’, or ‘to erasure’ if needed. There however remains serious questions with regards to the consent frameworks in such regulations, the amount, and types of data available with the companies, and the tenets on which the idea of individual freedom rests. It is also important to address the primary distinction between the generic-sensitive personal data, and data that can render an already marginalized population more vulnerable. Both the European Union’s GDPR and India’s proposed bill on data protection rely on free, voluntary consent of the data subjects (individuals) for processing of their personal data. These regulations draw a primary distinction between the public, often anonymized data, and the private data of an individual. If a parallel be drawn with the feminist theories, this is attributable to the separation of public and private life, a popular point of criticism for several thinkers (Weinberg, 2017). The traditional private-public divide for women, through the oppression they face behind closed doors-in essence, the private life and its conspicuous challenges are also replicated in the virtual world; “those who worry about the perils women face behind closed doors in the real world will find analogous perils facing women in cyberspace” (Allen, 2000). The issues of objectification, sexual abuse, unwanted sharing of health and critical personal information amongst others, thus have to be viewed from a gendered lens, with a more contemporary approach.
An underlying criticism of the consent frameworks in these regulations has also been the sheer assumption of equality; through the over-emphasis on individuality of the data subjects. Right to privacy has been fostered as an individual’s proprietary ownership over her privacy, based on the idea of a contract; to either agree with the terms, or be deprived of the digital participation (Weinberg, 2017). Although the idea of individual freedom used in the privacy regulations comes from the liberal democratic theories of ‘sovereign subject’, there is enough literature both feminist and otherwise, critiquing the foundations of such theories. Marx (1973) critiques this individual’s relations with private enterprises as a force that intensifies the underpinning social power-dynamics. He thus calls it a ‘circumscribed individual’. Pateman (1988) explains how such contracts presuppose patriarchy, generating political rights in the form of domination and subordination. Such contracts, told to be exemplifying individual freedom, actually undermine the universal freedom that is rather more important. As Eva Kittay (1999) puts it, rather than focusing on properties that make individuals rational and self-interested, we should focus on the mutual relations of care and concern. Several of these thinkers have also talked about the presupposed privilege of the data subjects in opting out of the state surveillance or commercialization by companies. The consent thus also rests on the class-privileges of such data subjects (Weinberg, 2017).
This discussion on the dichotomy of individual and relational rights is important in the present context, given the patriarchal structures that shape the digital rights discourse; the presupposition of a data subject capable, privileged of giving her free, informed, and voluntary consent for processing of her personal data. There could be better ways of managing this privacy discourse; for instance, by emphasizing more on the collective agency of the data subjects through the rationales of interdependence and interrelations. In the traditional areas of development, there have been detailed research-findings about the benefits of collective agency of groups (Evans and Nambiar, 2013). Successful platforms involving cooperative infrastructure and women’s collective agency, based on shared concerns of economic and social wellbeing for enhanced social accountability have existed for long time now. Such agency is usually a group’s collective, bottom-up resistance against oppression and social control. So far, collective agencies were limited to a local boundary, geographical territory or a particular field, in the tangible form; if cyber space be taken as a territory, privacy as a public good and gender as the binding factor for the group’s agency, these ideas of collective action can be replicated to ensure enhanced privacy for women in the digital environment. For the critical personal data of aspects including health and sexuality, a collective agency can perhaps help in overcoming the present challenges not adequately addressed by individual freedom and consent frameworks.
As the Privacy International study also found out, one of the apps did take consent from women for use of their data, however not informing them how the data will be used. While this violates the European GDPR, the actual implementation and the apps’ interface with the users still remains a troubled territory. If seen in a larger context, the idea of voluntary consent, individual agency, and presupposed equality is also problematic for other vulnerable groups like persons with disabilities, queer or those belonging to certain ethnicities. Eva Kittay (1999) talks about the inability of several of these groups to compete for goods of social cooperation, owing to the often-inequitable dependencies.
In the present context, while several of these apps offer safe-spaces for women to talk about their most intimate insecurities, they also become breeding grounds for profit-making corporates trying to exploit the information so received. To hence conclude, it is important to change the narratives around consent frameworks with a greater emphasis on collective agency, while simultaneously making women more aware of their rights. Such a discourse could also help in creating more equitable solutions for other marginalized groups like persons with disabilities, who often have to prioritize using assistive technologies over the privacy of their data.
Allan, A. Gender and Privacy in Cyber Space. (2000) Faculty Scholarship at Penn Law, 789.
Evans, A. and Nambiar, D. Collective Action and Women’s Agency. Women’s Voice, Agency, and Participation Research Series no. 4 (2013). World Bank.
Kittay, E. F. Love’s Labor: Essays on Women, Equality, and Dependency. New York: Routledge (1999).
Pateman, C. The Sexual Contract. Stanford, CA: Stanford University Press (1988).
Weinberg, L. Rethinking Privacy: A Feminist Approach to Privacy Rights after Snowden. Westminster Papers in Communication and Culture (2017). 12(3), pp.5–20.