top of page

The Information society isn't built for women - here's why



Technological advances are frequently heralded as democratic equalisers, with digital technologies providing access to knowledge, economic empowerment, and global connectivity. However, this utopian vision obscures the embedded power dynamics perpetually shaped by digital frameworks. Instead, the so-called 'information society', with its increasing technological emphasis on information, continues to reinforce racial and gender biases, subsequently entrenching the interests of the few at the expense of many (Mansell, 2010).


The Myth of Objectivity in AI and Machine Learning


Artificial intelligence and machine learning systems are often portrayed as neutral tools for progress. However, as Ruha Benjamin (2019) points out, AI does not operate in isolation; it reflects the biases of those who develop it and serves the interests of those who control it. Specifically, Safiya Umoja Noble (2018) highlights this issue in the context of Google search engine results, showing how algorithmic decision-making evaluates people against a disproportionately white, male, and able-bodied norm. For instance, Noble's study found that the top search results for 'Black girls' consistently returned hypersexualized and exotified images, reinforcing harmful stereotypes. This is not an isolated issue; AI systems shape digital representation, influencing who is visible, how they are portrayed, and whose voices are heard.


Facial Recognition and the Digital Discrimination of Women


Facial recognition technology further reveals how discrimination is embedded into digital tools. Research by Buolamwini and Gebru (2018) demonstrated that AI facial recognition misidentifies Black women at an error rate of 34.7%, compared to just 0.8% for white men. These biases translate into real-world consequences. For example, in 2020, Robert Williams, a black man from Detroit, was wrongfully arrested due to a false AI-generated match. As AI technologies become increasingly embedded into surveillance and policing systems, these errors disproportionately harm marginalised communities, particularly women of colour, who face both gendered and racialised discrimination in digital spaces.


Algorithmic Bias in the Workforce and Economic Inequality


Beyond surveillance, AI-driven technologies also reinforce economic inequalities, disproportionately affecting women. Uber’s algorithm, for example, has been found to assign lower-paying jobs to female and black drivers (Merchant, 2020).

Similarly, Amazon's AI-driven hiring system was exposed for systematically excluding women from tech jobs, as its historical hiring data favoured men (Dastin, 2018).

This reliance on biased historical data ensures that gender and racial disparities are not just perpetuated but automated, reinforcing exclusion under the guise of efficiency and meritocracy.


The Gendered Digital Divide in Knowledge Production


The idea that digital platforms enable open and democratic access to knowledge is fundamentally flawed. A striking example is Wikipedia, where 90% of editors are men (Ford & Wajcman, 2017), leading to male-dominated knowledge production. This digital gender gap is further exacerbated by the harassment women face when they attempt to participate in online spaces. Additionally, algorithmic invisibility often silences women's voices, replicating the same structural exclusions present in offline society.


Reimagining a More Inclusive Digital Future


To build a truly equitable information society, digital technologies must be designed with inclusivity, accountability, and fairness at the forefront. Increasing diversity in AI development and training data can help mitigate entrenched biases. Furthermore, acknowledging algorithmic discrimination is essential for implementing effective regulation in hiring, policing, and economic systems.

Recognising the human input required for technology to function, rather than treating it as a neutral force, is crucial in creating an internet that serves all users rather than reinforcing existing inequalities.

Written by Niamh Walters

Niamh Walters is a passionate advocate for intersectional feminism. She is dedicated to empowering marginalised communities by addressing the overlapping systems of oppression that shape their experiences. Her work fosters inclusivity and promotes access to education as a powerful tool for achieving equity.


 
 
 

Recent Posts

See All

Comments


SAFIGI Footer (4).png

We are lucky to have you <3

Join our Community

  • Facebook
  • Twitter
  • YouTube
  • Tumblr Social Icon
  • Instagram
SAFIGI Logo website

SAFIGI Outreach Foundation Ltd                  (Safety First for Girls) is a non-profit organization registered in Zambia serving as a global network of girls, women, groups, and allies to demand a safer world for girls through Safety Education, Advocacy and Research. Learn more about us.

Media

Feedback

2014 - 2021 © SAFIGI Outreach Foundation LTD.   Proudly SAFIGIan.

bottom of page