Breaking the Chain: A Feminist Approach to Bias Prevention in AI
Home > Tech > Analysis

Breaking the Chain: A Feminist Approach to Bias Prevention in AI

Photo by:
Share it!
Cinthya Alaniz Salazar By Cinthya Alaniz Salazar | Journalist & Industry Analyst - Wed, 03/08/2023 - 09:15

Artificial intelligence (AI) has become an increasingly important tool in the digital economy, with applications ranging from automated customer service, medical diagnosis and crime prediction models. While AI has immense potential to revolutionize how we live and work, there is a risk of perpetuating negative human biases and inequality within a rapidly evolving digital economy. At the heart of the issue lies the reality that the development of AI occurs without a feminist analysis of the circular supply chain that contributes to its composition.

“Far from a sinister story of racist programmers scheming on the dark web, automation has the potential to hide, speed and even deepen discrimination, while appearing neutral and even benevolent when compared to racism of a previous era,” argues sociologist Ruha Benjamin in Race After Technology: Abolitionist Tools for the New Jim Code.

The role of negligent processes and systems in perpetuating negative human biases and inequality in AI innovation has been well documented. Presently, women hold only 22% of AI developer positions worldwide, with men occupying the remaining 78%, according to a World Economic Forum report. Meanwhile, a study by WIRED revealed that a mere 12% of machine learning researchers are women, a concerning ratio for a field intended to revolutionize society. The underrepresentation of women in these industries is compounded by a lack of transparency and accountability, posing a threat that AI may become an extension of existing racist, sexist and ableist systems rather than a neutral tool.

A feminist analysis of the circular supply chain is crucial to prevent the entrenchment of biased AI in the new digital economy. As the supply chain for AI technology involves a complex web of interconnected activities, each step of the chain must be scrutinized for gender imbalances, economic disparities and environmental impacts. One area of concern is the exploitation of people and extraction of materials used to build AI technologies, starting from raw materials like cobalt and lithium that rely on child labor and the outsourcing of data tagging to the global South for pennies on the dollar. Another realm of concern is the environmental impact of the data centers that power AI. These facilities consume vast amounts of energy, often from non-renewable sources, and contribute significantly to carbon emissions. Finally, the development of AI by male-dominated teams can lead to gender biases being entrenched in the algorithms they create. 

A feminist analysis of the circular supply chain would not only consider the environmental impact and work towards developing sustainable solutions, but would also address issues of representation and fair compensation for women and people throughout the supply chain. Work that is currently being championed by the non-profit A+ Alliance for inclusive algorithms which is funding the multidisciplinary exploration of “new models and new ways of conceiving AI that correct for historic inequities and bring social programs and policy fit for the 21st century,” according to the Alliance’s Fair Network. Their global network is driven by an urgency to intersect and correct known biases embedded in data sets, algorithms, models, policies and systems to make AI and related technologies more effective, inclusive and transformational, not only more “efficient.” 

“Our goal is not to stifle AI innovation, but rather to actively consider how we can intersect and uproot visible and invisible biases through the implementation of diversity, equity and inclusive policies, independent oversight, regulation and even other technologies,” says Paola Ricaurt, Red Feminista de Investigación en Inteligencia Artificial, Tecnológico de Monterrey, at the DMI Decididas 2023 Summit. 

In short, companies' race to commercialize AI threatens to congeal and amplify human biases into the foundation of AI design, affecting public and private institutions and the greater digital economy. A feminist perspective stands to help companies and regulatory bodies identify immediate points of action to prevent the perpetuation of biases in AI. By taking these steps, it is possible for AI to be developed and deployed in a responsible and equitable manner that promotes social justice and fairness for all. Failure to do so risks entrenching and amplifying existing biases in our society, ultimately threatening the very foundation of our digital economy and the principles of equality and justice upon which it is built.


Photo by:

You May Like

Most popular