The Digital Assault

Photo of woman looking at smartphone and seeing "deep fakes." iStock Getty Images
iStock Getty Images

As we observe Sexual Assault and Prevention Awareness Month this April, a new and insidious threat is emerging from the digital realm: the weaponization of Artificial Intelligence against women and girls. While the technology promises innovation, its current trajectory is marked by the mass non-consensual exploitation of images and the automation of age-old biases. From rampant deepfakes to algorithms that systemically exclude women from economic opportunities, AI is increasingly functioning as a tool of gender-based violence and discrimination.

The statistics regarding AI-generated content are staggering. Recent data reveals that 98% of deepfake videos online are pornographic. 99% of those deepfakes target women and girls. This is not merely a technical glitch; it is a profound violation of bodily autonomy and agency. Explicit AI-generated deepfake content has grown over 550% year-over-year, yet approximately 1.8 billion women and girls worldwide have no laws protecting them from this form of digital violence.

These deepfakes represent a digital evolution of image-based sexual abuse, stripping victims of their dignity at a scale previously unimaginable. The trauma is real; over two-thirds of women journalists and activists report online violence, with over 40% stating it led to real-world attacks.

Experts argue that this hostility is “woven into the very data AI learns from.” AI models are trained on vast amounts of publicly available content that reflects society’s structural inequalities. Consequently, these systems do not just reflect bias; they often uncritically reproduce and exaggerate it.

The anti-woman nature of current AI is further fueled by a lack of diversity in the rooms where these models are built. Women make up only 22% of AI professionals and less than 14% at senior levels. Because the creators are predominantly male, the technology often mirrors patriarchal structures, portraying women in domestic or subservient roles while associating men with leadership and careers. This is visible in our daily interactions: voice assistants like Alexa and Siri were originally designed with subservient personalities and default feminine voices, reinforcing stereotypes that women are suited only for service roles.

The bias embedded in AI extends far beyond the digital screen, manifesting in life-altering consequences that automate systemic inequality. In the professional world, hiring algorithms have been caught systematically downgrading resumes that contain indicators associated with women, such as participation in female-dominated professional networks or mentions of women’s colleges.

Similarly, in the medical field, diagnostic AI systems trained primarily on male-centric data frequently misdiagnose women, particularly regarding heart disease and thoracic health, because the systems are often unaware of symptoms that present differently in female patients. These economic barriers are further reinforced by biased financial algorithms that deem women less creditworthy, limiting their access to microloans and essential resources even when women are statistically better at repaying debt.

The evidence is no longer just anecdotal; it is structural. As we finish up Sexual Assault and Prevention Awareness Month, we must confront a difficult reality: the very tools marketed as the future of progress are often designed to damage women’s agency. From the mass, non-consensual exploitation of women’s images in pornographic deepfakes to algorithms that systematically downgrade female resumes, AI is functioning as an automated extension of the patriarchy. In response, a growing movement of activists and researchers is calling for strategic refusal, a collective boycott of AI technologies until they meet fundamental human rights standards.

This movement argues that we should no longer treat AI as a “neutral” or “objective” tool. Instead, we must recognize these systems as reflections of the patriarchal data and the predominantly male creators that shape them. Currently, the gap between men and women in AI professionals is creating a massive blind spot as the unique lived experiences of women are ignored during the design process. Because the pride of the industry ignores context-sensitive development, the result is often “software, made woman, made servant,” visible in voice assistants originally programmed to respond playfully to sexual harassment.

True reform requires that tech companies move beyond performative ethics toward mandatory transparency. We must demand compulsory intersectional gender audits and discrimination testing for all high-risk AI systems before they are released to the public to identify subtle structural biases.

Furthermore, it is essential to dismantle the barriers that keep women out of the industry. Building diverse development teams ensures that the perspectives of those most affected by AI are integral to the design process, making it far more likely that harmful biases are noticed and eliminated.

Finally, corporate accountability must evolve past the implementation of simplistic filters that merely block flagged material without addressing the structural biases embedded deep within the models’ architecture. The status quo allows inequality to be automated at scale, and history shows that when corporate interest determines the research agenda, the rights of the marginalized are treated as mere externalities. It is time to demand a digital future that respects human dignity, autonomy, and equality for all, ensuring that technology serves as a tool for empowerment rather than a mechanism for exclusion.

Until these demands are met, the most powerful tool we have is the power of refusal. We must reject the use of AI. Just as workers in the creative industries have turned to unions and strikes to secure protections against AI, the public must recognize that continuing to feed data into these systems only validates their biased outputs. A future built on biased code is not innovation, it is injustice at scale.