Published Dec 01 2025

From deepfakes to chatbots: How AI is reinforcing sexual entitlement, fuelling abuse

Artificial intelligence (AI) comes in many forms, some popular examples including generative AI, chatbots such as ChatGPT, and deepfake technology. The widespread incorporation of new AI technologies across social media platforms, messaging services and search engines has exploded in the past year. 

For example, Whatsapp, Facebook and Instagram messages now all have an embedded Meta AI tool. Some new AI functions on these platforms include a “summarise” tool and a function that allows users to generate any image that you can “imagine” by typing into the prompt box. Google “AI summaries” now appear at the top of every Google search.

It’s also becoming increasingly difficult to distinguish between real and AI-generated videos on reel-based social media platforms such as Instagram and TikTok, as the technology becomes more widely accessible and realistic. 

This embedding of AI tools into existing platforms has happened dicreetly and seemingly overnight, with sometimes little thought by everyday users as to the meaning and implications of these new tools. 

Is AI doing more harm than good? (Yes)

AI technologies are most likely to negatively impact on the most vulnerable and marginalised groups. We’re already seeing the misuse of AI technology by young people through high rates of sexualised deepfake abuse (SDA) in schools.  

SDA is a form of image-based sexual abuse (IBSA) and refers to the creation, sharing or threat to create or share sexualised deepfake images of another person without their consent. Sexualised deepfakes are fake images or videos of real people that are generated by AI or by “pasting” a person's face onto existing pornography.

Deepfake technology makes engaging in IBSA behaviours easier because it removes the need for a person's nude image or sexual content to be created or shared with the perpetrator in the first place, as the sexualised aspect of the image is entirely fabricated. 

While the creation of non-consensual sexualised deepfakes by young people to bully, harass or sexually abuse their peers is a growing problem, there’s also the potential for AI use to normalise a sense of (male) sexual entitlement and to obscure the need for communication and consent.

Deepfake technology and chatbots provide young people with the ability to generate sexualised images or engage romantically with a chatbot on demand. Chatbots such as ChatGPT are also created to affirm and validate its users by agreeing with their opinions.

The transactional nature of AI use further embeds a culture of sexual entitlement to women and queer femmes’ bodies, as it removes any need for human connection, consent, relationality or reciprocation in intimacy.

This is highly concerning given ChatGPT founder Sam Altman’s recent announcement that users will be able to access erotica on the AI platform starting from December. 

Photo: E+/Getty Images

What do young people think? 

As part of my PhD research, I spoke with young people aged 16-24 about their feelings regarding the increasingly widespread use of AI, as well as their own experiences using AI, with a focus on sexualised deepfakes and romantic interactions with chatbots. 

The majority of participants were women and gender-diverse people. They spoke about their feelings of concern, frustration and fear surrounding the use of AI for sexual gratification or “intimacy” and what it may mean for their safety online. 

Young women and gender-diverse people spoke about feeling less safe online with the prospect of deepfakes being created, impacting on their ability to find joy or pleasure in online spaces. Some even had fears of their partners cheating on them with AI bots, which some of their friends had already experienced. 

They viewed deepfakes and chatbots as a result of AI’s potential to further existing abuse, objectification and exploitation of vulnerable groups, as generative AI can be used by anyone to create the “perfect” female body to further embed a cis, white, able-bodied depiction of desirability.


Read more: Young people’s perspectives must be included in sexualised deepfake prevention


In our discussions, we also touched on what kinds of values and behaviours were important to young women and gender-diverse people in their use of digital spaces for intimate practices, such as sexting. 

Trust, communication, consent, connection, community and respect were all mentioned as important aspects of any form of sexual violence prevention and respectful relationship-building, but particularly in resisting forms of AI-based sexual abuse. 

They spoke about how they viewed sending intimate images as a form of trust and bonding in relationships, and how the use of AI in this context takes away from the human experience of connection and care. 

Where to from here?

While it’s important to note that AI itself is not responsible for abuse, the features of AI make it both easier for perpetrators to cause harm and for the impacts of those harms to be magnified. 

As adults, we have much to learn from younger generations about how new technologies can be used ethically, or whether there’s space for them to be used ethically at all. 

AI technologies must not be accepted without applying a critical lens to what it means for human connection, the environment or the safety of vulnerable groups. 

Though critical thinking, digital literacy and ethics are important, they’re not enough on their own. The creators of emerging AI and social media platforms need to be held accountable and to stop placing profits over people.

Tech giants such as Google, Meta and OpenAI don’t have the interests of young people at heart in the rolling out of these new tools. It’s therefore important to hold companies to account to ensure adequate safeguards are put in place to prevent abuse. 

About the Authors

  • Ruby sciberras

    PhD Candidate, Centre of Excellence for the Elimination of Violence Against Women (CEVAW), School of Social Sciences, Monash University

    Ruby’s research explores the socio-cultural drivers of image-based sexual abuse to inform education and prevention approaches that incorporate young people's perspectives and are grounded in feminist, decolonial practise.

Other stories you might like