Deepfake Creators Reveal Shocking Motivations: ‘God-like Buzz’

A new study reveals that men creating deepfake nudes often feel a 'god-like buzz' of control and may not realize the harm they cause, dismissing their actions as 'just a joke.' Researchers draw a stark parallel between this virtual violation and sexual assault, urging increased awareness and action against this abuse.

3 hours ago
3 min read

Men Making Fake Nudes Feel ‘God-like Buzz,’ Study Finds

For the first time, researchers have explored the minds of men who create and share non-consensual deepfake pornography. A new study interviewed 10 boys and men in Australia involved in making these fake explicit images. The findings reveal disturbing motivations and a surprising lack of awareness about the harm caused.

Motivations Range From Control to ‘Just a Joke’

Some of the men interviewed described feeling a ‘god-like buzz’ from the sense of control and dominance they felt over the women depicted. This feeling of power was a significant motivator for their actions. However, not all participants reported this specific sensation.

A common thread among many creators was the belief that their actions were not harmful. They often dismissed the creation and distribution of deepfake nudes as ‘just a joke’ or ‘fooling around.’ This attitude suggests a significant disconnect between the perpetrators’ perception and the reality of the damage inflicted.

‘Moral Disengagement’ or Genuine Ignorance?

Researchers are questioning whether this lack of perceived harm is a form of ‘moral disengagement.’ This is when people tell themselves certain things to avoid feeling guilty or taking responsibility for their actions. It raises the question: are these men truly unaware of the pain they cause, or are they actively choosing to ignore it?

The study highlights a massive awareness gap. While the creators see their actions as harmless fun, the victims experience deep humiliation and psychological distress. Some victims have even reported losing their reputations entirely due to the non-consensual distribution of their images.

“On the one hand, you have the perpetrators who don’t think it’s harmful at all and then the victims who are who have their lives ruined.”

The Virtual Equivalent of Sexual Assault

The article draws a stark parallel between creating and sharing deepfake nudes and sexual assault. It urges readers to consider how they would react if a friend were to physically assault someone and then laugh about it. The virtual act is presented as the equivalent of ripping clothes off someone in public and mocking them.

This comparison aims to reframe the act, moving it from a perceived harmless prank to a serious violation with real-world consequences. The intent is to shock people into understanding the severity of the abuse, even when it occurs digitally.

Where to Report and Get Support

For those who encounter non-consensual intimate imagery, a global organization called NC Stop Abuse (ncstop.org) offers resources. This organization allows for reporting such abuse and provides support to victims. It is a crucial point of contact for those affected by these digital violations.

The article encourages calling out this behavior for what it is: a form of abuse and violation. By understanding the motivations behind deepfake creation and the profound impact on victims, the hope is to foster greater awareness and action against this harmful practice.

What’s Next?

As AI technology continues to advance, the creation of deepfakes will likely become more sophisticated and accessible. Future research will need to explore the evolving psychological aspects of perpetrators and the effectiveness of current support systems for victims. Understanding these elements is key to developing stronger preventative measures and legal frameworks to combat this growing digital threat.


Source: Deepfake nudes: Inside the mind of the men who make them | DW News (YouTube)

Written by

Joshua D. Ovidiu

I enjoy writing.

17,394 articles published
Leave a Comment