Deepfake Porn: it can happen to anyone

Imagine you are just 21 years old and already working as a teacher at a secondary school. You teach English and history – the two subjects you enjoyed the most as a child. The students like you, you enjoy your job and you feel like you’ve finally found your place in this world. But one morning you wake up, open your mobile phone and discover a disturbing message: a stranger tells you that offensive photos and videos of you are circulating on the internet. You are shocked – how can that be? You’ve never undressed in front of a camera. These are so-called ‘deepfakes’.

‘Deepfakes’ are deceptively real-looking images and videos that are created with the help of artificial intelligence (AI). Real people are shown virtually in situations or actions that they have never actually experienced. The fact that this technology is increasingly being used to (sexually) expose people is particularly problematic.

Fake porn images and videos are nothing new, but the rapid development of AI has taken the problem to a whole new level. For years now, actors, singers and other public figures – predominantly women – have been ‘undressed’ with the help of various technologies (e.g. Photoshop) – no one asks for permission. Back then, however, at least a certain amount of technical ‘know-how’ was required to create realistic-looking fakes.

Today, that has changed: Thanks to a wide variety of AI programmes, any layperson who is able to operate a smartphone or computer can now create deceptively real deepfakes. These digital fakes are almost indistinguishable from real porn – a development that has serious consequences for those affected. Once distributed online, such videos and images are often difficult or impossible to remove.

According to research by cybersecurity company ‘Home Security Heroes’, the total number of deepfake videos in 2023 was 95,820 – a massive increase of 550% compared to 2019. This figure is likely to have increased many times over in 2024. What is particularly worrying is that 98% of these videos are pornographic in nature. 99% of the victims are female. ‘Deepfake technology is being used as a weapon against women,’ explains Danielle Citron, Professor of Law at Boston University, because even if the content is fake, this does not change the real consequences for the victims: invasion of privacy, humiliation, shame, professional consequences, anxiety, depression, suicide.

Although the most frequently clicked deepfake videos and images still show celebrities, the problem has long since spread to private individuals. However, unlike with public figures, the manipulated content in such cases often does not come from strangers, but from people close to those affected.

This is what happened to Sophie Parrish: the 31-year-old florist and mother of two recounts her experience in an interview with Channel 4 News: a former family friend stole normal photos from her social media profiles without her consent in order to create nude images of her. These deepfake images eventually found their way into a chat group where fake photos and videos of other women were offered and distributed. ‘I feel dirty… very dirty […] It’s like women have no value – like we’re just a piece of meat,’ she explains.

Some perpetrators are primarily concerned with satisfying their own sexual fantasies. Others want to humiliate and degrade the women and girls in their personal environment. Anna Lehrmann, a counsellor at the ‘Frauen helfen Frauen’ (women helping women) association, also explains to VOLLBILD that deepfakes are often linked to other forms of violence, such as blackmail and threats.

The creation of deepfake images and videos is not illegal in principle. However, distribution could constitute a violation of general personal rights, the right to one’s own image, copyright or other offences such as libel, slander or defamation. Since January 1st 2024, deepfake content has been systematically recorded by the police. According to the Federal Ministry of the Interior, this is done when reports are created in the police’s electronic logging system (PAD).

Kurier also provides the following tips:

  • Preserve evidence: Even if it is difficult, you should save the video or image material, take screenshots, etc.
  • Block users (if possible)
  • Report content to the platform

Translated by Anna Smith

#Deepfakes #Deepfake #KünstlicheIntelligenz #Cybermissbrauch #DeepfakePornos #Rachepornos #AgainstHumanTrafficking #GegenMenschenhandel #EndExploitation #EndTrafficking #HopeForTheFuture #Österreich