Fabulous investigates sinister rise of deepfake porn as cases soar

DURING lockdown, cases of counterfeit pornography – created by superimposing stolen pictures of victims on to obscene images – soared, with women the main victims. Fabulous investigates…

Looking at the stills from a hardcore porn film, Helen Mort went into shock. The sexual images were violent, involving acts of strangulation – but the most shocking part is that they were of her.

Yet Helen had never taken part in the acts she was staring at on screen.

They were “deepfake” porn pictures, created using technology that allows users to superimpose one person’s likeness on to another’s body.

“My heart was racing as I looked at them – it was a very physical reaction,” says Helen, 36, a writer from Sheffield.

“Most of the stills were very realistic, and the ones that were the most disturbing were images where you couldn’t tell that they had been doctored.”

Helen first became aware of the images last November, after an acquaintance stumbled across them in a dark corner of the internet, where the images had been since 2017.

Married mum-of-one Helen had no idea she was a victim of the growing trend of deepfake pornography, where pictures of people are used without their consent – often stolen from social media – and grafted on to pornographic images using specialist software.

'Humiliated and abused'

At its simplest, it involves programs such as Photoshop, but at the other end of the scale, complex artificial intelligence software is used to animate still images and superimpose them on to the faces of porn actors.

Helen’s ordeal began one evening when the acquaintance called her to tell her that he had been on an adult website and recognised her face.

“My first reaction was: ‘That’s impossible’. I had never taken even vaguely intimate or revealing images of myself, so I couldn’t understand how it could be true,” Helen says.

After talking to her husband, they logged into the site together, where they saw the deepfake images of her, and also a message posted on an internal forum by someone with a username that was an amalgamation of Helen’s name and that of her ex-partner, from whom she had split three years earlier.

The person posting claimed to be Helen’s boyfriend and said he wanted to see her “humiliated and abused”, and had included images taken from Helen’s Facebook page alongside pornographic images, inviting other users to combine the two sets.

Users had taken up the challenge and posted manipulated images back on the site for others to comment on.

Helen and her husband were sickened by what they saw.

Helen, who is on good terms with her ex, contacted him immediately, though she is certain he had nothing to do with it.

“To this day, I have no idea who posted the images, and that’s what’s really scary,” she says.

She saved the pictures, which had been viewed 800 times, as evidence to report to the police – just days before the user deleted both their profile and the images.

South Yorkshire Police were sympathetic when she contacted them, but told her that nothing could be done because no law had been broken.

While posting revenge porn was made illegal in 2015, there is currently no legislation that specifically covers making and distributing deepfake pornographic images.

“The officer I spoke to said she’d had loads of calls about this kind of stuff during lockdown,” Helen explains.

'Violent images would flash back into my mind'

With no hope of an investigation into the images, the psychological trauma set in several days later, and Helen began to feel very paranoid.

“I felt like I couldn’t leave the house,” she recalls.

“The fact that someone I knew had found the images suggested that other people would have seen them, too.

“It was horrible bringing my two-year-old son home from nursery and suddenly getting a flashback of a violent image.

“I wondered whether he could tell that something was wrong.

“I tried to be careful not to let him see that I was upset, but he came over to me at one point and asked: ‘Is Mummy sad?’”

Then, a few days after Helen first viewed the fake images, she started to have nightmares.

“In the dreams, two men followed me into a subway where they attacked me, and there was no way out,” she says.

“For weeks afterwards, whenever I had a quiet moment, the violent images would flash back into my mind.”

Sadly, Helen is far from alone.

Lockdown has fuelled all types of online sexual abuse.

Last year, the Revenge Porn Helpline dealt with 3,146 cases, an 87% increase on 2019.

Calls from deepfake porn victims to the helpline have also increased, with January and February this year being its busiest months on record – and the majority of victims of both abuses are women.

While deepfake porn is a relatively new phenomenon, it is on the rise, and experts expect to receive many more calls than last year as cases increase.

Deepfake videos aren’t always pornographic, and in December 2020, Channel 4 used the technology to create a parody of the Queen’s Speech for its Alternative Christmas Message.

The phenomenon hit the headlines again in March this year when a series of bogus Tom Cruise clips went viral on TikTok.

The most pressing danger

While experts have been warning since 2017 that the technology could be used to create political propaganda, authorities are only just waking up to what many campaigners say is the most pressing danger – the misuse of the technology for creating fake pornography.

Labour MP Jess Phillips has been the victim of deepfake porn as part of an anonymous online hate campaign against her in May 2020.

And research by internet security firm Sensity revealed that in 2020, up to 1,000 deepfake videos were uploaded to porn sites each month, mainly featuring famous women.

The non-consensual videos rack up millions of views, and while mainstream porn companies make efforts to remove them, more images then appear in their place.

A Fabulous investigation discovered disturbing evidence of an underground trade in celebrity deepfake porn.

We found an active community of fakers taking requests and fees for bespoke deepfake porn videos.

One “customer” offered “top dollar” for a video of Jameela Jamil, with another asking for Jodie Comer.

The “projects” posted on one site included Emma Watson, Scarlett Johansson and 18-year-old climate change activist Greta Thunberg.

The activity also extends to videos featuring well-known Instagrammers.

London-based fitness trainer and influencer Maiken Brustad, 28, is another victim.

Her iCloud was hacked in 2016 and the hacker stole intimate pictures that she had shared with her fiancé, then leaked the images online, some of which were subsequently turned into deepfake porn.

Maiken discovered the leak that August when a friend emailed her to say she was being discussed on a forum in the Discord messaging platform.

She followed the link her friend sent and found a chat room in which people were swapping and uploading the photos and sharing links to her social media pages.

She says that the deepfake pictures – which were taken from a hardcore porn film – were particularly hard to come to terms with.

“I have never been so shocked,” Maiken says. “I was out shopping at the time, and when I found out, I had to go home I felt so nauseous.”

'I was scared'

She immediately told her family, fiancé and friends. “It was highly uncomfortable, but I braced myself for it and had open conversations. They were non-judgemental and fully supportive.”

After she reported the incident to the police, they visited her at home, where she broke down.

“The officers cared, but I don’t think they knew what to do,” she explains.

“A month later, I got an email saying that they couldn’t continue with the case as there was nothing they could do.”

Maiken says she has no idea who is behind the harassment, and in October last year even started getting messages taunting her about the deepfake images from a secure mail server based in Switzerland.

In the same month, she received emails from someone who threatened to send the doctored images to her colleagues at the gym where she works.

When she ignored them, the person went ahead and sent them.

“[My colleagues] were understanding and accepting, and just thought that whoever was behind it was sick, which was a relief,” she says.

“I was scared at first, but I am at the point where I think: ‘What else can this person do to me?’

I wish it hadn’t happened, but it has, and the reality is that I have two choices – either be sad every day or continue to live my life.

Early last year, researchers at Sensity discovered a bot on messaging site Telegram that used AI to remove clothes from images of women and generate their naked body parts.

The bot had created abusive images of more than 680,000 women, around 100,000 of which had been posted publicly to the app, with 70% of the doctored photos coming from social media or private sources.

The publicly shared images appeared on Telegram chat channels containing more than 100,000 members.

Revenge Porn Helpline manager Sophie Mortimer predicts a rapid rise in deepfakes this year.

“This is a trend that I only see getting worse as the technology becomes more widely available,” she tells Fabulous, explaining that there is little difference for victims between doctored images and genuine images that have been shared without consent.

“If it looks like you, and everyone who sees it believes it to be you, then the impact is the same. It takes a terrible toll on mental health, personal relationships, jobs and careers.

'We have to demand change'

“The advice we give is: don’t panic. Although the law on the disclosure of private images doesn’t cover manipulated images at present, we do our best to help and give advice on actions you can take. But we do need legislators to step up and address these issues.”

Thankfully, that might soon be about to happen. Campaigners are currently advocating to have the offence of image-based sexual abuse included in the proposed Online Harms Bill, which is currently at the consultation stage and could become law in 2022.

Previously, Jess Phillips, along with fellow Labour MP Stella Creasy and Conservative Maria Miller, considered trying to include the making and distribution of deepfake porn in the 2018 law that tackled upskirting, but decided they wanted a more comprehensive bill against all image-based abuse.

Clare McGlynn QC, Professor of Law at Durham University, has been arguing for broader legislation that can cover a range of technology-based sexual crimes for years.

The legal and political system, she says, has been too specific and too slow to keep pace with technology.

Although sharing my experience has been difficult, if it makes others who have experienced this feel less ashamed, it will have been worth it.

“If we adopted the idea of image-based sexual abuse, it is more likely to cover more forms of technology as and when it develops, so in effect, you then future-proof the law,” she explains.

“The Online Harms Bill will be a perfect vehicle for this. What we need is a comprehensive law that covers all forms of taking and sharing intimate images, and that includes altered images of victims.”

For Helen, a new law can’t come too soon. When the shock of her own abuse subsided, she decided to do something positive, speaking out about her experience, and in December last year she started a Change.org petition for tighter regulations on taking, making and faking explicit images.

“Although sharing my experience has been difficult, if it makes others who have experienced this feel less ashamed, it will have been worth it,” Helen says.

“This technology is not going away and deepfake porn has the potential to ruin lives – but it’s not a crime.

“We have to demand change.”

  • View and sign Helen’s petition at Change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images.

    Source: Read Full Article