I’m a victim of revenge porn — 48 hours is too long to wait for images to be removed

Upset redhead teen girl sitting by window looking at phone waiting call or message
Tech sites will have to remove non-consensual sexual images within two days (Picture: Getty Images)

Today, the UK government said they are ‘putting tech companies on notice,’ demanding that any non-consensual intimate images, otherwise known as ‘revenge porn’, must be taken down within 48 hours.

Writing for the Guardian, Keir Starmer that violence against women and girls is a ‘national emergency’ and this measure would go some way towards tackling the issue.

If platforms fail to comply they could face fines of up to 10% of their revenue — pretty hefty considering Meta alone made £149 million in revenue in 2025.

Companies could also be blocked from operating in the UK.

The End Violence Against Women Coalition has dubbed the move ‘welcome and powerful’. Previously, there was no time limit, and victims were often left chasing the likes of Facebook and Instagram, begging to have their images removed.

PM Starmer Visits Community Centre In Hertfordshire
Prime Minister Keir Starmer has said tech companies that don’t comply will fines or a ban (Picture: Suzanne Plunkett – WPA Pool/Getty Images)

Now though, victims still have a question. Why should it take two full days for revenge porn to be removed from the internet?

Other countries have much tighter time limits. India recently mandated that some deepfake content must be taken down within three hours.

Meanwhile, according to the European Commission, terrorist content must be removed within just one hour.

‘Any situation you can imagine, there were photos of it’

In 2020, Jodie*, was at university when a friend found images of her on X (then Twitter), advertising sex work.

This Is Not Right

On November 25, 2024 Metro launched This Is Not Right, a campaign to address the relentless epidemic of violence against women.

With the help of our partners at Women’s Aid, This Is Not Right aims to shine a light on the sheer scale of this national emergency.

You can find more articles here, and if you want to share your story with us, you can send us an email at vaw@metro.co.uk.

Read more:

‘I got in touch with the police and said this account was using my real name, real location, identifying information about who I am and where I live,’ Jodie recalls.

But the police told her ‘these things happen’ and that they couldn’t help.

Then, just months later, an anonymous email directed her to a chat forumwhere she found countless fake sexually explicit images of herself.

‘Any situation you can imagine, there were photos of it,’ Jodie says. ‘They had been taken from my Instagram (which was private) and had been deepfaked. I completely broke down.

‘It was the single worst day I’d experienced, with this feeling of total shock and dread, thinking: “Who else could have seen these? Do people think they’re real? How do I explain this to my parents, my boyfriend, my work?”

Silhouette of sad and depressed woman sitting on the floor at home
Jodie broke down when she first saw the pictures (Picture: Getty Images)

‘It was sheer panic, trying to work out how to get them down.’

To this day, Jodie doesn’t know if all the sexually explicit fake images of her have been erased from the internet.

She did find out the culprit though. From some of the images used, she pieced together that it was her male best friend, who she’d known since the age of 15.

‘He’d been my shoulder to cry on, the “nice guy”, well educated, went to Cambridge — he wasn’t the type of person you’d think was behind it,’ she adds. ‘He was a wolf in sheep’s clothing.’

After going to the police for a second time, the perpetrator was convicted six months later, but not for the sexual images, but for grossly offensive language.

He got a six month suspended prison sentence, so didn’t serve any jail time. He was ordered to pay Jodie £100 compensation, and had to do community service and court-mandated sexual rehabilitation therapy.

He wasn’t put on the sex offenders register, either, and while the images were taken down by the site, it then re-launched under a new name, and her pictures reappeared.

‘I had justice, but not the justice I felt I deserved — although I got a lot more than many women ever will,’ Jodie sighs.

‘I don’t know if my pictures are out there, we know that on the internet, nothing is ever really gone. That’s what’s scary.’

Image of an index finger pressing the Enter key on a laptop
Someone was sharing sexually explicit deepfakes of Jodie online (Picture: Getty Images)

Since her ordeal, Jodie has campaigned for a law like this new 48 hour limit to be implemented.

Her 73,000 signature strong campaign, alongside the End Violence Against Women Coalition, Revenge Porn Helpline, survivor-campaign group #NotYourPorn, and world-leading expert Professor Clare McGlynn, has been instrumental in this new legislation.

However, while Jodie says this is a ‘huge step’, she adds: ‘I think tech firms have the capability to do it quicker.

‘I would love to see the time halved to 24 hours, because ultimately those images being online is where the harm lies — knowing they can be copied and saved and redistributed.

‘Victims are less bothered about a platform being given a fine, and more worried about their images coming down.’

Jodie was just one of the 6,483 cases reported to the MET Police from 2017 to 2023, where private sexual photographs and films were disclosed with intent to cause distress.

Georgia Harrison MBE, who was a victim of revenge porn when her ex-partner Stephen Bear shared a private video of them having sex, has welcomed this move, too.

Labour Party Conference 2023
Georgia Harrison now actively campaigns against revenge porn (Picture: In Pictures via Getty Images)

‘It’s a vital step forward in protecting women and girls from intimate image abuse,’ she tells Metro. ‘Having experienced this violation myself, I know how devastating and long-lasting the impact can be and how exhausting it is when the responsibility falls on victims to repeatedly fight for content to be removed.

‘This is a statement to tech companies that they are no longer above the law and we know exactly what they are doing when they turn a blind-eye to unconsented intimate images that just so happen to be raking in millions on their platforms.’

The new powers are expected to be implemented by Ofcom by summer but Jodie says the ‘devil will be in the details’, when it comes to following through on this promise.

‘They key is that platforms need to be held accountable, so it’s not just an empty promise from the government, otherwise it’s pointless,’ she says.

This Is Not Right

On November 25, 2024 Metro launched This Is Not Right, a campaign to address the relentless epidemic of violence against women.

With the help of our partners at Women’s Aid, This Is Not Right aims to shine a light on the sheer scale of this national emergency.

You can find more articles here, and if you want to share your story with us, you can send us an email at vaw@metro.co.uk.

Read more:

Do you have a story to share?

Get in touch by emailing MetroLifestyleTeam@Metro.co.uk.

Leave a Reply

Your email address will not be published. Required fields are marked *