The Government has tabled an amendment to the Crime and Policing Bill that would require online platforms to remove non-consensual intimate images within 48 hours of receiving and verifying a valid report.
The proposal applies to material shared without consent, including so-called “revenge porn” and AI-generated or digitally altered intimate images depicting an identifiable person.
The amendment builds on the Online Safety Act 2023, which already imposes duties of care on regulated services. The core change is the introduction of a specific, time-limited statutory obligation triggered by an individual complaint.
What the amendment changes
Existing criminal offences address disclosure of private sexual photographs without consent. However, criminal processes often follow the harm rather than limiting its spread.
The proposed amendment focuses on rapid platform response. Under the proposal, once a platform receives a qualifying report and verifies that intimate content has been shared without consent, it would be legally required to remove that material within 48 hours.
The duty is intended to cover:
- Intimate images shared without the subject’s consent.
- AI-generated or digitally manipulated intimate images depicting a real person.
- Associated abusive content forming part of the same report.
Regulatory and financial consequences for platforms
The amendment provides for penalties of up to 10% of global annual turnover while in serious or repeated cases, services could face access restrictions in the UK.
Enforcement would fall within Ofcom’s supervisory framework under the Online Safety regime.
Platforms would need:
- Accessible and reliable reporting mechanisms.
- Rapid verification procedures.
- Documented internal escalation systems.
- Clear audit trails demonstrating compliance.
What it means for individuals
At present, victims of image-based abuse can report content to platforms and, in some cases, to the police. There is currently no statutory deadline for content removal. Timelines vary and decisions often depend on internal moderation policies.
If enacted, the amendment would provide a specific legal route to secure removal within two days of a verified report. Instead of relying on discretionary community standards, individuals would therefore be invoking a statutory obligation backed by regulatory sanction.
The change does not prevent abusive material from being created or initially posted. It addresses response rather than prevention. Criminal investigation would still be relevant in serious cases.
Practical questions remain. Platforms will need clarity on what constitutes a valid report, how consent disputes are handled and how authenticity is assessed in cases involving manipulated content. Safeguards will be required to balance swift removal with evidential fairness and freedom of expression considerations.
Why this matters
If adopted, the 48-hour removal duty would represent one of the clearest statutory response timelines in UK online safety law. It places a direct compliance obligation on companies operating in the UK market and provides individuals with a more predictable mechanism for limiting the spread of intimate image abuse.
Author
Gill Laing is a qualified Legal Researcher & Analyst with niche specialisms in Law, Tax, Human Resources, Immigration & Employment Law.
Gill is a Multiple Business Owner and the Managing Director of Prof Services - a Marketing Agency for the Professional Services Sector.

