Under a proposed UK law, tech platforms operating in the country would be required to remove intimate images shared without consent within 48 hours. The government said tackling such abuse should carry the same urgency as Child Sexual Abuse Material (CSAM) and terrorist content.
Failure to comply could result in fines of up to 10 percent of a company’s global revenue or the blocking of its services in the UK.
UK Prime Minister Keir Starmer told BBC Breakfast that the measures form part of an “ongoing battle” with platform providers on behalf of victims. He added that the rule would allow a victim of intimate image abuse “doesn’t have to do a sort of whack-a-mole chasing wherever this image is next going up”.
Keir noted that tech companies are “already under that duty when it comes to terrorist material so it can be done. It’s a known mechanism,” and said the government “needs to pursue this with the same vigour”.
The proposals are being introduced as an amendment to the Crime and Policing Bill, currently under consideration in the House of Lords. Once implemented, victims would only need to flag an image once instead of contacting multiple platforms. Companies would also have to prevent the images from being re-uploaded after removal.
The plans would give internet service providers guidance to block access to sites hosting illegal content, targeting rogue platforms currently outside the scope of the Online Safety Act.
Janaya Walker, interim director of the End Violence Against Women Coalition, said the move places the responsibility on tech companies to act. Technology Secretary Liz Kendall said: “The days of tech firms having a free pass are over… no woman should have to chase platform after platform, waiting days for an image to come down”.
Keir also said enforcement would involve fines and other measures determined by a “combination of oversight bodies in relation to what’s online and then it will be a criminal matter”.






Comments