The US now hosts more child sexual abuse material online than any other country
The US hosts more child sexual abuse content online than any other country in the world, new research has found. The US accounted for 30% of the global total of child sexual abuse material (CSAM) URLs at the end of March 2022, according to the Internet Watch Foundation, a UK-based organization that works to spot and take down abusive content.
The US hosted 21% of global CSAM URLs at the end of 2021, according to data from the foundation’s annual report. But that percentage shot up by nine percentage points during the first three months of 2022, the foundation told MIT Technology Review. The IWF found 252,194 URLs containing or advertising CSAM in 2021, a 64% increase from 2020; 89% of them were traced to image hosts, file-storing cyberlockers, and image stores. The figures are drawn from confirmed CSAM content detected and traced back to the physical server by the IWF to determine its geographical location.
That sudden spike in material can be attributed at least partly to the fact that a number of prolific CSAM sites have switched servers from the Netherlands to the US, taking a sizable amount of traffic with them, says Chris Hughes, director of the IWF’s hotline. The Netherlands had hosted more CSAM than any other country since 2016 but has now been overtaken by the US.
But the rapidly growing CSAM problem in the US is attributable to a number of more long-term factors. The first is the country’s sheer size and the fact that it’s home to the highest number of data centers and secure internet servers in the world, creating fast networks with swift, stable connections that are attractive to CSAM hosting sites.
The second is that internet platforms in the US are protected by Section 230 of the Communications Decency Act, which means they can’t be sued if a user uploads something illegal. While there are exceptions for copyright violations and material related to adult sex work, there is no exception for CSAM.
This gives tech companies little legal incentive to invest time, money, and resources in keeping it off their platforms, says Hany Farid, a professor of computer science at the University of California, Berkeley, and the co-developer of PhotoDNA, a technology that turns images into unique digital signatures, known as hashes, to identify CSAM.
The sheer scale of CSAM compared with the resources dedicated to weeding it out means that bad actors feel they’re able to operate with impunity in the US because the chance of their getting in trouble, even if caught, is “vanishingly small,” he says.
Similarly, while companies in the US are legally required to report CSAM to the National Center for Missing & Exploited Children (NCMEC) once they’ve been made aware of it or face a fine of up to $150,000, they’re not required to actively search for it.
To support MIT Technology Review’s journalism, please consider becoming a subscriber.
Besides “bad press” there isn’t much punishment for platforms that fail to remove CSAM quickly, says Lloyd Richardson, director of technology at the Canadian Centre for Child Protection. “I think you’d be hard pressed to find a country that’s levied a fine against an electronic service provider for slow or non-removal of CSAM,” he says.
The volume of CSAM increased dramatically across the globe during the pandemic as both children and predators spent more time online than ever before. Child protection experts, including the anti-child-trafficking organization Thorn and INHOPE, a global network of 50 CSAM hotlines, predict the problem will only continue to grow.
So what can be done to tackle it? The Netherlands may provide some pointers. The country still has a significant CSAM problem, owing partly to its national infrastructure, its geographic location, and its status as a hub for global internet traffic. However, it’s managed to make some major headway. It’s gone from hosting 41% of global CSAM at the end of 2021 to 13% by the end of March 2022, according to the IWF.
Much of that progress can be traced to the fact that when a new government came to power in the Netherlands in 2017, it made tackling CSAM a priority. In 2020 it published a report that named and shamed internet hosting providers that failed to remove such material within 24 hours of being alerted to its presence.
It appeared to have worked—at least in the short term. The Dutch CSAM hotline EOKM found that providers were more willing to take down material quickly, and to adopt measures such as committing to removing CSAM within 24 hours of its discovery, in the wake of the list’s publication.
However, Arda Gerkens, chief executive of EOKM, believes that rather than eradicating the problem, the Netherlands has merely pushed it elsewhere. “It looks like a successful model, because the Netherlands has cleaned up. But it hasn’t gone—it’s moved. And that worries me,” she says.
The solution, child protection experts argue, will come in the form of legislation. Congress is currently considering a new law called the EARN IT (Eliminating Abusive and Rampant Neglect of Interactive Technologies) Act, that would open services up to being sued for hosting CSAM on their networks and could force service providers to scan user data for such content.
Privacy and human rights advocates are fiercely opposed to the act, arguing that it threatens free speech and could usher in a ban on end-to-end encryption and other privacy protections. But the flip side to that argument, says John Shehan of the National Center for Missing and Exploited Children, is that tech companies are currently prioritizing the privacy of those distributing CSAM on their platforms over the safety of those victimized by it.
Even if the lawmakers fail to pass the EARN IT Act, forthcoming legislation in the UK promises to hold tech platforms responsible for illegal content, including CSAM. The UK’s Online Safety Bill and Europe’s Digital Services Act could cause tech giants to be hit with multibillion-dollar fines if they fail to adequately tackle illegal content when the law comes into force.
The new laws will apply to social media networks, search engines, and video platforms that operate in either the UK or Europe, meaning that companies based in the US, such as Facebook, Apple, and Google, will have to abide by them to continue operating in the UK. “There’s a whole lot of global movement around this,” says Shehan. “It will have a ripple effect all around the world.”
“I would rather we didn’t have to legislate,” says Farid. “But we’ve been waiting 20 years for them to find a moral compass. And this is the last resort.”