Nick Clegg says Facebook 'not completely on top' of harmful content, prompting renewed calls for online watchdog

Mike Wright
Sir Nick Clegg said Facebook and Instagram's size made them difficult 'to police' - REUTERS

Sir Nick Clegg has said Facebook is failing to eradicate harmful content on its sites because it is too big, in an admission that has provoked calls from charities for the Government to “urgently” set up an online watchdog to protect children.

The former Deputy Prime Minister, who is effectively third in command at the social media giant, said the company was still “not completely on top” of ridding Instagram and Facebook of content such as suicide and self-harm images.

In an interview on BBC Radio 4’s Today programme, Sir Nick said Facebook’s enormous scale - over two billion users - made it “a real challenge” to police what it showed its users. 

Following his comments the NSPCC and Molly Rose Foundation, a charity set up in the memory of 14-year-old Molly Russell who took her life after being bombarded with self-harm images on Instagram, said Sir Nick’s comments highlighted the need for a new regulator with meaningful sanctions to ensure tech giants properly protected children.

Speaking from Davos, Sir Nick: “There’s a question about how do we get rid of this material, and that is something I completely admit we are not completely on top of, because this is a platform used by a third of the world’s population. 

Sir Nick, pictured with Facebook CEO Mark Zuckerberg and COO Sheryl Sandberg, joined the company last year as its VP of Global Affairs and Communications 

“On New Year’s Eve alone 149 billion messages were communicated on our platform in one day. To police all of that at scale is a real challenge.”

The former Liberal Democrat leader added the company had no “commercial incentive” to show users “hateful” or “extreme” content. 

He added: “The very people we depend upon for the business, people who pay to place advertisements, the last thing they want is for that material to be placed next to that advertisement.”  

Sir Nick’s comments come as the Government is expected to outline its plans in the coming weeks to impose a statutory duty of care on tech giants forcing them to better protect children from online harm, a measure campaigned for by The Telegraph.

In a White Paper published last year, ministers said they were considering creating an online regulator with powers to levy fines into the billions and even criminally prosecute senior tech executives for gross failures to keep users safe.

Last year, the father of Molly Russell, Ian, accused Instagram of ‘helping to kill’ his daughter after it emerged the app had been showing her suicide and self-harm content. 

Following his comments, Instagram banned ‘graphic’ self-harm images and figures published by the company in November showed it had removed almost 1.7m images between April and September - close to 10,000 a day.

Responding to Sir Nick’s comments, Molly Rose Foundation said despite pledges to clean up the platform Instagram and Facebook had not “meaningfully changed the algorithms” that shown Molly suicide and self-harm material. 

A spokesman said: “It is Facebook's responsibility to prioritise safety of young and vulnerable platform users. Yet a year since Molly's story became well known though international media, they continue to promote and circulate material Nick Clegg describes as "hateful" and "unpleasant". 

“It is beyond unacceptable that in these circumstances Facebook have failed to meaningfully change their algorithms.  

Molly Russell was found dead at her home in Harrow, north London, six days before her 15th birthday in 2017 Credit: Russell Family

“Nick Clegg now admits Facebook has lost control of the situation they have created. It shows the extreme urgency needed for our Government to legislate for an independent regulator equipped with meaningful sanctions, that can prioritise safety of the young over the business interests of platform owners and advertisers.”

Tony Stower, NSPCC head of child safety online and innovation, added: “Nick Clegg seems to have inadvertently admitted what we’ve been saying for years, that Facebook has lost control of harmful content on its sites.

“But it’s simply not good enough to say they’re trying when children are still being subjected to this damaging content and groomers continue to use Facebook’s platforms to target young victims.

“Government must urgently publish its Online Harms Bill armed with a tough regulator so big tech can’t keep making excuses for failing in their duty of care without facing hefty fines and sanctions.”