Give Ofcom emergency powers to investigate Facebook encryption plans, says NSPCC chief

However, campaigners warn that Ofcom may not have its powers until 2024, by which time Meta’s encryption plans will likely be in place.

Following the NSPCC’s call, Antigone Davis, Global Head of Safety at Meta said: “We have no tolerance for child exploitation on our platforms. We agree on the need for strong safety measures that work with end-to-end encryption, and we have developed a clear approach for building these into our plans for end-to-end encryption.

“We’re focused on preventing harm from happening in the first place by restricting adults on Facebook and Instagram from messaging children and defaulting under 18s accounts to private or ‘friends only’.

“We also offer more controls for people to protect themselves from harm and respond swiftly to user reports and valid requests from the police.

“The overwhelming majority of Brits already rely on encryption to keep them safe from hackers, fraudsters and criminals, and any solutions we develop need to ensure those protections remain intact.  

“We’ll continue to work with outside experts to develop effective solutions for combating such abuse because our work in this area is never done.”

A spokesman for the Government said: “Children will be at the heart of our new online safety laws with tough sanctions on social media platforms that fail to protect young people from harm.

“This groundbreaking legislation will give Ofcom additional powers, with the most significant sanctions reserved for companies that fail to comply.

“We believe it is possible for companies to implement end-to-end encryption in a way which is consistent with public safety, and which does not prevent action being taken against child abuse.”


‘Meta can no longer be judge and jury over its own conduct while children’s safety sits on a cliff edge’

By Sir Peter Wanless, chief executive of NSPCC

It’s been almost two years since the NSPCC led a global coalition of 130 child protection organisations to write to Mark Zuckerberg.

We asked him to pause plans to roll out end-to-end encryption on Facebook and Instagram’s messaging services until they recognise that direct messaging is the frontline of child sexual abuse and prove they have systems in place to disrupt it.

Since we wrote to them, Facebook, now Meta, have been batting away a conveyor belt of safety scandals with obfuscation and denial, with our questions and concerns being met with unsatisfactory answers.

What is clear is the scale of abuse children face on their sites. 

Every year, Instagram alone is used in around a third of reported grooming crimes on social media. Crimes that would go undetected under Meta’s blanket end-to-end encryption plans.

It was encouraging to read in The Telegraph that the company is pausing the rollout until 2023 to consider the child protection implications.

As we’ve always said, Meta should only go ahead with these measures when they can demonstrate they have built technical mitigations that can ensure children will be at no greater risk of abuse.

But read closely and Antigone Davis offered nothing new. 

It was strong on rhetoric but light on detail, and it made it difficult to conclude anything other than this being a move to play for time while the tech giant weathers difficult headlines.

Ms Davis cited WhatsApp as an example of action taken against abuse in end-to-end encrypted environments, but this isn’t the silver bullet that Meta likes to suggest.

The figures speak for themselves.

In 2020, the National Crime Agency received around 24,000 child abuse tip-offs from Facebook and Instagram but just 308 from WhatsApp.

WhatsApp data show that less than 15 per cent of accounts they suspend for child abuse lead to actionable reports to police. Meta knows abuse is taking place, but they can’t see it and can’t act on it.

Meta could have announced that they would follow Apple’s lead in developing child safety measures that can work in end-to-end encrypted environments.

However, Will Cathcart, head of WhatsApp, previously labelled Apple’s plans “concerning” and categorically refused to take a similar approach.

By sticking with their own status quo and continuing to promote, at best, sticking plaster solutions, Meta still doesn’t have a clear plan to protect children. It is disingenuous to suggest otherwise.

Mark Zuckerberg could take steps today to restore confidence. In May, Facebook’s board successfully blocked a shareholder proposal to risk assess the impacts of end-to-end encryption on child abuse.

They should admit they got that wrong and commit to a full, independent risk assessment. 

Actions speak louder than words.

As whistle-blower Frances Haugen’s revelations show, transparency is key. 

In the past six months, Meta’s latest community standards report revealed a record number of child abuse takedowns.

Almost 50 million items of child abuse material were removed from Facebook and Instagram, more than triple the amount in the previous six months.

Meta had attributed the dramatic increase to them improving their “detection capability” but it is still not clear if the company is playing catch-up following apparent technical problems last year, or if the child abuse risk is ballooning.

It’s in this context that end-to-end encryption sits. We know it could wash away children’s safety and have a substantial impact on identifying grooming and child abuse material. 

But because agencies have no power to ask questions, we have no idea how bad the tsunami will be.

Meta often cites how they will welcome regulation to help guide their response to abuse. But we can’t wait another two years before we can even start to demand answers.

That’s why we are urgently calling on the Government to fast-track Ofcom’s investigatory powers in the Online Safety Bill. Let’s give the regulator the powers to start asking necessary questions and the ability to look at the inner workings of Meta without delay.

The ongoing encryption debate and whistle-blower revelations highlight that Meta can no longer be judge and jury over its own conduct while children’s safety sits on a cliff edge.

We cannot be left wondering whether Meta’s announcement sets in motion a substantial reset of their plans or is just another tactic from their PR machine. 

The Government can take the lead by giving Ofcom the power to demand answers.

Related Posts

Property Management in Dubai: Effective Rental Strategies and Choosing a Management Company

“Property Management in Dubai: Effective Rental Strategies and Choosing a Management Company” In Dubai, one of the most dynamically developing regions in the world, the real estate…

In Poland, an 18-year-old Ukrainian ran away from the police and died in an accident, – media

The guy crashed into a roadside pole at high speed. In Poland, an 18-year-old Ukrainian ran away from the police and died in an accident / illustrative…

NATO saw no signs that the Russian Federation was planning an attack on one of the Alliance countries

Bauer recalled that according to Article 3 of the NATO treaty, every country must be able to defend itself. Rob Bauer commented on concerns that Russia is…

The Russian Federation has modernized the Kh-101 missile, doubling its warhead, analysts

The installation of an additional warhead in addition to the conventional high-explosive fragmentation one occurred due to a reduction in the size of the fuel tank. The…

Four people killed by storm in European holiday destinations

The deaths come amid warnings of high winds and rain thanks to Storm Nelson. Rescuers discovered bodies in two separate incidents / photo ua.depositphotos.com Four people, including…

Egg baba: a centuries-old recipe of 24 yolks for Catholic Easter

They like to put it in the Easter basket in Poland. However, many countries have their own variations of “bab”. The woman’s original recipe is associated with…

Leave a Reply

Your email address will not be published. Required fields are marked *