Thursday, December 26, 2024

EU tells Meta to crack down on Israel-Hamas disinfo – BBC News

Must read

  • By James Gregory & Tom Gerken
  • BBC News

Image source, Getty Images

The EU has warned Mark Zuckerberg over the spread of “disinformation” on Meta’s social media platforms after Hamas’ attack on Israel.

It told Meta, which owns Facebook and Instagram, it “has 24 hours” to respond and comply with European law.

Social media firms have seen a surge in misinformation about the conflict, including doctored images and mislabelled videos.

On Tuesday the EU warned X, formerly known as Twitter, about such content.

The bloc’s industry chief, Thierry Breton, told Meta it must prove it has taken “timely, diligent and objective action”.

In a letter, he said the firm had 24 hours to tell him about the “proportionate and effective” measures it had taken to counter the spread of disinformation on its platforms.

A Meta spokesperson told the BBC: “After the terrorist attacks by Hamas on Israel on Saturday, we quickly established a special operations centre staffed with experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to this rapidly evolving situation.”

“Our teams are working around the clock to keep our platforms safe, take action on content that violates our policies or local law, and coordinate with third-party fact checkers in the region to limit the spread of misinformation. We’ll continue this work as this conflict unfolds.”

The European Commission meanwhile reminded all social media companies that they are legally required to prevent the spread of harmful content related to Palestinian militant group Hamas, which is a proscribed terrorist group in the EU.

“Content circulating online that can be associated to Hamas qualifies as terrorist content, is illegal, and needs to be removed under both the Digital Services Act and Terrorist Content Online Regulation,” a Commission spokesperson said.

Musk warning

On Tuesday, Mr Breton wrote in a letter to Mr Musk that “violent and terrorist content” had not been taken down from X, despite warnings.

Mr Musk said his company had taken action, including by removing newly-created Hamas-affiliated accounts.

He asked the EU to list the alleged violations.

Mr Breton did not give details on the disinformation he was referring to in his letter to Mr Musk.

However, he said that instances of “fake and manipulated images and facts” were widely reported on the social media platform.

“I therefore invite you to urgently ensure that your systems are effective, and report on the crisis measures taken to my team,” he wrote in his letter which he shared on social media.

The interventions come days after Hamas launched an attack on Israel, killing around 1,200 people and taking at least 150 hostages.

In response, Israeli forces have launched waves of missile strikes on Gaza which have killed more than 900 people.

In his response on X, Mr Musk said: “Our policy is that everything is open and transparent, an approach that I know the EU supports.

“Please list the violations you allude to on X, so that the public can see them.”

Mr Breton said that Mr Musk was “well aware of your users’ – and authorities’ – reports on fake content and glorification of violence”, adding that it was up to him to “demonstrate that you walk the talk”.

The EU Digital Services Act (DSA) is designed to protect users of big tech platforms.

It became law last November but firms were given time to make sure their systems complied.

On 25 April, the commission named the very large online platforms – those with over 45 million EU users – that would be subject to the toughest rules, among them X. The law came into effect four months later in August.

Under the tougher rules, larger firms have to assess potential risks they may cause, report that assessment and put in place measures to deal with the problem.

Failure to comply with the DSA can result in EU fines of as much as 6% of a company’s global turnover, or potentially suspension of the service.

Mr Musk dissolved Twitter’s Trust and Safety Council shortly after acquiring the company in 2022. Formed in 2016, the volunteer council contained about 100 independent groups who advised on issues such as self-harm, child abuse and hate speech.

Latest article