The EU has officially requested information from TikTok and Meta on the potential dissemination of misinformation about the Israel-Gaza conflict on their platforms. In the past, they had 24 hours to respond to the bloc’s complaints.
However, this most recent demand had legal standing, while the previous one did not.
Each company has one week to react. If the EU is not pleased with their answers, it may launch a formal probe under its new tech regulations. Following Hamas’ attack on Israel, the EU is worried about the potential spread of hate speech, violent and terrorist content.
“We’ll publish our first transparency report under the [new law] next week, where we’ll include more information about our ongoing work to keep our European community safe,” a TikTok spokesperson said.
“Our teams are working around the clock to keep our platforms safe, take action on content that violates our policies or local law, and coordinate with third-party fact checkers in the region to limit the spread of misinformation. We’re happy to provide further details of this work, beyond what we have already shared, and will respond to the European Commission,” a Meta spokesperson said.
A week prior, the EU had raised the same concerns with X (previously known as Twitter). Now, it has demanded more. At the time, X claimed to have eliminated hundreds of accounts connected to Hamas off the network.
Misinformation regarding the Israeli-Hamas confrontation has been flooding social media platforms with edited photos and incorrectly labelled videos. Thierry Breton, the EU commissioner, sent letters to the CEOs of Meta, TikTok, X, and Google earlier in October, giving them a 24-hour window to reply.
Nonetheless, in light of recent EU tech legislation dictating what kind of content is permitted online, these letters were not official, legally-binding requests.
The Digital Services Act (DSA) now requires the companies to reply by the specified dates.
In the event of noncompliance with the DSA, a corporation may face fines equivalent to 6% of its worldwide sales or maybe have the platform suspended.
The Commission has given Meta and TikTok two deadlines in this official DSA stage. Initially, the companies are required to reply to inquiries about safeguarding election integrity by November 8th, and they must submit the information on “the crisis response” by October 25th.
By the deadline in November, TikTok also needs to provide the European Commission with information about how company is safeguarding minors online.
Mr. Breton stated that Meta had to demonstrate that it had taken “timely, diligent and objective action” in response to the social media companies’ earlier request for additional information.
He added that TikTok “has a particular obligation to protect children & teenagers from violent content and terrorist propaganda”.
(Adapted from Wion.com)









