EC: First Online Platform Transparency Reports Show DSA is Working
Big Tech platforms are making a good effort at complying with the EU Digital Services Act (DSA), but much more remains to be done, European Commission (EC) officials said during a Nov. 10 briefing where they presented the first transparency reports required of the 19 companies classified as "very large online platforms" (VLOPs) (see 2304250008). These include Facebook, Bing, LinkedIn, Google and YouTube. Separately, the EC requested information from VLOPs Meta and Snap about how they're protecting minors.
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
The DSA governs providers of intermediary services such as social media, online marketplaces, online platforms with at least 45 million active monthly users in the EU and very large search engines. Among other requirements, VLOPs must analyze the systemic risks they create from dissemination of illegal content or harmful effects such content has on fundamental rights (see 2210040001).
As a new digital regulator, the EC first had to identify its priority areas, officials said. One is the Israel-Hamas war, where the EC engaged with platforms quickly on issues such as antisemitism, racism and terrorism. A second area is protection of minors; third is online marketplaces and consumer protection, critical as the holidays approach. A fourth priority is election integrity. The EC is already working with VLOPs in several countries that have held or are about to hold elections. The plan is to engage with them before elections and assess their performance afterward. This is particularly important since European Parliament elections are next year.
The EC is already seeing an effect from the DSA, officials said. For example, TikTok had no system for protecting minors a couple of months ago and had to start from scratch. It has now built more systems for verifying age and ensuring less profiling. While the situation isn't perfect, "we're seeing a change on the ground," officials said. VLOPs' "statements of reasons" outlining their actions are available on a searchable database.
One of the early objectives of the DSA is to shine more light on how platforms moderate content, said officials. A major issue is whether platforms have enough content moderators in the various European languages. At this point, the EC is trying to understand submitted information to see how content moderation aligns with platforms' risk assessments.
The DSA gives access to a wealth of previously inaccessible information about how the VLOPs work, officials said. This kind of disclosure would have been unimaginable without legal requirements. On the other hand, there is a problem with the transparency reports in that they're not harmonized. As such, companies can report in any format they choose. The EC will remedy this with additional rules early next year.
So far, the EC has learned several lessons from the DSA, officials said. For instance, the DSA's mechanism encouraging EU government authorities to send removal orders to platforms that post illegal content is working. Amazon has received the largest number of such orders. Another finding is that the most prevalent category of illegal content the act's notice and action mechanism addresses is intellectual property. On content moderation there's a huge variation in interpretation of the accuracy rate of automated moderation mechanisms, an issue the EC is studying more closely.
By next Feb. 17, all intermediary services must comply with the DSA and will have until June to come into alignment, officials noted. Penalties, which could include fines of 6% of a company's annual worldwide revenue and/or structural changes, can only be imposed once the entire system is in place next February. Before such fines are handed out, however, the EC might also consider interim measures or compliance deals, but the hard law of the DSA will be hanging over platforms' heads.
The EC wants information from Meta and Snap about what they've done to comply with their obligations to protect children online, particularly against risks to mental and physical health, it announced. The companies have until Dec. 1 to comply. Meta was asked in October to provide information on spreading terrorist and violent content and hate speech, and the alleged spread of disinformation.