The European Commission now has the power to request information under the Digital Services Act (DSA) to determine whether social media platforms and other services that carry user-generated content have failed to comply with content moderation duties.
The DSA is a cornerstone of the EU’s digital strategy and sets out an unprecedented new standard for the accountability of online platforms regarding disinformation, illegal content, and other societal risks.
User-generated content
Social media platforms and other services that carry user-generated content which are designated as Very Large Online Platforms (VLOP) are required to comply with provisions introduced by the DSA including the assessment and mitigation of risks related to the dissemination of illegal content, disinformation, gender-based violence, and any negative effects on the exercise of fundamental rights, rights of the child, public security and mental well-being.
In particular, the Commission will be looking at VLOPs’ policies and actions regarding notices on illegal content, complaint handling, risk assessment and measures to mitigate risks identified are being investigated.
Interviews, inspections, fines or bans
VLOPs that receive such requests must provide the requested information to the Commission which will then assess next steps. The Commission can impose fines for incorrect, incomplete or misleading information in response to a request for information. The Commission may conduct interviews and inspections and, if justified, proceed to a formal investigation.
If it decides that a VLOP has failed to comply or is not addressing the problems it has identified, and risks harming users, the Commission can take more drastic steps including imposing a heavy fine or request judges ban the platform from the EU temporarily.
Round-the-clock moderators
Social media platforms and other services that carry user-generated content which receive such requests will need to prove how they have taken timely, diligent and objective action. This is likely to involve the provision of evidence that demonstrates teams working around the clock to keep the platforms safe, taking action on content that violates policies or local laws; demonstrating coordination with third-party fact checkers to limit the spread of misinformation; removing content and/or adding notes to certain posts to give context to them.
It is likely that social media platforms and other services that carry user-generated content will need to redistribute resources and refocus teams in order to comply with these new standards. Content moderation and public-interest policies and in-house enforcement teams will need to be beefed-up and made transparent.
Social media platforms and other services that carry user-generated content and especially those designated as VLOPs will be keeping a close watch on what the Commission ultimately decides to do in this space, and how it actually wields its power in practice.
Zach Judge-Raza is a Director in the Public and Regulatory law Group, Fieldfisher.
With thanks to Trainee Solicitor Jonathan Comfort, co-author of this article.