In a move to protect users, especially minors, the European Union has voiced concerns about “harmful” content on popular social media platforms such as Snapchat, TikTok, and YouTube. The EU’s digital watchdog, the European Commission, is demanding that these platforms provide more information on how their recommender algorithms function and the steps they are taking to mitigate risks related to issues like hate speech, illegal drug content, and mental health. This action is part of the EU’s landmark Digital Services Act, which aims to hold tech companies accountable for the content on their platforms. Digital Services Act, Recommender systems

Platforms Under Scrutiny
The US-based EU digital enforcer has sent a series of questions to Snapchat, TikTok and YouTube aimed at obtaining technical details related to the design and operation of their recommender systems. These are the ones that will decide which content reaches users and, as such, can arguably exacerbate risks like to minors or a risk to mental health of users.
Recommender systems require Digital Services Act platforms to choose fit-for-purpose mitigation measures to reduce potential harms. Of particular concern to the EU is the use of these algorithms to spread hate speech and illegal drug content. They have until November 15 to respond to the request with information such as a complete explanation of how and what kinds of controls are in place when completed, what kind of testing is done as well as processes for taking corrective actions.
Openness and accountability
This is all part of antitrust efforts within the EU to crack down on big tech and their role in distributing illegal goods. The Digital Services Act, a once-in-a-generation piece of legislation, will force platforms to be more transparent about their content moderation and requires them to take pro-active steps to shield users –such as children– from harm.
The European Commission is looking to get in-depth information about the recommender systems to determine possible risks and how the platforms handle them by requesting from the executives. It is an important move to make sure that these very powerful algorithms are not being exploited to cause harm to people, or even too much disruption of democratic discourse.
The future of social media moderation
EU Crackdown on Snapchat, TikTok and YouTube Just a Drop in the Ocean of Digital Regulation The European Commission is also investigating the content recommender systems of several other well-known platforms that include TikTok, as well as AliExpress, Facebook and Instagram.
As our digital world grows and changes, what policymakers and regulators will have to do is to walk a tightrope — encouraging innovation without limiting consumer protectionsiduals from harm. A more pro-consumer view that also hinges on transparency and accountability is a template to the rest of the world of how to manage tech companies barely held in check for years.