Google Commits to Transparency in Content Moderation Ahead of New EU Rules
Google has announced plans to improve transparency in content moderation across its services, including its search engine, in order to comply with new European Union rules. The regulations, which come into effect this week, will also apply to social media platforms and websites such as Instagram, Twitter (rebranded as X), and TikTok. These companies will be required to enhance their content moderation practices or face potential fines amounting to billions of euros. Among the Google products affected are YouTube, Google Maps, Play, Search, and Shopping.
The Digital Services Act (DSA) will enable the EU to gain greater insights into the 19 platforms classified as “very large,” which have at least 45 million monthly active users, and how their algorithms function. The DSA imposes stricter regulations on targeted advertising and mandates companies to implement more effective mechanisms for flagging and removing illegal content.
In response to these requirements, Google is expanding its “Ads Transparency Center” to provide users with more targeted ad information in the European Union. The company also plans to provide researchers with increased access to data in order to better understand the practical workings of Google’s products. Furthermore, Google will publish transparency reports detailing how content moderation has been handled for additional services such as Maps, Play, Search, and Shopping.
Tech giants such as TikTok have recently announced measures they are taking to comply with the DSA, including giving users greater control over their feeds on platforms like Instagram and Facebook. The DSA mandates that major internet players assess the risks associated with their services and take appropriate actions to mitigate those risks.