The Regulation of Content Moderation

F Galli, A Loreggia, G Sartor - … Conference on the Legal Challenges of the …, 2022 - Springer
International Conference on the Legal Challenges of the Fourth Industrial …, 2022Springer
Online platforms have become a key infrastructure for creating and sharing content, thus
representing a paramount context for the individual/collective exercise of fundamental rights
(eg, freedom of expression, association) and the realisation of social values (citizens'
information, education, democratic dialogue). At the same time, platforms offer new
opportunities for unfair or harmful behaviours, such as the unauthorised distribution of
copyrighted content, privacy violation, unlawful content distribution (eg, hate speech, child …
Abstract
Online platforms have become a key infrastructure for creating and sharing content, thus representing a paramount context for the individual/collective exercise of fundamental rights (e.g., freedom of expression, association) and the realisation of social values (citizens’ information, education, democratic dialogue). At the same time, platforms offer new opportunities for unfair or harmful behaviours, such as the unauthorised distribution of copyrighted content, privacy violation, unlawful content distribution (e.g., hate speech, child pornography), and fake news. To prevent or at least mitigate the spread of such content, online platforms have been encouraged to resort to content moderation. This activity uses automated systems to govern content flows to ensure lawful and productive user interactions. These systems deploy state-of-the-art AI technologies (e.g., deep learning, NLP) to detect prohibited content and restrict its further dissemination. In this Chapter, we will address the use of automated systems in content moderation and the related regulatory aspects. Section 2 will provide a general overview of content moderation on online platforms, focusing mainly on automated filtering. Further, Sect. 3 will describe existing techniques for automatically filtering content. Section 4 will discuss some critical challenges in automated content moderation, namely vulnerability, failures in accuracy, subjectivity and discrimination. Furthermore, Sect. 5 will define some of the steps needed to regulate moderation. Finally, in Sect. 6, we will review existing legislation that addresses content moderation in online environments.
Springer
以上显示的是最相近的搜索结果。 查看全部搜索结果