TY - GEN
T1 - Evaluating the Different Approaches to Social Media Regulation and Liability
AU - Watney, M. M.
N1 - Publisher Copyright:
Copyright the authors, 2025. All Rights Reserved.
PY - 2025
Y1 - 2025
N2 - With more than 5,17 billion users, social media is one of the most powerful forces in the world today. Consumers and businesses rely on it for connecting, researching, and communicating. Over the years, social media platforms have evolved into complex landscapes plagued by data privacy breaches, content moderation controversies, and mounting concerns about mental health outcomes. But how do governments and social media companies protect the public safety against risks and threats that social media present, such as misinformation, deep fakes, hate speech, and extremist communication? The discussion explores the different approaches to regulation and liability of social media platforms. Some governments have shifted away from social media platform self-regulation of content moderation to legal regulation. For example, the European Union Digital Services Act and the United Kingdom Online Safety Act provide for the accountability of a social media company for illegal and harmful content on its platform, but the approaches differ. Government control treads a fine line between free speech and censorship, over-regulation that may stifle innovation, and the responsibilities that come with running a platform, public safety and the future of the internet. In the United States (US) free speech is protected under the First Amendment of the Constitution which allows citizens to express themselves without government interference. Since social media companies are private companies, they can decide which speech they wish to host and amplify. Section 230 of the Communications Decency Act provides immunity against liability for user-generated content. In recent years there have been legal disputes regarding the immunity protection and content moderation decisions. Allowing a social media platform to self-regulate may be good for innovation, but social media is now a powerful communication space with billions of voices and some of these voices are illegal or harmful. It may be that some form of government oversight should be in place to protect the public safety. The discussion highlights that governments around the world are increasingly alarmed by the potential for social media platforms to be exploited, and this has resulted in an ongoing struggle between the need for free expression and the imperative to maintain public safety.
AB - With more than 5,17 billion users, social media is one of the most powerful forces in the world today. Consumers and businesses rely on it for connecting, researching, and communicating. Over the years, social media platforms have evolved into complex landscapes plagued by data privacy breaches, content moderation controversies, and mounting concerns about mental health outcomes. But how do governments and social media companies protect the public safety against risks and threats that social media present, such as misinformation, deep fakes, hate speech, and extremist communication? The discussion explores the different approaches to regulation and liability of social media platforms. Some governments have shifted away from social media platform self-regulation of content moderation to legal regulation. For example, the European Union Digital Services Act and the United Kingdom Online Safety Act provide for the accountability of a social media company for illegal and harmful content on its platform, but the approaches differ. Government control treads a fine line between free speech and censorship, over-regulation that may stifle innovation, and the responsibilities that come with running a platform, public safety and the future of the internet. In the United States (US) free speech is protected under the First Amendment of the Constitution which allows citizens to express themselves without government interference. Since social media companies are private companies, they can decide which speech they wish to host and amplify. Section 230 of the Communications Decency Act provides immunity against liability for user-generated content. In recent years there have been legal disputes regarding the immunity protection and content moderation decisions. Allowing a social media platform to self-regulate may be good for innovation, but social media is now a powerful communication space with billions of voices and some of these voices are illegal or harmful. It may be that some form of government oversight should be in place to protect the public safety. The discussion highlights that governments around the world are increasingly alarmed by the potential for social media platforms to be exploited, and this has resulted in an ongoing struggle between the need for free expression and the imperative to maintain public safety.
KW - Free Speech on Social Media
KW - Public Safety and Social Media Regulation
KW - Social Media Liability for Harmful and Illegal Content
KW - Social Media Regulation
UR - https://www.scopus.com/pages/publications/105011594603
M3 - Conference contribution
AN - SCOPUS:105011594603
T3 - Proceedings of the 12th European Conference on Social Media, ECSM 2025
SP - 310
EP - 316
BT - Proceedings of the 12th European Conference on Social Media, ECSM 2025
A2 - Pinto, Susana
PB - Academic Conferences International Limited
T2 - 12th European Conference on Social Media, ECSM 2025
Y2 - 22 May 2025 through 23 May 2025
ER -