Should social media platforms be responsible for regulating content?

Danielmuleady
#im310-sp21 — social media
3 min readApr 24, 2021

--

Long story short, yes. Long story less short, I feel very strongly about this issue but I don’t lean one way or the other. Let me explain.

“I’m like the world’s largest totem pole — ready to fall one way or the other quite harshly.” Source: Countrylife.co.uk

In 1996, the United States Congress enacted Section 230 to regulate the internet in the early days of the medium. While there are many parts of Section 230, it is most known for the following statement:

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Sure, 25 years ago this seems like a good policy to have. When the frontier of the Internet seemed like the wild west, there was no way to hold individuals accountable — therefore, it seemed harsh to punish companies and domains responsible for something that they could hardly control.

Punishing companies would essentially go against one of the government’s longstanding views on the role that the Internet plays in society:

“to promote the continued development of the Internet and other interactive computer services and other interactive media

If the government signs a law into action that demotivates companies to publish websites since they can be punished at anytime for something outside of their control, it would go against their goal of promoting the growth and use of the internet. At the time, Congress needed to sign something into action and this may have been one of the better solutions back then.

But society has changed. The Internet is now a part of our daily lives and is not the foreign frontier that we once looked at it as. Section 230 is an outdated mess that doesn’t consider the changing society around it. Companies now have the means and control to manage their users (at least to some extent). Social media companies like Facebook, Twitter, and Instagram have been known to ban or block users for spreading misinformation and posting aggravating messages.

While I get hesitant at the thought of adding government regulations to something that is the culmination of society’s cognitive surplus, I understand that at some point the Internet isn’t just what I see everyday. There exist large communities that use the Internet for criminal activities and promoting dirty agendas. We have already seen platforms like Parlor serving as a place for its users to polarize each other with misinformation, spread baseless claims, and generate harmful activity (the Jan. 6 attacks on the U.S. Capitol).

I don’t think that we should be talking about instituting Patriot Act levels of monitoring and control but platforms, at the very least, should start to be held responsible for monitoring their users. If they can’t manage that task, then they don’t deserve to have such a large base of users. I know that you can’t design a system that starts that way — social media platforms grow and change with their user bases. I guess we will soon see which platforms can tackle that monumental task.

--

--