February 10, 2023 Reading Time: 3 minutes

Another day, another Big Tech debate – and quite a lot seems to be at stake with Section 230 of the 1996 Communications Decency Act being called into question.

Section 230 states “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Simply put, 230 absolves platforms and online service providers from being held accountable for user posts, shares, and searches. It also protects platforms from liability for any adverse effects of third-party activity or interactions. So, as it stands, Instagram wouldn’t be called into question for the recent post by Jaime Lee Curtis, which features a rather detestable framed picture hanging on her office wall.

Section 230 protection relates to a 1959 Supreme Court case that clarified the role of bookstores as distributors and held them not liable for the contents of the books on their shelves. Any wrongdoing that may be discovered is the responsibility of authors and publishers.

The bookstore owner does, however, play an important role in the process. As distributors, owners can have a say in what is featured in their stores, they can screen and preview material before it makes its way to the shelves, and they can make recommendations and suggestions to customers and other interested parties.

Much like the rulings in the bookstore case, Section 230 has been said to serve as both “a sword and a shield” for platforms and search engines to allow moderated and curated content, but not be held responsible for all that makes its way online, or what users may do. Users and third-party affiliates are liable for any content they create or share, as well as any criminal activity that may arise from it (if someone commits murder after being inspired by reading or watching How to Get Away With Murder, we don’t expect the author or actors to also take the blame).

This is not to downplay the fact that some users use social media for nefarious reasons, and that some platforms could do a better job at detecting criminal activity online (if the mafia uses your restaurant for weekly meetings, you might want to reconsider or, better yet, report your customer base). 

Content moderation is certainly warranted for companies that care about their broader impact, let alone their brand. Monitoring online occurrences not only assists with providing a better user experience, but also deters unlawful activity. As such, the primary concern is not whether activity should be moderated, but rather who does the moderating.

Firms should be in control of their online offerings and services, and online users should be accountable for their activities (just as fast food chains shouldn’t be used as scapegoats for obesity rates when it is individuals behind the wheel at the drive thru making choices). 

If notifications and recommendations lead to unhealthy patterns or places, we need to remember that we are the ones scrolling and clicking. It is up to users in the online realm to break free of the echo chambers that may have been established, as well as explore avenues that may broaden their horizons. And platforms and search engines should aim to enhance and expand positive user experiences via the algorithms they employ.

Just as a bookstore owner may steer you towards a certain genre or recommend a certain author, algorithms are based on user data and will prioritize or promote posts, publications, and promotions accordingly. And this is one of the great advantages of data compilation – by relying on inputs and interactions, algorithms create online opportunities and efficiencies (91 percent of users never feel the need to click over to the second page of Google search results). 

Platforms and search engines provide different incentives for user engagement and different means for interaction. And even those platforms that are often painted in a positive light, such as LinkedIn and Pinterest, will come with baggage and face opportunities for blame. It is also worth noting that algorithms aren’t perfect or infallible (if you are a Bob Dylan fan, you may love or loathe the Traveling Wilburys being featured as part of a curated playlist).

People and profits are powerful forces for encouraging firms to improve online activity and keep tabs on what their algorithms lead to, and what active users and contributors are up to. Pornhub scrubbed its site of seemingly criminal content after Visa pulled its partnership, proving that reputation and brand affiliation matter more than policies in place. 

Overall, it is in the best interest of any platform or search engine to flag, block, or report that which does not adhere to its terms of service, especially when illegal activity is discovered. And it is in everyone’s best interest for the tech industry to retain the right and protection to do so.

Kimberlee Josephson

Dr. Kimberlee Josephson is an associate professor of business at Lebanon Valley College and serves as an adjunct research fellow with the Consumer Choice Center. She teaches courses on global sustainability, international marketing, and workplace diversity; and her research and op-eds have appeared in various outlets.

She holds a doctorate in global studies and commerce and a master’s degree in international policy both from La Trobe University, a master’s degree in political science from Temple University, and a bachelor’s degree in business administration with a minor in political science from Bloomsburg University.

Follow her on Twitter @dr_josephson

Get notified of new articles from Kimberlee Josephson and AIER.