
WASHINGTON (TNND) — Lawmakers are renewing a push to take away the legal provision that shields social media companies from liability over content their users post on their platforms, a move that could reshape how the platforms operate as they face mounting pressure to protect children online.
Scrutiny of Big Tech platforms has been ramping up for years amid a series of explosive allegations from whistleblowers and lawsuits along with the advancement of artificial intelligence — an industry that social media platforms have heavily invested in.
The Sunset Section 230 Act was introduced in December and is sponsored by a bipartisan coalition of senators from across the ideological spectrum, showcasing how the issue of online child safety and social media reform has cut across the typical partisan lines that have divided Congress.
Illinois Democratic Sen. Dick Durbin, a lead sponsor of the bill, held a press conference at the Capitol on Wednesday with actor Joseph Gordon-Leavitt and online safety advocates to push Congress to pass a bill that would sunset Section 230 two years after it goes into law.
Section 230 of the 1996 Communications Decency Act was created to protect early internet platforms from lawsuits over user-generated content, a safeguard widely seen as essential to the internet’s development. As social media companies have become some of the nation’s most powerful and influential corporations, critics have questioned whether that protection should remain.
“To the extent this protection was ever needed, its usefulness has long since passed. Sunsetting section 230 could force Big Tech to the table to negotiate real reform that provides real accountability,” Durbin said.
The way social media works could fundamentally change if the companies forced to insulate themselves from lawsuits. Content moderation standards, another matter of intense political debate, could be drastically tightened to limit legal liabilities. Algorithms that power user feeds may also be subject to overhauls and alter the user experience, raising questions about what platforms would continue to succeed.
“It would affect all operations, because there would just be much greater risk on every possible technical surface,” said John Wihbey, an associate professor of media innovation at Northeastern University. “Every single aspect of content treatment and all the mechanisms around content would need to be adjusted for much greater legal risk.”
There has been bipartisan interest in repealing Section 230 for years, though the parties have different rationales for their opposition to the legal shield. Democrats’ initial opposition to leaving it intact was to hold tech companies accountable over perceived failures to deal with misinformation, while Republicans said it gave social media companies too much cover for censorship of views.
Those concerns still persist among lawmakers, but attention has largely shifted to addressing the negative consequences social media has had on children, a problem that countries around the world are trying to deal with through bans for youth and other regulation. In the U.S., repeated vows to put guardrails in place have not resulted in new regulations, though a handful of bills regarding sexually abusive material and deepfakes have made it into law.
Tech companies have tried to combat the push for regulations with a series of changes to their platforms and heavy investments into programs identifying problematic content, putting limitations on accounts for underage users and cracking down on bad actors. But those assurances have not gone far enough for a Congress that has largely accused the companies of prioritizing profits over safety at the detriment of children.
Advocates of repealing Section 230 argue the prospect of a lawsuit would force companies to more effectively regulate their platforms to remove problematic content and alter how they target it into users’ feeds.
“Big Tech needs to be incentivized to protect kids online, and if they don’t, they should be held civilly liable,” Durbin said.
Opponents of repealing Section 230 argue that it would open the industry to an avalanche of lawsuits that would make it impossible for start-ups to survive without the legal resources of the industry’s major players. It has also raised concerns about platforms having to take a heavier hand in content moderation decisions.
There are questions about the effectiveness of using Section 230 to address the complaints lawmakers and others have about Big Tech companies. It would force the hands of platforms that critics see as not acting effectively to address the issues, but would leave much of the discretion over how to handle the remedies to the industry without accompanying regulations.
“It’s not really a policy solution, it’s sort of a mechanism for forcing the debate,” Wihbey said. “There’s a big problem, but it could get worse if in a very polarized environment, they put a ticking time bomb into law, and then there’s absolutely no agreement or political will to do anything else.”
Congress has no shortage of proposed reforms for tech companies but little agreement between the two parties on the specifics on what to implement amid dueling perspectives on free speech and how far to go with regulation. Previous legislation with broad support, such as the Kids Online Safety Act that sailed through the Senate in the last Congress, have run into roadblocks that have so far been insurmountable.
Tech groups have fought efforts to repeal Section 230, arguing it would hamstring the industry’s growth, cost jobs and reduce access to information.
“Section 230 has been a cornerstone of the modern internet — enabling American innovation, supporting millions of jobs, and allowing businesses of all sizes to connect with customers, share ideas, and grow online. It has helped fuel the economic growth and global competitiveness that define the U.S. technology sector,” TechNet CEO president CEO Linda Moore said in a statement after the bill was introduced.