By Gail Fitzer
Gail Fitzer’s capstone—Combating the Social Media Disinformation Crisis: Why Reforming Section 230 is Not the Answer but Legislation Mandating Transparency Is—is an exemplary paper and great model for a Marxe School Capstone, winning high honors in her department. Beyond that, though, it’s a critically and intellectually rigorous piece of writing, making nuanced points throughout supported by in-depth research. In this regard, it’s a good resource not just for Marxe students, but for anyone producing long-form academic writing at Baruch. Ultimately, Gail argues a complex thesis effectively throughout this piece’s 50 pages of scholarship.
—Zefyr Lisowski, editor
Executive Summary
The rapid-fire spread of disinformation and hate speech on social media has done tremendous and likely irreparable damage to the United States and the world, threatening the very fabric of our societies. The damage to democracy, human life, public and mental health, law and order, civility, and the future of the planet are all well-documented in numerous studies. Unlike the European Union, Australia, Germany, and several other countries, the U.S. government has yet to regulate social media platforms, leaving them to make their own decisions about how to deal with disinformation and hate speech on their platforms. Much of the policy debate in the United States has focused on Section 230, with at least 20 bills introduced in the current session of Congress calling for Section 230 to either be totally repealed or reformed. However, because lies, disinformation and hate speech are all legal and protected speech under the First Amendment, holding the social media platforms liable for the damage caused by disinformation on their platforms is unlikely to hold up in court, and changes to Section 230 could actually cause more harm than good, according to some policy experts. A better approach to tackling the disinformation crisis starts with mandating transparency over both content moderation decisions and the results produced by social media algorithms – which audiences are being shown certain content and why – with a new digital bureau of the FTC overseeing the sharing of this data with vetted academic researchers. Independent oversight boards to rule on appeals over platform content moderation decisions and independent auditing intermediaries to verify platform data, which would be overseen by the new FTC division, are also important steps but the first and most critical bill Congress must pass to begin tackling the disinformation crisis and protect the public interest should mandate platform transparency and data sharing.
Please view and download the full piece below:
Published May 1, 2023