The White House: America’s Answer to Disinformation

The+White+House%2C+Courtesy+of+Wikimedia+Commons

The White House, Courtesy of Wikimedia Commons

Many believe that big tech companies should be regulated by the federal government in order to reduce the spread of misinformation online. It is no surprise that big social media platforms played a role in the January 6th Capitol insurrection. These platforms not only helped lay the foundation to ignite the attacks but also helped publicly expose them.1 

Afterward, major social media companies like Twitter and Facebook added new content restrictions and removed former president Donald Trump from their platforms, illustrating how unnervingly easy it can be for one company to sever the public’s access to specific information. It also laid bare the gaps in federal guidelines, policies, and laws about online speech.2 German Chancellor Angela Merkel agreed, arguing: “the United States should have a law restricting online incitement.”3 However, this clashes with the American understanding of free speech and the First Amendment.4 To understand the gap between a new law surrounding government implementation and the First Amendment, it would be helpful to look closely at the First Amendment case law from the Supreme Court. 

However, possibly a more promising approach should be considered. Social media platforms could tighten their policies and selectively choose what they show to their users. YouTube did this in 2019 by changing its algorithm for what it recommended to viewers. In addition, the chief executives of Facebook and Twitter have expressed willingness to accept government restrictions. In the country’s current politicized climate, it may be up to the social media giants to start the process of regulating and slowing the spread of misinformation.5 It would be helpful to further understand the steps Facebook and Twitter have already taken to combat false information.    

Another contingent believes that big tech companies’ ability to limit public access to the information they deem harmful is concerning, as it can lead to the stifling of thoughts, views, or free speech. For example, Andy Ngo, a conservative journalist, is known for his aggressive reporting on the Antifa movement and was suspended from Twitter in November 2019.6 Newsday columnist Cathy Young writes that “there is certainly nothing about Ngo’s Twitter presence to justify his banning.”7 She adds that the same accusations can be pointed at liberal reporters whose Twitter accounts are still intact. This double standard is worrisome. While some, like Vox commentator Aja Romano, argue that social media platforms need to regulate user content, there are other implications. Young highlights that without a clear definition of false information, decisions could become politicized. While some regulations for major tech companies could be useful, the way in which those rules are created and enforced must be done with caution.8

Personally, I believe that social media platforms should have restrictions banning harmful speech. The insurrection on January 6th clearly showed how quickly false information can spread and become a threat to the country. Higher-level supervision and policies need to be implemented. Additional federal guidelines surrounding disinformation on social media platforms is a key step. In order to put in place federal guidelines, a bill must be voted on by Congress, passed, and implemented as a law. The law would act similarly to the General Data Protection Regulation, a European Union regulation that functions to protect individuals’ data privacy. New regulation is necessary in order to protect the public good. One example could be the White House advocating for restrictions on disinformation from COVID-19 to hate speech. Those who decide what is and is not misinformation should come from an oversight board, like Facebook’s, and be a diverse group of people –specifically, people with expertise in collaborating for common goals based on policies and moderating online content. For example, Facebook’s board includes Julie Owono, whose background is in international law and relations, Jamal Green, a law professor at Columbia University with expertise in freedom of speech, and Nighat Dad, founder of the Digital Rights Foundation whose skills fall under digital rights and online safety.9 This is clearly a sensitive and challenging issue in the modern American political climate. While determining what exactly these restrictions would entail will surely be a lengthy and messy debate, it is nonetheless a step that needs to be taken in order to protect American democracy.