With the tragic events in Texas and Buffalo, we are once again rushing into a national conversation about how to combat domestic terrorism. At least five major problems arise after these events: health problems reflected in firearms laws; effective application of red flag laws; mental illness; loneliness and a sense of dispossession on the part of young white men who predominate in the demographics of shooters; often ideologies and theories based on white preferences; the safety of black, brown and indigenous communities as well as our children; and, last but not least, the role played by the Internet and social networks in this complex mix. For the sake of brevity, let’s focus on the last point.
The politics and practice of content moderation have come to the forefront of debate in the United States. Suffice it to say that neither Democrats nor Republicans have come to the right mark, as evidenced by these debates, which go to extremes and become paralyzed by the new law in Congress. But for a bipartisan division that affects the U.S., it doesn’t have to be that complicated. A simple and clear law should be available for discussion among citizens and ready to be adopted by social media companies, despite their allergies to any form of government regulation.
How about this draft?
- All platforms must comply with existing First Amendment law, including reporting reports that pose a “clear and genuine threat” to the health and safety of a person or community, including individuals as well as physical property.
- All platforms should have a clearly defined and functional link on the home page of their business to which users can report illegal activity.
- All provisions of this law comply with section 230 of the Decency of Communications Act 1996. (CDA)
Three main points illuminate these provisions. The first is a clear and modern exception to the danger in the First Amendment Act. This exception is regulated law, as well as federal and state bans on child pornography and obscene material. Private companies, in addition to this proposed law, are not bound by the First Amendment, but are not exempt from current legislation. This position makes this point crystal clear.
Under this paragraph, the creation of a duty under which users can report such activities gives the community the opportunity to engage with social networking companies. The helpless mess that many experience with security breaches and privacy breaches can begin to be addressed with this basic and important complaint optimization feature. Moreover, this method has predecessors. The Digital Millennium Copyright Act (DMCA) of 1998 established a similar way of notifying content owners of possible infringements by Internet service providers. For lawmakers, it would be pretty easy to take a page out of a DMCA book.
Nothing in these provisions repeals section 230 of the CDA. In fact, it could be argued that since these rules only apply to the compliance of social media companies with the current law, this proposed law is third-party. Indeed, the same can be said of the Law on Stopping the Fight against Sex Traffickers (SESTA) and the Law Allowing States and Victims to Combat Sex Trafficking on the Internet (FOSTA). However, Congress passed it in 2018, pointing out that in some cases it is important to enshrine in law what should be obvious but is not, or something that, like the moderation of content, is so involved in the discourse of cultural warfare, a clear statement seems to be refuted by lawmakers as well as law enforcement.
Similarly, nothing in this proposed law prevents the platform from establishing its own content management policy. The law is the basis of the expectations set by our government, below which no company has the right to go without compensation. The policy, which derives from the ancient Greek term for citizens, sets a higher level of expectation for community users that the platform sets. Once you click the “Accept” button in the terms of the site, the user actually becomes a member of that community.
Much remains to be done in the United States to combat domestic terrorism. Let’s start with the fruit: the common sense law on moderating content according to the laws we already have in place.