The First Amendment is really about speaking directed at the government and whether the government . [+] censors it.
Following the verdict of Derek Chauvin for the murder of George Floyd in Minneapolis, four-time NBA champion LeBron James tweeted an image of a Columbus police officer involved in the shooting death of 16-year Ma'Khia Byrant with the caption "YOU'RE NEXT #ACCOUNTABILITY."
Quickly, many accused the basketball superstar of using Twitter to incite violence. The tweet, which has since been deleted, came just days after Rep. Maxine Waters (D-Calif.) called upon protestors to "stay in the street," and even to become "more confrontation" should Chauvin be acquitted.
This led to a firestorm on social media, one in which there was no shortage of misunderstanding commentary.
The First Amendment And Social Media
Many have tried to make this a "First Amendment" issue in that James, Waters and others have a right to speak their mind. However, that would be a misreading of the Constitution. The First Amendment is really about speaking directed at the government and whether the government censors it.
"The First Amendment only restricts the government's regulation or punishment of expression, although it doesn't apply to only speech directed at the government; the First Amendment protects expression generally, unless the expression falls into one of the very narrow categories of unprotected speech," explained Chicago attorney Ari Cohn, who specializes in First Amendment issues.
MORE FOR YOU"Private companies, including social media platforms, aren't restricted by the First Amendment," added Cohn. "Quite the opposite, they are protected by it. They have their own First Amendment right to determine what speech they want to allow on their private property, much the same as you or I have right to kick someone out of our house for saying something that we don't like."
It is also important that we recognize from the outset that social media companies are private actors, and besides not being subject to the First Amendment, they can set and enforce whatever polices they deem appropriate.
"A good example of this, of course, is Twitter's January 2021 decision to ban President Donald Trump," said Bob Jarvis, lawyer and professor of Law at Nova Southeastern University.
"At the same time, however, social media companies (SMCs) are protected by Section 230 of the Communications Decency Act," added Jarvis. "This 1996 federal law shields websites from liability for the content their users create, post, and comment on. Section 230 assumes that SMCs are merely providing platforms for users to express their thoughts, with the SMCs neither reviewing nor endorsing such content or comments. Thus, while SMCs are not constrained from acting by the First Amendment, Section 230 gives them an incentive not to act."
Old Media Rules Still Apply
Another consideration is that many of the traditional or "old media" rules still very much apply to social media, even if the content is generated differently.
"Social media companies are private entities just like newspapers except that they don't provide their own content," noted James R. Bailey, professor of leadership at the George Washington University School of Business.
"Other people provide that content," Bailey explained. "Newspapers can make the decision to publish something or not. But that's where social media gets hung up because they're not creating their own content and because they're not creating their own content they're in this unfortunate position of having to police other people's content, and then step in and say whether it's appropriate or not."
Some users have cried "censorship" when their post or photo is taken down, or when more extreme measures were taken, such as them being removed from the service. However, it still doesn't make it a First Amendment issue as it is about the rules imposed by the companies.
"What do we allow and what don't we allow, we've already seen. Facebook and Twitter came down on the past president, and have thrown a few other people off and that's essentially controlling content," added Bailey.
Additionally, it could be argued that social media companies, as with any other companies, have an ethical duty to conduct their business responsibly. That could include not allowing the posting of someone's personal information, as an example. It would then be up to the social media companies to respond accordingly.
"A component of that ethical duty is to avoid foreseeable harm to others," suggested Robert Foehl, executive in residence at the Business Law and Ethics at Ohio University's Online Masters of Business Administration program.
"If an individual or group uses a social media platform to encourage acts of violence against another person or another person's property, then the social media company has an ethical obligation to employ corrective measures to remove such posted content so that resulting harm is minimized," Foehl added. "But this reactive response is not enough. The social media company has an ethical duty to engage in proactive measures to avoid the posting of such content in the first place – thereby avoiding foreseeable harm."
Inciting Violence
While it could be debated at great length whether Rep. Waters or Mr. James actually did intend to incite violence, there have been cases where some have most certainly used social platforms to do so. In such a case, the social media companies most certainly should take responsibility, as they have already done in the past.
"The recent banning of Donald Trump on Twitter, after the January 6 insurrection, has shown that these platforms are acknowledging the responsibility they have in perpetuating and mitigating potentially harmful misinformation," explained Jui Ramaprasad, professor in the Decision, Operations and Information Technologies department at the University of Maryland's Robert H. Smith School of Business.
"While banning users is one option, platforms are also starting to think – and should think – about how information/misinformation is spread, and which content should be privileged," added Ramparasad. "When algorithms are behind this process of determining what we see on our feed – no matter what platform we are on – it is not clear that the "true" information is privileged over the false."
Social media could be different from other forms of mass media however, in that the communication can spread so quickly. A simple tweet can go viral in minutes. Even if deleted by the original poster it can take on a life of its own – as many celebrities have found out when they've been too quick with the thumbs.
"It is highly unlikely, for instance, that a newspaper article could constitute incitement," said Cohn. "It's difficult to imagine that an article would be likely to cause people to immediately set aside the newspaper and go out to commit unlawful acts. Social media does add an interesting wrinkle, given the real-time communication that goes on. Could a tweet constitute incitement? It's significantly easier to imagine circumstances where it theoretically could than it is in the case of a newspaper article."
It is therefore easy to see why the social media platforms have had to resort to silencing some voices, and why the companies have had to address the spread of content that could be seen to even hint at inciting violence.
"Social media platforms removing content that calls for violence is absolutely within their purview, and there is certainly a strong moral argument that they should in fact do so," added Cohn. "You'd be hard-pressed, I think, to find many people who think that platforms should let serious threats of violence (as opposed to rhetorical hyperbole) be posted without moderation."
How social media handles these issues could be the challenge, especially as the technology is so new and continues to evolve.
"Social media is both similar to mass media but it is still a different beast," said Matthew J. Schmidt, PhD, associate professor of national security and political science at University of New Haven.
"There is still much to figure out on how social media can handle matters when people make posts that upset others, but we muddle through," Schmidt added. "We're not exactly digital natives. It could be the kids coming up now, who are born into the world of social media who may best figure it out. We built it, but we're still the outsiders. They'll live it in and figure out how such content can be properly moderated."