Parler, which describes itself as an unbiased, free-speech alternative to larger platforms like Facebook and Twitter, was kicked off Apple’s app store and the Google Play store for a time following the Jan. 6 attack on the U.S. Capitol.
Amazon also booted the company from its cloud-computing services, effectively shutting down Parler for a time.
These companies claimed that Parler was violating terms of service by not monitoring content that appeared on the site, some of which was hate-filled or inciting of violence. It was viewed as potentially contributory to the Capitol insurrection.
To make amends with Apple, Parler executives contracted with a content-moderating company that employs artificial intelligence as the first line in community policing. Posts that the algorithm identifies as “incitement” or that are deemed to be threatening violence or constituting hate speech will not appear for many users.
Despite the resolution, there are recent reports indicating the popularity of the social media platform was damaged by the controversy.
What happened with Parler underscores the tricky business of the attempt by businesses to balance free speech with responsible speech. User bans seem to be happening with greater frequency as social media platforms and digital services attempt to avoid promotion of “bad” speech.
Therein lies the rub: what is “bad”? What kind of speech should be restricted? Offensive speech? Hate speech? Irresponsible speech? Speech that incites violence?
While it is not an absolute, it should generally be held as true that the best cure for bad speech is more speech.
Still, communication companies must take steps to monitor their content. The platforms for communication have an obligation to be aware. The next step beyond awareness is more troublesome as companies choose what content to host and what (and who) to ban.
Great caution is in order. While private companies have the legal right to control public access related to communications, special regard must be given to the concept of free speech and free exchange in the marketplace of ideas.
Moderation of content that users post on social media sites is in order. However, care must be taken to avoid meddling too much into speech regulation.
It is not a matter of legalities. Businesses are private entities and can decide for themselves what content to allow. However, given their importance to and influence of modern public discourse, any form of speech limit must be carefully scrutinized and weighed.
Social media platforms may not be required to uphold the First Amendment (that’s government’s obligation), but their collective power makes any restriction they levy highly consequential. They should, therefore, err on the side of openness and uphold this country’s free speech principles as much as possible, regardless of legal requirement or the lack thereof.
This editorial first appeared in the Pittsburgh Post-Gazette last Wednesday. This commentary should be considered another point of view and not necessarily the opinion or editorial policy of The Dominion Post.