by Seth David Radwell
In the world of social media, Facebook is ubiquitous. Coming from humble beginnings as a Harvard student’s pet project, it’s now expanded to 2.89 billion users worldwide, making it the largest social network by 600 million users.
A recent Pew Research study found that 36% of American adults regularly use Facebook to get their news. In comparison, about 23% reported they use YouTube for news; 15% use Twitter.
In “American Schism,” I discuss how the media incentives in both digital and cable “news” are misaligned with the public good of accurate information. While this is a complex problem, there are myriad ways that better incentive systems can be created.
In the case of Facebook, the problem is twofold: Because Facebook’s algorithm is based on what users interact with through likes, comments and shares, it creates a lopsided sense of news. More outrageous comments tend to get noticed more and thus get more interactions and start a self-perpetuating promotion cycle. This has nothing to do with newsworthiness and everything to do with using effective stimuli to elicit more clicks and thus greater advertising revenue for Facebook.
The other issue is that since Facebook isn’t bound by the same ethical standards of journalism or information sharing — ethics that the mainstream media arguably has a tenuous grasp on as it is — the chance of encountering heavily biased or downright false information is high. Unlike a traditional news platform, which uses professional judgment to determine which stories get prominent attention, and abides by journalistic standards of verification of information, Facebook’s algorithm is tuned to serve high-interest content that will garner the most engagement — regardless of its accuracy. According to a Washington Post report, news publishers known for releasing misinformation got six times more “likes, shares, and interactions” on the platform than trustworthy news sites.
Even more disturbing is how Facebook began serving ads that invited users to “like” certain media outlet pages, which caused a significant increase in traffic to these sites. In 2013, BuzzFeed reported a 69% bump in page views linked from Facebook. While news outlets appreciated additional readership, they also came to understand Facebook was able to control where its users got their information. “Across the landscape, it began to dawn on people who thought about these kinds of things: Damn, Facebook owns us. They had taken over media distribution,” Alexis C. Madrigal wrote in The Atlantic.
With so much influence on what news gets distributed to whom, Facebook has demonstrated its ability to manipulate not only its users’ opinions, but also the fate of democracy.
How do we tame this social media juggernaut?
If we look to history, we can find several examples of companies getting too big for the public good. These monopolies, which occur when a company becomes the sole supplier of a particular product, eliminate competition in the marketplace, and as a result can take advantage of its customers.
Congress passed the Sherman Antitrust Act in 1890 to break up these monopolies, but it was largely toothless until Teddy Roosevelt began using it to his advantage. Roosevelt’s “trust busting” became something for which his administration would be remembered.
However, Roosevelt did not always believe big business to be bad. As long as it grew through legitimately outmatching its competition, he felt it served a public good. But if a business grew through unfair or unreasonable practices, Roosevelt felt the government should intervene in order to protect the public.
J.P. Morgan’s Northern Securities Company, which dominated the railroad industry, was ordered to dissolve following a 1904 Supreme Court ruling that the company was violating the Sherman Antitrust Act. Likewise, in 1911, John Rockefeller’s Standard Oil was ordered to break into 34 separate companies following a Supreme Court decision that determined the company’s majority market share had come as a result of economic threats against competitors and secret deals with railroads.
In more recent history, the U.S. Justice Department ordered AT&T — the sole provider of telephone service in the country through its Bell System — to split into nine independent companies in 1982. The so-called “Baby Bells” allowed competitors to reenter the market, providing telephone services and related equipment.
Many would argue that Facebook is a textbook definition of a monopoly, but the Federal Trade Commission still can’t come up with the evidence to prove it. The FTC filed suit against the company in 2020, but because of the way it presented its case, it could not prove that Facebook controls a dominant share of personal social networking services.
The reason this filing initially failed, according to a Tech Crunch report, is because they failed to include messaging apps — like the Facebook-owned WhatsApp — into the equation. If they were to include that data, Facebook would control more than 90% of the market. However, the FTC re-filed a lawsuit in August of this year.
The parallels between the monopolies of the early 20th century and Facebook are striking. Instead of competing with alternatives, they’ve in essence pulled them into their machine — or attempted to, as when they made a bid to buy both Twitter and Snapchat. “Facebook has, for many years, continued to engage in a course of anti-competitive conduct with the aim of suppressing, neutralizing, and deterring serious competitive threats to Facebook,” the FTC suit alleges.
With evidence that Facebook has become so powerful that they have the ability to shape an election or sway public opinion to such a degree, they’ve demonstrated they’re too big for the public good. From both a civil and an economic perspective, accurate information is a public good. When there exists a factual basis of monopoly behavior which limits societal consumption of a public good, limited regulation is necessary.
Until then, it’s important for users to understand that the information they receive from Facebook shouldn’t be wholly trusted. Before sharing a story, consider the source and verify its accuracy with other reputable sources. If it seems too scandalous to be true, it probably is.