Thursday morning, the staff of The Kansas Reflector got a jolt: Without warning or explanation, every link to their website was erased from their Facebook page. And it wasn’t only them — the same went for every user who had shared its content.
Just like that, one of the primary conduits for the nonprofit newsroom to reach its audience was cut off. “We’ve been flooded with instant messages and emails from readers asking what happened, how they can help and why the platform now falsely claims we’re a cybersecurity risk,” opinion editor Clay Wirestone and editor in chief Sherman Smith explained on its site.
They assured readers their electronic devices weren’t in danger, but wondered: Did Meta Platforms, Facebook’s parent company, take such draconian action because the news site had just shared on its page a commentary by an environmentalist documentary filmmaker critical of the social network’s advertising policy?
By the end of the day, the links had been restored. Meta communications director Andy Stone tried to allay the Reflector staff’s concerns that the block wasn’t because of what had been shared. “This was an error that had nothing to do with the Reflector’s recent criticism of Meta,” Stone wrote on X. “It has since been reversed and we apologize to the Reflector and its readers for the mistake.”
Comforting words, perhaps, but Meta hasn’t given users and content creators a lot of reasons to trust it. Its Facebook and Instagram social networks still teem with fake accounts, scammers and ridiculous amounts of misinformation and deliberate disinformation. A sobering recent study from three academic researchers looked at more than 13 million Facebook political image posts. These are often called “memes,” combinations of photos with text laid over the top. The study found a shocking 23% percent of them contained misinformation. And it wasn’t evenly distributed by ideology: Bad info was in 5% of left-leaning content, but in 39% of posts from the right.
Facebook isn’t the only Meta product with a credibility problem. Its Instagram announced earlier this year that it will push less political content into people’s feeds. Creators who run popular accounts have long suspected that Instagram and the other social networks suppress their posts when they get political — “shadowbanning” them. There’s plenty of evidence it does exist there and also on YouTube, TikTok and more. And who knows which views are tamped down?
So what Facebook says, goes. But, it still wants to have it both ways: It wants all the revenue it can get as a publisher, but also wants to deny responsibility for its users’ misdeeds. (“We’re just a platform.”)
This potentially existential threat for organizations that produce legitimate journalism puts convenience at its core. Facebook organized the confusing, wide-open internet and put its best features all in one place — keeping up with your high school friends, playing games, messaging. Facebook takes great advantage of traditional journalists’ work to fill users’ timelines with things to look at — but the company sells its own advertising and doesn’t prioritize sending people off to follow those links to outside websites. Facebook wants and needs what reporters, editors and photographers produce, but doesn’t always act like a good partner.
If you care about responsible, verifiable news, we suggest that you directly visit the websites of the news organizations you value. Cut out the social media middleman.