The good news about news consumption during the 2020 election is that fewer Americans were exposed to “untrustworthy websites” — misinformation — compared with the 2016 election, according to a recently released study by Stanford researchers published in the journal Nature Human Behavior.
That conclusion may seem surprising, given the incentives for bad actors to disseminate misinformation and the technological advancements allowing their lies to metastasize. But it also shows that increased awareness and concerted efforts by institutions and individuals to promote and seek credible information can make a difference.
“I am optimistic that the majority of the population is increasingly resilient to misinformation on the web,” Jeff Hancock, the founding director of the Stanford Social Media Lab and the report’s lead author, told the New York Times. “We’re getting better and better at distinguishing really problematic, bad, harmful information from what’s reliable or entertainment.”
Still, victory certainly shouldn’t be declared. Not when more than a quarter of Americans (26.2%) were exposed to untrustworthy websites in 2020. That was down from 44.3% in 2016 but is still unacceptably high for a stable society and democracy.
Other positive findings include fewer overall visits to untrustworthy sites and less time spent on them. Unfortunately, some stubborn demographic dynamics remain: Older adults were more likely to access misinformation, and former President Donald Trump’s supporters were more likely to engage in misinformation in 2016 and 2020. The researchers used a database of sites known to publish misinformation repeatedly.
Overall, citizens with “more ideologically moderate media diets” accessed untrustworthy information less than those with “more ideologically extreme media diets,” the study stated. And what the researchers call “referrers” reveals that progress was made regarding Facebook over the four years. In 2016, the social media site accounted for 15.1% of visits to untrustworthy sites, but by 2020 that had been reduced to 5.6%.
The study points out that after the misinformation surge in 2016, journalists and online platforms tried to curtail the spread. And although the misinformation dynamic is complex and multifactorial, the positive results suggest that some of the public and congressional pressure put on Facebook and other social media entities to address their roles may have helped. So, despite the pushback from Big Tech companies, Congress should keep up the pressure.
Ultimately, though, it’s up to individuals to take responsibility for the information they seek and potentially the misinformation they spread.
Nascent technology like artificial intelligence will undoubtedly bring new opportunities to further obscure the truth and splinter society. But fortunately, an antidote already exists: The U.S. has globally recognized, credible news organizations that endeavor to deliver, albeit imperfectly, objective news along with clearly labeled analysis and opinion.
Rejecting partisan misinformation and reading, watching or listening to these legitimate news organizations can return our society, however riven with divisions, to one in which our debate is over facts rather than fiction.