No Comment! Why More News Sites Are Dumping Their Comment Sections
Letters to the editor have always been a key part of American newspapers, the main channel for readers to respond to the content they consume and publicly debate major political and social issues. So in the late 1990s, and early 2000s, when media outlets began publishing their content on the internet, many editors and reporters were cautiously optimistic that providing a space for online commenting would solicit more diverse audience engagement and create stronger connection between content creators and consumers.
In 2008, NPR introduced its reader commenting system, an option it embedded through a third-party system at the end of most articles on the site. In the announcement, NPR wrote: "We are providing a forum for infinite conversations on NPR.org. Our hopes are high. We hope the conversations will be smart and generous of spirit. We hope the adventure is exciting, fun, helpful and informative."
And an adventure it has been, but not so much a positive one. Eight years and millions of toxic exchanges later, NPR announced the abrupt end of the experiment.
"After much experimentation and discussion, we've concluded that the comment sections on NPR.org stories are not providing a useful experience for the vast majority of our users," wrote Scott Montgomery, former managing editor for digital news, in his 2016 farewell-to-comments address.
Like countless other news outlets, NPR found itself overwhelmed by trolls, anonymous contributors who had too often hijacked comment threads with offensive and inappropriate submissions.
Simply put, trolls are the loudest voices in the room, the ones who write "crazy, nasty things just to get people all riled up," as this latest Above the Noise episode explains in its exploration of trolling psychology.
“I think that public engagement needs to be a key part of a public media organization," said NPR ombudsman Elizabeth Jensen.
Speaking for herself, not NPR, Jensen noted the irony of a public media organization removing one of its key public forums. "I think it’s disappointing that the commenting platform didn’t work the way that it could."
The big difference between letters to the editor and online commenting, of course, is the moderation and selectivity factor. Costs generally prohibit adequate oversight of who can comment in most online forums, and what they can say. And more outlets are finding that their comments are falling far short of the goal of encouraging debate and civil discourse among a representative selection of users.
NPR found that only a very small and wholly unrepresentative slice of NPR's audience was taking advantage of the comments section, Jensen explained, noting the sharp increase in inappropriate content. In one analysis of site activity, just .06 percent of all the visitors to NPR.org in a single month actually submitted comments at all. And more than half of all comments submitted came from just a tiny group of shockingly prolific contributors who, it estimated, disproportionately tended to be middle-aged men.
The unexpected volume of submissions, Jensen said, also sharply increased how much NPR had to pay external monitors to manage the comments section.
“We all like to have this ideal that we can engage with readers and reporters,” Jensen said. "But in reality, that just wasn’t the way it was working. It didn’t seem there was an easy way to fix that.”
NPR has since put greater energy into building robust social media forums to pick up the slack, which Jensen said generally seem to attract a more representative population of the NPR audience and encourage civil debate.
NPR's move away from website comments is far from unique. The trend started in 2013 when Popular Science became one of the first major publications to ditch its public comment section, citing scientific studies that found that blog comments can have a profound effect on readers' perceptions of science.
A series of subsequent analyses found that when readers are exposed to uncivil, negative comments at the end of articles, they are less trustful of the main content (dubbed the “nasty effect”).
Since Popular Science’s exit from the commenting business, a slew of other media outlets -- from Reuters to Recode -- have followed suit.
"Those [social media] communities offer vibrant conversation and, importantly, are self-policed by participants to keep on the fringes those who would abuse the privilege of commenting," Reuter’s executive editor, Dan Colarusso ,wrote in his company's announcement.
Vice News is among the most recent large online publications to join the no-comment club.
“Comments sections are really just a continuation of that age-old tradition of letters to the editor, a cherished part of many publications and a valuable way of creating an open dialogue between magazines and the people to whom they are ultimately accountable,” wrote Jonathan Smith of Vice News in announcing his publication's move in late 2016.
"But without moderators or fancy algorithms," he added, comments sections "are prone to anarchy. Too often they devolve into racist, misogynistic maelstroms where the loudest, most offensive, and stupidest opinions get pushed to the top and the more reasoned responses drowned out in the noise.”
Of course, the vast majority of news sites continue to host loosely moderated comment sections (like this one, for instance). And a number of outlets have even attempted to grow their comment sections, instituting various techniques to encourage civil debate and ward off the ever-present army of trolls at the gates.
Until recently, the New York Times heavily moderated online comments, devoting a significant amount of in-house staff resources to ensuring conversations remained civil. The site also didn't allow commenting on articles dealing with particularly controversial issues.
In 2017, the Times site shifted gears and implemented a new system called Moderator, a machine-learning technology developed by Google. Commenting is now available on many more news and opinion articles, but open only for one day after publication. The system rates and prioritizes user comments, assigning them values based on an analysis of more than 16 million previously approved comments going back a decade.
The system builds on work the Times has done in partnership with the Washington Post and Knight Foundation in an initiative called the Coral Project, an open-source platform geared toward helping news sites accept and manage reader submissions without having to manually scour each individual comment and weed out the bad apples.