How 11 People Try To Stop Fake News In World’s Largest Election

How 11 People Try To Stop Fake News In World's Largest Election

Employees work at Boom Live’s office in Mumbai

One of the operations most vital to Facebook at this moment is a world away from its Menlo Park, California, headquarters, and in more ways than one. Instead of the sprawling roof gardens and upscale cafes packed with Silicon Valley’s latest health fads, this cramped Mumbai office has worn carpets and fading walls lined with exposed electrical ducts. This is Boom Live, one of seven tiny fact-checking firms at the heart of Facebook’s efforts to rebuild some of its credibility during India’s elections.

The world’s largest democracy represents a key proving ground for Silicon Valley’s battered disinformation amplifier. Based on the early tallies, more than 60 percent of India’s 900 million eligible voters are expected to cast ballots between now and May 19, as the center-left Congress Party tries to seize power from the right-wing Bharatiya Janata Party. As in other elections around the world, paid hacks and party zealots are churning out propaganda on Facebook and the company’s WhatsApp messenger, along with Twitter, YouTube, TikTok, and other ubiquitous communication channels. Together with Facebook’s automated filters, Boom’s 11 fact-checkers and its similar-size fellow contractors are the front line of the social network’s shield against this sludge.

“In a country largely driven by local and community news, we knew it was critical to have fact-checking partners who could review content across regions and languages,” Ajit Mohan, Facebook’s managing director and vice president in India, wrote in a recent company blog post.

Facebook’s third-party fact-checkers in India analyze news in 10 of it’s 23 official languages, more than any other country, according to a spokesperson.

“Fact-checking is part of a broader strategy to fight false news that includes extensive work to remove fake accounts; cut off incentives to the financially-motivated actors that spread misinformation; promote news literacy; and give more context about the posts they see,” the company said in a statement.

Facebook has said that fighting misinformation is a top priority, and that it hands such critical responsibilities over to contractors to help it keep a better-informed watch around the world at all hours. Contractors also work for much less than the typical Facebook employee, can appear more objective than the company’s own employees, and can make for easier scapegoats if needed.

A visit to Boom’s offices makes clear that the scale of Facebook’s response in India so far isn’t enough. The small team appears capable and hardworking almost to a fault, but given the scale of the problem, they might as well be sifting grains of sand from a toxic beach.

“What can 11 people do,” Boom Deputy Editor Karen Rebelo says, “when hundreds of millions of first-time smartphone-internet users avidly share every suspect video and fake tidbit that comes their way?” Her team has been working for Facebook since a regional election last summer, and work related to the present election escalated earlier this year.