Facebook is facing multiple simultaneous controversies in the United States, particularly around disinformation, hate speech and political bias. But those issues are also playing out — sometimes in more sinister ways — around the world, including a country where Facebook has more users than anywhere else.
In recent weeks, the company has repeatedly come under fire for its actions and policies in India, a country increasingly important to its business as it’s locked out of China and looks for future growth.
Facebook on Wednesday appealed to India’s Supreme Court to exempt it from facing a New Delhi government committee investigating the company’s alleged role in religious riots in the city earlier this year. The committee said Facebook declined to appear, arguing that regulating tech platforms is under the “exclusive authority” of India’s national government.
“[The] refusal to appear before this committee is an attempt to conceal crucial facts in relation to Facebook’s role in the February 2020 Delhi communal riots,” the committee’s chairman said during a hearing last week. “This shows that Facebook has something to hide.”
The court ruled on Wednesday that Facebook’s India representatives would not be required to appear before the Delhi committee until October 15.
The company did not respond to a request for comment.
It’s the second time this month that Facebook has come under scrutiny from authorities in India.
The company’s representatives were also questioned about allegations of hate speech and political bias by an Indian parliamentary committee earlier this month, with the head of the committee, opposition politician Shashi Tharoor, tweeting that they had “unanimously agreed to resume the discussion later.”
Facebook said after the hearing that it would “remain committed to be an open and transparent platform.”
The Wall Street Journal reported last month that Facebook allowed a politician from India’s ruling party to remain on its platform even though his anti-Muslim posts flouted its rules against hate speech. The Journal cited current and former employees as saying Facebook’s public policy head in India, Ankhi Das, opposed removing the politician because doing so would hurt its business interests in the country.
Facebook finally banned the politician, Raja Singh, earlier this month. “The process for evaluating potential violators is extensive and it is what led us to our decision to remove his account,” a company spokesperson told CNN Business.
India is one of Facebook’s most important markets, where it has more users than anywhere else in the world. The company has rushed to cash in on India’s digital boom in recent years, with more than 600 million internet users in the country and nearly an equal number yet to come online.
Earlier this year, Facebook poured $5.7 billion — one of its biggest investments ever — into an internet company owned by India’s richest man. And India became the first country to get its new video service, Instagram Reels, in late June, days after the Indian government banned rival app TikTok amid a military dispute with China, home to TikTok’s parent company.
Despite the importance of the country to its business prospects, Facebook’s tenure in India has been marked by a series of controversies and clashes with authorities. In 2016, the Indian government blocked Free Basics, Facebook’s plan to offer free internet access to millions of Indians, on the grounds that it went against the principles of an open internet. More recently, Facebook has pushed back against the government’s demand to make messages on its mobile service WhatsApp traceable after a series of viral rumors that led to over a dozen lynchings in 2018, arguing that breaking encryption would compromise the platform’s privacy.
The company is also in discussions with the government over proposed regulations that would impose restrictions on how tech companies can store and process Indian data.
Facebook’s actions -— and missteps — in India are illustrative of the issues the company faces outside its home country, particularly in non-Western emerging markets. In an internal memo obtained by BuzzFeed News, a former Facebook data scientist outlined several instances when the company was slow to clamp down on abuse of its platform by politicians in countries such as Honduras, Azerbaijan and several others.
The memo reportedly mentioned a network of “more than a thousand actors” working to influence local elections in New Delhi in February.
Facebook did not immediately respond to a request for comment from CNN Business, but told BuzzFeed News its teams had taken down more than 100 networks around the world for abusing its platform.
“Working against coordinated inauthentic behavior is our priority, but we’re also addressing the problems of spam and fake engagement,” Facebook spokesperson Liz Bourgeois told BuzzFeed News. “We investigate each issue carefully… before we take action or go out and make claims publicly as a company.”
Facebook has faced accusations of failing to adequately curb hate speech in several of India’s neighbors, including Sri Lanka and Myanmar, where the company acknowledged two years ago that it was “too slow” to prevent the spread of “hate and misinformation” that led to widespread violence against the country’s Rohingya Muslim minority.
Facebook has previously shown a tendency to take action in the developing world only when faced with questions from international media or global agencies such as the United Nations, said Nikhil Pahwa, founder of Indian tech news website MediaNama and a digital activist who was at the forefront of India’s 2016 fight against Free Basics.
“Historically, platforms tend to act late. They tend to first allow a problem to develop and then respond to that problem instead of acting on it in the first instance,” Pahwa, who testified in front of both the parliamentary and Delhi committees, told CNN Business. “At some point in time they have to be held accountable for not acting, and I think we’re coming to a situation across the globe where that is becoming a cause for concern.”
India’s status as Facebook’s biggest market by users, as well as the Indian government’s increasing willingness to place restrictions on foreign companies that don’t comply with its rules, mean it has to walk a tightrope in the country. And with the current issues around hate speech, the company’s own policies make that balance even more delicate.
“Political leaders are influential speakers and are especially dangerous when they incite violence. Unfortunately they also control companies’ access to markets, and India is a big market for Facebook,” said Chinmayi Arun, a fellow at Yale Law School whose work focuses on internet governance. “The company needs to work out how to stay committed to its policies against incitement to violence, despite the risks to its business that come from antagonizing political leaders,” she added.
Several of the problems that Facebook is grappling with in the United States have existed for much longer in developing countries, said Mishi Choudhary, co-founder and legal director of New York-based tech advocacy group Software Freedom Law Center. Choudhary cited the company’s track record in countries like Brazil, Myanmar and more recently in Cambodia as examples of the company’s “lackadaisical approach” outside the West.
“Problems that have been rampant in the Global South were constantly being ignored by Facebook until the 2016 US elections,” she said. “They love patronizing us and play feel good stories about connecting the world.”
Choudhary and Pahwa both say Facebook’s issues in India are symptomatic of a larger global debate on how accountable tech companies are for content on their platforms — a debate that is also playing out in the United States. Facebook, Twitter and other social networks’ status as intermediaries allows them to facilitate speech without being held liable for what is said.
But the Indian government is now trying to change that. Proposed amendments to the country’s technology laws would force social networks and messaging platforms to trace individual messages that the government deems a threat. The proposed changes also require that social networks such as Facebook and Twitter take down “unlawful” content within 24 hours.
“When you are consciously choosing to not act on repeated violations, which are leading to harm, then it’s a choice you’re making, and somewhere law is going to catch up with you,” said Pahwa. “The gap between responsibility and liability is going to get filled with regulation. And that’s what’s happening right now.”
CNN’s Vedika Sud and Swati Gupta contributed reporting.