Mark Zuckerberg announced a new vision for Facebook two weeks ago, one centered around privacy. It is not an unreasonable approach for a company that has been dogged by scandals about its handling of user data.

But the plan to move Facebook from a public to a more private platform would make it more difficult, perhaps impossible, to stop the spread of horrific videos like those shot by the suspect in the New Zealand terrorist attack that left dozens dead.

The heart of Facebook’s privacy plan is to add end-to-end encryption on its various messaging platforms. The tension between privacy and encryption and surveillance and moderation is one of the great quandaries of the digital age. Facebook’s unprecedented reach has made it the focal point of the debate.

On Sunday, Facebook said it had teams across the world working around the clock to stop the spread of the gruesome video posted by the suspect in the New Zealand attack on Friday. In the first 24 hours alone, Facebook said it removed 1.5 million copies of the video from its platform.

But on Monday, the video was still spreading freely on WhatsApp, the encrypted messaging service acquired by Facebook in 2014.

“People’s private communications should be secure. End-to-end encryption prevents anyone — including us — from seeing what people share on our services,” Zuckerberg wrote two weeks ago as he outlined his vision for the future of Facebook. Much of his plan for the company is to emulate the privacy provided by WhatsApp.

Zuckerberg said that over the past 15 years, Facebook and Instagram had become “the digital equivalent of a town square,” but that people also wanted to connect privately in “the digital equivalent of the living room.”

The challenge, however, is that many of these living rooms are large, with dozens or sometimes hundreds of members – sometimes many of the people are strangers. Places like WhatsApp’s private groups are becoming the new digital town squares, just smaller and with less oversight.

In India, where WhatsApp has more than 200 million users, viral false WhatsApp messages accusing people of child abduction were blamed for more than a dozen lynchings last year. The company has taken steps to tackle the spread of disinformation on its platform ahead of the election in India later this year.

Much of the disinformation on WhatsApp is thought to spread in chat groups, many that have hundreds of members — those larger living rooms. A WhatsApp spokesperson pointed CNN to statistics that show 90% of messages sent on the platform are sent between two people, and that the majority of chat groups have fewer than 10 people.

WhatsApp did not say how many users are members of groups that have dozens, or hundreds, of people.

The spokesperson said the company condemned the “horrific terrorist attack” in New Zealand, and that it has put limits on how widely users can share messages in a bid to crack down on the spread of false information. Since the change, there has been a 25% decrease in the number of forwarded messages, according to the company.

The company says it removes more than two million accounts per month engaging in bulk or automated behavior.

Several WhatsApp users that spoke to CNN Business on Monday said they had received the video in WhatsApp chat groups. By design, WhatsApp does not have a way of tracking or preventing the spread of the New Zealand video.

With his new plan, Zuckerberg wants Facebook to emulate the privacy provided by WhatsApp’s encryption features.

“People should expect that we will do everything we can to keep them safe on our services within the limits of what’s possible in an encrypted service,” he wrote.

Antonio García Martínez, a former Facebook employee, tweeted critically of Zuckerberg’s pivot to privacy, claiming it was a “move to get out from under the content moderation onus, and simply write off dealing with the issue.”

Companies such as Apple already have end-to-end encryption in place to the dismay of some law enforcement officials. However, their messaging products aren’t typically used for the same kind of mass communications.

Much of the criticism Facebook has taken for its role in the spread of disinformation and hate speech is based on researchers, the media and lawmakers being able to see what is being shared because it is public. While the shift to privacy will be welcomed by many, it will be more difficult to quantify the scale of Facebook’s problems and make it more difficult to hold Facebook to account.

Some of those same critics also blast Facebook for not protecting user privacy enough. For many at Facebook, there may be a sense they’re damned when they do and they’re damned when they don’t.

Update: This story has been updated to add more context from WhatsApp about its efforts to fight misinformation.