Following the horrific terrorist attack at two mosques in Christchurch, New Zealand Facebook announced that it deleted 1.5 million videos of the shootings in the first 24 hours following the massacre.
The tech company said in a tweet late Saturday that it prevented 1.2 million videos from being uploaded to its platform, which has more than 2.2 billion global users.
“New Zealand Police alerted us to a video on Facebook shortly after the live stream commenced and we quickly removed both the shooter’s Facebook and Instagram accounts and the video,” said Mia Garlick, Facebook’s director of policy for Australia and New Zealand.
She added that Facebook is “removing any praise or support for the crime and the shooter or shooters as soon as we’re aware.”
Although the original footage of the 17-minute live stream was removed after an hour and banned by most social media sites, it has been repeatedly reuploaded by users, creating a dilemma for tech companies struggling to keep the graphic content off of their servers.
New Zealand’s prime minister, Jacinda Ardern, said that she will seek out Facebook to discuss efforts to stop the video’s circulation.
“This is an issue that goes well beyond New Zealand but that doesn’t mean we can’t play an active role in seeing it resolved,” said Ardern. “This is an issue I will look to be discussing directly with Facebook.”