Facebook says footage of the Christchurch shooting was viewed just 200 times during its live broadcast, and 4,000 times in total before it was removed. The company also says it didn’t receive any reports from users about the shooting live stream until 12 minutes after the video had ended.
Over the 24 hours following the initial broadcast, individuals attempted to re-upload the video 1.5 million times. The disparity highlights the specific challenges facing the social network, especially when an attack like Christchurch was made to go viral. Videos of the attack were cross-posted across various social networks, and links to the live stream and a manifesto were posted on 8chan prior to the attack.
In spite of Facebook’s claims, there are reasons to doubt these numbers. The platform’s viewing figures are currently the focus of a lawsuit, which has accused the social network of inflating its viewing figures in an attempt to lure advertisers onto its platform.
The platform’s account of the timing of the first user report has also been disputed by Right Wing Watch researcher Jared Holt, who claims he reported the video shortly after stumbling upon it via its 8chan thread. Facebook’s account also states that it removed the video after being contacted by the New Zealand Police, rather than a user report, although it says it did so “within minutes” of being contacted.
Now, the challenge for Facebook has become preventing the video from being re-uploaded. The company says it’s generated a hash of the video to prevent any unedited versions from being posted, and is using audio recognition to identify versions of the video that have been screen recorded. 1.2 million attempted uploads of the video were blocked within the first 24 hours, but that means 300,000 versions still managed to slip through Facebook’s filters.
Facebook isn’t the only social network working to stop the spread of footage of the attack. YouTube, Twitter, Reddit, and even Steam have all been working to remove the content or tributes to the attacker.