Several weeks ago, a terrorist killed fifty people at the Al Noor Mosque and Linwood Islamic Centre in Christchurch, New Zealand. The horrific attack was streamed through Facebook Live and shared thousands of times through additional Facebook posts. Facebook Chief Operating Officer Sheryl Sandberg recently shared a letter with the New Zealand Herald that outlines the actions Facebook plans to take to police hate groups and hate speech.
Sandberg noted that immediately after the attack, Facebook removed the video of the attack, shut down the terrorist’s Facebook and Instagram accounts, and used AI to delete related videos. Their efforts were not enough to prevent the terrifying footage from being published on Facebook multiple times. Sandberg stated that the company had identified at least 900 versions of the video.
The social media platform has consequently marked the shootings as terrorist attacks. Any Facebook user who praises or voices support for the Christchurch terrorist attacks will violate Facebook’s Community Standards and be kicked off the platform. Facebook is also restricting who can “go Live”. Users who have violated the platform’s Community Standards before may not be able to post Facebook Live streams. The platform is also investing in technology to help identify hateful and violent content, in case it manages to slip through the cracks.
Facebook plans to work directly with the New Zealand government and other institutions to move forward from the terrorist attacks. Sandberg noted that they will provide, “support to four local well-being and mental health organizations” to aid victims. They have also offered to work with the New Zealand government to discuss what social media platforms contribute to these tragedies.
The social media giant is also removing identified hate groups from their platform. They have already shut down Facebook pages for Australian and New Zealand groups like the Lads Society, the United Patriots Front, the Antipodean Resistance, and National Front New Zealand. Sandberg also stated that they will ban “praise, support and representation of white nationalism and separatism on Facebook and Instagram.”
Facebook already has fairly robust Community Standards. For example, a person could lose their account if they use dehumanizing language to compare a group to an insect. Facebook Vice President of Global Product Management Monika Bickert noted this past year that 7,500 people were employed to review controversial content. However, their current technology was not enough to prevent the terrorist attack from being live streamed. Some of their measures have been deemed controversial, but Facebook believes it is their responsibility to implement such standards.