Facebook said it will no longer tolerate content that promote division, so its banning white nationalism and separatism messages from its platforms – Facebook and Instagram by next week. This is to follow a prior ban on white supremacy which has been implemented already.
The social network company will frown at content that praise, support and representation of the ideologies, which include specific phrases like “I am a proud white nationalist” and “Immigration is tearing this country apart; white separatism is the only answer,” as Motherboard first reported.
Facebook arrived on the decision Tuesday, and COO Sheryl Sandberg was among the few persons who were mandated to couple-up the policy. The social network will direct those who search for or post content related to those to a nonprofit called Life After Hate, which helps people leave hate groups.
“We decided that the overlap between white nationalism, separatism, and white supremacy is so extensive we really can’t make a meaningful distinction between them,” Brian Fishman, Facebook’s counterterrorism policy director, told Motherboard. “And that’s because the language and the rhetoric that is used and the ideology that it represents overlaps to a degree that it is not a meaningful distinction.”
Tech analyst said: ”Given Facebook’s difficulties in policing banned content (consider how quickly footage of the New Zealand mosque shootings proliferated across the network), it remains to be seen how effective it will be in moderating white nationalism and separatism. Indeed, it said that implied and coded messaging related to the ideologies won’t be banned straight away, to an extent because it’s more difficult to monitor the site for such content. Nor will Facebook prohibit more general material on separatism and nationalism outside of white separatism and nationalism, such as the Basque separatist movement.”
Facebook is going to be using machine learning and AI to detect white nationalist and separatist contents on its platforms, which include Facebook and Instagram, as well as a system that finds and deletes images that have been seen to include hate speech. That is exactly how the company is cubing terrorism-related messages.
It’s most likely that advocates of free speech would fight the decision of Facebook to eliminate such content hence they have previously accused Facebook of having an anti-conservative bias in this regard. Nothing of such was said of Whatsapp yet.