Facebook, over the weekend, has revamped the way in which its takedown policy explains the guidelines for user posts. The new guidelines are already in effect, but they don’t modify the existing policy whatsoever, just clarify its terms. That means that the same community guidelines will apply, but it will be easier for users to understand why their request for a takedown has been approved or rejected. Nudity, terrorism, hate speech, violence and criminal acts guidelines have been detailed in such a way that users reading these guidelines will have a clear and concise image of what they are permitted to post and what they should keep to themselves. Moreover, Facebook has instituted a new tool with which Facebook employees can add banners to content that is not suitable for minors, although voices of organizations that militate for children’s presence on the internet say that the social media network should not restrict the use of these tools to Facebook employees, but make them available to users who are posting.
According to the new guidelines, nudity is still permitted, as long as it is “decent”, referring to the fact that boob-pics for example are allowed as long as the nipple isn’t visible. At the same time, photos of buttocks are not allowed, nor other sensitive areas. Facebook also clarified that these guidelines apply to digital images, too, unless they are created with nudity with an explicit purpose, such as sarcasm, criticism, art, education and so on. These changes aim to strengthen community standards and raise awareness about what is suitable for a social network and what is not. Content with explicit sexual acts that contain vivid detail are also still banned from Facebook, but the guidelines are now clearer so that all posters can understand what they are implying.
Alongside standards regulating nudity on Facebook, the new guidelines also explicitly ban terrorist organizations from the entire social network, and they go on to explain that even support for such groups or individuals and their acts is banned completely, which is a welcome explanation by many. Bullying remains banned, but the company has clarified that modified images that degrade individuals are also forbidden. Celebration and praise of general criminal acts outside of terrorism is also banned, although Facebook clarifies that discussing the legality of certain acts considered to be criminal is still ok with community standards.
The feature/tool that Facebook uses to flag violent content are somewhat discussed in security and privacy circles, especially by children’s safety advocates. Although they have been in use since January, Facebook still hasn’t provided the tool to community members. Organizations say that the network should make these tools readily available so that users could flag their own content and prevent videos from auto-playing in news feeds. Facebook global head of content policy, Monika Bickert has clarified to the BBC that the company has no means of implementing this tool right now for its users, but is considering the possibility in the future. At the moment, the company adds these flags only if posts have been reported, which isn’t the best solution one could think of.
Although community standards and takedown guidelines haven’t changed, the added clarifications were a necessity in many people’s minds so that confusion about why some takedown requests were not honored would be avoided. Facebook told the BBC that confusion was the main reason why they have updated and detailed their guidelines. You can check out the new guidelines by visiting the network’s Community standards page.