Update: Facebook acknowledges the flaw, and fixes the bug. The flaw, spotted by members of a body building forum, no less , allows Facebook users to access photos revealed by the report abuse tool. Only a handful of images are presented to the user as part of the 'report' feature, which is used by Facebook to maintain decency and remove harmful images, posts or content. Users are able to report "inappropriate profile photos" on a user's profile. By checking the box "nudity or pornography," the user is granted an opportunity to help Facebook "take action by selecting additional photos to include with your report.

Social Sharing

Top Searches
February 12, What are Facebook's rules for posting nude images? The question is at the forefront again after a French court ruled Friday that a French art teacher can sue the social media service after it suspended his Facebook account. Although Facebook hasn't given a reason, the account suspension came after he posted an image of a classical painting featuring a female nude. Facebook's rules on nudity have evolved over time. The latest community-standards policy, from March , says Facebook restricts photos of genitals or fully exposed buttocks, as well as some images of breasts if they include the nipple. But Facebook says it allows photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring. Breastfeeding moms protested when images were pulled. In , 11, people staged a virtual "nurse-in," replacing their profile photos with nursing ones. It's not clear when Facebook's policy changed internally, but about two years ago the policy wording changed to specifically allow photos of nursing mothers.
Related Stories
Facebook is rolling out technology to make it easier to find and remove intimate pictures and videos posted without the subject's consent, often called "revenge porn. Currently, Facebook users or victims of revenge porn have to report the inappropriate pictures before content moderators will review them. The company has also suggested that users send their own intimate images to Facebook so that the service can identify any unauthorized uploads. Many users, however, balked at the notion of sharing revealing photos or videos with the social-media giant, particularly given its history of privacy failures. The company's new machine learning tool is designed to find and flag the pictures automatically, then send them to humans to review. Facebook and other social media sites have struggled to monitor and contain the inappropriate posts that users upload, from violent threats to conspiracy theories to inappropriate photos. Facebook has faced harsh criticism for allowing offensive posts to stay up too long, for not removing posts that don't meet its standards and sometimes for removing images with artistic or historical value. Facebook has said it's been working on expanding its moderation efforts, and the company hopes its new technology will help catch some inappropriate posts. The technology, which will be used across Facebook and Instagram, was trained using pictures that Facebook has previously confirmed were revenge porn.
Facebook is allowing its staff to look at nude photos of its users in an attempt to combat revenge porn. The site has told its users to send in any photos that they are afraid might be circulated on the site. Those images will then be viewed by Facebook's own staff to verify them, and if they are decided to be legitimate they will be banned from being shared on the site. In a post aimed at clarifying the facts around the new initiative, Facebook makes clear that those images will only be seen by "a specially trained representative from our Community Operations team". But it does confirm that all of the images will be looked at by its own staff. It then goes on to lay out the details of the plan to stop the sharing of non-consensual intimate images, many of which were reported in the press.