Facebook Workers Ponturecarga

 

If you've ever wondered what kind of things Facebook does not want on your site, you're in luck. The social network publishes in 27 pages with detailed guidelines on what can and can not appear on its website, for the first time in its history.

So if you do not want to suffer the consequences, do not make credible violent threats or revel in sexual violence; do not promote terrorism or poaching of threatened species; do not try to buy marijuana, sell firearms, or publish prescription drug sales prices; do not publish instructions to self-harm; Do not show minors in a sexual context or commit multiple homicides at different times or places.

Facebook had already vetoed most of these actions on its previous "community standards" page, which outlines roughly the standards of the social network. But the document published on Tuesday includes some details that can sometimes be bloody.

The updated guidelines reflect the criteria used by its 7,600 moderators to review questionable publications and decide if they are eliminated or not, and on occasion, alert the authorities.

The criteria themselves do not change, but the details reveal interesting information. The photos of sines are allowed in some cases Ex: In images of breastfeeding or in paintings - but not in others. The document details what it considers sexual exploitation of adults or minors, but leaves room to prohibit other forms of abuse should they arise.

Since Facebook does not allow the presence of serial killers, the new guidelines define the term: anyone who has committed two or more murders in "indecent multiples or locations." The people who committed a single homicide will not be excluded. After all, it could have been in self-defense.

The review of the guidelines gives an idea of ​​how difficult the work of a Facebook moderator should be. These people must read and observe questionable material of all kinds and make complicated decisions such as if video makes apology for eating disorders or simply seeks to help people. Or when the line between the joke and the harassment is crossed or one goes from a theoretical reflection to a direct threat, and the like.

The moderators operate in 40 different languages. The goal of Facebook is to respond to reports on questionable content within 24 hours, although it says that it does not impose quotas or time limits on reviews.

The social network has made several errors sounded in recent years. For example, human rights groups say that Facebook has given an inadequate response to hate speech and incitement to violence against Muslim minorities in Myanmar.

In 2016, Facebook retracted after removing an iconic 1972 photograph of The Associated Press in which a girl appears running naked and screaming after an attack with napalm in Vietnam. At first, the firm insisted that it could not create an exception for a particular photograph of a nude minor, but soon retracted pointing out that the image had a great "global importance"