As a Facebook scam-busting site, we often ask our readers to report scams, spam and other offensive content. Have you ever wondered what happens once you click the report button? We certainly have, and we have often wondered what, if anything, gets done by Facebook in return. In all fairness, policing a site with almost a billion members is no small feat!
Today, Facebook Safety posted a note and an infographic that details the internal teams, guidelines and workflows that are involved in the Facebook reporting process. (the image is rather large, so you might have to download it and view it in an editor to view it properly)
Facebook has hundreds of moderators based in four centers that evaluate content based on established community standards. The following four distinct teams act and respond accordingly:
- Safety Team – Violence and Harmful Behavior
- Hate and Harassment Team – Hate Speech
- Abusive Content Team – Scams, Spam and Sexually Explicit Content
- Access Team – Hacked and Imposter Accounts
The Safety Team will contact law enforcement authorities when credible threats of violence are present.
The infographic also details how Facebook utilizes ‘Social Reporting‘ – a tool that “enables people to report problematice content not only to Facebook, but also directly to their friends to help resolve conflicts. Additionally, people can use the tool to reach out to a trusted friend who may understand the offline context of the situation and are able to assist.”
It is important to note that Facebook doesn’t go at it alone. They are partnered with over 30 agencies and organization that help in the prevention and aftermath of reported issues. The include: Facebook’s Network of Support, Lifeline and our Global Suicide Prevention Community, Safety Advisory Board, and the NCSA.
Leave a reply