The Dark Side Of Facebook - How Moderators Keep Facebook Clean
We talk a lot about Facebook’s algorithm and News Feed policies, but we don’t often think a lot about what the Facebook moderators are doing behind the scenes.
Facebook moderators check through Facebook content to filter out inappropriate, graphic, and illegal content.
Without the hard work of the moderators, Facebook would not be the place that so many of us love. This infographic from Who Is Hosting This explains how moderators work to keep the dark side of the internet out of Facebook.

Facebook’s moderators have a big task on their hands. Facebook has approximately 1.35 billion active users every month, who are sharing 4.75 billion posts each day. In addition, there are about 300 million photos uploaded and over 700,000 comments posted every day. That’s a lot of content to moderate.
The Facebook moderator team consists of 800-1,000 professional moderators, who are located all over the world. Altogether, Facebook moderators speak 24 different languages, and many of them are located in India and the Philippines.
Facebook moderators are looking to remove content in a few different categories: inappropriate sexuality, graphic content, illegal activity, hate and harassment, and access problems. Within the category of inappropriate sexuality, they’ll remove pedophilia, necrophilia, and nudity.
Graphic content includes animal abuse, murder, suicide, beheadings, and other graphic violence. They monitor illegal activity such as threats, harassment, domestic violence, and drug activity. Moderators also work to remove issues with hackers and imposters.
The moderating process begins with user-generated reports. Facebook users can flag any content they believe is inappropriate and mark it as harassment, hate speech, spam, sexually explicit, violent or harmful behavior, or simply “I don’t like this post.” An automatic algorithm then sorts flagged post into categories.
Within approximately 72 hours, a professional moderator will review these flagged post. They may choose to allow the content, delete it, or escalate the issue to a moderation team based in California.
When deleting a post, moderators may also issue a warning to the poster or, in the case of serious infractions, disable the user’s account. If a moderator encounters a dangerous situation or threat, that post will be escalated, and the Facebook team may share it with legal authorities. Facebook has also teamed up with a safety advisory board, the National Cyber Security Alliance, and over twenty different suicide prevention agencies to be able to deal with problematic content more effectively.
Being a Facebook moderator is a difficult job. Many moderators, particularly those working in India and the Philippines, earn low wages. It’s also a job that comes with significant psychological stress, as moderators constantly view highly disturbing content. Because of this, the average employment duration for Facebook’s moderators is only 3-6 months. For coping with all this stress and dealing with all of the content we don’t want to see, we certainly owe the moderators some thanks.