Do you post your child’s photos on Facebook? How do you know that she is safe on this app? Learn about Facebook’s safety features against child exploitation.
What you can read in this article:
What is NCMEC?
Facebook safety features they are implementing
What you can do to help children from being exploited on Facebook
Facebook is the biggest social media platform that has to deal with child pornography, trafficking, and sexual exploitation. For this reason, the company took major steps in making sure that they are protecting children and minors on their website and mobile app.
Facebook has partnered with the National Center for Missing & Exploited Children (NCMEC) to further understand and train their content moderators to know what they should be looking for when filtering content present by Facebook’s AI.
The National Center for Missing & Exploited Children is a private, non-profit 501(c)(3) corporation whose mission is to help find missing children, reduce child sexual exploitation, and prevent child victimization.
NCMEC works with families, victims, private industry, law enforcement, and the public to assist with preventing child abductions, recovering missing children, and providing services to deter and combat child sexual exploitation.
Facebook safety features they are implementing
Imagine from Unsplash.
Facebook uses a multi-pronged approach to monitor and moderate its content. Here are some of the safety features that they implement:
Policies for the Facebook community
Tools to give people control
Resources at every point in the service
Partnerships to complement our expertise
Feedback to keep on improving
Facebook has a diverse team of experts working to keep users safe. They have more than 35,000 people work on security and safety around the world, including experts in law enforcement, counter-terrorism, anti-human trafficking, child protection, online safety, analytics, engineering, and forensic investigation.
Previous positions and experience across the team include Former FBI Agents, human rights experts, security engineers, US Marine corps officers, Indian Army captains, Australian federal police, data scientist, and much more.
What does their AI do?
Imagine from Unsplash.
For years, they’ve used technology to find child exploitative content and detect possible inappropriate interactions with children or child grooming. But they’ve expanded their work to detect and remove networks that violate child exploitation policies, similar to efforts against coordinated inauthentic behavior and dangerous organizations.
In addition, they’ve updated their child safety policies to clarify that they will remove Facebook profiles, pages, groups, and Instagram accounts that are dedicated to sharing otherwise innocent images of children with captions, hashtags, or comments containing inappropriate signs of affection or commentary about the children depicted in the image.
They’ve always removed content that explicitly sexualizes children, but content that isn’t explicit and doesn’t depict child nudity is harder to define. Under this new policy, while the images alone may not break their rules, the accompanying text can help them better determine whether the content is sexualizing children and if the associated profile, page, group, or account should be removed.
How to report child exploitation on Facebook
After consultations with child safety experts and organizations, they’ve made it easier to report content for violating their child exploitation policies. To do this, they added the option to choose “involves a child” under the “Nudity & Sexual Activity” category of reporting in more places on Facebook and Instagram.
These reports will be prioritized for review. They also started using Google’s Content Safety API to help them better prioritize content that may contain child exploitation for content reviewers to assess.
What you can do to help the children being exploited?
Image from Unsplash.
According to their data, more than 90% of the content is shared/reshares of content previously. The majority of content comes from a handful of countries concentrated in certain regions.
More than 75% of these reports involve people sharing with “non-malicious intent” (as per NCMEC: poor humor, outrage, gawking)
The first step is to not share the content at all even if it’s out of anger. This helps those people that shared the content in the first place get what they want. Which is for people to share the content that they posted.
The second step is to immediately report the post that you think is exploiting children. This will allow Facebook and their team to find out about the content immediately and take it down from the source. So remember report instead of sharing and that is the best thing that you can do for these children.
How to get your child’s image removed on Facebook
If the photo violates the Facebook Terms and Community Standards, you can report it to them by doing the following steps:
Click on the photo
Beside the name of the person or the page who uploaded the photo, you can see the three dots. Click on that.
On the drag down box, click on “Find support or report photo.”
If you believe a photo violates your child’s privacy, please review their information on image privacy violations.
You can read more about Facebook’s safety features for minors and how you can protect your child on the app here.
Screenshot from Facebook
Tips on keeping your child safe on Facebook
Facebook’s safety features try to ensure that children and minors are protected on their platform, but we also have to do our part as parents to make sure that our children will be safe on their website, and other social media apps.
Here are some things we can do:
Start a conversation with your child early, before they are on social media.
Some studies revealed that children as young as 6 have access to smartphones or tablets. It’s better to start talking to your child about technology and social media before they hit 13, when they are allowed to go on it.
If you have a teenager on Facebook or Instagram, ask if you can friend or follow them.
Refrain from posting private information and inappropriate photos of you or your child online.
Don’t put yourself in a sticky situation where you or your child can be exploited.
Be mindful of age restrictions.
Those rules exist for a reason – to keep your child safe. Facebook and Instagram require everyone to be 13 years old before they can create their own account (in some countries, this age limit may be higher depending on local laws).
Let your teen know that the same rules apply online as apply offline.
If the rule “Don’t talk to strangers” applies in person, it is also applicable online. Remind them to be cautious when adding or talking to people on Facebook. You can also teach them about proper social media etiquette.
Help them manage their time online.
Try to be a good role model to your kids. If you set time restrictions on when your teen can use social media or be online (for example no texting after 10:00 PM), you should be prepared to follow the same rules.
Help them to check and manage their privacy settings.
Before you let your child spend time on Facebook or any social media app, it’s better if you navigate it together to make sure she understands the rules and knows how to protect herself from being exploited.
On Facebook, remind her to set her account to private and teach her how to make the privacy settings work for her safety and advantage.
Tell them to report if they see something they are concerned about.
Encourage your child to be proactive and trust her gut if she feels that there is something wrong about a post, or something fishy about a person trying to add her on Facebook. Teach her how to use the report function.
Keep the communication lines open.
You can make social media using a shared experience by capturing family moments with film or photo and have fun together editing, adding filters and using the augmented reality features like bunny ears!
While giving her a bit of space to explore Facebook by herself is okay, let your teenager know that she can always come to you and ask for help when she feels that something wrong online and offline.