Facebook is testing tools to combat child sexual abuse

Facebook social media app

Facebook has been under pressure to do more to get rid of images of child sexual abuse.

James Martin / CNET

Facebook is testing new tools aimed at blocking searches for photos and videos that contain child sexual abuse and preventing the sharing of such content.

“Using our apps to harm children is disrespectful and inappropriate,” Antigone Davis, who will lead Facebook’s global safety efforts, said in a blog post Tuesday.

The move comes as more pressure on the social network to address this problem is among its plans to enable basic encryption for messages on Facebook Messenger and Instagram Facebook photo service. End-to-end encryption would mean that except for the sender and the recipient, no one could see messages, including Facebook and law enforcement officers. Child safety advocates have raised concerns that Facebook encryption plans it could make it harder to get past child predators.

The first tool Facebook does is test a pop-up message that appears if users search for a term related to child sexual abuse. The notice asks users if they want to continue, and there is a link to criminal rehab groups. The message also states that child sexual abuse is illegal and that looking at these images could result in imprisonment.

nrp-child-safety-bundle-news-inline2

Facebook users who search for words related to child sex abuse content will see this pop-up notice urging them not to look at those images and get help.

Facebook

Last year, Facebook said it analyzed child sexual abuse content reported to the National Center for Missing and Deprived Children. The company found that more than 90% of the content was the same or similar to previously reported content. Copies of six videos accounted for more than half of the child-spying content reported in October and November 2020.

“The fact that only a few pieces of content relied on many reports suggests that a greater understanding of intent could help us prevent this resurgence,” Davis wrote in a blog post. another analysis company as well, which showed that users were sharing these images for other purposes outside of harming the child, including “upset or in the bad humor. “

The second Facebook tool said it is testing as a warning that will notify users if they try to share these harmful images. The safety alert tells users that if they could share this type of content again, their account may be disabled. The company said it is using this tool to help identify “behavioral signs” of potential users at a larger disk sharing this harmful content. This will help the company “educate them on why it is harmful and encourage them not to share it” publicly or privately, Davis said.

Facebook also updated its child safety policies and reporting tools. The social media giant said it will pull down Facebook profiles, Pages, groups and Instagram accounts “specifically for sharing innocent images of children with subtitles, hashtags or comments that contain inappropriate signs of love or statement about the children shown in the image. . Facebook users who report content will also see an option to notify the social network that the photo or video is “involving a child,” allowing the company to prioritize it for re-reporting. study.

the coronavirus pandemic, online images of child sexual abuse have increased, according to a January report by Business Insider. From July to September, Facebook found at least 13 million of these harmful images on the main social network and Instagram.

Source