Meta has responded with concrete actions to the latest claims about child abuse on its networks, especially Instagram and Facebook. The company led by Mark Zuckerberg announced greater restrictions on who can send messages to minors. It’s also rolling out new features that give parents more control over the settings of their children’s profiles.
Meta explained this Thursday that, by default, those under 16 years of age—or under 18 in some countries—are already They will not receive messages from users they do not follow or are not connected with on Instagram and Messenger. They also cannot be added to group chats by strangers.
The change will also be implemented in Messenger. Children and teens will only receive messages from Facebook friends or people they are connected to through phone contacts. The new measure “will help teens and their parents feel even more confident that they won’t hear from people they don’t know,” Meta said in a statement.
The European Commission has called Meta’s attention on several occasions for the fregistration of effective measures to prevent child abuse on their platforms. Last December, the agency made a request for information specifically related to the circulation of child pornography on Instagram. He also made reference to the complaint about how Instagram and Facebook algorithms facilitate the creation of pedophile networks.
Meta’s management is under scrutiny after an investigation by the Wall Street Journal, conducted in collaboration with Stanford University, warned last year that Instagram even allowed hashtags that made clear reference to pedophilia. These “tags” linked to accounts that advertised the sale of child sexual material.
Another report from the American media reported that the same thing is happening on Facebook. They discovered the existence of public groups that openly discussed sex with minors. The research also revealed how the algorithm recommends similar groups to users, which have between 200,000 and 800,000 subscribers.
As a result, Meta said in December that it would expand the list of phrases and emojis related to child abuse to curb this content. The technology company also indicated that it was using machine learning tools to detect connections between different search terms.
Along with the new restrictions on messaging children, Meta reported that parents will now have to give approval when a minor wants to change their private account to public On Instagram. The same thing will happen when your children want to change the sensitive content control category or the settings on direct messages.
This will be possible in profiles that have parental control mode activated, a tool that Meta released in March 2022. Instead of receiving a simple notification, parents will receive the option to approve or reject these changes.
Social media managers will be held accountable
The new features “empower parents” and “give them the tools they need to help protect them,” said Larry Magid, CEO of ConnectSafely, a nonprofit dedicated to internet safety. Magid stressed that these measures, at the same time, protect “the privacy of their adolescent children and their ability to communicate with their friends and family.”
Meta’s new features arrive at an opportune time. Next week, Mark Zuckerberg will have to testify before the United States Senate to discuss the problem of online child exploitation. The directors of X (formerly Twitter), TikTok, Snap and Discord were also invited to the session, scheduled for January 31.
Meta also faces lawsuits from dozens of states in the United States for allegedly fueling a youth mental health crisis. Prosecutors pushing the action contend that Zuckerberg’s company has repeatedly misled the public about the substantial dangers of its platforms. And, even knowing the risks, It has also induced compulsive use of its drugs in children and adolescents.