Social media giant Facebook has been urged to scrap its plans for the Instagram Youth platform amid concerns over child health and wellbeing.
In an open letter to CEO Mark Zuckerberg, attorney generals from more than 40 US states suggested that a youth Instagram could cause health and privacy issues for children.
The letter states that the use of social media “can be detrimental to the health and well-being of children”, who are “not equipped to navigate the challenges” of being in control of social media accounts.
Social media platforms come with a raft of documented privacy concerns that young children may not understand, as well as historic cases of cyberbullying. This, the signatories say, would be just as prevalent in the proposed children’s version of Instagram.
“Facebook has historically failed to protect the welfare of children on its platforms,” the letter states.
“The attorneys general have an interest in protecting our youngest citizens, and Facebook’s plans to create a platform where kids under the age of 13 are encouraged to share content online is contrary to that interest.”
The letter also says that Facebook has a “record of failing to protect the safety and privacy of children on its platform,” despite claims that its products have strict privacy controls.
In response, Facebook said it would “prioritise safety, privacy and working with regulators and experts as it builds out the service”. The company also committed not to show any advertisements on the youth platform.
“We agree that any experience we develop must prioritize their safety and privacy, and we will consult with experts in child development, child safety and mental health, and privacy advocates to inform it,” the social media giant said.
As far back as 2018, Facebook has been in the spotlight for failures to protect children from online harms. Former health secretary, Jeremy Hunt, accused Facebook and Google of “turning a blind eye” to their alleged impact on young people.
Recommended
- Fake Amazon review scam exposed in unprotected server
- DIGIT Startup Q&A: Jenifer Clausell-Tormos, Founder & CEO of Develop Diverse
- Sports video content startup Recast completes £5.9m series A round
Reports from 2019 revealed that Facebook’s Messenger Kids app, intended for children aged between six and 12, contained a significant design flaw that allowed children to circumvent restrictions on online interactions and join group chats with strangers that were not previously approved by parents.
Additionally, a recent “mistake” with Instagram’s algorithm promoted diet content to users with eating disorders, where the app’s search function recommended terms including “appetite suppressants” and “fasting” to vulnerable
“These alarming failures cast doubt on Facebook’s ability to protect children on their proposed Instagram platform and comply with relevant privacy laws such as the Children’s Online Privacy Protection Act,” the letter said.