Amid a new storm of controversy following The Wall Street Journal’s ‘Facebook Files’ expose, Facebook has today announced that it will shelve plans for its ‘Instagram for Kids’ project for now, so that it can meet with relevant regulators and advisory groups, to ‘get this right’ before proceeding.
Instagram chief Adam Mosseri posted a short video explaining the decision.
We’re pausing “Instagram Kids.” This was a tough decision. I still think building this experience is the right thing to do, but we want to take more time to speak with parents and experts working out how to get this right. pic.twitter.com/gMbPjft0CW
— Adam Mosseri ???? (@mosseri) September 27, 2021
First reported earlier this year, the Instagram for Kids project, as detailed by Mosseri, was aimed not at children, as such, but at tweens, aged between 10 and 13, who are increasingly finding alternate workarounds to access Instagram anyway. Facebook’s view was that by building a special version for younger users, which could be managed and overseen by parents, that would be a better way forward than the current situation – which the company still stands by in principle.
As per Facebook’s official announcement:
“We firmly believe that it’s better for parents to have the option to give their children access to a version of Instagram that is designed for them – where parents can supervise and control their experience – than relying on an app’s ability to verify the age of kids who are too young to have an ID.”
Various experts and officials had raised concerns about the Instagram for Kids project at the time of the initial reports, with a coalition of child safety groups penning an open letter to Facebook CEO Mark Zuckerberg calling on him to rethink the idea.
The experts highlighted the psychological harms that can be exacerbated by Instagram, and in particular the platform’s “relentless focus on appearance, self-presentation, and branding” which they claimed “presents challenges to adolescents’ privacy and wellbeing.”
These same elements were the highlighted once again in the recent Facebook Files reportage, which referred to leaked internal documents from Facebook which showed that the company is increasingly aware of the negative impacts that Instagram usage can have on young users.
As per the report:
“32% of teen girls said that when they felt bad about their bodies, Instagram made them feel worse […] Teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups.”
Facebook has since refuted these claims, noting that its research actually showed that many young users felt that Instagram was helpful in many ways.
“In fact, in 11 of 12 areas on the slide referenced by the Journal – including serious areas like loneliness, anxiety, sadness and eating issues – more teenage girls who said they struggled with that issue also said that Instagram made those difficult times better rather than worse.”
The ‘slide’ Facebook’s referring to is this, which was one element of the internal report:
This chart, and the title given to it by Facebook, shows that the company is well aware of such concerns, but Facebook has now sought to play down these impacts, and the broader relevance of the study, which it claims was only a small scale sample, and not necessarily designed to be indicative.
Still, hard questions are being asked, and with Facebook backed into a corner to some degree, it seems best for it to pause its Instagram for Kids plans, at least until it can better gauge response, and formulate a more inclusive, considered way forward for the project.
Is that a better outcome? Well, it’s hard to say.
Certainly there’s a logic to what Mosseri and Facebook are saying, that it would be better to provide more oversight for parents of tweens, in order to facilitate Instagram use without them seeking workarounds, and full access to potentially controversial content, in other ways. In principle, most child protection groups are calling for the current restrictions on young users to be fully upheld (Instagram users must be over 13), and for Instagram to work harder to stop them getting access outright – but is it realistic to expect youngsters to not at least try to access the app, or to think that Instagram and Facebook can stop such entirely?
I guess, that’s the key question – should Facebook, with all its scale and resources, be able to stop youngsters from accessing its apps, through improved verification and processes, which could, theoretically, reduce such concerns, as opposed to conceding that it’s going to happen, and allowing such harms to potentially proliferate, in some form, either way?
It seems like there could be an alternate verification process that could be implemented, at least in some capacity – but then again, if Facebook goes down that path, that could mean that all users need to provide identification in order to gain access to Facebook’s apps, which would keep kids out in the majority, and would provide a new level of enforcement and tracking.
But that then opens up a whole other can of worms for The Social Network.
Would you want to upload your personal ID to Facebook, and provide more direct tracking? Would you feel comfortable knowing that your real-world and online identities could be so closely linked?
It would almost undoubtedly lead to new types of legal enforcement for online activity, which could be a good thing, but there’s also the added concerns around citizen surveillance, platform access for those without official ID, government data tracking, etc.
These are broader questions around social media access that stem from the initial discussion, and when you expand out the debate to its logical conclusion, it does suggest that an Instagram for Kids could be a better solution.
Yet, at the same time, even Facebook’s own internal data underlines the potential harms the platform can cause.
So what then? Shut down Instagram entirely? Well, that’s not going to happen, and given this, kids are going to keep trying to access the app. Instagram says that it will focus on the continued improvement of its tools to protect youngsters, including improved birth date requirements and messaging restrictions, as well as internal control tools to help people manage their on-platform experience.
But really, all of these measures can only go so far – the fact remains that social media, in general, can have significant mental health impacts, and Instagram’s visual focus makes it particularly dangerous in regards to comparison, body image, and other key aspects that can have a major effect on teens.
What’s the way forward? I don’t know, no one does, but it is an important element that should see more investigation and debate, and we should, at the least, be open to discussion around the potential of an Instagram for Kids, if we’re going to be realistic about such access.