Last Updated: January 18, 2022
Even big companies can sometimes lack good judgment and make less-than-ideal business decisions. One such example is Facebook’s latest endeavor—developing and launching an Instagram app specifically targeted towards pre-teens.
Whether we like it or not, we can’t ignore the fact that no age verification tool is foolproof, resulting in many children scouring the internet and using various applications unmonitored. Facebook’s 2012 purchase—Instagram—is no exception. Prompted by this knowledge, it seems that Facebook adopted an “if you can’t beat them, join them” mindset and decided to develop a more monitored version of their star app.
Facebook’s vision for this app keeps the original premise of Instagram, with some differences—it will only be available for children under 13 and include parental control.
However, this effort was met with hostility by many individuals and organizations concerned with children’s safety, resulting in a letter addressed to CEO Mark Zuckerberg, demanding the app’s development to cease.
What Makes Instagram for Pre-teens Dangerous?
On the surface, the idea sounds solid. We can’t keep our children away from the internet, so why not acknowledge their presence and make it a safer place for them? Well, the solution isn’t so cut and dry.
It’s no secret that Instagram can have a detrimental effect on the mental health of some of its user base, who are primarily adults. The idyllic lifestyle and filtered appearance many choose to present on the app causes unrealistic expectations, resulting in disappointment, feelings of unworthiness, and a damaged sense of self-worth for many.
To think that this would change for a younger user base would be naive. In fact, the negative aspect may even be more prominent compared to “regular” Instagram, as a younger audience is more vulnerable to that effect. As the advocates for children who are opposing this app put it, Instagram “exploits young people’s fear of missing out and desire for peer approval.”
Encouraging children to use the app may cause damage to their mental health and promotes increased dependency on social media, which is already a problem for the younger, tech-raised generations.
All those arguments aside, perhaps the biggest reason to shelf this idea is the children’s safety. Instagram is already struggling with keeping adults with malicious intent from contacting underage users. With that track record in mind, creating a photo-sharing app for children is bound to attract unwanted attention, possibly putting the children in danger of online predators.
As the letter of the children’s advocates says, “Facebook’s long track record of exploiting young people and putting them at risk makes the company particularly unsuitable as the custodian of a photo sharing and social messaging site for children.”
Facebook is yet to announce any plans of abandoning this venture.