Meta’s Ongoing Battle with Underage Users: Are They Really Protecting Children?

Lawsuit alleges that Meta 'targeted' children to use its platforms

Complaint Meta ‘pursued’ children to use its platforms.

Oh, boy. It seems like Meta just can’t catch a break when it comes to underage users on its platforms. Despite their claims of doing everything they can to protect children, it’s clear that there’s a disconnect between what they say and what’s actually happening.

Let’s start with the facts. According to the Children’s Online Privacy Protection Act of 1998, kids under 13 aren’t even supposed to be able to sign up for Instagram or Facebook. But we all know that lying online about your age is a time-honored American tradition. And trust me, kids are experts at it.

The CEO of Meta, Mark Zuckerberg, even admitted during a congressional hearing that there are a “large number of people under the age of 13 who would want to use a service like Instagram.” So, why is it that kids are still able to access these platforms? And why is Meta being sued by multiple states for hooking children under 13?

An unsealed complaint revealed by The New York Times sheds some light on the issue. It alleges that Meta actively pursued and coveted children under 13 to use its platforms, failing to disable their accounts even when discovered. And to make matters worse, they continued harvesting their data without parental consent. It’s a troubling picture, to say the least.

The complaint states that Meta’s knowledge of underage users on Instagram is an “open secret” within the company. They’ve documented it, analyzed it, and protected it from public disclosure. This level of disregard is concerning, to say the least.

Now, Meta claims that they have measures in place to remove underage accounts once they’re identified. But come on, can we really trust them to police themselves effectively? Verifying the age of users online is undoubtedly a challenge, but Meta’s solutions don’t seem to be cutting it. They suggest supporting federal legislation that requires app stores to get parental approval for teen downloads, but is that really enough?

Let’s not forget that research has repeatedly shown how harmful social media can be for children. Facebook’s own research confirms that Instagram, in particular, has a detrimental impact on a significant percentage of teenagers, especially teenage girls. It’s clear that there’s a serious problem here.

At the end of the day, Meta needs to step up its game when it comes to protecting children on its platforms. We can’t rely on half-hearted solutions or the promises of a company that seems more focused on profits than on the well-being of its users.

What do you think? Is Meta doing enough to protect underage users? And what can we do to ensure our children’s safety in the digital age? Let’s discuss in the comments below!