Facebook's Kid-Safe Version of Messenger Let Kids Join Group Video Chats with Unauthorized Adults

(Press Association via AP Images)

The company won’t say how many kids it put at risk, but Facebook has spent the last week shutting down group chats in its “safe” Messenger Kids app where unknown adults were free to virtually mingle with youngsters — including video chats.

Advertisement

An alert obtained by The Verge reads:

While Facebook has yet to make any public announcement regarding this security flaw, when contacted by The Verge the company did admit that “the alert had been sent to thousands of users in recent days,” and pinned the blame on a “technical error” which affected, so they say, “a small number of group chats.”

“Thousands” of group chats doesn’t sound like a small number to me, especially given that you have to multiply that vague figure by the number of kids involved in each one. Worse, Facebook has a bad history of slow-rolling the truth when it comes to its many security issues.

In April of last year, Facebook admitted that it had exposed the personal data of 87 million users to data-harvesting firm Cambridge Analytica — up from the 50 million it had admitted to just the month before. Later the firm fessed up and said the “most of its 2 billion users” were likely to have had their profiles scraped by outside firms — read, Cambridge Analytica among others — without their permission.

Earlier this year, Facebook admitted (there’s that word again) that it had been storing some Instagram passwords unencrypted on the company’s servers. How many, exactly? Again, Facebook slow-rolled the truth. First they said that “thousands” of photo-sharing passwords had been stored in plain text, perfectly readable to anyone with access. A month later they admitted — jeez, they need a C-Level officer in charge of Corporate Culpability — the actual figure was in the millions.

Advertisement

How many millions? Let’s just stick a pin in that for today, with a label that says, “Developing…”

That story came sharp on the heels of a similar security disaster. The company’s premier social media service, Facebook itself, had stored “hundreds of millions” of users’ passwords in plain text. If the real number turns out to have been closer to two billion, who would be surprised?

Now today’s news that virtually anyone could join a group chat in an app purportedly designed to let parents have complete control over who their preteen kids were allowed to chat with. How many kids were, um, exposed? I don’t trust Facebook’s initial admission, and given all that you’ve just read, neither should you.

I probably don’t need to remind you that Facebook was just fined $5 billion by the FTC for privacy violations, a figure which some say (including yours truly and several U.S. senators) is “woefully inadequate.”

But wait, as the ad announcer says, there’s more.

The Parent Coalition for Student Privacy, along with 16 other public health advocacy groups, filed an FTC complaint against Facebook last October, alleging that Messenger Kids violates the Children’s Online Privacy Protection Act (COPPA). They announced:

Advertisement

The FTC complaint says that Facebook’s parental consent mechanism does not meet the requirements of COPPA because it’s not reasonably calculated to ensure that the person providing consent is actually the child’s parent. Any adult user can approve any Messenger Kids account, and testing confirmed that even a fictional “parent” holding a brand-new Facebook account could immediately approve a child’s account without proof of identity. The complaint also asserts that Facebook Messenger Kids’ privacy policy is incomplete and vague. The policy allows Facebook to disclose data to unnamed third parties and the “Facebook Family of Companies” for broad, undefined business purposes. The policy does not specify what companies are in the “Facebook Family.” COPPA requires that privacy policies list the name and contact information of any third parties who have access to children’s data.

To sum up: Messenger Kids is just another data-harvesting scheme, this time directed at preteens, with lax security and little or no accountability.

Or to put it even more succinctly: Messenger Kids is Facebook.

At this point, serious punitive action is required, because even $5 billion is barely a drop in Mark Zuckerberg’s slime bucket. Ideally I’d like to see Facebook split up into its various components, with a privacy data firewall between them. More importantly, I’d place a punitive tax on social media’s targeted ads (with a rebate paid to viewers of said ads) in order to go directly after their business model — a model which encourages exactly the kind of neglect and malfeasance for which Facebook has become so infamous. All of this goes against my long-held libertarian ideas about antitrust, but when theory meets reality and comes up short, then it’s time to update the theory.

Advertisement

Meanwhile, if you have children using Messenger Kids, delete the damn app already.

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement