The Federal Trade Commission alleges that Meta has “repeatedly violated” privacy rules, and proposes to tighten the agency’s 2020 order against the company, completely barring it from monetizing data from anyone under 18 in any way, among other new restrictions.
The order in question took effect in 2020, but was created in 2019 as part of then-Facebook’s $5 billion settlement after the company violated a previous order. Now the FTC says that Facebook/Meta has violated the new order, but also the Children’s Online Privacy Protection Act Rule.
“The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures,” said Samuel Levine, director of the FTC’s Bureau of Consumer Protection, in a news release. (Due to the order and said failures spanning both names of the company, both are also used throughout.)
The 2020 order created an independent third party assessor who would evaluate whether Meta was adhering to the privacy rules, things like having new products go through privacy reviews and restrictions on how facial recognition data and phone numbers are used.
This assessor recently gave the FTC their report, and apparently it is not pretty, containing evidence of numerous shortcomings or violations: “the Commission notes that the breadth and significance of these deficiencies pose substantial risks to the public,” the agency wrote.
Specifically, Facebook promised (in 2018 — the timeline is long and confusing) to cut off app developers’ access to users’ data if that user had not used the app in 90 days. But it did not do so, the FTC alleges, and allowed some of that data to be used well into 2020.
The company also “misrepresented that parents could control whom their children communicated with through its Messenger Kids product.” The contact controls put in place by Facebook were inadequate, allowing children to communicate with unapproved contacts via group video calls and chats.
These may not sound like the most egregious failures, but regulations around tech for kids are tight for good reason, and COPPA violations are serious. When one considers that Facebook had not only been put on warning for a decade for sloppy privacy practices, but that it knew the FTC was eyeballing its every move especially with sensitive data like that of under-13 users, one is less inclined to offer grace.
This apparently cavalier approach to compliance with the FTC order has prompted the agency to tighten the screws, with a number of proposed changes to the order — something it may do when there “changed conditions of fact or law or public interest” warrant it. Companies may consider themselves warned that FTC orders are very much living documents.
In this case the 2020 order, affecting all of Meta’s businesses (Facebook, Instagram, WhatsApp, and Oculus), would be modified to add the following:
Total prohibition of monetizing data from anyone under 18. This data could only be used to provide services or for security purposes. (And it doesn’t retroactively become legal when the user turns 18, either.)
No launching new or modified products or services without the independent assessor confirming that the new features are in compliance with the privacy restrictions.
If Meta happens to buy up some cool new company, this privacy rule now applies to them as well.
Expanded limitations on facial recognition, requiring disclosure and affirmative consent.
Strengthened requirements throughout relating to privacy review, data inventory, access controls, etc.
Today sees the FTC publishing an Order to Show Cause, which details the issues noted in brief above and was not publicly available at the time of writing. Meta has 30 days to respond, after which the agency will “carefully [consider] the facts and arguments made by the parties” and decide whether the expanded order is justified. I have asked when the new order could potentially take effect and will update this post if I hear back.
FTC moves to completely prohibit Meta from monetizing kids by Devin Coldewey originally published on TechCrunch