US regulator criticises Meta over use of childrens' data
FCA proposes strict new privacy controls on the social media giant
The US Federal Trade Commission has criticised Meta's safeguarding and privacy controls with respect to children using its Messenger Kids app, suggesting strict new controls on the use of data from under-18s.
The regulator accuses Meta of misleading parents about the privacy of children's data and controls over who they can communicate with via Messenger Kids, which is marketed as "a free and safer app for kids to connect, communicate and play with family and friends."
The FTC says that Meta's claims over how much control parents have over who children can message and are misleading. It says children are able to communicate via text and video with individuals outside of their group, and that children's data can be accessed by Meta's developers.
These issues put the company in violation of a 2020 privacy order implemented by the regulator after the Cambridge Analytica scandal.
"Facebook has repeatedly violated its privacy promises," said Samuel Levine, director of the FTC's Bureau of Consumer Protection in a press release. "The company's recklessness has put young users at risk, and Facebook needs to answer for its failures."
The regulator is suggesting changes to the 2020 order including:
- Preventing Meta and its subsidiaries profiting from of data from people under 18.
- Requiring written confirmation from an independent assessor that Meta's privacy programme is compliant before the launch of any new products or services.
- Requiring Meta to obtain user consent for uses of facial recognition technology.
- Requiring that any companies acquired by Meta to comply with the order.
- Removing several gaps and deficiencies in the company's privacy programme identified by the regulator.
The FTC says that Facebook (as the company was then known) has violated rulings imposed in 2012 and 2020, the latter a $5 billion settlement resulting from the misuse of data by Cambridge Analytica, by continuing to allow its developers to access data of dormant users who have not used the app for 90 days, despite promising not to in 2018. This activity continued until mid-2020, it alleges.
The FTC alleges that as well as contravening the previous agreements, Meta was also in violation of a children's privacy law called COPPA, which imposes rules on websites and services directed at under-13s.
If implemented, the measures would have a significant effect on Meta's business, including moves into virtual reality and the metaverse.
Meta has 30 days to respond, as of Wednesday. It would have an opportunity to appeal any rulings.
Meta spokesman Andy Stone called the proposals "a political stunt" and vowed to fight them.
"Despite three years of continual engagement with the FTC around our agreement, they provided no opportunity to discuss this new, totally unprecedented theory," Stone said in a statement. "FTC Chair Lina Khan's insistence on using any measure - however baseless - to antagonise American business has reached a new low."