Two days after saying it needs to do better at protecting its youngest customers, Facebook has said in an internal post that it's developing an Instagram app that would add users under age 13. Federal privacy regulations don't allow children younger than 13 to use the app as it stands. "We have identified youth work as a priority for Instagram and have added it to our H1 priority list," an executive told employees Thursday, BuzzFeed reports. That fact was left out of the post entitled "Continuing to Make Instagram Safer for the Youngest Members of Our Community," which repeated the minimum age and said Instagram requires verification when new users sign up. The company confirmed the BuzzFeed report on Friday, per the AP, saying it is "exploring a parent-controlled experience" on Instagram.
Watchdogs complained immediately that the plan is a way to develop users early so the company can profit from them later. "Facebook poses one of the biggest threats when it comes to children’s privacy," said an official for Amnesty Tech, part of Amnesty International. "Increasing safeguards for children online is paramount, but the fact remains that Facebook will be harvesting children’s data and profiting off their detailed profiles." A University of Maryland researcher said, "From a privacy perspective, you're just legitimizing children's interactions being monetized in the same way that all of the adults using these platforms are." Adam Mosseri, the boss at Instagram, said verifying ages has been difficult, given that 13-year-olds don't have driver's licenses or other official IDs. "We have to do a lot here," he said, "but part of the solution is to create a version of Instagram for young people or kids where parents have transparency or control." (More Instagram stories.)