Facebook-owned Instagram unveiled technology Tuesday aimed at preventing underage children from creating accounts and blocking adults from contacting young users they don’t know.
It was the latest move responding to concerns about inappropriate contact between adults and children on the platform, which like most services sets an age minimum of 13.
Instagram will begin using artificial intelligence to determine a user’s age at signup in an effort to find underage users.
“While many people are honest about their age, we know that young people can lie about their date of birth. We want to do more to stop this from happening, but verifying people’s age online is complex and something many in our industry are grappling with,” a blog post said.
“To address this challenge, we’re developing new artificial intelligence and machine learning technology to help us keep teens safer and apply new age-appropriate features.”
Additionally, the California giant said it would introduce a new feature that prevents adults from sending messages to people under 18 who don’t follow them, to prevent unwanted contact.
“This feature relies on our work to predict peoples’ ages using machine learning technology, and the age people give us when they sign up,” Instagram said.
Instagram is also looking at ways to make it more difficult for adults who have been exhibiting “potentially suspicious behavior” to interact with teens, including restricting these adults from seeing suggested teen accounts.
The image-focused network indicated it will alert teens to potentially suspect behavior by adults, including the sending of large numbers of private messages.
“We’ll use this tool to alert the recipients… and give them an option to end the conversation, or block, report, or restrict the adult,” Instagram said.
AFP