- Advertisement -
Lifestyle

Instagram makes new under-16s’ accounts private by default to protect children from abusers

Government legislation in several countries now puts the onus on technology giants to ensure sufficient safeguards to protect children.

Staff Writers
2 minute read
Share
Instagram has come under fire from children's charities for failing to remove harmful content. Photo: Pexels
Instagram has come under fire from children's charities for failing to remove harmful content. Photo: Pexels

Instagram has made new under-16s’ accounts private by default so only approved followers can see posts and “like” or comment.

Tests showed only one in five users opted to select a public account when the private setting was the default on signing up, it said.

Instagram will also send existing account holders a notification “highlighting the benefits” of making their account private.

Despite a backlash from some groups, Instagram say it is also pushing ahead with new apps specifically for under-13s.

Parent company Facebook said, “The reality is that they are already online and, with no foolproof way to stop people from misrepresenting their age, we want to build experiences designed specifically for them, managed by parents and guardians.”

It is also developing artificial-intelligence systems to find and remove under-age accounts.

Government pressure and legislation in several countries now puts the onus on the technology giants to ensure sufficient safeguards to prevent children accessing potentially harmful content.

Instagram has also come under fire from children’s charities for failing to remove harmful content.

It has also been investigated for its use of children’s data in Europe. In response, it has developed a range of child-protection measures.

In March, it announced older users would only be able to message teenagers who already followed them.

But the system relies on the age provided by the user and listed in the account – something younger users may lie about to avoid such restrictions.

Instagram said its latest updates were about “striking the right balance”.

“Historically, we asked young people to choose between a public account or a private account when they signed up for Instagram but our recent research showed that they appreciate a more private experience.”

The company has also developed a tool that automatically detects potentially suspicious adult accounts and stops those accounts from interacting with teens’ accounts on the app.

An adult’s account might be marked as suspicious if it has been blocked or reported by multiple teen accounts, said Instagram’s head of public policy, Karina Newton.

Once they are flagged, adults won’t be shown teen accounts in discovery tools. They also won’t be able to follow young people’s accounts, leave comments on their posts or see comments from teens on other people’s accounts.

“We are creating an additional buffer around young people,” said Newton, noting that the company had already restricted adults from sending direct messages to users under 18 who didn’t follow them.

Meanwhile, “a more precautionary approach” will see advertisers able to target children based on only age, gender, and location, rather than interests and web-browsing habits.

And while defending targeted ads in general and the opt-out features it has for some types, Instagram said: “We have heard from youth advocates that young people may not be well equipped to make these decisions.”