Instagram steps up teen controls in response to child safety concerns

0 3

Unlock the Editor’s Digest for free

Meta will add restrictions to teenagers’ accounts on its platforms to include a safety protection that makes parental approval mandatory for certain features, as the industry comes under fire over social media’s impact on young people.

The $1.3tn social media platform said on Tuesday the profiles of teens on its photo-sharing Instagram app will automatically be private and see less content deemed “sensitive” under its new “teen accounts” feature.

For the first time, under-16s wishing to loosen those restrictions and, for example, turn their profile public will need to sign up to a “parental supervision” feature on the app to gain permission.

Parents with supervision will be able to see the topics their teen is browsing and who they are messaging, but not the messages. They will also be able to add restrictions, such as blocking their child’s access to the app at night.

“The new teen account protections are designed to address parents’ biggest concerns, including who their teens are talking to online, the content they’re seeing and whether their time is being well spent,” Meta said in a blog post on Tuesday. 

The push comes as social media platforms have faced sharp criticism for doing too little to protect minors using their platforms from harmful or inappropriate content and from child predators and sexual exploitation. Concerns also have been mounting over the perceived negative mental health and addictive impacts of the technology. 

Meta in particular has been attempting to draw teen users to its Instagram app to compete with fast-growing rivals such as ByteDance’s TikTok, while its flagship Facebook platform has been losing traction with a younger audience.

However, in January, chief executive Mark Zuckerberg was compelled to issue a dramatic apology live before US Congress to the families of children who had been victims of sexual exploitation and abuse on his platforms. Meanwhile, dozens of US state attorneys-general have filed lawsuits against the platform for its role in child harms. 

Meta plans to introduce the feature globally on Instagram over the course of the year and across its other apps including Facebook next year, Tuesday’s post said. 

Both the parent and the teen have to agree to the parental supervision setting. Meta said it was building technology to more proactively root out teens who might be lying about their age, while it would draw on a variety of signals to assess if someone was eligible to take on the child’s parental role for app use.

Forcing social media platforms to address child safety is one of the few issues in the US that garners bipartisan support, with the Senate in July overwhelmingly passing the Kids Online Safety Act. The legislation places a duty of care on social media platforms to protect children from harmful online content. 

At the same time, several state laws have been passed recently with requirements such as mandating parental consent for minors’ use of social media as well as age verification, including in Florida and Louisiana. Others, in Utah and Mississippi, have been blocked by judges over First Amendment concerns. 

Jurisdictions elsewhere are also seeking to address concerns over children’s internet use. The European Commission in May opened an in-depth probe into Meta under its Digital Services Act, looking at whether the platform has “appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors”.

The UK’s Online Safety Act, passed nearly a year ago, is considered among the strictest pieces of legislation to protect underage internet users from harmful content.

Read the full article here

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy