Australia’s under-16 ban puts social media on notice

0 1

Unlock the Editor’s Digest for free

Parents the world over know the dilemma. Social media networks are a pivot around which their teenage children’s lives revolve — where they hang out with friends, nurture relationships, share what they are doing and find out much about current events. Yet research has suggested a link between social media apps and anxiety, depression, and sleeping and eating disorders among young people. They can be a source of harmful material on body image, or of misogynistic content, or forums for bullying. Is the answer to ban their use by vulnerable minds?

Australia’s government thinks so. Prime Minister Anthony Albanese is rushing through a ban on under-16s having accounts with platforms such as Facebook, Instagram, X, TikTok and Snapchat, in what it calls a world first. Tech companies would be obliged to take reasonable steps to ensure under-age children could not use their services — with penalties of up to A$50mn ($32.5mn) for “systemic” breaches — though children and parents would not be penalised. Albanese says the ban aims to protect mental health, likening it to curbs on underage drinking: worth doing even if some teens circumvent it.

More than a dozen other countries including Austria, Germany, Italy and South Korea are moving towards minimum age limits for social media, often 15, with varying degrees of enforcement. Norway is raising an existing threshold from 13 to 15. The UK government is reportedly weighing up backing a private members’ bill on the issue. About 10 US states have passed laws restricting children’s access to social platforms, though some are being challenged in court.

One question is whether Australia’s ban is workable; several countries have already found age verification is tricky. Age assurance trials are under way in Australia to determine how the ban will be enforced, but users would not have to hand over ID data directly to platforms. Meta, owner of Facebook and Instagram, has suggested Apple and Google, which control smartphone app stores, should play a bigger role in verifying ages across multiple apps.

The second question is whether a ban is desirable. For all their harms, social networks also have benefits. Researchers say they can stimulate youthful creativity and learning. For some marginalised communities, they can be an important means of interaction. And while online media is more sophisticated, and its impacts potentially more insidious, the approach to risks from older media forms has been to shield children from unsuitable content, not blanket bans.

The most useful role of age-related bans — or the threat of them — may be to incentivise tech companies to create genuinely safe teen versions of apps, with more stringent moderation and shorn of addictive features. Instagram recently introduced “teen accounts” touting greater protections for 13 to 15 year-olds, though some parents and campaigners suggest these do not go far enough. Australia’s government has said services designed to be safe for kids will be exempted from its ban if they are effective. Its legislation excludes online gaming and messaging services and, after lobbying by teachers and a children’s music group, YouTube.

Making apps safe is not straightforward. Roblox, a gaming platform that explicitly catered to younger players using AI to safeguard conversations, last week had to introduce new controls for under-13s after criticisms over alleged child abuse on the platform. But the tech companies have vast financial and technological resources to come up with solutions — and the moves in Australia and elsewhere suggest the pressure on them will only grow.

Read the full article here

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy