| Welcome to Global Village Space

Friday, November 29, 2024

TikTok introduces age restrictions amid mental health concerns

In addition to filter restrictions, TikTok plans to introduce new safety resources across 13 European countries.

TikTok, the popular short-form video platform, is set to roll out a series of global restrictions aimed at addressing the negative impact of beauty filters on teenagers’ mental health. The changes, expected to be implemented in the coming weeks, will primarily target users under 18, barring them from accessing filters that significantly alter facial features such as enlarging eyes, smoothing skin, or plumping lips.

Move Against Distorted Beauty Standards

The restrictions are part of TikTok’s broader effort to promote healthier online habits and counteract the growing pressures young users face to meet unrealistic beauty standards. Filters like “Bold Glamour,” which create hyperrealistic changes, have drawn criticism for contributing to distorted self-image and anxiety, particularly among teenage girls. Many teens report feeling less confident in their unfiltered appearance after prolonged exposure to these effects.

Read More: Australia Approves Social Media Ban for Under-16s in Landmark …

In contrast, filters designed for comedic purposes—such as those adding animal ears or exaggerated facial features—will remain accessible to all age groups.

Findings From the Internet Matters Report

TikTok’s decision follows a commissioned report from the UK-based children’s online safety organization, Internet Matters. The report, titled Unfiltered: The Role of Authenticity, Belonging and Connection, revealed that beauty filters often normalize unattainable ideals, leaving teens, especially girls, feeling pressured to conform. It also found that many children struggled to discern whether images had been digitally altered.

TikTok acknowledged these findings in its press release, stating that the platform aims to foster a culture of authenticity and self-respect. The company will also expand filter descriptions to inform users of how a filter alters their appearance when applied.

Improving Age Verification Systems

Ensuring the effectiveness of these new restrictions depends heavily on accurate age verification. TikTok plans to implement machine-learning technology to identify accounts of users under 13, the platform’s minimum age requirement. Currently, TikTok removes approximately six million accounts globally each month for failing to meet age standards.

Chloe Setter, TikTok’s head of child safety public policy, stated that the new technology would flag suspicious accounts for review by moderators, who could remove them if deemed underage. Users whose accounts are wrongly removed will have the option to appeal.

This initiative aligns with upcoming regulations like the UK’s Online Safety Act, which will require social media platforms to enforce stricter age checks. Ofcom, the UK’s communications regulator, has previously criticized TikTok’s age verification measures as insufficient, highlighting the need for “highly effective” checks to protect younger users.

Broader Industry Trends in Child Safety

TikTok’s announcement comes amid increasing regulatory scrutiny of social media platforms’ handling of child safety. In the UK and EU, platforms are racing to comply with tougher laws expected to take effect in 2025.

Other major platforms have introduced similar measures. Roblox recently restricted violent and explicit content for its youngest users, while Instagram launched “teen accounts” that allow parents to monitor and control their children’s activity.

Experts have pointed out that these changes are likely driven more by regulatory pressures than altruistic motives. Andy Burrows, CEO of the Molly Rose Foundation, emphasized the need for ambitious legislation, arguing that platforms often act only when compelled by law.

Expanding Support for At-Risk Users

In addition to filter restrictions, TikTok plans to introduce new safety resources across 13 European countries. These tools will connect users who report content related to suicide, self-harm, hate, or harassment with local helplines.

TikTok’s European Public Policy Lead, Dr. Nikki Soo, noted that the company is committed to creating a supportive and respectful digital environment where users feel empowered to express themselves authentically.

Road Ahead

As TikTok continues to refine its safety policies, experts remain cautiously optimistic. The NSPCC described the new restrictions as “encouraging,” but stressed that they represent only the beginning of what is necessary to protect young users online.

Read More: Millions in Pakistan Attempt to Access Blocked Adult Sites via VPNs

Richard Collard, associate head of policy for child safety online at the NSPCC, called on other platforms to follow TikTok’s lead in implementing robust age-checking systems. “This is a positive step, but it’s just the tip of the iceberg,” he said.