| Welcome to Global Village Space

Thursday, September 19, 2024

Meta rolls out parental controls for teen Instagram accounts

Meta aims to make these accounts more secure, giving both teenagers and their parents more control over their online experience.

Meta Platforms has introduced a major overhaul to Instagram accounts for users under 18, aiming to address increasing concerns about the negative impact of social media on young users. The company will automatically convert all designated accounts into “Teen Accounts,” which will be private by default. This move comes as a response to mounting studies linking social media usage with mental health issues such as anxiety, depression, and learning disabilities, particularly among teenagers.

Read More: Meta removes 63,000 accounts in Nigeria over sextortion scams

Under this new structure, users under 16 can only modify default settings with parental consent. Parents will have access to a suite of tools, allowing them to monitor their children’s interactions and control app usage, creating a more secure online environment for teenagers.

Strict Privacy and Content Restrictions

The new teen accounts will offer a higher level of privacy by default. Teens can only be tagged or messaged by people they already follow or are connected with. Additionally, the app’s sensitive content settings will be set to the most restrictive option. Meta aims to make these accounts more secure, giving both teenagers and their parents more control over their online experience.

In the case of under-16 users, altering any default settings requires parental approval. This ensures that younger users are protected from excessive exposure to potentially harmful or inappropriate content without direct oversight from their guardians.

Legal Pressure Mounts on Social Media Companies

Meta’s move comes amid a broader reckoning in the social media industry, with platforms like TikTok and YouTube also facing legal challenges. Several lawsuits, filed by school districts and on behalf of children, accuse these platforms of being addictive and harmful. In 2022, 33 U.S. states, including New York and California, sued Meta for allegedly misleading the public about the risks associated with Instagram.

In response to this pressure, the U.S. Senate advanced two key bills in July 2024: the Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act. These bills are designed to hold social media companies accountable for how their platforms affect young users.

Read More: Meta faces first EU antitrust fine

As part of the update, Instagram users under 18 will receive notifications to stop using the app after 60 minutes of daily usage. Additionally, a built-in sleep mode will silence notifications overnight. Meta plans to roll out teen accounts within 60 days in the U.S., UK, Canada, and Australia, followed by the European Union. Global implementation is set for January 2025.