| Welcome to Global Village Space

Thursday, November 28, 2024

TikTok removes over 30 million videos in Pakistan in 2024

TikTok has removed more than 30.7 million videos from its platform in Pakistan during the first half of 2024 for violating guidelines.

In a significant move towards creating a safer digital environment, TikTok has removed more than 30.7 million videos from its platform in Pakistan during the first half of 2024. This initiative is part of the platform’s larger effort to enhance content moderation and maintain a secure and positive user experience, as outlined in its second-quarter Community Guidelines Enforcement Report.

Proactive Content Moderation in Pakistan

TikTok’s proactive approach to content moderation has been evident in its handling of harmful and inappropriate material. According to the report, 99.5% of the videos removed in Pakistan were taken down before users had the chance to report them, and 97% of these videos were deleted within 24 hours of being uploaded. The company credits this efficiency to its advanced automated moderation system, which helps detect and remove harmful content in real-time.

Read More: ByteDance Stands Firm Amidst TikTok Ban Threats

The 30.7 million videos removed in Pakistan contribute to the larger global total of 178 million videos deleted during the second quarter of 2024, with 144 million of these being identified and removed automatically through TikTok’s detection technology. These figures highlight the platform’s focus on ensuring that harmful content is addressed before it can spread.

Automation Leads the Charge

TikTok’s growing reliance on automation for content moderation is a key part of its global strategy to maintain platform safety. The automated system successfully detects and removes content that violates community guidelines, such as violent material, nudity, and misinformation. The platform has achieved a proactive detection rate of 98.2%, meaning that harmful content is removed before it can reach users, making the experience safer for everyone.

In Pakistan, where TikTok has faced scrutiny for allowing inappropriate content to flourish, the removal of 30.7 million videos demonstrates the platform’s resolve to meet regulatory expectations. The company’s focus on automation has allowed it to swiftly manage vast amounts of content uploaded every day, ensuring that problematic material is quickly removed.

Global Efforts to Enhance Platform Safety

Globally, TikTok’s content moderation efforts reflect similar trends. In its Community Guidelines Enforcement Report for the first quarter of 2024, TikTok reported removing approximately 166.9 million videos worldwide. Of these, more than 129 million videos were removed through automated systems, further showcasing the effectiveness of its AI-powered tools. The report also revealed that 0.9% of all uploaded videos were removed due to guideline violations.

In addition to video removals, TikTok disclosed that nearly one billion comments were filtered or removed in the second quarter of 2024 using its comment safety tools. This measure, aimed at reducing harmful interactions on the platform, represents another step in the platform’s broader strategy to maintain a respectful and safe online environment.

Challenges and Future Plans

Despite its success with automation, TikTok’s moderation system still involves human reviewers, particularly for content that is flagged for appeal. This hybrid model helps ensure that false positives—videos wrongly identified by the AI—can be reviewed and reinstated when appropriate.

Read More: Fatwa Issued, Calls Tiktok ‘Haraam’

However, the platform’s increasing reliance on AI has also resulted in significant changes to its workforce. Recently, TikTok laid off hundreds of employees, including approximately 500 workers in Malaysia, as AI-powered moderation took over many roles previously performed by human moderators. This shift comes amidst growing regulatory pressure in countries like Malaysia, which demand greater accountability from social media firms to combat cyber offenses.