TikTok tightens security measures as regulators zero in
Video-sharing social media network TikTok says in the first quarter of 2021, it removed almost 72 million videos from its platform that violated its community guidelines or terms of service.
This is according to the social media app’s Q1 2021 Community Guidelines Enforcement Report, which provides visibility into the volume and nature of content and accounts removed from the platform during the first three months of 2021.
The ByteDance-owned TikTok app allows users to create 15-second-long videos, sound-tracked by music clips.
It is available in over 150 countries, with over one billion users globally, and has been downloaded over 200 million times in the US alone.
TikTok's Chinese original version, Douyin, has gained massive popularity, with over 689 million users in the country.
Over the past few months, the app has been implementing additional privacy controls to help bring it in line with regulations designed to protect child privacy online, such as the US's child privacy act, the Children's Online Privacy Protection Rule.
For the first time, the company says it published the number of suspected underage accounts removed, to keep the full TikTok experience a place for people 13 years of age and over.
In the report, TikTok says 82% of removed videos were deleted before they received any views, 91% before any user reports, and 93% within 24 hours of being posted.
It also notes 1.9 million adverts were rejected for violating advertising policies and guidelines. In addition, the platform says 11.9 million accounts were removed for violating its community guidelines or terms of service, of which over seven million were removed for potentially belonging to a person under the age of 13.
The removed videos were abolished for various reasons, including adult nudity and sexual activities; harassment and bullying; hateful behaviour; Illegal activities; minor safety; suicide, self-harm, dangerous acts; and displaying violent and graphic content.
“At TikTok, we work to create age-appropriate environments by developing policies and tools that help promote safe and positive experiences on our platform.
“Our Community Guidelines apply to everyone and all content on our platform. Our TikTok team of policy, operations, safety and security experts work together to develop equitable policies that can be consistently enforced,” says Eric Han, head of safety at TikTok US.
“Our policies do take into account a diverse range of feedback we gather from external experts in digital safety and human rights, and we are mindful of the local cultures in the markets we serve. Our ultimate goal is to create guidelines that enable authentic and creative self-expression in a safe and entertaining community environment.”
Another 71.4 million accounts were blocked from being created through automated means, it says.
“We continue to rely on technology to detect and automatically remove violating content in some markets. Of the total videos removed globally, 8.8 million were flagged and removed automatically for violating our community guidelines,” says TikTok.
In the first quarter, TikTok reinstated over 2.8 million videos after they were appealed.
“TikTok offers creators the ability to appeal their video's removal. When we receive an appeal, we will review the video a second time and reinstate it if it had been mistakenly removed. We aim to be consistent and equitable in our moderation practices and will continue our work to reduce false positives through ongoing education and training of our moderation team.”
TikTok’s success has made it a big target, as regulators across the globehoned inon the controversial platform, after many users reportedly experienced harassment within the app.
Earlier this year, TikTokagreed to pay$92 million to settle a class-action lawsuit that alleged the video-sharing social networking service illegally collected some teenage users’ data, according to court filings.
In the UK and EU, TikTok is facing a lawsuit from the children's commissioner for England, filed on behalf of millions of children for alleged violation of user data privacy.
In October 2020, the social network launched its global bug bounty programme as an extension of its vulnerability management initiative to proactively identify and resolve any vulnerabilities.