TikTok bans promotion of crypto, forex trading products

Read time 2min 10sec

TikTok has prohibited users from promoting crypto-currency and forex trading products on its platform, as part of a list of newly-banned content.

The video-sharing social media network recently updated its policy to prohibit adverts and promotional videos relating to content on trading platforms, pyramid schemes, lending and management of money assets, loans and credit cards, foreign exchange, and investment services.

Branded content, according to the ByteDance-owned social media company, is that where a content creator or influencer will receive something of value from a third-party, such as a brand, in exchange for their post, to promote that brand's products or service. It could be a brand endorsement, partnership or other kind of promotion for a product or service.

“Creators and their partners on TikTok are responsible for their branded content complying with all applicable laws and established regulations. These policies are designed to ensure a safe and positive environment for our users. The branded content policies are updated periodically,” says the company on its website.

The ban comes several weeks after the social media platform was flooded with financial investment-related content, which reportedly received over 2.8 billion views, with the hashtag "#investing" gaining much momentum.

The social media app also banned the advertising and promotion of weapons, cigarettes and tobacco products, gambling, dating and live video apps, and political content, among others.

TikTok is available in over 150 countries, with more than one billion users globally, and has been downloaded over 200 million times in the US alone.

Its Chinese original version, Douyin, has gained massive popularity, with over 689 million users in China.

Over the past few months, the app has been implementing additional privacy controls to help bring it in line with regulations designed to protect child privacy online, such as the US's child privacy act, the Children's Online Privacy Protection Rule.

Last week, the company announced that it removed almost 72 million videos from its platform that violated its community guidelines or terms of service, during the first three months of 2021.

According to the social media app’s Q1 2021 Community Guidelines Enforcement Report, 82% of removed videos were deleted before they received any views, 91% before any user reports, and 93% within 24 hours of being posted.

It also notes 1.9 million adverts were rejected for violating advertising policies and guidelines.

See also