TikTok cuts hundreds of jobs amid AI shift for content moderation

Additional layoffs expected next month

Image:
Content moderators at TikTok face being cut

TikTok has announced major layoffs, mainly targeting content moderation teams in Malaysia and other regions.

The company is reportedly moving towards AI-driven content moderation to enhance efficiency and meet growing regulatory demands.

Sources with knowledge of the matter told Reuters that over 700 jobs were cut in Malaysia, although TikTok later clarified the figure was closer to 500.

The employees affected, most of whom were responsible for manual content review, were notified of their impending redundancies through email.

TikTok currently uses a combination of automated detection systems and human moderators to review content on its platform.

ByteDance, TikTok's parent company, employs over 110,000 people worldwide.

A TikTok spokesperson told Reuters that the recent job cuts are part of "ongoing efforts to further strengthen our global operating model for content moderation."

Additional redundancies are expected next month, with several hundred employees likely to be affected globally.

This year, TikTok plans to invest $2 billion in trust and safety initiatives worldwide, with AI now responsible for removing 80% of content that violates platform guidelines.

The latest round of layoffs at TikTok comes after a series of job cuts earlier this year.

In April, the company reduced its workforce in Ireland by over 250 employees, and in May, reports surfaced that around 1,000 staff from its operations and marketing teams were being laid off.

Regulatory challenges

In recent years, TikTok has faced regulatory challenges in many countries.

In July, the UK's broadcasting and telecoms regulator, Ofcom, fined TikTok £1.875 million for providing inaccurate information about its parental control safety feature.

TikTok was asked to share details about the adoption of its "Family Pairing" feature following a request in the summer of 2023. Ofcom sought this data to evaluate how effectively TikTok's controls protect teenage users. The regulator also aimed to help parents make informed decisions about their children's platform use.

While TikTok initially complied with the request in September 2023, it later retracted the information, prompting Ofcom to launch an investigation.

The investigation uncovered major deficiencies in TikTok's data governance processes, revealing that the company lacked proper checks to prevent inaccurate data submission and was slow to recognise and correct the errors.

The EU Court of Justice in June upheld the European Commission's decision to designate ByteDance as a gatekeeper under the Digital Markets Act (DMA). The court dismissed all arguments put forward by the company, affirming ByteDance's obligations under the DMA to comply with stricter regulations aimed at promoting fair competition in digital markets.

Earlier this year, the US House of Representatives passed a bill that could potentially lead to a nationwide ban on TikTok.

The legislation, aimed at protecting national security, gave ByteDance a six-month deadline to sell its controlling stake in TikTok. Failure to do so would result in the app being banned across the country.

Proponents of the bill argued that a platform as influential as TikTok, controlled by a company with ties to the Chinese Communist Party, poses a serious national security risk.

In response, TikTok has sought to address these concerns by highlighting its commitment to data security and privacy.