These frenzies - where TikTok drives disproportionate amounts of engagement to some topics - are evidenced by interviews with former staffers, app users and BBC analysis of wider social media data.
While Olivia was an experienced social video creator, frenzies can also draw in people who seem never to have posted content like this before - and reward them with huge numbers of views.
During the school protests, I decided to see what type of content TikTok’s algorithm might recommend to an undercover account pretending to belong to a 15-year-old boy with typical interests, such as football.
In an interview with the BBC, Mr Markovac said he encourages young people to “rebel against ridiculous rules”, but he said he could not be held responsible for the poor decisions of a minority of viewers.
Several former TikTok employees in the US and UK told the BBC that limiting these frenzies of harmful content was not a priority for the social media company, because it could slow down the app’s meteoric growth.
TikTok told the BBC it has more than 40,000 “safety professionals” using technology to moderate content, with the “vast majority” of videos with harmful misinformation never receiving a single view.
The original article contains 1,966 words, the summary contains 200 words. Saved 90%. I’m a bot and I’m open source!
This is the best summary I could come up with:
These frenzies - where TikTok drives disproportionate amounts of engagement to some topics - are evidenced by interviews with former staffers, app users and BBC analysis of wider social media data.
While Olivia was an experienced social video creator, frenzies can also draw in people who seem never to have posted content like this before - and reward them with huge numbers of views.
During the school protests, I decided to see what type of content TikTok’s algorithm might recommend to an undercover account pretending to belong to a 15-year-old boy with typical interests, such as football.
In an interview with the BBC, Mr Markovac said he encourages young people to “rebel against ridiculous rules”, but he said he could not be held responsible for the poor decisions of a minority of viewers.
Several former TikTok employees in the US and UK told the BBC that limiting these frenzies of harmful content was not a priority for the social media company, because it could slow down the app’s meteoric growth.
TikTok told the BBC it has more than 40,000 “safety professionals” using technology to moderate content, with the “vast majority” of videos with harmful misinformation never receiving a single view.
The original article contains 1,966 words, the summary contains 200 words. Saved 90%. I’m a bot and I’m open source!