In January 2019, a Peterson Institute investigation described TikTok as a “Huawei-sized problem” that posed a national security threat to the West. In July, the White House considered banning the app outright. Trump was persuaded to hold off on an outright ban in August, and purchase of the companies’ US operations by a native firm is currently on the cards. However, new guidelines introduced by the Chinese government on foreign purchases are reportedly stalling the decision. Yet one of the most alarming aspects of the TikTok phenomenon doesn’t seem to be hitting the headlines much at all: its problem with child predators.
TikTok has often sparked outrage for its bizarre and shocking trends. Just recently, the app came under fire for promoting videos in which users role-played as Holocaust victims to the Bruno Mars’ song “Locked out of Heaven.” Furthermore, there has been a consistent stream of evidence that the app facilitates and encourages predatory behaviour toward underage users.
A major draw of the app is its frequent dance and duet trends promoted in the app’s “For You” section. Users often imitate dance routines or mime song lyrics, and many current TikTok ‘influencers’ shot to fame by participating in these trends. However, many of these trends involve sexually-themed dances or songs, which are especially problematic when the underage users posting them are public and contactable by adult users. Its video-only interface makes it a primarily visual medium, and most users are under 18. British Child Online Safety Expert John Carr stated in an interview with the Sun Newspaper that, “There’s no question an app like this is a magnet for paedophiles.”
A British report found that a quarter of children in their research sample had live-streamed with someone they have never met, and one in twenty of them had been asked to take their clothes off. In April, a BBC investigation found hundreds of disturbing sexual comments publicly posted to videos uploaded by teenagers and children. Earlier this year, American TikTok influencers Caleb King and Tony Lopez came under fire for inappropriate conduct with underage users. In October 2018, allegations against TikToker Buddy Haynes went viral, and his account was banned several months later. The vast majority of abuse, however, is swept under the proverbial rug.
A Buzzfeed interview with an undercover account that exposes predatory content lamented the platform’s inconsistent approach to moderation. They claimed that, all too often, videos exposing predatory behavior were removed, but videos of “creepy old men” were not.
Since TikTok bosses were fined £4.3million by the Federal Trade Commission in 2019 for gathering data on children, the app has banned those under thirteen and introduced ID checks in the US. This investigation was centered on federal privacy laws, rather than child abuse, which can indeed victimize children thirteen and over. Nor has the case set much of precedence outside the US. Over half of British children are currently using the app, including one in three seven-year-olds. Children in the US, UK and Spain spend as much time watching TikTok as they do YouTube.
Nevertheless, requiring ID checks for younger users also fails to eliminate the issue of predatory adults using the platform to participate in questionable behaviors. In July 2020, outrage ensued when in a now-deleted TikTok a father appeared to sexually roleplay with his daughter. In June YouTube commentator ReadyToGlar brought to light TikToker Brad Bastidas who appeared to be using his young sibling to film sexually suggestive videos that received hundreds of thousands of views and likes.
In the face of concerns, TikTok has insisted safety is its ‘number one priority’ but admitted it was ‘becoming an increasingly difficult challenge.’ Is it really unrealistic to expect a multi-billion dollar corporation to funnel resources toward eliminating pedophiles from its overwhelmingly young platform? If not, maybe it is time to ask why TikTok is choosing not to do so?
Although Silicon Valley’s track record on the issue of online predation, ought to make us pessimistic about the future for child safety on TikTok once its US operations are reacquired. But if TikTok wishes to thrive as a platform where young people can showcase their talent and wit, or just have some harmless fun, it must work much harder to eliminate the victimization of young users or risk a cultural and legal backlash.
The views expressed in this article are the opinion of the author and do not necessarily reflect those of Lone Conservative staff.