On the small, glowing screens of TikTok Live children from some of the world’s most impoverished areas are seen sitting cross-legged, hands outstretched, silently pleading for help.
These young faces, broadcast from mud-brick homes in Afghanistan to the crowded urban streets of Pakistan and Indonesia, are not asking passer-by for coins or food but requesting virtual gifts, which are converted into small amounts of real money.
What appears to be a simple cry for help is, in reality, an increasingly troubling trend that has spiralled into a form of digital begging, driven by TikTok’s algorithms and weak content moderation.
Despite the platform’s policies against such practices, children and vulnerable individuals are being pushed into these live-streamed appeals, where their poverty is turned into a source of entertainment.
According to an investigation by The Guardian’s Observer, this disturbing trend is growing rapidly, largely unchecked by the platform’s content moderators.
TikTok, which launched its live-streaming feature in 2020, has become a stage for these emotional pleas, yet the company’s enforcement of its own rules on child begging has proven inconsistent. In many cases, flagged accounts are only removed after intervention from media outlets, raising concerns about the platform’s commitment to protecting vulnerable users.
These live-streamed begging sessions, some orchestrated by families and others potentially part of a larger network exploiting children, raise red flags about the ethical implications of digital charity. W
hile some families may genuinely rely on these earnings, there are reports of multiple children appearing in similar settings each day, leading to suspicions of organized exploitation.
The app’s design encourages this exploitation, with the digital gifts incentivizing both creators and TikTok itself. Though TikTok claims to take 30% of gift revenue for processing and app fees, the platform often retains nearly 70% of the total value generated from these streams.
TikTok Live is intended for content creators to interact with their audience in real-time, but the platform’s safety measures have been criticized for being insufficient. While it requires hosts to be over 18 and have at least 1,000 followers, children frequently appear in these live streams, often under the guidance of unseen adults or handlers.
In addition to passive begging, some participants have been coerced into performing degrading acts, such as covering themselves in mud or enduring long hours of sleeplessness, just to earn virtual gifts. In some instances, streams go viral, generating significant revenue for both the creators and the platform, making it more difficult to discern where the money truly goes.
While there have been cases of TikTok Live helping creators raise money for legitimate needs, like medical expenses, these instances are far outweighed by the potential for exploitation. Experts warn that many of the children involved may be too young to fully understand what is happening or may be coerced into participating, with third parties controlling access to the earnings.
Despite TikTok’s promises to address these issues, previous investigations by the BBC and Al Jazeera have highlighted similar patterns of exploitative livestreaming in vulnerable settings such as refugee camps and orphanages.
TikTok’s safety measures are often rendered ineffective by anonymity and a lack of accountability, leaving viewers and potential donors unsure of the true beneficiaries of their contributions.