The chilling discovery made when Indonesian authorities apprehended a teenager in connection with a November bombing on a Jakarta high school campus laid bare a disturbing new trend. The youth was found in possession of a life-size toy rifle, starkly inscribed with the ominous phrase “welcome to hell” and the names of infamous white supremacist mass murderers. This November 7th attack, which tragically injured 96 individuals, may represent the first instance of white supremacist-inspired violence in Indonesia, but law enforcement agencies fear it is merely the harbinger of more to come.
The scale of the issue is significant. Indonesian police revealed in March that at least 97 young individuals, some as young as 11 years old, were under surveillance. These youths had reportedly fallen under the influence of online content that glorified mass violence and espoused white supremacist ideologies, with messaging apps like Telegram identified as primary vectors for this insidious spread. Alarmingly, authorities indicated that at least two of these monitored teenagers were actively planning acts of violence in the immediate aftermath of the Jakarta bombing.
This phenomenon is not confined to Indonesia. Across Southeast Asia, a region characterized by its rich tapestry of ethnicities and faiths comprising hundreds of millions of people, law enforcement and security officials are confronting a surge in teenagers plotting violence inspired by white supremacist figures such as Brenton Tarrant, the perpetrator of the Christchurch mosque attacks. Interviews with security officials from Indonesia, Singapore, Malaysia, Thailand, and the Philippines paint a consistent picture of a growing concern.
Singapore’s internal intelligence agency, the Internal Security Department (ISD), has taken decisive action, detaining four youths since December 2020. These individuals were apprehended on grounds of subscribing to “violent far-right extremism ideologies” and were reportedly planning attacks. The ISD has since explicitly identified far-right extremism as a paramount threat to national security.
A crucial detail emerges: none of the teenagers being monitored in Singapore and Indonesia are ethnically white. This underscores that the appeal of these ideologies transcends racial lines, finding fertile ground among vulnerable youth in diverse societies. In some instances, authorities report that these young individuals were motivated by a misguided belief that their actions would preserve the existing racial and religious composition of their nations. Conversely, three Indonesian security officials noted that others were primarily drawn to the violent specter of far-right attackers, even in the absence of shared grievances or specific ideological commitments.
In every case examined by Reuters involving teenagers in Singapore and Indonesia, authorities alleged that the radicalization process occurred through social media platforms and online communities. Pravin Prakash, a researcher specializing in Southeast Asia at the Center for Organized Hate Studies, a Washington-based think tank, observed that many of these young individuals appear to be disillusioned and lonely, seeking solace and identity in a “nihilistic worldview” after being drawn into the orbit of far-right messaging.
The Jakarta suspect, according to Indonesian authorities, had posted online video footage of his campus adorned with Nazi symbols. Accompanying text seemed to draw inspiration from AC/DC’s iconic song “Highway to Hell,” with a chilling adaptation: “Don’t need no reason, ain’t nothing I’d rather do. I am on the highway to hell and all my friends are going to be there.” This highlights the potent blend of online subcultures, music, and extremist ideologies that can ensnare young minds.
Telegram groups, in particular, have been identified by Indonesian police as providing a sense of belonging for these disaffected youths. However, this sense of community comes at a cost. Police commissioner Mayndra Eka Wardhana, a spokesperson for the counter-terrorism squad, lamented that Telegram often fails to act on content flagged by authorities as extremist. In response to inquiries, a Telegram spokesperson, Remi Vaughn, stated that the platform maintains an “open channel of communication with Indonesian authorities” and commits to removing “any content that breaches Telegram’s terms of service whenever reported.” Vaughn further emphasized Telegram’s support for “the right to peaceful free speech,” while unequivocally stating that “calls to violence are explicitly forbidden.”
Recognizing the transnational nature of this threat, security and police agencies across Southeast Asia are enhancing their collaborative efforts. Officials from Singapore and Indonesia have confirmed that this marks the first instance of regional cooperation specifically targeting this form of radicalization.
KILLER MEMES AND CODED LANGUAGE
A common thread among the Indonesian teens identified by authorities as having been radicalized is their affiliation with the “true crime community,” a popular internet subculture. Within channels associated with this community, users engage in the sharing of memes and other content that glorifies perpetrators of mass violence, including figures like Brenton Tarrant, whose name was found on the Jakarta suspect’s toy rifle. This was evidenced by screenshots shared with Reuters by police and a separate review of four such online groups.
The conversations within these online spaces reveal a disturbing encouragement of violence. Screenshots of these discussions show some users trading bomb-making tutorials and egging each other on towards committing violent acts. The spread of white supremacist content is not limited to explicit declarations; it also manifests through coded language and localized adaptations.
Reuters observed hundreds of videos from Southeast Asian users on TikTok that featured racist caricatures of Chinese people and other minority groups, such as Rohingya Muslims. These visuals were often accompanied by phrases like “TCD” or “TRD,” which Saddiq Basha of Singapore’s S. Rajaratnam School of International Studies (RSIS) has identified as potential coded calls for “Total Chinese Death” or “Total Rohingya Death.” Basha has been tracking such content since 2024. One particularly popular video by an Indonesian user, utilizing the hashtag #TCD, garnered over 542,000 views. The creator did not respond to a request for comment.
The use of coded phrases to advocate for violence is not new. Western white supremacist groups have employed similar linguistic tactics, using phrases like “TND/Totally Nice Day” and “TJD/Totally Joyful Day” to promote the extermination of Black and Jewish people, according to anti-discrimination organizations like the Anti-Defamation League.
Following inquiries from Reuters regarding its content moderation policies, TikTok removed the Indonesian user’s post and similar content identified by the news agency. A company spokesperson affirmed, “There is no place on our platform for those dedicated to spreading beliefs or propaganda that encourage violence or hate.” However, two individuals working on TikTok’s online safety teams, who spoke on condition of anonymity as they were not authorized to speak to the media, expressed unfamiliarity with specific policies for moderating posts featuring localized interpretations of white supremacist slogans and were unaware of such content’s existence.
The TikTok spokesperson further elaborated that the platform actively blocks “certain keywords from appearing as search suggestions to reduce their visibility if we find that they are being used as coded language” and collaborates with Southeast Asian advisors on online safety matters. Munira Mustaffa, director of the Chasseur Group and an advisor to Southeast Asian governments and social media platforms on combating extremism, pointed out a potential blind spot: tech companies have historically focused their moderation efforts on Islamist content in Southeast Asia, sometimes to the detriment of identifying and addressing other forms of extremist posts.
“While the concept of neo-Nazism lies in the assertion that the white race reigns supreme, these ideas are easily adaptable into local context,” Mustaffa explained, adding that teenagers who successfully carry out attacks often believe they will achieve significant status within their online communities.
Among the youths authorities allege were radicalized by algorithmic recommendations is Nick Lee Xing Qiu. Detained by the ISD last year as an 18-year-old, he was suspected of plotting attacks against Singapore’s Malay Muslim minority. The agency stated that algorithms on unspecified platforms had recommended far-right extremist content to him. Reuters was unable to reach Lee, who is being held under a law that allows for detention without trial, nor could a legal representative be identified to address inquiries.
Both Lee and another unnamed teenager, who was separately detained, self-identified as “East Asian supremacists” in their online posts, according to ISD statements. These youths referenced the neo-Nazi “great replacement theory”—the conspiracy theory that white populations are being deliberately supplanted by minority groups—and claimed inspiration to “fight back,” the ISD reported.
YOUTH REHABILITATION AND GLOBAL REACH
Indonesian counter-terrorism official Mayndra expressed deep concern that teenagers radicalized by violent extremist content could become prime targets for recruitment by “terror groups.” Many of the young individuals currently detained or under monitoring in Indonesia and Singapore are either below the age of majority or have not yet committed overt acts of violence.
The Jakarta bombing suspect, for instance, is currently under the care of child protective services while authorities build their case, according to police spokesperson Budi Hermanto. The suspect has not yet been formally charged or entered a plea. Rudianti, a family member of the Jakarta suspect who uses a single name, shared a poignant hope with Reuters: “My hope, if it’s possible, is do not punish him, just give him counselling so he can be a better person.”
In an effort to curb the spread of radicalization among youth, Indonesia announced plans this month to restrict social media access for children under the age of 16. Mayndra acknowledged that while this measure could aid in combating youth radicalization, it is not a comprehensive solution.
In Singapore, authorities have enlisted the expertise of the Religious Rehabilitation Group (RRG) to engage with some teenagers detained for plotting far-right attacks. Established in 2003 by Muslim scholars, the non-profit organization was initially created to rehabilitate suspected Islamist militants and now comprises volunteer educators. The RRG provides counseling to young detainees and assists them in preparing for national examinations, according to Ahmad Helmi Bin Mohamad Hasbi, an RRG counselor and a radicalism expert at RSIS.
The RRG has previously worked with Singapore’s first far-right extremist detainee, a 16-year-old apprehended in 2020 for allegedly planning machete attacks on two mosques. He was released from rehabilitation in 2024. However, groups like the RRG face the daunting challenge of countering the rapid global influence of some Southeast Asian extremists.
The far-reaching impact of these ideologies was tragically underscored just one month after the Jakarta bombing. A 15-year-old Russian was accused of fatally stabbing a Tajik migrant child in the Moscow region. The Russian suspect had authored a manifesto, subsequently published on Telegram and authenticated by researchers with the U.S.-based non-profit Global Project on Hate and Extremism. Within this manifesto, the Russian suspect hailed the Indonesian teenager as a hero and argued that if non-white youths could execute such attacks, white supremacists should aspire to achieve even greater atrocities.
© Thomson Reuters 2026.
