Illustrative image related to AI deepfake case in Nausori, Fiji ties teen’s private images to CSAM surge online.
A Fijian mother’s discovery that a circulated video showed her 16-year-old daughter naked — a clip the girl did not create but which was produced by schoolboys using an AI “nudification” app — has become the latest, stark example of how rapidly evolving technology is amplifying online sexual harm to children in Fiji and the wider Pacific. The Nausori incident, in which images the teenager had shared privately were transformed into fabricated explicit footage and spread in a Telegram group, comes amid mounting national and international data that show Fiji is increasingly visible in global child sexual abuse material (CSAM) detection systems.
The Nausori case highlights how adolescent online behaviour and deepfake-style tools can collide to produce devastating harms. According to the account supplied to investigators, boys at the teenager’s school downloaded previously shared social media images and used an AI-powered app to generate a fake naked video, which was then distributed beyond the classroom circle. The victim and her family only became aware when a relative sent the clip to the mother, who reported the matter to authorities.
The incident arrives against a backdrop of rising reporting and referral figures. Fiji recorded 3,638 cases of online child exploitation in 2023, and national authorities estimated that around 15 terabytes of pornographic content were consumed in the country each day earlier in 2025 — a volume officials described as a “tsunami” of material, some proportion of which involves CSAM. Filipe Batiwale, Fiji’s Online Safety Commissioner, told international clearing houses in 2025 that between roughly 1,800 and 8,000 CSAM-related referrals linked to Fiji were being passed to global centres such as the US National Center for Missing and Exploited Children. Those referral numbers represent platform-generated alerts, not necessarily confirmed offences, but they underscore the scale of material flagged from Fiji-linked accounts or infrastructure.
The growing visibility of Pacific-origin content in global datasets has pushed Fijian leaders to reframe how the problem is discussed. In late 2025, then-deputy prime minister Manoa Kamikamica publicly urged replacing the term “child pornography” with “child sexual abuse material (CSAM),” arguing that the former obscures the criminal and exploitative nature of the content and the need for a child-protection response. International agencies have echoed concerns: at a 2025 regional summit UNICEF warned that levels of violence, abuse and neglect among Pacific children are among the highest in the world, while Save the Children reported rising physical, sexual, emotional and online violence across several Pacific states.
Experts and officials stress that detection alone will not protect children. Many global platforms operate beyond local jurisdiction, and small island states frequently lack the technical, investigative and prosecutorial capacity to convert referrals into outcomes for victims. Dedicated digital-forensics units are rare, reporting mechanisms remain weak in many countries, and police and child protection systems are already stretched. As connectivity expands from urban centres to remote villages, the rapid spread of smartphones and messaging apps has outpaced legal, technical and institutional safeguards.
The Nausori episode is being treated as a warning about new technological threats: deepfake-style transformations of private images can create material that appears authentic and travels quickly online. Authorities and child-protection advocates are calling for a coordinated response that includes stronger reporting pathways, faster platform cooperation, investment in digital-forensics capacity and public education about emerging AI risks. With Fiji’s detection numbers rising and referrals into global clearing houses increasing, officials say urgent international support and clearer regulatory expectations for tech companies are now essential to prevent more children from being harmed.

Leave a comment