Our institutions pretend they cannot see what is happening. But the truth is starker: they have chosen not to look.
Globally, 38 per cent of women say they have experienced online abuse, according to the Economist Intelligence Unit. Nigeria exceeds that average: Gatefield data shows that women make up 58 per cent of documented online harms. And yet, policymakers still treat online spaces as trivial, unserious, or somehow separate from "real" violence. That denial enables what happened recently when Nigerian men used Grok -- X's AI chatbot -- to digitally undress women and share the images for sport. One of those women was Ayra Starr.
When AI-generated nude images of singer Ayra Starr spread across X last month, the platform responded with remarkable indifference: no policy violated, nothing to see. Actress Kehinde Bankole endured the same violation earlier this year. Across Nigeria, women's faces and bodies are being digitally stripped, weaponised, and circulated in acts of violence that would trigger national outrage if they occurred offline. Instead, they vanish into a legal void. Our institutions pretend they cannot see what is happening. But the truth is starker: they have chosen not to look.
Keep up with the latest headlines on WhatsApp | LinkedIn
One victim told fact-checking site DUBAWA: "This post is causing more insults and humiliation to me, and as it spreads, it ruins my reputation." When she reported to X (formerly Twitter), the platform claimed there was no violation of its policies and it got away with it. According to Gatefield's "State of Online Harms Report", X accounts for 34 per cent (nearly one in three) of online harms in Nigeria.
At NarratEQ, we built an index to test whether any system exists in Nigeria to prosecute online violence against women. We assessed the essentials: laws naming the harm, digital evidence protocols, mechanisms that force platforms to cooperate, trained investigators, and victim support systems. Across all 36 states and the FCT, the answers came back the same: No. No. No. No. and No.
Meanwhile, the violence is unmistakable. Globally, 38 per cent of women say they have experienced online abuse, according to the Economist Intelligence Unit. Nigeria exceeds that average: Gatefield data shows that women make up 58 per cent of documented online harms. And yet, policymakers still treat online spaces as trivial, unserious, or somehow separate from "real" violence. That denial enables what happened recently when Nigerian men used Grok -- X's AI chatbot -- to digitally undress women and share the images for sport. One of those women was Ayra Starr.
...the necessary infrastructure simply does not exist. Nigeria has no trained digital forensics teams to authenticate AI-generated content. No investigators capable of mapping coordinated digital harassment campaigns. No prosecutors skilled in building chains of custody for digital evidence. No standardised evidence preservation protocols. And no specialised support systems helping women document and report abuse without retraumatisation.
The law, where it exists, is failing. Nigeria amended its Cybercrime Act in 2024 to include harassment and cyberstalking, but the updates do not recognise gender-specific online harm. The Act is instead routinely misused to intimidate journalists and critics. It has teeth, but just not for the people who need protection.
The operational gaps are even deeper. Women already face impossible burdens when reporting physical violence: they are expected to produce perfect evidence, perfect witnesses, perfect narratives. Online violence multiplies these burdens to the point of absurdity. Abuse flashes across WhatsApp, Instagram, Facebook, X, and Telegram at once. A deepfake can circulate on encrypted channels before a victim even discovers that it exists. There is no unified case file, no chain of creation, no mandated obligations from platforms. And when the perpetrators, servers, and platforms are scattered across global jurisdictions, Nigerian law enforcement has nowhere to reach.
That is because the necessary infrastructure simply does not exist. Nigeria has no trained digital forensics teams to authenticate AI-generated content. No investigators capable of mapping coordinated digital harassment campaigns. No prosecutors skilled in building chains of custody for digital evidence. No standardised evidence preservation protocols. And no specialised support systems helping women document and report abuse without retraumatisation.
When the state lacks the machinery to enforce consequences, impunity becomes the default. And that impunity sends a clear message: The digital space is an unsafe terrain for Nigerian women, and nothing will be done about it. It warns women in public life - artists, journalists, politicians, activists - that their participation comes with the risk of technological violence that will go unaddressed.
The Ayra Starr incident was not an anomaly - it is inevitable. Technology will continue evolving, and abusers will continue adapting faster than our laws. The real test is whether Nigeria chooses to evolve its protections with equal urgency.
Other countries show this is possible. Mexico's Olimpia Law, named after a revenge porn survivor who campaigned for change, explicitly criminalises online gender-based violence. France's Sren 2024 legislation does the same. The USA's TAKE IT DOWN Act criminalises the non-consensual sharing of intimate images, both authentic photos and AI-generated deepfakes. These laws work because they do what Nigeria refuses to do: name the harm, build enforcement mechanisms, and hold platforms accountable.
The Ayra Starr incident was not an anomaly - it is inevitable. Technology will continue evolving, and abusers will continue adapting faster than our laws. The real test is whether Nigeria chooses to evolve its protections with equal urgency.
Because the truth is that nothing happened when online violence targeted her. Nothing happened when it targeted countless unnamed women. But that "nothing" is not a natural state of affairs. It is the byproduct of political choices, legislative absences, and institutional neglect. And like all choices, it can be remade. And it must.
Farida Adamu is an adjunct professor at the America Business College in Paris. She is the Insights and Analytics Lead at Gatefield, the product lead for NarratEQ, and a principal researcher for the State of Online Harms in Nigeria report.