A quick look at what happened
Nigeria’s National Information Technology Development Agency (NITDA) revealed that Google, LinkedIn and TikTok removed more than 28 million accounts tied to scams, impersonation and other abuses in 2024. Platforms also pulled down roughly 58.9 million pieces of content that violated policies, although some removed content (about 420,000 items) was later reinstated after appeals.
The scale of the cleanup in real terms
- 28+ million accounts removed across Google, LinkedIn and TikTok in 2024.
- Google: ~9.68 million accounts deactivated.
- LinkedIn: nearly 16 million accounts removed — a figure NITDA flagged as worrying given LinkedIn’s professional focus.
- Content: 58,909,122 pieces removed; 420,439 of those later reinstated after review or appeal.
- User reports: Nigerians filed 754,629 complaints to platforms in the same period.
What pushed regulators and platforms to act
NITDA framed the removal campaign as part of a broader effort to reduce online fraud and improve crisis response. False information and impersonation pose direct economic and safety risks — from social-engineering scams to misinformation that sparks panic or financial loss. The agency said platform cooperation has improved communication channels and supports Nigeria’s data protection and digital safety frameworks.
The surprise hidden in LinkedIn’s data
LinkedIn’s removal of nearly 16 million accounts alarmed NITDA leadership because LinkedIn is typically a professional network. The high figure suggests scammers are increasingly abusing even professional platforms for impersonation and social-engineering attacks — a reminder that no single app is immune.
What’s really going on beneath the surface
A huge takedown doesn’t automatically mean safer spaces
Removing millions of accounts is a blunt but visible metric. It can reduce scammer capacity quickly, but it’s not a silver bullet. Bad actors reappear on new accounts, use encrypted apps, or move to peer-to-peer marketplaces. Effective long-term reduction in fraud requires better detection, cross-platform data sharing, and partnership with financial institutions to cut the payout rails.
Why the number of reinstated posts actually matters
About 420,000 removed posts were reinstated after review — a modest but meaningful share. This underscores two realities: automated moderation systems have false positives, and governments and platforms must balance rapid takedowns with fair, transparent appeals processes to protect legitimate speech and avoid censorship or mistakes.
How this affects everyday users, companies, and policymakers
- Individuals: Stay vigilant about impersonation and double-check profiles before sharing sensitive info. Use platform safety tools and report suspicious accounts.
- Businesses: Strengthen social engineering awareness training for staff and verify recruitment or vendor outreach through independent channels.
- Policymakers: Prioritise cross-border cooperation, support transparent moderation appeals, and invest in digital literacy campaigns so citizens can better spot scams.
What regulators and platforms should learn from this episode
NITDA pointed to closer ties with platforms and Nigeria’s Data Protection Regulation as key to progress. Still, the episode spotlights governance challenges:
- Speed vs. accuracy: Platforms must act quickly during crises, but automated takedowns require robust appeal channels to correct mistakes.
- Cross-platform intelligence: Fraud networks operate across services — information-sharing frameworks (with privacy safeguards) could improve detection.
- Measurement of success: Beyond takedown counts, regulators should track fraud reduction, user harm prevented, and re-offense rates to measure real impact.
What the next few months could reveal
- Whether platforms publish more granular transparency reports (detailing enforcement categories, detection methods, and reinstatement reasons).
- Policy updates from NITDA or the Nigerian Data Protection Commission around moderation standards and cross-border data cooperation.
- How private-sector partners (banks, telcos) and platforms coordinate to disrupt the financial incentives behind fraud.
