Meta removes 63,000 Nigerian accounts linked to cybercrime
Meta Platforms, the parent company of Facebook, Instagram, and WhatsApp, has removed 63,000 Nigerian accounts associated with notorious “Yahoo Boys” (as scammers are called in Nigeria).
The company announced this in its Q1 2024 Adversarial Threat Report, published on Wednesday.
The announcement comes barely a week after the Federal Competition and Consumer Protection Commission fined Meta $220m for multiple violations of data protection regulations.
The accounts, which have been deleted over the past few weeks, were used for financial sextortion scams and distributing blackmail scripts, it has been revealed.
According to Meta, “Financial sextortion is a borderless crime, fuelled in recent years by the increased activity of Yahoo Boys, loosely organised cybercriminals operating largely out of Nigeria, that specialise in different types of scams.
“We have removed around 63,000 accounts in Nigeria attempting to target people with financial sextortion scams, including a coordinated network of around 2,500 accounts.
“We have also removed a set of Facebook accounts, pages, and groups run by Yahoo Boys – banned under our Dangerous Organizations and Individuals policy – that were attempting to organise, recruit, and train new scammers.”
Highlighting the process used during the investigation, Meta said it found that most scammers’ attempts were unsuccessful, though some had targeted minors, noting that those cases were reported to the National Centre for Missing and Exploited Children (NCMEC).
Meta revealed that it also shared information with other technology companies via the Tech Coalition’s Lantern programme, which was set up to help curb such scams across different platforms.
Meta further said that it had removed around 7,200 assets in Nigeria, including 1,300 Facebook accounts, 200 pages, and 5,700 groups that were providing scam-related resources.
These assets, Meta noted, were offering scripts and guides for scams and sharing links to collections of photos for creating fake accounts.
To protect users, especially teens, Meta disclosed that it has implemented stricter messaging settings for users under 16 (under 18 in certain countries) and displays safety notices to encourage cautious behaviour online.
Subsequently, it wrote, “We also fund and support NCMEC and the International Justice Mission to run Project Boost, a programme that trains law enforcement agencies around the world in processing and acting on NCMEC reports.
“We’ve conducted several training sessions so far, including in Nigeria and Cote d’Ivoire, with our most recent session taking place just last month.”