Meta claimed its estimate had been 'rough and overly inclusive' but declined to provide an updated figure. It told Which? that user reports about scam ads had declined 50% in the 15 months to November, and it said it had removed 134m pieces of scam ad content so far that year.
Which? Scam Alerts group put 'at risk'
Launched four years ago, the group exists to raise awareness of the latest scams and empower people to spot and avoid fraud attempts. With 42,000 members, we believe it’s the largest UK scam-prevention group on Facebook.
In January 2025, we received the first of multiple warnings on Facebook stating that the group ‘is at risk of being disabled, and has reduced distribution and other restrictions, due to Community Standards violations’.
The Which? social media team manages this community with strict criteria for approving posts, so we can keep everyone safe. For example, any scams shared for awareness and education purposes have to be in screengrab format and can’t include any links to the actual scam content.
Meta misfire
Platforms deluged by scams
This frustrating experience unfolded in the same year Meta announced it would stop working with independent fact-checkers to monitor content in the US.
Although this doesn’t apply in the UK, it’s not something I’d want to see replicated here. Having investigated fraud for almost a decade, it’s clear from my experience that Meta platforms are still deluged with scams – even with the Online Safety Act now in force.
Does Meta really not know how, or does it simply not care enough?
source https://www.which.co.uk/news/article/meta-mayhem-why-does-facebook-keep-censoring-the-which-scam-alerts-group-aAOXK6Z4UpOt