Breaking: Anonymous Creator 'RadialB' Fuels AI 'Decline Porn' Wave, Algorithms Amplify to Millions
The core of this intelligence event is the identification and methodology of 'RadialB,' the anonymous originator of a viral AI video trend. Operating under a pseudonym, this individual in his 20s from northwest England—who has never visited Croydon—uses prompts like 'roadmen wearing puffer jackets, track suits, and balaclavas' to generate hyper-realistic, absurd scenes of taxpayer-funded decay in the south London borough. His videos, including one of a grimy, litter-filled water park, are engineered for virality by exploiting a key vulnerability: their realism. 'If people saw it and they immediately knew it was fake, then they would just scroll. The selling point of generative AI models is that they look real,' RadialB stated. This intentional blurring of reality is the operational catalyst.
Key Data Points & Actors:
- •Scale: Dozens of copycat accounts have emerged, collectively amassing millions of views on TikTok and Instagram Reels.
- •Creator Motivation: RadialB claims his intent is humor and engagement, not politics, stating the goal is to make content 'more and more funny or absurd.' He acknowledges videos 'blew up' because they were 'very graphic.'
- •Platform Response & Evasion: RadialB's primary TikTok account was banned for 'graphic or inappropriate' content, but he has already established a new account posting identical material, demonstrating the ineffectiveness of reactive, account-based moderation.
- •Labeling Failure: While some videos carry 'AI-generated' labels per platform policies, the BBC found commenters who were 'genuinely convinced' the scenes were real, indicating labels are insufficient to counter visceral, realistic imagery.
- •Monetization Vector: RadialB notes other accounts re-share his content 'for views and clicks - and in an effort to monetise the content on other platforms like Facebook,' revealing a nascent disinformation-for-profit ecosystem.
This development differs from previous disinformation waves due to the low technical barrier and high visual fidelity. The 'huge jump' in AI tool quality, as noted by the creator, enables a single individual to mass-produce content that was previously the domain of well-resourced state or political actors.