Growing 'AI Waste' Calls For Responsible AI Frameworks

Sean Carroll

When Brighton-based Vixen Digital published its analysis of AI-driven content failures in March 2025, documenting an 80% ranking collapse from aggressive AI scaling, it served as an early warning signal. Eight months later, the patterns identified have only intensified, new research reveals AI's environmental costs are far higher than previously claimed, and we all continue to pay the price.

“The problem hasn't gone away, if anything, we're fielding more inquiries from businesses in this situation now than we were in the spring. Essentially we’ve seen a rise in what we call ‘AI waste’ - the growing volume of low-value AI output that weakens business performance, damages trust in the marketing industry, raises sustainability and ethical questions, and contributes to a declining digital ecosystem”, said Sean Carroll, Director of Vixen Digital.

The same trend has been observed by others. Mary Kemp, award-winning co-founder of Brighton AI and Simpler with AI, adds

"We're seeing the same pattern with our clients. Teams without proper AI training can burn through 10 to 20 prompts trying to get one usable result. Train them on structured workflows and that drops to 2 to 3 attempts. Every wasted iteration has an environmental cost most organisations don't even realise exists." says Mary.

Vixen Digital has now become one of the first in the UK to achieve ISO/IEC 42001 certification for Responsible AI Management alongside ISO/IEC 27001 for Information Security.

But stresses that you don’t need an accreditation to make a difference.

“Anyone start developing internal frameworks to use AI responsibly. We had them even before the ISO, and it forced us all to ask the right questions before damage is done.” says Caroll.

Mary Kemp adds, "You don't need certification to start using AI responsibly. Train your team on effective prompting and reusable workflows, and you immediately cut the waste. A well-crafted prompt works in three tries. A vague one can take 50. That difference is where responsible AI actually starts."

When AI Strategies Backfire: A Cautionary Tale

Throughout 2025, the agency has audited multiple sites that scaled aggressively using AI-powered content strategies. In a documented case published in Search Engine Land in March, the results were stark:

  • 80% of top-performing pages lost their query rankings entirely (devalued, not just dropped)

  • Over 90% content duplication was detected across the site

  • High domain authority but remarkably low brand search volume is a red flag combination

  • Catastrophic revenue impact as organic traffic collapsed

"This wasn't a manual penalty from Google," Carroll explained. "The algorithms simply recognised what was obvious to us: mass-produced content with minimal differentiation, engagement metrics that appeared inflated, and a site that looked over-optimised for search engines rather than built for users."

The Hidden Costs of AI: New Research Reveals Alarming Impact

A November 2025 study published in Nature's Scientific Reports challenges earlier claims that AI is more environmentally friendly than human work, revealing the true cost of AI-driven workflows. The peer-reviewed research by N.H. Woo is the first correctness-controlled analysis to quantitatively compare the environmental impacts of human and AI programmers.

A November 2025 study in Nature’s Scientific Reports has challenged the idea that AI is more environmentally efficient than human work. According to figures previously shared by OpenAI’s, the average ChatGPT prompt uses around 0.34 watt-hours, a tiny amount on its own. But at an estimated 2.5 billion queries per day, that quickly scales to 850 megawatt-hours daily, enough to charge thousands of electric vehicles. Over a year, this equates to nearly one trillion queries and energy consumption comparable to powering around 29,000 U.S. homes annually.

With Google now turning every search into an AI-driven query through AI Overviews, the environmental footprint of “routine digital activity” is expanding faster than most people realise.

"The Nature study confirms what we've been seeing in practice," Carroll noted. "The most concerning finding is that failed AI attempts, which happen more often than people realise, waste exponentially more resources than successful ones. When agencies automate content creation without proper quality control, they're not just risking their clients' rankings, they're burning massive amounts of energy producing outputs that will never see the light of day or will be penalised by Google within months."

AI models also continue to exhibit bias and hallucination issues. Recent examples include cases in early stages of litigation, such as the Harper v. Sirius XM Radio, where an AI-powered hiring system allegedly unfairly downgraded applications by relying on proxies that disproportionately disadvantage Black candidates. And, we’ve seen outcomes of some previous cases, such as iTutor Group agreeing to pay $365,000 to settle their own AI discrimination lawsuit back in 2023.