Welcome to Eye on AI, with AI reporter Sharon Goldman. In today’s issue: A new effort to bring sustainability back into the AI conversation…Cerebras prices IPO above expected range…Anthropic is now courting small business owners…and court filing shows Sam Altman has an over $2 billion stake in companies that dealt with OpenAI.
Over the past couple of years, public discussions about AI sustainability have largely been drowned out by headlines about the race for computing power, energy, and geopolitical advantage.
Visit h-doctor.club for more information.
But two experts are trying to bring green AI back into the conversation. Sasha Luccioni built a prominent profile over the past five years as AI & climate lead at open source AI company Hugging Face. Now, she and Boris Gamazaychikov, the former head of AI sustainability at Salesforce, say they plan to help organizations make AI sustainability practical and measurable — through rigorous studies examining AI’s environmental impacts, research-driven guidance on AI strategy and procurement, and tools and frameworks that developers and business leaders can apply in the real world.
Most companies still care about sustainability goals internally, even if the public discourse has shifted toward ‘AI race’ rhetoric and beating China, she said. The pair’s newly-launched Sustainable AI Group will help businesses “better understand the choices that they can make,” she explained—where models are running and what kind of models are used to help organizations decarbonize and “de-risk their AI use as much as possible.”
AI can be selected with sustainability in mind
The problem, Luccioni said, is that today’s AI, with its energy-hungry data centers and heat-intensive chips and servers that often require massive cooling systems, is exposing organizations to volatile costs, supply constraints, regulatory uncertainty, and growing pressure from both communities and employees. But, she added, the good news is that every layer of the AI stack can be designed and selected with sustainability in mind, whether that means choosing a fine-tuned small model over a frontier LLM, or running workloads in a data center powered by renewable energy rather than gas.
“I hear a lot of employees being like ‘Hey, we’re really worried about the environmental impacts of using AI in our work, so how do we use more responsibly?’” said Luccioni, who added that pushback and criticism of AI data centers has become a bipartisan issue both on social media and in government.
There is tremendous confusion, for example, about how much water today’s AI data centers actually require. The reality, Luccioni said, is that cooling systems involve tradeoffs. “Either you’re wasting a ton of water or you’re wasting a ton of energy.”
Traditional water-based cooling systems rely on evaporation, meaning significant amounts of water must continually be replenished, she explained. But closed-loop systems that recirculate water come with their own costs: they require additional energy to continuously cool the water as it moves through the system.
Many use cases don’t require massive models
Either way, Luccioni said the data center debate relies on a narrative that everyone will be using massive, general purpose LLMs or generative AI models that need the huge data centers to run.
But many enterprise use cases, she said, don’t actually require massive frontier models. Instead, companies often need smaller, specialized AI systems tailored to specific tasks—such as optimizing factory energy usage or helping employees search internal documents more efficiently. Those kinds of models can sometimes run locally or on-premise, reducing both energy use and data privacy concerns.
Rather than assuming every problem requires a giant LLM, Luccioni said organizations should start by asking what they actually need AI to do and then choose the simplest, most efficient system capable of accomplishing that task sustainably.
“I think they should flip the question and say what are some things we could improve in our company? And maybe there’s some smaller solution,” she said. “Right now, there’s this FOMO, and people are rushing into it, but given the cost and commitment, it makes more sense for me to think about defining KPIs.”
Luccioni said she has also become more convinced that market demand—rather than criticism alone—may be the strongest lever for change in the AI industry. If enough customers begin prioritizing renewable-powered infrastructure or asking tougher questions about carbon intensity and sustainability, she said, providers will eventually respond. Today, however, many companies still do not fully understand how their AI usage connects to broader sustainability commitments, and clearer communication between AI providers, enterprise buyers, and sustainability teams is still missing.
“Currently the AI market doesn’t distinguish between green and not green,” she explained. “Well, what if we get enough people to start factoring that into their procurement choices?”
Luccioni acknowledged that efficiency gains alone may not solve AI’s environmental challenges, as overall demand for compute keeps rising as organizations expand AI usage. Still, Luccioni said she remains cautiously optimistic. “I feel we have enough existing interest, [including] Boris’ work with clients at Salesforce,” she said. “I think there’s a lot we can do.”
With that, here’s more AI news.
Sharon Goldman
[email protected]
@sharongoldman
This story was originally featured on Fortune.com
