- How to clear the cache on your Windows 11 PC (and why you shouldn't wait to do it)
- These Sony headphones deliver premium sound and comfort - without the premium price
- The LG soundbar I prefer for my home theater slaps with immersive audio - and it's not the newest model
- Samsung's new flagship laptop rivals the MacBook Pro, and it's not just because of the display
- Email marketing is back and big social is panicking - everything you need to know
Google’s environmental report hints at enterprise cost-saving tactics

The flip side of this approach is that newer equipment can often be far more energy efficient, depending of course on the age and specs of the equipment being replaced. Therefore, is keeping old systems longer good or bad for the environment?
Google said it continued to match its global energy consumption with renewable energy production in 2023, even as its data center electricity consumption grew 17%, and its total greenhouse gas emissions increased by 13%.
“We see our growing infrastructure as an opportunity to drive the innovations and investments needed to power a low-carbon economy,” it said.
Google listed a variety of ways it claims to be improving sustainability and power/water efficiency, but provided insufficient detail to enough to help other enterprises replicate its tactics. Many of the measures read as sales pitches for its own products and services, but there were a few ideas that companies could consider.
Training AI models for less
“We’ve identified tested practices that our research shows can, when used together, reduce the energy required to train an AI model by up to 100 times and reduce associated emissions by up to 1,000 times, which are all used at Google today,” the report noted. “We’ve sped up AI model training through techniques like quantization, boosting large-language model training efficiency by 39%.”
As for the custom tensor processing unit (TPU) chips it has designed for applications like this, “Our TPU v4 was 2.7 times more energy efficient than our TPU v3 and we’ll soon offer Nvidia’s next-generation Blackwell GPU to Cloud customers, which Nvidia estimates will train large models using 75% less power than older GPUs to complete the same task,” Google said. “Additionally, our new Google Axion Processors are up to 60% more energy efficient than comparable current-generation x86-based instances. These advancements, including AI-powered optimizations like AlphaZero, show how we’re constantly improving hardware efficiency.”