- Get four Apple AirTags for just $73 with this Black Friday deal
- I tested Beats' new Pill speaker and it delivered gloriously smooth sound (and it's on sale for Black Friday)
- These Sony headphones are a fan favorite - and $150 off for Black Friday
- I tested a 'luxury' nugget ice maker, and it's totally worth it - plus it's $150 off for Black Friday
- The Dyson Airwrap is $120 off ahead of Black Friday - finally
3 things CIOs can do to make gen AI synch with sustainability
Rackspace Technology
Whether or not AI delivers on its promises over the long term, CIOs required to account for their full carbon impact now need to include the impact of AI in their Scope 3 reporting — and that gets complicated very fast. For example, if you run inference with a model that was trained by somebody else, you should report on your share of the CO2 impact. The provider might be able to tell you the overall cost of training, but nobody knows how to divvy up that cost among all the users over the lifetime of the model.
“None of this is clear yet, because Scope 3 reporting is new and so is gen AI,” says Niklas Sundberg, chief digital office and SVP at Swiss global transport and logistics company Kuehne+Nagel. Sundberg knows about as much as anybody on Scope 3 reporting and covers the subject in his book Sustainable IT playbook for technology leaders.
Despite the ambiguities, IT leaders are charging ahead with AI. Along the way, some have discovered three things they can do to mitigate the impact on their own sustainability initiatives. They share them here.
1. Use a big provider to optimize utilization
“We are already advanced users of AI, and one of the things we recommend is to use AI, especially inference, through providers that have shared on-demand AI inference environments,” says Elwin. This makes sense because the more people using a public cloud service, the higher the utilization rates. The improvement in the use of resources in running power-hungry AI applications could make a difference in your organization’s overall carbon footprint.
CIOs can take it a step further by asking providers a list of questions, starting with how they train their models and how inference is run. “If you’re only buying inference services, ask them how they can account for all the upstream impact,” says Tate Cantrell, CTO of Verne, a UK-headquartered company that provides data center solutions for enterprises and hyperscalers. “Inference output takes a split second. But the only reason those weights inside that neural network are the way they are is because of massive amounts of training — potentially one or two months of training at something like 100 to 400 megawatts — to get that infrastructure the way it is. So how much of that should you be charged for?”
Verne
Cantrell urges CIOs to ask providers about their own reporting. “Are they doing open reporting about the full upstream impact that their services have from a sustainability perspective? How long is the training process, how long is it valid for, and how many customers did that weight impact?”