SUSE plans more support for gen AI workloads

Once IT management figures out whether to run workloads in the cloud or on premises, then they can explore the question of open-source versus proprietary operating systems. 

Cost trade-offs

Regarding where generative AI workloads are run, on premises or in the cloud, “there are some cost considerations. The jury is out on the cost tradeoffs,” Iams said. 

For many enterprises, the on-prem vs cloud debate is more about control than anything else. It is a common problem for CIOs and CISOs to work out precise settings and customizations tailored to that enterprise’s environment, only to find those decisions overwritten by a cloud staffer who changed settings universally for all cloud tenants. 

“The universal business model is that the CIO wants throats to choke,” Weinberg said, referring to the ability to control employees and contractors that your team has hired, versus an employee or contractor working for the cloud vendor.

As for the software, Iams said that “open source is not always going to be cheaper than closed source. There is this perception that open source is cheap, but someone has to get all of it to work together.”

That is precisely part of the SUSE argument, that they will be delivering a suite of all of the elements needed to support gen-AI deployments, with all elements tested to work well together. 



Source link