Get your IT infrastructure AI-ready


By Ken Kaplan

Many enterprises are well down their path toward implementing artificial intelligence, whereas others are still unsure how they’ll use it to run their business. Either way, CIOs and IT teams have many choices to make as AI continues to evolve at lightning speed. 

Some feel that the train is leaving the station and they must get on board now, according to Sean Donahue, senior solutions manager at Nutanix.

“AI is not an option,” he said. “It’s not a speculative market. Companies know they have to go AI; they just haven’t figured out what use cases they will tackle first.” Donahue said this is likely because many don’t fully grasp how their organizations could benefit from using AI. He likened it to when Thomas Edison introduced the light bulb in 1888.

“It was amazing to see at first sight, and the demo struck awe, but people didn’t understand how to use electricity, especially since there was no infrastructure bringing electricity to their home.”

Artificial intelligence adoption is a challenge many CIOs grapple with as they look to the future. Before jumping in, their teams must possess practical knowledge, skills, and resources to implement AI effectively. 

Getty

AI challenges and infrastructure needs

To get their IT operations AI-ready, forward-thinking leaders are re-evaluating their entire IT ecosystem in order to build the right infrastructures to handle both existing and future AI-powered functions.

“It takes data scientists, AI engineers, and machine learning operational engineers, and then it takes good infrastructure people, along with the developers who build the apps,” said Rajiv Ramaswami, president and CEO of Nutanix. “The set of tools that you need to put together AI applications and get them going to market, that’s not easy either. On top of that, there is a shortage of hardware.”

AI implementation is costly and the training of AI models requires a substantial investment. “To realize the potential, you have to pay attention to what it’s going to take to get it done, how much it’s going to cost, and make sure you’re getting a benefit,” Ramaswami said. “And then you have to go get it done.”

GenAI has rapidly transformed from an experimental technology to an essential business tool, with adoption rates more than doubling in 2024, according to a recent study by AI at Wharton, a research center at the Wharton School of the University of Pennsylvania.. Weekly AI usage among business leaders surged from 37% to 72% and organizations reported a 130% increase in AI spending since 2023, the report found.

Traditional IT infrastructure is not equipped to handle high-intensity AI requirements like training large language models (LLMs) or processing high-volume, real-time data streams. IT professionals indicated that running AI applications on their current IT infrastructure would be a “significant” challenge, according to the Enterprise Cloud Index report, released by Nutanix in early 2024.

Donahue used a practical car metaphor that demonstrates this challenge. “My 1949 car is not up to today’s demands for performance,” he said. “I’m happy driving it, but I know it’s never going to compete on the highway. In fact, I shouldn’t be running it on the highway because it’s so outdated already.” In other words, most existing IT infrastructure maintains the status quo, but it won’t efficiently meet intense demands from AI workloads.

As newer car designs have evolved to meet higher safety standards, fuel efficiency, and performance, enterprise IT infrastructures must evolve to provide greater computational power, flexibility, and efficiency to handle AI applications, Donahue said.

Enhanced security measures and governance frameworks are critical as enterprises seek to protect intellectual property and customer data within AI models. This is driving CIOs to seek infrastructure that can manage AI strategically and securely, and be agile enough to handle future innovations and challenges.

gettyimages-1792557227.jpg

Getty

Managing IT infrastructure that runs AI

According to Donahue, IT teams are exploring three key elements: choosing language models, leveraging AI from cloud services, and building a hybrid multicloud operating model to get the best of on-premise and public cloud services.

“We’re finding that very, very, very few people will build their own language model,” he said. “That’s because building a language model in-house is like building a car in the garage out of spare parts.”

Companies look to cloud-based language models, but must scrutinize security and governance capabilities while controlling cost over time. “If those things don’t scare me away from using it with my corporate IP and data, then I’m going to realize at the end of the month that I’m paying the hyperscalers because my AI inferencing application – that little query box that my employees use to ask questions – uses cloud GPUs, and those aren’t cheap,” he said.

This brings IT teams to a third step: thinking beyond cloud-based models and considering solutions designed intentionally and specifically to handle AI functionality. 

Donahue pointed to Nutanix’s GPT-in-a-Box, a comprehensive, pre-configured solution that combines hardware and software to support the deployment of AI models directly on-premises, in the cloud, or at the edge. This setup is designed to streamline the deployment and operation of GPT models by providing all the necessary components in a single, integrated package for integrating generative AI and AI/ML applications into IT infrastructures while keeping data and applications under IT team control.

Donahue explained that GPT-in-a-Box allows existing IT systems to streamline processes needed to onboard AI capabilities. It reduced the complexity of selecting compatible components, configuring software, and optimizing performance.

By controlling the entire stack, including hardware, software, and AI layers, IT teams can implement robust security measures designed to safeguard AI environments, including data encryption, secure data access controls, and intrusion detection systems. GPT-in-a-Box also allows teams to manage performance by leveraging the optimal resources for efficiently accessing data in the right location.

gettyimages-1518857461.jpg

Getty

Managing apps and data across hybrid multicloud systems

According to Donahue, infrastructure must be at the center of the AI adoption strategy and there’s one cloud model poised to be particularly successful: hybrid multicloud

“Hybrid multicloud is where it’s at,” Donahue said. “AI just speaks to hybrid multicloud because your data sets will be everywhere. You will have to use a solution like unified storage to gather and manage them under one roof.”

Hybrid multicloud environments integrate diverse computing resources and data storage types. They facilitate efficient data management and processing, pivotal for the performance of AI systems, particularly when handling extensive and varied data sets spread across multiple locations.

“People who are using hybrid multicloud already will probably have an easier time getting started with their AI efforts,” Donahue said.

Prioritizing infrastructure modernization is essential. Embracing AI effectively demands that enterprises reassess and revitalize their underlying IT systems, focusing on the future and achieving the key scalability, capacity, efficiency, and analytical capabilities required to keep up in a fast-changing IT world.

Learn more about the Nutanix Enterprise AI capabilities in this blog post and video coverage of its release in November 2024.

Ken Kaplan is Editor in Chief for The Forecast by Nutanix. Find him on X @kenekaplan.

gettyimages-1518854805.jpg

Getty





Source link

Leave a Comment