- Buy Microsoft Visio Professional or Microsoft Project Professional 2024 for just $80
- Get Microsoft Office Pro and Windows 11 Pro for 87% off with this bundle
- Buy or gift a Babbel subscription for 78% off to learn a new language - new low price
- Join BJ's Wholesale Club for just $20 right now to save on holiday shopping
- This $28 'magic arm' makes taking pictures so much easier (and it's only $20 for Black Friday)
AI Adoption Trends and Lessons Learned: An Expert Q&A
Over the past six years, NVIDIA DGX Systems have helped power artificial intelligence projects across enterprises both large and small. I recently spoke with Charlie Boyle, the Vice President and General Manager of DGX Systems at NVIDIA, about the big AI trends he has seen and what’s coming next.
Q: Have you been surprised by the growth in AI adoption and how it’s being used?
Boyle: We have been surprised by how many organizations organically discover AI use cases or applications. They have brilliant folks working for them who say, ‘Hey, how about this?’ The great thing about AI today is that it doesn’t take years to come up with a successful project. It may only take hours, days or a few weeks to come up with something that is not only useful to the company but that can also be rolled into production.
Other surprises are the velocity with which companies have adopted AI, and the breadth of businesses looking at the technology. We have found that the biggest reason they’re embracing it is to improve their customer relationships. At the end of the day, that’s what many brands are striving for: How do I meet my customers where they’re at and give them a great experience?
Q: What’s the current state of the AI market?
Boyle: We are at the tip of the spear in this market. About 18 months ago, companies were starting AI projects and they quickly accelerated. Many organizations initially thought they needed ‘a bit’ of infrastructure to evaluate their early proof of concept results. That quickly changed to: ‘If I want this to be successful, I need a whole lot of infrastructure and some great software to take on these projects, and it will really change my business.’ Right now AI is a lot more mainstream than it was just a few years back, innovation is happening everywhere in all the different industries.
Q: What AI advice you would give IT leaders?
Boyle: The other trend we’ve seen is companies creating centers of excellence to support AI across the enterprise. In this way, different departments don’t have to procure their own AI infrastructure. For example, one business unit may have a project that must get done quickly so they need more resources. That project might be completed in two months but there may be 20 other groups that want to start an AI project.
Having a flexible, shared infrastructure is important because the last thing you want is someone who has a really great idea and IT tells them: ‘Sure, I can help you with that in a few months.’ The response should be: ‘Yes, I can help you, and depending on the size of the request, I should be able to help you get started within 24 hours.’
If you’re not ahead of the game and embracing the technology, then lines of businesses are going to acquire it anyway, and then you’re stuck stitching together different bespoke infrastructures that they all decided to buy.
Put a roadmap in place that includes external expertise. For example, NVIDIA DGX SuperPOD™ and products from our partners help CIOs and IT departments take a leadership position rather than reacting to initiatives started by lines of business.
Finally, don’t be afraid of AI; it’s not a difficult concept. It physically looks a little different, but it still behaves just like all the servers IT is running today. And there are so many resources available for IT organizations. It’s understanding the new terms, how to apply those new terms to infrastructure decisions, and then making sure there’s an outlook for future capacity.
Click here to learn more about how companies can get started quickly on their AI journey with NVIDIA DGX Systems, powered by DGXA100 Tensor core GPUs and AMD EPYC CPUs.