- ExpressVPN vs NordVPN: Which VPN is best for you?
- Ultramarine Linux 40 continues to be one fine unofficial Fedora Spin
- TunnelBear VPN review: An affordable, easy-to-use VPN with few a few notable pitfalls
- VMware Product Release Tracker (vTracker)
- I use this cheap Android tablet more than my iPad Pro - and it costs a fraction of the price
How to Build an AI Infrastructure to Support NLP
When enterprises began deploying AI infrastructure solutions almost six years ago, they were breaking new ground in AI exploration, leading-edge research and “big science” challenges.
Since then, many businesses have focused their AI ambitions on more pragmatic use cases, including revolutionizing customer care, improving factory efficiency, delivering better clinical outcomes, and minimizing risk.
Today, we’re witnessing the explosion of the biggest enterprise computing challenge of our time with the rise of natural language processing (NLP), which has become an essential capability for businesses everywhere.
E-commerce giants are employing translation services for chatbots to support billions of users worldwide. Major manufacturers like Lockheed Martin are using NLP to enable predictive maintenance by processing data entered by technicians, exposing the clues in unstructured text that are precursors to equipment downtime.
Such efforts are happening around the world. In Vietnam, for example, VinBrainAI is building clinical language models that enable radiologists to streamline their workflow and achieve up to 23% more accurate diagnoses using better summarization and analysis of patient encounters.
What these organizations have in common is their desire to implement large-scale AI infrastructure that can train models to deliver incredible language understanding with domain-specific vocabulary. The reality is that large language models, deep learning recommender systems, and computational graphs are examples of data-center-sized problems that require infrastructure on a whole new scale.
To take advantage of this opportunity, more businesses are implementing AI centers of excellence (CoE), based on shared computing infrastructure, that consolidate expertise, best practices and platform capabilities to speed problem-solving.
The right architectural approach to an AI CoE can serve two critical modes of use:
- Shared infrastructure that serves large teams and all the discrete projects that developers may need to run on it
- A platform on which gigantic, monolithic workloads like large language models can be developed and continually iterated upon over time
The infrastructure supporting an AI CoE requires a massive compute footprint, but more importantly, it must be architected with the right network fabric and managed by a software layer that understands its topology, the available resources profile and the demands of the workloads presented to it.
The software layer is just as important as the supercomputing hardware. It provides the underlying intelligence and orchestration capability that can enable a streamlined development workflow, rapidly assign workloads to resources, and parallelize the biggest problems across the entire platform to achieve the fastest training run possible.
While the AI CoE is taking flight in enterprises across industries, many organizations are still working out how to infuse their business with AI and the infrastructure needed to get there. For the latter, new consumption approachesare gaining traction that pair supercomputing infrastructure with businesses that need it, delivered in a hosted model, offered through colocation data centers.
IT leaders can learn more about these trends and how to develop an AI strategy by attending NVIDIA GTC, a virtual event taking place March 21-24 that features more than 900 sessions on AI, accelerated data centers and high performance computing.
NVIDIA’s Charlie Boyle, vice president and general manager of DGX Systems, will present a session titled “How Leadership-Class AI Infrastructure Will Shape 2023 and Beyond: What IT Leaders Need to Know – S41821”. Register for free today.