Neural Processing Units (NPUs): The dawn of AI PCs

In May, Microsoft unveiled a new category of PCs: Copilot+ PCs, the first of a new generation of AI laptops, desktops, and tablets that enable AI processing to happen at the edge. These devices are able to process such intensive workloads thanks to the presence of neural processing units (NPUs) on a SoC (system on a chip). NPUs are a new addition to the usual lineup of central processing units (CPUs) and graphics processing units (GPUs), and they open up enormous possibilities for powerful new AI-powered capabilities and power efficiency.

First, let’s talk about NPUs. NPUs enable local devices to handle AI workloads. These chips simulate the neural network of a human brain, completing trillions of operations per second (TOPS), all in parallel. They process AI workloads with particular efficiency, providing excellent performance while consuming very little power for the amount of work they do. When combined with CPUs and GPUs, the NPU can take on AI tasks, which frees up the other chips for other types of workloads.

Typically, the large language models (LLMs) that form the foundation of generative AI typically live in the cloud, which has the massive scale required to store and process. But as AI advances, large language models will become smaller, so they are more focused and applied. The result is a local machine that’s exponentially more powerful, performant, and efficient than its predecessors.

So, what are the major benefits of an AI PC, which are designated as reaching at least 40 TOPS? There are many:

  • Incredible performance: There’s nothing like a concrete example.One ISV that focuses on endpoint security recently moved their file scanning service from a CPU and GPU setup, over to an NPU. The service now runs seven times faster. Nearly all applications, not just AI, will be lightning fast.
  • Hyper-efficient power consumption: When running Teams on a GPU, the app consumes nine watts, primarily to perform noise cancellation in the background. When Teams moved to the NPU, it used just nine milliwatts. That’s just sipping power, so much so that an AI PC can outlast a smartphone, providing multi-day battery life. For an enterprise supporting thousands of PCs, these energy savings quickly add up, helping the organization make significant progress towards its energy efficiency goals.
  • Powerful new capabilities: With AI running locally, all kinds of new capabilities become possible. Copilot can take meeting notes, summarize them and create action items, saving hours of time. Working with colleagues on different continents that speak different languages? Turn on real-time translation. As software vendors learn more about the possibilities for NPUs, we’ll certainly see capabilities that were previously unthinkable on PCs.
  • Simple PC management: These upgrades improve security and data protection for employees located anywhere. Copilot+ PCs powered by Snapdragon® X Series processors enable easy remote management, flexible cloud usage, and improved energy efficiency.

The 12-core Snapdragon X Elite NPU powers the Copilot+ AI PC for high-performance, while the ten-core and eight-core Snapdragon X Plus will enable OEMs to hit that sweet spot of commercial price points between $800 and $1,000 for PCs. The Snapdragon X [Elite or Plus], reaching 45 TOPS, is bringing a whole new world of power, battery efficiency, security, and experience to organizations.

Watch this webinar to learn about AI PCs, including Copilot+ PCs, and the groundbreaking performance of Snapdragon X Series processors.

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.




Source link

Leave a Comment