Your next Windows computer upgrade will most likely have a new type of processor. It is called an NPU and is the most important requirement for Microsoft’s next generation of Windows computers—Copilot+ PCs

While the new Snapdragon processors and Microsoft’s requirements brought NPUs into the spotlight, they’ve been around longer than you think.

However, these new processors mean you no longer have to rely on cloud servers to run generative processes. 

So what are NPUs? How do they work? What are their benefits and drawbacks? I’ll explain everything in this article.

What Is an NPU?

NPUs (short for neural processing units) are dedicated chips that execute complex AI operations. They process multiple data points in parallel and simulate brain neurons. This computing method is the ideal way to run neural networks and tackle functions like natural language processing, image recognition, and generative tasks.

hey are faster than GPUs and CPUs for these and more AI use cases as a result.

Neural networks are artificial intelligence principles that train computers to learn from and process data like the human brain. They try to recreate the human brain by using interconnected nodes in a layered structure.

As mentioned, NPUs are not as new as their current hype. Before the Microsoft Build event, they’ve resided in CPUs, GPUs, smartphones, and servers. They’ve also gone by different names, such as tensor cores, neural engines, and DSPs (digital signal processors).

For example:

  • Intel and AMD built NPUs into their flagship Meteor Lake and Phoenix CPUs.
  • NVIDIA RTX series GPUs have built-in NPUs (tensor cores) for AI features like DLSS 3 and ray tracing.
  • Qualcomm has used its hexagon NPU since its Snapdragon 855 smartphone chipset in 2018.
  • Apple introduced its first neutral engine in the A11 Bionic in 2017.

How NPUs have evolved

NPUs have also evolved from what they used to be. This evolution was primarily driven by the ever-expanding list of AI use cases, especially generative AI.

Earlier versions focused on less complicated object identification use cases, which relied on simpler neural networks known as CNNs (convolutional neural networks). But AI video and photography increased the stakes. NPUs that could handle more advanced neural network models were developed as a result.

Generative AI, which involves using AI to generate new content, brought even higher power and performance demands. 

Creating new text, image, video, and audio output requires more memory and data transfer, which could lay waste to limited bandwidth and spike energy consumption. Today’s NPUs solve these problems for smaller devices as they are integrated into large systems on chips (SoC).

What are SoCs, and why are they important?

What most people call CPUs today are systems on chips or systems-on-a-chip. They are integrated circuits that contain different processing units, such as CPUs, GPUs, and TPMs. This chip design significantly saves space, improves power efficiency, and boosts performance.

The CPU has to hand off tasks to the GPU, TPU, or NPU when users initiate specific tasks. If these processors are separate, the CPU must transfer data over relatively long distances on the same board, which would consume more bandwidth and drive up energy usage. It means laptops, tablets, and phones will easily overheat and slow down because they lack space.

Also read: Ultimate Guide: How to Stop a Laptop from Overheating

That is why most gaming laptops with separate GPUs consume more power and run loud fans to dissipate excessive heat.

Integrating these processors on a single chip resolves these issues and allows smaller form factors to perform faster and consume less energy.

NPUs vs. Traditional Processors

We’ve established that NPUs are better at handling complex AI tasks than other types of processors. But that doesn’t mean they can replace them. CPUs and GPUs still have important computing roles to play. With the emergence of AI, NPUs should be seen as complementary processors.

NPUs ensure AI features will not bog down your GPU and CPU. They will allow them to handle regular and demanding functions. 

On the flip side, NPUs are not versatile and cannot handle most of the processes CPUs and GPUs are designed to execute.

  • NPUs vs. CPUs

CPUs (central processing units) are commonly considered every computer’s brain. They are general-purpose processors that deal with data manipulation and transfer, logical operations, input and output tasks, some graphics processing, and emulation. Heavy tasks like simulation and development rely mostly on CPUs. They also determine whether to send tasks to the GPU or NPU.

While they can do most things, it’s better to outsource specialized tasks to other processors. They are especially not optimized for parallel processing like NPUs. You’ll notice significant slowdowns if you force machine learning and AI operations to run on the CPU.

  • NPUs vs. GPUs

GPUs are designed to handle demanding graphical computations. They’re more valuable to gamers, video editors, and 3D modelers. That said, NVIDIA has made the argument that GPUs are great for AI tasks because they have numerous integrated cores and excel at parallel processing.

But their AI capabilities are mostly targeted at graphics-related tasks and features, such as DLSS and live streaming HDR. These tasks also mostly run on AI-optimized cores known as tensor cores.

Dedicated NPUs do a better job at other use cases, such as system-wide information processing and other generative applications.

  • Windows and NPUs

Microsoft’s Copilot+ PC is mostly responsible for the current spotlight on NPUs. The company is boldly betting on AI to revolutionize Windows and transform how people interact with their computers.

Its new Copilot+ PCs offer exciting new integrated and generative AI capabilities powered by dedicated NPUs. That way, GPUs and CPUs are not burdened by these new classes of features. 

What is a Copilot+ PC?

Copilot+ PC is Microsoft’s new class of Windows computers. Devices with the Copilot+ brand ship with AI features that will not run on other computers. They include: 

  • Cocreator in Paint, which helps users refine their sketches,
  • Restyle Image for advanced image editing,
  • Live Captions for real-time audio transcription and translation, 
  • Recall, which helps people retrieve forgotten information in seconds, and 
  • Windows Studio Effects for enhancing the video calling experience.

Do you need a Copilot+ PC? Our rundown on these next generation computers will tell you all you need to know.

We’ve seen teasers in the latest Windows 11 build, such as the preview version of the Copilot app and Image Creator in Paint.

However, Microsoft requires NPUs that can pull off 40 trillion operations per second (TOPS) to run the main Copilot+ AI features. Currently, only the new Snapdragon X processors can manage that capability. The new chipsets are behind Microsoft’s claim that Copilot+ PCs are the “fastest and most intelligent” Windows devices ever.

The Snapdragon X processors won’t be the only chips capable of running Copilot+ PC features. AMD and Intel have announced new SoCs capable of more than 40 TOPS, such as the next AMD Ryzen AI 300 series (Strict Point) and Intel’s next generation Lunar Lake processors.

We analyze Microsoft’s declaration in detail and provide more insight in our comparison between Copilot+ PCs and traditional Windows devices

The new PCs have also had their share of PR disasters due to the controversies surrounding Recall, which could discourage some users from making the switch.

To be fair to Microsoft, the newly released Copilot+ PCs offer incredible performance and battery life. They also perform better when running on batteries than other Windows computers.

Apart from the new AI features, Microsoft also introduced a new emulation layer, known as Prism, that improves x64 and x86 app performance on ARM devices.

The role of NPUs in Copilot+ PCs

NPUs primarily allow AI processes to run locally without the need to use cloud servers. It’s one of the main reasons Microsoft insists on processors that can execute at least 40 trillion operations per second.

You still need an internet connection and a Microsoft account to use generative AI features like Cocreator and Restyle Image. Microsoft’s servers will review your prompts before allowing the NPU to work on them.

For example, whenever you press the Copilot button, launch Recall, activate Live Captions, or turn on Cocreator, Windows will automatically route those processes to the NPU. You can also see the NPU’s real-time graph in the Task Manager tick up to show that it’s busy.

NPUs also handle other integrated AI functions, such as graphics upscaling (automatically improving video and image equality, especially in games).

Additionally, they free up computing resources for GPUs and CPUs by handling tasks related to AI that these processors may have had to handle in the past. These tasks include things like graphics upscaling and object recognition. That way, you’ll notice significant performance improvements. 

Computers with Intel’s Core Ultra CPUs and AMD’s Phoenix processors already have integrated NPUs. However, those NPUs do not currently meet Microsoft’s 40 TOPS threshold to run Copilot+ PC tasks. Microsoft also requires dedicated NPU chips, so NVIDIA’s tensor cores don’t qualify.

Other Types of Devices Using NPUs

As mentioned, NPUs are not exclusive to computers. They’ve also been integral to other smart devices for years.
Mobile chip makers such as Qualcomm, Samsung, Apple, and Huawei have used NPUs for years to power features like voice assistants, facial recognition, predictive typing, translation, and task automation.

NPUs are also part of IoT devices, such as Smart TVs that automatically upscale image quality and gadgets like speakers with integrated AI assistants.

Are NPUs the Only Type of AI Processors?

No. NPUs are just among the many types of processors built for handling complex AI tasks. Players like NVIDIA, Google, Amazon, Qualcomm, IBM, AMD, and Apple also have different types of processes dedicated to AI.
\Most of this hardware targets enterprise-scale processes and is built for things like robotics, driverless vehicles, security and surveillance, and cloud computing.

Outlook for AI Chips

According to Statista, the AI chips market was worth $53.7 billion in 2023 and is primed to surpass $70 billion in 2024 and $90 billion in 2025.

We’re currently seeing mass AI adoption across different sectors, from healthcare to law and business. 

As for PCs, Microsoft’s new direction is wrapped up in artificial intelligence, and its NPU requirement means every computer will have an AI chip going forward.

Download Auslogics BoostSpeed
Your one-stop PC maintenance and optimization tool, this program will clean, speed up, repair and tweak your system to ensure maximum speed and efficiency.

NPUs: Looking Forward

NPUs are now making it increasingly possible to truly achieve incredible machine learning and AI ambitions. Users can now run generative AI tasks on their devices without subscribing to cloud solutions. The requirement for Copilot+ PCs and future AI capabilities also means you now have to look out for NPU-powered computers on your next purchase.
However, there are no guarantees that these features will come as bundled packages with Windows or if Microsoft will introduce a Windows-as-a-service subscription model.

Still, there’s a lot to anticipate, as Intel and AMD’s drive to ship Copilot+ PCs means users will get even more powerful and efficient computers. Windows 12 speculations are also not fading away, as some experts believe Copilot+ PC is a primer for the new operating system.

Let us know what you think about NPUs and if the technology has bought you over.


What are the main differences between NPUs, CPUs, and GPUs?
NPUs are mainly designed to process AI tasks without consuming much power. CPUs are multipurpose processors that can run just about any task, but they are reserved for important system functions and other complex computational processes that GPUs cannot run. GPUs are mainly focused on graphics tasks, which can be quite demanding. They also perform parallel processing like NPUs, but with more focus on video and image operations.
What are the main benefits of using NPUs in computing?
NPUs make AI tasks run faster, allow users to perform these tasks on-device, and reduce the burden on GPUs and CPUs.
How will NPU change the computer market for the average user?
Almost every user will now add NPUs to their list of must-have PC components, as they are required for Copilot+ PCs. However, Microsoft doesn’t have plans to stop support for non-Copilot+ PC users, as they will continue receiving feature, quality, and security updates.
What makes Copilot+ PCs unique?
Copilot+ PCs are Microsoft’s new priority. The current set of devices is unique because Microsoft has improved support for Arm chips and introduced new AI integrations that can shape a new future for Windows.