
When you're setting out to get a new gaming PC or laptop, you've probably noticed there are quite a few models out there without an Nvidia or AMD graphics chip. These devices usually come with an integrated GPU, rather than a discrete, or dedicated graphics chip.
PCs with and without discrete GPUs have their uses, but it's important to know the difference if you're on the lookout for a new PC, especially with Intel's new Panther Lake platform right around the corner. For most people, all you need to know that an integrated GPU is best for lightweight, everyday computing like web browsing and word processing, and a discrete GPU is better for gaming and heavier content creation like video editing.
What Is A GPU Anyways?
Most mainstream computers have two types of processors. A CPU, or a central processing unit, and a GPU, or a graphics processing unit. Traditionally, a GPU's purpose is to take visual data from the CPU and render it for display on your monitor. This is why dedicated graphics cards have been necessary for PC gaming for so long. Because of the sheer amount of visual data that needs to be processed in a video game, a dedicated GPU is necessary to avoid turning most video games into a slideshow.
In the last few years, though, the use-case for graphics chips has expanded drastically. It turns out that the thousands of cores in a GPU makes it extremely good at parallel processing. That makes it good for complicated mathematics, data science, and AI. That's also why graphics cards have only got more expensive over the last decade, due to the rise of cryptocurrency and AI.
Discrete GPU vs Integrated GPU
Pretty much every modern computer has a GPU in one form or another, but they're either going to be integrated into the processor, or they'll be a discrete GPU, which means it's on its own chip and board.
There are some desktop processors that don't have integrated graphics, though. While every modern AMD GPU has an integrated graphics chip, some Intel processors leave them out, usually to lower the cost. You can usually tell the difference from the name of the chip – if there's an 'F' at the end of the product name, it doesn't have an integrated GPU.
Getting an integrated GPU in your desktop processor is underrated, too. While you'll typically want to depend on a discrete GPU for gaming and content creation, an integrated GPU is excellent for troubleshooting if your graphics card is having problems. Just don't make the mistake of plugging your HDMI or DisplayPort into the motherboard instead of the graphics card.
The biggest difference between integrated and discrete graphics is that integrated graphics need to share resources with the CPU. That means they share the same power budget, which can be a problem if you're putting pressure on both the GPU and CPU – as most PC games do. Integrated graphics also rely on the same pool of system memory as your processor, which also means things can slow down exponentially in heavy workloads, and not just because DDR5 is significantly slower than GDDR6 or GDDR7.
This also means that, yes, the PS5, Xbox Series X, Nintendo Switch 2, and pretty much all handheld gaming PCs use integrated graphics, rather than dedicated GPUs.
Discrete graphics, then, are dedicated graphics processors that have their own dedicated resources. In desktop gaming PCs, these typically take the form of graphics cards. These are GPUs soldered onto PCBs (printed circuit boards) with all the resources they need – dedicated video memory, power management, and usually a meaty heatsink. These cards will then slot into a PCIe slot on your motherboard, which then gives you access to more powerful graphics processing.
Laptops are a little different. Rather than a full graphics card, mobile discrete GPUs are usually soldered onto the motherboard in its own little section, paired with its memory and power delivery. There are a few laptops out there that have socketed GPUs, but they're extremely rare. As for integrated graphics in a laptop, they work identically to integrated graphics in any other processor, built into the main CPU and sharing its resources.
However, there is a way to add dedicated graphics to a laptop that doesn't already have them. Thanks to the high speeds of Thunderbolt 5 and USB 4, you can use an external GPU to power up your laptop. Not every external GPU is the same, though. When external GPUs first started blowing up almost a decade ago, they were typically external docks that you'd slot a graphics card into, and then hook into your system via Thunderbolt or a proprietary connector – usually depending on the laptop manufacturer.
More external GPUs come with built-in mobile graphics chips these days, though. These are usually smaller and more portable, but don't have the same kind of power of a desktop graphics card.
Jackie Thomas is the Hardware and Buying Guides Editor at IGN and the PC components queen. You can follow her @Jackiecobra