Why Do Integrated Graphics SUCK?





Visit http://squarespace.com/techquickie and use offer code TECHQUICKIE to save 10% off your first order.

Why aren’t integrated graphics as good as a discrete GPU?

Techquickie Merch Store: https://www.lttstore.com

Follow: http://twitter.com/linustech

Leave a reply with your requests for future episodes, or tweet them here: https://twitter.com/jmart604

License for image used: https://creativecommons.org/licenses/by/2.0/legalcode

source

20 thoughts on “Why Do Integrated Graphics SUCK?

  • think about how much better a gpu you could cram into that limited silicone if you didn't have to do things like add a second encrypted cpu running its own OS and label it a "feature" (that no one asked for)

  • Integrated GPU suck? Vega 11 make that statement obsolete, by the way I still plan on buying dedicated graphic card, when I got the budget.

  • As technology advances, they can get 3-4 generations old GPU setups sized down to an area close to that of modern IGP/APU. A Raspberry Pi takes up the space of a cell phone yet has the processing power of entire large desktop computers less than a decade ago. Shrink down that GTX650 and turn it into a IGP.

  • You forgot to mention that CPUs and GPUs use vastly different architecture.
    While CPUs have to be able to work different types of operations,
    dedicated GPUs can make use of their purpose to only solve one type of operation over and over.

  • A very easy solution would be: Make a "Threadripper" sized APU and put on "A LOT" of cores at a lower frequency. Then deliver the APU with a reference liquid or vapor cooler and make a small-ish "block" or "bracket" cooler/fan unit, which can easily be fitted into a slot on the front or back of a build. Now to finish it off make a "tiered" memory management system, where you can have additional blocks of a more expensive memory type to which the GPU has first priority, and only if the GPU isn't using it made available as "normal" system memory. This would have the benefit of giving your system a lot of extra speed on the memory for workload which doesn't depend on the GPU.

    ……

    This however would probably make the APU a lot more expensive – perhaps even expensive enough to where it would cost more than a mid tier CPU and GPU … BUT – as a "console-killer" or a laptop solution it could work… It just takes a way different design philosophy than what is currently being employed

    … And to be honest – with the way SoC and APU tech is moving currently – do you really think anyone will be using discrete graphics in 10-20 years? Sure you will probably have the "enthusiast" marked – but think about it for a second. (This is made as an example – and should not be taken too literally – but just to encourage a different midset) Let's take a mobile device like the Xiaomi Pocophone F1… It's a 300$ phone with a Snapdragon 845, 6GB of RAM, 128GB "SSD" (yes I know … i know … just run with it). The compute power of that device is a lot higher than most realize. Compare it to something like an I3 or I5 with an entrylevel GPU – it can actually sorta hold its own (depending on the task) – especially if you go back a few generations.
    As we move closer and closer to these SoC and APU solutions being able to handle HD and 4K, the need for high powered GPU solutions will diminish, and your kids or grandkids will look at your "old computer" as something of a dinosaur. Graphical fidelity can only go so far before the average human eye won't be able to tell the difference – and that day might be closer than most people realize.

Comments are closed.