Weird question about IGPU's Ryzen 7000 and others

Elf_Boy

Quasi-regular
Joined
Nov 9, 2019
Messages
343
Points
43
Ok, So I get me a nice x670 system, a 7950, decent amount of ram, etc, and of course a dedicated GPU like my 7900xtx.

So is the igpu dead silicone that does nothing or is it utilized for its computations ability? Seems like a waste of silicone if it isnt.
 
That's a good question. I wonder if it's used for 2d work. At the least it's nice to have for troubleshooting gpu issues.
 
You can use them as an encoder or compute - but the software you are using has to specifically use it and you have to have the drivers for it installed - it isn't a transparent or automatic thing that it gets used, unless you plug a monitor into them.

Other than that - yeah, idle silicon that's nice to have for troubleshooting occasionally.
 
Allright. I suppose if/ when Igpu's are much more common some app will take advantage.

If i had the coding skills beyond "hello world" I'd write a proof of concept.
 
Allright. I suppose if/ when Igpu's are much more common some app will take advantage.
IGPUs are as common today as they are ever going to be. Almost every Intel CPU has one. It was only AMD CPUs that were a hold out - AMD restricted them to their APUs for a good while. But until relatively recently those were a really small portion of the market share
 
It just seems like for things like Photoshop, file compression, so many productivity, games even for enhanced physics, could benefit. Oh well. the world is what it is, not what I want it to be. Be a lot
quieter and friendly world were I in charge.
 
I COMPLETELY forgot that Ryzen 7000 series processors have an iGPU. I still think they should have just left that sh1t for the APUs. Like with Intel iGPUs, I would just go into the UEFI (or the BIOS in older systems) and disable that shiznit. But I do agree that it would be nice if you could actually use the iGPU for something along with the discreet GPU. Too bad there's not enough of a market for Vulkan/D3D12 Explicit Multi-GPU (not to mention development time and financial cost). I was hoping we would see more shiznit like this: https://www.anandtech.com/show/9740/directx-12-geforce-plus-radeon-mgpu-preview (And also a bit here: https://www.anandtech.com/show/10067/ashes-of-the-singularity-revisited-beta/4)
Cuz then we would also see iGPUs getting thrown into the mix together with discreet GPUs, with all of them working together.
 
It's not really wasted space because it is in the I/O die, it isn't in the CCDs, so it isn't taking any space away from the cores. It takes very little space, and the I/O die had it, and they can at least fill it with something that might be useful in the future in a more meaningful way. I have a theory that we'll start seeing AI cores in the I/O dies as well on desktop parts in the years to come. They've proven they can stuff things like iGPU in there, so why not.
 
I have a theory that we'll start seeing AI cores in the I/O dies as well on desktop parts in the years to come.
Already there. It's been in Apple's silicon for a good while now (5+ years) and is in all the Mx processors.

 
Become a Patron!
Back
Top