Intel DG1 GPU Teardown, Failed Benchmarks, and Why It Won't Work on Most Systems

There's a claim in the article that the boards are lacking the EEPROM chips that would hold the firmware, meaning it's in the BIOS. It's a small board, and an SPI (presumably QSPI) EEPROM is actually pretty tiny. A 32Mb surface-mount one is like 5mm on a side. Edit: that is to say, this isn't really a *reason* as such. SPI EEPROMs are dirt-cheap and tiny.
 
The articles states that the card has about the same performance as a GT1030. That seems like a big let down. The GT1030 at least has the option of GDDR5 and will work with just about any PC with the correct PCIe slot type.
 
Last edited:
Honestly if this got to the point where it even touches a 1030 that’s a big step up for Intel. I would think that the DG1 is just a big version of what their integrated chips are running.
 
They had one thing to do...
This needs GDDR5 or HMB2 memory and at least 256 bit bus.
I get they are targeting OEM PCs and this is a 30 watt part but that's not what were talking about 2 years ago.
Maybe next year Nvidia and AMD will see some low end competition.
Feels like rooting for any Cleveland sports team. "Maybe next year"
 
Honestly if this got to the point where it even touches a 1030 that’s a big step up for Intel. I would think that the DG1 is just a big version of what their integrated chips are running.
This one is actually the same spec as the Iris XE graphics in the 11th gen i7 - has 96 execution units and is sitting on dual channel lpddr4-2133. Making it slower than a laptop with Iris XE and, say, lpddr4-4267, which is a common config right now.

as it’s a development board I guess the slower speed isn’t that unusual, but I think early reports were saying the “retail” cards that would ship in regular systems to end users would only have 80 EU in them, the equivalent of Iris XE in the 11th gen i5.

Clock speed of the onboard memory will be a big factor in performance, I think, but I think these weren’t intended for gaming performance so much as extra features of Iris XE and an add-in card (more outputs) for systems that have the older gen HD graphics (everything Intel-based right now).

I expect it will be the cheapest optional add-in GPU on like Dell optiplex and precisions and the like.
 
Last edited by a moderator:
They had one thing to do...
This needs GDDR5 or HMB2 memory and at least 256 bit bus.
I get they are targeting OEM PCs and this is a 30 watt part but that's not what were talking about 2 years ago.
Maybe next year Nvidia and AMD will see some low end competition.
Feels like rooting for any Cleveland sports team. "Maybe next year"
Sure, then it would need at least twice the ammount of shader cores.

HBM? really? among other things it would increase cost dramatically.

As the first desktop entry since the i740 (was there a i800 series desktop card?), it doesn't look that bad as an OEM entry level solution. I do have doubts about it not being much better than the integrated graphics. But hey, baby steps...
 
Last edited:
This one is actually the same spec as the Iris XE graphics in the 11th gen i7 - has 96 execution units and is sitting on dual channel lpddr4-2133. Making it slower than a laptop with Iris XE and, say, lpddr4-4267, which is a common config right now.

as it’s a development board I guess the slower speed isn’t that unusual, but I think early reports were saying the “retail” cards that would ship in regular systems to end users would only have 80 EU in them, the equivalent of Iris XE in the 11th gen i5.

Clock speed of the onboard memory will be a big factor in performance, I think, but I think these weren’t intended for gaming performance so much as extra features of Iris XE and an add-in card (more outputs) for systems that have the older gen HD graphics (everything Intel-based right now).

I expect it will be the cheapest optional add-in GPU on like Dell optiplex and precisions and the like.
It it lets Dell ship a small desktop with a CPU that lacks an iGPU while giving it the ability to run dual DP screens at 1440p 60hz so their eyes don’t get strained under the LED lights. Then I and all my clerical departments will be happy. And if it sips power and has good low power states I could see it being a win for my hydro bills.
 
As the first desktop entry since the i740 (was there a i800 series desktop card?), it doesn't look that bad as an OEM entry level solution.
Technically there was an i752 successor (if only briefly). The 800 series were chipsets (aside from the i860 CPU from the 80s), but the integrated graphics versions theoretically used the 740's successor architecture which had already been developed.

I wouldn't mind a fully functional card for use as a basic video adapter for some of my crunching systems without integrated options.
 
griff30 you’re thinking of “DG2” which is built on the “high power” architecture (“XE-HPG”) that Intel claims they have. In this case they’ve launched a low-end, oem-only part first, based on the “low power” (“XE-LP”) in the integrated graphics instead of the so-called high end one first.

I will believe it exists when it is actually available. I say “so-called” because the last set of rumors I saw regarding for performance put it squarely in the “mainstream / esports” performance segment.

I don’t remember if DG1 was even mentioned early on in talks about Xe discrete GPUs.
 
griff30 you’re thinking of “DG2” which is built on the “high power” architecture (“XE-HPG”) that Intel claims they have. In this case they’ve launched a low-end, oem-only part first, based on the “low power” (“XE-LP”) in the integrated graphics instead of the so-called high end one first.

I will believe it exists when it is actually available. I say “so-called” because the last set of rumors I saw regarding for performance put it squarely in the “mainstream / esports” performance segment.

I don’t remember if DG1 was even mentioned early on in talks about Xe discrete GPUs.
Ah, that clears it up. Thanks.
 
Back
Top