Intel Iris Xe DG1 Graphics Card Review [Gamer Nexus]

Any summary for those of us who don't want to watch a half-hour video in order to absorb 90 seconds worth of content?

The Xe DG1 is basically the IGP ripped from Intel's mobile CPU and paired with its own RAM on a discrete card. Performance wise, it nips at the heels of the GT1030 GDDR5, though with some issues with frame time consistency. It'll be a good competitor to both Nvidia and AMD on the low end if Intel can work out the driver bugs.

I see this card going two ways in the market. First and more preferable, it gets released and forces Nvidia and AMD to drop their prices and/or fix stock issues because it competes well with their lower end cards. Or second, scumbag miners find some new shitcoin to mine on the Xe DG1 and the status quo doesn't change, the card is scalped and bought up by all of the miners and we're forced to stick with GT210s from 10 years ago.
 
It's kinda weird to be talking about a brand new GPU that can get beaten senseless by a 7 year old 750ti.

They have to start somewhere.

And if you aren't actually playing games, then game performance doesn't matter. You would just need something that supports a newer HDMI or Displayport standard, along with maybe hardware decoding of common new video codecs. The problem is that many of the budget cards from Nvidia are getting so old that in some cases they can't even do those two simple things.
 
They have to start somewhere.

And if you aren't actually playing games, then game performance doesn't matter. You would just need something that supports a newer HDMI or Displayport standard, along with maybe hardware decoding of common new video codecs. The problem is that many of the budget cards from Nvidia are getting so old that in some cases they can't even do those two simple things.
Same issue with AMD. No one has released truly new budget cards in years. The only releases with new technology have been midrange cards or better and it doesn't look like that is going to change anytime soon unfortunately for many of us.

I know Kyle said a while back that he expected AMD and nVidia to intentionally abandon the low end market in favor of the integrated graphics on CPUs. I have no doubt he's correct considering the overall profits on low end cards are probably non-existent and there's little chance of keeping prices somewhat low with the large increase in the cost of GPU dies. It also needs to be considered that with the advent of iGPUs the volume of sales for low end cards has plummeted.
 
Maybe now that we've actually seen one of these for real, sites will stop showing that ridiculous render from last year. You know, this one.
Intel-Ponte-Vecchio-Featured-Image-1030x670.jpg
 
Last edited:
Maybe now that we've actually seen one of these for real, sites will stop showing that ridiculous render from last year. You know, this one.
View attachment 368033

Well this is a DG1... which is an OEM only GPU that is basically just their IGP parts spun on their own silicon.

These where not designed to be a real for sale GPU. They purposely made them not work on standard PCIe for a reason. From a business point of view these are to help Intel sell silicon to OEMs that have non functioning iGPU without costing them a bunch of money. So imagine Intel taking a ton of semi junk 10nm chips that have say 6 out of 8 working cores with GPUs that are not working. (we know Intel took a bath on 10nm with tons of chips that are half working... yields where terrible and Intel probably still has bins of thousands of such 10nm chips low end spec with non functioning igpu) The DG1 lets Intel sell OEMs those 10nm CPUs.... along with a DG1. Expect some funky OEM part numbers on low feq low core count 10nm parts paired with DG1.

So Intel gets a bit of experience... and gets to sell off a bunch of chips that no one would ever buy retail (low clock low core no igpu) at least for a price Intel would want to sell them at. However DG1 gives Intel a way to recoup at least some of that from OEM sales.

Bring on DG2... which may or may not look something like that render. lol
 
IIRC this is essentially just their integrated gpu slapped on a dedicated card so devs could get familiar with the platform. We saw demos of this running Destiny 2 in Jan of 2020. This is old news.

DG2 is when we should start seeing decent gaming performance
 
So a slightly better igpu.
For those of you thinking this is going to be the line of cards* that solves the shortage of gpus and provides much needed competition to AMD and Nvidia - I don't recommend holding your breath.🤷‍♀️

*By line of cards i mean any discrete gpu intel releases in the next 10 years.
 
So a slightly better igpu.
For those of you thinking this is going to be the line of cards* that solves the shortage of gpus and provides much needed competition to AMD and Nvidia - I don't recommend holding your breath.🤷‍♀️

*By line of cards i mean any discrete gpu intel releases in the next 10 years.

DG1 is NOT the GPU they have been working on for a few years.

This performs like a iGPU because it is. lol This is a OEM part designed to salvage the 40% of 10nm CPUs Intel turned out for almost 2 years that have non functioning GPUs. lol

DG1 is just their iGPU on a non CPU bit of sand.
 
for anything non gaming and under $1000 cad I’d take this in a heartbeat. I have lots of secretarial work that gets done and needs light video, and photo touch up work.
 
It's kinda weird to be talking about a brand new GPU that can get beaten senseless by a 7 year old 750ti.

Just wait until they release the overdrive chip and it only gets beaten senseless by 5 year old cards. Instead of 7 year old cards.
 
they need to drop the DG2 this is not doing anyone any good unless you want prebuilt PCs.
 
Again you guys get this is NOT what Intel has been working on right ?

This is just their igpu without the CPU cores. Its a OEM card only that doesn't even slot into a standard PCIe slot. Its made to sell broken CPUs to OEMs they are selling these bundled with broken 10nm salvage CPUs for less then working CPUs. (its a 10nm salvage program) They let the GPU group put it together for a little extra experience before they launch actual GPUs designed to sell at retail.

The only real stupid thing Intel has done here is call it DG1.... giving people the wrong idea. It also didn't help that Intel claimed it was to help developers with their platform a year back when they had nothing to show and people where rightfully worried about their jobs. (I mean the CEO and most heads are gone since) If anyone their had any sense they would have named this something without the one at the end... and not named their actual GPU DG2. I mean man are they stupid sometimes. lol
 
I watched the video, the 1% lows were abysmal. I know this is not a high end part, but it should at least have stable frames.
 
I'd buy a cheap one just for quicksync and H264/265 duties. Could probably move my unraid server over to AMD finally.
Might be good for an AMD homeserver where you want to run blue iris for cameras. Of course, Intel would have to allow it first.
 
Last edited:
tldw: its crap, barely better than integrated. fingers crossed for DG2...

50-100% more performance over the 11th gen UHD 750 and 100-200% more performance over the UHD 630 isn't what I'd consider "barely better". Sure the 1% lows sucked, but that's pretty much the same for the former two IGPs in my experience.

Even being a hacked together solution for OEMs, it's good for what it is.
 
Wow that's so lame that they won't work on the majority of motherboards out there. I wonder what their reasoning for doing that was
 
Wow that's so lame that they won't work on the majority of motherboards out there. I wonder what their reasoning for doing that was
With the current situation of graphics cards, it'd make sense to lock them to only work in their intended purpose, for OEMs only in OEM branded machines. It'd prevent, or at least deter the possibility of theft out of the factories, and slimy buyers doing backroom deals to get cards to scalp or sell on to miners.
 
That is a pretty low bar Intel set for themselves even for an entry level card. Lets be honest, these days even basic entry level cards cost quite the premium.
 
I was hoping for an advanced explanation.
Think I pretty much explained it.

For 2-3 years Intel has been trying really hard to make 10nm work. But it hasn't. When CPUs are fabbed their will always been microscopic defects in the wafer. These tend to be random and scattered over the wafer. With 10nm Intel has had a ton of issues with defects being a lot higher then they should be, and due to the very tight gate widths they tried to pull off those small defects cause major problems. Not the type you can solve by just downclocking a bit. So they have had to fuse off cores and in a lot of cases the defects have hit the GPU components. So Intel has bins full of thousands of CPUs that actually work for the most part accept the iGPUs are either buggy or simply too damaged to function. So they fuse of the GPU. Now Intel can sell those as CPUs without a GPU however.... that means OEMs have to spend money on a GPU. Which for most inexpensive OEM machines they are not willing to do. Intel also has chips that have defective GPUs and fused cores. So if they want to sell a OEM say a 4 core 10nm part without a GPU... they have to either pay them to take them (really that is a thing) or find someway to give them value.

One solution at Intel has been to fabricate a standalone GPU (DG1) that is nothing more then a working coffee lake iGPU. They have thrown it together to be extremely inexpensive and locked it to those OEM intended machines. This is a production fix... a way for Intel to move probably 10s of thousands of coffee and comet lake chips they have that are far too broken to ever make it to retail. But can be salvaged this way in OEM machines. DG1 isn't designed to make Intel a ton of money... they are just trying to recoup something out of all those 10nm fails. Its common practice for OEMs to get special SKUs on CPUs that are not quite retail worth.... its why you see the odd OEM machine with a chip with a odd number that is clocked 200mhz slower then a retail part ect. Mostly they end up in emerging markets... but sometimes you see them in low end OEM machines in the west as well.

https://arstechnica.com/gadgets/202...crete-graphics-cards-to-oems-and-integrators/

"The Iris Xe discrete add-in card will be paired with 9th gen (Coffee Lake-S) and 10th gen (Comet Lake-S) Intel® Core™ desktop processors and Intel(R) B460, H410, B365, and H310C chipset-based motherboards and sold as part of pre-built systems. These motherboards require a special BIOS that supports Intel Iris Xe, so the cards won’t be compatible with other systems."

PS.... I do wonder if these are even custom silicon. I suspect these are actually coffee lake CPUs with only the iGPU working. I would love someone who had one of these to rip it apart and put the silicon under a microscope. I bet they are CPUs. lol
 
I suspect these are actually coffee lake CPUs with only the iGPU working.
Coffee Lake didn't have Xe, though, and that first-gen 10nm laptop CPU that was only used in one cheap China-only laptop didn't either, as far as I know, so this would have to be something else.
 
  • Like
Reactions: ChadD
like this
Coffee Lake didn't have Xe, though, and that first-gen 10nm laptop CPU that was only used in one cheap China-only laptop didn't either, as far as I know, so this would have to be something else.
That is a good point.

Ya it probably is custom silicon. Makes me wonder though what processes they used. Did they fab these at 10nm as well ? Or are they perhaps 14nm+^9 or something. :)
 
That is a good point.

Ya it probably is custom silicon. Makes me wonder though what processes they used. Did they fab these at 10nm as well ? Or are they perhaps 14nm+^9 or something. :)
I'm pretty sure I read somewhere that it's 10nm superfin, which is 10+ in their older nomenclature.
 
  • Like
Reactions: ChadD
like this
I'm pretty sure I read somewhere that it's 10nm superfin, which is 10+ in their older nomenclature.
So they got to salvage parts... give their mostly brand new GPU team some experience with dgpu drivers firmware ect... and probably gave their fab folks an opportunity to try to get 10nm+ working on a part that they can run in small quantities and not bet the company on.

Win win win. Makes good sense. Other then the stupid marketing missteps that have customers thinking DG1 is a product of their skunk works high end GPU lab, actually pretty damn smart.
 
to be fair...
did you even watch the vid? he explains all that in the first segment.
Really oh lol fair enough. I did watch it I must have tuned it out for the first few min.

Nah I just went back and rewatch... no he really doesn't do that. :) He actually calls it a "prelude to what is coming up in the future" His own stupid words. lol As well as saying he tried to get it to work in other boards.
 
50-100% more performance over the 11th gen UHD 750 and 100-200% more performance over the UHD 630 isn't what I'd consider "barely better". Sure the 1% lows sucked, but that's pretty much the same for the former two IGPs in my experience.

Even being a hacked together solution for OEMs, it's good for what it is.
If it´s that much better than a UHD 750, it´s probably a great card to use on a media server for transcoding duties, I hear QuickSync is really good.
 
Back
Top