Intel Arc Alchemist Xe-HPG Graphics Card with 512 EUs Outperforms NVIDIA GeForce RTX 3070 Ti (Leak)

sleepeeg3

Supreme [H]ardness
Joined
Mar 4, 2004
Messages
5,397
TechPowerUp said:
Intel's Arc Alchemist discrete lineup of graphics cards is scheduled for launch this quarter. We are getting some performance benchmarks of the DG2-512EU silicon, representing the top-end Xe-HPG configuration. Thanks to a discovery of a famous hardware leaker TUM_APISAK, we have a measurement performed in the SiSoftware database that shows Intel's Arc Alchemist GPU with 4096 cores and, according to the report from the benchmark, just 12.8 GB of GDDR6 VRAM. This is just an error on the report, as this GPU SKU should be coupled with 16 GB of GDDR6 VRAM. The card was reportedly running at 2.1 GHz frequency. However, we don't know if this represents base or boost speeds.

When it comes to actual performance, the DG2-512EU GPU managed to score 9017.52 Mpix/s, while something like NVIDIA GeForce RTX 3070 Ti managed to get 8369.51 Mpix/s in the same test group. Comparing these two cards in floating-point operations, Intel has an advantage in half-float, double-float, and quad-float tests, while NVIDIA manages to hold the single-float crown. This represents a 7% advantage for Intel's GPU, meaning that Arc Alchemist has the potential for standing up against NVIDIA's offerings.
Source

Could intel actually have something semi-competitive on their hands, after decades of trying?

Will believe it when I see it!

Intel's Arc Alchemist
qGkN9oFmGOWtlQSJ.jpg

NVIDIA GeForce RTX 3070 Ti
vlRyyBP43AjkkkGM.jpg
 
Last edited:
As long as it mines as good or better than 3070 Ti, so we can keep it too out of the hands of the whiny and entitled.
Intel is likely to divert 80% of their GPU’s to OEMs and the remaining 20% to the retail channels. Based on their server cards these things are going to be mining beasts.
 
Intel already has graphics drivers.

They do, but last I checked they were pretty basic.

OK for a basic laptop that really does t do anyvting but web browsing and Office, but it exactly with details 3D rendering settings like Nvidia and AMD.
 
This is exactly where everyone has been saying performance is going to be, but yeah the drivers are going to be a big question.

See: Xe iGPU vs AMD iGPU vs low-end NV dGPU benchmarks. NV vs AMD is pretty consistent but Xe is all over the place in relation to the others. Idk how much better that got over the course of 2021 but I'm going to be watching Arc's performance consistency real close.
 
It depends on the platform. Windows users are probably in for a transition period before things stabilize, with newish titles getting steadily better while Intel’s performance will simply lag for older ones because they have no way of making up the hundreds of thousands of man-hours AMD and Nvidia have poured into app-specific optimizations and profiles for the last twenty years. Intel’s Windows drivers have generally been decent given the limits of the hardware outside of OpenGL. If Intel sticks with it, I think they could be successful here.

I predict Linux users will be very happy - Intel’s led the field in open driver quality for years. It would be surprising if Xe’s changes under the hood meaningfully impacted that reputation. There are still questions - will there be good OpenCL support in a reasonable timeframe? Will bringing Intel graphics hardware up on RISC-V or Power motherboards be hindered by driver team priorities? - but if I could get a video card in the same ballpark as a Vega 64 with a lower TDP and hardware raytracing by some time in 2023 that doesn’t kick my wallet in the balls, I could be a very happy man.
 
Last edited:
They do, but last I checked they were pretty basic.

OK for a basic laptop that really does t do anyvting but web browsing and Office, but it exactly with details 3D rendering settings like Nvidia and AMD.
My kid has a Surface Pro 8 and it’s pretty impressive what it can play with only ~20 watts (i5).

I think “traditional” gaming they would be ok. It’s the lack of dlss, ray tracing, ect where they would get slaughtered.

It’d be exciting to see another dedicated gpu vendor regardless.
 
My kid has a Surface Pro 8 and it’s pretty impressive what it can play with only ~20 watts (i5).

I think “traditional” gaming they would be ok. It’s the lack of dlss, ray tracing, ect where they would get slaughtered.

It’d be exciting to see another dedicated gpu vendor regardless.
Intel has their own version of DLSS apparently coming... and I am pretty sure these cards still have planed RT support. Clearly that will require a new driver... but yes when they release these they will need a day one driver. Intel isn't completely new to the game of GPU drivers though so I would not expect a total shit show. I suspect the drivers will be solid but things like RT may be less reliable.... and who knows if their Intel DLSS is ready to go the same day they start selling em.
 
Can't help but miss [H] proper right now. When these launch a good [H] review of image quality and realistic gaming settings ect would be great.
 
Intel has their own version of DLSS apparently coming... and I am pretty sure these cards still have planed RT support. Clearly that will require a new driver... but yes when they release these they will need a day one driver. Intel isn't completely new to the game of GPU drivers though so I would not expect a total shit show. I suspect the drivers will be solid but things like RT may be less reliable.... and who knows if their Intel DLSS is ready to go the same day they start selling em.
Ray tracing will be fine on Intel, the DX12 specification on that is pretty buttoned up and there isn’t a lot of wiggle room for developers there. They are all pretty much using the same tool set to implement it.

It’s going to be older titles that give them a hard time, developers have done some pretty janky things over the years with lighting and texture “optimizations” to squeeze frames out of their games. Sure Intel is going to replicate those as best they can for any of those older popular titles, and by popular I mean the ones with the greatest recent YouTube and Twitch ratings. But the further you move from them the weirder stuffs going to get, until of course you go so far out you land in the weeds of the indi titles. Unlikely they paid AMD and NVidia to optimize anything there so those will be just fine.
 
Intel graphics drivers have been fine for basic desktop use but have always been really buggy and bad for gaming which is fine in a way because they didn't really have the power for most games but I don't think it should be dismissed as a possible issue.

Hopefully the drivers are stable when they launch but this certainly isn't anything I'm interested in being an early tester for. I'm also generally in favor of increased competition but there are some concerning possible scenarios that involve Intel and Nvidia being the only ones in the market, I can't think of many companies I would want to be in charge of that market less than those.
 
I wonder how long it'll be before someone finds a security hole in it where the only option is to either disable it or run the latest drivers/firmware that gimp performance by 10% ;)
 
XeSS hehehehe
Silly name aside if it works the way Intel suggests... and they don't drag their heals getting the DP4a (universal) version out the door. They could completely put DLSS out to pasture in a way FSR couldn't quite do. It promises to basically be DLSS for all. (well for RDNA2 and RTX cards anyway) FSR may still help people stretch those RX580 and 1070s out.

For those that don't know Intel is claiming to allow pretty universal support by using a standard INT8 math instruction (DP4a)... with XeSS also being able to use the XMX matrix units on their new chips for more performance. If it works out it should be somewhere in between DLSS and FSR. DLSS quality hopefully with the easy and universal implementation of FSR.
 
Exciting times. Intel has the money, experience and has been trying to break into this market for years. Here’s to hoping something great comes out of this!
 
Jim at AdoredTV came back online within the last few weeks and produced a video documenting his outlook on Intel...and it's optimistic.
 
Intel is likely to divert 80% of their GPU’s to OEMs and the remaining 20% to the retail channels. Based on their server cards these things are going to be mining beasts.
I mean, I was kind of kidding but on the other hand I'm also not.

Let us join Intel, and with arms locked march right over the soft, overly ripened heads of the chubby cheeked gamers desperate for a GPU upgrade. To watch someone's face collapse in grief again and again? It just does something to the human heart!
 
Last edited:
I would suspect Intel's cards are going to be the choice for compute sleds for virtualized environments through big OEM's like Dell, etc. Intel will likely be able to better nail down software support & integration for that type of environment, and honestly, that's where the big money is going to be made.
 
I wish them the best, but always temper things like this with an adage I learned from a wizened elder early in my career:

"Our ideas are MUCH better than our competitors products!"

Yep. Always are. You will be judged when you ship.
 
Pretty confident in Intel's upcoming stuff from what I've heard on partner calls. Generally it is more focused on the compute side for Datacenter/Edge , but when I've talked/asked about the gaming side with them, everyone seems generally enthused about it, and nothing seems really as fluff as when they were discussing pre-release Xeons in the past. So I am actually pretty excited.
 
Yep. Always are. You will be judged when you ship.

I won't buy Intel unless I have to, but I fully expect them to bring down an iron fist on their partners to ensure they only treat customers with their velvet gloves.

Anyone who can wait for the Q2-Q3 desktop GPU market adjustment on prices should. We've waited this long...
 
I won't buy Intel unless I have to, but I fully expect them to bring down an iron fist on their partners to ensure they only treat customers with their velvet gloves.

Anyone who can wait for the Q2-Q3 desktop GPU market adjustment on prices should. We've waited this long...
For sure.
I expect their offerings to tilt better towards compute, and not just because Intel, but because making graphics drivers which work well in all games is ... well, "have fun". Those games shipped, exist, and need a "drop in" or fail. That's a tough nut to crack, but could work going forward - if one doesn't really want to run older games. Maybe.

Compute is easier. Clients will re-write if there's a cost-benefit, in general. It's current code, not a DX9/DX10/DX11 game.

But that's why I say - will be judged when shipped. Might be great if you don't care about the dark areas, and that's cool. I just worry the dark areas may be large. But hey, no dog in this hunt, popcorn time!
 
Do people not know that The FPS Review has the SAME GUY doing the GPU reviews the SAME WAY he did them here?
Ya we know where Brent is. Just seems like he had better access before... but perhaps that is in my head.
I just hope Brent doesn't have to line up his online shopping bot to review Intels offerings.
 
Ya we know where Brent is. Just seems like he had better access before... but perhaps that is in my head.
I just hope Brent doesn't have to line up his online shopping bot to review Intels offerings.
Failure or deliberate attempt to not get a sample in someone like Brent's hands is a really telling bit of data IMO.
 
Do people not know that The FPS Review has the SAME GUY doing the GPU reviews the SAME WAY he did them here?
I didnt know. I forgot The FPS Review even existed. I havent really watched/read reviews since H went belly up... I dont upgrade my computer often like I used to.
 
Pretty confident in Intel's upcoming stuff from what I've heard on partner calls. Generally it is more focused on the compute side for Datacenter/Edge , but when I've talked/asked about the gaming side with them, everyone seems generally enthused about it, and nothing seems really as fluff as when they were discussing pre-release Xeons in the past. So I am actually pretty excited.

Any word on decoding capabilities?
 
I'd love to have a plug in intel GPU for quick sync capabilities with AMD systems. It's the biggest reason I keep my home server stuff on intel instead of the AMD multicore monsters.
 
I'd love to have a plug in intel GPU for quick sync capabilities with AMD systems. It's the biggest reason I keep my home server stuff on intel instead of the AMD multicore monsters.

True, having a bottom barrel Intel dedicated GPU for video decode would be nice too.

I run a number of LibreElec/Kodi boxes, and I have found the death of the basic entry level GPU frustrating, because it means that whenever they dream up a new codec, if you want to hardware decode it, you have to replace the CPU and motherboard.

A cheap version that is about as capable as on board Intel graphics, but kept current with hardware codec decode would be great, IMHO.
 
True, having a bottom barrel Intel dedicated GPU for video decode would be nice too.

I run a number of LibreElec/Kodi boxes, and I have found the death of the basic entry level GPU frustrating, because it means that whenever they dream up a new codec, if you want to hardware decode it, you have to replace the CPU and motherboard.

A cheap version that is about as capable as on board Intel graphics, but kept current with hardware codec decode would be great, IMHO.
Would be cool if things were a bit more like the olden days, and you could buy a card that just had whatever the latest intel CPU socket was on it, and it let you slap in an Intel CPU just for the iGPU capabilities as mentioned.
 
Any word on decoding capabilities?
Ah they weren't really going into any specifics for consumer side (They have been Surprisingly tight lipped the past 2 years), but the fact they just seemed generally excited and weren't making dumb claims seemed a good sign to me.
 
What I really want to know is will Intel include their AI upscaling algorithms in their open source Linux drivers. Intel’s open source GPU driver stack is pretty impressive so how much of it remains open for the gaming parts intrigues me.
 
Back
Top