VideoCardz: Intel to unveil Xe-HPG gaming architecture with hardware ray-tracing

Snowdog

[H]F Junkie
Joined
Apr 22, 2006
Messages
11,262
I usually don't post rumors here, but Videocardz is calling this confirmed, and they are usually sure when the say that:

https://videocardz.com/newz/intel-to-unveil-xe-hpg-gaming-architecture-with-hardware-ray-tracing
Intel has confirmed it is developing another Xe sub-architecture.

Intel Xe-HPG for enthusiast gamers coming in 2021
Intel has been actively developing another GPU architecture since 2018. This is another Xe micro-architecture called Xe-HPG optimized for gaming. The XPG roadmap spans from mid-range to enthusiast segments. It was built upon the three Xe pillars: Xe-LP (Graphics Efficiency), Xe-HP (Scalability) and Xe-HPC (Compute Efficiency).

Intel has not confirmed the specifications of the GPU, but it was said that it features the GDDR6 memory subsystem to improve the performance per dollar ratio. On the other hand, the Xe-HP series will feature HBM memory.

NVIDIA introduced hardware raytracing 2 years ago with GeForce RTX 20 series. AMD is expected to launch its RDNA2-based graphics cards leveraging hardware raytracing later this year. Intel has also confirmed their Xe-HPG series will support hardware-accelerated raytracing.
 
I will gladly welcome a 3'rd player into the ring, NVidia does a good job advancing it's performance metrics, AMD does a good job playing the middle ground, I would like to see what Intel can bring to the table here, they have a lot of GPU's in the field and even if it just means that their onboard solutions get a huge kick in the pants that is really a win for everybody.
 
There's reportedly an issue with their fabric that balloons power consumption for four tile configs that won't be fixed for a while. But if a single tile GPU pumping out 10.6 TFlops of FP32 has sane thermals and solid drivers, I'd seriously consider snagging one as an early adopter.
 
There's reportedly an issue with their fabric that balloons power consumption for four tile configs that won't be fixed for a while. But if a single tile GPU pumping out 10.6 TFlops of FP32 has sane thermals and solid drivers, I'd seriously consider snagging one as an early adopter.

Tiled stuff isn't for gaming anyway. It's a data center compute part.
 
I'm hoping there's a consumer part derivative that maintains the compute performance. Gaming's not my first order of interest for it.

There won't be. Compute parts are splitting from gaming parts for everyone. AMD is going CDNA/RDNA, NVidia has A100 Data center, and will have completely separate Ampere gaming GPUs.

A big differentiator is Ray Tracing. All the gaming parts have Ray Tracing. All the data center parts mentioned here, don't.

Ray Tracing is table stakes on Gaming GPUs beyond entry level, going forward.
 
1597270246666.png
 
Cool. I'd be interested in Intel actually delivering a decent IGP experience, but having another player in the GPU market can't be a bad thing.
 
Today is NDA day for the Intel CPU/GPU info, this is what Videocardz was referring to as confirmed.

HPG GPU will be using an external fab for certain:
Intel-4_26.jpg
 
In Windows it’s been iffy, especially for OpenGL. I understand they’ve been making strides in the last few years, though. The Linux drivers, on the other hand, are fantastic.
Yeah - intel has been pretty good about Linux support, so I welcome the extra competition. I have been locked into AMD because of their FOSS drivers, and I really would like to have more discrete options.
 
Would have to assume if Intel is releasing a full on gaming GPU line they will also focus on improved drivers. I never really had any problems with intel video drivers in the past, though I've certainly never tried any kind of gaming on them.
 
In my own exp,Hell yes they are bad especially for gaming.
I agree but I am not sure where the drivers end and the hardware begins. Yes intel drivers aren’t great for any sort of gaming but they were never really designed for it. Intel’s drivers are built for compatibility and stability to which they are pretty good. But they to date have put 0 effort into gaming optimization or features beyond what the various specs call for.
So while I 100% agree their drivers are shit for gaming I’m pretty sure it’s because games were the absolute last thing on their checklist for validation.
 
If you're playing a game that is within the scope of performance of the product, they're usually pretty good. Haven't had problems with them in years.

OpenGL is their bête noire on Windows, but outside of it being slower than Direct3D and a little unrefined, I haven't run into any rude surprises in a while. It's clear somebody within the company's been working on making the most of the hardware in modern games and applications. The worst I can usually lay at their feet is that it takes a little while for a driver update to show up to optimize for a new title. They've gone from being an embarrassment to an acceptable baseline. On Linux they're the best-maintained Mesa driver, full stop. I am non-trivially interested in seeing how a consumer market Xe part holds up.
 
Intel drivers seem pretty reliable to me. I think the hardware, thus far, has just been weak and under-powered, not any problem with the drivers.

And you have to think, most of the Windows PCs out there are running on Intel IGP. While not great for gaming, they have been battle tested.
 
Intel drivers seem pretty reliable to me. I think the hardware, thus far, has just been weak and under-powered, not any problem with the drivers.

And you have to think, most of the Windows PCs out there are running on Intel IGP. While not great for gaming, they have been battle tested.

The main problem I see is on GoG forums, people having trouble with old Games (10+ years old) that were only ever designed with NVidia/AMD in mind.
 
The main problem I see is on GoG forums, people having trouble with old Games (10+ years old) that were only ever designed with NVidia in mind.
FTFY– +10 yrs ago nvidia was using broken (but sometimes technically compliant) opengl in their drivers, and patching the games to make it work. Games worked on nvidia hardware, and were buggy but playable on AMD. Nobody tested for Intel, so while their opengl implementation was actually pretty good (though not super fast), whether it worked was another matter.

Modern nvidia drivers are more compliant, but more importantly (imo) games have moved away from opengl and dx9. dx10+ and vulken are more strict, and moves more of the code into the game rendering engine, so if it works on one platform it's more likely to work on the others.
 
FTFY– +10 yrs ago nvidia was using broken (but sometimes technically compliant) opengl in their drivers, and patching the games to make it work. Games worked on nvidia hardware, and were buggy but playable on AMD. Nobody tested for Intel, so while their opengl implementation was actually pretty good (though not super fast), whether it worked was another matter.

Modern nvidia drivers are more compliant, but more importantly (imo) games have moved away from opengl and dx9. dx10+ and vulken are more strict, and moves more of the code into the game rendering engine, so if it works on one platform it's more likely to work on the others.
Yeah OpenGL was an interesting beast to code for, there was no in between it ran like hot garbage or greased lightening. There was no middle ground, a lot of pretty janky implementations out there. DX9 wasn’t much better but how much better depends on the version I mean DX9 launched in 2002.... lots of “interesting” code written for that too I’m glad teachers have done a better job globally at stamping in better coding practices into students.
 
It will be nV for high, AMD mid, Intel budget
I have hopes that the high end Xe DG1 gamer version matches nVidia.
I don't think Raja will let us down. They even had Kyle in there for a while, and he doesn't back vaporware.
The low end Intel GPU will meet the needs of 90% of gamers in the gtx1050 area and offer Plex users more encoding streams than we have bandwidth.
If it's under $150 it's a day one buy for me.
As for those who won't think of buying it, you can thank the competition for lower prices on the GPU you do buy.
 
Back
Top