Intel Arc Alchemist Xe-HPG Graphics Card with 512 EUs Outperforms NVIDIA GeForce RTX 3070 Ti (Leak)

Would be cool if things were a bit more like the olden days, and you could buy a card that just had whatever the latest intel CPU socket was on it, and it let you slap in an Intel CPU just for the iGPU capabilities as mentioned.
That sounds inefficient. You know what would be a better idea... is a company like AMD, Intel or Nvidia (fat chance on NVs part) just make a actual 2D first card with encode/decode and extremely basic 3D. (instead of rehashing 5 year old gaming cards as lowbies) I mean both AMD and Nvidia do turn out new stock on some of those low end 4-5 year old cards. Its just insane that you can't even find those in stock. As an example my local part shop... cheapest 2 GPUs right now are both low end AMD/Nvidia workstation cards, which both cost more then they ought to. Next step up in stock are $350 (Canadian $280ish USD) 1050tis I shit you not, insanity. Pretty sure that is about double what those things MSRPs where new.
 
What I really want to know is will Intel include their AI upscaling algorithms in their open source Linux drivers. Intel’s open source GPU driver stack is pretty impressive so how much of it remains open for the gaming parts intrigues me.
Given how good their Open source stuff is, I would bet on it.

I am not sure I would bet on Intel making all their fancy new features open source. I mean the card will work for sure... Intel has already added everything needed I believe for kernel and mesa support. XeSS though ? I don't know I hope Intel will support everything and keep everything open. It seems like a stretch though... I somehow doubt we get a fancy Intel control panel at launch. Be a good test to see if Intel does pull something out.... it would be something if they drop an official open source control panel for linux.
https://www.phoronix.com/scan.php?page=news_item&px=Intel-Graphics-Panel-2021

Looks like they are getting stuff ready for multiple skus now as well.
https://www.phoronix.com/scan.php?page=news_item&px=Intel-DG2-G12-Linux
 
Given how good their Open source stuff is, I would bet on it.
I would like to think so, and I am sure they will, just wondering if they do, will AMD add it to their driver stack as they currently don’t have one?
 
That sounds inefficient. You know what would be a better idea... is a company like AMD, Intel or Nvidia (fat chance on NVs part) just make a actual 2D first card with encode/decode and extremely basic 3D. (instead of rehashing 5 year old gaming cards as lowbies) I mean both AMD and Nvidia do turn out new stock on some of those low end 4-5 year old cards. Its just insane that you can't even find those in stock. As an example my local part shop... cheapest 2 GPUs right now are both low end AMD/Nvidia workstation cards, which both cost more then they ought to. Next step up in stock are $350 (Canadian $280ish USD) 1050tis I shit you not, insanity. Pretty sure that is about double what those things MSRPs where new.
That's basically how Intel pitched the the Xe Max dGPUs on mobile & desktop. I remember very little gaming mention, and instead a lot of emphasis on the QSV encode/decode as well as compute acceleration for light content creation. Granted, that was more an artifact of DG1 being limited and Intel needing a better iGPU alternative for Comet Lake and Rocket Lake, but I think it shows there's value in cards with limited 3D performance but good media blocks and ok compute performance (see also the Apple M1 variants there).

Its really a fucking waste that Navi24 has no encoder and such poor decode. And it's crappy that NV has been limiting their -108 dies... like what's the fuccin point of cramming a MX450 into a thin-and-light if it has no NVENC and cut down NVDEC in addition to being barely better at gaming than an AMD iGPU? GK108 was the only time that NV put a full-featured media block in a bottom tier GPU, and the result is that the crappy ole GT 710 I have kicking around is more useful to me than a new GT 1030 because NV cut the dang media block in Pascal and they'll probably do the same thing if the make a GA108 /rant
 
That's basically how Intel pitched the the Xe Max dGPUs on mobile & desktop. I remember very little gaming mention, and instead a lot of emphasis on the QSV encode/decode as well as compute acceleration for light content creation. Granted, that was more an artifact of DG1 being limited and Intel needing a better iGPU alternative for Comet Lake and Rocket Lake, but I think it shows there's value in cards with limited 3D performance but good media blocks and ok compute performance (see also the Apple M1 variants there).

Its really a fucking waste that Navi24 has no encoder and such poor decode. And it's crappy that NV has been limiting their -108 dies... like what's the fuccin point of cramming a MX450 into a thin-and-light if it has no NVENC and cut down NVDEC in addition to being barely better at gaming than an AMD iGPU? GK108 was the only time that NV put a full-featured media block in a bottom tier GPU, and the result is that the crappy ole GT 710 I have kicking around is more useful to me than a new GT 1030 because NV cut the dang media block in Pascal and they'll probably do the same thing if the make a GA108 /rant
If you can find them P1000’s or better a P620, they make great little encoders. I use them in my security camera media servers and they are little beasts. The 2GB variants are useless for gaming and mining and you can find them for a little north of $200, half height and powered through the PCIe slot.
 
If you can find them P1000’s and P620’s make great little encoders. I use them in my security camera media servers and they are little beasts. The 2GB variants are useless for gaming and mining and you can find them for a little north of $200, half height and powered through the PCIe slot.
Yeah! I've actually been looking at those for the dedicated game recording box I'm putting together. There's been a decent amount of P600s and P620s showing up for ~$100 locally and that's really tempting vs something like a GTX950/960
 
That sounds inefficient. You know what would be a better idea... is a company like AMD, Intel or Nvidia (fat chance on NVs part) just make a actual 2D first card with encode/decode and extremely basic 3D. (instead of rehashing 5 year old gaming cards as lowbies) I mean both AMD and Nvidia do turn out new stock on some of those low end 4-5 year old cards. Its just insane that you can't even find those in stock. As an example my local part shop... cheapest 2 GPUs right now are both low end AMD/Nvidia workstation cards, which both cost more then they ought to. Next step up in stock are $350 (Canadian $280ish USD) 1050tis I shit you not, insanity. Pretty sure that is about double what those things MSRPs where new.
There's no 2d cards anymore -- everything is done with 3d apis, or the CPU.
 
There's no 2d cards anymore -- everything is done with 3d apis, or the CPU.
That show how little I know, I thought stuff like gpu h264 encode-decode would be called 2d stuff and not using vulkan, dx or other 3d apis.

That sounds inefficient. You know what would be a better idea... is a company like AMD, Intel or Nvidia (fat chance on NVs part) just make a actual 2D first card with encode/decode and extremely basic 3D. (
That sound like a what intel IGPU offer and they are extremely popular, the market that what more than a recent iGPU but would not buy a monster card if that the only offer could be thin.
 
That show how little I know, I thought stuff like gpu h264 encode-decode would be called 2d stuff and not using vulkan, dx or other 3d apis.
Nope, that's closer to cryptographic operations than 2d, which involves fairly simple math by comparison. Only reason we had 2d accelerators in the past is because you needed to do hundreds or thousands of these ops per second, and CPUs were definitely not up to the task at the time.
 
When is ARC Alchemist shipping? I hope there won't be some crazy scalping to blow up the price.

The rumor mill says that it should be a paper launch for laptops mid- or late Q2 with real-world available in Q3. No one's really talking about Intel board partners but expects both the regular GPU manufacturers that produce both AMD and Nvidia cards to work with Intel, as well as the possible addition of other hardware companies not normally associated with graphics.

Also, no one seems to know what order things will launch or announce in.
 
Just hoping they don't totally suck. We need a 3rd player in the GPU game.
Given the mid and low end that both Nvidia and AMD have provided this time around Intel would have to work very hard indeed for their GPUs to do worse. But as stated by others above I expect when they "launch" in Q2 some small quantity will make it to retail channels with the overwhelming bulk of them going to the big OEM's for laptop and desktop integration. There it just needs to provide better value or performance than the GTX 1650, 3050Ti, 3060, and 3070 as that is what is dominating OEM sales currently and those get a refresh in Q3 when all of them announce their back to school stuff.
 
Providing that Xe Iris works great for low graphic settings all they need to create GPUs capable for higher resolutions, reatime raytracing, AI upscaling and support for Dx 12.1.
 
I read that Intel has removed ASTC texture compression support from the Arc series.

I have no clear understanding of how much it is used these days, but I wonder if that impacts support for legacy titles?
 
I read that Intel has removed ASTC texture compression support from the Arc series.

I have no clear understanding of how much it is used these days, but I wonder if that impacts support for legacy titles?
ASTC was a bigger deal on mobile (not laptop) platforms. It's a known quantity outside that space, and OpenGL (and OpenGL ES) support it via an extension, but it never made it into the core specification. My guess is that if it's encountered in the wild, Intel's driver will convert ASTC to a format natively supported by the hardware and that Intel didn't figure it'd be worth dedicating support to in hardware. There's a solid (and very detailed) writeup here.
 
I don't know if Intel is going to be making cards or just chips, but if they are making cards, they should run a promotion allowing you to buy one of these direct at MSRP if you also buy their top CPU.
 
I read that Intel has removed ASTC texture compression support from the Arc series.

I have no clear understanding of how much it is used these days, but I wonder if that impacts support for legacy titles?
AMD helped develop it and they dropped hardware support for it back in 2017. NVidia still does but may not for much longer. It’s mostly used in ARM environments now and ARM is the current patent holder but there’s some despite there with AMD that is currently unresolved either way the patent expires in 2023 and currently the tech is basically only used for mobile environments and the only current PC environment that uses it (that I can find) is a Nintendo Switch emulator.

There are software decoders that can handle it just fine and have been doing so since 2018.
 
Poor Nvidia. Now they are going to see what it felt like for AMD when Intel pressures OEMs to use Intel gpus over Nvidia, well and AMD.. Good ole Intel.
 
Up until ces, Intel was going to release the cards in Q1.

Now its what, Q2 or Q3?
I have a feeling that by the time its released "matching a 3070" won't be a good thing.
 
Up until ces, Intel was going to release the cards in Q1.

Now its what, Q2 or Q3?
I have a feeling that by the time its released "matching a 3070" won't be a good thing.
3070 is pretty good performance. Even more so if they load it with more VRAM, couple that with hopefully the ability to undercut on price. I think these things will sell well if they hopefully hit that level performance.
 
3070 is pretty good performance. Even more so if they load it with more VRAM, couple that with hopefully the ability to undercut on price. I think these things will sell well if they hopefully hit that level performance.
And it comes out 2 years from now, will you be saying the same thing? Its late and getting more late.
Im not against intel. More graphic cards are desperately needed. I just don't think they are going to deliver anytime soon. Hope im wrong and I have one in my rig.
 
Up until ces, Intel was going to release the cards in Q1.

Now its what, Q2 or Q3?
I have a feeling that by the time its released "matching a 3070" won't be a good thing.
I don't know both of Intels competition just released low end cards that can't beat five year old low end cards recently. I'm not hopeful that AMD or Nvidia is in any rush to one up the current gen.
 
Up until ces, Intel was going to release the cards in Q1.

Now its what, Q2 or Q3?
I have a feeling that by the time its released "matching a 3070" won't be a good thing.
Q2 or even Q3 is going to be fine, edging out a 3070 at that stage is going to be good enough. It’s not like the 3070 is a bad card and I highly doubt that anything AMD or NVidia have lined up for their next gen parts are going to be leaps and bounds better in either availability, price, or performance. Between inflation, tariffs, and shipping woes those aren’t coming any time soon anyways.
 
I have a feeling that by the time its released "matching a 3070" won't be a good thing.
That optimistic, at a reasonable price and good volume, matching a 2070 super could be a really good thing in the current environment that would change everything, let alone a 3070
 
Poor Nvidia. Now they are going to see what it felt like for AMD when Intel pressures OEMs to use Intel gpus over Nvidia, well and AMD.. Good ole Intel.
Yes, poor NVIDIA... :whistle:

Crying_Over_Money.jpg
 
nvidia will be crying if Intel can take the market share out of the GPU virtualization market. Given Intel's position with OEM's like Dell and stuff like MX7000's, I think Intel is in good position to take that entire market away from nvidia. AMD isn't even really a player in it. This is where the future & big money is anyways, and TBH likely the only reason Intel is interested in making GPU's.
 
Up until ces, Intel was going to release the cards in Q1.

Now its what, Q2 or Q3?
I have a feeling that by the time its released "matching a 3070" won't be a good thing.

Was it confirmed by Intel on their blog? What about matching a 3090?
 
AMD isn't even really a player in it. This is where the future & big money is anyways, and TBH likely the only reason Intel is interested in making GPU's.

Believe it or not, Intel is actually only interested in making gaming GPUs. Their previous discrete graphics was designed to do streaming video for IIRC Amazon. Their larger goal is to make I+I solutions for consumer and business markets.
 
I've seen people say that these won't be "good enough" to entice people away from Nvidia... the high high end cards are a small fraction of the overall market. I haven't looked at the Steam metrics in a while but I'm sure nothing has changed.

I think your average PC gamer started shaking their head in disgust years ago when high end cards doubled in price, and then doubled again. Every single PC gamer I know still plays in 1080p and they turn off 95% of the graphics options that "look good" because they are distracting.
 
Believe it or not, Intel is actually only interested in making gaming GPUs. Their previous discrete graphics was designed to do streaming video for IIRC Amazon. Their larger goal is to make I+I solutions for consumer and business markets.
I don't believe that for a second. Sure, that's a market they're entering because for vGPU you still need what is essentially consumer level product/support. However, that's not where the money is going to be made for them. The money will be made with massive MX7000 style farms w/ GPU expansion shelves, and Intel will have a massive leg up over nvidia in this realm since Intel is the one building everything else in these servers.
 
The money will be made with massive MX7000 style farms w/ GPU expansion shelves, and Intel will have a massive leg up over nvidia in this realm since Intel is the one building everything else in these servers.

They see them as separate markets, and will be making ASIC-style custom and semi-custom compute cards separately.

This is Intel's chance to come off like the good guys, and they'll put a lot of effort into serving gamers.
 
The bigger question is now, IF Intel can get a solid GPU out there, AND they can work it into a solid APU, will they go after Microsoft for the next console contract?
That a big question, so much time before the next full refresh that everything is possible, with the current margin console are looking for APU wise versus the rest of the market.....has long has you sell everything you do outside console APU, I am not sure you seek much that type of contract.

Is there a reason to go more after Microsoft than the Sony/Nintendo ? Lower volume than the other 2 bigger seller, that would less handcuff them in the higher margin sector ?

Hindsight is 20/20 and long term partnership could still be very good for them and they were grateful in the past and could be on the next one, but just imagine how much more AMD would have made without the console contract.... (well maybe not maybe the contract winner end up with TSMC space anyway.... I do not know enough at all for that)
 
Last edited:
I just don't see Intel entering that specific market. It doesn't make a lot of sense. There is far more money to be made on the enterprise side of things which is where they already are anyways.
 
The bigger question is now, IF Intel can get a solid GPU out there, AND they can work it into a solid APU, will they go after Microsoft for the next console contract?
I don't think Microsoft will ever go in contract again with Intel after the original Xbox fiasco.
 
I don't think Microsoft will ever go in contract again with Intel after the original Xbox fiasco.

I think they would consider if they could get Intel volume (imagine being a console that is easy to buy on the next launch, it is just ridiculous to think about) but I really do not think Intel would do it. P.S. I am speaking completely out of any knowledge or fact based intuition.

I am not sure what was the issue with the original Xbox was it not a nice surprise success ? Did they got any issue with that P3 ?
 
The bigger question is now, IF Intel can get a solid GPU out there, AND they can work it into a solid APU, will they go after Microsoft for the next console contract?

Now that's an interesting thought. AMD is crushing it with blades based on the Xbox APUs.

There is far more money to be made on the enterprise side of things which is where they already are anyways.

That's the thing; only the Playstation is a true gaming device. The Xbox Series series is also an enterprise chip.
 
I don't think Microsoft will ever go in contract again with Intel after the original Xbox fiasco.
Not so much of a fiasco as many like to believe, Microsoft burned AMD far harder for the initial Xbox launch, AMD developed all the prototypes and they only found out they lost the contract when the console was announced, the announcement demonstration was actually running on the AMD hardware.
https://www.gamespot.com/articles/2...ackley apologized on Twitter,, "I beg mercy."

I mean they started designing the second Xbox in 2002, a time when Microsoft and Intel were at their peak hubris, and Nvidia wasn't that far behind. There was no way their combined corporate egos would allow them to collaborate on a project and hit the required console price points, that simply wasn't going to happen. Not to mention the ongoing legal trouble between Microsoft and Nvidia over chip prices, which wouldn't be resolved until 2003 at which point the 360 had already been in development for over a year.

Here is a really good read on the development of the Xbox 360 it was copied from a much older forum that is long since gone
https://www.psu.com/forums/threads/...w-ibm-out-foxed-intel-with-the-xbox-36.16930/

But yeah basically Microsoft and IBM worked out a deal the others couldn't hope to match, IBM got to double-dip their costs as much of the development of tech needed for the PS3 could be shared with the Xbox 360 which in turn made things cheaper for both Microsoft and Sony and Microsoft's shareholders were not willing to take another bath on the hardware as they had on the original console which given they lost an average of $168 per console, and sold over 24 million of them means that the hardware cost them north of $4B, shareholders didn't like that one bit.
 
Back
Top