Intel 2022 Keynote

That Projection presentation screen they are behind is Epic. I got this in the mail today even though haven't been in the program since 2017.
 

Attachments

  • IMG_20220928_015157442~2.jpg
    IMG_20220928_015157442~2.jpg
    341.4 KB · Views: 2
Screenshot_2022-10-03-10-50-48.png


Holy shit, this is an official slide! I thought some 3rd party made it to show where the AMD X3D would fall, but this is actual Intel bullshit graph shenanigans to hide the Raptor Lake losses, aaahahahaha!
 
View attachment 515802

Holy shit, this is an official slide! I thought some 3rd party made it to show where the AMD X3D would fall, but this is actual Intel bullshit graph shenanigans to hide the Raptor Lake losses, aaahahahaha!
You have a very strange definition of hide. Nothing is hidden there, it just takes more than a glance to see it. And losses? 3 Games it losses to the 5800X3D, 2 look like a tie or margin of error, and and 3 other games they seem to beat the 5800X3D. Only a die hard hater would classify this as hiding. Obscuring; yeah, sure. But hiding is some next level reach there.

This graph could have easily been used to obscure the fact that intel wins half the times just the same as it could be used to obscure that it loses half the times. It seems to me that Intel just doesn't want to bring attention to their competitor too much in a graph, whoop de doo. 🤷‍♂️
 
  • Like
Reactions: Dan_D
like this
You have a very strange definition of hide. Nothing is hidden there, it just takes more than a glance to see it. And losses? 3 Games it losses to the 5800X3D, 2 look like a tie or margin of error, and and 3 other games they seem to beat the 5800X3D. Only a die hard hater would classify this as hiding. Obscuring; yeah, sure. But hiding is some next level reach there.

This graph could have easily been used to obscure the fact that intel wins half the times just the same as it could be used to obscure that it loses half the times. It seems to me that Intel just doesn't want to bring attention to their competitor too much in a graph, whoop de doo. 🤷‍♂️
Obscuring yes. I will give them credit for including it I guess. They probably figured they would get clobbered if they didn't. Yes they only loose 2... but they tie in two others, and those wins I mean look at the scale, they are barely wins. Then of course there is the fact that this is their flagship vs a mid range part. I know the 3D is an outlier until AMD releases a 3D part for the 7000s.

If anything this should be showing consumers that if your just a gamer... flagship is a terrible waste of your money. Especially with high end GPUs apparently now being set at a regular price of north of $1k.

Then the other kicker is if your a normal home user of higher end CPU things... how many of them are really CPU things. I mean if you have a blender hobby, again its the GPU that is going to matter to you more then having high end CPU rendering.

Intel is still going to sell these people are still going to over buy. Anyway ya obscuring is the better word, being kind but I guess we have to give Intel some recognition for including it.
 
Last edited:
You have a very strange definition of hide. Nothing is hidden there, it just takes more than a glance to see it. And losses? 3 Games it losses to the 5800X3D, 2 look like a tie or margin of error, and and 3 other games they seem to beat the 5800X3D. Only a die hard hater would classify this as hiding. Obscuring; yeah, sure. But hiding is some next level reach there.

This graph could have easily been used to obscure the fact that intel wins half the times just the same as it could be used to obscure that it loses half the times. It seems to me that Intel just doesn't want to bring attention to their competitor too much in a graph, whoop de doo. 🤷‍♂️
This is just like nvidia trying to make their product look as good as possible. They're selling something so it should come as no surprise.
 
This is just like nvidia trying to make their product look as good as possible. They're selling something so it should come as no surprise.

All of these companies have cheated to win benchmarks and show themselves in the best possible light. It's been going on as long as these companies have been competing with each other.
 
You have a very strange definition of hide. Nothing is hidden there, it just takes more than a glance to see it. And losses? 3 Games it losses to the 5800X3D, 2 look like a tie or margin of error, and and 3 other games they seem to beat the 5800X3D. Only a die hard hater would classify this as hiding. Obscuring; yeah, sure. But hiding is some next level reach there.

This graph could have easily been used to obscure the fact that intel wins half the times just the same as it could be used to obscure that it loses half the times. It seems to me that Intel just doesn't want to bring attention to their competitor too much in a graph, whoop de doo. 🤷‍♂️
Please, arguing semantics is Intel level here. Mixing bars which are favorable to them with little lines that are much less so, is bullshit however you want to call it.
 
It’s a weird graph. It took me a triple take to figure what the hell is going on.

I'm back and forth to whether or not I think they should have even mentioned the X3D on that chart.
 
If anything this should be showing consumers that if your just a gamer... flagship is a terrible waste of your money.
Always has been, since forever.
This is just like nvidia trying to make their product look as good as possible. They're selling something so it should come as no surprise.
Nvidia doesn't have to try to make their product look good, the product speaks for itself. Whether you like them as a company or not they consistently put out top tier products and push the envelop in graphics over and over. Full stop.

We could argue the merit of their pricing ad nauseam, but we would be wasting our breaths. They are in a unique situation and they are the ones calling the shots trying to steer the whole 3000/4000 series storm. All we can do is sit, watch and hope GPU pricing comes back to a sane level somewhere down the road.
 
Intel is still going to sell these people are still going to over buy.
We see them a lot on the Internet people buying 24 gig of Vram or 16 core CPU to games, but outside the very rich people I would imagine tend to look at reviews before going for expensive items.

According to the steam survey:
https://store.steampowered.com/hwsurvey/cpus/
94% of people are on 8 core or less, I could imagine half of the people above that being people that game on PC that do multithreaded work load from time to time and would not have bought that many cores otherwise.

And those people buying many core CPU to game were often buying the higher cache and bin, not the cores, the 5800x3D being a nice choice for them.

It has been so long that buying the flagship CPU are terrible return on money for a gamer (I do not remember much a time when it was not the case would it be because of very interesting model to overclock or because the flagship was not adding much to the game side), that the GPU matter much more, etc... that it must be common wisdom by now.

There was a vast overblown on an issue to get just 6 cores by some (12400K years later showing that very well), but the idea that it was not a very diminishing return of buying on the highest end of CPU for gaming I do not remember people trying to sell that. It is for very rich people.
 
  • Like
Reactions: ChadD
like this
You have a very strange definition of hide. Nothing is hidden there, it just takes more than a glance to see it. And losses? 3 Games it losses to the 5800X3D, 2 look like a tie or margin of error, and and 3 other games they seem to beat the 5800X3D. Only a die hard hater would classify this as hiding. Obscuring; yeah, sure. But hiding is some next level reach there.

This graph could have easily been used to obscure the fact that intel wins half the times just the same as it could be used to obscure that it loses half the times. It seems to me that Intel just doesn't want to bring attention to their competitor too much in a graph, whoop de doo. 🤷‍♂️

It's not a strange definition of hide, it IS the definition of hide. To hide is to conceal something, and when you conceal a result as a weird candle-stick element on a bar graph, that is trying to hide it.

It's just business as usual with Intel PR. They've always been petty, childish and dishonest in their PR slides going back decades. They wouldn't have to rig up stupid graphs to try and rationalize their products existence if they didn't make terrible products in the first place /COUGH 11th gen, Netburst /COUGH.

The pettiness, childishness and dishonesty in Intels PR slides always seems to coincide with how well they're doing in the market. If they're releasing wastes of silicon like Netburst or the 11th gen, they lay it on thick. When they're somewhat even, you get nonsense charts like the one above. When they're far ahead, they generally ignore AMD entirely.

Remember when AMD was going to a chiplet approach with Epyc, and Intel slandered them for "using glued together cores"? Intel had done the same thing as far back as the Pentium D, Core 2 Quad and numerous mobile chips.
 
  • Like
Reactions: Meeho
like this
Always has been, since forever.

Nvidia doesn't have to try to make their product look good, the product speaks for itself. Whether you like them as a company or not they consistently put out top tier products and push the envelop in graphics over and over. Full stop.

We could argue the merit of their pricing ad nauseam, but we would be wasting our breaths. They are in a unique situation and they are the ones calling the shots trying to steer the whole 3000/4000 series storm. All we can do is sit, watch and hope GPU pricing comes back to a sane level somewhere down the road.
Haha you should have read a bunch of the 40 series reveal thread, LOTS of people were claiming that nvidia was outright misrepresenting performance numbers.
 
It's not a strange definition of hide, it IS the definition of hide. To hide is to conceal something, and when you conceal a result as a weird candle-stick element on a bar graph, that is trying to hide it.

It's just business as usual with Intel PR. They've always been petty, childish and dishonest in their PR slides going back decades. They wouldn't have to rig up stupid graphs to try and rationalize their products existence if they didn't make terrible products in the first place /COUGH 11th gen, Netburst /COUGH.

The pettiness, childishness and dishonesty in Intels PR slides always seems to coincide with how well they're doing in the market. If they're releasing wastes of silicon like Netburst or the 11th gen, they lay it on thick. When they're somewhat even, you get nonsense charts like the one above. When they're far ahead, they generally ignore AMD entirely.

Remember when AMD was going to a chiplet approach with Epyc, and Intel slandered them for "using glued together cores"? Intel had done the same thing as far back as the Pentium D, Core 2 Quad and numerous mobile chips.
Again, they aren't concealing anything. It's right there on the graph. Anyone that isn't just glacing at it for a half of a second can easily read it. Anyone stating otherwise is being disingenuous at best.

Haha you should have read a bunch of the 40 series reveal thread, LOTS of people were claiming that nvidia was outright misrepresenting performance numbers.
It's always so strange to see how heated people get when their prefered choice of product is getting overshadowed by a competitors and just how far they'll go to defend some misguided sense of honor of a corporation that doesn't give a rats ass about end users (none of them do).
 
Again, they aren't concealing anything. It's right there on the graph. Anyone that isn't just glacing at it for a half of a second can easily read it. Anyone stating otherwise is being disingenuous at best.


It's always so strange to see how heated people get when their prefered choice of product is getting overshadowed by a competitors and just how far they'll go to defend some misguided sense of honor of a corporation that doesn't give a rats ass about end users (none of them do).
Nvidia is touting their DLSS3 as if it is the only way to play. This draws attention away from the actual performance of the hardware and the talking points and marketing are effectively trying to steer everything this way. Prices are stupid and all the shill sites are attempting to justify them.

I firmly believe that, one of these days, Nvidia will be a dead end platform. The future is really in integrated graphics on the CPU. But everyone is milking the shit out of as many market segments as they can. I knew very few people that can dump a fuckton of money into a discreet graphics card. My buddy and I are always donating systems and graphics cards to our friends and acquaintances because they can't afford to purchase a gaming rig. I'm lucky, I never re-married, and I don't have kids. SO, I can do this from time to time (buy expensive graphics cards). I am pretty done with Nvidia though, they have performance and what not but I don't see the value in their products and and I have been burned more than once on their shoddy hardware. ATI/AMD hasn't always been great, but I don't recall the last time I ever felt like they fucked me.

I think all corporations are evil. Some just have a longer track record of devious behavior. Intel and Nvidia are the biggest two of the misleading marketing and anti-competitive practices. AMD is getting there, gradually. If they had a technological lead so great that no one could touch them within a generation, they would be gouging the living shit out of their consumers. In some ways, the platform pricing for AM5 is exactly that. The acceptance of the new technologies coming out of Team Red has been rather dismal.

I, personally, don't get too excited about technology these days. Nothing is really knocking my socks off so to speak.

I am trying really hard to sit this product cycle out, but I have this birthday money that wants to burn a hole in my pocket... So, kind of waiting to see what Intel 13th Gen actually does. I suspect it's just a refinement of 12th gen, but my adoption costs should be lower with an affordable DDR4 MB. I have been loading all my systems up with AMD's hardware for a couple years now. Kinda itching to give the evil Intel corp another go.
 
Last edited:
I went to the link that Intel listed on the slide that Meeho posted looking for actual numbers but none were found. I did find this listed..

intel 14th & 15th gen.png


Intel apparently used 14th and 15th gen CPUs in this comparison. No wonder Intel got better results than the Ryzen CPUs!!

I also thought it was interesting that Intel was using an "Intel Internal Validation board" so their results can't be outside validated or disputed. And also of interest was Intel not using identical DDR5 memory in their 13900 &12900 tests. Of course the 12900 received the slower memory.

I enjoyed Gamers Nexus take on this Intel keynote. Back to you Steve..



Thanks Steve!
 
The future is really in integrated graphics on the CPU.
Why Nvidia would not be one of the best positioned company in the world for when that happen ?
Tegra is one of the best sold gaming SOC of all time (by far the most outside cellphone ?)

They have SOC in the car-self driving industry, game consoles, AI, the Jetson familyt, when they tried to acquire arm seem to show that like Apple-AMD-Intel they are well aware of that possibility and have a large presence in that world.

A lot of NVIDIA advancement make a lot of sense in the place that SOC make a lot of sense (and independent wireless VR set could benefit a lot from DLSS 3 and what not, same for a switch or other handheld device)
 
View attachment 515802

Holy shit, this is an official slide! I thought some 3rd party made it to show where the AMD X3D would fall, but this is actual Intel bullshit graph shenanigans to hide the Raptor Lake losses, aaahahahaha!

It's always difficult to find benchmarks of World of Warcraft on brand new hardware. I was VERY surprised to see in on the Intel slide. Not only that, but it straight-up shows their new 13-series CPU getting stomped by the 5800X3D. And only 6% faster than regular Zen3? I doubt that Intel would exaggerate performance numbers for their competitor's product, so the numbers are likely real. Kudos to Intel for actually showing situations where their stuff is not the fastest.
 
Why Nvidia would not be one of the best positioned company in the world for when that happen ?
Tegra is one of the best sold gaming SOC of all time (by far the most outside cellphone ?)

They have SOC in the car-self driving industry, game consoles, AI, the Jetson familyt, when they tried to acquire arm seem to show that like Apple-AMD-Intel they are well aware of that possibility and have a large presence in that world.

A lot of NVIDIA advancement make a lot of sense in the place that SOC make a lot of sense (and independent wireless VR set could benefit a lot from DLSS 3 and what not, same for a switch or other handheld device)
Nvidia is not involved in driving the PC market forward with CPUs. You will never see their Graphics in an Intel or AMD chip. Last I checked, Intel and AMD are really the only two companies for PCs these days.

Maybe they would be well positioned if they actually made CPUs, they don't. They failed to acquire ARM, they only license the technology.

I'm not saying that Nvidia is going to disappear.

The way I look at it, is that Nvidia's closed ecosystem of DLSS stuff is a cheat. Your Games have to be prerendered by Nvidia Super Computers and supported. It's meant to lock you into their products. DLSS is shit, because it's not real. DLSS 3.0 is being touted as only with 4,000 series cards while it can be run on 2000 & 3000 series boards as well. It's predatory business and one of these days that shit is gonna bite them in the ass.

AMD might be an evil corporation, but at least their technologies are open to everyone. Their technologies don't require games to be pre-rendered on a supercomputer to gain a bonus in frames per second and the technology is really close to DLSS in terms of quality....And it runs on everything.

Karma's a Bitch and it's coming for Nvidia one of these days
 
Your Games have to be prerendered by Nvidia Super Computers and supported. It's meant to lock you into their products.
It became generic with DLSS 2.0 from my limited understanding, no need to train for specific game anymore.

Maybe they would be well positioned if they actually made CPUs, they don't. They failed to acquire ARM, they only license the technology.
ARM cpu could become good enough that using them will be good enough.

It is really hard to predict, specially with what the console world make possible in term of breaking paradigm vs the PC world an Arm a la Apple M1 with a NVIDIA gpu type of soc, with some specialized hardware path for frequent function of game engine instead of media (IK, collision detection, etc..) could be hard to beat.

Has for, being shit because it is not real, not being real is exactly the point and exactly the strength of it and something already use massively in audio, video not in real-time, intelligently playing with prediction and what our brains fill in for what is missing should be a promising avenue.

Running DLSS 3.0 on a 2xxx-3xxx would have come with a different bitting (artifact level to not gain a giant level of latency could have gave a bad name to the techno)
 
Last edited:
Nvidia is not involved in driving the PC market forward with CPUs. You will never see their Graphics in an Intel or AMD chip. Last I checked, Intel and AMD are really the only two companies for PCs these days.

Maybe they would be well positioned if they actually made CPUs, they don't. They failed to acquire ARM, they only license the technology.

I'm not saying that Nvidia is going to disappear.

The way I look at it, is that Nvidia's closed ecosystem of DLSS stuff is a cheat. Your Games have to be prerendered by Nvidia Super Computers and supported. It's meant to lock you into their products. DLSS is shit, because it's not real. DLSS 3.0 is being touted as only with 4,000 series cards while it can be run on 2000 & 3000 series boards as well. It's predatory business and one of these days that shit is gonna bite them in the ass.

AMD might be an evil corporation, but at least their technologies are open to everyone. Their technologies don't require games to be pre-rendered on a supercomputer to gain a bonus in frames per second and the technology is really close to DLSS in terms of quality....And it runs on everything.

Karma's a Bitch and it's coming for Nvidia one of these days
AMD is only open because they don't have the manpower or market share to close it off, DLSS is a cheat in the same way that Anti-Aliasing is a cheat, or the SLI profiles of old were a cheat.
NVidia is certainly closed to a degree, they have a lot of proprietary stuff but to date, there aren't really any open alternatives that do what their stuff does they are pushing things forward almost uncomfortably so.
And I think you misunderstand how DLSS works nothing is pre-rendered, they use an AI to generate algorithms specific to the games and then include those algorithms in their driver updates in the same way they used to include SLI profiles in their game updates and how they currently include game optimizations in their driver updates. But those algorithms are run locally on the hardware.
AMD doesn't have access to the AI backend that Nvidia does to attempt this and instead AMD's method uses a combination of standard upscaling methods that they have averaged out as best of a bad case scenario sort of deal. I do like AMD's approach due to its relative simplicity, but it lacks features and depth compared to Nvidia's.

The big difference in looking at the approach of the two companies is AMD offers products, and Nvidia offers ecosystems, two very different methodologies for dealing with things.
I think it would be cool if Nvidia made more consumer CPU's their existing ARM lineup is beastly for what they are and having more options there would be entertaining, especially for the emulation community The Jetson Nano's are super fun little compute cards.
I am looking at ordering up one of the new Orion's because it may finally have the power I want for some of the later-gen arcade games.
 
It became generic with DLSS 2.0 from my limited understanding, no need to train for specific game anymore.


ARM cpu could become good enough that using them will be good enough.

It is really hard to predict, specially with what the console world make possible in term of breaking paradigm vs the PC world an Arm a la Apple M1 with a NVIDIA gpu type of soc, with some specialized hardware path for frequent function of game engine instead of media (IK, collision detection, etc..) could be hard to beat.

Has for, being shit because it is not real, not being real is exactly the point and exactly the strength and something already use massively in audio, video not in real-time.

Running DLSS 3.0 on a 2xxx-3xxx would have come with a different bitting (artifact level to not gain a giant level of latency could have gave a bad name to the techno)
The big thing that the 2xxx-3xxx cards lack that DLSS 3.0 requires is the hardware optical flow accelerator, the 2000 and 3000 series cards do calculate optical flow but they do it at a software level so it doesn't necessarily have the needed performance to insert frames in the same way and is more prone to artifact glitches due to corners being cut to maintain the needed calculation speeds.
 
It's always difficult to find benchmarks of World of Warcraft on brand new hardware. I was VERY surprised to see in on the Intel slide. Not only that, but it straight-up shows their new 13-series CPU getting stomped by the 5800X3D. And only 6% faster than regular Zen3? I doubt that Intel would exaggerate performance numbers for their competitor's product, so the numbers are likely real. Kudos to Intel for actually showing situations where their stuff is not the fastest.
Well probably because there's not many people who care to see how a CPU performs on a nearly 20 year old game.
 
Somewhat related to my previous post, I see Nvidia has snuck in that they are releasing a new Orion chip in 2023 that operates in the 5-10w range, to completely supplant the original Jetson Nano based on the Tegra stuff.
What is interesting here is the Jetson nano is famous for basically being the guts of the Nintendo Switch (fewer cores at a lower clock), we know the old Tegra stuff is no longer produced so I wonder if this could possibly be the guts of the rumored "Switch Pro" that has been rumored for 2023.

https://coolinglass.com/2022/09/nvidia-introduced-jetson-orin-nano/
https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-orin/
 
Last edited:
we know the old Tegra stuff is no longer produced
Wasn't that a rumor that was speculation of a refresh that never occurred, there was millions of switch sales between when the rumored stop time and now, and the switch is entering black friday season-christmas with the same Tegra.
 
Wasn't that a rumor that was speculation of a refresh that never occurred, there was millions of switch sales between when the rumored stop time and now, and the switch is entering black friday season-christmas with the same Tegra.
Who knows it's all rumors and Nintendo is famous for its tight lips, they have proven time and time again that they will burn your contract and fight a court battle with you over leaking their stuff before their announcements.
The last rumor I heard was that Nintendo bought up a crapload of the updated chips before Nvidia killed off the lineup so they had a stockpile, but other rumors say that the lineup was still too high in demand for Nvidia to kill because they didn't have any alternatives in the Nano pricepoint which was sub $200. I mean the Jetson Nano was a $130 SBC it's "replacement" was the Jetson Xavier which was $400 and required a dedicated development board for another $600 or so. I know we've kept our Jetson Nano's in our classroom labs because despite the fact they are now old as dirt there isn't anything in their price range to replace them with.
 
Somewhat related to my previous post, I see Nvidia has snuck in that they are releasing a new Orion chip in 2023 that operates in the 5-10w range, to completely supplant the original Jetson Nano based on the Tegra stuff.
What is interesting here is the Jetson nano is famous for basically being the guts of the Nintendo Switch (fewer cores at a lower clock), we know the old Tegra stuff is no longer produced so I wonder if this could possibly be the guts of the rumored "Switch Pro" that has been rumored for 2023.
Why all this Nvidia and Arm stuff in a thread about Intels X86 CPUs? Tegra is not one, but several versions of SoC as is Snapdragon (phones, Amazon firestick, Oculus VR etc) which among those Nvidia primarily would compete against in the ARM space (Qualcomm, Samsung, Apple, Mediatek etc).

Nvidia will probably come with a good SoC at a later stage primarily build for gaming, I am almost certain of that (perhaps even with their Vulcan based "rosetta"). But, Nvidia is a sidenote in Intels Keynote besides their new GPUs. They have no X86 SoC and are already basically being squeezed out of the low end and entry level marked, now that Intel have an "APU" that can do some light gaming (Iris XE) and if to look at Intels past, they would probably squeeze Dell, HP, Lenovo and other businesses to bundle Intels ARC GPU with their CPUs on laptops and prebuilds instead of Nvidia. Intel got sued and lost in the past against AMD, but they might do the same towards Nvidia now that they have their own GPUs and perhaps have learned how not to loose if they get sued again.

So, concerning Nvidia in this Intel Keynote, the only thing I gather from it, is that Intel is pitting their GPU up against Nvidia in the slides when it comes to price. Gauntlet has been thrown against Nvidia. The only relevance a potential Nvidia SoC have in this, is that Intel have incentive to screw Nvidia now out of the high volume low end to mid range market when it comes to prebuilds and laptops, so it might be a chicken and the egg situation. Does Nvidia build their own SoC and release it to the laptop and prebuilds market before or after they get screwed and squeezed out by Intel ... What will Intel do to gain marked share of their own ARC GPUS you think?
 
Why all this Nvidia and Arm stuff in a thread about Intels X86 CPUs? Tegra is not one, but several versions of SoC as is Snapdragon (phones, Amazon firestick, Oculus VR etc) which among those Nvidia primarily would compete against in the ARM space (Qualcomm, Samsung, Apple, Mediatek etc).

Nvidia will probably come with a good SoC at a later stage primarily build for gaming, I am almost certain of that (perhaps even with their Vulcan based "rosetta"). But, Nvidia is a sidenote in Intels Keynote besides their new GPUs. They have no X86 SoC and are already basically being squeezed out of the low end and entry level marked, now that Intel have an "APU" that can do some light gaming (Iris XE) and if to look at Intels past, they would probably squeeze Dell, HP, Lenovo and other businesses to bundle Intels ARC GPU with their CPUs on laptops and prebuilds instead of Nvidia. Intel got sued and lost in the past against AMD, but they might do the same towards Nvidia now that they have their own GPUs and perhaps have learned how not to loose if they get sued again.

So, concerning Nvidia in this Intel Keynote, the only thing I gather from it, is that Intel is pitting their GPU up against Nvidia in the slides when it comes to price. Gauntlet has been thrown against Nvidia. The only relevance a potential Nvidia SoC have in this, is that Intel have incentive to screw Nvidia now out of the high volume low end to mid range market when it comes to prebuilds and laptops, so it might be a chicken and the egg situation. Does Nvidia build their own SoC and release it to the laptop and prebuilds market before or after they get screwed and squeezed out by Intel ... What will Intel do to gain marked share of their own ARC GPUS you think?
Almost guaranteed that Intel sells the OEM their ARC GPUs for dirt, not at a discount that would get them in trouble but sells them at a loss and makes it clear they are selling them at a loss to gain market share.
Because if Intel says "we'll sell it to you at half off, even though its a loss to us because we want market share" they can get sued, but if Intel just says "we are selling these at a loss because we want market share" then its perfectly fine...
But I really expect them to bundle them with OEM gaming laptops at a price that the existing Nvidia 3050ti can't compete against. To give them something that could game in the $600-800 range which is pretty empty of viable options at the moment.
 
Well probably because there's not many people who care to see how a CPU performs on a nearly 20 year old game.

World of Warcraft has had 8 expansions, with the 9th coming in a couple months. Every expansion has brought significant improvements and changes to the game engine. When the game was released you could play it on a 3dfx Voodoo3 or a Nvidia TNT2. Now it's a DirectX 12 game that supports Ray Tracing. To imply that it's the same as it was when it was released just shows that you have no idea about the game. It is an extremely CPU dependent game, so yes, CPU performance absolutely matters. It was a noticeable upgrade in WoW when I swapped my 3900X for a 5900X. Being that it's still one of the most popular MMO games out today, I'd say that there are plenty of people who care.
 
World of Warcraft has had 8 expansions, with the 9th coming in a couple months. Every expansion has brought significant improvements and changes to the game engine. When the game was released you could play it on a 3dfx Voodoo3 or a Nvidia TNT2. Now it's a DirectX 12 game that supports Ray Tracing. To imply that it's the same as it was when it was released just shows that you have no idea about the game. It is an extremely CPU dependent game, so yes, CPU performance absolutely matters. It was a noticeable upgrade in WoW when I swapped my 3900X for a 5900X. Being that it's still one of the most popular MMO games out today, I'd say that there are plenty of people who care.
You can polish a turd 6 ways to sunday but that game never looked good anyhow. Plenty of people play Overwatch but reviewers don't waste time on it for similar reasons.
 
You can polish a turd 6 ways to sunday but that game never looked good anyhow. Plenty of people play Overwatch but reviewers don't waste time on it for similar reasons.

Do you actually have anything to contribute to this thread beyond being a shit disturber? Just because you don't care for a particular game doesn't mean it's not important enough to benchmark. Get over yourself.
 
Do you actually have anything to contribute to this thread beyond being a shit disturber? Just because you don't care for a particular game doesn't mean it's not important enough to benchmark. Get over yourself.
I'm just trying to explain to you why no reviewers are putting up WoW benchmarks.
 
I'm just trying to explain to you why no reviewers are putting up WoW benchmarks.

Your "explanation" was a thinly-veiled condescending attempt at trolling. Intel seemed to think WoW was important enough to benchmark, despite not even showing their own products in a particularly positive light. And WoW benchmarks are not actually that rare, they just usually tend to coincide with the release of a new expansion, not necessarily with the release of new hardware, as in this case. I expect to see quite a few benchmarks when Dragonflight is released in November.
 
Your "explanation" was a thinly-veiled condescending attempt at trolling. Intel seemed to think WoW was important enough to benchmark, despite not even showing their own products in a particularly positive light. And WoW benchmarks are not actually that rare, they just usually tend to coincide with the release of a new expansion, not necessarily with the release of new hardware, as in this case. I expect to see quite a few benchmarks when Dragonflight is released in November.
No, it was not trolling, quit being so sensitive. I played WoW at release and for a year or so afterward, I don't hate the game or anything like that. I'm also looking forward to playing OW2 some. I just am giving my opinion that neither are really worthy benchmarks at this point, I'd much rather see newer titles with perhaps the one old school title like CS:go thrown in there even though it's not really very relevant anymore.
 
Back
Top