Vega Rumors

grtitan

Telemetry is Spying on ME!
Joined
Mar 18, 2011
Messages
1,266
Just saw this on Reddit and wanted to share it.

Please note, they are rumors, so take it as such:

""
RumorFull Vega Lineup And Release Date Revealed! self.Amd

submitted 3 hours ago * by RadeonRebel Vega">[email protected] | 16GB C14@3200 | GTX 780 > Vega

Hello, I am here to bring you some important information about Vega. Yes I know many people here have been waiting patiently for some substantial information about Vega from AMD. Well I am happy to say your wait has finally come to an end. Today I will relieve you of your suffering and reveal the full Vega lineup and release dates!



The RX Vega lineup will come in 3 flavors to cater for different price points. The stated aim is to contend directly against the GTX 10 series.



THE LINEUP:



RX Vega "Core" - RRP: $399.99 (1070 perf or better)

RX Vega "Eclipse" - RRP: $499.99 (1080 perf or better)

RX Vega "Nova" - RRP: $599.99 (1080TI perf or better)



Please note: The RX Vega subnames were still being finalized by the boardroom at the time so they may have changed by now. I can confirm the alternative was "Dwarf" "Super" "Hyper" based on the Hertzsprung–Russell diagram which the engineers were in favor of but marketing department not so much.



And now for the grand finale...



DATES:



The Vega teaser is May 16.

The full Vega reveal is May 30 - June 3.

The official release of Vega is June 5.](/spoiler)"



P.S Do not ask me for my source. If I told you my friend could be fired. Let's just say it is rather "Sunny" where he works... ;)

P.S.S Let the downvotes begin. I am expect that I will be flamed for revealing this info despite being from a legit source. If you don't believe me feel free to come back to this post in 3 weeks.



So are you ready to join the #RadeonRebellion? Which Vega card are you most interested in?

""

Link

Again, this is just a rumor, so take with a big boatload of salt.
 
So, I read it. And I immediately thought that anyone who went over the Radeon RX Vega Discussion thread could write this up. Honestly, most of this info can be guessed by anyone on this forum. Vega teaser is the AMD Analyst Day, full Vega reveal is Computex, the 3 different cards, I mean, hello, no shit AMD would want to slot their cards into each price point Nvidia is running. Estimating performance that broadly is giving us nothing.

There are only two things in that whole post that are worth looking at for truthfulness, Vega release date of June 5, and the RRP. Some people believe AMD will compete with Nvidia up to 1080 Ti, some don't.
 
Hopefully Nova doesn't require super-nova levels of energy in order to function against a 1080ti.
 
If you believe that, I've got a Vega engineering sample I'll sell you for $1,200 in Bitcoin...
 
Last edited:
Yeah it it was just legit Vega info there would be some slides (about how innovative it is), an AMD t-shirt (about how cool it is) some bogus clocks (about how fast it is) and some likely/probably/hopefully links like the above, all in less than a quarter of a forum page. It's actually a good thing that those threads grow, even if they are off topic sometimes. It keeps the spotlight on AMD. The downside is that if AMD fails to deliver (and in some cases they will because of over hyping ) we'll see their stock take a hit again.
 
Since the Vega goalpost keeps changing. Why not GV104 performance? :D
 
Since the Vega goalpost keeps changing. Why not GV104 performance? :D

Man you still mad that you paid the Nvidia Founders edition tax, jeez you run really slow ram and only 16GB at that. Oh wait this about a rumor not some stupid comment that added nothing of value to the the thread.
 
If Vega produces a 1080Ti performance card @ $600, I would pre-order.
However, I am skeptical but hopeful.

Here is the problem with what people want and what AMD might be delivering. Everyone want their card at cheap prices there disgusted with Nvidia markup on their high end card then damn AMD for "not competing".
Vega is a HBM2 card , something that no other consumer card will deliver this year, more then likely this would also have a price tag. Then I'm wondering about how would you marry those things and still come away with a price that just makes little sense?

If anything learned from "last time" the card might not be cheap and more then likely will be a bit more expensive for the performance it delivers. This is the only realistic outlook for RX Vega at this point in time ...
 
If Vega produces a 1080Ti performance card @ $600, I would pre-order.
However, I am skeptical but hopeful.

I wouldn't pre-order the card based off what AMD "says" its performance is, even if actually came out exceeding 1080 TI. I would wait for a few pro/anti AMD sites/You tubers get their hands on them and review them to make the decision.

Since the Vega goalpost keeps changing. Why not GV104 performance?

I am sure the next leak suggest that it will exceed Volta... In all honesty I hope AMD competes again at the high end market, but people easily forget that AMD had cards that exceeded Nvdia's Titan (well maybe not their $3000 version…) But they were also in the market for charging 1000+ for their cards as well in the past.

Man you still mad that you paid the Nvidia Founders edition tax, jeez you run really slow ram and only 16GB at that. Oh wait this about a rumor not some stupid comment that added nothing of value to the the thread.

What tax? Bought my 1080 TI FE at MSRP and that never changed. Now if your talking about "past" initial 1xxx FE tax… Nope the $100 wasn’t missed. However you should be able to acknowledge that the goal post has moved from hoping to exceed 1070, to a presentation that shown it on par/exceeding 1080 (in certain games), to now lets hope it offers TI/Titan XP performance.

This is why people get upset on either side, when they blindly ride the hype train and think that their company can do nothing wrong, and anyone says anything different is a heretic. Both sides have had some incredible marketing, slide shows, and presentations only for the real product to be horrible.

Either way buy whatever provides the best cost to performance for what you are trying to do and stop hating on other peoples decisions.
 
I was just linked to a video about Volta and in it, the youtuber stated Vega will force Volta's hand lol. So ya its already begun, hype train started.

AMD loyalists are AMD's own worst enemy guys.
 
I was just linked to a video about Volta and in it, the youtuber stated Vega will force Volta's hand lol. So ya its already begun, hype train started.

AMD loyalists are AMD's own worst enemy guys.

Luckily, it's not the majority that expect that. I think most of use have tempered our expectations. The sane ones, that is.
 
Nothing new here. What the OP posted makes sense given what we already know. The cards will not meet people's desire for a cheaper Nvidia alternatives given the HBM. However, they will only have 16,000 cards globally in the first 3 months, they will sell. So now it is down to performance, Power draw, Temps OC room. Oh and drivers. For same performance, Temps, etc I would go with an Nvidia card.
 
I like the idea of HMB2 memory, I am curious if it will make a real world difference this time.
 
It did last time.

https://m.hardocp.com/article/2017/01/30/amd_video_card_driver_performance_review_fine_wine/14

That's why at 1440p and 4k on 2017 games a 2015 Fury X card with only 4gb HBM is still able to be competitive with cards with 6, 8 and 12(11)GB of GDDR5 RAM.

Despite the haters.

what does that link prove exactly? I don't see it compared to any other similar performance card with non - HBM memory. what I see is the Fury X being compared to a much weaker RX 480 even to couple of games had to be tested at 1080P because the RX 480 lack of the enough power to push same settings at 1440p

also, you are seeing there just average numbers not minimums and not frametimes which is where the lack of vRAM is more noticeable, you don't know if the card is performing good if isn't compared directly to a card with similar performance but with more vRAM..

Yes the fury X is a powerful card even today, but to say there isn't a performance issue or penalty with vRAM on the Fury X is just a big lie even if those are just "few tittles" that require to lower Texture and AA settings.
 
what does that link prove exactly? I don't see it compared to any other similar performance card with non - HBM memory. what I see is the Fury X being compared to a much weaker RX 480 even to couple of games had to be tested at 1080P because the RX 480 lack of the enough power to push same settings at 1440p

also, you are seeing there just average numbers not minimums and not frametimes which is where the lack of vRAM is more noticeable, you don't know if the card is performing good if isn't compared directly to a card with similar performance but with more vRAM..

Yes the fury X is a powerful card even today, but to say there isn't a performance issue or penalty with vRAM on the Fury X is just a big lie even if those are just "few tittles" that require to lower Texture and AA settings.

The point was zerogg was implying that 4GB of HBM sucked last time and didn't make a real world difference.
The linked article was showing that [H]ardOCP ate a bit of crow and said apparently 4GB of HBM wasn't the limitation they originally supposed it would be because every single game they played, 2 years later, in a retest didn't have any issues at 1440p. If you'll recall this site was one of the sites that openly stated 4GB of HBM wouldn't be enough into the near future, and people should consider avoiding the card. Well two years later --- it is doing just fine.

So I'm telling lies eh?

You want minimums? Here's quite a few minimums from many benches in Feb. Not one of them is terrible. Click through this and take a look.
http://www.techspot.com/review/1329-buying-gpu-radeon-fury-geforce-980/

I own a couple Fury X cards in crossfire. People endlessly berating the cards with no direct experience really gets quite old. The HBM RAM situation on the Fury's has not been the issue it's made out to be over and over again on the forums. HBM held up quite well, --- well not only holds up but delivers an absolutely excellent experience when coupled with a freesynch monitor. The only time I have to drop a setting or two is when I'm playing at 7680x1440 on my 3 (1440p) displays.

And more directly - - - the Fury X doesn't look bad against cards with 8GB and 12GB of RAM regardless. When you add crossfire or SLI to the equation it actually ends up on top of the pefromance pile. So I guess HBM RAM isn't really the problem you think it is after all is it?
http://forums.guru3d.com/showthread.php?t=400781

1440p
nRsWCxX.jpg


4k
YSEaq5k.jpg


But then it never really was the problem you suggest it was.
To be perfectly clear --- it never really did have serious struggles at at high resolutions compared to the alternative bigger portions of GDDR5 based cards with triple the memory.... Here are some benches from a 2015 Tweak Town review with three 4k monitors.
Think a pair of HBM based 4GB Fury X in crossfire would struggle at 11,520x2160 compared to a pair of Titans with 12GB RAM?
You'd be mostly wrong
http://www.tweaktown.com/tweakipedi...re-triple-4k-eyefinity-11-520x2160/index.html

Here's the bottom line, by the time the 4GB HBM memory quantity becomes an issue the actual processing power is a bigger issue, and the FPS would be too low to matter -- and the competitors with more RAM aren't doing much better on that front.


94_488_amd-radeon-r9-fury-crossfire-triple-4k-eyefinity-11-520x2160.png


94_477_amd-radeon-r9-fury-crossfire-triple-4k-eyefinity-11-520x2160.png


94_400_amd-radeon-r9-fury-crossfire-triple-4k-eyefinity-11-520x2160.png




Back to a single card. Take a look at the benchmarks for Prey, probably the newest triple A title out right now. How's the >2 year old 4GB HBM based Fury X hold up compared to the newest competition at 4k resolution? Minimums vs averages?
It's holding up quite well I'd say...

IMG_1602.PNG



I'm glad AMD is sticking with HBM for Vega. It wasn't a failure last time, and it's unlikely to be a failure this time.
 
Last edited:
AMD got engineers to sit and manage Fury so it doesn't run out. It got nothing to do with HBM.

And even Hynix now says HBM is...well...a bigass failure unless you want ECC.
You mean with their "terrible" driver support you Nvidia fan bois are always warning about? B.S. I guess we'll see if that holds water in another couple years when the engineers attention moves to Vega, and Fury cards are fading out in the rear view mirror. But somehow I doubt it since even 7970 AMD owners have remained quite satisfied after all these years. The benchmarks I linked cover from card launch to now. They show a pretty darn established consistency.

If it's all driver magic, then its pretty darn consistent. So you pick, it's either "AMD card drivers suck" or they don't. After owning Nvidia for 15 years straight and owning AMD for only the last two years I'm telling you they don't.

Or it's "HBM memory is a poor decision and is a severe limitation for a performance card" or it's not. And those benches show it's not - not at the beginning of the card life, and not now >2 years later. How do all those other 4 GB GDDR5 cards fare on those more recent games at 4k? Right, much worse than the HBM card! Where's the nvidia engineering team then?


Taking a step back...
Why are you even in nearly every AMD threads Shintai? You (often baselessly) bash AMD products with nearly every post.

What's the last AMD graphics card you've owned?

Why are you so publically biased against products you obviously haven't used in a few years?
 
You mean with their "terrible" driver support you Nvidia fan bois are always warning about? B.S. I guess we'll see if that holds water in another couple years when the engineers attention moves to Vega, and Fury cards are fading out in the rear view mirror. But somehow I doubt it since even 7970 AMD owners have remained quite satisfied after all these years. The benchmarks I linked cover from card launch to now. They show a pretty darn established consistency.

If it's all driver magic, then its pretty darn consistent. So you pick, it's either "AMD card drivers suck" or they don't. After owning Nvidia for 15 years straight and owning AMD for only the last two years I'm telling you they don't.

Or it's "HBM memory is a poor decision and is a severe limitation for a performance card" or it's not. And those benches show it's not - not at the beginning of the card life, and not now >2 years later. How do all those other 4 GB GDDR5 cards fare on those more recent games at 4k? Right, much worse than the HBM card! Where's the nvidia engineering team then?


Taking a step back...
Why are you even in nearly every AMD threads Shintai? You (often baselessly) bash AMD products with nearly every post.

What's the last AMD graphics card you've owned?

Why are you so publically biased against products you obviously haven't used in a few years?

Basically what you're saying is that Fury X with 4GB HBM doesn't suffer from it's small memory pool in games that use less than 4GB of memory ?

upload_2017-5-13_16-11-3.png


Do you have any other fascinating revelations for us ? Perhaps you've discovered that GCN doesn't suffer from a geometry bottleneck in games that don't place a high load on the front-end ?

upload_2017-5-13_16-14-43.png


c4jt321.png

1440_ultra.png

In case it wasn't eminently clear, the 8.6Tflop Fury X is roughly on par with a 980Ti despite the fact that it's running at 4K, a resolution at which its least likely to be held back by its weak front-end.it should be faster. It isn't.
 
Last edited:
The point was zerogg was implying that 4GB of HBM sucked last time and didn't make a real world difference.
The linked article was showing that [H]ardOCP ate a bit of crow and said apparently 4GB of HBM wasn't the limitation they originally supposed it would be because every single game they played, 2 years later, in a retest didn't have any issues at 1440p. If you'll recall this site was one of the sites that openly stated 4GB of HBM wouldn't be enough into the near future, and people should consider avoiding the card. Well two years later --- it is doing just fine.

So I'm telling lies eh?

You want minimums? Here's quite a few minimums from many benches in Feb. Not one of them is terrible. Click through this and take a look.
http://www.techspot.com/review/1329-buying-gpu-radeon-fury-geforce-980/

I own a couple Fury X cards in crossfire. People endlessly berating the cards with no direct experience really gets quite old. The HBM RAM situation on the Fury's has not been the issue it's made out to be over and over again on the forums. HBM held up quite well, --- well not only holds up but delivers an absolutely excellent experience when coupled with a freesynch monitor. The only time I have to drop a setting or two is when I'm playing at 7680x1440 on my 3 (1440p) displays.

And more directly - - - the Fury X doesn't look bad against cards with 8GB and 12GB of RAM regardless. When you add crossfire or SLI to the equation it actually ends up on top of the pefromance pile. So I guess HBM RAM isn't really the problem you think it is after all is it?
http://forums.guru3d.com/showthread.php?t=400781

1440p
nRsWCxX.jpg


4k
YSEaq5k.jpg


But then it never really was the problem you suggest it was.
To be perfectly clear --- it never really did have serious struggles at at high resolutions compared to the alternative bigger portions of GDDR5 based cards with triple the memory.... Here are some benches from a 2015 Tweak Town review with three 4k monitors.
Think a pair of HBM based 4GB Fury X in crossfire would struggle at 11,520x2160 compared to a pair of Titans with 12GB RAM?
You'd be mostly wrong
http://www.tweaktown.com/tweakipedi...re-triple-4k-eyefinity-11-520x2160/index.html

Here's the bottom line, by the time the 4GB HBM memory quantity becomes an issue the actual processing power is a bigger issue, and the FPS would be too low to matter -- and the competitors with more RAM aren't doing much better on that front.


94_488_amd-radeon-r9-fury-crossfire-triple-4k-eyefinity-11-520x2160.png


94_477_amd-radeon-r9-fury-crossfire-triple-4k-eyefinity-11-520x2160.png


94_400_amd-radeon-r9-fury-crossfire-triple-4k-eyefinity-11-520x2160.png




Back to a single card. Take a look at the benchmarks for Prey, probably the newest triple A title out right now. How's the >2 year old 4GB HBM based Fury X hold up compared to the newest competition at 4k resolution? Minimums vs averages?
It's holding up quite well I'd say...

View attachment 24766


I'm glad AMD is sticking with HBM for Vega. It wasn't a failure last time, and it's unlikely to be a failure this time.


Problem is buffer size can be split over multiple cards, so that 4gb vram per card is now more when you have multiple cards. What doesn't span over the multiple cards are the assets, those have to be dupilcated, so if a game doesn't go over 4 gb with assets much then you don't run across problems with 4gb of vram.

Most game engines starting for 2 or 3 or even 4 years back have supported streaming assets from main memory so without knowing the total vram usage in a given time, there is no way to assess if the GPU is truly vram limited and to what degree if limited how much is it limited. Its not a hard bottleneck for current games. Some titles like witcher 3 and GTA V those are definitely used more than 4gb and we can see in those games FuryX takes a hit when using higher settings. No amount of extra shader power is going to help.

And with Prey, it shows you exactly what I stated, stream assets are not so much that it will put FuryX in a memory amount bottleneck.
 
Problem is buffer size can be split over multiple cards, so that 4gb vram per card is now more when you have multiple cards. What doesn't span over the multiple cards are the assets, those have to be dupilcated, so if a game doesn't go over 4 gb with assets much then you don't run across problems with 4gb of vram.

Most game engines starting for 2 or 3 or even 4 years back have supported streaming assets from main memory so without knowing the total vram usage in a given time, there is no way to assess if the GPU is truly vram limited and to what degree if limited how much is it limited. Its not a hard bottleneck for current games. Some titles like witcher 3 and GTA V those are definitely used more than 4gb and we can see in those games FuryX takes a hit when using higher settings. No amount of extra shader power is going to help.
Witches 3 barely uses any VRAM, GTAV does though
 
maybe thinking of another game, shadows of mordor?
Yeah that's ram hungry with the 4k texture pack I remember it being the game that brought the asymmetric memory configuration on the 970 to light
 
You mean with their "terrible" driver support you Nvidia fan bois are always warning about? B.S. I guess we'll see if that holds water in another couple years when the engineers attention moves to Vega, and Fury cards are fading out in the rear view mirror. But somehow I doubt it since even 7970 AMD owners have remained quite satisfied after all these years. The benchmarks I linked cover from card launch to now. They show a pretty darn established consistency.

If it's all driver magic, then its pretty darn consistent. So you pick, it's either "AMD card drivers suck" or they don't. After owning Nvidia for 15 years straight and owning AMD for only the last two years I'm telling you they don't.

Or it's "HBM memory is a poor decision and is a severe limitation for a performance card" or it's not. And those benches show it's not - not at the beginning of the card life, and not now >2 years later. How do all those other 4 GB GDDR5 cards fare on those more recent games at 4k? Right, much worse than the HBM card! Where's the nvidia engineering team then?


Taking a step back...
Why are you even in nearly every AMD threads Shintai? You (often baselessly) bash AMD products with nearly every post.

What's the last AMD graphics card you've owned?

Why are you so publically biased against products you obviously haven't used in a few years?

Did I hit the nail since it bothers you that much?

Having to babysit memory on Fiji on a per game level is hilarious because we all know what happens when a game isn't in focus. And then Hynix on top throwing HBM on the floor for the consumer space. Not to mention all the slide BS about what HBM can do that's supposed to be unique.

Lets just be honest here, AMD betted on the wrong memory for consumer products and made bad decisions for HPC products as well.

This is taken from AnandTech.
2982019-2431910896-9908c.png


Lets recap:
No 8hi HBM1 stacks. Fiji locked to 4GB.
No 8hi HBM2 stacks, Vega 10 locked to 8GB.
No 2Ghz HBM2 stacks, Vega 10 getting lower bandwidth than Fiji.

na_3840.png

na_vram.png
 
Last edited:
Epic from Archaea.

The 4GB HBM held up better than anyone expected.

People are still confusing memory "usage" with memory "needed".

You can post all the vRam usage charts that you want, but most of the do not reflect real world performance hits, average or min FPS.
 
Yes at times 4GB HBM does well, I saw issues with Very High textures in Rise of the Tomb Raider - very noticeable hitching and slowdowns. Going to Nightmare on shadows and something else in Doom same thing. For a high end card one should not have to worry about the memory or be restrained by it. I also don't just buy average FPS = good game play. I have seen supposedly good frame rates with sucky game play due to stutter, pauses etc. Really need better data than just FPS, unfortunately that data is much harder to obtain so I end up using those canned benchmarks at times as well. It is good that we have professionals doing real game testing here.

I think I will see how HBM2 pans out for Vega before calling it a failure, especially with no real clue about performance is just laughable at best.
 
I think I will see how HBM2 pans out for Vega before calling it a failure, especially with no real clue about performance is just laughable at best.

From a 1070-1080 perspective you can say both memory capacity and bandwidth is enough. But cost wise it's a failure. Then you can argue that its something AMD have to pay for you because it offers nothing more than a 256bit GDDR5X/GDDR6 interface would.

If the goalpost moves to the fabled 1080ti performance, then the failure is 4 fold with density, capacity, bandwidth and cost. But I think we can disregard that idea.

For HPC, if performance is 1070-1080 you can get away with it and show the world you got free ECC on top and we can ignore the cost as such.
 
Last edited:
Actually it is the contrarians to said insane ones that repeat that utter BS and then claim the issue resides with said insane ones when in fact had they not spent any time at all repeating the utter BS, then it wouldn't spread and rational expectations might flourish.

Why do we have to read through this BS? I came here looking for info on Vega (well, rumors), and we have to wade through these ridiculous statements from those trying to defend an outdated, underperforming AMD effort?

Vega looks like it may actually do well. Can we not just be positive about it till real details (and as noko points out) professional reviews with frame-time analysis become available?
 
Why do we have to read through this BS? I came here looking for info on Vega (well, rumors), and we have to wade through these ridiculous statements from those trying to defend an outdated, underperforming AMD effort?

Vega looks like it may actually do well. Can we not just be positive about it till real details (and as noko points out) professional reviews with frame-time analysis become available?
Gotta love wolf in sheeps clothing... WTH do you think I was talking about. Watching this angst and bitter retorts based on absolutely nothing is tiring.

Vega will be what it will be. It will either beat Nvidia or it wont. Somehow most of these AMD threads turn into Nvidia discussions. So far the amount of info has really not told us much rather than the possibility that it will be around 1080 (non-Ti). Not sure how that is a bad thing, not great but not damning either.

The amount of chest pounding and screaming is unnecessary. Get what ever fits your needs. If AMD doesn't have the card to attain a particular performance level and Nvidia does, then it isn't rocket science, BUY NVIDIA. Otherwise you just have to wait and see if AMD will provide an alternative.
 
if this is true they will be in stock when the price goes up about 200%....dealers will make a fortune and amd will lose out on sales from the start.
 
I see "1080Ti or better" and immediately lose faith that this is legit...
 
Did I hit the nail since it bothers you that much?

Having to babysit memory on Fiji on a per game level is hilarious because we all know what happens when a game isn't in focus. And then Hynix on top throwing HBM on the floor for the consumer space. Not to mention all the slide BS about what HBM can do that's supposed to be unique.

Lets just be honest here, AMD betted on the wrong memory for consumer products and made bad decisions for HPC products as well.

This is taken from AnandTech.
2982019-2431910896-9908c.png


Lets recap:
No 8hi HBM1 stacks. Fiji locked to 4GB.
No 8hi HBM2 stacks, Vega 10 locked to 8GB.
No 2Ghz HBM2 stacks, Vega 10 getting lower bandwidth than Fiji.

na_3840.png

na_vram.png


Just want to say that some Nvidia and AMD users can't even get Nier Automata to run at all. My RX 480 won't complete the tutorial without crashing. The only way to get it to run on a RX 480 is to uninstall the Win 10 Creator's Update, revert the drivers to last year's drivers that came out before ReLive, and it still crashes.

Nier:Automata is fucked for AMD users.

Nier:Automata is fucked for Nvidia users.

Square Enix refuses to pay the developers Platinum Games. Probably why we have gotten zero support since launch. Don't believe me? Read the number of support thread stickies on the official Steam forum. Don't see any huh? That AMD one was started by the community and isn't a Square Enix official thread. The buy DLC thread is the only official support thread from Square Enix.

Only way SOME Nvidia and AMD users can get it to run sometimes is to use a mod.

We have gotten a $14 DLC for this POS broken ass game and ZERO performance patches since release. Zero, zilch, nada, nil.... Fuck Square Enix.

Now I have not read what you were posting this in regards to. I have no idea what you are using Nier:Automata to show. I just wanted to vent on this POS game that I spent $50 on and haven't been able to play since Mar 17, 2017 as have many other Nvidia and AMD users!

Oh and I just want to say F Square Enix one more time!
 
Hahahaha.

The one game Shintai produced benchmarks for that shows the HBM based Fury X in a pretty bad light. I'd never heard of the game,but thanks for the clarity on its state of quality code.

Irony...

there are lot of games that make the HBM suffer badly once you crank textures up and some quality visual settings up, Dying Light and Rise of The tomb raider comes to my mind also Doom.
 
Back
Top