NVIDIA Maxwell GPU - GeForce GTX 980 Video Card Review @ [H]

Both [H]ardOCP and AnandTach have BOINC teams. I (we) would really appreciate throwing in some results from 1 or 2 of the projects for a more real-world compute comparison than those benchies Anand uses. PrimeGrid and Milkyway for example ;)

Otherwise great article with great results :)
 
two GTX980s will be my next upgrade for sure.. but I need to do more than upgrade my cards.. looking at 3 displays with native DP and maybe x99 or x97 whole system rebuild... probably a nice xmas gift to myself in a couple months. :)

edit: well, never mind that plan. just grabbed a pair of zotac 970s. I wanted to keep 3x dvi ports for display connections and for the price these seem like a decent step up in performance while also down in power use at the same time. win!
 
Last edited:
We don't use leaked drivers, we use official drivers from NVIDIA and AMD directly. We asked AMD prior to the review the best driver to use right now, which we do for every launch review.

Not sure why I am not seeing the original reply I made to this.

Anyway.. The 14.x drivers, whenever they are actually released, might just show some gains for AMD.

From what I have seen, and others have been reporting, it is a very nice driver set.
 
Not sure why I am not seeing the original reply I made to this.

Anyway.. The 14.x drivers, whenever they are actually released, might just show some gains for AMD.

From what I have seen, and others have been reporting, it is a very nice driver set.

We used the latest drivers from AMD, nothing newer has been released yet from AMD. We asked AMD if anything new would be released soon, the answer was no. We used the recommended driver straight from the words of AMD themselves, for this review. Whatever these leaked drivers are, you cannot be sure where they ultimately came from. We go right to the source for drivers.

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64

http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
 
I just ordered 2. I feel a little giddy. Its still true that the EVGA SuperClocked versions aren't worth it right? I ordered two vanilla EVGA GTX 980's. Whoa i need some air lol
 
Waiting for the 970 review/test myself. Going by the prices and a few other benchmarks around the web, I'm glad I skipped last generation. 1440p may be in my budget now.....
 
So apparently there is a yet to be announced feature on these cards which they were experimenting with at some other press event a a few days ago, Norm Chan from over at Tested picked up on it during a demo and they made him sign an NDA. Wonder what it is...
 
Last edited:
I was wondering if there was an error with the Apples to Apples comparison for Watch Dogs.

You say in the review that Watch Dogs does not let you use Ultra Textures on the 780TI due to 3gb of ram.

One of the inherent problems with Watch Dogs is the high requirement to enable "Ultra" textures. This game needs 4GB of VRAM to run at 1440p smoothly with "Ultra" textures enabled. The GeForce GTX 780 Ti has never been able to cut it, not due to performance but due to its limitation of having 3GB of VRAM. This problem is solved with the GeForce GTX 980. With its 4GB of VRAM it can run "Ultra" textures in this game just like the 4GB Radeon R9 290X.

Yet the Apples to Apples states it was using Ultra textures for all, and the FPS difference between the regular review tests and utlra textures was 1 fps less for avg / max on the 780 TI

In this apples-to-apples graph we are comparing each video card at the game's highest settings. We are testing at 1440p with Temporal SMAA, "Ultra" textures, "Ultra" settings and HBAO+ High mode.
 
It's always had the performance to run Ultra. Ultra textures problem is about stuttering/choppiness gameplay behavior, not performance. The actual performance difference between Ultra and High has always been close.
 
It makes me absolutly SICK that Shadow of Mordor requires 6GB OF VRAM for ultra at 1080. I feel physically sick. WTF did I buy two of these cards for when they were obsolete at launch. They gimped this card intentionally.

The worst part is I sort of saw it coming but everyone kept saying 1080p only requires 3GB of vram max.This is the biggest most expensive mistake I have made in PC building.

THIS CARD IS USELESS. I hope EVGA lets me step-up. What a scam! And all of these so called review sites with all their experience and analysis and testing are all part of the scam. Not a single word of caution in any review about possible vram limitations at 1080p. Everyone just gave a big thumbs up. Absolutely no analysis whatsoever. And all this nonsense talk about overclocking records and power effeciency is Irrelevant when you can't even run ultra textures in a game that launches a week later!

I am furious! I think the marketing of this card is deceptive. How can they call this product the 980 when it does not even handle console ports. And they even skipped the 880 branding as if this was not one, but two generations beyond the 780ti, when real world performance is not only practically identical, but BOTH cards are obsolete. Egregious!!!!
 
Last edited:
Lol, and that's at 1080p. So imagine what it's like to run higher resolutions - are the 8GB R9 290X cards already obsolete as they won't have the memory to run that title at 2560x1600? Or has it occurred to you that Shadow of Mordor might be a special case? :D
 
It makes me absolutly SICK that Shadow of Mordor requires 6GB OF VRAM for ultra at 1080. I feel physically sick. WTF did I buy two of these cards for when they were obsolete at launch. They gimped this card intentionally.

The worst part is I sort of saw it coming but everyone kept saying 1080p only requires 3GB of vram max.This is the biggest most expensive mistake I have made in PC building.

THIS CARD IS USELESS. I hope EVGA lets me step-up. What a scam! And all of these so called review sites with all their experience and analysis and testing are all part of the scam. Not a single word of caution in any review about possible vram limitations at 1080p. Everyone just gave a big thumbs up. Absolutely no analysis whatsoever. And all this nonsense talk about overclocking records and power effeciency is Irrelevant when you can't even run ultra textures in a game that launches a week later!

I am furious! I think the marketing of this card is deceptive.

Damn dude chill. Noone can account for a shitty optimized game. I'm not in the loop on this game but is it a console port? (wouldn't surprise me)

Watchdogs was another shitty port. Last I heard that game was running like a champ on GTX 780s and much better on 2GB cards than it did when it launched. These games get optimized for consoles that share GPU and CPU memory. They don't take the time to change things and although a shitty AMD gpu in a Xbox one or PS4 can run it, our gpu's which are about 8x more powerful cannot.

Eventually when we wise up and avoid these shitty console ports these developers will get the hint that they will need to optimize their games better or GTFO of pc gaming.

All this tells me about this game is to avoid it.

The world isn't out to get you and your cards are not obsolete.
 
This is just the start of a trend. Even if it is an outlier now, this is where console ports are going. This is the whole reason I did not buy the 780ti. VRAM. I figured NVIDIA knew 4GB was enough so they launched at 4, but I should have known the real next gen would require double the vram. I am so mad I could just throw the whole computer off the god damn balcony. Except, that is what THEY want isn't it. That just plays right into their hands. NVIDIA is the Heisenberg of consumer electronics, and they made me their bitch.

One week of maxiumum e-peen and then the shadow of mordor comes and drowns my whole package in freezing cold icewater. I saw it coming but not one week later! $1,100 investment and it is deficient. You can't even just spend money and get good performance. You have to be so diligent when you build. No wonder nobody puts up with this hobby. How can they market this card as a 980 when they knew 6gb games were in the pipe. They did it intentionally to ensure there was a reason to step up to the next card.
 
Last edited:
This is just the start of a trend. Even if it is an outlier now, this is where console ports are going. This is the whole reason I did not buy the 780ti. VRAM. I figured NVIDIA knew 4GB was enough so they launched at 4, but I should have known the real next gen would require double the vram. I am so mad I could just throw the whole computer off the god damn balcony. Except, that is what THEY want isn't it. That just plays right into their hands. NVIDIA is the Heisenberg of consumer electronics, and they made me their bitch.

One week of maxiumum e-peen and then the shadow of mordor comes and drowns my whole package in freezing cold icewater. I saw it coming but not one week later! $1,100 investment and it is deficient. You can't even just spend money and get good performance. You have to be so diligent when you build. No wonder nobody puts up with this hobby.

i'll take them off your hands free of charge, seeing as how they're causing you so much distress and what not.
 
This is just the start of a trend. Even if it is an outlier now, this is where console ports are going. This is the whole reason I did not buy the 780ti. VRAM. I figured NVIDIA knew 4GB was enough so they launched at 4, but I should have known the real next gen would require double the vram. I am so mad I could just throw the whole computer off the god damn balcony. Except, that is what THEY want isn't it. That just plays right into their hands. NVIDIA is the Heisenberg of consumer electronics, and they made me their bitch.

One week of maxiumum e-peen and then the shadow of mordor comes and drowns my whole package in freezing cold icewater. I saw it coming but not one week later! $1,100 investment and it is deficient. You can't even just spend money and get good performance. You have to be so diligent when you build. No wonder nobody puts up with this hobby. How can they market this card as a 980 when they knew 6gb games were in the pipe. They did it intentionally to ensure there was a reason to step up to the next card.

Relax man, wait a week to see if it's even worth getting worked up about it. Don't let this announcement take a wind out of your sails. Enjoy your new cards. It's not like this is Crysis reborn, nothing we've seen requires 6GB of VRAM, especially not for 1080p. Remeber COD Ghosts 6GB RAM minimum requirements... yeah, it's going to be just like that. Screams next gen, but on paper only.
I'd be surprised if there is noticeable difference between high and ultra textures. Let's wait a few days before going postal :D.

Now, if The Witcher 3 claims that in the next few months, then i'd say yeah now you can start worrying about it...
 
Last edited:
Are there any reference cards out this generation? Last generation the cards with the metal shrouds were supposedly reference. Specifically, what of this eVGA GTX 980 (model: 04G-P4-2980-KR). Any reason to believe eVGA screwed up and went cheap on this card as well?
 
Are there any reference cards out this generation? Last generation the cards with the metal shrouds were supposedly reference. Specifically, what of this eVGA GTX 980 (model: 04G-P4-2980-KR). Any reason to believe eVGA screwed up and went cheap on this card as well?
What? Have you not looked a single 980 review? Pretty much every 980 out there is a reference 980. The only non reference 980 even out there at the moment is the Gigabyte G1.
 
So apparently there is a yet to be announced feature on these cards which they were experimenting with at some other press event a a few days ago, Norm Chan from over at Tested picked up on it during a demo and they made him sign an NDA. Wonder what it is...

I'M DYING TO KNOW WHAT IT IS?!?!?!?!?!?!

Maybe it's just boring VR stuff...
 
Do you guys think I am CPU bound in BF4 at 1080p? Look at my signature. I get like 60-70 fps even in SLI.
 
After one week with 2 980's I decided to return one card tomorrow. Maybe its just me being extremely sensitive but my conclusion is that SLI is still the stuttering mess it has been years ago when I had 2 580's. I tried a lot of games with different options, vsync on/off, the "smooth vsync" option for SLI but in single GPU mode the most games just feeled smoother. I think I'll stick to just one card and wait for GM200.
 
shoulda kept both and got a gsync monitor

This ^^^

After one week with 2 980's I decided to return one card tomorrow. Maybe its just me being extremely sensitive but my conclusion is that SLI is still the stuttering mess it has been years ago when I had 2 580's. I tried a lot of games with different options, vsync on/off, the "smooth vsync" option for SLI but in single GPU mode the most games just feeled smoother. I think I'll stick to just one card and wait for GM200.

See above, I also volte for the above post. From what I've read and seen Gsync negates all stutter in ALL configuration. Silky smooth with 0 stutter, lag, frame pacing issues. It's litterally the future of gaming and "once you go Gsync everything else.. just stinks."" LOL
 
This ^^^



See above, I also volte for the above post. From what I've read and seen Gsync negates all stutter in ALL configuration. Silky smooth with 0 stutter, lag, frame pacing issues. It's litterally the future of gaming and "once you go Gsync everything else.. just stinks."" LOL

I really want one...but TN? I have this nice NEC AH-IPS 1440p Monitor now I need to find a local store to check out the ASUS RoG and see if I can get along with it. I hope ASUS will make that exact model as a 4K version some time. 2 980's and a Gsync 4K should be nice combination. I looked at the Acer 4K/Gsync but not sure if its good I rather wait for other models: http://www.tweaktown.com/articles/6...4k-monitor-4k-g-sync-is-g-lorious/index4.html
 
Last edited:
I really want one...but TN? I have this nice NEC AH-IPS 1440p Monitor now I need to find a local store to check out the ASUS RoG and see if I can get along with it. I hope ASUS will make that exact model as a 4K version some time. 2 980's and a Gsync 4K should be nice combination.

I understand. TN panels have their pluses though. Faster etc.. So Do IPS though better viewing angles darker blacks etc...

I'm waiting until Freesync is available before deciding which to jump on. Monitors matching gpu fps is the future plain and simple. I just want to see what Gsync's competition has in store before investing. With any luck in a few months once that info is on the table I'll save some money and have an easier time finding one in stock.
 
I have a question about DSR. i have a 40'1080p monitor right now with my sig rig. I have been experimenting with DSR @ 1440 and 4k rez. I plan on upgrading to a true 4k monitor when I can snag a 40' 60hz model. My question is.. Can one 980 run most games Watch Dogs, BF4 FC3 etc at 4k 60fps ? I ask because I get pretty good frames via DSR but I assume the card is doing double or at least extra work downsampling. I think the workload would be decreased if it is just rendering native 4k. I will go SLI if needed but if one wil do thats great..Thoughts??
 
I have a question about DSR. i have a 40'1080p monitor right now with my sig rig. I have been experimenting with DSR @ 1440 and 4k rez. I plan on upgrading to a true 4k monitor when I can snag a 40' 60hz model. My question is.. Can one 980 run most games Watch Dogs, BF4 FC3 etc at 4k 60fps ? I ask because I get pretty good frames via DSR but I assume the card is doing double or at least extra work downsampling. I think the workload would be decreased if it is just rendering native 4k. I will go SLI if needed but if one wil do thats great..Thoughts??
We have GTX 980 reviews showing how those games play at 4k. And of course you will not get 60 fps running those games at that res except on medium maybe.
 
Back
Top