Nvidia IQ vs. ATI IQ...which is better?

rsnellma

Gawd
Joined
Jul 17, 2002
Messages
639
I have been doing some poking around and ran across an article that gave me the answer to this question, but I don't recall where it was. So, now I post the question to you.

Which is better at this point and time?

Nvidia's Image Quality or ATI's Image Quality

Thanks in advance.

Bob2001
 
The 8800 series must have the best IQ.
Ask your question again in 2 months and we might have a new answer.
 
Currently the winner for the best Image Quality is Nvidia's G80 series. However, if you are looking through mid to mid-high range cards, ATI has the crown.
 
Generally speaking when cards of similar specs go head to head, Image quality usually goes to ATI. But Nvidia has the most current card on the market being the 8800GTX we'll see when the R600 is out and they go head to head.
 
So, your saying a 1950 Pro bests a 7600GT, but a 8800 bests them all?

I am looking for a card that will do 1600x1200 with decent fps in games like Battlefield series.

Bob2001
 
Funny you should ask that... Nvidia fans swore that 7900series had same IQ few months ago but now that they have article after articles people favor 1900series....


If you're shooting for 1600x1200

You should get 8800gts if you have money

1900xt 1900xtx variety under $300

7950gt if you want nvidia but this card is also slower than 1900xt
(if you can find 7900gto might be better)
 
rsnellma said:
So, your saying a 1950 Pro bests a 7600GT, but a 8800 bests them all?

I am looking for a card that will do 1600x1200 with decent fps in games like Battlefield series.

Bob2001

Ugh, the 8800 should do the trick if money is no object. If you want to go 2x cheaper the X1950XT will provide more FPS and higher IQ in BF2 over it's competition.

I have a X1950XTX and it screams through all games @ 1680x1050 and i dont even have a C2D. The X1950XT would perform similarly to the XTX.

You can pick one up for ~$250.

Oh, and make sure you dont get the 7900GTO. Wondering why so much for so little? The memory modules on them are bad and fail over a period of time, long and short. Failed on me and countless other individuals.
 
I feel like adding my input:

The difference in image quality between AF in the NVIDIA 7-series and the ATI X1K series is not huge. It is noticeable, yes, in some games. In very dynamic games like UT04 and Battlefield 2142, it's less noticeable than in more static games like World of Warcraft and Oblivion.

I do look for it when I'm testing, and I can usually find it, but I find that when I'm not looking for it, I tend to not notice.

What is more noticeable to me, though, is the AA quality. NVIDIA, in my opinion, has stronger antialiasing and produces better image quality than ATI. (Example)

Of course, the 8800 series changes that. Presumably, R600 will attempt to change it again.
 
Mark_Warner said:
I feel like adding my input:

The difference in image quality between AF in the NVIDIA 7-series and the ATI X1K series is not huge. It is noticeable, yes, in some games. In very dynamic games like UT04 and Battlefield 2142, it's less noticeable than in more static games like World of Warcraft and Oblivion.

I do look for it when I'm testing, and I can usually find it, but I find that when I'm not looking for it, I tend to not notice.

What is more noticeable to me, though, is the AA quality. NVIDIA, in my opinion, has stronger antialiasing and produces better image quality than ATI. (Example)

Of course, the 8800 series changes that. Presumably, R600 will attempt to change it again.

True, and i did notice that my 7900GTO at 4xSSAA produced better IQ than my X1950XTX at 4xADAA. But, at the same settings, the X19s are faster and the better AF is quite noticeable.
 
Marvelous said:
Funny you should ask that... Nvidia fans swore that 7900series had same IQ few months ago but now that they have article after articles people favor 1900series....
If people really thought that G7x and R5xx had the same IQ at default levels, these individuals were quite blind. Even after upping the G7x's ante to High Quality, the gap can still be pretty immense when you flip on ATi's HQ AF.

As far as whether R600 can up the ante over G80, I'm not so sure there's much room for them to be able to do that. DX10 itself is probably primarily responsible for NVIDIA's giant leap in filtering quality (nearly perfect), and I'm sure ATi will follow suit, but where do you go when 16x non-angle dependent anisotropy is about as perfect as anyone would ever want? It's possible that ATi could go with a completely optimization-free method as an option, or they could offer us filtering at greater than 16 samples, but would that noticeably impact image quality?

I think the big question is whether ATi will be able to counter NVIDIA's coverage sampling anti-aliasing. We've seen rumours hinting at dedicated FSAA logic on R600, giving us 4x MSAA for free, but how will R600 scale further down the line? Will 6x again be our cap, or will we see 12x or 16x, and what kind of hit should we expect with these modes? Without the a method similar to the one NVIDIA's employed with CSAA, these higher modes are going to come attached with severe performance penalties, just as the penalties for the pure multi-sampling modes on G80 can be pretty severe. It's possible, of course, that R600's dedicated FSAA logic will alleviate a big chunk of the performance penalty of all multi-sampling modes and may actually give us super-sampling at reasonable performance levels.

In any case, I think the pure image quality gap is going to be pretty non-existent, and this is going to give us a much more level playing field with respects to benchmarking.
 
Endurancevm said:
True, and i did notice that my 7900GTO at 4xSSAA produced better IQ than my X1950XTX at 4xADAA. But, at the same settings, the X19s are faster and the better AF is quite noticeable.

I completely agree... AF makes a world of difference how well your textures look... As Nvidia 7900 series and below surely lacks... Now we have the 8800 series that question has been answered...

As for AA it's really not much of difference to distinguish it apart unless looking for it...
 
ATI's offerings have been significantly better than Nvidia's, in all cards except for the the 8800 GTX and GTS. The 8800 is actually better than ATI's cards, but they're both so good, you'd be hard pressed to notice the difference in a blind test.
 
Summary: Anisotropic Filter Quality

Nvidia's 8000 series: top-notch default image quality. VERY expensive.

ATI x1000 series: good default quality, you can make it very good for a small performance hit if you muck with the settings (turn on angle-independent AF).

Nvidia 7000 series: okay default image quality, you can make it as good as the DEFAULT image quality of the ATI x1000 series by changing settings to "high quality."

Summary: Anti-Aliasing Quality

Nvidia's 8000 series: top-notch default image quality. New Coverage Sampling AA is astounding combination of 4xMS plus 16x fragment AA. These parts feature AA with FP16 and FP32 HDR. Again, very expensive

ATI x1000 series: good quality, but tends to drop details in 4x and 6x multisample modes. Transparency AA offers less detail than Nvidia's offerings. 6xMS mode is a good balance of quality and speed, but some people complain it leaves details too soft. These parts feature FP16 AA with HDR, but this is only feasable on higher-end cards.

Nvidia 7000 series: better quality than ATI's x1000 in 4xMS and 8xMS+SS modes, especially with Transparency AA. Unfortunately, 8xMS+SS mode is a performance hog. Also, this series does not do FP16 HDR with AA
 
Back
Top