Armenius
Extremely [H]
- Joined
- Jan 28, 2014
- Messages
- 42,384
Rumored AIB models with higher voltage options should be about 15-20 percent faster:
The architecture was designed to run at 3GHz.
AMD always touts their clock speeds.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Rumored AIB models with higher voltage options should be about 15-20 percent faster:
The architecture was designed to run at 3GHz.
There was an older installer package issue where it wouldn't install things if they weren't detected at the time of installation, so if you didn't have your equipment all plugged in when you installed the drivers it would leave things out and you would have to do a full uninstall and reinstall of the driver package on site when things were all hooked up.
It made batching deployments a serious PITA, AMD has since corrected the issue but it went on for a full year or so before they got around to fixing that particular issue.
The problem is the price potential. The question is which if any AIBs are worth the increased prices, meeting or exceeding FE/Reference designs, QA/build quality, and overclockability/VRM/features. At least in the last generation with NV it was pretty much only limited EVGA and Asus models that were even potentially worth it, and for AMD it was Sapphire, PC Red Devil, and Asus. When I picked up my 3090 Asus ROG Strix model I only did so because it was before the big shortage/tariff/pandemic/mining combo, so I paid around $1650-ish, which for a $1500 card seemed worth it for the top of the line AIB (and I've had good experiences with ROG Strix / Matrix cards in the past. It has worked flawlessly for me ever since and its additional features like dual BIOS, good cooling, and the ability to OC smoothly have been worth while I think.). However, it wouldn't be long before the MSRP was $2200-2400 (depending on the standard black/gunmetal or white color!) and scapers even higher, so there as no way in hell it was worth that over getting a $1500 FE. On the AMD side, I saw they seemed to have even less control over insane prices among AIBs; putting aside the fact that the 6800XT and 6900XT did NOT have an Asus ROG Strix version that was air cooled, I saw a lot of highly priced AIBs even during the era when AMD cards were not as desirable for mining etc. Adding this to the horrid, easily botted AMD dot com direct sales platform as the only way , and lots of people who would have paid for those $650 6800XT never got the chance - I don't want to see the same thing happen again.Well AMD has to leave some room for the Powercolors and the rest of their AIB, Spooge Red Devil edition cards.
For what it's worth, this is what the footnote (1) says:I'm a bit concerned by this slide.
View attachment 524218
Doom Eternal Ray Tracing at 4K max settings.
I just tried that on my 4090, and I don't know where they got 135 FPS from, but the framerate on my 4090 is much MUCH higher than that, even without DLSS. I'd really like to know what their benchmark was for that number.
Yep.For what it's worth, this is what the footnote (1) says:
- Testing done by AMD performance labs November 2022 on RX 7900 XTX, on 22.40.00.24 driver, AMD Ryzen 9 7900X processor, 32GB DDR5-6000MT, AM5 motherboard, Win11 Pro with AMD Smart Access Memory enabled. Tested at 4K in the following games: Call of Duty: Modern Warfare, God of War, Red Dead Redemption 2, Assassin’s Creed Valhalla, Resident Evil Village, Doom Eternal. Performance may vary. RX-842
What savior? It sounds like a great GPU for sub-4K gaming, and it's a lot cheaper than the 4090. That'll be good enough for a lot of people.Unless I'm missing something here, I just don't see the 7900 XTX as the GPU savior people are starting to hype it up to be.
There are lots of graphs making the rounds showing AMD being within 5-10% of the 4090 performance wise. My own testing of generalized scenarios using the 4090 shows that this is most likely false, even in purely rasterized games.What savior? It sounds like a great GPU for sub-4K gaming, and it's a lot cheaper than the 4090. That'll be good enough for a lot of people.
I'm prepared to believe you about the performance, but I'm not spending $1600 on a GPU, and I game at 1440, so the 7900s seem to make a lot more sense to me. As always, YMMV.There are lots of graphs making the rounds showing AMD being within 5-10% of the 4090 performance wise. My own testing of generalized scenarios using the 4090 shows that this is most likely false, even in purely rasterized games.
FYI, it says "4080" in your signature.There are lots of graphs making the rounds showing AMD being within 5-10% of the 4090 performance wise. My own testing of generalized scenarios using the 4090 shows that this is most likely false, even in purely rasterized games.
FML that is literally the SECOND TIME I've put 4080 in a signature. That's what I get for having a 3080 for 2+ years.FYI, it says "4080" in your signature.
It looks perfectly fine for 4k gaming? What sort of FPS are people looking at for 4k nowadays?It sounds like a great GPU for sub-4K gaming
I'm sure it will be, but I'm running 2x 1440p monitors, and so 4K isn't relevant for me.It looks perfectly fine for 4k gaming? What sort of FPS are people looking at for 4k nowadays?
IMHO, there are pros and cons to each. There's even an opportunity (gulp) for AMD to be perceived as "better" even with much less fps at 4K... we'll see. Just to be honest, the 4K (purists, not low res upscalers) high gamer is a small group today, due to current limitations. Some of those limitations are lifted with these new AMD cards, but we need the monitors, etc. to support it. So, this could be a tale of December results and a "redo" once more advanced displays become more available.I'm prepared to believe you about the performance, but I'm not spending $1600 on a GPU, and I game at 1440, so the 7900s seem to make a lot more sense to me. As always, YMMV.
I've seen some speculation that the 7900 clocks will go as high as 3GHz, and that via that method, the card will be great at 4K/8K. We all know how speculation on the Internet goes, of course.IMHO, there are pros and cons to each. There's even an opportunity (gulp) for AMD to be perceived as "better" even with much less fps at 4K... we'll see. Just to be honest, the 4K (purists, not low res upscalers) high gamer is a small group today, due to current limitations. Some of those limitations are lifted with these new AMD cards, but we need the monitors, etc. to support it. So, this could be a tale of December results and a "redo" once more advanced displays become more available.
Personally, I think people will likely be happy with either at 1440p, and many will opt to save the money, space and power this time around IMHO. But we'll know more after we see them in action.
Let’s not kid ourselves. 4K maybe, but not 8k. I mean, believing that is not even weed territory, that’s straight up crack pipe religion.I've seen some speculation that the 7900 clocks will go as high as 3GHz, and that via that method, the card will be great at 4K/8K. We all know how speculation on the Internet goes, of course.
To the best of my knowledge it’s all working as intended. But looking at it I see why you are still concerned, all I can say is with my equipment it works but the more I read the more I am convinced I may be the exception and not the rule.Good to know that it's fixed. You don't happen to know about bistreaming output do you? Any issues with Atmos/DTS:X or delays with audio streams? As dumb as it sounds, the GPU audio component is a deal breaker for me.
Not if you assume they are using the 60hz 4K as a limit and they are just showing both are maxing out that display…
Oh, I know, I'm just saying what I heard.Let’s not kid ourselves. 4K maybe, but not 8k. I mean, believing that is not even weed territory, that’s straight up crack pipe religion.
But then the 3090ti and 6950xt are both lower, so that also doesn't explain anything.Not if you assume they are using the 60hz 4K as a limit and they are just showing both are maxing out that display…
I've already said this in this thread, but it's pages back, FrgMstr said that nVidia dumped all the 4090s they could upfront. There is very limited allocation for any more in North America for the next two quarters. If you don't have a 4090 by now, you're gonna have a tough time getting one until Spring 2023. If AMD can deliver this year and in Q1 2023 and performance exceeds the 4080 16 GB, they will be beating what nVidia has available at the time, most likely at a lower price (for rasterization at least)....While a lot of things are different now, I hope AMD has both better control on AIB pricing for partners and the reference/FE availability off their own website and any 3rd parties. Especially given the delay of more than a month before these cards show up for sale (I consider this inadvisable especially given NV has the top end cards already available, lower end ones will debut, and NV will adjust prices and ramp up their insane marketing machine ) AMD will again be underwhelmed when they really could have done much better on the product alone...
Unless I'm missing something here, I just don't see the 7900 XTX as the GPU savior people are starting to hype it up to be.
nVidia is like Dodge when they released the Viper truck.AMD fans and Nvidia fans are both missing the point, here.
AMD was essentially required to put out a card this time of year and so was Nvidia. AMD challenged Nvidia to a game of slam-your-dick-in-a-car-door chicken, and Nvidia did three rails of coke, slammed their dick in the car door, and then hit the power lock. Then AMD said, OMG, you actually slammed your dick in a car door? Fucking outstanding.
Nvidia's like, why aren't you slamming your dick in the door? And AMD says that's retarded, why would anyone slam their dick in a car door?
AMD put out two cards that beat their current line-up, of which there is quite a lot of still, and that's it. They have a full stack and it runs all the way up to, but not over $1K. And it's all profit at the top for them. This makes Nvidia look desperate and greedy for most people.
They're still fighting off the "value" brand burden, on CPU and GPU fronts. I think that changes when AMD doesn't have to compete with existing AMD and Nvidia new-old stock.
nVidia is like Dodge when they released the Viper truck.
Because you wanna look like a COP!I love these reviews. I love how they love panther bodies.
Naw, panther is just crazy comfortable, especially with the air bags out back.Because you wanna look like a COP!
Confirmed. 7900 XTX is a 4080 competitor.
Confirmed. 7900 XTX is a 4080 competitor.
There is ZERO chance of an out of the box 3ghz game clock 7900xtx. Not only would a 30% game clock increase be a thing of fantasies, it has already been said that 3ghz looks like the highest the chip can even properly handle even for max boost.This is basically what I assumed this entire time. The VERY interesting part is just how much overhead the XTX will have in AIB partners hands. Or just to get to the point: just how much it will close any and all gaps. Moving from 2.3 to 3 GHz, I assume no more than MAYBE a 15% uplift (more likely 10-12%). But that is a massive swing against any and all 4080's that will inevitably come down the pike especially at this price point.
Let's assume a 3GHz Sapphire Toxic costs $1200, I don't think there is any 4080 that nVidia can launch that will look favorable without a massive price drop and performance bump.
wonder how far they’ll throw the power limit out the window with the aib stuffThis is basically what I assumed this entire time. The VERY interesting part is just how much overhead the XTX will have in AIB partners hands. Or just to get to the point: just how much it will close any and all gaps. Moving from 2.3 to 3 GHz, I assume no more than MAYBE a 15% uplift (more likely 10-12%). But that is a massive swing against any and all 4080's that will inevitably come down the pike especially at this price point.
Let's assume a 3GHz Sapphire Toxic costs $1200, I don't think there is any 4080 that nVidia can launch that will look favorable without a massive price drop and performance bump.
Fair.There is ZERO chance of an out of the box 3ghz game clock 7900xtx. Not only would a 30% game clock increase be a thing of fantasies,
Source?it has already been said that 3ghz looks like the highest the chip can even properly handle even for max boost.
nVidia is like Dodge when they released the Viper truck.
[Radeon RX 7900 XTX] is designed to go against 4080 and we don’t have benchmarks numbers on 4080. That’s the primary reason why you didnt see any NVIDIA compares. […] $999 card is not a 4090 competitor, which costs 60% more, this is a 4080 competitor.
— Frank Azor to PCWorld
That's a 1.3x increase. Depending on how hard the chips are pushed to get them to just 2.3GHz, I would expect that to increase power consumption by at least 1.3³ = 2.2x. Not all of the 355W TBP is core power, so it wouldn't end up 355x2.2W, but it would still be ridiculously hot....Moving from 2.3 to 3 GHz...
I see you have mentioned this before as well. Sure, you can source ALL the other components for A computer for 600, but it will fall victim to the same thing the 4090 did, and that's CPU bottlenecks. You aren't building a system that can push either card (if we are to believe the 1.7x) for $600.
Edit: I get it. You guys are FROTHING at the mouth for an AMD card and just hoping and praying it's as fast, or faster than Nvidia. You want it so bad! Unfortunately, I don't see it happening and AMD intentionally not even showing it's hand before reviewers isn't a "smooth move, because who cares about FPS anyway", it's a silly move that stinks of low performance numbers.
AMD fans and Nvidia fans are both missing the point, here.
AMD was essentially required to put out a card this time of year and so was Nvidia. AMD challenged Nvidia to a game of slam-your-dick-in-a-car-door chicken, and Nvidia did three rails of coke, slammed their dick in the car door, and then hit the power lock. Then AMD said, OMG, you actually slammed your dick in a car door? Fucking outstanding.
Nvidia's like, why aren't you slamming your dick in the door? And AMD says that's retarded, why would anyone slam their dick in a car door?
AMD put out two cards that beat their current line-up, of which there is quite a lot of still, and that's it. They have a full stack and it runs all the way up to, but not over $1K. And it's all profit at the top for them. This makes Nvidia look desperate and greedy for most people.
They're still fighting off the "value" brand burden, on CPU and GPU fronts. I think that changes when AMD doesn't have to compete with existing AMD and Nvidia new-old stock.
This might just be the best thing I've read so far here, lol!!!AMD was essentially required to put out a card this time of year and so was Nvidia. AMD challenged Nvidia to a game of slam-your-dick-in-a-car-door chicken, and Nvidia did three rails of coke, slammed their dick in the car door, and then hit the power lock. Then AMD said, OMG, you actually slammed your dick in a car door? Fucking outstanding.
Nvidia's like, why aren't you slamming your dick in the door? And AMD says that's retarded, why would anyone slam their dick in a car door?
AMD put out two cards that beat their current line-up, of which there is quite a lot of still, and that's it. They have a full stack and it runs all the way up to, but not over $1K. And it's all profit at the top for them. This makes Nvidia look desperate and greedy for most people.
This is the most absurd pile of nonsense that I have read in this thread so far.