Graphicswise was Oblivion released too soon?

Sunin

[H]ard|DCer of the Month - August 2008
Joined
Dec 27, 2005
Messages
3,421
Several articles point it out that you have to have the highest end cards in SLI / Crossfire to get FPS to avg above 60fps, which most consider to be the optimal playing rate. This is of course with all settings maxed. So my questions boil down to two:

Was the game released prematurely where the hardware necessary to fully take advantage of the game really is not attainable for the masses? or

Is the game a driving force to push high-end prices down so that the hardware does become available to the masses?

I'm curious, because I'd love to play with all settings maxed but I can't. I have it where it defaulted and am playing 16x12 with a 7800GS (the best AGP has to offer atm) and my FPS vary between 20-120. The game is playable, but I wonder what I am missing!

So what effect will this game and others like it have on hardware?

Two other side questions:

1. What is the maximum throughput of 2x16 SLI
2. How long before we reach the maximum? or better worded maybe: How fast can we go with our current architect?

Sorry longer then I had anticipated, but I'm a tech geek like most and wonder where things will be going in the next year or two!
 
Pretty much every game "grows" into its technology. Look at HL2. When that was released, the x800 was the "top end" card.
 
I don't think any game with replay value should be released with the latest graphics cards being able to run it at max.

I like something that I play and enjoy now and then come back to later and it bump the graphics up even more.

It's actually nice that all the latest games, GRAW, TR: Legend, Oblivion push my 7900GTX to its knees at high settings, I don't want graphics card manufacturers to get lazy.
 
Look at it this way, would you rather be able to play the game now or wait another 6 months? If the game was never released, newer cards will replace the current generation and the cycle will continue.

The same thing happened with Far Cry, Half-Life 2, Doom 3, etc.

Sunin said:
1. What is the maximum throughput of 2x16 SLI
2. How long before we reach the maximum? or better worded maybe: How fast can we go with our current architect?
1. There is a proformance increase with SLI/Crossfire, but I don't have the time to find the sources. I do know that 100% increases have been reported with Crossfire (with the .exe renamed).

2. I'd say as soon as Vista / Direct X 10 compatible cards are released.

lloose said:
Pretty much every game "grows" into its technology. Look at HL2. When that was released, the x800 was the "top end" card.
More like the 9800 Pro.
 
Yeah, I was going to say...ATI was handing out free HL2 coupons with the 9x00 series, so those could handle it. Hell, I played HL2 @ 800x600 on my Radeon 9200. :)
 
It was released too soon, but I say that more because it's in serious need of a patch more then anything. I agree with others that the game will grow into the technology down the line. Even now, without everything maxed out, the game looks better then anything else in most regards. This game was under development for a long time, so they just got to the point where they really needed to release it. Now that it's here, I just hope patches will eventually squash most of the bugs and hopefully some of the performance issues as well.
 
i am glad they released it and it is pissing all over your high end cards.

If you cant run it in the highest settings then get better hardware or turn down the extras.

i hate games that limit their "potential" so that some guy with a 5 year old video card can play it.
 
MrGuvernment said:
i am glad they released it and it is pissing all over your high end cards.

If you cant run it in the highest settings then get better hardware or turn down the extras.

i hate games that limit their "potential" so that some guy with a 5 year old video card can play it.

Ok Testosterone boy... The point was even with X1900XTX in SLI that run > $1,000 your still not gauranteed to have FPS over 60. In fact the game is begging for the next gen cards to be released.

I agree I would not want the graphics to be deminished cuz some guy with a FX5500 wants to run it with all the options turned on full. That was not my point and god I hope that never happens.

I'm just wondering if it would have been wiser to wait another 3mos or so, but I think I like one posters view is in 6mos or a year we can all play this game with what will be mid market cards and achieve the coveted 60fps or more for a reasonable price and it will be like playing a completely different game with everything jacked to max! Plus by then they will have some bugs worked out and maybe smooth/improve the graphics engine a bit!

Sunin
 
If it were multiplayer, and nothing could get it past 60fps, then I'd think there was an issue. It's single player, and regardless of what people would like, as long as it stays above about 20fps, a large number of people will play thoguh it without bitching too much.

I'm playing on much less than the latest gen of gear (6800GT dual 2.4ghz xeons and 1.5GB ram), and I'm at 1024x768. Am I getting the maximal graphic experience, no. Was the siege of kvatch a slideshow at a couple points? yeah. Do I still find the game fun.. definitely. If I had a LCD monitor with worse scaling, I'd probably not feel that way, but I shopped with the fact in mind that it was going to have to scale the image on some stuff I use it for, so I chose one that does it better than average.
 
Sunin said:
Plus by then they will have some bugs worked out and maybe smooth/improve the graphics engine a bit!

Sunin

This is what I'm really wondering about. Are they gonna release a patch to optimize the engine further, if that is even a possibility?

I just got my 2405 (coming from a 17" CRT) and playing at 1920x1200 is sexy...but also very painful. Thank god scaling widescreen is ok. I would ideally use 1280 x 768 (since it's barely more than 1024 x 768), but then my roommates just come in and tell me to bump it up back to 19x12 again. Haha.
 
I certainly hope they do some graphics engine work. I don't want it to be degraded, but something that is a bit smoother. One can hope!
 
That's a little silly. no one said you have to run a game maxed all the time. Lower the resolution and you still play the same game.
 
LCD's with native resolutions of 16x12 = looks like shit if you lower them. So I must run at that resolution. I do not have everything maxed out. I get playable fps, but I do wonder what I am missing!
 
I dunno..I still don't see it when I lower my resolution on my LCD with graphics are all 3D. Only 2D images look bad when I lower my resolution.

But even so, you don't have to have everything turned on to enjoy the game.
 
I run it on my 1900XT with everything maxed, with the LOD mods, dual core optimizations on, etc. The game runs at a steady 30FPS outdoors in heavily wooded/grassy areas. It would probably run a little quicker, but I like V-Sync on. (can't stand tearing at all) When I'm indoors it fluctuates between 45 and 60. For a game like this, these rates are entirely reasonable. My system (in sig,) is nice, but it's not exactly top of the line. (when considering Crossfire, SLI, and much higher end procs are available) This situation of games growing with new hardware is not new at all. I can think of a lot of very old games that didn't run well on the current generation of hardware when they came out.

Sierra SCI games didn't run too well on my XT with EGA card (which was pretty standard at the time)

Ultima VII didn't run too well on my 386.

Ultima VIII, System Shock and BioForge didn't run well on my 486 DX2 66 with VLB card. (in fact these games didn't run very well on the early Pentiums either)

Etc. etc. etc.
 
UltimaParadox said:
Bingo this is an RPG, not an FPS. More than playable with AA and AF on my 6800 Ultra

Yes but like I mention in the original post, you don't have all the graphics turned up and at what resolution are you talking? 16x12 on 7800GS still lets you drop to just under 20fps at times. And that to me is unacceptable. Its playable yes, but as the 1900xtx stated above hardly ever drops to 30fps that is more acceptable, and them in SLI would allow full graphics I would imagine. Even at that its beg'n for the next gen card to fully run silky and with full graphics.

Sunin
 
It all really depends on what you consider a playable framerate. I play the majority of games at around 30fps. I crank up all the settings and have a resolution of 1920x1200 when the game supports it.

Oblivion runs great. Never really dropping below 20fps. Indoor areas are always 60fps, Towns around 30-60 depending on which one, and outdoor areas 20-40fps depending on how much grass is around.

In FPS shooters, it really depends on the game as to what the playable framerate is. But since i generally only play single player games, even 25fps is playable.

As for oblivion being released too soon. No. Not at all. Its just badly optimised. GFX cards these days are far more powerful than people think. Games are just optimised for them badly. But thats what happens when you have to make a game compatible with hundreds of different cards.
 
Iratus said:
I don't think any game with replay value should be released with the latest graphics cards being able to run it at max.

I like something that I play and enjoy now and then come back to later and it bump the graphics up even more.

It's actually nice that all the latest games, GRAW, TR: Legend, Oblivion push my 7900GTX to its knees at high settings, I don't want graphics card manufacturers to get lazy.

Indeed.

I originally played Half-Life 2 on a Geforce FX 5200(!), but the Source engine was nice enough to give me decent framerates with some still pretty nice looking graphics.

Later I played it on my modded 6800LE (12 pipes, 5 vertex shaders unlocked) at 1600x1200 with most settings maxed out and it was just lovely.
:D
 
I was hoping they would push the game back to Q4 of this year but they dropped the ball. They released the game without any thorough beta testing. Not surprising seeing how they were plagued with the same problems in Morrowind. I don't think the next gen gfx cards will make a huge difference in fps on this game. I'm anxious to see what a 3GHz conroe will do for it though.
 
Sunin said:
Yes but like I mention in the original post, you don't have all the graphics turned up and at what resolution are you talking? 16x12 on 7800GS still lets you drop to just under 20fps at times. And that to me is unacceptable. Its playable yes, but as the 1900xtx stated above hardly ever drops to 30fps that is more acceptable, and them in SLI would allow full graphics I would imagine. Even at that its beg'n for the next gen card to fully run silky and with full graphics.

Sunin
It's and advanced game, woop-de-doo. Your complaining because a game is to pretty? That's ridiculous.
 
Yeah...in terms of graphics, games have come a long way. Remember 3dmark2001se? The first time we saw the nature demo and wondered when we'd see that rendered in real-time?

Exhibit A. Oblivion outdoors. Kiss your fps goodbye :)
 
PCMusicGuy said:
I was hoping they would push the game back to Q4 of this year but they dropped the ball. They released the game without any thorough beta testing. Not surprising seeing how they were plagued with the same problems in Morrowind. I don't think the next gen gfx cards will make a huge difference in fps on this game. I'm anxious to see what a 3GHz conroe will do for it though.

I honestly don't think that will do much for the game since it is a dual processor chip (unless I'm mistaken, but the way I read the technology its two 1.5ghz processors running in tandem). Oblivion from what I've read is not all that adept at taking advantage of dual processors. You'd be better off with a 4.5 Ghz P4. Although I must say I am excited about the Quad and Octal cores coming next year ont he 45nm technology. When we see speeds in the 10ghz plus range I'll be :)
 
All I know is that Nvidia released a driver for FEAR that gave me a 30% FPS boost... So when I see a game that chugs on my 7800GTX -Oblivion- I just wait for the patches/driver....
 
Most of the low framerates that I see are because of hard drive latency/transfers, not rendering.

The principal problem from the PC point of view is that it really is optimised for the Xbox360 and it's capabilities/limitations.

I'm hoping that the first patch makes it a little more PC friendly (for instance, I really wish it would use all of my ram, to cut down on the HDD thrashing and pop-in).

It won't though.
 
Sunin said:
LCD's with native resolutions of 16x12 = looks like shit if you lower them. So I must run at that resolution. I do not have everything maxed out. I get playable fps, but I do wonder what I am missing!

Well, if you had an ATI card, you could always do 800x600 with 4xAA and keep the HDR. 800x600 divides evenly into 1600x1200, so even LCDs that interpolate *terribly* will still look as good as native at that res.

You just need buckets of FSAA to make it work, is all.

And, FWIW, 800x600 with 4xAA in 'Oblivion' performs WAY better than 1600x1200 does (a touch more than twice the FPS), and really doesn't look that much worse. Certainly not enough less to justify the enormous performance difference.
 
I think that it was released too soon. I would have much rather had the advanced lighting and shadow effects that were in the first videos of the game than what we ended up with.

I am constantly playing Oblivion and trying to find an explanation for why it is running so slow.. I mean everywhere you go in the game it seems un-optimized or badly written even though it looks so pretty.

For example, some houses, for no reason at all, bring the fps down to around 15. I am outside the house and its 40fps i go inside and its 15... oh wait no if i take 2 more steps in the building its all of a sudden 60 fps..... like wtf??

Examples of this are in Anvil there is a quest involving a lady and her rats downstairs, go into her house and explain the ridiculous drop in fps. There is also a fighters guild there that does the same thing.. but the fighters guild in Chorrol I believe has more people in it and a training area being used and is smooth as silk.

Or how bout Oblivion gates affecting framerates if you are just in the general vicinity of them, I mean not even looking at them... but just because they are around you in a 100 yard radius it affects your framerate.

Also look at combat in the game, why is that I can look at 10 people walking around the city and my framerate stays steady but when I am fighting 1 bandit in a very undetailed cave the frames go in the crapper.. all he is doing is raising his arm and performing a badly animated movement.

I love the game but alot of the time I can't explain how huge the fluctuations are and why they are even happening. I think Bethseda is great at making interactive worlds BUT I think they are amateur programmers/developers/whatever you call the people who handle the engine.

P.S. The first time I got to take a look outside at the faraway land blurry texture bullshit I knew I wouldn't only be playing it for the technical marvel it is.
 
as long as it runs well at 1024 x 768 on my ati x700 at medium-low settings i'll b fine. 4 me the graphics of a game only matter for the first like 10 mins of gameplay, then its all down to the gameplay. unless the graphics r runescape bad, of course.
 
Hey, I just installed the patch and walked around outside, wow it seems like they increased the draw distance and removed the soup (or maby thats the lod mod), also self shadowing is working fine now and I have the graphics maxed (all sliders to right), and I ran around outside and barley diped below 50s it's nice to see the performance increase as I'm replaying it with an overhaul mod now.

Resolution at 1280x960, HDR, 8xAF
 
Back
Top