Will Vista Shorten The Life Of GPU's

Slo Gun

Limp Gawd
Joined
Sep 10, 2005
Messages
302
If the desktop and everything is going to run the card in 3d mode, won't it reduce the lifespan of the video card, as it will always be running as though there is a game running, producing more heat and running at a higher voltage?
 
phoderpants said:
Sounds like good logic to me.

Yeah, I like the reasoning. I guess if it is the case the next gen video card will have to use better cooling methods then?

Para
 
"Will Vista Shorten The Life Of GPU's?"

In genernal, no. Depending on the number of windows open and the actions being performed, a 3D desktop is generally a light load. A video cards isn't a light switch (on/off). The heat produced by the load varies. You'll probably see more of a load from certain WinFX enabled windows.

Will Vista shorten the life of insanely overclocked cards running above maximum specified voltage? Probably. :p
 
im sure there will be some utilities built in or 3rd party that lower the clock/voltage whatever whenever you switch between games and 3d desktop
 
can't switch voltages on a video card without some modding, unless you install a switch of some sort (seen it done before, but it's omegamodding). With the dual core GPU's coming out though, they should be able to dedicate one core to doing just the desktop and then switching to dual core when you need it in gaming.
 
board2death986 said:
can't switch voltages on a video card without some modding, unless you install a switch of some sort (seen it done before, but it's omegamodding). With the dual core GPU's coming out though, they should be able to dedicate one core to doing just the desktop and then switching to dual core when you need it in gaming.

Not true. The latest ATI Tool allows for voltage adustments right in windows on the x1800 ;). However, I don't think it will be a big issue. No matter how pretty the desktop is, (I assume) it will be nowhere near the load of a 3D game. Now, think about what 2D mode really is on a video card. It's not like it shuts off the ability for 3D processing, it just lowers the voltages and clock frequancies to keep the card running cool. Something tells me that cards running in "2D Mode" will still have pleanty of juice to run any of the features in the windows desktop.

I could be wrong, but I don't think I am. Time will tell I guess... but I will say one thing for certain. It's nothing I am going to lose any sleep over :).
 
Viper87227 said:
Not true. The latest ATI Tool allows for voltage adustments right in windows on the x1800 ;). However, I don't think it will be a big issue. No matter how pretty the desktop is, (I assume) it will be nowhere near the load of a 3D game. Now, think about what 2D mode really is on a video card. It's not like it shuts off the ability for 3D processing, it just lowers the voltages and clock frequancies to keep the card running cool. Something tells me that cards running in "2D Mode" will still have pleanty of juice to run any of the features in the windows desktop.

I could be wrong, but I don't think I am. Time will tell I guess... but I will say one thing for certain. It's nothing I am going to lose any sleep over :).

Sounds like a fair assumption.
Hope your right.
 
Slo Gun said:
Sounds like a fair assumption.
Hope your right.

Me too... as I am one of the guys overclocking and overvolting cards to their maximum :D
 
I am sure that the Vista drivers that ATI and nVidia ship will clock the card low for the Vista GUI, sort of how nVidia cards have what we currently call a "2D" clock rate and a "3D" clock rate.

Hell, I think ATI cards even run at a lower VOLTAGE when they are at their "2D" clock rate. I know they did with the 9800 and X800 series anyway.

Vista's new fangled hardware driven GUI might be sexy but it CERTAINLY isn't going to stress a current or next gen GPU enough to need fully clocked horsepower.
 
what does it matter? more than half of us buy a new video card every 3-6 months anyhow..
 
Panda Man said:
New cooling is probably on the way. Anybody heard the stories about future-gen GPUs using a socket (rather old actually)? Read the possibilities in Maximum PC anyways

EDIT - info - http://www.anandtech.com/video/showdoc.aspx?i=2570

I doubt that will happen.. mainly due to space contraints. Look at your MB now, its jam packed. Now, find a way to make room for a 2nd socket (as well as room for proper cooling), all the various crap your normally going to find on a GPU PCB, and sockets for ram (because onboard ram would be stupid if the GPU is upgradable). It's a nice idea, but due to space contraints, I doub't your ever going to see it happen. There's no real need for it anyways. What you may be more likely to see is a daghterboard with a GPU socket and memory sockets, so that someone can upgrade their GPU and Memory without getting a new PCB, but it will still be an add on PCB, not something integrated into the motherboard. Even that, if it happened, is a ways off.
 
seanmcd said:
what does it matter? more than half of us buy a new video card every 3-6 months anyhow..

:rolleyes:

You think the majority of Vista users will be buying a new video card every 3-6 months? Computer tech enthusiasts represent a VERY small percentage of the user base for Windows. Think outside yourself and this forum for a minute.
 
Viper87227 said:
I doubt that will happen.. mainly due to space contraints. Look at your MB now, its jam packed. Now, find a way to make room for a 2nd socket (as well as room for proper cooling), all the various crap your normally going to find on a GPU PCB, and sockets for ram (because onboard ram would be stupid if the GPU is upgradable). It's a nice idea, but due to space contraints, I doub't your ever going to see it happen. There's no real need for it anyways. What you may be more likely to see is a daghterboard with a GPU socket and memory sockets, so that someone can upgrade their GPU and Memory without getting a new PCB, but it will still be an add on PCB, not something integrated into the motherboard. Even that, if it happened, is a ways off.

if you could integrate all the SouthBridge features into a single NB, then the room is there
 
extended ATX and full tower cases. the room is there, the question is if YOU have the room for it.
 
This should indeed shorten the life of videocard. No more 2d/3d clocks. The card will always have to run accellerated to the max. Voltage always to the max etc.. Video Card warranty is a big thing come vista. ;)
 
Viper87227 said:
Sadly though, he speaks the truth. Maybe not quite half... but to damn many.
MOST people do NOT get a new card every few months but you missed the point. the stupid part is him acting like it would be no big deal if cards wore out faster. for most people that would actually be an issue.
 
Lord_Exodia said:
This should indeed shorten the life of videocard. No more 2d/3d clocks. The card will always have to run accellerated to the max. Voltage always to the max etc.. Video Card warranty is a big thing come vista. ;)

Not true, not true at all. Want a test on a current system? Run any DX application in a window, even a window as large as your desktop resolution, and notice with the monitoring/tweaking program of your choice that ATI cards don't step up to 3d voltage/clock mode...it's only in full screen.

"Ok, but Vista's desktop is full screen!", you say. True, but it's also pretty low overhead 3d, less stressful on the system than ATI's little Control Center 3d settings car window.

To applications and utilities, it'll still be "the desktop", which means clock speeds and voltages can be throttled back just like they are now, until a "real" 3d application requests exclusive control of the screen.

Not saying you can't go hog wild with Vista's desktop and customize the heck out of it, and actually put a serious load on the GPU...but the GPU won't throttle up automatically based on that load alone.
 
Just because Vista has the swanky Aeroglass desktop, doesn't mean you have to use it. I use the classic style in XP, and will use the equivalent for Vista. Just don't see the point - I want to focus on the UI of an application, not the window surrounding it.
 
Croak said:
Not true, not true at all. Want a test on a current system? Run any DX application in a window, even a window as large as your desktop resolution, and notice with the monitoring/tweaking program of your choice that ATI cards don't step up to 3d voltage/clock mode...it's only in full screen.

"Ok, but Vista's desktop is full screen!", you say. True, but it's also pretty low overhead 3d, less stressful on the system than ATI's little Control Center 3d settings car window.

To applications and utilities, it'll still be "the desktop", which means clock speeds and voltages can be throttled back just like they are now, until a "real" 3d application requests exclusive control of the screen.

Not saying you can't go hog wild with Vista's desktop and customize the heck out of it, and actually put a serious load on the GPU...but the GPU won't throttle up automatically based on that load alone.

True, there really is no need for Vista desktop to up the clocks to full 3d mode to run. Even thourh windows currently runs in 2d it stil uses the graphics card to render. Just as an example, this is why without installation of video drivers the destop at higher res runs like sh*t.
 
Croak said:
Not true, not true at all. Want a test on a current system? Run any DX application in a window, even a window as large as your desktop resolution, and notice with the monitoring/tweaking program of your choice that ATI cards don't step up to 3d voltage/clock mode...it's only in full screen.

"Ok, but Vista's desktop is full screen!", you say. True, but it's also pretty low overhead 3d, less stressful on the system than ATI's little Control Center 3d settings car window.

To applications and utilities, it'll still be "the desktop", which means clock speeds and voltages can be throttled back just like they are now, until a "real" 3d application requests exclusive control of the screen.

Not saying you can't go hog wild with Vista's desktop and customize the heck out of it, and actually put a serious load on the GPU...but the GPU won't throttle up automatically based on that load alone.


What I have read about Aero is that it will be such a resource hog that you will need a fully compatible DX 9.0 or DX10 card to even run it. It will make your videocard work. What I have also understood is that unlike runing a dxdiag test direct 3d whether full screen or not, Aero will make your videocard work in full 3d mode 24/7. So if what I read in the past about aero is true your videocard will treat it in full 3d like if it were rendering a full screen game. I guess there could be a workaround for this, but again from my understanding on what was spoken of regarding aero is that it will be a fully hardware accelerated 3d scheme that will keep your video hardware fully accellerated 24/7. If that is the case the way the current crop of vidoecards work they will run 3dmode all the time, thus forcing them to run at their highest voltage and clock rates around the clock and shortening their life. That's why I said yes before. Unless ms changed this I still stand behind what I said b4.
 
trek554 said:
MOST people do NOT get a new card every few months but you missed the point. the stupid part is him acting like it would be no big deal if cards wore out faster. for most people that would actually be an issue.

yeah, I had my 6800GT for 16months before I upgraded and I'll probably have my 7900GTs for at least 6 months, probably longer though.
 
toga2 said:
:rolleyes:

You think the majority of Vista users will be buying a new video card every 3-6 months? Computer tech enthusiasts represent a VERY small percentage of the user base for Windows. Think outside yourself and this forum for a minute.


I consider myself an enthusiast and I do a major upgrade every THREE years (at this point).
 
Well, lets see. Wear out your video card. The only means of wearing out your video card is corrosion or electromigration. Everything else is pretty much catestrophic fialure or accidents.

has anyone worn out a video card? A chip? Granted, with smaller trace sizes and more heat, the lifespan has been shrinking, but it is still well more than the useful life of the card or apaprantly any other component as you don't see companies flipping their wig over their servers that were supposed to last 5 years dying after 3.

The best I can do is I have one old athlon 1.2GHz chip that got used for maybe 6 years through the magic of scrounging the spare parts bin. It stopped working after about 6 years of use with no apparant damage to it or the system I was trying to use it in.

Even if it cut the life of the card from 5 years to 4 years, a huge drop in lifespan, it wouldn't generate much ill will.

What vista will likely do is cause some issues with cooling and noise. Low effort 3d all the time will mean more cooling and thus more noise. Some effort will ahve to be put in by the graphic card companies to manage that more thoroughly and intelligently than they do now to keep things ergonomic.

Requiring a DX9 part as minimum spec will cause more issues than making more use.


Case in point, OSX. It has had 3d accelerated desktop and I ahven't heard mac users burnign theorugh their 9700 equivalents yet.
 
greyt_Autumn said:
I consider myself an enthusiast and I do a major upgrade every THREE years (at this point).
i dont think an upgrade every three years would make you too much of an enthusiast. lol. :rolleyes:
 
trek554 said:
i dont think an upgrade every three years would make you too much of an enthusiast. lol. :rolleyes:

maby he's a poor enthusiast? Anyways I'm going to use my current stuff till it breaks or new-fangled hardware forces me to upgrade.
 
seanmcd said:
what does it matter? more than half of us buy a new video card every 3-6 months anyhow..



I've had my X800XT PE for nearly 18 months. My cards have basicly all lasted a year+
 
Well heat brakes down electronics. and the logic that the new windows (or any 3d desktop environment) will shorten the life of a video card.

Wile I think there is truth in the statement. I don't think it will matter much. most people upgrade there systems every 5 years or sooner. lets say video cards last 6 years. and with the new 3d desktop environments your card will last 5% less longer. that's about 110 days less or about 5.66 years that is still a lot longer then your 5 year typical upgrade cycle.

We all know that video cards generally last longer the even 6 years and that 5% loss due to usage is prolly even high for the most part.

I would say you have nothing to worry about.
 
Back
Top