Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
phoderpants said:Sounds like good logic to me.
board2death986 said:can't switch voltages on a video card without some modding, unless you install a switch of some sort (seen it done before, but it's omegamodding). With the dual core GPU's coming out though, they should be able to dedicate one core to doing just the desktop and then switching to dual core when you need it in gaming.
Viper87227 said:Not true. The latest ATI Tool allows for voltage adustments right in windows on the x1800 . However, I don't think it will be a big issue. No matter how pretty the desktop is, (I assume) it will be nowhere near the load of a 3D game. Now, think about what 2D mode really is on a video card. It's not like it shuts off the ability for 3D processing, it just lowers the voltages and clock frequancies to keep the card running cool. Something tells me that cards running in "2D Mode" will still have pleanty of juice to run any of the features in the windows desktop.
I could be wrong, but I don't think I am. Time will tell I guess... but I will say one thing for certain. It's nothing I am going to lose any sleep over .
Slo Gun said:Sounds like a fair assumption.
Hope your right.
Panda Man said:New cooling is probably on the way. Anybody heard the stories about future-gen GPUs using a socket (rather old actually)? Read the possibilities in Maximum PC anyways
EDIT - info - http://www.anandtech.com/video/showdoc.aspx?i=2570
seanmcd said:what does it matter? more than half of us buy a new video card every 3-6 months anyhow..
Viper87227 said:I doubt that will happen.. mainly due to space contraints. Look at your MB now, its jam packed. Now, find a way to make room for a 2nd socket (as well as room for proper cooling), all the various crap your normally going to find on a GPU PCB, and sockets for ram (because onboard ram would be stupid if the GPU is upgradable). It's a nice idea, but due to space contraints, I doub't your ever going to see it happen. There's no real need for it anyways. What you may be more likely to see is a daghterboard with a GPU socket and memory sockets, so that someone can upgrade their GPU and Memory without getting a new PCB, but it will still be an add on PCB, not something integrated into the motherboard. Even that, if it happened, is a ways off.
what a stupid thing to say.seanmcd said:what does it matter? more than half of us buy a new video card every 3-6 months anyhow..
trek554 said:what a stupid thing to say.
MOST people do NOT get a new card every few months but you missed the point. the stupid part is him acting like it would be no big deal if cards wore out faster. for most people that would actually be an issue.Viper87227 said:Sadly though, he speaks the truth. Maybe not quite half... but to damn many.
Lord_Exodia said:This should indeed shorten the life of videocard. No more 2d/3d clocks. The card will always have to run accellerated to the max. Voltage always to the max etc.. Video Card warranty is a big thing come vista.
Croak said:Not true, not true at all. Want a test on a current system? Run any DX application in a window, even a window as large as your desktop resolution, and notice with the monitoring/tweaking program of your choice that ATI cards don't step up to 3d voltage/clock mode...it's only in full screen.
"Ok, but Vista's desktop is full screen!", you say. True, but it's also pretty low overhead 3d, less stressful on the system than ATI's little Control Center 3d settings car window.
To applications and utilities, it'll still be "the desktop", which means clock speeds and voltages can be throttled back just like they are now, until a "real" 3d application requests exclusive control of the screen.
Not saying you can't go hog wild with Vista's desktop and customize the heck out of it, and actually put a serious load on the GPU...but the GPU won't throttle up automatically based on that load alone.
Croak said:Not true, not true at all. Want a test on a current system? Run any DX application in a window, even a window as large as your desktop resolution, and notice with the monitoring/tweaking program of your choice that ATI cards don't step up to 3d voltage/clock mode...it's only in full screen.
"Ok, but Vista's desktop is full screen!", you say. True, but it's also pretty low overhead 3d, less stressful on the system than ATI's little Control Center 3d settings car window.
To applications and utilities, it'll still be "the desktop", which means clock speeds and voltages can be throttled back just like they are now, until a "real" 3d application requests exclusive control of the screen.
Not saying you can't go hog wild with Vista's desktop and customize the heck out of it, and actually put a serious load on the GPU...but the GPU won't throttle up automatically based on that load alone.
trek554 said:MOST people do NOT get a new card every few months but you missed the point. the stupid part is him acting like it would be no big deal if cards wore out faster. for most people that would actually be an issue.
toga2 said:
You think the majority of Vista users will be buying a new video card every 3-6 months? Computer tech enthusiasts represent a VERY small percentage of the user base for Windows. Think outside yourself and this forum for a minute.
i dont think an upgrade every three years would make you too much of an enthusiast. lol.greyt_Autumn said:I consider myself an enthusiast and I do a major upgrade every THREE years (at this point).
trek554 said:i dont think an upgrade every three years would make you too much of an enthusiast. lol.
seanmcd said:what does it matter? more than half of us buy a new video card every 3-6 months anyhow..