
05-03-2003, 07:40 PM
Explanation about why FXs suck so much:
nVidia, in their infinite wisdom, decided to have a different way of controlling the fan. The fan speed would depend on how much stress is being put on the video card by how many programs are running. It seems like a good idea at first...if you dont want to idle.
Here is what makes their idea sooooo bad. When your computer is running no programs at all for a certain length of time, nVidia made it so that the fan will actually shut off, sounds bad already eh? And when a program is executed, it starts back up again and it's all fine and dandy. So...lets say that you went idle for 10+ minutes and your Matrix Reloaded Screensaver pops up. Now you're not running any programs since your idle, but your screensaver demands a large amount of 3D effects from your video card. So the card tries and gives it the graphics it needs to continue along.
Now all this work the card is doing requires a large amount of cooling, after all, it IS the might GEFORCE FX 5800 ULTRA! *scary music plays* But since your computer isnt running any programs, it doesnt start the fan (gg fan controlled by drivers). So after about 4 or 5 seconds into this screensaver, your card is reaching a good 95 - 100 degrees celcius, give or take a few. Now a NORMAL operating temperature for anything is around...30 - 40 degrees celcius, about that, I run at 35 and it's no big deal. But if you ever ran at 95 - 100 degrees celcius for more then say....10 seconds, your CPU would melt right into your motherboard. Same thing is going to happen with the "godly" GeForce FX. $400 bucks down the drain! WOOHOO!
And that's why the FX blows.
|