- #1
SootAndGrime
- 61
- 0
I have two enthusiast-level ATI Radeon cards in crossfire and I don't understand the necessity of having more than two graphics cards in a system. I can run just about any game maxed out with playable framerates, even Crysis.
There are many inherent problems with having four graphics cards. With heat and staggering power consumption being the most prominent. Four-way HD 6970 or GTX 580's can use 1,400W+ of total system power at full load. The only PSU that I know of capable of powering four GPU's at full-bore (tested) is the Silverstone 1,500W. Also, your typical power socket cannot handle the load and circuit breakers can blow. Requiring an electrician to repair.
Don't forget the chip-melting heat four graphics cards can produce. And since they are right next to each other, they share the heat. This can cause some of the cards to exceed 90 *C in some cases.
Who would actually need four graphics cards in SLI/CF? It's much better to have two dual-core GPU's like two HD 6990's or two GTX 590's instead of having four individual graphics cards. While sacrificing a small amount of performance due to the cards being downclocked. Also, you'll be saving PCIe slots.
It's hard to overclock the dual-core GPU's to match the clock speeds of their single core counterparts. The cards are downclocked for heat/power reasons.
Off-topic, but why is it that no single graphics card is allowed to exceed 400W?
Two or three is more than enough for 99% of applications. Also, the performance scaling after two GPU's isn't very good. A third card might add 40% more performance, while a fourth will give only about 25%.
Trying to run Crysis on 2560x1600 with 32xMSAA with graphics mods installed and on multiple monitors might need four GPU's. But I can't think of anything else.
Does anyone here have four graphics cards in their system? If so, can you give me some input?
There are many inherent problems with having four graphics cards. With heat and staggering power consumption being the most prominent. Four-way HD 6970 or GTX 580's can use 1,400W+ of total system power at full load. The only PSU that I know of capable of powering four GPU's at full-bore (tested) is the Silverstone 1,500W. Also, your typical power socket cannot handle the load and circuit breakers can blow. Requiring an electrician to repair.
Don't forget the chip-melting heat four graphics cards can produce. And since they are right next to each other, they share the heat. This can cause some of the cards to exceed 90 *C in some cases.
Who would actually need four graphics cards in SLI/CF? It's much better to have two dual-core GPU's like two HD 6990's or two GTX 590's instead of having four individual graphics cards. While sacrificing a small amount of performance due to the cards being downclocked. Also, you'll be saving PCIe slots.
It's hard to overclock the dual-core GPU's to match the clock speeds of their single core counterparts. The cards are downclocked for heat/power reasons.
Off-topic, but why is it that no single graphics card is allowed to exceed 400W?
Two or three is more than enough for 99% of applications. Also, the performance scaling after two GPU's isn't very good. A third card might add 40% more performance, while a fourth will give only about 25%.
Trying to run Crysis on 2560x1600 with 32xMSAA with graphics mods installed and on multiple monitors might need four GPU's. But I can't think of anything else.
Does anyone here have four graphics cards in their system? If so, can you give me some input?
Last edited: