Even just a year ago, having a hot-running graphics card such as AMD's R9 290X, was par for the course. Admittedly, there have been hotter and cooler examples of 'the must have' GPU over the years but in general, if it's good value and performs well, I'm usually sold.
This is especially true with me as I usually rip the stock cooler off a new graphics card straight away and fit a waterblock, so heat has never really bothered me. The exceptions were excessively inefficient models such as Nvidia's GTX 480, which weren't that fast and could heat your average Olympic swimming pool. Equally, AMD's dual-GPU offerings have often generated too much heat and been overkill for my needs.
[break]
However, something changed with Nvidia's GTX 750 Ti, which was the first of its Maxwell GPUs for desktop. Here was a graphics card so power efficient, that it actually became popular with Bitcoin miners even though until that point, Nvidia had been lagging behind AMD, which had sold shedloads of cards to digital currency miners.
Likewise for gamers and anyone that appreciates an efficient bit of hardware, the GTX 980 and GTX 970 proved to be equally good when it comes to your electricity bill. The GTX 980, for example, drew 299W under load (that's the combined system load), while the GTX 780 comes in at 373W and AMD's R9 290X at 409W. I nearly couldn't believe my eyes when I saw
Matt's review.
Nvidia's GTX 980 drew over 100W less than the R9 290X in our tests - AMD has to deal with this deficit with its new GPUs.
That's a huge improvement, and with my little office getting unbearably hot in the summer if I even think about gaming, the ability to consume 100W less and have a more powerful graphics card at the same time is a godsend. In fact, part of my reason for not owning a 4K screen at the moment, as I discussed in my recent blog -
AMD and Nvidia need to step up to the 4K challenge is that even if I could get my hands on the GPU horsepower to deal with all those pixels, the heat generated wouldn't be tolerable in the summer.
So, things are actually looking good in terms of GPU power efficiency but I inevitably began to compare the GPU market with the CPU one, specifically the fact that AMD's CPUs are so much hotter-running and less efficient than Intel's. Once you overclock them the difference is catastrophically huge, with AMD's FX CPUs drawing a huge amount of power. The contrast here, though, is huge compared to the battle between Nvidia and AMD. Here, the two are out of sync in GPU launches, so while Nvidia already has its high-end next-gen graphics cards out in the wild, all eyes are on AMD to come up with a competitive product, which we expect to land realtively soon.
The R9 290X is a toasty customer, so much so that water-cooling it actually eliminates thermal throttling and boosts performance over reference cooler-equipped models, even at default frequencies - click to enlarge
In fact, you only have to look in our forum to see hardware spec-filled signatures sporting an at-a-glance equal number of Radeon or GeForce - both have offered up excellent products in the last 24 months and it's only Nvidia's recent 900-series launch that has started a cycle of GPU launches that began with the GTX 750 Ti, and will likely end with AMD's mid-range offerings next year, or possibly with the eagerly-awaited GTX 960.
While AMD might seem to be on the back foot at the moment, a lot can be put down to the fact it's out of sync and Nvidia was first to market with a new product. I'm genuinely excited to see what it comes up with as my aging GTX 660 Ti has seen better days. However, I do think that to really win the best GPU crown, AMD has to reign in its power consumption, even if the supposed R9 390X is a lot faster than the GTX 980. Absolute performance is all very well but bucking the trend of better power efficiency would be unwise. Hopefully we won't have to wait too long to find out if AMD has managed to do it.
Want to comment? Please log in.