For many of us, the "Space Race" is something that happened in the history books. It seems so silly now - two countries locked in a nearly twenty-year struggle, each trying to prove it had better engineers. Billions of dollars, tonnes of rocket fuel, and even a few monkeys... all sacrificed so that the US could claim superiority over Russia when the race finally died down. Poor monkeys.
Though many great things inadvertently came out of the Space Race (like plastics), the actual act itself held little benefit. In fact, the only
direct benefit that came out of it is the existence of satellites - useful, yes, but that was figured out in the early sixties... so what did we do for the other 15 years? Aside from an ego massage and a couple examples of pure, unadulterated power, we spent a whole lot of time and money staring at the sun.
It's funny how, when faced with technology, we still do the exact same thing.
Did you guys catch that little news bit about
ATI's new graphics card? There's something that sounds impressive about 1GHz core clock, isn't there? Those wearing green underwear shouldn't worry though, because I'm sure NVIDIA will have something even bigger and better up its sleeve. Of course, by then we'll need to move on to 2KW power supplies. All I'll need is a monkey, and I'll have my very own Space Race.
"We spent a whole lot of time and money staring at the sun."
Much like the actual Space Race, the race between NVIDIA and ATI (or AMD and Intel, you pick the battle) is not without its casualties. The first on the list is your electric bill. Oh, that 8800 GTX seemed so innocent when you put it in and snapped those two power leads on it... but the 185W maximum TDP means that thing is sucking down some major juice when you're playing games.
I'm going to leave the specific maths out of this article, but let's just say that the 400 quid you dropped on the counter is not the only meaningful amount that you'll be paying thanks to that card. Nowadays, it's as if a graphics card must need enough power to run the country of Uzbekistan while generating enough heat to keep it all warm and toasty. If it's not, well, you must be behind the times.
The most awful part about it is the spin provided by the manufacturers. Nowadays, things are being lauded in Performance Per Watt (PPW). That
seems power-conscious, right? And it is, to an extent. In order to have a better PPW, a device must actually be more power efficient. For example, the G80 core is really quite a bit more efficient than G71. When it gets the process shrink down to 80nm, it will be even more so.
"Nowadays, it's as if a graphics card must need enough power to run the country of Uzbekistan"
With that in mind, it seems like PPW is the perfect fit for the energy-conscious enthusiast. Let's make an energy efficient shopping list:
- 80+ efficiency power supply (where at all load states, the unit is at least 80% efficient) such as the Seasonic S12 series or the upcoming Ultra XConnect 600W
- a Core 2 Quad (it's based on the Conroe core, which is one of the most efficient chips PPW-wise)
- 8800 GTX (hailed by NVIDIA as its most power efficient graphics card yet)
- nForce 680i SLI motherboard
- 24" LCD widescreen monitor
All in all, using PPW makes the newest system seem (at least on paper) the most efficient setup. A computer like this is liable to be one of the fastest desktops commercially buildable, and according to all manufacturing specs it should be quite a bit more efficient than that old dinosaur you used to be running. So why is it that our power supply companies are making up to 2000W models now?!
What nobody has bothered to mention is that PPW excludes how many watts the product actually ever uses from the calculation. It's a great game of statistical manipulation. Rather than focus on what the total draw is for the hardware, we focus on what better efficiency you're getting from each measurable unit of that draw. This one itty-bitty detail hides a simple fact - we're continuing to suck down more and more power, and it's costing us an exponential sum.
That "energy efficient" setup I mentioned before actually looks a bit like this:
- CPU - 130W,
- GPU - 185W,
- Motherboard - 35W (estimated, chipsets are rarely disclosed),
- PSU loss (20%) - 87.5W
- Total - 437.5W
[separator]That's what you need just to turn the computer
on, and that doesn't include hard drives, USB devices, BR/HD-DVD drive, or monitor. And lots of that energy isn't even being used for processing - there's all the energy being lost to heat (an 8800 GTX can easily exceed 90C at full load), then the energy being used to get the heat out of the system (fans aren't really free, either).
"It all comes back to that idea of real-world benefit."
The worst of it, however, is the real result. This system won't perform real-world tasks (even gaming) all that much better than a system at the top of the midrange, which can be built for less than half the cost and under half the power consumption.
If you size your monitor down from your 24" to 22", you can even get away with a resolution drop that makes an 8800 GTX simply unreasonably overpowered - for the total loss of 2" diagonal of viewable space. And the truth is, you'll probably get a better monitor, as larger panels tend to have more drawbacks in refresh, contrast, and brightness.
It all comes back to that idea of real-world benefit. If
Quake 4 is already running at 140fps, does 15-20% more make a real difference?
Your monitor is capped at 60, no matter what. With that in mind, my personal answer is "no."
Now, I bet that there are a few of you saying, "But I'm futureproofing!" But the thing is, the cost of the previous top-rated hardware sinks like a stone the minute the next new goliath card or chip comes out. It's the same argument that I have with SLI - it's actually cheaper to buy one card that performs admirably now, and upgrade it when the next comparable major revision comes to light. And your electric bill will thank you.
"A Merom-based desktop could smoke many Athlon FX 60+ setups for well under half the wattage."
I don't mean to bash all new technology - I think it's important that these companies are making these strides in research and development. I just don't think that this is a good blueprint for a sustainable consumer product. Sustainable, to me, is the laptop, mini-ITX and console market - what can we squeeze out of the little bit of power and heat tolerance afforded in these tight spaces?
Battery life has made chips like the Merom and Turion all about
real efficiency, and a Merom based desktop could smoke many Athlon FX 60+ setups for well under half the power draw. And the truth is, it's more computing power than most of us will need for another year or two. So why not just upgrade then, and save the cash vs. overbuying now?
We as gamers need to start being aware of the hidden costs of ridiculous excess, and start demanding products that use the term of efficiency intelligently. Rather than a card that offers a 100% performance increase for a 10% power increase, how about one that offers a 50% increase for a 30% power
decrease? That, my friends, is sustainable growth.
I want to see companies try to cram more into less, or at the very worst, the same. Right now, particularly in graphics (but we're back to it with CPUs, too), the research is all in "bigger, faster, better" or in "just enough to get by." However, there's a big difference between the integrated graphics and an 8800 GTX, and it seems that companies are missing that. But that will only happen when we as consumers start asking for it.
"The truth is, it's more computing power than most of us will need for another year or two."
It's time to put an end to this upward spiral, guys. More power, more power, more power can only get us so far and we're already to the point of some diminishing returns. Until we make another huge technology move forward (and I'm not talking die sizes, unified shaders, etc... I mean something
big), I think it's high time that we start focusing on making real Earth a better place, rather than trying to get a better view of the sun (which will be about equal in temperature to the chips in my computer).
And if we
still insist on this Space Race, there had better be a monkey included with my next GPU purchase. I always wanted a monkey.
Want to comment? Please log in.