The DX10 version, which we ran using an Nvidia GeForce 8800 GTX, 2GB of memory and an Intel Core 2 Duo E6600 running at 3.0GHz (9x333) under Windows Vista Ultimate, really showed off smoke and fire effects to the fullest we've ever seen.
Chucking a grenade in a building window bought up not just a huge and jaw-droppingly lovely firebomb that ripped out through the windows, but also a black billowing of smoke that was so detailed, members of the assembled crowd actually coughed at it. It was so amazing to watch that we actually spent two or three minutes just shooting missiles at a cliff face in order to watch the clouds billow out.
In the comparison screenshots we noticed that the explosions actually looked quite similar on both versions and that the enhancements were actually coming from the better shadows and motion blur on the DX10 version, which made the DX10 version seem so much more dramatic and involving. The actual explosion and smoke effects were, as far as we can tell, identical to the DX9 version.
won extra kudos for its snow effects on the ground too. While we thought it looked pretty good to start with, it looked even better in DX10 when the snowfall broke where we walked and waves of the powdery stuff rose up as we walked, issuing out like Gods love spray. Again, it looked good on the DX9 version, but the shadows and motion blur on the DX10 demo put it ahead by a nose, despite the slight performance hit. Close inspection of the DX9 shadows showed them to slightly blocky around the edges when compared to the high quality DX10 shadows.
Medium and High shadows under DirectX 10, click to enlarge.
The snow effects on the ground were strangely absent on some monsters in the DX9 levels too, though in DX10 the snow was kicked up in front of all the monsters realistically as they walked and moved about the piste.
The major difference at any rate between the two versions was the ability to use the high-quality shadows in DX10 that really made the game look better on that platform. Rounding the edges off and smoothening out areas of shade really has a massive effect on the game when it's otherwise bright white. Unfortunately, the high quality shadows of the DX10 version also caused a massive performance hit, which conveniently brings us to...
The performance of Lost Planet
under DX9 was very solid on the GeForce 8800 GTX. With everything on full (medium shadows is the maximum under DX9 - Ed.
) we ran the demos included benchmark and got an average of 30 FPS at 1920x1200 4xAA 16xAF in the Snow section of the benchmark and 45 FPS in the Cave test.
We also tried GeForce 8800 GTX SLI too, which wasn't natively supported in the DX9 version of the game. We had to force AFR1 in Nvidia's driver control panel in order to get some decent scaling. We managed to hit 57 frames per second in the Snow and 56 FPS in the Cave.
Lost Planet DX10 Demo, click to enlarge.
On the DX10 version things were a little worse off. Running the same benchmark with the same settings on a GeForce 8800 GTX got a framerate of 25 in the Snow, increasing to 39 in the Cave. This was pretty good performance considering the more realistic motion blur effects (that are really hard to catch on camera).
We decided to also try high quality shadows, since they looked appreciably better than the sometimes blocky medium shadow quality. This really
killed performance, as the average in snow dropped to only 15 fps, while performance in the Cave didn't fare much better at 19 fps. Ouch.
Thanks to Nvidia's recently-released DirectX 10 SLI driver
, we could turn SLI on in the DirectX 10 version demo, so we measured performance again. We were actually really surprised at how well it scaled, with 29 fps in Snow and 37 fps in the Cave - that's not far off 100 percent scaling and is a good sign for SLI performance in future DirectX 10 content. Once again, we can't stress how good it looked, though with everything pimped out to full the framerate still fell at certain points during the demo. It's playable at 1920x1200 4xAA 16xAF... but only just!
Editor's note: Because this game demo was shipped by Nvidia, we have not tested performance of AMD's ATI Radeon HD 2900 XT against the GeForce 8800 GTS 640MB. At the ATI Radeon HD 2900 XT launch, we were given a DirectX 10 demo from Call of Juarez and for same the reasons, we have not published comparative numbers from that demo either.
Rest assured, we will be revisiting DirectX 10 performance across a range of games as and when they arrive in retail.
Lost Planet DX10 Demo, click to enlarge.
There's no doubt about it, Capcom has come up aces with the PC version of Lost Planet
. This may only be a demo we were playing with, but if the code plays this well in its un-optimised and unfinished form then we can't wait to see what the final game will look like. True, the game isn't flawless at this point; the A.I proved itself to be a foolish beast and there were a few times when clipping issues ruined otherwise gorgeous scenes of gory Akrid-slaughtering gaming.
For the vast, vast majority though, Lost Planet: Extreme Condition
proves itself to be an awesome game in the making, even if it is a bit of a system hog in its DX10 flavour. It's a little early to talk about which version is better, but from what we've seen it's actually the DX9 version that just about manages to come out ahead in the end, if only because it'll be more compatible for XP users and the graphics really aren't that much worse off.
One thing's for sure - we simply cannot wait for the full game.
Both versions of the Lost Planet demo are available here