Interview: AMD on Game Development and DX11

January 6, 2010 | 12:08

Tags: #11 #5850 #5870 #batman #console #consolification #cypress #developer #directx #dirt2 #dx #gpu #huddy #interview #manager #physics #relations #richard #saboteur

Companies: #amd #games

bit-tech: Do you think we'll see more of this kind of behaviour in the future?

RH: I hope we'll see less - I hope that both Rocksteady and Eidos that feel they've been dragged into a marketing war that isn't in their interest and in the consumers interest by allowing that in and they will decide not to allow it in future. I certainly hope other developers and publishers will make the same decision.

If I've done anything to embarrass Rocksteady or Eidos in any of my discussions about this then I hope that the only effect of that is people say, "you know what, we should be embarrassed about it." I'm not trying to force the issue but I hope Nvidia will realise it's a poor tactic and poor quality approach and not one that would be taken by any tech leader.

bit-tech: Recently Nvidia disabled PhysX in the drivers if you're using an ATI card as the primary graphics adapter.

RH: They don't want to QA it. The PC is an open platform, though - you're meant to take any two parts and put them together. Intel don't say "we're not prepared to QA our CPUs with Nvidia or AMD's graphics parts" when they obviously spend time QAing them because you want to build a system that works.

bit-tech: Given Nvidia licensed its own MSAA technology for Unreal Engine 3, why don't you just do the same thing? Put your code in as well and when the game detects your vendor ID it uses this code instead.

We're currently working with Eidos and we want that to be in there in a future update. That's not a commitment to it, but we are working with Eidos to make it happen because I believe it's in every consumer’s interest.

Interview: AMD on Game Development and DX11 PhysX, Eyefinity, Saboteur and cloud computing
The Saboteur lacked ATI support due to last minute changes, but was quickly patched

bit-tech: Recently, Saboteur launched without any support for ATI graphics hardware - although it was patched later on. How did that happen?

RH: It was a mess of timing. The developer put in a change at the very last moment which unfortunately relied on a particular behaviour of the driver. Two changes in our driver and [this change] in the game took place at the same time and we didn't catch that - we should have done, and that's a straight forward failing on our part. We worked as closely as we could with the game developer to make the patch available as quickly as possible, but we messed up on that occasion. If you want to catch me on these I'll put my hands up to every one of them. I wouldn't want to try and discourage you from catching me on this because I want you to go catch Nvidia as well so if you've got a list of titles where you think we failed, bring it out because a PC should just work.

bit-tech: That brings us to the consolification of games. More and more developers are leaving PC gaming or pushing it into second place as issues of piracy, revenues, QA time etc. continue to mount, how do you encourage developers to commit to a PC release, let alone DirectX 11?

RH: It's a good question and one console vendors will have to ask themselves too, once cloud computing becomes practical - do you really need a console to do the job for you or can you rely on a nice, fast broadband connection to do the job for you? All of us are looking at ways to improve the value associated with that particular platform we work on. As the company that supplies the graphics parts for the Xbox 360 I have to say I don't cry too much if someone plays on a 360 instead, or even on a Wii [where ATI graphics are also present] because there's something like 90 million units between those two which all feature AMD chips, so I'm not exactly against console gaming.

bit-tech: Do you see the next generation of consoles being cloud computing clients, then?

RH: No, definitely not because I think it's nice to have a decent chunk of horsepower available to you locally - that can make a big difference, particularly if you are in something which is a twitchy game where 100th of a second counts. I don't think we'll go completely to cloud computing in one generation, but we'll have more access to it in the next generation. The Onlive stuff was an interesting route to go.

bit-tech: Will more PC games move towards a cloud model faster? I'm thinking especially given there's more chance to try things on a continually upgradable computer?

RH: It's a matter of choosing the experience you like best. Maybe you'll end up playing Crysis on a mobile phone, but it's not quite the real experience. If you had a large 30in monitor or Eyefinity where you have three monitors then even your peripheral vision is engaged, then that's a better, richer, visual experience.

bit-tech: Will we see more Eyefinitiy titles?

RH: At two hundred bucks per monitor to fill in my peripheral vision, yes absolutely!

bit-tech: I watched PC Pro try Burnout on it the other day and because most PC games are designed with the HUD around the edges. This means you have to physically move your head around to see it because it's on the peripheral screens. Will there be driver updates to move these to the centre screen or are you working with game developers to patch/optimise for Eyefinity?

Interview: AMD on Game Development and DX11 PhysX, Eyefinity, Saboteur and cloud computing
Burnout Paradise works over three monitors via ATI's Eyefinity, but has HUD issues

RH: We won't, in this generation, be able to put in a fix that takes something like Burnout and moves the HUD into the centre. Most games actually where they've looked at this on something such as Matrox's Triplehead2Go then they've realised it's simply important not to just widen things out as you go and spread them in the same kind of ratios, it's important to focus on the central display. Burnout is probably in the minority by pushing its HUD that far out.

bit-tech: A lot of FPS games have health in one corner and ammunition left in the other?

RH: Well if all they do is stretching it out, that is disappointing and it’s the wrong way to go. Obviously you should bring the bits into the centre screen and just use the other screens as peripheral vision, which is Dirt 2 does.

Fixing that in the driver would be quite hard because they're using their own shaders to put things in specific places. We could do it, but I don't think it's worth the effort and we're concentrating on engaging future games and a select number of titles that we hope to get patched. There's an API coming for it which will be publically documented and if Nvidia want to go down the same kind of route then they can use the same API and exactly the same kind of functionality, although they can't call it Eyefinity. It's not about shutting people out to us, it's about innovation - doing it first and doing it well.
Discuss this in the forums
YouTube logo
MSI MPG Velox 100R Chassis Review

October 14 2021 | 15:04

TOP STORIES

SUGGESTED FOR YOU