The great ray tracing and rasterisation debate
Things moved swiftly onto the topic of ray tracing – a topic that has been discussed pretty extensively over the past couple of months. Many developers have come out and said that they don't see ray tracing as a viable alternative to rasterisation.
Even Michael Chien, Intel's Director of Research, admitted that there wasn't going to be
a binary switch from the current rasterisation techniques to ray tracing during the Intel Developer Forum in Shanghai.
Shortly after that, Nvidia's CEO and President, Jen-Hsun Huang, said at the company's Financial Analyst's Day that Nvidia loves ray tracing. It was a fitting time to ask David whether he sees ray tracing slowly replacing rasterisation in the future and whether or not he believes that rasterisation will be around in ten years time.
"I think rasterisation will definitely be around in ten years time. So will scanline rendering and so will ray tracing for certain kinds of visual effects. But if you look at Pixar's movies as an example – they all use ray tracing, but the only movie that they extensively use ray tracing in is
Cars.
Cars is one of the few movies that makes extensive use of ray tracing, according to Kirk.
"People use ray tracing for real effects as well though. Things like shiny chains and for ambient occlusion (global illumination), which is an offline rendering process that is many thousands of times too slow for real-time," said Kirk. "Using ray tracing to calculate the light going from every surface to every other surface is a process that takes hundreds of hours."
During the Analyst's Day, Jen-Hsun showed a rendering of an Audi R8 that used a hybrid rasterisation and ray tracing renderer. Jen-Hsun said that it ran at 15 frames per second, which isn't all that far away from being real-time. So I asked David when we're likely to see ray tracing appearing in 3D graphics engines where it can actually be real-time?
"15 frames per second was with our professional cards I think. That would have been with 16 GPUs and at least that many multi-core CPUs – that's what that is. Just vaguely extrapolating that into our progress, it'll be some number of years before you'll see that in real-time," explained Kirk. "If you take a 2x generational increase in performance, you're looking at least four or five years for the GPU part to have enough power to render that scene in real-time.
"You can cheat to achieve some effects through rasterisation and it won't cost as much [from a performance perspective] – for what is a very small benefit in image quality," he continued. I asked if this was a dilemma that developers faced going forwards if they're looking to adopt ray tracing, as they're essentially going to have to go back five years in terms of image quality in order to implement it.
"Why would they throw away everything that works in order to adopt something new that has been built from scratch. I think that the APIs will evolve to include more ray traced components. There's plenty more for us to do [in graphics] and we're only scratching the surface at the moment."
Note: this interview was completed two days before Intel's Tom Forsyth revealed that Larrabee's main focus for graphics was on rasterisation.
Want to comment? Please log in.