Directx 11 to Reveal a New Era of Graphics

Editor’s Note 4/49/08 12:36 PM: OK, I’ll admit it. I got HAD by this article. It was based on an April’s Fools joke and made it’s way to PCMech well after April Fools. I’ll leave it up here since people have already commented on it. I’m not sure if Nathan (the author) knew it was a farce, but one thing is for sure: I need to pay much better attention when I’m publishing guest posts for PCMech. Sheesh…


It has been announced that DirectX 11 will include a completely new type of graphics rendering called ray-tracing. Wait a minute. It’s not new. In fact, it’s been around since the 80’s. How come it took so long to be implemented for public use? How does it work? What advantages does it have over current-gen graphics? These questions are about to be answered.


Ray-Tracing was first introduced in 1986, and is basically defined as tracing the paths of light as they interact with objects. This is essentially what our eyes do, so it creates quite a vivid and realistic picture. Unfortunately it wasn’t practical to use in everyday graphics because it took up so much raw power to compute. It was used scarcely in the 90’s, but only for demonstrations and now in the 21st century with multi-core technology it is finally possible to make Ray-Tracing practical.

So what happened? Well the movie-industry took advantage of it right off the bat. Many special effects were ray-traced to give a more realistic look. The movie Beowulf was entirely ray-traced. It wasn’t perfect, but it was damn close, and a heck of a lot better than what people have now. To give you an example of how much power it takes to ray-trace though, a person created a video of real-time ray tracing of a convertible on YouTube, and it takes the combined effort of THREE PS3 consoles. You can check it out here, it’s pretty cool. Remember each PS3 has 8 processors (6 active), so we are looking at over 20 processors for one, non-moving object.

Sony_PS3_sales_UK Hmm. This is starting to explain some things. Like Why Nvidia did not support DX10.1 on their 9-series cards, and not have any new technology on the cards other than smaller chip-sizes. They realized that the old ways of graphics are dying. What’s the point of Directx 10.1 anyway? Rasterisation, what Nvidia and ATI use, has reached its peak. They both have perfected the art of essentially faking graphics. Now it’s time for the real stuff. It’s an open field, and apparently Intel is planning on joining the competition. They recently have been experimenting with combining a processor with the graphics card with successful results. This could spell bad news for both ATI and Nvidia, but knowing the way Intel prices things I’m sure there will still be close competition.

An interesting thing about ray-tracing is that it is fairly scalable. With rasterisation, you notice less and less with each improvement. For example the new 8-core skull-trail beast from Intel hardly earns gamers a few FPS on rasterisation. For ray-tracing however, it will be exactly 8 times better than a single-core. So what will this do? Well there will probably be a new multi-core processor every couple months, possibly reaching over 100 before 2010. If each has implemented graphics with ray-tracing technology, you can see the benefit of that over getting a separate graphics card.

Benefits of Ray Tracing

Ray-tracedVsRasterized By now you probably want to see what ray-tracing can do compared to rasterizing. Well take a look at this image on the right. As you can plainly see, the ray-tracing image has more realistic reflections and shadows. Nvidia has worked their butts with their 3D shader processors, but they could never get anything close to this. It’s very encouraging to see the difference, but remember we are a ways off from getting objects of that clarity interactive on our computers. Directx 11 is only going to support a few limited things, so that the transition to ray-tracing is gradual, and not all at once. I won’t be surprised if Ray-Tracing Processing Units (RPU’s) are implemented on the Nvidia 10 series cards. At first maybe only characters are ray-traced. Then as new hardware is introduced, textures and objects within certain draw distance are ray-traced, until eventually everything as far as the eye can see is ray-traced and rasterisation becomes a thing of the past.

And This Matters Because…

Is this a good thing? Maybe. Everything would be a lot more predictable and you would be able to confidently tell which brand of graphics card is better just by looking at the data sheet, unlike today, where the only real way of telling which of two cards are better is by rigorously testing them in 3D programs, measuring their temperatures, calculating wattage, etc. So there will be two consequences. Either we will finally end the number game by being able to really tell what is what without any background info, or, more likely, it will simply enter the next stage of confusing the common public in return for profit.


  1. Every single statement in this article is false.

  2. Hey this could make vista actually worth something, if they make it DX11 compatible in SP2 or something

  3. Seriously, the number of inaccuracies in this article is incredible. And the graphics comparison? Just.. wow. At least you credited Intel, which should (but probably won’t) suggest to most readers that it’s totally biased.

  4. I don’t get it. Upon watching a high resolute HD vid on my pc (for several years now). Are they just making an hd chip for man made graphics people? I am sure it is more marketing than results, and they are targetting billon dollar game making companies. Talk about avivo encoding next time. Much more to do with reality…

  5. Sorry, but you might need to study a bit more about the actual differences between the two rendering methods and how those two ‘sample’ images were actually produced. Fields to study – fixed function versus pixel shader (go way back to DX6 for that first image, lol), hype and marketing, reflection mapping, shadowing, conceptual art. Ray tracing is the future, but misrepresenting it and the technology that preceeds it does neither a service and is inaccurate reporting.

  6. I don’t usually read this site…are PCMech articles always this BS-filled? Ouch. Nice use of a 10-year-old-looking rasterized image to drive misinformation.

  7. Brian Srivastava says:

    IIRC this was an april fools post gone awry.

    Not that the principle of adding ray tracing to DX is unreasonable. OpenRT aims to be ray tracing in the style of OpenGl, and presumably there will be a DirectRT in future.

    However, the basic mathematics sort of fall apart. Ray tracing, while demonstrably practical for older engines on extremely fancy setups with 16 cpu cores is probably 2 or 3 hardware generations from being implementable at that level on the desktop and another generation or 2 away from being able to do that on (what will then be) modern engines. My research is on RT on GPU hardware, which faces essentially the same curve: theoretically possible now but don’t expect anything any one would want to actually use any time soon.

    It is possible, however unlikely, that one of AMD, Intel or Nvidia will release hardware custom built for RT in the same way current GPU’s were for rasterization, but even there you’re looking at years for it to filter down much (AFAIK the university of I think it’s Saarland has something along these lines already but I have no idea how good it is or the costs). That would create a whole slew of questions with compatability etc… as well, which means we likely won’t see that for some time (though likely that will be the future).

    Granted MS may decide that DX10.x is as far as they need to go with raster graphics and DX11 will include ray tracing at some point in the distant future, but don’t expect it to be anything other than an exercise for academics and researchers for a few more years. I would also say, given the relatively bland reception DX10 has recieved the MS marketting types will be happy to shove something out the door, no matter how minor an improvement they can call DX11 and hope for some better press.

  8. “It has been announced that DirectX 11 will include a completely new type of graphics rendering called ray-tracing.”

    yeah, on April 1, lol !

    Besides that, nearly every argument in this article is crap.

  9. Oscar Forth says:

    lol … Most un-informed article … EVER!

    For those that know anything about Ray-tracing … the problem isn’t ray tracing its the scene changing. Updating the scene optimisation trees is a nightmare.

    Also .. am i the only one who really thinks that if you can render Quake 4 at 100FPS on an intel chip then ONE PS3 should be able to render that vehicle?

  10. Grayson Peddie says:

    Geez… DirectX 11? I kind of get the feeling that I might have to hold off on building a computer until the day DX11 comes out.

    Why can’t we just give DirectX 11 a rest until maybe the second half of next year or two? DirectX 10.1 has already been out since the days of ATI Radeon HD 3xxx series, but maybe not quite…

  11. David Risley says:

    This is, unfortunately, what happens when you blindly trust a guest author. Sorry guys. The author of this post (Nathan) passed this off as real to me and I took it at face value without bothering to fact check it. Completely my fault.

    No, this is not usual for PCMech. I just got completely HAD here.

  12. Brian Srivastava says:

    Don’t feel too bad, I nearly put a reference to it in my thesis since the story actually broke the day before april fools day my time.

    Besides it’s not that the premise is unrealistic, it’s comming. It’s probably not comming in DX11, but it’s comming.

    And yes Oscar Forth you’re the only one that thinks that. Since the intel chip in question (and it wasn’t originally done on one chip but I believe 4 machines using a PVM or equivalent, and 4 cores each) would be a 16 core behemoth that dwarfs the performance of any modern buyable intel CPU, while at the same time being dramatically easier to program than a PS3. One of my pet projects is helping (I make no pretext of being lead on this) is setting up a small PS3 cluster for some astrophysics types, writing software for it is a bit, different, clearly manageable (e.g. DMC4 and GTAIV), but both the 360 and PS3 trap themselves with a pitiful amount of main memory and the funky architecture of the memory, SPE’s & PPE probably hampers success a bit.

  13. What will happen to the ppl spent thousands thinking dirextx 10 was the end?

  14. Not everything in this article is false, i have recently installed the beta for the new windows OS, and with it comes DX11, so this may be based on an april fools day joke, but how does anyone know the joke was not just “expanding” on the truth, it may not ALL be false

  15. hackerkid says:

    haaaa directx 11 is coming in july 2009 …
    and ATI graphics products group of Advanced Micro Devices, may be the first with its DirectX 11 graphics chip according to rumours, code-named ATI RV870 graphics processing unit will be launched in late July, whereas Nvidia’s code-named GT300 is projected to be released in October.

Speak Your Mind