Saturday, February 27, 2010

Old Timers vs Real-Timers

Some time ago, I've heard from an insider of the game industry what's the next big thing... the luminare of knowledge said that it was Global Illumination in Real-Time... aha !! Why didn't anybody think of that ? 8)

I guess that that's going to be the next fad... probably driven by CryTek which now has some YouTube videos about this famous Real-Time GI... I mean, "exaggerated color bleeding".

Aside from the fact that cheap color-bleeding is far from being proper global illumination, the video doesn't even try to show any decent anti-aliasing.. because of course, that company is in the business of doing more stuff in real-time rather than getting the basics right.
If you ask to Pixar engineers, they'll tell you that they don't even dare to dish cheap anti-aliasing to the artists. Not today, not 20 years ago.
But in the games we don't care and we have to swallow this BS MSAA that NVidia, AMD and everybody else gives us.

You can bet that a bunch of smaller developers will be rushing to match CryTek with their "Real-Time exaggerated color-bleeding". This is just how things are..
My suggestion is to instead follow what Id Software does.. the MegaTexture stuff.. that is what is more important.. dealing with large quantity of data, streaming those data.

The average player can be fooled that he/she (most likely "he") is seeing something awesome because "shit blows up".. and that player can't quite spell out the quality.. but then again, another CG movie comes up and for some reason it's not quite like the games on the monitor..

Programmers also need to be able to judge things for themselves. Graphics programmers should forget for a minute about real-time and go investigate on what sets production rendering apart.. acknowledge that by analyzing the image quality rather than overcompensating by cranking up the HDR bloom.