Friday, March 20, 2009

Softball

It looks like I won't be leaving my job just yet 8)
I made my case and, after some consideration, I got an offer that I thought was a good compromise.
In any case, I most likely would have waited for the project to end..

Now I'm getting ready to depart for San Francisco in a couple of days, where I'll be attending to yet another GDC 8)
This time around I got a somewhat decent company laptop (a small Vaio) with proper VPN connection that allows me to access the company's email, servers and Perforce depot (where all code and data is stored).
So, I can access all the code I may need for some meetings and I even have Visual Studio installed.. only the computer couldn't possibly even run the current run-time which won't even start without Direct3D 10. And that to run (somewhat) smoothly requires 64 bit Vista, 4 cores CPU, a powerful GPU and 8 GB of RAM 8)
That's my development machine at work, but more recently I've been moving to Core i7, 12 GB of RAM and even SSD disk.. 2 in a RAID0 !

..it's just that the crazy amount of data, textures and animation goes beyond the normal concept of real-time.
I remember when we reached the 32 bit limit.. it's really easy because in Vista 32, 32 bit memory addressing really means just ~2 GB per-application (the OS itself can only see around 3 GB anyway).

The problem with real-time is that one needs to keep enough data around to render a whole sequence. Streaming and compressing helps a lot, but one still needs some beefy hardware specs.
An off-line renderer may take 1 or 2 days just for one frame of animation.. and a frame of animation may require several gigabytes worth of uncompressed data.
In most cases, most of the textures and geometries will persist for a few frames, so one in theory could keep them around in RAM for the next frame.. however one does indeed need lots of RAM in that case.

ummmumm

Speaking of off-line rendering.. RibRender is now somewhat able to run very basic shaders.. though much is still missing.
Lighting parameters are not there yet, but the shading "virtual machine"  runs. I've improvised some sort of assembly.. it's a good feeling. I've always been too busy with graphics, but at the end of the day, virtual machines, self-modifying code, etc, are important for graphics too 8)

14 comments:

  1. Davide..ci vediamo a San Francisco :)

    Marco

    ReplyDelete
  2. Bravo bravo.. ora ti mando i miei dati personalisssimi ! (ho un tel affittato questa volta 8)

    ReplyDelete
  3. Non sarebbe possibile prendere l'output di una scheda video (quello che una volta si chiamava framebuffer), spararlo in un encoder video e servire il tutto via internet a un client da portarsi appresso per le demo?

    Ciaps.

    ReplyDelete
  4. Too bad I won't be @ GDC this year; would have been nice to meet up with you. I just recently upgraded to an i7/12GB/Vista64 system (no raid though!) at home. Since most of the stuff that had been eating up my time is now done, it's time to roll up my sleeves and get busy. :D Have fun @ GDC.

    ReplyDelete
  5. Ooohh 12GB!!! why not 16 already?

    ReplyDelete
  6. Freddy,
    (answering in En as it's an interesting topic 8)
    It's possible and in fact we do that. Every day the run-time runs automatically with an option to take screen shots at every frame. Then a batch file makes a movie with those screen shots.
    It's interesting because if one thinks in terms of pre-rendered movies, then it's only a fair game to playback a movie 8)
    It's useful to show progresses, without requiring people to go to the nearest holodeck 8)

    Ragin,
    Sigh.. while I'll be stuck for 2 weeks with an Intel Celeron M and 1GB of RAM ! (or perhaps I should bring my macbook as well.. ummm)

    Paul,
    I think the motherboard only has 3 memory slots.. and the max size of those memory sticks is 4GB ..anyway, it's good to keep things in perspective (^^;)

    ReplyDelete
  7. Hmmm some L1366 socket motherboards can support more than 12GB, this one for example can do 24GB

    http://www.newegg.com/Product/Product.aspx?Item=N82E16813131365

    Looks like it has 6 memory slots, so 6 x 4GB sticks?

    I think memory is the one factor that makes the biggest difference in performance nowdays. So the more the better.

    ReplyDelete
  8. Have you seen the presentation for Onlive?
    Because it is exactly what I was thinking about... :-/

    ReplyDelete
  9. Haven't seen that.. personally I think we still aren't quite there with broadband and latency.. but I guess it could work for some games (less action).

    Interactive TV !

    ReplyDelete
  10. http://gdc.gamespot.com/video/6206692/gdc-2009-onlive-press-conference?hd=1

    They claim 1ms of latency for the encoders opposite of the usual 500-1000ms usually live video encoders have.

    So the lag seems to stay in an acceptable range.

    ReplyDelete
  11. ummm.. so, whatever they are using to encode these highly compressed frames, it can do a 1000 of them per second ? ..doesn't seem like the sort of technology that one develops just for a mere application.

    Where is the catch ? 8)

    ReplyDelete
  12. It's cloud computing for videogames, both for catalogue and new games.

    It has the potential to reduce piracy, invrease revenue, save you money (you do not need to buy a sli enabled ninja-pc).

    ReplyDelete
  13. I understand all that.. but I don't think the Internet is currently fast and reliable enough to play high res games at high frame rates and with good latency.

    Games like Virtua Fighter for example need to be running at 60 Hz, and probably don't even allow triple buffer because of the extra latency (I remember that was the case when I was developing Tetris for Arika 8)

    If I play a game it's usually an action game.. and I want it to be responsive.

    The multi-core, multi-processor, multi-system stuff is very much something I've been looking into..
    But I think that for the immediate future it will be for less real time stuff (use N machines to bake lighting for a game level with a certain daylight), rather than for a server & thin-client solution.

    ReplyDelete
  14. The model server+thin client would be a lot useful for demonstrating stuff at meetings like it was intended in my earlier post.

    In that case latency is a non issue and you would have more flexibility over a movie generated from screenshots the night before.

    Too bad anyway that the Onlive public beta is for US and AAIIIH only. :)

    ReplyDelete