Monday, April 19, 2010

Deferred Rendering

Since my last update, I've done quite a bit of work on PyStream to finish off my dissertation.

The example rendering system is now a deferred rendering system, and includes a pass for screen-space ambient occlusion (SSAO). This is type of rendering system that is the "hot" thing for games right now; this technique is being used in Crysis 2 and Starcraft 2, among others.  The statue of Beethoven that is nearest to the camera is entirely unlit, except for ambient lighting.  Despite this, the shape of the statue is still apparent, due both to SSAO and to the use of HDR cube map ambient lighting.



 The shaders being generated by the compiler are pretty much what I'd write by hand.  The overall rendering system easily achieves 60 fps on a mid-range video card despite the fact that it is written entirely in Python.

Tuesday, February 23, 2010

Highly Dynamic Cows

High-dynamic range rendering support has been added to the example renderer I am building using PyStream.

This means that cube maps are now supported, as is off-screen rendering.  Ambient lighting is now done with low-resolution cube maps.  Loops are now supported.  The dips in the ground are a result of using parallax occlusion mapping.

Wednesday, February 3, 2010

A Data-driven Discourse

I am currently TAing CS428 at UIUC. This class is the second in a sequence of software engineering classes. In this class, students form groups of 6-8 and work on projects for an entire semester. Many students choose to work on games, which is serendipitous, as I am interested in game architecture.

In an email I sent out to some of the students, I suggested they look into data-driven design. Unfortunately, there isn't a lot of concrete information on data-driven design. Most of the documentation is along the lines of “put as much of the game in data as you can, it's a good idea” and not a lot of specifics. There was much confusion. The truth of the matter is, I did not fully understand data-driven design until I used it in a real game. I think, in general, data-driven design is one of those topics that people do not fully understand until they've done it. There are lots of experienced people evangelizing a data-driven approach, and a lot of inexperienced people simply not understanding why the evangelists are so fervent. This is my attempt to bridge that gap.

Polymorphic Cows

The PyStream compiler is getting more robust, and the cases that break it are getting hammered out.


A big milestone is that polymorphic types now work. In the screenshot, there are three different “material” types that control how light interacts with a given surface. There's a chalky DiffuseMaterial that wraps implements wrap lighting. There's a shiny PhongMaterial which implements straight-up Blinn-Phong lighting. There's also a non-photorealistic ToonMaterial that emulates the banded lighting commonly found in cartoons. The code for these materials is at the end of this post.

Friday, November 27, 2009

Scamper Ghost Released

A friend and I have finally released our game




As they say, it's "a fast-paced avoidance retro pixel art game with ghosts, coin collecting, power-ups, and slow motion."

Sunday, November 22, 2009

Unified Codebases Rock

In a previous post, I eluded to the advantages of being able to run the same code on the CPU and the GPU. One obvious example is the interaction between color processing and frame buffer clearing. Shaders may process the color of a fragment before it is written to the frame buffer for a multitude of reasons, including color correction, exposure control, tone mapping, and gamma correction. This causes problems when the CPU needs to predict the final value for a color that will be output by a shader. For example, when fog is used to attenuate the colors of objects as they fade into the distance, it is necessary to match the clear color with the fog color. If the same color processing is not applied to both, the result is the following:



On the other hand, if the correct color processing is applied to the clear color, the result is much more satisfying:

PyStream Texture Support

PyStream now supports textures. There are a lot of simple issues like texture mapping that have been lost in the shuffle until now, and I am going back to fix.



def shadeVertex(self, context, pos, normal, texCoord):
    trans     = self.worldToCamera*self.objectToWorld
    newpos    = trans*pos
    newnormal = trans*vec4(normal, 0.0)

    context.position = self.projection*newpos

    return newpos.xyz, newnormal.xyz, texCoord

def shadeFragment(self, context, pos, normal, texCoord):
    surface = self.material.surface(pos, normal.normalize())

    # Texture
    surface.diffuseColor *= self.sampler.texture(texCoord).xyz
    
    # Accumulate lighting
    self.ambient.accumulate(surface, self.worldToCamera)
    self.light.accumulate(surface, self.worldToCamera)
                
    mainColor = surface.litColor()

    mainColor = rgb2srgb(tonemap(mainColor))
    
    mainColor = vec4(mainColor, 1.0)
    context.colors = (mainColor,)