Post-Processing |
Image post-processing is the task of improving or altering the scene image after it was rendered.
This topic contains the following sections:
The namespace DigitalRune.Graphics.PostProcessing contains a variety of post-process filters (base class PostProcessor).
A post-processor in DigitalRune Graphics is simply a class that performs an operation on one (or more) textures and writes the result to a render target:
// Define source texture (= input) and render target (= output). context.SourceTexture = sourceTexture; context.RenderTarget = renderTarget; context.Viewport = new Viewport(0, 0, renderTarget.Width, renderTarget.Height); // Apply post-processing. postProcessor.Process(context);
The RenderContext is used to define the input texture, the output render target and the viewport in the output render target. The method Process reads the specified source texture, performs the post-process operation and writes the result to the specified viewport in the render target (or the back buffer if the render target is null).
Certain post-processors require additional parameters. For example, the DepthOfFieldFilter expects that the depth buffer of the current scene is set in the render context (property GBuffer0). The ObjectMotionBlur requires a texture containing the motion vectors of the scene stored in RenderContext.Data[RenderContextKeys.VelocityBuffer]. Make sure to read the documentation of the individual post-processors.
Post-process operations can be nested: A post-processor can internally use other post-processors.
Modern games usually perform a series of post-process operation on the scene image. Post-process operation can be linked together using a PostProcessorChain. The PostProcessorChain is a collection of post-processors that are executed in sequence. The output of the previous post-processor is the input of the next post-processor in the list.
To change the image format (e.g. size or surface format) within the post-processor chain, post-processors have a DefaultTargetFormat property. For example, this is used by the HdrFilter to convert a HdrBlendable input texture to an LDR Color (R8G8B8A8) texture.
The PostProcessorChain is derived from PostProcessor and can be treated the same way as any other post-processor.
Post-processors use the rasterizer state CullNone and the depth-stencil state None (depth-buffer disabled). It is not necessary to set the rasterize state or the depth-stencil state before calling a post-processor.
Post-processors in a PostProcessorChain use the blend state Opaque. When a single post-processor is executed, the blend state must be set explicitly. Most post-processors require a blend state of Opaque but some post-processors can be used with different blend states. For example, the CopyFilter can be used to blend a source texture to the current render target:
context.SourceTexture = sourceTexture; context.RenderTarget = renderTarget; context.Viewport = new Viewport(0, 0, renderTarget.Width, renderTarget.Height); device.BlendState = GraphicsHelper.BlendStateMultiply; _copyFilter.Process(context);
Screen-Space Ambient Occlusion (SSAO) is a real-time approximation technique for ambient occlusion, which is implemented as a post-process filter (see SsaoFilter).
SSAO can be applied to the final scene image. However, in a deferred lighting pipeline the it is usually applied directly to the diffuse light buffer (before the material pass). See the DeferredLightingSample for more information.