Painting Style Renderer

04/07/2011

I've always loved photorealism in graphics but I do get the sense that people are too unwilling to experiment with it. Often a "realistic" renderer means that the art team stops bothering with anything interesting in the artistic direction. Perhaps that is unfair - but at the very least it gives them the excuse to fall back into the artistic cliches which we all love - gritty realism, bright fantasy and greeble infested sci-fi.

Because of this, non-photo-realism has always had a nagging at the back of my head, so I decided to try and roll a renderer that did something a bit different. This is what I've come up with so far:

 

 

The main inspiration comes from a couple of papers and presentations. The best paper I found is one by Barbara J. Meier from 1996 which suggests an offline rendering algorithm for a painterly style. There is also a later paper by a German Student Daniel Sperl who suggests a possible implementation in OpenGL using similar techniques. Worth mentioning, but not something I used is another presentation which looked like it presented a very cheap effect but I wasn't happy with the resulting look.

The basic idea is this: generate thousands of particles at different positions on a mesh and render these as "brush strokes" in screen space. This gives a painting style effect without much of the randomness or changes/flickering which you get in other post effects and can be off-putting.

To know the color of these brush strokes you first render the scene as usual and then use this reference image for a color lookup.

The offline technique presented by Barbara is done on the CPU and there were certain things I could not emulate in the graphics pipeline, but none were that important. For several of these I came up with solutions which I think work better anyhow.

The implementation proposed by Daniel Sperl was also interesting but lacked some of the features from the offline version such as orientation of particles, which I added. I also changed around how the billboarded particles are rendered to a far cheaper technique and moved the data into a VBO which sped draw times up significantly.

After much testing it became apparent that the bottleneck in the system is not actually the number of particles. It is the fill time of the particles, mainly due to the fact they are alpha blended onto the scene. I could easily render several million particles when they only covered a couple pixels each, but when their size grew, and they began to overlap each other, that is when the slowdown really begun. The problem was, without fairly large overlapping particles it is almost impossible to avoid gaps inbetween the strokes where the background shows through.

My first attempt at a solution was to do something normal painters do, I rendered a base layer with much larger and fewer particles to fill the gaps. Unfortunately there were two issues with this. First of all it was expensive - as the fill time was just as high as rendering lots of particles. Secondly it completely messed up the silhouette of objects as the brush strokes were so large.

On my second attempt I drew the reference image (of the normally rendered scene) below the particles. This filled in any gaps, but the silhouette was still bad. In some places it had a per-pixel sharpness and in others it was blurred where a brush stroke went across it.

In the end I think I came up with quite a novel solution, which is to render the reference image at a quarter of the screen size. This avoids having a sharp silhouette and fills in the gaps much more seamlessly. The added advantage is that the first pass, in drawing the scene as usual, becomes quite a bit faster. In fact, this pattern of taking advantage of the intrinsic loss of detail due to the effect, seems to be the key to getting the most out of the renderer. As well as downscaling the reference image it also makes sense to drop normal maps and texture resolutions, and any of the unneeded detail which wont be seen in the final product. Hopefully this makes up somewhat for the memory sucking horror of generating hundreds of thousands of particles per mesh.

For rendering the particles it made sense to have a LOD system. In the end I generated a set of Element Index Buffers for each object which skipped out every other or every third (etc) particle and used different ones of these based on distance.

There were some other tricks to reduce the fill time time too. The surface normal was stored with each particle and it's angle to the screen calculated in the vertex shader. This allowed me to shrink particles at a sharp angle to the screen and cull ones which were backfacing. The depth is rendered to texture when rendering the reference image and this can be used as a kind of poor man's depth sorting. You can discard particle fragments which are behind the reference object.

Currently the orientation is based upon either the tangent or binormal vector depending on the size of each UV triangle, but it is perfectly possible to allow the artist full control via either all strokes align along the UV x-axis or the UV y-axis. Stroke orientation and size is something I really want to experiment with in future as currently it has a very Monet type feel due to the small strokes.

The other thing that needs work is the colors. I feel there needs to be some post effect, perhaps increasing contrast and saturation. The shadows also seems to be off. This could be due to the color as I feel they need to be more blue and the light more orange. These are things which can all come down to tweaking though.

Anyway, I'm very welcome to ideas suggestions and feedback. In honesty I've been starting at that piano so long I hardly know what looks good any more.