Last week, I read Alexandre Pestana's post on Volumetric Lights. This got me thinking about adding this effect to 3DWorld. In fact, I had already added this a while back, but I never posted any of the screenshots. It took a bit of effort to get it working again, and now I can write a blog post about it.
I've already talked about rendering "God rays", a type of volumetric lighting, in a previous blog post. Here is a screenshot from a few months ago, where the sun rays are visible through the leaves of a palm tree.
"God Rays" showing through palm trees using a screen space postprocessing pass. |
This is a nice, fast screen-space technique, but it only works when the light (sun in this case) is visible. And it only works for uniform fog density. I needed a better system without these limitations, even if it meant spending more frame time drawing the fog. I decided to add this feature to my smoke effects shader code path, but have the fog density come from a procedural function rather than a 3D smoke texture constructed from smoke diffusion on the CPU.
I used a standard volumetric ray marching technique to step from the scene hit point to the viewer in eye space using uniform steps. This integration process calculates the lighting and fog effects for every pixel in the fragment shader on the GPU. The operations performed at each step are as follows:
- Sample the indirect area lighting term from a 3D texture to get the ambient color at this point.
- Check the shadow map 2D texture at this point to see if there is direct lighting from the sun.
- Evaluate a 3D noise function to determine the smoke/fog density at this location.
- Combine all of the terms to compute the change in color at this step using the equations:
sample_color = ((indirect_light + direct_light*shadow_term) * fog_color); color = prev_step_color*(1-fog_density) + sample_color*fog_density;
Dense volumetric fog with shadows in the Crytek Sponza Atrium scene. |
Sponza Atrium with fog, half of it in light and the other half in shadow. |
Here I'm using a 128x128x256 indirect lighting texture, a single 4096x4096 shadow map texture, and a screen resolution of 1920x1080 (1080p). I'm taking 3 samples per lighting pixel, which can be more than 200 samples per pixel but is on average much smaller. I get around 20FPS with all effects enabled. The framerate increases to 40-50 FPS at 1280x720 (720p), which is more reasonable for a realtime game. Most of the time is taken by steps 1 and 2. If I reduce the number of steps/samples, the framerate improves significantly, but the quality degrades and some rendering artifacts start to appear. If I reduce the indirect lighting or shadow map texture resolution, the framerate will also improve, at the cost of more noise and less lighting precision. Blocky lighting doesn't look very good.
Here is another screenshot that shows how important the indirect lighting term is. The fog near the fire on the right glows with an orange hue. If I remove the indirect term, the fog will be black, which looks very wrong. The fog on the left side is brighter due to indirect lighting from the sun in the open courtyard. You can also see the shafts of light in the back center of the image where the sun shines through the space above the curtains.
Volumetric fog with direct sunlight + shadows, and indirect lighting from the fire on the right. |
The previous images all used a constant fog density. 3DWorld can also use procedural 3D Perlin noise to generate time-varying, nonuniform fog density. This looks more like smoke that slowly drifts across the scene. I used 3 noise octaves, which seemed to be a good balance between runtime and quality. The additional three 3D texture lookups add significant overhead to the fog computation time, and reduce the framerate by about 20%. But, it does look much nicer. Here are some screenshots.
GPU-generated volumetric smoke that moves/flows dynamically and is light by both indirect and direct light with shadows. |
Dynamic volumetric smoke, lit by a shaft of sunlight, viewed from above. |
Volumetric smoke with shadows and light shafts. |
Finally, here is a video showing how the smoke moves over time. The camera is motionless; only the smoke is moving.
3DWorld also has a smoke diffusion model that can be used in gameplay mode. Some of the weapons produce smoke, and others create fire, which also produces smoke. The smoke is diffused though a 3D volume with collision model blockers on the CPU. Then the volume data is sent to the GPU for use in rendering. I may make a blog post on this topic later.
I would like to use this effect for some game content, but it's not clear if the performance/quality trade-off curve has any useful points. It's either too slow, or the quality is not good enough. Still, it's a nice demo effect, and fine for small screen resolutions. I'll have to look into some possible optimizations. I've read that randomly distributing the same points reduces the rendering artifacts but produces noise. Maybe the results would be more acceptable? In some cases a screen-space blur can remove most of the noise, but it's not clear if the 3DWorld framework can do this blur correctly in its current form. Maybe I'll experiment with this later.
It is a nice effect. Maybe you could just use low-res sampling to get the performance up?
ReplyDeleteYes, there's certainly a quality vs. performance trade-off here. I should probably add a config option for it rather than having the values hard-coded in the source and shader.
ReplyDelete