WebGL GPU Landscaping and Erosion

Nov. 10, 2011
comments

A while ago I finished playing From Dust which I enjoyed a lot. What impressed me about that game was the application of landscape changes by erosion. One drawback of the tool Lithosphere I wrote earlier is that it can't do any form of hydraulic erosion. I decided to write a test in WebGL to see if a few simple algorithms could be used to shape a landscape according to hydraulic erosion.

Contents

Demo

You can try the live demo. The code is on github.

Video

Screenshots

At the start it looks like this.

Starting the rain.

After a while of raining and erosion.

Mesh

I am using vertex shader texture lookups to displace the Y axis of a flat plane to represent the terrain and water.

The mesh is regularly tessellated into 512x512 cells. For the terrain I use a hexagonal mesh because that avoids tessellation artifacts better then a square tessellated mesh, the technique is described in this short paper.

As a sidenote, in order to get the mesh overlay in WebGL I use barycentric coordinates. Each vertex of a triangle mesh gets an attribute (barycentric) with (1, 0, 0) for the first vertex, (0, 1, 0) for the second and (0, 0, 1) for the third. Pass this to the Fragment shader and then you can test it like this:

if(all(greaterThan(bc, vec3(0.02)))){
        gl_FragColor = vec4(color, 1.0);
}
else{
        gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
}

Terrain

The initial terrain is generated by layering 10 octaves of simplex noise. The technique is identical to this descrption (except the noise function).

float h = 0.0;
for(int i=0; i<10; i++){
        float factor = pow(2.0, float(i));
        h += snoise(vec3(uv*factor, delta*float(i+1)))/(pow(factor, 0.88)*10.0);
}

Snoise is a good implementation of simplex noise for WebGL by Ian McEwan.

Ambient Occlusion

As discussed in my WebGL SSAO and minecraft like rendering articles, ambient occlusion is an important aspect of good lighting.

When using a heightfield the occlusion can be evaluated on the heightfield instead of in screen space. This is desirable since it avoids some of the issues of screen space ambient occlusion.

The algorithm I use for this inspects neighbors of each cell in the heightfield. Alpha is the angle between the cells normal and the direction to the neighbor (vec).

As a sample pattern I use points distributed by a spiral method.

for(int i=1; i<33; i++){
    float s = float(i)/32.0;
    float a = sqrt(s*512.0);
    float b = sqrt(s);
    float x = sin(a)*b;
    float y = cos(a)*b;

which produces this pattern:

Lighting

In order to light the scene I use spherical harmonics and a fixed set of coefficients. The algorithm and coefficients are from the OpenGL Shading Language (Orangebook), see the code examples in Chapter 13.

Shadows

Shadowmapping is used to produce the shadows. Because I'm dealing with a heightfield, I do not evaluate the shadows in screen-space but on the heightfield. This makes blurring them easier.

The scene is rendered from the lights point of view and the depth of each fragment is stored. Then the heightfield is processed and each cells position is compared with the depth seen from the light.

This gives very hard and pixelated shadows. To soften this somewhat, a simple averaging 3x3 kernel is used to compare neighbors in the lights depths and divide the result by 9.

As a third step, the resulting shadowmap is blurred with a fast gaussian 5x5 filter. Algorithm description here and there and elsewhere.

Water Heights

In order to evaluate the water there are two steps. First the water height is diffused by a volume preserving blur. Then the vertical moments are evaluated using a verlet method.

Diffusion

In order to diffuse water heights I exchange heights between a cell and two neighbors at a time, clamped by the actual availability of water.

The difference is divided by two because it is assumed that the neighboring cell will add half the amount that got subtraced. The limits are clamped by two to allow for the fact that half the available water could get transfered by the other of two neighbors in the evaluation.

Momentum

The idea behind this algorithm comes from this page. The algorithm regards the last state of the water and derives the vertical velocity by the difference between its current height, and the last height of its neighbors. I apply a similar mass-conserving algorithm as for diffusion.

Water Velocities

When water heights change, I can sum up the displaced volume of the water and the direction of displacement. Multiplying direction by volume of displacement and dividing by total volume of water in a cell gives this vector field.

Erosion

Once the water velocities are known, they can be used to modify the heightmap. First rock is converted to soil depending on the flowspeed of water over it, and then soil is transported in the direction of the water flow.

Visualizing Water Flow

Water as is known from the real world makes ripples as it flows. My algorithm for water is not good at this kind of effect, and it would be difficult anyway because the ripples should be smaller then a cell in the heightfield.

As a solution I divide the water up in 128x128 regions (each being 4x4 cells big since the water heightmap is 512x512 pixels). Each region gets a texture position (a position map 128x128 pixels in size). This position map is continuously updated from the water velocities.

The water positions are then used to index detail normals for the water surface and bi-linearly blend them with the neighboring region.

Conclusion

Doing a very simplified model of hydraulic erosion gives very pleasing results. I'm confident that with better algorithms for water flow and a more varied erosion model, many more interesting effects can be achieved.

Further Work