Minecraft Like Rendering Experiments in OpenGL 4

Dec. 09, 2010
comments

Since all the excitement about Minecraft I wondered what nice rendering effects could be done in such a restricted environment. I have decided to pursue that avenue of research and a variety of techniques will be presented in the following article.

I have also written about tessellation shading in OpenGL 4 which is where I learned a lot of the modern way to do things. All of the following explanations are targeted at OpenGL 4 because I think it is a bit simpler. However most of it could be done with older versions as well.

A note on comments, I migrated to Disqus for the comment system. Sorry that the old comments are gone, but I think it is better than my homebrewn comment system, let me know if you have issues with it.

A Russian translation of this article is provided by Grigory of gameinstitute.ru.

The Result

The aim to achieve was a procedurally generated level of a floating rock with caves. Below you will find a video and a couple screenshots.

(many of you asked, the music is "Intimate Moment" by Luke Richards from the youtube audioswap)

Overview of the floating rock

A cave entrance

Inside a cave with some lava

Closeup of a stone wall with some gold

Contents

Contents

Howto read

The following explanations are the outlines of the techniques, and if you want to understand them in detail you should read the source, I frequently reference it when explaining some effect.

Setup

The application uses GLSL shaders, Vertex Buffer Objects and Array Textures. Rendering of the level is done with a single glDrawArrays call while the shaders, textures and buffers are bound.

Pyglet and gletools development versions are used as a dependency. Some functionality is written in C for performance reasons, everything else is written in python.

You can download the source and experiment with it. The C extension is compiled with the make.sh shell script, and if you are not on linux, you might want to port it to your compiler.

Volume Data

The level data is stored in a one dimensional array of bytes. It represents a volume of the same width, height and depth. Indexing this volume is done with the formula . The following C macros facilitate access:

#define get(x, y, z)\
data[\
    clamp(x,size)+\
    clamp(y, size)*size+\
    clamp(z, size)*size*size\
]

#define put(x, y, z, value)\
data[\
    (x)+\
    (y)*size+\
    (z)*size*size\
] = value

Other data like occlusion and gathered light will be stored in the same fashion, but with different data types.

Tessellating Quads

In order to build a representation from the volume data as OpenGL quads, it is necessary to tessellate it. For each cell in the volume, test if it is empty and if it has neighbors. Generate a quad for each neighbor offset by 0.5 towards that neighbor. In the illustration below, each cell is a circle, filled if it contains something.

Filling the volume with some random data as well as setting glClearColor

glClearColor(0.8, 0.8, 0.8, 1.0);

and using a simple shader that assigns a dark grey

vertex:
    in vec4 position;
    uniform mat4 mvp;

    void main(){
        gl_Position = mvp * position;
    }
fragment:
    out vec3 fragment;
    void main(){
        fragment = vec3(0.2);
    }

it looks like this

Pretty boring, so what we need is some lighting. In order to get lighting, we need some normals.

Face Normals

The face normals point towards the center of the empty cells away from filled cells. Exactly in the opposite direction we offset the faces to.

A buffer is filled with the normal values and passed as vertex attribute to the shader. To see if we did the normals right, a geometry shader (see shaders/normals.shader) is quite handy that receives the normal values and draws a line for each face.

Spherical Harmonics Environment Lighting

The OpenGL Shading Language has a nice writeup in chapter 12 about how to use spherical harmonics for lighting. Their website hosts the source for the demos (see glsldemo-src/shaders/OrangeBook/CH-12-shLight).

I wrote a function that allows me to lookup the environment light (see shaders/spherical_harmonics.shader) like this:

fragment = sh_light(data.normal, beach) * 0.5;

Applying this to our random cube we get:

Simplex Noise

it is time to fill the cube with more interesting data. In order to do this, we are going to use Ken Perlins Simplex Noise. I have translated the code for 3d noise to C (see ext/simplex.c).

Setting the volume to 1 if

Ambient Occlusion

The lighting from the spherical harmonics is ok. However there is very little to suggest feature depth. One great way to do that is ambient occlusion. The idea is to sample each cell and see how many rays escape from it without colliding.

We could store a single occlusion value per cell, but we can do better. Each cell has 6 faces, and for each face only half the rays can possibly contribute due to Lambert's cosine law. The solution is to store 6 occlusion values per cell, one for each face.

typedef struct{
        float left, right, top, bottom, front, back;
} Cell;

The value for each face is the sum of the cosines of the angle between the face normal and ray normal if the ray escaped, dividied by the sum of all cosines for that face.

However, since the face normals are always axis aligned you can just use the absolute value of the respective ray normal.

In order to quickly traverse the volume data, we store each ray as integer offsets, and perform the volume/cell intersection just once for each ray. We also store the depth at which the ray went into the new cell for later use. The ray also stores its contribution to each face, so we do not have to compute it every time we need it.

typedef struct{
        float depth;
            int x, y, z;
} Offset;

typedef struct{
        float left, right, top, bottom, front, back;
        Offset points[point_count];
} Ray;

The sample stores all rays with which we are going to test each cell. It also stores the sum of all rays per face.

typedef struct{
        float left, right, top, bottom, front, back;
            Ray rays[ray_count];
} Sample;

We need to distribute our test rays uniformly over the unit sphere. This blog entry has a nice writeup of the Golden Spiral method of Sphere Picking.

Generating the rays according to that algorithm and filling our volume with the ray positions, it looks like this:

If we are now performing an occlusion step for each volume cell and see how much of the environment is occluded, and factor this into our tessellation by filling another buffer with float values for the occlusion per face we get a fairly nice result.

Gamma Correct Rendering

Color and light values that you might set in your program should be in linear space. That is how energy/color calculations make sense. However, Monitors have gamma () curves (due to the fact they have less contrast in low luminosity colors).

Unfortunately most values that you could read out of textures and such are already gamma corrected. So you have to degamma values you read into your application like this: and just before you output it on screen do

.

vec3 gamma(vec3 color){
    return pow(color, vec3(1.0/2.0));
}
main(){
    fragment = gamma(color);
}

Gamma values are usually either 1.8 or 2.2 (depending on the monitor, but since there is no reliable way to get that information, you might want to make gamma correction a user setting).

GPU Gems 3 Chapter 24 "The Importance of Being Linear" does a great job of explaining gamma. Ignore gamma at your own peril, it does make quite a difference.

More Interesting Noise

If we scale the terrain up a bit to a size of 128 cells, it does not look like a floating rock.

In order to get more interesting noise multiple octaves of simplex noise can be added together. I do it like this:

float simplex_noise(int octaves, float x, float y, float z){
    float value = 0.0;
    int i;
    for(i=0; i<octaves; i++){
        value += noise(
            x*pow(2, i),
            y*pow(2, i),
            z*pow(2, i)
        );
    }
    return value;
}

Since I want to get a roughly spherical floating rock of sorts, I need to multiply the noise with its distance from the center. I also want the rock to be flatter on the top than on the bottom, hence a second multiplication factor is a gradient in y direction.

Combining these together and stretching y for noise while compressing x and z a bit, we get something like a floating rock.

Excavating caves with another instance of noise offset a little also makes it more interesting.

The complete rock code looks like this:

void floating_rock(int size, byte* data){
    float caves, center_falloff, plateau_falloff, density;
    foreach_xyz(1, size-1)
        if(yf <= 0.8){
            plateau_falloff = 1.0;
        }
        else if(0.8 < yf && yf < 0.9){
            plateau_falloff = 1.0-(yf-0.8)*10.0;
        }
        else{
            plateau_falloff = 0.0;
        }

        center_falloff = 0.1/(
            pow((xf-0.5)*1.5, 2) +
            pow((yf-1.0)*0.8, 2) +
            pow((zf-0.5)*1.5, 2)
        );
        caves = pow(simplex_noise(1, xf*5, yf*5, zf*5), 3);
        density = (
            simplex_noise(5, xf, yf*0.5, zf) *
            center_falloff *
            plateau_falloff
        );
        density *= pow(
            noise((xf+1)*3.0, (yf+1)*3.0, (zf+1)*3.0)+0.4, 1.8
        );
        if(caves<0.5){
            density = 0;
        }
        put(x, y, z, density>3.1 ? ROCK : 0);
    endfor
}

Interior Ambient

If we peek into one of the caves of the floating rock, it will be pretty dark and you can not see any detail in dark areas.

To solve this we will pick another set of constants for spherical harmonics with a different hue. Then we multiply that color by a small value and we use the occlusion factor to mix between the outside (bright) and inside (dark) ambient.

vec3 outside = sh_light(surface_normal, beach);
vec3 inside = sh_light(surface_normal, groove)*0.04;
vec3 ambient = mix(outside, inside, data.occlusion);
fragment = gamma(ambient);

Problem solved

Texturing

To get a bit more detail into the scene textures would be nice. To get textures, each face needs texture coordinates. These are also put into a buffer when tessellating the volume.

Since we need more than one material, each texture coordinate is a 3 component vector. The first two are the texture coordinates, and the third is which texture to use.

OpenGL Array Textures are a good way to communicate multiple images in one go to the graphics card. They work like 3d textures, but the z component does not blend and is an integer (0 for the first texture, 1 for the second, 2 for the third and so on).

I have created a couple of textures with Lithosphere which I'm putting into the Array Texture.

As a convention I'm going to use values in the volume for air, rock, gems, dirt, grass and lava

const int AIR = 0;
const int ROCK = 1;
const int GEMS = 2;
const int DIRT = 3;
const int GRASS = 4;
const int LAVA = 5;

Tessellation is going to put the texcoords into its own texcoord buffer that is then used by the shader as a varying. In the shader we will have to sample the material color by the texcoords and multiply it with the ambient light.

vertex:
    in float occlusion;
    in vec3 normal, texcoord;
    in vec4 position;
    uniform mat4 mvp;

    out Data{
        vec3 texcoord, normal;
        float occlusion;
    } data;

    void main(void){
        data.occlusion = occlusion;
        data.texcoord = texcoord;
        data.normal = normal;
        gl_Position = mvp * position;
    }

fragment:
    import: spherical_harmonics
    import: util

    uniform sampler2DArray material;

    in Data{
        vec3 texcoord, normal;
        float occlusion;
    } data;

    out vec3 fragment;

    void main(){
        vec3 material_color = texture(
            material, data.texcoord
        ).rgb;

        vec3 outside = sh_light(data.normal, beach);
        vec3 inside = sh_light(data.normal, groove)*0.004;
        vec3 ambient = mix(outside, inside, data.occlusion);

        vec3 color = material_color*ambient;
        fragment = gamma(color);
    }

That is looking ok, so I'm introducing a few simple rules to vary the material.

  • If rock has air on top it is dirt (see function cake_dirt in ext/ext.c)
  • The top face of dirt is grass (see function tessellate in ext/ext.c)
  • If there is rock and and noise with a threshold of 3.6 add gems (see function add_gems in ext/ext.c)
  • Add some lava at the center according to a noise threshold of 3.2 if the distance from center is smaller than 12 cubes (see function add_lava in ext/ext.c)
  • If some cube has no neighbors at all, set it to air (see function delete_solitary in ext/ext.c)

Observer Light

Since caves can be quite dark, we want to add another light source. I call it "torch". it is as if the player is carrying some light source around with him.

We will need a couple things to do this right

  • Distance (depth) of a fragment in eye space
  • Face normal in eye space
  • Eye normal for each fragment in eye space

Once we have that we can compute the cosine law with the face normal for the observer and divide it by the square of the distance (light falloff).

Depth

In order to get depth we need to pass in the modelview (not the modelview projection) matrix to our shader, transform the position by and, and write the length of that vector into our data in the vertex shader.

in vec4 position;
uniform mat4 modelview;
out Data{
    float depth;
} data;

void main(){
    data.depth = length((modelview * position).xyz);
}

Face Normal in Eye Space

To get the face normal we need the normalmatrix. The normalmatrix is a 3x3 matrix derived from the modelview matrix. Since I apply no scale to my modelview, I can just grep the first 3 rows/columns out of my modelview matrix.

mat3 normalmatrix = mat3(modelview);

However you should pass this in as a uniform.

You can get the eye face normal by:

vec3 eye_face_normal = normalmatrix * data.normal;

Eye Normal

In order to get the eye normal vector from gl_FragCoord (which is in screen space) you need to

  1. transform that coordinate to device space (divide by screen size, subtract 0.5 and multiply by 2)
  2. multiply by the inverse projection matrix

The formulas for how to construct both the projection and inverse projection matrix can be found in the Redbook (OpenGL programming guide) Appendix F.

I have encapsulated the method in the utility function "get_eye_normal"

vec3 get_eye_normal(vec2 viewport, mat4 inverse_projection){
    vec4 device_normal = vec4(
        ((gl_FragCoord.xy/viewport)-0.5)*2.0, 0.0, 1.0
    );
    return normalize(
        (inverse_projection * device_normal).xyz
    );
}

As an observer light color we pick some warm yellow (as if from a bright torch). The code below supplements our ambient light.

vec3 eye_face_normal = normalmatrix * data.normal;
vec3 eye_normal = get_eye_normal(
    viewport, inverse_projection
);
vec3 torch_color = vec3(1.0, 0.83, 0.42);
float intensity = 2.0/pow(data.depth, 2);
float lambert_term = abs(
    min(0, dot(eye_face_normal, eye_normal))
);
vec3 torch = lambert_term * intensity * torch_color;

vec3 color = material_color*ambient + material_color*torch;

And it looks like this

Normalmapping

A flat texture fill is a bit boring, and one solution to this is normalmapping. The idea is to get the normal for each fragment from a texture. In order to do this, we need those textures first. I used Lithosphere to create the normalmaps alongside the texture maps and they look like this.

Because the normals always use green as their up direction, I can not simply put them on each face and it will work. I need to project the normal from the normalmap into face normal space.

To do this we need two more normals besides the face normal. A normal per face perpendicular to the face normal pointing into the x direction of the texcoords, and a normal pointing into the y direction. I call these vectors s and t.

These face normal perpendicular vectors are stored as vertex attributes at 3 floats each (like the face normal). In order to rotate a given vector from the normalmap to face normal space, we can then do:

We receive this matrix in the vertex shader as attributes. Then we store the matrix for the fragment shader to use.

in normal, s, t;
out Data{
    mat3 matfn;
} data;

void main(){
    data.matfn = transpose(mat3(s, normal, t));
}

In the fragment shader we use that matrix to transform a map normal to world and eye fragment normal.

vec3 map_normal = normalize(
    texture(normalmap, data.texcoord).rgb
);
vec3 frag_normal = normalize(
    map_normal * data.matfn
);
vec3 eye_frag_normal = normalize(
    normalmatrix * frag_normal;
);

Now we can simply replace every usage of normal.data and eye_face_normal with frag_normal and eye_frag_normal and we have nice normalmapping.

Specular mapping

I have put some gold onto stones, but it looks more like some yellow fungus.

In order to solve this we need specular mapping. Another set of textures is created, this gives the specularity.

This factor is used in lighting to modify the lambert term.

float frag_specular = texture(specularmap, data.texcoord).r; 
vec3 torch_color = vec3(1.0, 0.83, 0.42);
float intensity = 2.0/pow(data.depth, 2);
float lambert = abs(min(0, dot(eye_frag_normal, eye_normal)));
float specular = pow(lambert, 1+frag_specular*8);
vec3 torch = specular * intensity * torch_color; 
float highlight = pow(specular, 4) * intensity * frag_specular;

vec3 color = (
    material_color*ambient +
    material_color*torch +
    highlight*torch_color
);

This gives us a nice specular color shading and a light colored highlight on the gold while not modifying rock etc.

Light Gathering

I also put some lava at the center of the rock. The problem is, lava is supposed to emit light, but this one of course does not.

The solution is similar to ambient occlusion. The idea is to accumulate an influx of light per cell for all rays that hit a lava block.

Each cell is tested with the sampling method used for ambient occlusion and the light is collected (see the function gather in ext/ext.c).

Then during tessellation a buffer is filled with this light information, which is used during rendering. The vertex shader passes the light to the fragment shader.

data.light = light;

And the fragment shader multiplies the material color with the light color.

vec3 color = (
    material_color * ambient +
    material_color * torch +
    highlight * torch_color +
    material_color * data.light
);

Which gives this result

Atmosphere

The color so far computed is traversing atmosphere in order to reach the observer. This atmosphere blocks some light, but it also introduces its own color. This is not visible below:

A simple way to compute the fog effect is presented in this article. You take where e is the euler constant, d is the depth and D is your fog density. Then you use f to mix between the computed color and the fog color.

vec3 fog(vec3 color, vec3 fcolor, float depth, float density){
    const float e = 2.71828182845904523536028747135266249;
    float f = pow(e, -pow(depth*density, 2));
    return mix(fcolor, color, f);
}       

// and in the fragment shader do
vec3 at_observer = fog(color, vec3(0.8), data.depth, 0.005);
fragment = gamma(at_observer);

And this is the result

Parting words

Hey, you made it down here! Phew, this has been quite a long article. it is surprising how many details go into making some simple graphics.

I hope you enjoyed this tour of common realtime rendering effects and the explanations where useful. You can download the source and toy with it if you want.