Tuesday, June 19, 2012

Let there be light

I remember many many years ago, when first learning graphics concepts, I was working though 'Programming Role Playing Games with DirectX'. There was a model included on the CD of a castle with a mote and I thought it looked terrible. The textures were gaudy, the resolution was low and it was quiet jarring and unpleasant to look at. In a later chapter, an introduction to lights and shading, the same model was reused. The difference in results was unbelievable. With the correct shading depth perception was easier and the scene, despite its initial poor quality, took on a much more natural and realistic feel.

Here is a comparison of the same mesh, with the same texture, rendered from the same point of view with and without shading.


The picture above, and the video below, both use a simple diffuse lighting calculation to shade the side of the earth which is facing away from the sun. This can be implemented as per vertex lighting, or per pixel lighting. Each method has pros and cons which I shall briefly discuss here. If I get anything wrong or leave something out please comment.

(Instead of using the diffuse value to shade the back side of the earth, I use it as a ratio in a multisample between a day and night time texture. Since the night time texture is already colored to show shade and darkness the result is the same but I also get lights from civilization on the dark side)

Per Vertex Lighting
  • Diffuse value is calculated in the vertex shader.
  • Faster since diffuse value is only calculated once per polygon and each pixel in the polygon has the same diffuse value (or is that 3 times per polygon, once per vertex, and the diffuse value is interpolated across the face of the poly?)
  • Since the diffuse value is per vertex and not per pixel the value is not always correct and some shade popping occurs. Check out the video to see this.
Vertex Shader

attribute vec4 position;
attribute vec4 normal;
attribute vec2 uv0;

varying vec2 _uv0;
varying float _diffuse;

uniform mat4 modelViewProjectionMatrix;
uniform vec4 normalizedSunPosition;

void main()
{
    //Texure coordinates are needed in fragment shader
    _uv0 = uv0;

    //Calculate diffuse value
    vec4 nor = normalize(modelViewProjectionMatrix * normal);
    _diffuse = max(dot(nor, normalizedSunPosition), 0.0);
   
    //Translate vertex
    gl_Position = modelViewProjectionMatrix * position;
} 

Fragment Shader

varying lowp vec2 _uv0;
varying lowp float _diffuse;

uniform sampler2D dayTexture;
uniform sampler2D nightTexture;

void main()
{
    gl_FragColor = (texture2D(nightTexture, _uv0) * (1.0 - _diffuse)) + (texture2D(dayTexture, _uv0) * _diffuse);
}


Per Pixel Shading
  • Diffuse value is calculated in the fragment shader.
  • Potentially slower as there are generally allot more pixels rendered than vertices and therefore allot more diffuse calculations.
  • More realistic, smooth results.
Vertex Shader

attribute vec4 position;
attribute vec4 normal;
attribute vec2 uv0;

varying vec2 _uv0;
varying vec4 _normal;

uniform mat4 modelViewProjectionMatrix;

void main()
{
    _uv0 = uv0;
    _normal = normalize(modelViewProjectionMatrix * normal);
    gl_Position = modelViewProjectionMatrix * position;
}

Fragment Shader

varying lowp vec2 _uv0;
varying lowp vec4 _normal;

uniform sampler2D dayTexture;
uniform sampler2D nightTexture;

uniform lowp vec4 normalizedSunPosition;

void main()
{
    lowp float _diffuse = max(dot(_normal, normalizedSunPosition), 0.0);
   
    gl_FragColor = (texture2D(nightTexture, _uv0) * (1.0 - _diffuse)) + (texture2D(dayTexture, _uv0) * _diffuse);
}




Sunday, June 17, 2012

Lets bring some atmosphere to the party.

Another minor update. Instead of investing more significant amounts of time and adding atmospheric scattering I decided to simply add a cloud layer to the planet. The cloud texture came as part of the texture set I am using to render the earth.









Initially I bound the cloud textures (the image data and the transparency map) to texture space 1 and 2 (the earth texture is in space 0) and I attempted to render the clouds directly onto the planet. Since the application that this software will be used in will only view the planet from high orbit, flattening the clouds directly onto the earth texture wouldn't be an issue. The results were unsatisfying. If I looked online I could probably find some correct GLSL for doing alpha blending in a shader, and not by setting the OpenGL state machine with glEnable(GL_BLEND); glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA); however I have never tried this before so took quicker approach.

I've created a 2nd sphere for the atmosphere. It is a fraction larger than the earth sphere and the cloud texture is blended onto this. This approach is more costly to render as I am rendering 2 spheres instead of one, and the alpha blending needs to sample the color buffer, however the spheres are relatively low resolution (1681 vertices, 9600 indices which are rendered as a triangle strip to make 3200 polygons per sphere. The sphere is dynamically built at run-time allowing this resolution to be changed).

This method also allows the clouds to easily move independently of the planet, as real clouds do. I don't want to suggest that this wouldn't be possible if I flattened the clouds onto the earth sphere, it probably would by doing some texture coordinate scrolling, but it would result in a more complex shader. Slower to run? Perhaps but defiantly more difficult to understand.

No code worth while to show in this example. The shaders used for the atmosphere layer are pretty much identical to those shown previously.

Next task is to add time of day by creating a light source (the sun) and using texture blending to render the dark side of the earth. I am confident that a standard lighting algorithm will work for this, but instead of using the diffuse lighting value to darken a color (shade that pixel) the diffuse value will be used as a ratio to sample the day and night earth textures.

*Edit*
I've made a short video showing the mobile version of the engine running on a mobile platform, or at least the simulator for that platform.






Monday, June 11, 2012

A nice segue from Decade Engine to Mobile Development.

A friend is in the process of writing a nice iPad application. I shall not go into any detail regarding the app as it is his idea and not mine to share. The app needs to render the earth, allow the user to rotate the planet, zoom in and out to country level and also allow the user to touch anywhere on the globe and if a country has been pressed, that country is highlighted and this information available to the app layer.

To date he has been using an open framework called whirlyglobe. This is a pretty impressive framework and I would recommend that you check it out but after testing it on an iPad2 and 'The New iPad' within the app it seemed a little slow. Vector files are being used to highlight the country, raising it above others when selected. All of this is in very high detail and looks excellent, but this detail does come at a cost. The response on the iPad is sluggish and would probably be even more so when there is an app sitting above it.

When looking into how we could improve the performance, I suggested that I could use the concepts that I developed when programming the original Decade Engine, along with the new features I have been learning with converting the original engine to OpenGL 3/OpenGL ES 2.0.

Here is the first rendering from Decade Mobile. Please note that this video was recorded off my Mac Mini but the same code (with minor changes which I shall document in a latter post) has been built and runs on an iPad and iPhone.







The textures used in this video have been purchased from here. Since zooming is only required to the country level and not to the cm or meter level as was possible in the original Decade Engine, I thought it overkill to use the procedural sphere technique so instead just use a normal sphere. Some webgl code for generating the vertices for a sphere can be found here.
_______________________________________________________________________________
Sphere Generation (Vertex and Index Buffer) Code

void Sphere::Create(const Vector3 center, GLfloat radius, GLuint precision)
{
    vector vertices;
   
    GLuint latitudeBands = precision;
    GLuint longitudeBands = precision;
   
    for (GLuint latNumber = 0; latNumber <= latitudeBands; latNumber++)
    {
        GLfloat theta = latNumber * M_PI / latitudeBands;
        GLfloat sinTheta = sinf(theta);
        GLfloat cosTheta = cosf(theta);
       
        for (GLuint longNumber = 0; longNumber <= longitudeBands; longNumber++)
        {
            GLfloat phi = longNumber * 2 * M_PI / longitudeBands;
            GLfloat sinPhi = sinf(phi);
            GLfloat cosPhi = cosf(phi);
           
            GLfloat x = cosPhi * sinTheta;
            GLfloat y = cosTheta;
            GLfloat z = sinPhi * sinTheta;
            GLfloat u = 1.0f - ((GLfloat)longNumber / (GLfloat)longitudeBands);
            GLfloat v = (GLfloat)latNumber / (GLfloat)latitudeBands;
           
            VERTEX_POSITION_UV0 vertex;
            vertex.Position = Point4(radius * x, radius * y, radius * z, 1.0f);
            vertex.U0 = u;
            vertex.V0 = 1.0f - v;
            vertices.push_back(vertex);
        }
    }
   
    vector indices;
    for (GLuint latNumber = 0; latNumber < latitudeBands; latNumber++)
    {
        for (GLuint longNumber = 0; longNumber < longitudeBands; longNumber++)
        {
            GLuint first = (latNumber * (longitudeBands + 1)) + longNumber;
            GLuint second = first + longitudeBands + 1;
           
            indices.push_back(first);
            indices.push_back(second);
            indices.push_back(first + 1);
           
            indices.push_back(second);
            indices.push_back(second + 1);
            indices.push_back(first + 1);
        }
    }
    

    vertexBuffer.Create((float*)vertices, VERTEX_POSITION_UV0::GetFloatsInFormat(), vertices,size(), VERTEX_POSITION_UV0::GetFormat());
   
    indexBuffer.Create(indices, indices.size());
}

void Sphere::Bind()
{
    vertexBuffer.Bind();
    indexBuffer.Bind();
}

void Sphere::Render()
{
    vertexBuffer.Render(&indexBuffer, GL_TRIANGLES);
}
_________________________________________________________________________________
Vertex Shader

uniform mat4 mvp;

in vec4 position;
in vec2 uv0;

out vec2 textureCoord0;

void main (void)
{
    textureCoord0 = uv0;
    gl_Position = mvp * position;
}
_________________________________________________________________________________
Fragment Shader

in vec2 textureCoord0;
uniform sampler2D texture0;

out vec4 fragColor;

void main(void)
{
    fragColor = texture(texture0, textureCoord0);
}