Friday 18 November 2011

Water spreading, and other videos

Example of a small planet:
Editing Terrain Example:



And Finally it is! Spreading water! Milcho did a great job and we can see the results on video below. Sorry for bad quality and editing, but I was in hurry, I made that video for debug pursposes but I thought I will share it with you. Once a better version is ready I will re-edit this post and write some more about water too.


Friday 28 October 2011

Is it Oblivion?

For those who (maybe) are waiting for some news, here are two screenshots from latest build of our engine. Yes, you see it right - we got water and tall grass implemented, and whole terrain received some visual updates. Still a lot of work to do but I just wanted to let you know we're working and will be posting soon some interesting technical posts about our latest achievements :)

Update: New download link posted (see on the left), and new video!




Monday 10 October 2011

Triplanar Texturing and Normal Mapping

Normal mapping was something that we've talked about for a while, but there were a number of other things to do with texturing.

For our texturing, we had no choice but to use tri-planar texture mapping - since we generate an actual a planet and the terrain can be oriented in any direction. Combine that with the fact that the terrain is diggable, we had to make the texture adapt to any angle. Triplanar mapping was the perfect solution.

Doing normal mapping on top of triplanar mapping may seem hard at first, but it's just a little harder than triplanar texture mapping.

To obtain the final fragment color for triplanar mapping, you basically sample the same texture as though it was oriented along the three planes (See diagram on right).

Once you have a sample from each of these planar projections, you combine the three samples depending on the normal vector of the fragment. The normal vector essentially tells you how close to each plane the projection actually is. So if you have a mostly horizontal plane, the normal vector would be vertical and thus you would sample mostly from the horizontal projection.

This same principle can be used to compute the normal from a sample from a normal map. Instead of sampling from the texture, you would sample from the normal map. The RGB color you get would give you the normal vector, as seen in that plane. Then you can combine these normals using the same weights that you use to compute the mixture from the texture coordinates.

Basically you obtain three normal vectors, one on each plane, and each having a certain coordinate system that is aligned with the texture on the side.

On the picture on the right, the red, green and blue are the axis on each projection of the texture, while the dark purple is a sample normal vector. You can imagine, the closer the fragment's normal is to each plane the more it samples from that plane. One thing is that unlike texture mapping, is that when the normal is close to the plane's, but is facing the opposite direction, you have to reverse the normal map's results.

This is what the code for obtaining the normal of one texture from its three normal projections looks like in our terrain shader:

vec4 bump1 = texture2DArray(normalArray, vec3(coordXY.xy, index));
vec4 bump2 = texture2DArray(normalArray, vec3(coordXZ.xy, index));
vec4 bump3 = texture2DArray(normalArray, vec3(coordYZ.xy, index));

vec3 bumpNormal1 = bump1.r * vec3(1, 0, 0) + bump1.g * vec3(0, 1, 0) + bump1.b * vec3(0, 0, 1);
vec3 bumpNormal2 = bump2.r * vec3(0, 0, 1) + bump2.g * vec3(1, 0, 0) + bump2.b * vec3(0, 1, 0);
vec3 bumpNormal3 = bump3.r * vec3(0, 1, 0) + bump3.g * vec3(0, 0, 1) + bump3.b * vec3(1, 0, 0);
return vec3(weightXY * bumpNormal1 + weightXZ * bumpNormal2 + weightYZ * bumpNormal3);

Where weightXY, weightXZ and weightYZ are determined like so from the normal that's calculated at that fragment:
weightXY = fNormal.z;
weightXZ = fNormal.y;
weightYZ = fNormal.x;

I realize that it sounds a bit counter-intuitive that we need the normal before we can calculate the per-fragment normals, but this normal can be simply obtained by other means, such as per-vertex normal calculations. (We obtain it through density difference calculations of the voxels)

Finally, to get good results you need an actual good normal texture. We only had time to create one (neither of us are graphics designers), so here's a video of the rock triplanar normal map, with a short day length on our planet:

Friday 16 September 2011

In the desert...

Hello there whoever is reading our blog!

(by the way if you have any questions, don't be shy and post comments here)

I'm writing a short report in between some more informational posts to let you know we're working hard on terrain LOD, volumetric clouds and tweaking sky shaders. Once we're happy with the results we're planning to move on to objects and fluids. 

Below you can see some shots from day spent in our desert biome, which also received some work, so it looks more like a desert. Hope you enjoy it - even if it's just beginning, with very basic shaders. Stay tuned!
Sunrise
Almost midday
Sunset

Sunday 11 September 2011

Day / night on the planet, part 2

The next task, after introducing the sun, was dealing with the sky.

So far, our sky has been left a dull solid color. It would be trivial to make the sky sphere shader change the color of the sky, but it needed to correspond with the sun position in the sky, as seen by the player.

There are several good solutions posted on the web. All attempt to simulate atmospherical scattering. However, most of those methods include heavy calculations, that are perhaps not best suited for a real-time sky. Figuring we would need something simpler, I decided to tackle the problem on my own.

The solution involves simply drawing a sphere around the player, and rendering the sky on that. Since the sky can be thought of as a incredibly large sphere that's far away, this approach suited us. And since we have no plans to allow the player to leave the planet, it is unnecessary to simulate an atmosphere from outside the atmosphere.

Consider the following diagram:
This image (which is not to scale) shows the rough idea of how we draw the sky - it is drawn on the tiny sphere (the light blue circle in the diagram) that surrounds the player everywhere he goes.
 The three main things that determine the color of the sky are the direction to the sun, the 'up' direction (used for horizon calcuations), and the direction to the voxel being drawn.

Given these three vectors in the fragment shader, we can compute the three dot products between them - dotSunUp, dotSunFrag, dotFragUp. These dot products yield a normalized value in the [-1,1] range, and give us enough information about how to draw the sky.

The main check here is the position of the sun in the sky. There are three cases - the sun is high in the sky, the sun is near the horizon or the sun is below the horizon. The three cases corerspond ot daytime, sunrise/sunset and nighttime.
Here's a brief explanation of the cases.

1. if dotSunUp > TransitionConstant
In this case the sun is high up in the sky, so the entire sky can be drawn blue. It's also possible to lighten the near-horizon fragments, for a more realistic look. This lightening is done by using the dotFragUp value - which determines how close to the up direction (And the horizon) a fragment is.

2. if dotSunUp > NightTransitionConstant
In this case we are in a sunrise or sunset phase. The sky color is blended between a day and night color, depending on the dotSunUp value. Here we also start to add a glow that surrounds the sun. This glow is calculated using the dotSunFragment value, which determines how close to the direction of the sun a fragment is.

Variation of color of the near-horizon fragments can be used to also change their color to increase realism.

3. All other cases
In this case we are in nighttime. The sky color is simply set to the night time color, again with possible variation to the near-horizon fragments.

The two constant mentioned above can be set to different values to increase the transition between night and day, as well as change the time that the glow persists after the sun has gone  below the horizon. (Note that a dotSunUp value < 0 means the sun is below the horizon, while the NightTransitionConstant can be set to negative values - in the current setting it's -0.2

Finally after all these checks are done, an additional check can be performed using dotSunFrag, to see if the fragment drawn is actually really close to the sun's position - in which case the fragment is actually a part of the sun's disk. A similar method is used to set the sun's glow.

Finally, here's a short video of the sunrise and sunset on the planet. There's still some tweaking to be done on the colors, as well as the mentioned near-horizon color changes. Considering the fragment shader is based entirely on dot products and linear interpolation (both very fast on a GPU) - the results are very satisfactory for a real time implementation.




Monday 29 August 2011

Day / night on the planet, part 1

Problem: vertices with normals pointing into sun's direction
are lit even when the sun is on the other side of planet.
If you looked at our Screenshots section you probably noticed something strange about these screenshots - most of them are really dark and have some small areas randomly lit. There is a reason for this kind of behavior and I'm going to reveal it now - our world is actually a true planet! Before we just had a regular flat terrain and our sun was set somewhere in the sky above and it lit the terrain correctly. Once we switched to planet, we ended with one part of it being lit properly, and another being mostly dark. Because of nature of the planet, we just couldn't have all terrain lit anymore. What we needed was a day/night cycle.

Now, a few words about planet - we decided to go that way because we thought it will add a nice realistic touch to how players experience the world - for example, they can actually go around and arrive at the same place from which they departed. There are a lot of other cool things about having a planet, but it also introduces new problems. One of them is how diffuse sunlight works.

First problem was easily solved - our sun was static and didn't really move. It turned out that all the code for moving was already written by Milcho, but Update() function for the planet wasn't called. Once I fixed it we got a day/night cycle with sun circling around the planet (its of course not how it happens in real world, but moving planet around the sun would be much harder and not really worth the effort, so we use usual simplistic model), but one major problem remained - because some vertex normals pointed downwards towards the planet core, they were lit by the sun while it was on the other side of the planet! This is the reason of strangely lit holes on our screenshot page.

After a quick brain storm we found some solution that had a slight chance of success - we needed a "horizon" of some kind. Because we no longer had a flat terrain, we couldn't just hardcode horizon to be, for example along XZ plane. For each vertex on the planet's surface, horizon will be different. This is what we came up with:


Problem: vertex is lit even at night because its normal
points towards the sun
Solution: adding a "fake" horizon that determines
area where the sun is "active" for a given vertex

Our horizon is a plane that goes through vertex position and is perpendicular to vertex up vector. We compute dot product of up vector (U) and vertex to sun vector (VS). If the result of dot product is negative, it means sun is outside of our horizon (night), if its positive or zero - its in the horizon (day).


Pseudo-code in GLSL vertex shader looks like this:



    upVector = normalize(vertexPosition - corePosition)
    vertexToSun = normalize(sunPosition - vertexPosition)
    sunInHorizon = dot(upVector, vertexToSun)
    sunIntensity = 1.0
    sunFade = 0.2      // sunlight will fade when the dot product is less or equal 0.2
    if (sunInHorizon < 0)
    {
        // no diffuse light
    sunIntensity = 0.0 
    }
    else
    {
        // we fade sunlight
        if (sunInHorizon <= sunFade)
        {
          sunIntensity = sunInHorizon / sunFade
        }
    }

    diffuse = sunIntensity * diffuseLight

This way we solved problem of lighting vertices that have normals pointing into sun direction while sun is on wrong side of the planet and achieved a nice day/night transition over the whole planet. Timelapse of this effect can be seen on this video:


Its probably still far from perfect, but works ok for now. In next part of day/night series I'd like to optimize and correct some things presented today and also add day/night effects to our skydome, which right now remains white despite time of day change.

Wednesday 24 August 2011

Bottlenecks and smooth noise functions

Over the past few days, we've been trying to improve the current bottlenecks that slow down the engine.

The first bottleneck was simply a graphics one - how many polygons can be drawn on screen. The standard solution is frustum culling. In our case, this was a fairly easy task, since all our visual data is separated into 'visual blocks'. Each one is simply a cube, 12m on each side, which holds all visual data. Frustum culling for that simply meant checking whether each corner is visible, and if none of them are, drawing it was skipped.

Another improvement to graphics was reducing the sheer number of draw calls. As you can imagine, with one visual block being a 12x12x12m cube, in order to get good visual distance between them, we have quite a few of them. One way to decrease the number of draw calls is to simply add only full ones to a list when rebuilding, and draw only that list. The downside is that in order to accurately maintain that list, when the user moves, a rather large number of them have to be rechecked. This is an acceptable trade off, for the most part at least.

But another bottleneck, even worse is the sheer number of data that can be covered. In fact, 21*21*21 = 9261 visual blocks give us a little over 120m view distance on each side of the player - and cover a huge volume of 16 million cubic meters! - which means we need 16 million data samples to construct that volume!!!

This presents a generation bottleneck - the terrain generation function must be fast enough to generate that much data in a reasonable time. Of course one generated the data is stored to disk, and can be read from there.
For generating data, we were able to obtain a decent looking terrain at approx. 16 datablock generations per second. This means a point generation rate of 27,000 data points per second, while when loading datablocks, we were able to load of speeds ranging beteween 100 and 200 datablocks/second  - which is an average of 259,000 datapoints per second.

This leads to the major problem in this fiasco, and that is, the speed at which the density function operates. For our project, we currently use libnoise, and we use something like 7 independent one octave functions, and two 2 octave functions  Four of these (including one of the 2 octave) are combined together in different frequencies and amplitudes to generate terrain, while the other 3 are used for picking biomes and determining oceans and seas and mountain areas.

For those familiar with how Perlin noise works, this means a huge number of lookups and trilinear interpolations. Which makes it slow, slow, slow. ... Unfortunately, the slowness is greatly due to the sheer number of points generated, not only the speed of the noise. A quick switch to an implementation of Simplex Noise (also authored by Ken Perlin), showed no significant improvement in speed. (the measured speed is not only dependent on the generation but on number of other factors). While Simplex noise may be faster in higher dimensions we only use 3d noise anyway, and while straight forward tests of speed may show it faster in lesser dimensions, it is still comprable to Perlin noise when used in a real world application. Of course its speed isn't the only thing that it has improved over Perlin, but for now, it won't help our bottleneck.

The only way to help the generation bottleneck is to redesign the density function used to use less noise modules, and combine them in more ingenius ways.

The good thing is, that while this is a bottleneck as such, it is only applicable when the terrain is first generated. Often times players will move around generated terrain, which, as mentioned, is much faster. There's also the possibilty of pre-generating terrain while nothing else is pending - but this also has the drawback of starting to store excessive data, so it too must be balanced.

In other words, without more major overhaul, the speed of the density function is not likely to show significant improvement by just switching noise generators.