kyrie.pe

About waves and shaders




Also available in Chinese thanks to Indienova!


Hi, I’m Giacomo, one of the developers of Rhythm Doctor, a one-button rhythm game inspired by Rhythm Heaven. You can play the demo at rhythmdr.com (and you should!). What I mostly do on the project is programming the full version of the game, which we are aiming to release early next year.

In this article, we’re going to show you how we used shaders to make fluid waveform movement along a line (sprites and animations made by our artist, Winston Lee!):






The inspiration for this was the Arctic Monkeys' music video, "Do I Wanna Know?” which features the same kind of movement. As you can see, there’s something deeper happening than just moving a static sprite around:







Background info


Until now, the main mechanic in the game is to press space exactly when the seventh beat sounds. Each beat is represented by a pulse in the beat line:






This kind of movement was done by simply scaling vertically a ‘pulse’ sprite on each beat.

This new mechanic, on the other hand, is a fluid one-two pattern, where cue happens on the ‘one’, the wave accelerates from left to right, and you have to press the spacebar on the ‘two’. The period of time between the ‘one’ and the ‘two’ is shouted out by the nurse, Rhythm Heaven style (but we won’t go deeper into the game design for this article):






Hafiz (the designer/composer/programmer of the game) had made the mathematical function on Matlab that showed how he wanted the wave. The wave was composed of two functions: a triangle wave and a truncated sine wave.

A triangle wave is wave that looks like a zig-zag curve:






And a truncated sine wave is a sine wave that is zero on most of its domain, except for a small part. It looks like a mountain on a plain landscape. As you can see in the following formula, the curve is truncated because of its condition that if 'x' is between 0 and width (which is actually the wavelength), it displays the sine curve. If not, it's just zero.






Now, the magic of getting the wave we need is multiplying both functions to get a new rdWave(x) function, which is what we exactly wanted!







Now, maybe you've realized that there's a new 'tElapsed' parameter on the 'truncSine' function. This is a variable we use to displace the function horizontally, so the wave can move along the x-axis. Right now is tied to the elapsed time on the screen, but you could set it to whatever you want.

So, after understanding how the curve works, I had to look a way to implement it on the actual game.

So this is what we needed:
1. Make a continuous wave that could move smoothly along the beat line.
2. The wave line must be 1 pixel thick, so it looks similar to the classic wave.
3. It has to play nicely with the effects we had already implemented, such as glow, outlines, and flashes, which are implemented as fragment shaders.

The first thing I thought was to do it using shaders, because I'm quite comfortable writing image effects using them. The other reason is that, we can control exactly how the pixels look, which is important in pixel-art games.

Plus, because the wave would be drawn on a big quad that would work as a canvas to paint it, we can add our other effects (glow, outline, etc.) on the same shader and make it behave as just another sprite in the game, making the management of it much easier.

So, shaders...





Fragment shaders 101


Before continuing, I think we should have a short talk about shaders. Shaders are usually small programs that are run on the GPU (graphics processor). All modern games use shaders to make games look as good as they do, because they allow us to apply lightning, blur, and lots of cool effects you see nowadays on most games.

The benefit of doing these effects on the GPU instead of the CPU is that the GPU is better suited at running a lot of processes in parallel. And because games have a lot of triangles and pixels that can be processed simultaneously, they are processed on the GPU.

There are a few types of shaders, each one with a different purpose, but to draw our wave we will use a fragment shader (also called a pixel shader). A fragment shader is a program that determines the color of every pixel it’s been assigned to draw. How so? By running the shader on every one of those pixels, and returning a color for each one of them.

There are many languages for programming shaders, but most of them are very much alike, so we will use GLSL, the one used by OpenGL and WebGL. Here’s and example of a shader:


void mainImage(out vec4 fragColor, in vec2 fragCoord)
{
	fragColor = vec4(1.0, 0.0, 0.0, 1.0);
}



Objects on the that are processed on the GPU are made by triangles, so to make a quad to render our wave, we will need two of them. So what would happen if we run this code on a quad? This would result on this:






Yes, just a red rectangle. So what does our code do? Starting with the first line, the function mainImage is basically the code’s entry point, the place where the code will start running. And its arguments are out vec4 fragColor and vec2 fragCoord. The first one, fragColor, is the variable you have to change to set the pixel color you want to be shown, expressed in RGBA. Because it is marked as out, it will work as a return variable and will change the pixel color. And fragCoord gives you the coordinate of the pixel you are processing right now. Going on with the code, if we look the line between the braces:


fragColor = vec4(1.0, 0.0, 0.0, 1.0);


We are assigning a red color to the pixel. And because we haven't done much else, all pixels are red!

Until now, shaders haven’t been that useful. Not if we are going to just paint a big, one-color, rectangle. But if we use fragCoord, we will be able to know which color we want to paint each pixel. By the way, these examples have been done on Shadertoy, a website were you can code fragment shaders on quads, exactly what we need right now! (Mouseover them to view their source code!)

Now, take a look at this shader code and its output:


void mainImage(out vec4 fragColor, in vec2 fragCoord)
{
    float y = fragCoord.y;
    
    if(y > 100.0)
		fragColor = vec4(0.0, 0.0, 0.0, 1.0);
    else
   		fragColor = vec4(1.0, 1.0, 1.0, 1.0);
}





What this shader is doing is, it's rendering the pixels that are y > 100 pixels with black, and y < 100, with red (if we put the origin on the bottom-left corner of the quad). Knowing the position of the pixel is enough for us to make our shader. Now let's do our wave!



Making the wave shader



Because our wave is a mathematical function, we now have to know how to draw a function in the shader. If you look closely at our previous example, there's a conditional y > 100. This actually represents an inequality. So the right part would be our function. So for example, if that statement would actually be y > sin(x) + 100, the quad would display this:






Which is actually a sinusoidal wave, albeit a really small one. I added 100 pixels to the sin(x) wave so it could move a little higher and be more visible. Using this logic, if we write our triangle(x), truncSine(x) and rdWave(x), we would be able to draw them on the screen. We will declare the three mathematical functions as three functions in the shader code:


float triangle(float x)
{
    // Triangle wave
    return abs(mod(x * 0.2, 2.0) - 1.0) - 0.5;
}

float truncSine(float x)
{
    // Half sine wave
    const float height = 40.0;
    const float sineWidth = 40.0;
    const float pi = 3.1415;
    
    if(x < 0.0 || x > sineWidth) 
        return 0.0;
    else
    	return sin(x * pi/sineWidth) * height;
}

float rdWave(float x, float t)
{
    return truncSine(x - t) * triangle(x);
}



Basically the code is a very similar representation of the mathematical functions defined at the beginning of this article. The only issue here is how to get the ’t’ argument on the rdWave function, which was the time elapsed, so we could move the truncated sine wave back and forth. Luckily, Shadertoy gives us some global variables that help us to do this. The variable we need is called iGlobalTime, and gives us the elapsed time since the shader started to show. Now we can implement the mainImage function:



void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
    const float yOffset = 100.0;
    
    float x = floor(fragCoord.x);
    float y = floor(fragCoord.y) - yOffset;

    if(y < rdWave(x, iGlobalTime * 40.0))  
        fragColor = vec4(1.0, 0.0, 0.0, 1.0);
    else
        fragColor = vec4(0.0, 0.0, 0.0, 1.0);
}



Besides the new functions, we have made some changes to the previous shader:
1. Added a yOffset constant. This is used to modify the y variable, which allows us to quickly move the wave vertically.
2. Added floor() to fragCoord.x and fragCoord.y. Because Shadertoy gives us pixel coordinates with 0.5 extra added (122.5, 123.5, 124.5, etc.), I’m using floor to have them be just integers.
3. iGlobalTime is multiplied by 40 - just an aesthetic modification so the wave moves faster and look less boring.

Now, if you rewind the shader using the controls, you can see our Rhythm Doctor wave moving!






We are almost there. Now we want the wave to be…



1-pixel thin


Until now, we have been working with an inequality to display our wave. Now how can we do to display it as a 1-pixel thin curve?

We used this logic:
1. If a pixel is under the function curve (the red pixels), it means that it can be part of the curve.
2. To know if its part of the curve, or to be more precise, if it’s an outermost pixel from the inequality, we will check if one of its neighbouring pixels (up, down, left and right pixels) is NOT part of the curve. That way we will now it’s on the limit between the red and black and it must be rendered.

So, to displace the pixels, we will add or subtract the x and y values. Finally, we show it.

And, yeah, we change the color to green, to match the rhythm doctor wave =).



void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
    const float yOffset = 100.0;
    
    float x = floor(fragCoord.x);
    float y = floor(fragCoord.y) - yOffset;
    float t = iGlobalTime * 40.0;
    
    bool center = rdWave(x      , t) >  y;
    bool right  = rdWave(x - 1.0, t) >  y; 
    bool left   = rdWave(x + 1.0, t) >  y; 
    bool up     = rdWave(x      , t) >  y + 1.0;
    bool down   = rdWave(x      , t) >  y - 1.0;

    if(center && !(right && left && up && down))
        fragColor = vec4(0.0, 1.0, 0.0, 1.0);
    else
        fragColor = vec4(0.0, 0.0, 0.0, 1.0);
}










Conclusion


We finally got what we wanted! After doing this, we integrated the shader on our game, which it's being done in Unity. We added the other effects such as glow and outlines into the shader too. And to make things easier and more flexible, we tied the shader function variables (such as wavelength, triangle frequency, etc.) with the Unity Inspector, which allows us to have sliders and curve editors which we can set the wave to whatever we wanted. You can see a video of this in action:






Thanks for reading this far! Hope you'll enjoy Rhythm Doctor!










Back to home