Texture Mapping, Perlin Noise and Bump Mapping

Texture Mapping, Perlin Noise and Bump Mapping

Welcome to the fourth part of my raytracer development blog. For this part, I have started by making some quality of life improvements, followed by the main tasks of this homework which are texture mapping, perlin noise and bump mapping. I can easily say that this homework has been the most time consuming one yet. It required planning, experimentation and a lot of debugging. On the other hand, it has certainly been one of the most satisfying visually. Once again, I will be detailing my approach, progress and implementation in the following sections. Here is the table of contents for this blog post:
  1. Compilation of Object Files
  2. Progress Bar During Render
  3. Texture Base Class and Perlin Noise
  4. Image Texture Mapping
  5. Bump Mapping for Image Textures
  6. Bump Mapping for Perlin Textures
  7. Bugs, Fails, Mistakes

Compilation Of Object Files 

To compile the raytracer, I was using a simple Makefile that found every source file in the project and just compiled them from zero. This worked fine when there weren't many source files or third party libraries, however, I have been adding more and more source files and libraries to expand the program and the compilation time was around ~35s when I lost my patience and decided to do something about it.
I learned that the compiler can actually generate the dependencies. While writing the Makefile to do that, this answer on stackoverflow has been a huge help to me. In short, what I ended up doing was to write a Makefile that creates the object files, along with dependency files, stores them in build directory so that subsequent makes can use these object files and link them directly without having to compile them again. This saved me a huge time as waiting for ~35 seconds just to find out that you have forgotten a semi colon is just brutal.

Progress Bar During Render

This is another quality of life improvement (actually the first one since previous one was more of a must than a QOL). Instead of staring at emptiness, I added a simple progress bar and that gives me a rough idea about how the rendering is going and when it will end. Once again, this answer on stackoverflow helped greatly.

Texture Base Class and Perlin Noise

We have two different texture types that need to be handled, image textures and perlin textures. I wanted to make the textures expandable, that's why I went with an abstract base class for textures. Any derived textures shall implement the value function, which given texture uv coordinates and the intersection point, should return the color value. This way, if one wants to add a new texture type, it is enough to define the value function and inherit from the base class.
There is also another pure virtual function to calculate the bump normal, I will get to that in bump mapping.

Perlin noise is a procedurally generated solid texture. What does it even mean?
First of all, perlin noise is just noise, randomness that we can control. What would you do to generate a random grayscale image. If the points are selected too random, all you get is a noisy texture that doesn't have much use. If you instead use a sine like function (has a known pattern and a period), the texture looks way too perfect. Perlin came up with a procedure to generate noise that is random, but in a more controlled manner, resulting in very cool looking textures. And lastly a solid texture means that the texture is defined by a function evaluated at the surface of the object.

Perlin noise is calculated in world coordinates. This important detail will come up later as well. The theory is to think of the world space as a lattice (like small cubes with 1 unit edges expanding infinitely), randomly associate each vertex of the lattice with a gradient vector, evaluate the input point according to a weight function and interpolate the results. For a more detailed explanation of the math and specified variables, Realistic Ray Tracing book can be of great help.
Perlin noise, depending on what you apply to the noise value to scale it to [0,1] range gives different patterns. A patchy pattern, resembling a golf ball can be obtained by (noise + 1) / 2 and a veiny look can be obtained by taking the absolute value of the noise value.

perlin_types_nobump.png rendered in 37,090s

Image Texture Mapping

Using image textures is a very cheap and effective way to increase the visual complexity of objects. First, we need to be able to read the images. Up to now, I have been dealing with png files. However, textures may have different formats like jpg, so I looked for different options of reading these files. I found stb_image.h, a header only public domain library that has a very simple API for reading and writing images. Then I modified the parser to read these textures. I made it so that, for image textures, I am applying image instancing, meaning that if the same image is given multiple times, I read only once and in my imageTexture objects, I store a pointer to these image objects.

Now, I am ready to map  the image textures to the objects. I will start out with sphere case.
We need to parametrize the sphere according to spherical coordinates and map phi, theta to [0,1] range representing UV coordinates. Wikipedia has a pretty decent explanation on this subject. In hit routine, uv coordinates are calculated and put into the hit record object. We are going to perform the lookup in the pixel color routine. This is mainly because for perlin textures, we need to perform the lookup in world coordinates. In order to keep this lookup uniform, we only calculate uv in local.

Moving onto image texture mapping for meshes, we realize that there is no such thing as calculating the texture uv. These values are specified in the scene files.
For PLY files I was using happly, unfortunately, it did not have an API method to easily read texture UVs. Therefore I added that to the header myself and created a pull request. The texture UVs are interpolated using the beta and gamma calculated in the hit routine of a mesh triangle, then the result is stored in hit record.

sphere_texture_replace_nearest.png rendered in 0,218s

When only the nearest pixel is fetched, there exists some aliasing.

sphere_texture_replace_bilinear.png rendered in 0,210s

Biliniar interpolation can be used to remove some of the aliasing problems.

sphere_texture_blend_bilinear.png rendered in 0,233s

The value from the texture can be used with the material together. Here, they are averaged.

ellipsoids_texture.png rendered in 0,225s

This image pretty much sums up the texture mapping with image textures and perlin textures. If you inspect the blended(redish) sphere, you can see that it is rotated and the texture rotates with the object. By having the uv coordinates calculated in local coordinates, we can have this sort of behavior.

skybox.png rendered in 31,91s

This image demonstrates a cool looking effect that you can achieve with texture mapping. Here, the skybox is actually a texture mapped onto a sphere and the camera is inside the sphere. That's why, for this image, backface culling must be disabled. I added a tag to the xml for that purpose. Also, in comparison with the reference image, my output seems a little more brighter.

killeroo_nobump_walls.png rendered in 5,646s

The grid is a texture with this one.


bump_mapping_basic_nobump.png rendered in 6,834s


bump_mapping_transformed_nobump.png rendered in 7,40s

galactica_static.png rendered in 12m6,558s

This image contains a background texture instead of background color. If a primary ray misses, that ray takes the color of background texture.


galactica_dynamic.png rendered in 12m24,707s

VeachAjar.png rendered in 1m42,827s

Finally, we have this rather complex scene. This scene is created by benedikt bitterli and adapted to our scene format with small changes by our Proffessor Mr. Akyuz. In order to render this scene, I added an option to give the transforms as composite.

Bump Mapping For Image Textures

Bump mapping is a very cool looking effect where you can obtain bumpy looking results without changing the underlying vertex positions. The shading normal can be perturbed according to the derivative of the given image texture and the intersection point so that we can get the texture to look more alive.
It is important to note that this perturbation of shading normal is also done on world coordinates (at least that's what I have done and obtained similar looking results to given reference images). In hit routines of primitives, dp/du and dp/dv are calculated. These are vectors in local coordinates.
For spheres, these values are calculated as the gradient of the intersection point p in the directions u and v by using the spherical parametrization. As for the meshes, they can be calculated from the texture coordinates of the triangle. The derivative of the image texture is just the difference of subsequent pixels, in a certain direction and in grayscale, which can be calculated by averaging color rgb.

bump_mapping_basic.png rendered in 7,898s

For a few inputs, I had to modify bumpmapMultiplier variable a bit to get a more similar result to the references. In this case I changed it from 30 to 3.

bump_mapping_transformed.png rendered in 7,531s

Again, bumpmapMultiplier 30 -> 3

killeroo_bump_walls.png rendered in 6,72s
sphere_bump_nobump.png rendered in 5,349s

This one is my personal favorite, it definitely shows the visual improvement that bump mapping can achieve.

Bump Mapping For Perlin Textures

For perlin textures, the case of finding the derivative is different. Perlin texture is continuous and evaluated in world coordinates. Therefore, we use a small difference epsilon (1e-3 works fine) and find the gradient of the texture.

perlin_types_bump.png rendered in 41,124s

In this image, we can see how the scaling factor can have a huge impact on the look of the objects. The closer spheres have higher frequency.

spheres_perlin.png rendered in 5,673s

The left one is patchy while the right one is veiny.

Bugs, Fails, Mistakes


This one actually looks pretty cool (even cooler than the actual one), this is the result of calculating the bump mapping normal in the objects local coordinates. I believe it is possible to get a very similar result to this by just increasing bumpmapMultiplier.


Texture coordinates given for the door contain negative values. I was returning false for an image.get operation if values weren't in limits [0, image.width(height)] so no color were returned for those parts, causing this black line. I corrected this by adding these negative values to the image.nx(ny).


While learning about OpenGL for Ceng477 course, I read somewhere that the most common problem you could come across on image textures is textures being reversed, needlessly to say, I was ready for this problem. This is because I consider the texture (0,0) to be at bottom left, instead of top left. Therefore, I need to reverse the y coordinate by subtracting it from image.ny (height). As a default, I decided to flip y like this, I also made it possible to add a <FlipVertical>(bool)</FlipVertical> tag to the texture so that it can be specified by user as well.


This is a rather funny one that contains two bugs. Firstly, the bump amount is too much because I was calculating bump normal in local coordinates, also, although I was mapping the texture correctly, I forgot to flip when I was calculating the derivative from the image for bump normal.


I included this one because it looks cool as well. Both bugs explained in the above image are present in this one as well. However by looking at this one, it is quite hard to deduce that.

Thank you for reading, see you in the next post!

Comments