**Rendering Algorithm Final Project
Shruti Agarwal** Theme: “Something seems a little off” ================================================================================ Motivation ================================================================================ Motivation1 Motivation2 In the above two images, the reflection in the mirror is of an object other than the one seen in the image. Previous Work ================================================================================ Previous to the final project I had implementations for path tracing with MIS but without volumetric rendring. The following bsdfs: microfacet, dielectric, mirror and blinn phong. I also had implementations for depth of field and point light, area light and distant sphere. The following extentions are made to the nori. Textures ================================================================================ I have added the support for procedural textures and image based textures in the nori. There are four different type of texture classes namely: The code for noise generation was taken from pbrt/v3. All these texture classes derive from a base interface in Texture.h which is a ETexture type of class and a type of NoriObject. Textures can be loaded from .xml files into the scene by using a tag "texture" and type specifies the particular class from the above five textures. As textures are always associated with a BSDF they are included in .xml file as child nodes of BSDF class. The BSDFs class diffuse, and blinn phong was enhanced to to accept texture type of object as its child nodes. If no texture object is specified then by default a constant texture is specified. Each BSDF class has a fixed color value to which the texture provides variation. For example,

Texture from .xml

<mesh type="obj"> <string name="filename" value="meshes/walls.obj"/> <bsdf type="texture_diffuse"> <texture type="fbmTexture"> <integer name="octaves" value="16"/> <float name="roughness" value="0.5"/> <point name="scale" value="8 4 1"/> <string name="id" value="albedo"> </texture> <color name="albedo" value="1 1 1"/> </bsdf> </mesh> In the following examples, the different textures are used to change the color of the diffuse materials. Octave=8, roughnes = 0.5, and scale 1.
Image Fbm Turbulence Perlin

Texture with base color

In the following examples, the same textures are used, but now we have different color already assigned to the surface.
Image Fbm Turbulence Perlin

Scale change

In the following examples, the different values for scale values for fbm texture.
scale(x, y)=4 scale(x, y)=8 scalex=4

Blinn Phong: exponent texture

Texture can also be used to vary the value of exponent in Blinn Phong. In both cases the base exponent is 10000 to give a sharp specularity. The exponent changed with fBM texture of octave 8, roughness 0.9, and scale (8, 8, 1). As we can see, this effect can not be obtained by no texture 10000 or 100 exponent.
with texture without texture exp=10000 without texture exp=100
Materials ================================================================================ The previous implementation of microfacet and blinn bsdf scale the specular coefficient by 1-m_kd in order to conserve energy. Therefore, it was not possible to obtain specular surfaces for say a pure red color.

Multiple BSDFs

Therefore, I implemented two new classes diffuseMicrofacet and diffuseBlinn which were able to add rough glossy effect or glossy effect to a diffuse surface.
blinn microfacet diffuse and blinn diffuse and microfacet

Multiple BSDFs: with texture

The diffuse color can be obtained from any texture. This texture can be of any of the above type.
blinnFbm Texture microfacet Image Texture

Anisotropic Microfacet

Anisotropic Microfacet using Beckmann distribution: I used the following paper for implementing anisotropic microfacet BRDF using Beckmann distribution. "Measuring and modeling anisotropic reflection", Gregory J. Ward, In Computer Graphics (Proceedings of SIGGRAPH 92). But I think I'm not sampling correctly, the images below shows the problem with my imaplementation. The specular highlights in only one hemisphere are correct.
ax=ay=0.1 ax/ay=0.1 ay/ax=0.4

Merl BRDF Dataset

Measured BSDF class which loads the data from Merl BRDF database "A Data-Driven Reflectance Model", Wojciech Matusik, Hanspeter Pfister, Matt Brand and Leonard McMillan, ACM Transactions on Graphics 22. The two file names shown here are: The read code for the binary file was taken from website: http://people.csail.mit.edu/wojciech/BRDFDatabase/.
Gold Paint Silver
Lights ================================================================================

Spotlights

The spotlight is a point emitter that has properties like: The various settings for spotlight are shown below. In the first three images, the angle subtented is 45 degrees and fall off angle is 30 degreees. In last image the fall off angle is reduced to 20 degrees. All of these have a light of white color. Starting from first image, the spotlight is moved closer to the object in second image, rotated in third and fall off reduced in fourth.
spotlight closer to object rotated fall off=20

Environment Maps

The environment map is implemented with important sampling. The ray that doesn't hit any object are sent for environment map query. The environment map supports scale and rotation but not translation as the map is at infinity. An example enviroment map is shown below.
envMap rotated=45 rotate=90
Medium ================================================================================

Henyey Greenstein Phase Function

The phase function is used in medium to sample a new direction during medium scattering event. I implemented sampling rom Henyey Greenstein Phase Function for this purpose. Shown below are the snapshot of points sampled using Henyey Greenstein Phase function for g=0, g=-0.7, and g=0.7. Finally, also the snapshot of hypothesis test is also given.
g=0 g=-0.7 g=0.7 hypothesis test

Medium from .xml

I have implemented the homogenous medium in homogenous.cpp which derives from base interface Medium. A homogenous medium can be included in the scene using .xml file with the following interface. All the mediums are children of Scene class. A medium can have the following propoerties: <medium type="homogeneous"> <string name="name" value="fog"/> <color name="sigma_s" value="0.5, 0.5, 0.5"/> <color name="sigma_a" value="0, 0, 0"/> <phase type="HenyeyGreenstein"> <float name="g" value="0"/> </phase> </medium> The name tag can be used by any mesh, camera or light to include this type of medium as it's inside or outside medium. For example, <string name="out_media" value="fog"/> or, <string name="in_media" value="fog"/> Scene class during it's activate function assigns mediums too each object by the name of the medium given by that mesh. This setting can create complex inconsistent medium setting where out medium of camera is one but out medium of a mesh is something else. This design assumes that the use will specify the correct configuration. Integrator ================================================================================ The integrator which renders volumetric path tracing is implemented in path_vol.cpp. This integrator is implemented using next event event estimation at the points in sampled distance in the medium. If the medium is inside an object the shadow ray are connected directly if the BSDF of the mesh is null.

Basic Test

path mis sigma_s=0 sigma_a=0 sigma_s=0 sigma_a=0.15 sigma_s=0.15 sigma_a=0

Effect of g

Only medium and light (area light and spotlight), and scatter coefficient fixed at 0.25 for all colors. The three different values of g=0, g=0.9, g=-0.9.
g=0 g=0.9 g=-0.9

Inside an object

In the following two examples, I have the medium restricted to a sphere.
sigma_s=1 sigma_a=0 sigma_s=0 sigma_a=0 0.85 0.85
Final Scene ================== final image The reflection here are not generated by a trick, but is due to careful placement of the objects and mirror in the scene. The scene snapshot is given below, with the camera marked with red. The shape of the mirror is actually ellipse which becomes a circle after project. Thus, it hides the clues of perspectivity and it looks as if we are looking at the mirror orthogonally. In reality the mirror is looked at an angle. This trick is revealed clearly when we use a circular mirror instead of an ellipse.
top camera
Conclusion ================== All the files are pushed to git. All the test files shown in this report are in results/project/testFiles folder. And the final scene blender file, xml file, meshes, texture and environment map are kept in results/project/finalScene folder. The folder scene is not pushed because of its large size.