The models were made in Autodesk Maya, some of them were download form http://3d.csie.ntu.edu.tw/~dynamic/database/index.html and https://www.turbosquid.com/
The "new": fresh flowers, tea cup (modern ceramics), new table. The "old": withered flowers and greenish water, ancient vase, antique wall paper.
src/micro_dielectric.cpp
Reference: https://www.cs.cornell.edu/~srm/publications/EGSR07-btdf.pdf
The implementation of this rough dielectric was a modified version of dielectric from our previous assignment. Computation of Fresnel coefficient is the same as dielectric, but the outgoing direction is computed using half-way vector. I use Beckmann Distribution to sample the half-way vector, then according to the Fresnel coefficient we had, we can decided whether the outgoing light is the reflection or the refraction of incoming light.
The following comparison shows that if the roughness value (alpha) of the rough dielectric is zero (top rectangle on right image), then it should have identity result as the normal dielectric (top and bottom rectangle on left image). The bottom rectangle on the right image has roughness value of 0.5.
Path tracing with multiple importance sampling. On the left, the two plane are both dielectric. On the right, the top one is microfact dielectric with alpla(roughness) = 0, the bottom one is microfact dielectric with alpla(roughness) = 0.5;
src/paper.cpp
This feature is for the semi-transparent looking of the flower petals and the curtain. Since time is limited, I was not able to implement the subsurface scattering, so I make this modified version of diffuse bsdf in order to achieve the transparent looking.
I set up a transparency value in the xml code. When we sample the ray, if the random number is smaller than transparency, then the outgoing direction is same as the incoming direction. In other word, the light passes through the mesh. Otherwise, we do the cosine weighted sampling, which is same as what we have done in the normal diffuse bsdf. One thing that different from the normal diffuse is that we also consider the negative normal (i.e.: ray is coming from the back of the mesh, or inside of the mesh), in case like this, we reverse the normal and do the same thing as before.
The following comparison demonstrates the result of this paper bsdf when we set the transparency value to 0, 0.5 and 1 (left to right).
include/nori/texture.h
src/image_texture.cpp
src/diffuse_texture.cpp
src/blinn_texture.cpp
src/paper.cpp
src/fresnal_texture.cpp
src/micro_dielectric.cpp
src/specular_texture.cpp
The image loader is already included in the base code stb_image.h
so I can use it directly.
I can import image file with .exr, .png and various format as long as stb_image.h
support it.
The texture is a simple UV texture. Given a UV coordinate form the intersection
of the ray and a mesh, I compute the corresponding pixel point in the texture image
simply by multiply the height and width of the image with the UV coordinate (
number from 0 to 1). This method cannot handle low resolution image since it
will return aliasing texture.
I was able to use texture as albedo in src/diffuse_texture.cpp
, src/blinn_texture.cpp
,
src/paper.cpp
and src/fresnal_texture.cpp
. For src/micro_dielectric.cpp
, the
texture can replace the roughness value, which could create an anisotropic dielectric. The src/specular_texture.cpp
is very similar to the Blinn bsdf but instead of using m_kd and random number to decided whether it is diffuse or specular,
we use the black and white texture to indicate which pixel is diffuse or specular.
src/environmental_light
The environmental lighting let us to get the radiance from texture. My implementation is just a simple emitter without important sampling. To get the corresponding pixel on the texture map, I use the incoming ray direction to compute the two angles of the direction (which is the longitude and latitude), then use these two angles as a UV coordinate and find the pixel point on the texture map.
include/nori/medium.h
src/homogeneous
src/path_volumetric
src/path_volumetric2
I implemented medium class for participating media. So far I only have homogeneous volumes and isotropic phase functions. The medium can be attached to a scene or a mesh. I have two volumetric path tracing (VPT) integrators, one for each case.
If the medium is attached to the scene, then I use src/path_volumetric
for the integrator.
Below is a comparison of a scene without any medium and a scene with medium of zero absorption and scattering. In theory, these two result should be identity.
If the medium is attached to the scene, then I use src/path_volumetric2
for the integrator.
In this case, the VPT only triggered when we are in the medium, otherwise we use the BRDF Sampling to generate new ray.
Below is a comparison of a medium with high absorption be very few scattering and a medium with same absorption but higher scattering.
Below is a comparison of a scene without any medium and a scene with medium of zero absorption and scattering. In theory, these two result should be identity.