Motivation
Theme: Something Old and Something new
Part 1: SystemOverview
What I have before
Before I did the final project, I already have the render support motion blur, camera blur, naive path tracing, Multiple importance sampling path tracing. The BRDF class I already have are mirror, dielectric, isotropic micro-facet blinn and phong.
What I add for the system during the final project
1. Texture
12pt
Add texture as a subclass of nori::object to the system.
The texture support image texture, solid texture and procedure texture.
Solid texture support pure color and checkboard.
Procedure texture includes, perlin noise, fbm and turbulence
The texture value could replace a parameter in bsdf.
2. Volumetric
12pt
Added Medium class as a subclass of nori::object to the system
and a medium interface structure which assigned to mesh indicating inside and outside mediums. The medium could be assigned to camera also.
The Medium support homogeneous medium. It also support heterogeneous medium which generated by procedure texture(peril noise for example); Support Woodstock algorithm to do heterogeneous free path tracing.
Added volume_path_mats to do naive path tracing in the scene with media;
Added volume_path_mis to do multiple importance sampling in scene with media;
Added A PhaseFunction as a subclass of nori::object to system including Henyey Greenstein phase function here;
3. Anisotropic Material
6pt
Added two brdf subclass classes for anisotropic material:
Anisotropic micro facet,
Anisotropic phong brdf,
Implemented both of their districution function and sampling methods with their show and mask functions.
Modified the shaming frame with uv coordinates tangetn at hit point so that the object could have continues uv.
4. Lighting
2pt
Implemented Evironment mapping as environment light
Part 2: Volumetric Rendering
related files:
include/nori/media.h
include/nori/phasefunction.h
src/volume.cpp
src/vol_path.cpp
src/vol_path_mis.cpp
src/heter_media2.cpp
src/homo_media.cpp
src/henyey.cpp
My medium is defined with mesh boundaries, each mesh has its out going and in going medium, when the ray hit a surface, it compute the next medium. cameras and emitters also keeps a pointer to a medium so that I could render a scene with a fog filling the whole scene.
The whole scene has a medium map and load them to boundaries before rendering process.
Phase Function
The phase function I used for scattering is Henyey Greenstein, here are the validation of the distribution and probability, the first image shows a uniform scattering distribution and the second image shows the forward scattering distribution, The last image compares the samples histogram and the integrated density.
Homogeneous implementation
For volumetric rendering the method I use is path tracing with free path sampling. First i trace a ray to calculate the nearest surface and then I sample a distance proportion to the transmittance term. Then I check whether I am in the medium or hit a surface. If I hit a surface, I do brdf sampling. If I am in the medium, I’ll do pf sampling.
Here is the homogeneous v_path_mats versions, the top left version is the one without medium and the top right is medium that sigma_t equals to zero, which means It will totally equal to the left one. And the bottom left one is purely absorbing medium and the bottom right one is absorbing as well as scattering.
Volume_path_trace_naive
In order to reduce the noise I also did the multiple importance sampling version of volumetric path tracing. At each evaluate point I cast a ray shadow as well as a brdf / pf sample and weighted them together. The pdf I use is probability of hitting light source multiplied by the probability of sampling a free path larger than the tmax. Here are the volumetric mis versions.
Volume_path_trace_mis
Hetergeneous implementation
For heterogeneous medium, the free path sampling section would use wood-stock method by filling the volume with additional fictitious particles to make the medium homogeneous. I implemented the free path sampling method as a method in medium class so there is a hetergenous medium with a woodstock sampling method.
cbox_path_mats_ref
Part 3:Texture and Noise
related files
include/nori/texture.h
src/procedure_fbm.cpp
src/procedure_checker.cpp
src/procedure_perlin.cpp
src/marble.cpp
src/texture.cpp
src/solid.cpp
The texture is implemented as a nori object. Each scene keeps a texture maps, and in rendering prepress the system load texture pointer to the meshes and emitters.
Each texture has a method getColor, given uv values, If the texture is marked as a 3D texture, the method will call getNoise to return spectrum value given a point position.
The imageTexture loads image by an external library “stb_image”. I transform the unsigned char to float ranging from 0 to 1 and store them in a bitmap.
Variables and colors of perlin noise and fbm and turbulence marble as well as solid texture could be defined by user. So does the solid textures.
ImageTexure
SolidTexure
ProcedureTexure
Part 3:Anisotropic Microfacet and Phong BRDF
I modified the shading frame of hitting points, calculated the dpdu and dpdv in triangle intersection and then make a orthotropic base from the surface normal and this two vector. If there is no uv, then the shading frame equals to geometry frame.
Anisotropic Microfacet
related files:
src/anisotropic.cpp
src/anisotropic_phong.cpp
src/bvh.cpp
I implemented the anisotropic micro facet distribution function and shadowing and masking function, as well as sampling middle vector proportion to the distribution, and evaluating the pdf. Here are is the warp tests.
Here are the anisotropic micro facet images, the first one has a alpha_x = 0.1 an alpha_y = 1, the second one has a alpha_x = 1 and alpha_y = 0.1.
When I set alpha_x = alpha_y the results turn out to be as same as the isotropic micro facet.
Anisotropic Phong BRDF
I also implemented the anisotropic phong in the paper “An Anisotropic Phong BRDF Model ” by Michael Ashikhmin and Peter Shirley. This one has a more metal like appearance.
Here are the anisotropic phong facet images.
Here are the anisotropic phong facet images. The change of light of the surface and the shadow is visible.
Part 5:Environment Emitter
I implemented this function which is similar as distant sphere.The Emitter has a texture image, when a ray does not hit any emitter, it will hit environment emitter. The emitter sampling method only transform the query direction into longtitude and latitude value and look up into coresponding pixel value, treat that value as the radiance.
Part 6:Final Image