Computer Graphics

Texture & Shadows

Prof. Dr. David Bommes
Computer Graphics Group

Last Time: Rasterization Details

Transformations & Projections

images/rendering-pipeline-1.svg

Transformations of Triangles

  • A triangle is an affine combination of three points \[ \vec{X}\of{\alpha, \beta, \gamma} \;=\; \alpha\vec{A} + \beta\vec{B} + \gamma\vec{C}\]

    with \(\alpha+\beta+\gamma=1\).

  • Affine transformation preserve affine combinations

    • Triangles are transformed to triangles
  • Planar projections map triangles to triangles

    • Similar derivation as for lines

Lighting

images/rendering-pipeline-2.svg

How to transform normal vectors?

  • A point \(\vec{x} = \transpose{(x,y,z,1)}\) lies on its tangent plane, specified by \(\vec{n} = \transpose{(n_x, n_y, n_z, d)}\): \[ n_x x + n_y y + n_z z + d = 0 \quad\Leftrightarrow\quad \transpose{\vec{n}} \vec{x} = 0 \]

  • The same equation should be satisfied after an affine transformation \(\mat{M}\) maps \(\vec{x}\) to \(\vec{x}'\) and \(\vec{n}\) to \(\vec{n}'\) \[ 0 \;=\; \transpose{\vec{n}'} \vec{x}' \;=\; \transpose{\vec{n}'} \mat{M} \vec{x} \;=\; \transpose{\left(\transpose{\mat{M}} \vec{n}' \right)} \vec{x} \]

  • Comparing the two equations yields \(\vec{n}' = \mat{M}^{\mathsf{-T}} \vec{n}\)

    • In practice use upper-left \(3 \times 3\) block of \(\mat{M}\)
    • Don’t forgot to re-normalize \(\vec{n}'\)

Rasterization

images/rendering-pipeline-4.svg

Bresenham Algorithm

Δx  = x1-x0;
Δy  = y1-y0;
d   = 2*Δy - Δx;
ΔE  = 2*Δy;
ΔNE = 2*(Δy - Δx);

set_pixel(x0, y0);

for (x = x0, y = y0; x < x1;)
{
	if (d <= 0)    { d += ΔE;  ++x;     }
	else           { d += ΔNE; ++x; ++y }
	set_pixel(x, y);
}

Good: Only integer arithmetic!

Visibility

images/rendering-pipeline-6.svg

Z-Buffer

  • Store current min. z-value for each pixel
    • After model transformation, view transformation, projection transformation, and viewport transformation
  • Additional buffer for depth values
    • Framebuffer stores RBG color values
    • Depth buffer (z-buffer) stores depth values
    • Storage: additional 16 to 32 bits per pixel
images/z-buffer.jpg
Wikipedia

Z-Buffer

images/z-buffer.svg
Image from Wikipedia

Today: Texture & Shadows

Materials & Texture

  • So far: color/material varies per model or per vertex

  • Textures add visual detail without raising geometric complexity: Paste 2D images onto 3D geometry

images/matterhorn-1.jpgGeometry images/matterhorn-2.jpgTexture images/matterhorn-3.jpgTextured Mesh

Images from http://www.endoxon.ch

Materials & Texture

  • So far: color/material varies per model or per vertex

  • Textures add visual detail without raising geometric complexity: Paste 2D images onto 3D geometry

images/hippo-1.pngGeometry images/hippo-2.png+Lighting images/hippo-3.png+Texture

Images from http://www.3drender.com/jbirn/productions.html

Materials & Texture

  • Textures allow us to change many surface properties:
    • reflectance (diffuse + specular colors/coefficients)
    • normal vector (normal mapping, bump mapping)
    • geometry (displacement mapping)
    • opacity (alpha mapping)
    • reflection/illumination (environment mapping)

Texturing One Triangle

Texturing One Triangle

  • Assign 2D texture coordinate \(\vec{u}=(u,v)\) to each vertex
  • Interpolate texture coordinate by barycentric coordinates \[ \vec{u}\of{\vec{X}} = \alpha \vec{u}\of{\vec{A}} + \beta \vec{u}\of{\vec{B}} + \gamma \vec{u}\of{\vec{C}} \]
  • Fetch color value from texture: \(\vec{c}\of{\vec{x}} = \mathrm{texture}\of{\vec{u}\of{\vec{x}}}\)

images/texturing-1.png

Texturing One Triangle

images/texture-interpolation-1.png
Orthogonal view
images/texture-interpolation-2.png
Screen-space interpolation
perspectively incorrect

 

Images from Akenine-Möller, “Real-Time Rendering”

Perspective Interpolation

  • Linear interpolation in world coordinates yields nonlinear interpolation in screen coordinates!
  • Choose screen-space (noperspective) or perspective (smooth) interpolation for vertex shader outputs (smooth is default)
images/texture-interpolation-4.png

Texturing One Triangle

  • Assign 2D texture coordinate \(\vec{u}=(u,v)\) to each vertex
  • Interpolate texture coordinate by barycentric coordinates in 3D object space
  • Fetch color value from texture

images/texturing-2.png

Texturing One Triangle

images/texture-interpolation-1.png
Orthogonal view
images/texture-interpolation-2.png
Screen-space interpolation
perspectively incorrect
images/texture-interpolation-3.png
Object-space interpolation
perspectively correct

Images from Akenine-Möller, “Real-Time Rendering”

Texturing a Triangle Mesh

Texturing a Triangle Mesh

  • How to find texture coordinates for each vertex?
  • Find parameterization: Mapping between 2D texture space and 3D object space
  • See lecture 3D Geometry Processing (Spring)
images/parameterization-1.png

Simple Parameterizations

images/parameterization-2.png

Images from Akenine-Möller, “Real-Time Rendering”

Sphere Parameterization

\[ \vector{ \phi \\ \theta } \mapsto \vector{ \cos\phi \, \cos\theta \\ \sin\phi \, \cos\theta \\ \sin\theta } \]

images/sphere-param.png

Cartography

  • Many ways to parameterize a sphere!
  • Some parameterizations have special properties:
    • preserve angles (conformal, 2nd image)
    • preserve areas (equi-areal, 4th image)
images/cartography.png
Floater, Hormann: Surface Parameterization: A Tutorial and Survey, 2005

Low-Distortion Parameterization

images/markus-tex-1.png images/markus-tex-2.png
Orthogonal projection
images/mario-tex-1.png images/mario-tex-2.png
Low-distortion parameterization

Low-Distortion Parameterization

images/teresa-tex-1.png images/teresa-tex-2.png

Low-Distortion Parameterization

Texture Filtering

Texture Interpolation

  • How to get color value from real-valued texture coordinates \((u,v)\), such as \((u,v) = (6.4, 3.7)\)?
images/texture-access.png
  • Round to nearest integer coordinate

    color = tex[6,4];
  • Bilinear interpolation of neighboring texture pixels

    color = (1-s)*(1-t)*tex[6,3] + (1-s)*t*tex[6,4]
                  s*(1-t)*tex[7,3] +     s*t*tex[7,4];

Magnification Filter

images/texture-magnification-1.png

images/texture-magnification-2.pngnearest images/texture-magnification-3.pnglinear

Magnification Filter

images/texture-magnification-4.pngnearest images/texture-magnification-5.pnglinear

Images from Akenine-Möller, “Real-Time Rendering”

Minification Filter

images/texture-minification-linear.png
Bilinear filtering: Aliasing for minification

Minification Filter

Minification Filter

  • Point sampling is the wrong model
    • Texture minification leads to aliasing
  • Integrate over image pixel’s area in texture space
    • Approximated by an ellipse
    • Elliptically weighted averaging (EWA filtering)

images/texture-filtering-1.png images/texture-filtering-2.png images/texture-filtering-3.png

Minification Filter

images/texture-minification-linear.png
Bilinear filtering: Aliasing for minification

images/texture-minification-ewa.png
EWA filtering

Minification Filter

EWA filtering

bi-linear filtering

Minification Filter

  • Point sampling is the wrong model
    • Texture minification leads to aliasing
  • Integrate over image pixel’s area in texture space
    • Approximated by an ellipse
    • Elliptically weighted averaging (EWA filtering)
    • Computationally too expensive
  • Approximate EWA filtering by
    • mip-mapping
    • anisotropic texture filtering (not discussed)

Mip-Mapping

  • Store texture at multiple levels-of-detail
    • MIP comes from the Latin “multum in parvo”: a multitude in a small space.
    • Precompute down-scaled versions of texture image
  • Use lower-resolution versions when far from camera
    • OpenGL picks the most suitable image resolution for each per-pixel texture lookup based on pixel’s depth value
images/mip-mapping-1.png
MIP map
images/mip-mapping-2.png
Image from Akenine-Möller, “Real-Time Rendering”

Try it yourself

Try it yourself

Special Texture Maps

Light Maps

images/light_maps.png

Alpha Maps

  • Discard transparent texture pixels (alpha=0) in fragment shader. Frequently used for real-time rendering of plants.

images/alpha-map-1.png images/alpha-map-2.png Images courtesy of Oliver Deussen

Alpha Maps

  • Discard transparent texture pixels (alpha=0) in fragment shader. Frequently used for real-time rendering of plants.

images/alpha-map-3.png images/alpha-map-4.png

Image from Akenine-Möller, “Real-Time Rendering”

Alpha Maps

Environment Maps

  • Approximate reflections of environment at surface.
  • Environment is assumed to be far away from object.

images/envmap-1.png images/envmap-2.png images/envmap-3.png

Spherical Environment Maps

images/envmap-0.jpg images/envmap-1.jpg images/envmap-2.jpg images/envmap-3.jpg images/envmap-4.jpg

Images from Gene Miller

Cube Environment Maps

  • Cube maps are the preferred representation.

images/cube-map-1.png
images/cube-map-2.png images/cube-map-3.png

Bump Maps

  • Add surface detail without increasing geometric complexity
    • Perturb surface normal before lighting
  • Assume bumps are small compared to geometry
    • Bump pattern is taken from a texture

images/bump-map-1.pngnormal rendering images/bump-map-2.pngbump map images/bump-map-3.pngbump-mapped result

Bump Maps

images/triplegangers-1.pngCoarse mesh images/triplegangers-2.pngDiffuse map + specular map + bump map + …map

Image from TripleGangers

Nice Earth Textures

images/earth-day.jpg

2D Texture

images/earth-day.jpg
images/earth001.png

+ Diffuse and Specular Lighting

images/earth-day.jpg
images/earth002.png

+ Night Texture

images/earth-day.jpg images/earth-night.jpg

images/earth003.png

+ Specularity Map

images/earth-day.jpg images/earth-night.jpg images/earth-gloss.jpg

images/earth004.png

+ Normal Map

images/earth-day.jpg images/earth-night.jpg images/earth-gloss.jpg images/earth-normal.png

images/earth005.png

+ Clouds

images/earth-day.jpg images/earth-night.jpg images/earth-gloss.jpg images/earth-normal.png images/earth-clouds.jpg

images/earth006.png

Multi-Texturing Earth Viewer

Acknowledgments

Many thanks to Hartmut Schirmacher for providing aligned textures and initial WebGL code!

images/Hartmut.png
Prof. Dr.-Ing. Hartmut Schirmacher,
Beuth Hochschule für Technik Berlin

Literature

  • Akenine-Möller, Haines, Hoffman: Real-Time Rendering, Taylor & Francis, 2008.
    • Chapter 6
images/akenine.png
  • Shreiner, Seller, Kessenich, Licea-Kane: OpenGL Programming Guide, 8th edition, 2013.
    • Chapter 6
images/shreiner.png

Shadow Maps

Shadows

  • Shadows are important for 3D depth perception
images/shadow-perception.png

Shadows

  • Shadows are important for 3D depth perception

images/spheres-2.png images/spheres-3.png

Shadows

  • Shadows are important for 3D depth perception
images/molecule-shadow.png

Shadows

  • Shadows are important for 3D depth perception
images/protein.png
images/protein-shadow.png

Shadow Art

images/shadow-art1.jpg images/shadow-art2.jpg

images/shadow-art3.png

Shadow Art Research

Shadow Computation

Shadow computation is very similar to visibility determination

  • Visibility
    • Which objects can be seen from the viewpoint?
  • Shadows
    • Which objects can be seen from the light source?

Lighting & Shadows

  • If a light source is occluded, simply skip its diffuse and specular contribution in the Phong lighting computation

\[ \begin{eqnarray*} I(\vec{p},\vec{n},\vec{v}) &=& I_{\text{ambient}} \\[2mm] &+& \sum_{i} \text{shadow}(\vec{p},\vec{l}_i) \cdot \left( I_{\text{diffuse}}(\vec{p},\vec{n},\vec{l}_i) + I_{\text{specular}}(\vec{p},\vec{n},\vec{v},\vec{l}_i) \right) \\ \\ \text{with} && \text{shadow}(\vec{p},\vec{l}_i) \;=\; \left\{\begin{array}{ll} 1 & \vec{l}_i\text{ is visible from } \vec{p},\\ 0 & \vec{l}_i\text{ is blocked from } \vec{p} \end{array} \right. \end{eqnarray*} \]

Light Maps & Shadows

  • Store precomputed lighting in textures
    • Precomputation can take shadows into account
    • Works for static scenes only
      images/light_maps.png

Shadows in OpenGL

  • Shadows are global effect
    • Occluders block light from receiver
    • Occluders can move/deform dynamically
  • How to achieve global effects with local rendering pipeline?
    • We have to use multiple render passes!
  • Two main approaches
    • Shadow volumes (object space)
    • Shadow maps (image space)

Shadow Computation

  • Visibility
    • Which objects can be seen from the viewpoint?
  • Shadows
    • Which objects can be seen from the light source?

\(\Rightarrow\) Apply standard visibility techniques for shadows:
z-buffer \(\rightarrow\) shadow map

Shadow Maps

  1. Render scene as seen from light source
    • Store z-buffer (holds distance to light)
    • Light’s z-buffer is called shadow map

images/shadow-map-1.png
Scene rendered from eye point
images/shadow-map-2.png
z-buffer from light source

Shadow Maps

  1. Render scene as seen from light source
    • Store z-buffer (holds distance to light)
    • Light’s z-buffer is called shadow map
  1. Render scene from eye point
    • Light a certain image plane pixel \((x,y)\)?
    • Map it back into word coordinates: \((x', y', z')\)
    • Project it into shadow map: \((x'', y'')\)
    • If distance point-to-light > depth stored in map \(\Rightarrow\) Point is in shadow!

Shadow Maps

images/shadow-map-3.png

Image from Akenine-Möller, “Real-Time Rendering”

Shadow Maps

images/temple-noshadow.pngScene without shadows images/temple-z-buffer.pngz-buffer of light’s view images/temple-with-shadows.pngScene with shadows

images/temple-from-light.pngSeen from light’s view images/temple-pixel-in-shadow.pngPixels in shadow

Images from Wikipedia

OpenGL Implementation

  1. Render scene without light contribution
  2. For each light source
    1. Render scene from light souce
      • Store z-buffer in shadow map
    2. Render scene with light contribution (accumulate)
      • Shadow map look-up for each pixel
      • Pixels in shadow are discarded
      • Other pixels are lit and rendered

Let’s Try!

Shadow Map Aliasing

images/chess-alias1.png images/chess-alias2.png images/chess-alias3.png

Perspective Shadow Maps

images/perspective-shadowmaps.png

Projection Aliasing

images/perspective-shadowmaps-2.png

Perspective Aliasing

images/perspective-shadowmaps-3.png

Perspective Shadow Maps

  • Avoid the perspective aliasing!
  • Acquire shadow map in post-perspective space
    • Normalized Device Coordinates
    • Parallel viewing rays
  • Optimal configurations
    • Directional light parallel to image plane
    • Point light in the camera plane

Perspective Shadow Maps

images/perspective-shadowmaps-4.png images/perspective-shadowmaps-5.png

Perspective Shadow Maps

images/perspective-shadowmaps-6.png images/perspective-shadowmaps-7.png

Perspective Shadow Maps

images/chess-alias1.png images/chess-alias2.png images/chess-alias3.png

images/chess-persp1.png images/chess-persp2.png images/chess-persp3.png

Soft Shadows?

images/buddha.jpg
Gael Guennebaud

Soft Shadows

  • Don’t filter shadow map pixels!
    • Corresponds to shadow map test with filtered version of the geometry
  • Filter boolean shadow map results
    • Sample in current pixels’s vicinity
    • How many shadow map tests yield light/shadow?
    • Percentage closer filtering

Shadow Maps Summary

  • Works for everything that can be rasterized
    • Self-shadowing, alpha-textured objects
  • Soft shadows by percentage closer filtering
  • Fixed resolution shadow map
    • Aliasing issues
    • Resolved by perspective shadow maps
  • Omni-direction lights?
    • Needs “shadow cube maps”

Literature

  • Hughes et al.: Computer Graphics: Principles and Practice, 3rd Edition, Addison-Wesley, 2014.
    • Chapters 5.6, 15
images/cgpp.png
  • Shreiner et al: OpenGL Programming Guide, 8th edition, Addison-Wesley, 2013.
    • Chapter 7
images/shreiner.png
  • Nvidia GPU Gems
    • Chapter Perspective Shadow Maps: Care and Feeding
images/kozlov.png

Literature

  • McGuire et al: Fast, Practical, and Robust Shadow Volumen, Nvidia Tech Report, 2002.
  • Everitt et al: Hardware Shadow Mapping, Nvidia Tech Report, 2005.
  • Stamminer & Drettakis: Perspective Shadow Maps, Siggraph 2002.