Computer Graphics Concepts: CMYK, RGB, Buffering, and More

CMYK and RGB Color Models

CMYK subtracts from white, K “black”

RGB Additive color model – adding ‘primary’ colors to black

Buffering and Image Processing

Double buffering is having two buffers in video memory (front and back buffer) for swapping between them, solving any flickering or partially drawn frames problems

Sobel convolution Apply a kernel to every pixel in image

9k=

Gradient of a point Directional change in intensity of color of image

Lines and Curves

Parametric line P(t) = P0 + t(P1-P0) Z

B(t) = (1-t)P0 + tP1   B(t) = (1-t)[(1-t)P0 + tP1] + t[(1-t)P1 + tP2]

Image Analysis

Image histogram is a measure of distribution of color within an image

9k=

Coordinate Systems and Transformations

In a local coordinate system, the coordinates of a set of points are defined relative to the origin unique to the object

In the global coordinate system, the coordinates are defined relative to center of the world

Pivot point = center of our coordinate system

Transformations Translation, Rotation and Scaling

Perspective and Projection

Perspective closest things seem bigger, has vanish-points, parallel lines touch at infinity Orthographic everything seems equal, no vanish points, parallel lines never touch

Reflection and Ray Tracing

Diffuse reflection Light hits surface, is scattered either due to irregular surface OR sub-surface and emerges in different directions Specular reflection Mirror-like reflection of light

Ray Tracing shadows adding another for loop, and testing for intersection between Light, Point and Objects

foreach pixel p:create ray r from e going through p; foreach sceneobject o:calculate intersection of r with o; foreach light l:p.color += light scattered at intersection

Distributed ray tracing add element of randomness to swap aliasing for noise

Pixel Depth and Algorithms

8-bit pixel 2^8=256

DDA uses floating points where as Bresenham algorithm use fixed points.

Curves and Surfaces

NURBs curves is the ability to control smoothness

Each control point’s weight changes according to a polynomial function (basis function) of degree d, each control point corresponds to one basis function which spans a range of (degree+1) knot spans

Depth and Rendering

Z buffer is used to determine which objects are in front of others when painting. The depth value of each pixel is stored in a separate buffer and will be used to discard pixels that are behind already painted objects as they are rasterized.

Point*View Matrix = coordinate system, camera space

9K4j9VTPEwEBADs=

The Normal N = (nxnynz), is obtained by the cross product of two vectors constructed with the vertices of the polygon:

N = ( (V2 – V1) x (V3 – V1) ) / | (V2 – V1) x (V3 – V1) |

View matrix camera defined by EYE,CENTER,UP

Field of View (fov) – angle of aperture of the camera and is necessary to create the Frustum

It is necessary to divide X, Y, Z by W to go from homogeneous to normalized coordinates and thus convert them into coordinates of our framebuffer (0..Width)

Texturing and Mapping

Mipmaps are used to resolve aliasing and help solve the problem of visible LOD divisions

Shadow Map is a process by which shadows are added to 3D computer graphics, shadows are created by testing whether a pixel is visible from the light source, by comparing the pixel to a z-buffer or depth image of the light source’s view, stored in the form of a texture.

First from the light point of view. The results of this render pass don’t reach the color buffer. (Filling the Z-buffer) Second, The scene is rendered as usual from the camera point of view.

Normal map is a technique used for faking the lighting of bumps and dents, stored as regular RGB images where the RGB components correspond to the X, Y, and Z coordinates, respectively, of the surface normal.

Texture coordinates (uvs) indicate for each point of a mesh which point of the texture corresponds to it, they are 2D values (between 0 and 1)

Shading and Lighting

Form factor physical relationship between the two surfaces, the fraction of energy leaving one surface that reaches another surface

Gouraud Shading Calculate color at each vertex, Interpolate color across each triangle

Flat shading computes lighting values for one vertex per polygon and uses the resulting color for the entire polygon

Phong shading we calculate normal per pixel. That means that each pixel has its own normal, and we avoid mach banding. Gouraud, which interpolates the colors on the surface of the polygons (based on the colors of each vertex), in Phong shading the normal vector is linearly interpolated on the surface of the polygon (based on the normals of each vertex of the polygon) computes the light per fragment, not per vertex. Phong shading is therefore more computationally expensive than Gouraud shading since the reflection model must be calculated at each pixel instead of at each vertex.

Phong shading Vector R reflected (L reflected on N) vector V from the point to the center of the camera vector L from the point to the light. If you have the coordinates of the point you are going to paint and the coordinate of the light source, subtract between them and normalize.

Vector Operations

Dot Product  v • w = v1w1 + v2w2 + …. +  vnwn we use it to measure angles

Screen Shot 2014-04-02 at 18.32.07.png

Cross product gives perpendicular vector