Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Normalizing Vectors and Light in Computer Graphics

  1. Nov 10, 2011 #1
    I am trying to understand how light works in computer graphics. Especially OpenGL 3+. So I was given a paper that says how to calculate light. It says "normalise the vectors
    before taking the dot product".How do I normalize vectors and why? I have to compute the normals to a vertex. What is a normal to a vertex? How can I visualise it?
     
  2. jcsd
  3. Nov 10, 2011 #2

    DrGreg

    User Avatar
    Science Advisor
    Gold Member

    "Normalising a vector" just means rescaling it (i.e. multiply by a scalar) so that the vector's length becomes 1.

    I don't understand what "normal to a vertex" could mean. Was that really the language that was used?
     
  4. Nov 10, 2011 #3

    chiro

    User Avatar
    Science Advisor

    As stated, normalizing vectors means making their length 1. You just divide each component by the length of the vector, and that will give you a unit vector.

    In rendering applications, if your texture is RGBA 32-bit, each component of the vector will have 8 bits. This means that the representation will consist of 3 signed 8-bit numbers corresponding to your vector.

    Typically when you use the dot product texture operator, you have a texture map which is known as the normal map. When you take the dot product of the normal map with the light normal vector, you get a lightmap for your object which you modulate with your texture map which gives you a texture that shows the effect of a light on that object.

    So basically: light direction vector -> texture. From there you use dot product between this texture map and normal map: This will give you a lightmap which you modulate with normal texture map which gives modulated map. You then modulate that usually with another lightmap which gives final texture for object which you use for rendering.

    You can do more than this but this is the general idea.

    The thing is that this method assumes a directional light source: your lighting source is like say the sun and not something like a lamp or a flashlight: you add more texture operations for this kind of effect.

    The normal to a vertex (or vertex normal) is the normal at a point of your geometry.

    As an example if you have a parametric surface (Bezier surface), the vertex normal is just the surface normal at that point.

    Vertex normals are used for lighting as well, and under the old rendering pipeline, this is how lighting was actually done. It is done in the same sort of way that the normal map does it, but it's done in the vertex pipeline, whereas the normal map computations are done in the fragment (or texture) pipeline.

    If you want to visualize a vertex normal, just draw a line of the vertex normal from the point to another short point in the direction of the normal.

    In terms of getting the vertex normal, for surfaces like Bezier surfaces or NURBS, you need to use calculus. Essentially you get the tangent and binormal vectors and then take the cross product and normalize the vector.

    For straight surfaces it is a little trickier. You have to take into account adjoining edges since these objects are not continuous in the way that parametric surfaces are. You usually average over the normal vectors of adjoining surfaces.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Normalizing Vectors and Light in Computer Graphics
  1. Java3D graphics (Replies: 0)

Loading...