Final Exam, CSCI 360, Fall 2020


  1. The fragment shader has access to the variable gl_FrontFacing
    1. What causes this variable to have the value of true?
      1. [3 points] Answer this at a conceptual level.
        This value is true if the camera can "see" the front face of the polygon.
      2. [3 points] Answer this at a mathematical level.
        This value is true if the angle between the normal vector to the surface and the look at vector from the camera is less than 90°. This is accomplished by
        • Using the cross products of the vectors determined by the vertexes of the triangle to find the normal to the triangle.
        • Normalize this quantity
        • Using the dot product between this and the normal and the look at vector to find the angle, or really the cos of the angle. If this is positive, the surface is visible.
        We discussed this in some detail, I would expect an explanation close to this.
    2. [2 points] How is the value of gl_FrontFacing impacted by the order of the vertexes of a triangle?
      Since gl_FrontFacing is determined by the surface normal and the surface normal is determined by looking at the vectors created by the vertexes of the triangle, the order of specification must be
      1. Known
      2. Consistent between triangles.
      In general, the order of the vertexes determines the surface normal. a x b = -(b x a)
      Ok, so perhaps this is overboard, but I really would expect the words surface normal in the answer.
    3. [2 points] Name at least two WebGL calls that impact the value of gl_FrontFacing
      • gl.frontFace(mode)
      • gl.cullFace(mode)
      Ok, so this was not a great question. If you gave me frontFace, you got two points.
  2. Communicating with shaders
    1. [4 points] Name and describe two methods a front-end program can employ to communication values with WebGL shaders.
      • Vertex related data is sent via a vertex buffer object. These are called attributes in the shader.
      • Object, or higher, related data is sent via a uniform variable. These are called uniform in the shader.
    2. [2 points] Explain the differences between these methods.
      Vertex data is generally sent over once per object. If the vertex data changes this may be updated, but for static data this is only stored once. This data is generally "larger", ie many points per object, and is stored on the GPU.

      Uniform data tends to be values such as transformation matrices, object level parameters, or arguments to shaders. This data is by definition "small" and tends to be updated each time an object is drawn.

    3. [4 points] Explain when each of these methods would be used. Provide examples.
      VBO data would include the vertex values, possible vertex normal and other attributes such as per vertex colors. They are communicated via a call to glBufferData (after the VBO has been established.) An example might be
      gl.bufferData(gl.ARRAY_BUFFER,flatten(bcs),gl.STATIC_DRAW);
      

      Uniform data includes items such as transformation matrices. They are communicated via a call to glUniformMatrix*** or glUniform*. An example might be

      gl.uniformMatrix4fv(this.projLoc, false,flatten(proj));
      gl.uniform4fv(this.colorLoc, c);
      
      I expect there to be some crossover among the various parts. This is ok, just explain the topic to me.
  3. Linear Interpolation
    1. [3 points] Describe the linear interpolation that occurs in the WebGL pipeline.
      Variables that are declared varying and set in the vertex shader are interpolated to produce values in the fragment shader.
    2. [4 points] Describe how this LERP can be used to detect edges of triangles.
      Using barycentric coordinates. Each vertex is assigned a triple (a,b,c) where exactly one value is 1. When these are interpolated, any fragment with a value "close enough" to zero is considered to be on the edge of the triangle.
    3. [3 points] Describe how this LERP can be used to make objects created out of polygons appear to be constructed out of smooth curves.
      The lighting model uses a normal at each fragment to determine the color of the fragment. As the interpolated normal changes from one vertex to another, the lighting model makes the surface appear to be a smooth curve.
  4. Triangles
    1. [2 points] Name two properties of triangles that makes them ideal for display in a graphics pipeline.
      • They are always simple, you don't need even-odd or zero fill rules.
      • The points are all co-planar.
    2. [3 points] Select one of these properties and explain why it is important to possess that property in a graphics pipeline.
      Co-planar points means that there is a single surface normal. This means that a single test for front facing is necessary. If the points are not coplanar, there is the possibility that part of the surface is visible and another part is not. This simplifies hidden surface detection.
  5. [15 points] Rendering
    Individual answers will vary. This answer should be summative and relatively complete.
    1. Describe the process that transforms triangles specified in modeling coordinates to pixels in the frame buffer. Discuss this in detail in terms of the WebGL pipeline, however make sure your discussion includes
      • Various coordinate systems.
      • Processes such as clipping, projection, lighting, ...
      • Pipeline stages and shaders
      • Hardware such as the CPU and GPU.

Submission Instructions