Hide menu

Student projects Advanced Computer Graphics

Project: Depth of Field Effect

introintrointrointro

 

 

 

 

 

 

In optics, particularly film and photography, the depth of field is the distance in front of and beyond the subject that appears to be in focus.

Scenery outside the DOF projects blurred images which is also known as circles of confusion. Decreasing the aperture size reduces the size of the blur circles for points not in the focused plane, so that the blurring is imperceptible, and all points are within the DOF. The disadvantage of a small aperture is that the exposure has to be extended in order for enough light to fall on the film. A camera in 3D graphics libraries such as OpenGL operates with an infinitesimal aperture since the library does not have to care about exposure times. In order to archive a DOF effect, a lens system with a focal plane needs to be mimicked. As part of this project, two implementations of lenses are done: a software one and an hardware one. In order to gain a better understanding about mimicking lens system, an implementation in software is done. This implementation clearly shows that a real-time effect created in software is not possible. To be able to gain better performance, the gpu has to be used in the form of hardware shader. During the first pass a texture containing a sharp (focused) version of the scene is produced. In this original texture the depth values of the pixels are stored in the alpha channel. The second pass applies a gaussian function to the original texture, obtaining the blurred (out-of-focus) version. The final pass mixes the original texture and the blurred texture using a sigmoidal function based on distance to the focal plane.

Project member(s): Johannes Rajala & Willem Frishert

 

Project: Skeletal animation

sasasasa

 

 

 

 

 

As part of project for introduction to computer graphics developed an application simulating/demonstrating various OpenGL and Shader effects and skeletal animation.

The followings were implemented for the project:

  • A 3d castle was made using 3DStudio Max. The castle was saved as 3ds file and loaded by the application using a 3dsLoader implemented in C.
  • Lens flare effect was implemented when looking at the sun.
  • Environment consisting of ground and sky was implemented using box wrapped with cloud and ground texture. The textures were generated using terragen.
  • Fire was implemented using particle system. Each particle for the fire has its own velocity, size, living time, color, etc., which varied with time. In addition advanced memory management was used to achieve faster frame rates since allocating and de-allocating memory space involved in creating and destroying hundreds of particle in each frame is time consuming.
  • The following shaders using GLSL were implemented: toon shading; phong shading; day-night effect of planet earth; refraction to simulate ‘look through’ glass effect
  • Skeletal animation was implemented using Cal3D. Cal3D is a skeletal based 3d character animation library written in C++ in a platform-/graphic API-independent way.>

Project member(s): Ruman Zakaria Salam

Project: Real-Time High Dynamic Range Imaged Based Rendering

ibrHigh Dynamic Range imaging (HDRI) techniques, unlike normal digital imaging techniques,are able to capture a far greater dynamic range of exposures (i.e. a large range of values between light, i.e. intense sunlight, and dark areas, i.e. deep shadows). Using these techniques the wide range of intensity levels found in real scenes can be accurately represented. One of the techniques is HDR Image Based Lighting which employs an HDR image as light source of a typical Computer Graphics scene. Originally, Image Based Lighting is a global illumination method but it has also started to show up in game technology since most game consoles and computers have vastly improved in terms of available resources for rendering. The project for this course consists of four steps: HDR image acquisition, image transformation, Image Based Lighting and tone mapping. The main focus lies on achieving real-time HDR IBL application. The image is stored as Greg Ward's hdr format (also known as RGBE). This format has been chosen because the available graphics hardware (for this project) is limited in applying bi-linear filtering. Using RGBE, graphics hardware shaders have to decode and encode the textures every single step of the pipeline. The pipeline consists of HDR image analysis, applying effects (blooming and vignette), modeling materials (perfect reflection, perfect refraction, diffuse and porcelain) using fresnel effect, and apply Reinhardt's photographic local tone mapping operator.

Project member(s): João Pedro Jorge & Willem Frishert

Project: Realistic water simulation using GLSL

rtrrtr

As part of Real Time Rendering course project did real time realistic water simulation using OpenGL and GLSL. Some characteristics of the simulation are:

 

 

 

  • Fresnal term was incorporated so that more refraction and less reflection occurs when looking down at the water surface from a steep angle; and more reflection and less refraction occurs when looking down at the water surface from a less steep angle.
  • In order to simulate reflection and refraction projective texturing was used. The scene was rendered several times with lower resolution and a clipping plane. At each stage of the rendering process the screen was captured. At the final stage projective texturing in GLSL was used to throw the captured images from the previous rendering cycle onto the scene.
  • The project was further improved by using frame buffer objects (FBO’s) to avoid the slow OpenGL copy texture functions.
  • 3D noise textures were used to simulate the wave effect of the water surface.

Project member(s): Ruman Zakaria Salam

Project: Photon Mapping

agiagiagiagi

 

 

 

 

 

 

 

As part of project for Advanced Global Illumination course worked with photon mapping to render realistic images.

QT from Trolltech was used to make the front end where users are able to change various settings for the photon mapping. Users are able to set the camera position, set the size, position and material properties of objects within the kernel box. Settings related to photon mapping such as number of photon to throw, the resolution of the image to produce, etc., can also be done through the interface.

Some characteristics include:

  • An efficient random number generator was used.
  • Refraction caustics are available.
  • Color bleeding occurs.
  • An efficient and easy to use front end (the GUI) made using QT and OpenGL is used to set the various properties of the scene and variables in photon mapping.

Project member(s): Ruman Zakaria Salam

Project: Hwakins Wormhole

hwhw

I Developed a game for mobile platform using OpenGL ES specially designed for smart phones or PDA’s.

The player has to control a continuously moving glowing ball by drawing bar using a stylus pen. That bar acts as a barrier on the way of the moving ball, and the ball rebounds after striking the bar. The target of the player is not to allow the ball to come close to wormholes which are moving around randomly, otherwise the ball gets sucked into the wormhole due to the force of gravity and the player loses its life. In addition the player has to make the ball hit specific items moving randomly in space to earn points and additional strengths. So, ultimately the goal of the player is to stay alive as long as possible and cover more and more stages, to get maximum points. 
The glowing ball, the wormholes and explosions are all done using particle systems. The ball is attracted towards the wormholes due to the attractive force of gravity.

Project member(s): Ruman Zakaria Salam


Page responsible: webbredaktoren@liu.se
Last updated: Mon May 13 15:02:24 CEST 2013