Difference between revisions of "EE586L/Projects 2010"

From WikiBiron
Line 1: Line 1:
 
'''Project Title:''' ''Interactive Object Tracking''
 
'''Project Title:''' ''Interactive Object Tracking''
  
'''Authors:''' [mailto:davidrm@usc.edu David Martin]
+
'''Authors:''' [mailto:davidrm@usc.edu David Martin], [mailto:mingchuh@usc.edu Ming-Chun Huang], [mailto:yujenh@usc.edu Yu-Jen Huang]
  
 
'''Abstract:''' ''In this project, a camera-based object tracking system will be demonstrated along with virtual reality effects.  Our goal is to replace a real world target with a pre-stored image.  To make the replacement appear realistic, several image enhancements, such as color blending, image scaling, shadow insertion and additive noise, should be applied to the pre-stored image before superimposing it on the live video feed.  In contrast with simple static image replacement, the object will be dynamically adjusted to mimic realistic shadows and rotations, and then displayed on the LCD screen. The expectation of the project is to provide both entertainment and education platform for kids and the elderly in an interactive and animated fashion, such as story telling and rehabilitation in video games. ''
 
'''Abstract:''' ''In this project, a camera-based object tracking system will be demonstrated along with virtual reality effects.  Our goal is to replace a real world target with a pre-stored image.  To make the replacement appear realistic, several image enhancements, such as color blending, image scaling, shadow insertion and additive noise, should be applied to the pre-stored image before superimposing it on the live video feed.  In contrast with simple static image replacement, the object will be dynamically adjusted to mimic realistic shadows and rotations, and then displayed on the LCD screen. The expectation of the project is to provide both entertainment and education platform for kids and the elderly in an interactive and animated fashion, such as story telling and rehabilitation in video games. ''
 +
 +
'''Poster:'''
 +
 +
'''Video:'''
 +
 +
----
 +
== Sixth Sensors ==
 +
 +
'''Authors:''' [mailto:chongliu@usc.edu Chong Liu], [mailto:daruxu@usc.edu Daru Su], [mailto:yuanhans@usc.edu Yuanhang Su], [mailto:narvanen@usc.edu Yashodhar Narvaneni]
 +
 +
'''Abstract:''' ''The project is to design and develop a Interactive drawing and presentation tool which will realize the freedom of free hand moving and gestures. The system uses a CCD camera to capture the hand movements and gestures of the user. A finger tracking and gesture recognition algorithms are implemented on the DSP board and the output is fed to a Projector. The fingers are tracked based on the color caps worn on the finger and the gestures are detected based on the direction and speed of motion. The tool is used as a drawing tool and a interactive Presentation tool. This can be further extended to use as a good user interface tool for lectures, teaching, television, computer or many other devices with which a user can interact. ''
  
 
'''Poster:'''
 
'''Poster:'''
  
 
'''Video:'''
 
'''Video:'''

Revision as of 17:31, 22 April 2010

Project Title: Interactive Object Tracking

Authors: David Martin, Ming-Chun Huang, Yu-Jen Huang

Abstract: In this project, a camera-based object tracking system will be demonstrated along with virtual reality effects. Our goal is to replace a real world target with a pre-stored image. To make the replacement appear realistic, several image enhancements, such as color blending, image scaling, shadow insertion and additive noise, should be applied to the pre-stored image before superimposing it on the live video feed. In contrast with simple static image replacement, the object will be dynamically adjusted to mimic realistic shadows and rotations, and then displayed on the LCD screen. The expectation of the project is to provide both entertainment and education platform for kids and the elderly in an interactive and animated fashion, such as story telling and rehabilitation in video games.

Poster:

Video:


Sixth Sensors

Authors: Chong Liu, Daru Su, Yuanhang Su, Yashodhar Narvaneni

Abstract: The project is to design and develop a Interactive drawing and presentation tool which will realize the freedom of free hand moving and gestures. The system uses a CCD camera to capture the hand movements and gestures of the user. A finger tracking and gesture recognition algorithms are implemented on the DSP board and the output is fed to a Projector. The fingers are tracked based on the color caps worn on the finger and the gestures are detected based on the direction and speed of motion. The tool is used as a drawing tool and a interactive Presentation tool. This can be further extended to use as a good user interface tool for lectures, teaching, television, computer or many other devices with which a user can interact.

Poster:

Video: