Difference between revisions of "EE586L/Projects 2010"

From WikiBiron
Line 1: Line 1:
 +
== Eye Catchers ==
 +
 +
'''Authors:''' [mailto:deodhar@usc.edu Ashvin Deodhar], [mailto:temcguir@usc.edu Trevor McGuire], [mailto:tokayer@usc.edu Jason Tokayer]
 +
 +
'''Abstract:''' ''Automatic video gaze tracking finds applications in handicapped accessibility and behavioral analysis. Many challenges exist in recognizing and tracking the pupil of the eye. We adapt a version of the Starburst algorithm, originally developed to track the eye with an infrared illumination source, to track the pupil in the visible spectrum. The adapted algorithm runs on a TI DSP and tracks the pupil with a camera mounted in front of a monitor. We demonstrate functionality by detection of menu options on the monitor using the gaze of the eyes.''
 +
 +
'''Video:'''
 +
 +
== Ewind ==
 +
 +
'''Authors:'''[mailto:shengguo@usc.edu Sheng Guo], [mailto:yinjunpa@usc.edu Yinjun Pan], [mailto:weiweiwa@usc.edu Weiwei Wang]
 +
 +
'''Abstract:''' ''Our project is Emotional Detection. We used pitch, energy, and MFB as features to estimate the emotion of the speaker. We assigned a weight to each feature based on their importance. EMA database was used for training data and test data. During the demo, we will use both this database (not include the training data) and some speeches from YouTube to test our system.''
 +
 +
'''Poster:'''
 +
 
== Interactive Object Tracking ==
 
== Interactive Object Tracking ==
  
Line 4: Line 20:
  
 
'''Abstract:''' ''In this project, a camera-based object tracking system will be demonstrated along with virtual reality effects.  Our goal is to replace a real world target with a pre-stored image.  To make the replacement appear realistic, several image enhancements, such as color blending, image scaling, shadow insertion and additive noise, should be applied to the pre-stored image before superimposing it on the live video feed.  In contrast with simple static image replacement, the object will be dynamically adjusted to mimic realistic shadows and rotations, and then displayed on the LCD screen. The expectation of the project is to provide both entertainment and education platform for kids and the elderly in an interactive and animated fashion, such as story telling and rehabilitation in video games. ''
 
'''Abstract:''' ''In this project, a camera-based object tracking system will be demonstrated along with virtual reality effects.  Our goal is to replace a real world target with a pre-stored image.  To make the replacement appear realistic, several image enhancements, such as color blending, image scaling, shadow insertion and additive noise, should be applied to the pre-stored image before superimposing it on the live video feed.  In contrast with simple static image replacement, the object will be dynamically adjusted to mimic realistic shadows and rotations, and then displayed on the LCD screen. The expectation of the project is to provide both entertainment and education platform for kids and the elderly in an interactive and animated fashion, such as story telling and rehabilitation in video games. ''
 +
 +
'''Poster:'''
 +
 +
'''Video:'''
 +
 +
== Mosaic ==
 +
 +
'''Authors:''' [mailto:pengkuic@usc.edu Pengkui Chu], [mailto:shiyuxu@usc.edu Shiyu Xu], [mailto:yingzhen@usc.edu Ying Zheng]
 +
 +
'''Abstract:''' ''Virtual Drums based on digital camera.''
 +
 +
'''Video:'''
 +
 +
== Project Natal - The Beginning ==
 +
 +
'''Authors:''' [mailto:gorsi@usc.edu Talha Gorsi], [mailto:arjungup@usc.edu Arjun Gupta], [mailto:vkhatri@usc.edu Vikash Khatri]
 +
 +
'''Abstract:''' ''Project Natal is the combination of two classic arcade games, Snake and Vertical Scrolling Plane. These games are controlled using human gestures which are identified by converting the video sequence from camera to background subtracted binary frames and analyzing motion in the resulting frames. The classic snake game is an example of discrete human action recognition and the movement of snake is controlled by moving the hand to the right, left or up. The snake can eat eggs, increase in size and hit the maze. Vertical Scrolling plane is inspired from "1942 game" and the plane tries to follow the position of your hands in right-left or up-down directions doing the continuous tracking of hand position. The obstacles appear in random order and the target is to save plane from hitting any obstacle. The user can select the game interactively on the home page by pointing hand in right or left direction. This home page appears in the start and at the end of each game.''
  
 
'''Poster:'''
 
'''Poster:'''
Line 18: Line 52:
  
 
'''Video:'''
 
'''Video:'''
 +
 +
== SmartVoice ==
 +
 +
'''Authors:''' [mailto:siy@usc.edu Yu Si], [mailto:zhiyangw@usc.edu Zhiyang Wang], [mailto:yuyuxu@usc.edu Yuyu Xu]
 +
 +
'''Abstract:''' ''Auto-speech recognition. Using MFCC model and VQ Technology to extract the voice feature and create personal ID. Two modes provided, inside training mode, password is entered while inside testing mode, actual indentification and access control are implemented.''
 +
 +
'''Poster:'''

Revision as of 13:30, 23 April 2010

Eye Catchers

Authors: Ashvin Deodhar, Trevor McGuire, Jason Tokayer

Abstract: Automatic video gaze tracking finds applications in handicapped accessibility and behavioral analysis. Many challenges exist in recognizing and tracking the pupil of the eye. We adapt a version of the Starburst algorithm, originally developed to track the eye with an infrared illumination source, to track the pupil in the visible spectrum. The adapted algorithm runs on a TI DSP and tracks the pupil with a camera mounted in front of a monitor. We demonstrate functionality by detection of menu options on the monitor using the gaze of the eyes.

Video:

Ewind

Authors:Sheng Guo, Yinjun Pan, Weiwei Wang

Abstract: Our project is Emotional Detection. We used pitch, energy, and MFB as features to estimate the emotion of the speaker. We assigned a weight to each feature based on their importance. EMA database was used for training data and test data. During the demo, we will use both this database (not include the training data) and some speeches from YouTube to test our system.

Poster:

Interactive Object Tracking

Authors: David Martin, Ming-Chun Huang, Yu-Jen Huang

Abstract: In this project, a camera-based object tracking system will be demonstrated along with virtual reality effects. Our goal is to replace a real world target with a pre-stored image. To make the replacement appear realistic, several image enhancements, such as color blending, image scaling, shadow insertion and additive noise, should be applied to the pre-stored image before superimposing it on the live video feed. In contrast with simple static image replacement, the object will be dynamically adjusted to mimic realistic shadows and rotations, and then displayed on the LCD screen. The expectation of the project is to provide both entertainment and education platform for kids and the elderly in an interactive and animated fashion, such as story telling and rehabilitation in video games.

Poster:

Video:

Mosaic

Authors: Pengkui Chu, Shiyu Xu, Ying Zheng

Abstract: Virtual Drums based on digital camera.

Video:

Project Natal - The Beginning

Authors: Talha Gorsi, Arjun Gupta, Vikash Khatri

Abstract: Project Natal is the combination of two classic arcade games, Snake and Vertical Scrolling Plane. These games are controlled using human gestures which are identified by converting the video sequence from camera to background subtracted binary frames and analyzing motion in the resulting frames. The classic snake game is an example of discrete human action recognition and the movement of snake is controlled by moving the hand to the right, left or up. The snake can eat eggs, increase in size and hit the maze. Vertical Scrolling plane is inspired from "1942 game" and the plane tries to follow the position of your hands in right-left or up-down directions doing the continuous tracking of hand position. The obstacles appear in random order and the target is to save plane from hitting any obstacle. The user can select the game interactively on the home page by pointing hand in right or left direction. This home page appears in the start and at the end of each game.

Poster:

Video:

Sixth Sensors

Authors: Chong Liu, Daru Su, Yuanhang Su, Yashodhar Narvaneni

Abstract: The project is to design and develop a Interactive drawing and presentation tool which will realize the freedom of free hand moving and gestures. The system uses a CCD camera to capture the hand movements and gestures of the user. A finger tracking and gesture recognition algorithms are implemented on the DSP board and the output is fed to a Projector. The fingers are tracked based on the color caps worn on the finger and the gestures are detected based on the direction and speed of motion. The tool is used as a drawing tool and a interactive Presentation tool. This can be further extended to use as a good user interface tool for lectures, teaching, television, computer or many other devices with which a user can interact.

Poster:

Video:

SmartVoice

Authors: Yu Si, Zhiyang Wang, Yuyu Xu

Abstract: Auto-speech recognition. Using MFCC model and VQ Technology to extract the voice feature and create personal ID. Two modes provided, inside training mode, password is entered while inside testing mode, actual indentification and access control are implemented.

Poster: