EE586L/Projects 2012

From WikiBiron
Revision as of 09:33, 22 April 2012 by Sungwonl (talk | contribs)

Digit(al) Calculator

Authors: Anil Sunil,Chetan Bhadrashette,Sarthak Sahu

Abstract: Automatic recognition of sign language is an important research problem for communication. Real-time image processing can provide much better experience than using a touch based system. Our project implements a basic calculator using gesture recognition methods. It can be very useful to use this calculator by using gestures for numbers and symbols such as addition, subtraction, multiplication and division, without pressing any buttons or typing anything.

Video: [TBD]


EDGR

Authors: Aditya Tannu, Michael Minkler, Joshua Ramos

Abstract: EDGR - Embedded Depth Gesture Recognition. An 8-piece puzzle solved using hand gestures

Video: [TBD]


ExDetect

Authors: Yixin Shi, Qinwen Xu,Zhanpeng Yi

Abstract: The human visual system can understand different emotions on face very easily. However, it still needs effort to develop a real-time automated facial expression recognition system with great accuracy and short delay. Here, a real-time facial expression recognition prototype will be developed. The recognition system detect a single face from real-time video sequence and then attempt to recognize a set of emotional expressions including joy, surprise, disgust, anger and neutral. The system is supposed to be response to emotion variation without perceivable delay. First, skin color would be used to trace the face area in video steam and then LBP operator would be performed on divided small blocks of extracted face so that histogram could be computed and cascaded to be a whole feature set. Template matching would be used as classifying method and the outcome would be one of the five predefined emotions.

Video: [TBD]


FaceDetc

Authors: Li Cheng,Jinghan Xu,Xin Wei

Abstract: real time face detection

Video: [TBD]


Magic Face

Authors: Jinkai Wang,Ya Cao,Yao Lin

Abstract: Our project implements facial expression recognition in real time. Using DSK6416T and camera, human face, eyes and mouth can be tracked and bounding boxes will display on the board. Smile, surprise and neutral expressions can be recognized.

Video: [TBD]


Paper Piano

Authors: Hang Dong, Dana Morgenstern, Yu Rong

Abstract: The motivation of our project is to do a simple virtual piano by using a piece of paper as a keyboard and a camera to track the finger movements in order to select the key notes to play. The tracking of the fingers and the detection of the keys that are pressed will be achieved by using video processing techniques. The sound notes corresponding to the keys pressed will output through a loudspeaker. We may also display in the LCD of the board the result of the edge detection of the paper piano keys and fingertips.

Video: [TBD]


Smart Group

Authors: Li Li, Hao Xu, Chiho Choi

Abstract: Gaze Tracking. TBD

Video: [TBD]


Visual Object

Authors: Shira Epstein, Will Chung

Abstract: Virtual Object inserts a virtual object into the video feed captured by the camera. First, an object of known shape and size is placed in a fixed location in the real scene. Using the POSIT algorithm, our code detects the object feature points and reaches an estimate of the relative camera pose in 3D space. Finally, the Virtual Object is drawn to the output video accordingly.

Video: [TBD]