EE586L/Projects 2011

From WikiBiron
Revision as of 09:46, 5 May 2011 by Seanmcpherson (talk | contribs) (CamMouse)

Alter Ego

Authors: Nirmal Patel, Vipresh Gangwal

Abstract: Alter Ego; latin "the other I" here refers to one's own reflection. The project acts like a mirror. One can put on the trioscopic goggles and look into a 3D image of oneself in the LCD screen. This colorful 3D represntation is achived by generating Anaglyphs.

Video: YouTube Video

Blu-Rabbit

Authors: Evan Lee, Jose Herrera, Ross MacKinnon

Abstract: In an attempt to create a more ergonomic and natural interface between humans and computers, this project strives to utilize common communication techniques to replace the hardware used today. The outdated and unnatural mouse is not the optimal way to create a comfortable experience for the user, and thus this project utilizes intuitive gestures to achieve the same end. By developing and tweaking algorithms for finger tracking and gesture recognition, the members of the Blu-Rabbit team show how interaction between humans and computers may mimic interactions seen between humans.

Video: YouTube Video

Cam Hockey

Authors:Anoop Panicker, Mihir Thakkar, Sanjay Jain, Vinit Adkar

Abstract: Cam Hockey is a video camera-based interactive version of the popular game Air Hockey. The goal of the game is to control your player on the screen using your hand movements (with a color marker) and hit the ball towards the opposite side of the screen controlled by another player.If a player is unable to hit the ball, the other player scores a goal. The objective of the game is to score as many goals as you can while outscoring your opponent.

Video: YouTube Video

CamMouse

Authors: Yen-Cheng Chen, Chan-Yi Ou, Yu-Ting Wu, Jin-Wei Lee

Abstract: Our goal is developing a virtual cursor interface which can control a cursor as normal computer mouse does. This idea is inspired by our daily life; although touch panel is getting better and prevailing nowadays, using mouse is still the easiest and most effort-saving way to control the cursor. In order to use computer mouse, a flat surface is needed. However, by utilizing merely a video camera in this system, we don’t need to bring a computer mouse, and it can be easily used in any kind of surface.

Video: YouTube Video

Dancing Machine

Authors: Yue Pan, Chenchen Wang

Abstract: Dancing Machine Modeling: 1) Judge the behavior of player wearing color markers based on color detection algorithm. If the player locates the right colored marker into the requested space area at correct moment, one score will be added. 2) The speed of the game(flying arrow speed)is in pace with the chosen real-time music, based on beat detection algorithm. 3) The interface is divided into three parts: arrow panel, score showing panel and real-time camera video panel. 4) The game complexity could be changed by switches of the platform.

Poster:

Video:

Fatigue Driving Warning System

Authors: Young Chun Ahn, Lu Liu, Yinxiao Zhang

Abstract: A system that monitors the user with a camera and send out warnings when it's detected that the user is dozing off.

Video: YouTube Video

Go Five

Authors: Fei Yu, Weijie Zhao, Zhe Wang

Abstract: Using motion based control techniques in games is becoming gradually more popular nowadays. Mobile cameras and real-time video processing take place of traditional input devices, which may provide much better game experience with much more freedom and joy. In this project, gestures of fingers are used as the control factor. Skin color extraction and pattern matching are adopted to detect the position of finger tips to achieve low complexity for real-time interactivity. In addition to that, modified low pass filter is applied to reduce the influence of noise and therefore keep the robustness of the detection. More than five kinds of control commands defined by distinct finger gestures are designed user-friendly. Overall, by introducing this novel way to a real-time interactive game, players are able to enjoy fancy gaming experiences by simply moving their fingers in front of a camera.

Video: YouTube Video

Got Ya

Authors: Yu-Chu Yang, Chun-Kang Chen, Yung-Chun Chen

Abstract: We are building a surveillance system to track and label the people in front of the camera. Once people step inside the monitor area, the system will show the outline of people and each people will be assigned a bounding box with different colors from others. The goal of our system is to accurately count the number of people in the monitor area.

Video: YouTube

hEEEro

Authors: Xiang Fu, Lidou Wang, Yingbu Kou

Abstract: Our project helps to establish a renewable background (BG) model using Gaussian Mixture Models (GMM) algorithm based on frame sequences, and recognize the foreground (FG) region in real time, followed by adding pre-stored image or video background. The main applications involve movie industry, weather forecast, photograph with super stars, surveillance, background modeling, etc.

Poster:

Video:

Match & OCR

Authors: Chun-Ting Huang, Kuan-Ming Lin, Yen-Feng Lee

Abstract: Our project includes one mini game and Chinese number OCR system. Firstly, the mini game utilizes marker tracking to check the marker matches the position on the screen, and if it matches, the score also increases. Secondly, we build writing interface for input in the Chinese number OCR system, and the system can recognize the input information as Chinese number or unknown.

Video: YouTube Video

Midas Hands

Authors: Geun Lee, Suk Jin Lee, Shafil ahmed, Christian Idyllel

Abstract: Depth data from the Kinect is used for hand gesture recognition to manipulate on-screen images through geometric modifications. These modifications include scaling and translation.

Video: YouTube Video

Pitchin' Wavelets

Authors: Balamurali Govindaraju, Yule Wu

Abstract: Pitch Shifting by Wavelets.

Poster:

Video:

Project Conductor

Authors: Nitin Balajee Ravi, Prasanna Dhamodaran, Sung Kwang Cho

Abstract: The objective of the project is to change the tempo of music real-time using gestures. We also have multiple/single instruments playing some music whose tempo is varied.

Video: YouTube Video

Super Advertisement

Authors: Jian Chen, Boyang Li, Chen Ling

Abstract: A method for automatic insertion of image into a real-time video sequence is implemented. The algorithm is based on Harris corner detection and image mapping.

Poster:

Video:

Tiro

Authors: Rashmi Deshmukh, Hardik Shah

Abstract: For our project we got inspired by the progress in the field of HCI (Human – Computer Interaction) and its multidisciplinary nature. The development in human gesture recognition and how well the computer understands the body motion. We are implementing a small game by using Monte Carlo tracking technique to track the marker on the wrist of a human arm. The game involves throwing a block in one of the slots shown on the display. The player first selects the slot where he would throw the block after which he is expected to throw with the same speed in the same slot 3 times to win the game. He would be given 10 chances to do. The main challenge of our project was to increase accuracy of tracking the marker and increase repeatability of the result after a throwing action is performed by the user standing at any distance from the camera. Since the game involves fast motion tracking of the arm, we had to give special attention to the speed factor keeping in mind the drawbacks of the DSP kit.

Video: YouTube Video