A HCI interface based on hand gestures

Chuan-Kai Yang  Yu-Chun Chen

Computer Graphics & Multimedia Lab., NTUST


 

Abstract

Human–computer interaction, or HCI for short, concerning how people interact with computers, has long been an important and popular research field. Though not completely realistic, fancy HCI applications such as those shown in the science fiction movies Minority Report and Iron Man have impressively demonstrated the potential and trend of HCI technologies that will be very soon made available. As one can very often observe, compared with traditional keyboard/mouse interfaces, the exclusive use of hands has distinguished itself by enjoying a more intuitive and natural way for communication. Furthermore, the increasingly popular concept of ubiquitous computing has called for convenient and portable input devices, thus making hand gesture inputs even more attractive. For example, a smart phone equipped with the capability of hand gesture recognition could be a good input substitute for its intrinsically small touch screen or keypad. Rather than data gloves, which transfer hand gestures through relatively expensive electronic devices, we are more interested in recognizing the gestures of a bare hand. In this regard, there exist works that can track a 2D articulated hand model. In this paper, we make further improvement in computation efficiency and propose novel interfaces to be coupled with the hand tracking system for more userfriendliness.


Download
[PPT] [Video]

Figures

Figure 1: The angular search, cited from

Figure 2: The hand contour template

Figure 3: Sweep tracker algorithm and its improved version

Figure 4: A color change to indicate the occurrence of a depth change event