A computer-vision based gesture system was created that allows a gloved or non-gloved hand to control an EMR to select patients from a database, select and retrieve radiologic images or laboratory data, and adjust image viewing parameters (such as degree of zoom and level of attenuation.) Hands are tracked in a in a two-dimensional space in front of a digital video camera. Images are arranged in virtual panels creating a vertically standing image cylinder (a “Gibson
cylinder”). The user’s viewpoint is that of the center of cylinder looking out towards the panels. Movement of the hand in one of four directions triggers rotation or vertical movement of the cylinder, thus shifting the images into and out of the user’s view. Hand gesture tracking is accomplished using a modified and enhanced version of the CAMSHIFT algorithm [2
] which is based on color and motion cues. A state machine switches between user modes such as “zoom” and “rotate”. Three interface tasks were tested across twenty hospital employees including: hand calibration for tracking, image navigation, and inducing “sleep” mode.