PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
IEEE Trans Neural Syst Rehabil Eng. Author manuscript; available in PMC 2010 October 1.
Published in final edited form as:
PMCID: PMC2843820
NIHMSID: NIHMS170323

Incorporating haptic effects into three-dimensional virtual environments to train the hemiparetic upper extremity

Abstract

Current neuroscience has identified several constructs to increase the effectiveness of upper extremity rehabilitation. One is the use of progressive, skill acquisition-oriented training. Another approach emphasizes the use of bilateral activities. Building on these principles, this paper describes the design and feasibility testing of a robotic / virtual environment system designed to train the arm of persons who have had strokes. The system provides a variety of assistance modes, scalable workspaces and hand-robot interfaces allowing persons with strokes to train multiple joints in three dimensions. The simulations utilize assistance algorithms that adjust task difficulty both online and offline in relation to subject performance. Several distinctive haptic effects have been incorporated into the simulations. An adaptive master-slave relationship between the unimpaired and impaired arm encourages active movement of the subject's hemiparetic arm during a bimanual task. Adaptive anti-gravity support and damping stabilize the arm during virtual reaching and placement tasks. An adaptive virtual spring provides assistance to complete the movement if the subject is unable to complete the task in time. Finally, haptically rendered virtual objects help to shape the movement trajectory during a virtual placement task. A proof of concept study demonstrated this system to be safe, feasible and worthy of further study.

Keywords: Cerebrovascular Accident, Neuroplasticity, Robotics, Upper Extremity, Virtual Reality

INTRODUCTION

A significant proportion of the persons surviving cerebrovascular accidents (CVA) suffer from residual sensory and motor impairments that affect their upper extremity function and can severely limit the functional independence of a person after a CVA [1], [2]. Newer studies show that robotically-facilitated repetitive movement training might be an effective stimulus for normalizing upper extremity motor control in persons with moderate to severe impairments who have difficulty performing unassisted movements [3], [4].

Hogan and colleagues designed a suite of robots starting with the MIT-MANUS a 2 DOF robot that trained the shoulder and elbow in a horizontal plane [4]. Subsequent additions to their suite include a 1 DOF robot that can train shoulder movements in vertical or diagonal planes [5], and a third that trains the wrist in three DOF [6]. Participants interact with the end effector of these robots at the hand and their arms are supported by external structures as needed. Trajectories of the participant may be shaped utilizing a haptic channel that limits negative trajectories and movement that deviates from a predetermined positive trajectory [7]. The PARIS system was designed to work in larger three dimensional workspaces and to either train or study the effects of adjusting actual task parameters and distortion of tasks or feedback and their effect on motor learning and control [8]. The NeReBot and the MariBot are two wire based robot systems designed to provide passive range of motion treatments to the shoulder and arm of patients with minimal active movement [9].

Exoskeleton Robots provide an alternative to end effector robots in their ability to control individual joint torques and velocities. The ARMin system facilitates a patient interacting in virtual environments utilizing a principle described as minimal intervention [10]. The robot provides assistance only when the subject moves outside predetermined trajectories or range of joint torques. The PneuWREX is a four DOF, pneumatically actuated exoskeleton with a grip sensor that allows subjects to train the hand and arm as a functional unit in a series of complex virtual environments [11]. The RUPERT system [12] is a portable, wearable, exoskeleton robot that facilitates movement of the arm and shoulder and can facilitate interactions with real world objects.

Van der Linde et al describes the Haptic Master an admittance controlled haptic robot that senses forces applied by the subject and controls motion of the subjects arms in response the applied forces. It is well suited for virtual environment interface and neurorehabilitation [13], [14]. Several robotic rehabilitation systems have been designed using the Haptic Master. Harwin et al. designed the GENTLE/S a system in which participants perform upper extremity movements using the Haptic Master, in a series of virtual environments that follow a continuum of visual complexity [15]. The robot augments the participant's movement with a haptic spring and damper system that maintains a trajectory and velocity determined by the participant's therapist. The spring and damper system control is modeled using the “bead” concept [16]. The Am Coordination Training 3D Device utilizes the Haptic Master to study the kinematics of three dimensional reaching activities by persons with upper extremity hemiparesis in virtual space [17]. The ADLER system which utilizes the Haptic Master, to facilitate real world two and three dimensional reaching and pointing activities, can also be combined with a ring gimbal, produced by Moog, [14] to add an additional 3 passive degrees of freedom to facilitate performance and measure the kinematics of more complex real world activities of daily living [18]. In a pilot study, our group utilized the Haptic Master incorporating a gimbal to facilitate training the hand and arm as a functional unit in virtual environments [19].

The systems described above utilize a variety of models and technology to facilitate and augment upper extremity movement for persons with hemiparesis. One common aspect to the majority of these systems is that trajectories, velocities, and assistance levels are predetermined and maintained throughout the movement. The approach described in this paper differs in that it utilizes the Haptic Master's ability to measure forces, velocity and position in real time, allowing it to utilize on-line algorithms to adjust haptic effects such as assistance against gravity, assistance in the direction of the target, and damping. These adjustments can be applied during the movement to enable the subject to accomplish the motor task with minimal external support. In addition, the level of assistance can be varied from trial to trial depending on the subject's performance throughout a session, maximizing the participant's output while maintaining a reasonable success-rate. Finally, the Haptic Master as a newer generation, admittance controlled robot combines the ability to render minimal friction with the capacity to create very rigid constraints that can be used to present haptic objects in virtual environments for the indirect shaping of arm movement trajectories.

Another line of inquiry has suggested that symmetrical movements of the upper extremities may activate similar neural networks in both hemispheres and may facilitate inherent interlimb coordination resulting in improved functional therapeutic outcomes for persons with hemiplegia [20]. It is postulated that during symmetrical bilateral movements, both hemispheres are activated and abnormal interhemispheric cortical inhibition is balanced [21]. Several authors cite improvements in hemiparetic upper extremity motor performance as a result of bilateral upper extremity training [22], [23], [24].

This paper describes the design process and rehabilitation applications of three simulations for the Haptic Master. Using rich virtual environments, key features utilized in these simulations include, haptic effects, custom visual presentations, 3D scalable workspaces, direct motion analysis, and adaptive algorithms that modify task difficulty based on a user's success rate. One of these simulations facilitates bilateral, symmetrical movement of the two upper extremities and the other two are for unilateral training.

METHODS

The simulations utilize the Haptic Master [14], a 3 degrees of freedom, admittance controlled (force controlled) robot. Three more degrees of freedom (yaw, pitch and roll) can be added to the arm by using a gimbal, with force feedback available only for pronation/supination (roll). A three-dimensional force sensor measures the external force exerted by the user on the robot. In addition, the velocity and position of the robot's endpoint are measured. These variables are measured at 1000 HZ and are used to generate reactive motion based on the properties of the virtual haptic environment in the vicinity of the current location of the robot's endpoint. This allows the robotic arm to act as an interface between the participants and the virtual environments, enabling multiplanar movements against gravity in a 3D workspace. The Haptic Master Application Programming Interface (API) allows one to program the robot to produce haptic effects, such as spring, damper and constant force and to create haptic objects like blocks, cylinders and spheres as well as walls, floors, ramps and complex surfaces. These effects can be used to provide a haptic interface with realistic haptic sensation that closely simulates the forces found in upper extremity tasks [13].

An important goal for our utilization of the Haptic Master was to take advantage of its multi-planar, 3-dimensional workspace. To accomplish this goal and to accommodate patients with a variety of impairments it was necessary to design several modalities to interface the paretic upper extremity of our subjects with the Haptic Master robotic arm. Subjects with the highest levels of finger function simply grasp a stationary 1.5 inch diameter sphere connected to the Haptic Master. For subjects with less strength and endurance, we designed a glove that secured the sphere at the end of the Haptic Master to the palm of the subject (Figure 1a). For subjects with wrist and finger dystonia, a ball and socket joint is attached to the palmar surface of a resting hand splint (Figure 1b). Both the glove and the ball and socket joint add additional degrees of freedom that are not measured by the Haptic Master. Moog produces a ring gimbal that can support the weight of the upper extremity and adds an additional 3 degrees of freedom that was not utilized by the subjects described in this paper [14].

Fig. 1
a. Glove attached to the end effector of the Haptic Master. b. For more involved subjects, a commercial resting hand splint was attached to the end-effector of the Haptic Master via ball and socket joint.

Simulation 1: Reach-Touch

In the Reach-Touch simulation, the participant moves a virtual sphere in a 3 dimensional space in order to touch a series of ten haptically rendered targets (Fig. 2a). After reaching a target, the subject had to bring the cursor back to the starting position defined by a haptically rendered torus at the bottom of the screen. As soon as the cursor sphere is placed within the torus, the next target to be touched begins to flash. The goal of the task is to improve the speed and accuracy of a wide variety of shoulder and elbow movements within the context of aiming /reaching type movements performed in a functional workspace. The working space of the Haptic Master can be raised or lowered, by changing the relationship of the initial physical position of the Haptic Master to the center of the virtual workspace. This allows the system to easily accommodate a wide variety of subject heights and active ranges of motion.

Fig. 2
a. Visual presentation of the Reach-Touch simulation. b. Velocity / Assistive Spring Force changes during one trial of Reach-Touch. Four seconds after velocity toward the target approaches zero, the assistive force is initiated. The endpoint velocity ...

The visual interfaces used in all three simulations presented in this study were programmed in C++ using the Open GL library. We used stereoscopic glasses to enhance depth perception and present movement targets to the subjects in a 3-dimensional stereo working space. CrystalEyes Stereoscopic glasses [25] were used to present the 3 dimensional visual environments. This process employs two graphic buffers, one for the left eye, another one for the right eye. CrystalEyes Stereoscopic glasses block one eye at a time with the same frequency as computer refresh rate. This synchronization allows the right eye to see the right graphic buffer, and the left eye to see the left graphic buffer, which provides a 3-dimensional stereo effect.

One of the goals of the Reach-Touch simulation is to increase the participant's shoulder active range of motion. Three haptic effects have been developed for this simulation to accommodate patients with varying degrees of impairment. One assistance mode provides an adjustable haptic spring that draws the subject toward the target. The amount of assistive force (spring stiffness), starts at zero and gradually increases in 5 N/m increments every 10 milliseconds when the hand velocity or active force applied by the subject to the robot does not exceed predefined thresholds within 5 seconds after movement onset (Fig. 1b). Current values of active force and hand velocity are compared online with threshold values. If either is above the threshold, the spring stiffness starts to decrease in 5 N/m increments every 10 milliseconds. The range of the spring stiffness is from 0 to 10000 N/m. The velocity threshold is predefined according to the mean velocity of movement recorded from a group of neurologically healthy subjects and varies among the ten target spheres.

A second haptic effect, an invisible virtual ramp was designed to allow subjects with force generation impairments to perform three dimensional reaching movements against gravity. The ramp runs through the starting position and the target. Friction between the ramp and cursor is negligible. Support from the ramp through the Haptic Master decreases a percentage the force of gravity the participant overcomes, based on the angle formed by the ramp and the ground. As a result, the force necessary to move the upper extremity against gravity, toward the target is reduced. The ramp also decreases arm instability making this movement less tiring and frustrating for more impaired subjects. A third haptic effect, a range restriction, limits the participant's ability to deviate from an ideal trajectory toward each target, thus shaping the trajectories. This effect was not utilized in this experiment but will be incorporated into future experiments. All of the haptic effects can be modified to provide less assistance as the participants improve.

Simulation 2: Cup Placing

The goal of the cup placing simulation is to improve active range of motion and reaching accuracy. The screen displays a three-dimensional room with haptically rendered shelves and table. The participant uses their virtual hand (hemiparetic side) to lift virtual cups and place them onto one of three spots on three shelves (Fig. 3a). Hand movement and viewpoint movement within the virtual environment are synchronized to maintain a clear view of the virtual hand throughout the activity in order to maintain focus on the task and increase the sense of involvement in the activity. A small target that is a different color than the virtual hand denotes the area of the hand used to grasp the cup handle and a rectangular target indicates the correct placement of the cup on the shelf. The size of the targets can be modified as the subject improves.

Fig. 3
a. Visual presentation of the Cup Placing simulation. b. Percentage change in kinematic measures following arm training using the Haptic Master during the Cup Placing Simulation c. Depiction of a single subject training with and without haptic effects ...

Dimensions of the cup placing task are calibrated to the subject's active range of motion and can be modified to be consistent with individual therapeutic goals. Calibration measures a subject's maximum reach 1) up and to the left, 2) up and to the right 3) down and to the left, and 4) down and to the right. The width of the shelves is 80% of the shortest excursion to the left or right that these movements elicited. The lowest and highest shelves are set at 80% of the lowest and highest excursions and the distance the shelves are set from the subject is 80% of the shortest horizontal excursion accomplished during the test. The calibration protocol itself can be used as an outcome measure.

Haptic feedback is employed in this simulation. After the subject acclimates themselves to the virtual environment, collisions with the table, shelves and other cups provide for normal feedback and feed-forward processes thus assisting in shaping the subjects' arm trajectories. The “weight” of the haptic cups can be adjusted, which allows for weighted strengthening activities for less impaired subjects as well as anti-gravity assisted movement for weaker subjects. An optional damping effect can be applied by the Haptic Master, which stabilizes the subject's movement trajectory in 3 dimensions. The augmented force feedback provided by the damping effect reduces the need for the impaired user to grade forces as they must when the Haptic Master is moving freely. Again, these effects may be modified depending upon the subjects' performance. The goal of the haptic effects described in this simulation is to allow the subject to train reaching movements with minimal external support or guidance by manipulating spatial task parameters, force requirements, and utilizing haptically rendered obstacles (see Figure 3a and 3b)

Performance Measurement: Reach Touch and Cup Placing

The Reach-Touch and Cup Placing simulations can be used for performance testing. During testing, haptic assistance is disabled. In this study, subjects were given 10 seconds to reach each target or place each cup from the starting position. If the subject is not able to complete the reaching movement within the predefined time, the target disappears and the subject must go back to the starting position to initiate the next movement. Position, velocity and force data collected by the Haptic Master can be utilized to analyze changes in movement patterns as subjects participate in testing and training activities making separate kinematic measures superfluous.

The following measurements track progress during the Reach-Touch and Cup Placing simulations. Duration is measured as the time that elapses during the touching of all ten spheres or the placing of all nine cups including the time necessary to return to the starting point between movements. Therefore, the measurement of duration for these activities, offers insight into the efficiency and accuracy of the subjects' arm movements.

Smoothness of the trajectories is evaluated by integrating the third derivative of the trajectory length, calculated as:

NIJ=T52L2oTJ2dt,
(2)

where T= duration, L = Length of trajectory, J=d3Ldt3, NIJ = normalized integrated jerk.

This numerically describes the ability to produce smooth, coordinated, gross reaching movements without object manipulation versus disjointed collections of sub-movements [26], [27], [28]. Rohrer et al [29] cite this ability as an indicator of neurological recovery in persons with strokes.

Simulation 3: Bilateral Catching of Falling Objects

The goals of the simulation “Catching Falling Objects” are to enhance active and passive range of motion as well as movement speed of the paretic upper extremity by utilizing a coupled movement guided by the less impaired arm. Virtual hands are presented in a mono view workspace. The movement is initiated by placing each virtual hand on a small circle. The object of the activity is to catch the target as quickly and as high on the screen as possible but the subject can catch the falling object at any point on the screen before it reaches the horizon. Subjects move in three dimensions but only horizontal and vertical plane movement affect cursor movement. This eliminates depth perception issues because the subjects are not required to orient the cursors in the transverse plane.

This simulation utilizes an active bi-manual, symmetrical movement of the subject's upper extremities by employing a master-slave relationship between the unimpaired and impaired arm, similar to the systems described by Fasoli [26] and Houtsma [30]. Our system differs in its ability to adaptively maximize active participation of the impaired arm. A Flock of Birds motion sensor [31] attached to the less impaired hand guides the robot which is attached to the participant's impaired arm (Fig.4a). An initial symmetrical (relative to the patient's midline) relationship between the motion sensor and the Haptic Master is established prior to the start of the game and maintained through out the game utilizing a virtual spring mechanism. At the highest levels of the virtual spring's stiffness, the Haptic Master guides the subject's arm in a perfect 1:1 mirrored movement. As the trajectory of the subject's hemiparetic arm deviates from a mirrored image of the trajectory of the sensor (held in the uninvolved hand) the assistive virtual spring is stretched exerting a force on the subject's impaired arm. This force draws the arm back to the mirrored image of the trajectory of the uninvolved arm (Fig. 4b). If the subject successfully hits the falling object a predetermined number of repetitions in a row (three in this study) the spring stiffness diminishes. The subject then has to exert a greater force with their hemiplegic arm in order to maintain the symmetrical arm trajectory required for continuous success. If the subject can not touch the falling object appropriately by exerting the necessary force, the target is not successfully caught. After a predetermined number of consecutive misses (three in this study), the virtual spring stiffens again to assist the subject. In this way, the adaptive algorithm maximizes the active force generated by the impaired arm.

Fig. 4
a. Trajectories produced during five repetitions of Catching Falling Objects simulation demonstrated by a representative subject. The unimpaired arm position (left) was measured by an electromagnetic sensor. The impaired arm position (right) was measured ...

Performance Measurement

Integrated Active Force

The simulation Catching Falling Objects utilizes a bilateral symmetrical movement to increase the active movement abilities of the hemiparetic arm. We quantified the active movement of the hemiparetic arm by measuring the forces exerted by the subject on the Haptic Master and computing:

IFA=tFAΔt=t(Fk(PFBPHM))Δt,
(3)

where IFA is Integrated Active Force, FA is force applied to the robot by the subject in the direction of the target, F is total force applied to the robot in the direction of the target, k is stiffness of the virtual spring that determines the force with which the robot assists the affected arm movement, PFB is position of the mirror image of the unaffected hand (see Simulation 3 above), and PHM is position of the affected hand.

Feasibility Study: Subjects and Training Protocol

To test the feasibility of this system we tested four subjects with chronic hemiparesis secondary to post stroke. Subjects were selected for the study based on the ability to actively extend the wrist of the hemiparetic limb at least 20° and extend the metacarpophalangeal (MCP) joints at least 10° which would fulfill or exceed the motor requirements necessary to participate in the lower functioning group of the EXCITE trial [32]. Subjects ranged from level 5 to level 7 on the Chedoke McMaster Stroke Arm Impairment Inventory, a seven point ordinal scale with one corresponding to no active or reflexive movement and seven corresponding to rapid isolated against gravity movement. The group ranged from 3 to 6 on the Chedoke McMaster Stroke Hand Impairment Inventory which is scored similarly [33]. Two subjects demonstrated no upper extremity spasticity and the other two, mild to moderate spasticity as measured by a Physical Therapist using the Modified Ashworth Scale [34]. All patients were ambulatory without assistive devices and each had intact light touch on the dorsum of their impaired hand. None of the subjects demonstrated behaviors consistent with hemi-sensory inattention or neglect as observed by an experienced physical therapist but these constructs were not tested formally. All of the subjects reported normal or corrected normal visual acuity and no field cuts on their intake history. Table 1 shows clinical and demographic data for the subjects. Subjects 1 and 2 trained 3× / week for three weeks and two of the Subjects 3 and 4 trained 4× / week for two weeks. Subjects were seated perpendicular to the Haptic Master with the robot in its neutral position and the interface knob 5 inches from the midpoint of their clavicle. Combinations of shoulder flexion, elbow extension, and horizontal adduction and abduction motions were trained. They performed 100 repetitions of the Reach-Touch simulation, 99 repetitions of the Cup Placing simulation and fifty repetitions of the Falling Object simulation. This training took about ninety minutes to one hundred and five minutes at the beginning of the training period but as the subjects improved they were able to complete the same number of repetitions in seventy-five minutes. No adverse events or reactions occurred and there were no complaints consistent with cybersickness, such as dizziness, nausea or disorientation [35], despite the fact that one of the activities (Reach-Touch) used partially immersive graphics.

Table 1
Feasibility Study Subjects

PRELIMINARY RESULTS

During the testing activities, kinematic and force data were collected using unassisted conditions for the unimanual simulations. For the bilateral simulation, testing was performed in the presence of the assistive algorithm, and the amount of active force applied by the subject in the direction of the target was used as a test variable. These data as well as pre and post-test results for the Wolf Motor Function Test. of Upper Extremity Function were analyzed to compare the results of robotically collected kinematic and performance data and behavioral tests of upper extremity function. Our small sample size and lack of control did not allow for testing to establish the efficacy of our system.

Figure 2c shows the percent change in the duration, and the smoothness of the trajectory used in the Reach-Touch simulation. All four subjects showed improvements in duration of the movement (31%, 35%, 44% and 35%), while three of the four subjects, demonstrated improvement in the smoothness of the trajectory by 66%, 50% and 63%.

Figure 3b shows the percent changes for the same three measures for the Cup Placing simulation for each of the four subjects. All the subjects showed a decrease in the time needed to complete the task; the percent change in duration was 57%, 49%, 36%, and 26%. The improvement in the smoothness of the trajectories in all four subjects (91%, 84%, 32%, and 72%) suggests more neurologically integrated movements [28]. Figure 3(c, d) shows the hand trajectories generated by a representative subject in the Placing Cup activity pre and post training. In this simulation, the shelf and the table are haptically rendered as solid objects, so that the moving cup cannot cross their surface. Figure c depicts a side view of a trajectory generated on Day 1 of training, without haptic assistance, and another trajectory generated with additional damping and partial antigravity support. At the beginning of the training the subject needed the addition of the haptic effects to stabilize the movement and to provide enough arm support for reaching the virtual shelf. Because the shelf is haptically rendered it assists the subject to produce a trajectory that accommodates the spatial aspects of the placing movement (see thick line near the shelf). However, Figure 3d shows that after two weeks of training this subject demonstrated a more normalized trajectory and could accommodate the placing movement to the shelf, even without haptic assistance.

During the bilateral activity catching falling objects, robotic assistance decreases as a participant's performance improves. Fig. 4b depicts the relationship between differences in the position of the motion sensor and the Haptic Master endpoint, and the assistive force. The Haptic Master's assistive spring exerts greater forces on the hemiparetic arm as its trajectory deviates further from that of the mirror image of the unimpaired arm. The daily changes in active force are shown in Fig. 4c for subject S2. The amount of force that this subject was able to produce in her hemiparetic arm increased from a mean of 7.9 N during the first two days of training to 15.4 N during the last day of training. As the training progressed, all four subjects increased the amount of active force exerted by their hemiparetic arm between 17% and 83%. Figure 3d shows the total active force exerted by the hemiparetic arm for each of the four subjects during this bilateral activity before and after 8 training sessions.

Real-world upper extremity function was measured using the 15 timed items from the Wolf Motor Function Test (WMFT), an outcome measure utilized in the EXCITE trial, one of the largest trials of upper extremity rehabilitation in the stroke literature [32]. This group of tests consists of simple movements and standardized functional activities which are timed. Each activity has a 120 second time limit and subjects that are unable to complete an item were given a score of 120. The WMFT also contains two strength measurements which we did not collect. Results from the WMFT are summarized in Table 3. Subjects' pretest scores ranged from 54.4 seconds (s) to 179.6 s (Mean (SD) = 111.8 (60.9)). Post-test scores ranged from 50.5 s to 187 s (Mean (SD) = 96.2 (63.2)). Pre to post-test percentage change ranged from −4% to 40% (Mean (SD) = 13.7 (19)). Two subjects improved their aggregate time to complete all 15 timed items by more than 10%.

Table3
Wolf Motor Function Item Times

DISCUSSION

The three simulations described in this paper were developed to provide multiplanar unimanual and bimanual activities for stroke rehabilitation and to utilize the haptic effects and adaptive algorithms afforded by the robot to intensify motor skills practice. To date, the iterations of this system and simulations have evolved into unique training modalities that can safely accommodate and challenge subjects with a range of impairments. All of the pilot subjects were able to perform training that combined single arm reaching and bimanual activities, even if they had difficulty with these types of activity in real world environments. The combination of assistance modes, scalable workspaces and hand-robot interfaces, allowed our subjects to train multiple joints in three dimensions without extensive support of their upper extremity.

One of the dilemmas in robot-assisted rehabilitation is to indentify an optimal combination of two approaches to facilitation of motor skill recovery. The first approach uses the robot as a “teacher” with the objective to teach the patient, for the given motor goal, an optimal hand trajectory and/or pattern of inter-joint coordination. The second approach uses the robot as an “enabler” that provides the minimal assistance needed for the patient to accomplish the motor task. In the first case, the robot usually guides the subject along the desired endpoint trajectory or restricts the subject's movement away from this trajectory. Two methods of implementing this second strategy include, the “bead” approach which is based on minimum jerk trajectory control [15], [16], as well as the use of a haptically rendered channel, described by Krebs et al. [7]. Both of these approaches limit deviation from predetermined trajectories and utilize extensive external support of the arm. The optimal amount of this type of guidance for maximizing recovery facilitation is not known. An important challenge when using this approach is to avoid making the patient's experience primarily passive, which would decrease the therapeutic effect [36]. One solution would be to reduce the stiffness of the controller to allow for larger deviations from the desired trajectory. The admittance-based controller of our robot allows for generation of high forces to create stiff virtual surfaces that can either guide or restrict hand motion in 3D space. This feature can be used to restrict subject's movement to a vicinity of a target trajectory, for example, by creating a virtual “tube” centered on the planned trajectory (not analyzed in this study). Another possible strategy would be to amplify subject's deviations form the desired trajectory in order to augment the error detection abilities of participants or exploit after effects [37].

An additional approach would be to use haptic virtual environments to shape the hand trajectories more indirectly. The ability of the admittance controlled robot to generate precise haptic effects allows the creation of high-fidelity virtual objects, for example, the haptic shelves utilized in the cup placing simulation, and the ramp utilized in the reach-touch simulation. This offers a physical method of shaping 3D trajectories of the arm, important for transfer to real world transport/reaching activities. Providing a haptically rendered environment forces the participant to form an internal model of the virtual environment and adapt their own self generated motor programs to fit this internalized model [38]. The ability to generate, implement and fine tune motor programs within physical task constraints is an important skill set critical to independent function.

Another important aspect of human-robot interaction during rehabilitative training would be the goal of minimizing the robotic assistance and to provide it only “as needed” [39]. In our approach the subject moves with limited external support and generates trajectories independently, with the objective to avoid the typically occurring human “slacking” associated with extensive external assistance [40]. Variable stiffness springs can be utilized to maintain an acceptable rate of progress toward movement objectives, in combination with other haptic effects such as antigravity assistance, stabilizing damping or haptically rendered obstacles, as illustrated in figure 3c, that can be employed at the discretion of a therapist to train optimal trajectories. Minimizing external assistance to the smallest degree required for task completion during practice and adaptively increasing the difficulty level of the practiced tasks are consistent with the theories of motor learning applied to stroke rehabilitation [41].

Both of these overall approaches – “robot teaching a trajectory” and “robot enabling the movement through minimal assistance” - are not mutually exclusive. Further studies are needed to find out how these two approaches can be combined during training to optimize transfer of the acquired skills to the activities of daily living. The flexibility afforded by the Haptic Master could provide an opportunity to compare these approaches.

No claims for causation can be made at this stage of our inquiry but each of our pilot subjects experienced improvements in kinematic measures during their robotic training activities. One of our subjects made substantial improvement on the WMFT and the other three demonstrated smaller or no improvement. Subject S1, who demonstrated the highest level of spasticity in this study, experienced a weather-related increase in spasticity during the final three days of our study. This may partially explain his decrement in WMFT score (5% decrease). Subject S2 made substantial improvements on the WMFT (40%). Subjects S3 and S4 made smaller changes (13% and 7% respectively). These two subjects were less impaired than S2 with their scores approaching normal function on many items thus indicating a possible ceiling effect on potential improvements when measured by the WMFT. This finding indicates that future studies should consider additional clinical tests that may be more sensitive to changes in upper arm movements. There were no significant relationships between kinematic and clinical testing data, however ongoing studies with an expanded and more sensitive clinical testing battery and a larger subject pool, with a wider range of impairments may elucidate this relationship.

Two of the three simulations discussed in this paper feature adaptive algorithms that modify activities based on success rates. This allows for training paradigms that continually and interactively move the motor outcome closer and closer to the targeted skill. In our previous study of virtual reality-based hand rehabilitation after stroke, these adaptive algorithms were designed to maintain a success rate of about 80% [42], [43], [44]. Other authors have utilized assistance algorithms to adjust robotic assistance in order to maintain predetermined performance levels [38], [45]. These algorithms were designed to balance the goals of maximizing the subject's effort to facilitate recovery, and minimizing the level of subject's frustration with the activity. A similar approach was utilized in this study, but optimal success rates for stroke rehabilitation activities have not been definitively established.

Evidence strongly emphasizes that learning of a new motor skill is essential for inducing functional plasticity [46] therefore; it appears that the dynamic and adaptive development and formation of new motor skills, critical variables necessary to promote motor changes and neural plasticity can be accomplished with these systems. A major limitation of this system is that it does not incorporate hand activities. The simulations on this paper were designed to train only the arm. We are currently designing simulations to provide for simultaneous arm and hand training in virtual environments [19]. This is being accomplished by using the Haptic Master, its 3 DOF gimbal, an instrumented glove and a force reflecting hand exoskeleton that will allow for robotically facilitated combinations of shoulder, elbow, wrist and individual finger movement.

Table 2
Robotically Collected Performance Data

REFERENCES

1. Duncan PW, Goldstein LB, Matchar D, Divine GW, Feussner J. Measurement of motor recovery after stroke: Outcome assessment and sample size requirements. Stroke. 1992;23(8):1084–1089. [PubMed]
2. Stewart KC, Cauraugh JH, Summers JJ. Bilateral movement training and stroke rehabilitation: A systematic review and meta-analysis. J Neuro Sci, 2005;244:89–95. [PubMed]
3. Patton JL, Mussa-Ivaldi FA. Robot-assisted adaptive training: custom force fields for teaching movement patterns. IEEE Trans Biomed Eng. 2004;51(4):636–646. [PubMed]
4. Krebs HI, Mernoff S, Fasoli SE, Hughes R, Stein J, Hogan N. A comparison of functional and impairment-based robotic training in severe to moderate chronic stroke: a pilot study. NeuroRehabilitation, 2008;23(1):81–87. [PubMed]
5. Krebs HI, Ferraro M, Buerger SP, Newbery MJ, Makiyama A, Sandmann M, Lynch D, Volpe BT, Hogan N. Rehabilitation robotics: pilot trial of a spatial extension for MIT-Manus. J NeuroEng Rehabil. 2004;1(1):5. [PMC free article] [PubMed]
6. Krebs HI, Volpe BT, Williams D, Celestino J, K. Charles S, Lynch D, Hogan N. Robot-aided neurorehabilitation: a robot for wrist rehabilitation. IEEE Trans. Neural Syst. Rehab. Eng., 2007;15(3):327–335. [PMC free article] [PubMed]
7. Krebs HI, Palazzolo JJ, Dipietro L, Ferraro M, Krol J, Rannekleiv K, Volpe BT, Hogan N. Autonomous Robots. Vol. 15. Kluwer Academic Publishers; The Netherlands: 2003. Rehabilitation Robotics: Performance-Based Progressive Robot-Assisted Therapy; pp. 7–20.
8. Patton JL, Dawe G, Scharver C, Mussa-Ivaldi1 FA, Kenyon R. Robotics and Virtual Reality: The Development of a Life-Sized 3-D System for the Rehabilitation of Motor Function. Proc. 26th EMBC Annu. Int. Conf. Eng. Med Biol. Soc. 2004 [PubMed]
9. Rosati G, Gallina P, Masiero S. Design, implementation and clinical tests of a wire-based robot for neurorehabilitation. IEEE Trans. Neural Syst. Rehab. Eng., 2007;15(4):560–569. [PubMed]
10. Nef T, Mihelj M, Riener R. ARMin: a robot for patient-cooperative arm therapy. Med. Bio. Eng. Comput. 2007;45:887–900. [PubMed]
11. Sanchez RJ, Jr., Wolbrecht E, Smith R, Liu J, Rao S, Cramer S, Rahman T, Bobrow JE, Reinkensmeyer DJ. A Pneumatic Robot for Re-Training Arm Movement after Stroke: Rationale and Mechanical Design; Proc. of the 2005 IEEE 9th Int. Conf. on Rehab. Robotics (ICORR); 2005.
12. Sugar T, He J, Koeneman EJ, Herman R, Huang H, Schultz RS, Herring DE, Wanberg J, Balasubramanian S, Swenson P, Ward JA. Design and control of RUPERT: a device for robotic upper extremity repetitive therapy. IEEE Trans. Neural Syst. Rehab. Eng. 2007;15(3):336–346. [PubMed]
13. Van der Linde RQ, Lammertse P, Frederiksen E, Ruiter B. The HapticMaster, a new high-performance haptic interface. Proc. Eurohaptics. 2002:1–5.
14. Moog FCS Corporation Haptic Master. 2006. http://www.fcs-cs.com.
15. Harwin W, Louierio R, Amirabdohallahian F, Taylor M, et al. Marincek C, et al. Assistive Technology: Added Value to the Quality of Life. OS Press; Amsterdam: 2001. The GENTLE/S project: A new method of delivering neurorehabilitation; pp. 36–41.
16. Amirabdollahian F, Loureiro R, Harwin W. Minimum Jerk Trajectory Control for Rehabilitation and Haptic Applications; Proc. 2002 IEEE Int. Conf. on Robotics & Automation; May, 2002. pp. 3380–3385.
17. Ellis MD, Sukal T, DeMott T, Dewald JP. Augmenting clinical evaluation of hemiparetic arm movement with a laboratory-based quantitative measurement of kinematics as a function of limb loading. Neurorehabil. Neural Repair. 2008;22(4):321–9. [PMC free article] [PubMed]
18. Wisneski KJ, Johnson MJ. Quantifying kinematics of purposeful movements to real, imagined, or absent functional objects: Implications for modelling trajectories for robot-assisted ADL tasks. Journal. of NeuroEngineering and Rehabil. 2007;4:7. [PMC free article] [PubMed]
19. Adamovich S, Fluet GG, Merians AS, Mathai A, Qiu Q. Proc. 28th EMBC Annu. Int. Conf. Eng. Medicine Biol. Soc. Vancouver, Canada: Aug, 2008. Recovery of Hand Function in Virtual Reality: Training Hemiparetic Hand and Arm Together or Separately; pp. 3475–3478. [PubMed]
20. Stinear JW, Byblow WD. Rhythmic bilateral movement training modulates corticomotor excitability and enhances upper limb motoricity post stroke: a pilot study. J. of Clin. Neurophysiol. 2004;21(6):124–131. [PubMed]
21. Cauraugh JH, Summers J. Neural plasticity and bilateral movements: a rehabilitation approach for chronic stroke. Progress in Neurobiol. 2005;75:309–320. [PubMed]
22. Lum PS, Burgar CG, Shor PC, Majmundar M, Van der Loos M. MIME robotic device for upper limb neurorehabilitation in subacute stroke subjects: A follow up study. J. of Rehab. Res. and Dev. 2006;43(5):631–642. [PubMed]
23. Whitall J, Waller S, Silver K, Macko R. Repetitive bilateral arm training with rhythmic auditory cueing improves motor function in chronic hemiparetic stroke. Stroke, 2000;31(10):2390–5. [PubMed]
24. McCombe-Waller S, Whitall J. Fine motor control in adults with and without chronic hemiparesis: baseline comparison to nondisabled adults and effects of bilateral arm training. Arch. Phys. Med. Rehabil. 2004;85(7):1076–83. [PubMed]
25. RealD/StereoGraphics CrystalEyes shutter eyewear. 2006. htt://www.reald.com.
26. Fasoli SE, Krebs HI, Stein J, Frontera WR, Hogan N. Effects of robotic therapy on motor impairment and recovery in chronic stroke. Arch. of Phys. Medicine & Rehabil. 2002;84(2):477–82. [PubMed]
27. Tresilian JR, Stelmach GE, Adler CH. Stability of reach to grasp patterns in Parkinson's disease. Brain. 1997;120(11):2093–2111. [PubMed]
28. Adamovich SV, Berkinblit MB, Hening W, Sage J, Poizner H. The interaction of visual and proprioceptive inputs in pointing to actual and remembered targets in Parkinson's disease. Neuroscience. 2001;104(4):1027–41. [PubMed]
29. Rohrer B, Fasoli S, Krebs HI, Hughes R, Volpe B, Frontera WR,, Stein J, Hogan N. Movement smoothness changes during stroke recovery. J. of Neurosci. 2002;22(18):8297–82304. [PubMed]
30. Houtsma J, Van Houten F. Virtual Reality and a Haptic Master-Slave Set-Up in Post-Stroke Upper-Limb Rehabilitation. Proc. of the Inst. of Mechanical Engineers, Part H, J. of Engineering in Medicine. 2006;220:715–718. [PubMed]
31. Ascension Technology Corporation Flock of Birds. 2006. http://www.ascension-tech.com.
32. Wolf S, Thompson P, Morris D, Rose D, Winstein C, Taub E, Giuliani C, Pearson S. The EXCITE Trial: Attributes of the Wolf Motor Function Test in Patients with Sub acute Stroke. Neurorehabil. & Neural Repair. 2005;19(3):194–205. [PubMed]
33. Gowland C, DeBruin H, Basmajian J, Plews N, Nurcea I. Agonist and antagonist activity during voluntary upper-limb movement in patients with stroke. Phys. Ther. 1992;72(9):624–33. [PubMed]
34. Bohannon RW, Smith MB. Interrater reliability of a modified Ashworth scale of muscle spasticity. Phys. Ther. 1987;67(2):206–7. [PubMed]
35. Holden MK. Virtual environments for motor rehabilitation. Cyberpsychol. & Behav. 2005;8:187–211. [PubMed]
36. Hornby TG, Campbell DD, Kahn JH, Demott T, Moore JL, Roth HR. Enhanced gait-related improvements after therapist- versus robotic-assisted locomotor training in subjects with chronic stroke: a randomized controlled study. Stroke, 2008;39:1786–92. [PubMed]
37. Patton JL, Stoykov ME, Kovic M, Mussa-Ivaldi FA. Evaluation of Robotic training forces that either enhance or reduce error in chronic hemiparetic stroke survivors. Exp. Brain Res., 2006;168(3):368–383. [PubMed]
38. Reinkensmeyer DJ, Emken JL, Cramer SC. Robotics, motor learning, and neurologic recovery. Annu. Rev. Biomed. Eng. 2004;6:497–525. [PubMed]
39. Wolbrecht E, Chan V, Reinkensmeyer DJ, Bobrow JE. Optimizing compliant, model-based robotic assistance to promote neurorehabilitation. IEEE Trans. on Neural Systems and Rehabil. Eng. 2008;16(3):286–297. [PubMed]
40. Todorov E. Optimality principles in sensorimotor control. Nature Neurosci. 2004:907–915. [PMC free article] [PubMed]
41. Majask M. Application of motor learning concepts to the stroke population. Topics in Stroke Rehabil. 1996;3(3):27–59.
42. Adamovich S, Merians A, Boian R, Tremaine M, Burdea G, Recce M, Poizner H. A virtual reality (VR)-based exercise system for hand rehabilitation after stroke. Presence, 2005;14:161–74.
43. Merians A, Jack D, Boian R, Tremaine M, Burdea G, Adamovich S, Recce M, Poizner H. Virtual reality-augmented rehabilitation for patients following stroke. Phys. Ther. vol. 2002;82(9):898–915. [PubMed]
44. Merians A, Poizner H, Boian R, Burdea G, Adamovich S. Sensorimotor training in a virtual reality environment: does it improve functional recovery poststroke? Neurorehabil. and Neural Repair. 2006;20(2):252–67. [PubMed]
45. Hogan N, Krebs HI, Rohrer, Palazzolo JJ, Dipietro L, Fasoli SE, Stein J, Hughes R, Frontera WR, Lynch D, Volpe BT. Motions or muscles? Some behavioral factors underlying robotic assistance of motor recovery. J. of Rehab. Res. & Dev. 2006;43(5):605–618. [PubMed]
46. Plautz E, Milliken G, Nudo R. Effects of repetitive motor training on movement representations in adult squirrel monkeys: role of use versus learning. Neurobiol. of Learning and Memory. 2000;74(1):27–55. [PubMed]