PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
Stud Health Technol Inform. Author manuscript; available in PMC Jun 26, 2013.
Published in final edited form as:
PMCID: PMC3693446
NIHMSID: NIHMS479373
Simplifying Touch Data from Tri-axial Sensors Using a New Data Visualization Tool
Lawrence H. SALUD, MS, Calvin KWAN, BS,1 and Carla M. PUGH, MD, PhD
University of Wisconsin-Madison, Department of Surgery, Northwestern University, Department of Surgery
Calvin KWAN: kwan/at/surgery.wisc.edu; Carla M. PUGH: pugh/at/surgery.wisc.edu
1Corresponding Author. kwan/at/surgery.wisc.edu
Abstract
Quantification and evaluation of palpation is a growing field of research in medicine and engineering. A newly developed tri-axial touch sensor has been designed to capture a multi-dimensional profile of touch-loaded forces. We have developed a data visualization tool as a first step in simplifying interpretation of touch for assessing hands-on clinical performance.
Keywords: Sensors, Clinical Breast Exam, Haptics, Data Visualization
Quantifying and evaluating palpation is a growing field of research in medicine and engineering [14]. In our previous work, we have explored novel, touch sensor technologies that can electronically capture forces caused by palpation during a Clinical Breast Examination (CBE) of a task-training simulator [5]. By embedding sensors in the CBE simulator, we can electronically capture where clinicians touch and 2 how much force is applied in an examination. This capability enables clinical performance evaluators to interpret and assess how learners make clinical diagnoses using palpation. By performing a plurality of computational techniques on sensor data, we are able to mine for correlations between palpation and clinical diagnosis in different contexts [611]. While integrating electronic sensors with simulators is becoming increasingly common in medicine and engineering, the complexity and accuracy of sensor data interpretation remains challenging from a signals processing perspective.
A novel touch sensor called a “tri-axial touch sensor” has recently been developed [Figure 1] [12]. As a discrete touch-sensing device, this newly developed sensor was designed to decompose loading of touch into three-dimensional, independent, force vectors. In our previous work, we evaluated clinical palpation performance with only a one-dimensional force vector: normal force. To accommodate the addition of two channels for shear force, we have developed a visualization tool as a first step in simplifying interpretation of touch for clinical performance assessment. Our goal for this project was to compare our signals processing method with that used by other researchers.
Figure 1
Figure 1
Tri-Axial Sensor (Left) and its application in the CBE to capture clinical palpations (Right).
2.1. The Tri-Axial Touch Sensor
Fabricated by Micro Electromechanical Systems (MEMs) technology, the tri-axial touch sensor is designed to output a three-dimensional force profile comprising of normal and shear forces. This sensor encompasses a discrete, tri-axial design by orienting three, silicone cantilevers perpendicular to one another in a novel and robust Printed Circuit Board (PCB) assembly. When the top of the sensor is loaded by touch, each cantilever responds to a direction of load. Because each cantilever is oriented perpendicular to one another, a three-dimensional force profile of touch can be acquired. In addition, the PCB is encapsulated with a polydimethylsiloxane (PDMS) mold that maintains load sensitivity and PCB visibility. Cantilever X is oriented to respond to shear loading across the x direction. Cantilever Y is oriented to respond to shear loading across the y direction. Both cantilevers collectively respond to shear loading when a user applies lateral forces to the top of the sensor. Cantilever Z is oriented to respond to normal loading perpendicular the x and y directions. Cantilever Z responds to normal loading in the z direction when one touches downward on the sensor [Figure 1].
A novel approach to sensing touch, this sensor is designed to such that the magnitude of directional load on each cantilever can be reported as a proportional output voltage. The higher a load is applied to a cantilever, the higher the voltage amplitude. Furthermore, the sensor and hardware interface outputs amplitudes with different polarities based on the direction of load, Hence, quantified touch can be reported with both magnitude and polarity. Each cantilever is dedicated to a channel such that three separate outputs can be connected to a data acquisition (DAQ) system, enabling individual capture of forces, in the x direction, Fx, in the y direction, Fy, and in the z direction, Fz, [Figure 2a].
Figure 2
Figure 2
Conceptual illustration of the sensor’s axis and channel states (A) and its visual representation on the cube (B).
2.2. Data Acquisition
A PC laptop is used as a DAQ system and is connected to the sensor via a computer hardware interface comprising of an NI-DAQ USB 6210 from National Instruments, and a custom Wheatstone bridge circuit with amplification.
2.3. Data Visualization
We have classified the sensor’s three channels of force Fx, Fy and Fz as three pairs of directions: Forward-Back, Left-Right, and Up-Down [Figure 2a]. One direction from each pair is combined such that the total possible combination is eight directions. To represent each combination in virtual space, we subdivided a cube into eight equal, enumerated sections. Each section visually represents one of the eight possible states [Figure 2b]. For example, Forward-Left-Up is state ‘1’.
We have constructed this cube in software using the National Instruments LabVIEW Graphical Development Environment. This software allows the cube to be manipulated in three-dimensional space. A laptop displays our cube interface, an eight-section wireframe cube. Each section is colored with a unique color to be distinguishable among adjacent sections. When touched, the software provides spatial cues to visually communicate where the sensor was touched. As shown, a texture-map of state 7 is applied to visually communicate that the sensor was loaded in the “Back-Left-Down” direction [Figure 3]. To accomplish this in real-time, the software repeatedly samples all three forces Fx, Fy and Fz simultaneously, subtracts baseline offsets, and applies a linear transformation of the forces to minimize intra-channel crosstalk. The calibration coefficients, T, from publication Shenshen Zhao, Yuho Li, et. al. were used to perform this linear transformation [12].
Figure 3
Figure 3
Depiction of a force on the sensor and the visualization of the Cube Algorithm. A left-backward-down directional force is highlighted as state 7 of the cube.
Despite subtracting baseline offsets from forces Fx, Fy and Fz, the sensor exhibited some nominal activity due in part by the active components of the custom circuit. To account for this activity, three sets of minimum and maximum thresholds were defined to create a non-visible bounding box that would encapsulate a “not touched” state. Conducting preliminary, touch-loading experiments with the sensor, we defined this bounding box in milivolts. Conceptually located inside and at the center of the cube, this box, U, is defined as Ux-min = 50 mV, Ux-max = 310 mV on the x-axis; Uy-min = −50 mV, Uy-max = 180 mV on the y-axis; and Uz-min = −50 mV, Uz-max = 180 mV on the z-axis. The software samples all three forces Fx, Fy, and Fz simultaneously every 1 kHz to determine if all three forces are outside of this bounding box. If all three forces are outside of this box, the software considers the sensor touched, determines the sensor state and texture maps the associated section of the cube. When this section is no longer considered touched, the texture map is removed and the section’s wireframe is colored grey. At the end of a DAQ session, one can determine which sections remained untouched by observing which sections of the wireframe maintained its original color.
2.4. Data Acquisition - Proof of Concept Analysis
After several practice trials and with the audible aid of a metronome, a participant repeatedly rubbed the top of the tri-axial touch sensor with their fingertip in a shearing forward and back motion at approximately 1.5Hz for about 10 seconds. We captured this motion with the DAQ system.
2.5. Data Analysis – Force Profile Samples
This motion was stored in the DAQ system as a force profile. Using Matlab, we ran a Gaussian smoothing algorithm across this force profile to reduce noise [13]. We then chose four random samples, subtracted baseline offsets of forces Fx, Fy and Fz, applied the linear transformation, and defined each sample as a set of three amplitudes with vector points Vx, Vy and Vz. The four samples were tallied as four vectors, each having Euclidean distances that we labeled as vector magnitudes in a table. Furthermore, we classified a direction of touch as a transition between two states. Given all four vectors, directional magnitudes from all pairs of transitions between states were calculated as Euclidean distances with points of origin at starting states of transition.
All common pairs of transitions between states were grouped together. Once grouped, we determined frequency of occurrence and average directional magnitude. We report our results in a table.
Signals Processing Comparison - In Wang’s study [3], normal force sensors are attached onto prostates of the Digital Rectal Exam (DRE) simulator to record contact force. These forces are differentiated through signal filtering, discretizing the signal into states, and determining the types of movements from sequences of sensor states. This algorithm provides separate filters for different patterns of movements and utilizes specified time delays in distinguishing continuous (vibration) from discrete motions (tapping).
In contrast, our process provides directional info and detects movement change due to peak detection. However, this approach is restricted to the area over the 3D sensor and applies one generalized filter across all movements. Similarly, both methods utilize a state-based approach, relate patterns of state sequences to profile palpation, and can be applied to analyze performance and technique.
As shown in Figure 4(a), four samples (A, B, C, and D) of three Gaussian-smoothed, force profiles on the x, y and z-axis electronically show the participant’s forward and back motion. Figure 4(b) shows visual representations of the states and state transitions of the four samples. The sequence of samples collectively match the participant’s forward and back motion.
Figure 4
Figure 4
Waveforms produced from the Proof of Concept Trial by the graph (Left) and the corresponding visualization of areas touched on the sensor (Right).
Table 1 shows states, vector points, magnitudes, directions, and directional magnitudes for samples A, B, C and D. Table 2 groups common directions into frequency and average directional magnitude.
Table 1
Table 1
Data table for force collected on the sensor.
Table 2
Table 2
Table for frequency of directional movement and its magnitude.
We have introduced a tool for quantifying and evaluating clinical palpation by constructing a real-time, visual representation of multi-dimensional, touch-loading activity. We have accomplished this by representing touch as sectional changes of a cube in virtual, three-dimensional space. Connecting this visual to the tri-axial touch sensor, we can convert multi-dimensional force components into summative forces that map to discrete visual representations of touch. Moreover, by providing this ability in real-time, we have introduced a visual-haptic pattern recognition framework that could enable an intuitive method for assessing clinical performance with multiple, hierarchical levels of pattern-generating information.
Tagging systems that allow users to annotate digital resources have been increasingly used to mine vast amounts of data [1416]. We have demonstrated a tagging system to auto classify touch-loading activity into states of summative forces that can be decomposed into more detailed force properties. In Table 2, states function as “tags”, or higher-level, metadata, providing entry points into more detailed information such as frequency and average directional magnitude. Tags may provide evaluators with an improved ability to assess expert and novice learners of hands-on clinical procedures. Klatzky and Lederman have shown that sets of reproducible and subconscious maneuvers are used to manually explore objects [17]. These maneuvers, called exploratory procedures (EPs), have stereotyped movement patterns with invariable and highly typical characteristics when exploring properties of an object. In clinical examinations, EPs may be identifiable by using our framework to develop these tags. The variables listed on Tables 1 and and22 may be linked to these EPs, and bring us closer to quantifiable correlations between hand movements and diagnosis accuracy.
Visual-haptic tools in simulation in healthcare have been useful in the development of clinical training in medicine [1, 18]. Moreover, research has shown that a combination of visualization and touch show a clear advantage to the construction of meaning [19]. From discovering learner trends in the use of exploratory procedures, to establishing performance standards, to understanding frequency of fine-grained, touch-loading activity in the shear force direction, the development path for this tool holds great promise as a visualization and data mining framework for objective, clinical performance assessment. We have taken a first step by addressing, in part, the complexity and accuracy of tri-axial sensor data interpretation. We accomplish this through the initial development of a real-time visual representation of palpation for clinical performance evaluators.
Acknowledgments
We would like to thank Shenshen Zhao and Professor Chang Liu of the MedX Lab at Northwestern University for pioneering work on this sensor technology.
1. Ables DC, et al. Quantifying the sense of touch and reducing training time for the clinical breast examination using electronic palpation imaging. American Journal of Clinical Oncology-Cancer Clinical Trials. 2007;30(4):457–457.
2. Ables DC, et al. The science behind electronic palpation: quantifying the sense of touch used in the clinical breast exam. Breast Cancer Research and Treatment. 2006;100:S128–S129.
3. Wang NH, et al. Quantifying Palpation Techniques in Relation to Performance in a Clinical Prostate Exam. Ieee Transactions on Information Technology in Biomedicine. 2010;14(4):1088–1097. [PubMed]
4. Van Zoest G, van den Berg H, Holtkamp FC. Three-dimensionality of contact forces during clinical manual examination and treatment: A new measuring system. Clinical Biomechanics. 2002;17(9–10):719–722. [PubMed]
5. Kwan C, et al. Moving past normal force: capturing and classifying shear motion using 3D sensors. Stud Health Technol Inform. 2012;173:245–9. [PMC free article] [PubMed]
6. Pugh CM, Rosen J. Qualitative and quantitative analysis of pressure sensor data acquired by the E-Pelvis simulator during simulated pelvic examinations. Stud Health Technol Inform. 2002;85:376–9. [PubMed]
7. Pugh CM, et al. Use of a mechanical simulator to assess pelvic examination skills. JAMA. 2001;286(9):1021–3. [PubMed]
8. Wang N, et al. Using a prostate exam simulator to decipher palpation techniques that facilitate the detection of abnormalities near clinical limits. Simul Healthc. 2010;5(3):152–60. [PubMed]
9. Wang N, et al. Quantifying palpation techniques in relation to performance in a clinical prostate exam. IEEE Trans Inf Technol Biomed. 2010;14(4):1088–97. [PubMed]
10. Baumgart LA, Gerling GJ, Bass EJ. Characterizing the range of simulated prostate abnormalities palpable by digital rectal examination. Cancer Epidemiol. 2010;34(1):79–84. [PMC free article] [PubMed]
11. Gerling GJ, et al. The Design and Evaluation of a Computerized and Physical Simulator for Training Clinical Prostate Exams. IEEE Trans Syst Man Cybern A Syst Hum. 2009;39(2):388–403.
12. Zhao SS, Li YH, Liu C. A tri-axial touch sensor with direct silicon to PC-board packaging. Sensors and Actuators a-Physical. 2011;170(1–2):90–99.
13. O’Haver T. Fast smoothing function. 2008 Available from: http://www.mathworks.com/matlabcentral/fileexchange/19998-fast-smoothing-function.
14. Macias E, et al. Architecture and protocol of a semantic system designed for video tagging with sensor data in mobile devices. Sensors (Basel) 2012;12(2):2062–87. [PMC free article] [PubMed]
15. Bourne PE, et al. Will widgets and semantic tagging change computational biology? PLoS Comput Biol. 2010;6(2):e1000673. [PMC free article] [PubMed]
16. Good BM, Tennis JT, Wilkinson MD. Social tagging in the life sciences: characterizing a new metadata resource for bioinformatics. BMC Bioinformatics. 2009;10:313. [PMC free article] [PubMed]
17. Lederman SJ, Klatzky RL. Hand Movements - a Window into Haptic Object Recognition. Cognitive Psychology. 1987;19(3):342–368. [PubMed]
18. Abate AF, et al. A Pervasive Visual-Haptic Framework for Virtual Delivery Training. IEEE Transactions on Information Technology in Biomedicine. 2010;14(2):326–334. [PubMed]
19. Reiner M. Visualization: Theory and Practice in Science Education. Springer Netherlands; 2008. Seeing Through Touch: The Role of Haptic Information in Visualization.