
Multisensory Interface for 5D Stem Cell Image Volumes
Michael Koerner
Eric Wait
Mark Winter
Erzsebet Kokovay
Andrew R Cohen
Abstract
Biological imaging of live cell and tissue using 3D microscopy is able to capture time-lapse image sequences showing multiple molecular markers labeling different biological structures simultaneously. In order to analyze this complex multi-dimensional image sequence data, there is a need for automated quantitative algorithms, and for methods to visualize and interact with both the data and the analytical results. Traditional computational human input devices such as the keyboard and mouse are no longer adequate for complex tasks such as manipulating and navigating 3+ dimensional volumes. In this paper, we have developed a new interaction system for interfacing with big data sets using the human visual system together with touch, force and audio feedback. This system includes real-time dynamic 3D visualization, haptic interaction via exoskeletal glove, and tonal auditory components that seamlessly create an immersive environment for efficient qualitative analysis.
I. Introduction
Manipulating and interacting with large and complex data is becoming an increasingly common task in many fields. This is especially true in the field of biological image analysis. Live cell and tissue biological microscopy uses high throughput optical imaging to capture time-lapse sequences showing live cells in their intact tissue microenvironment. One important application area in live cell biological microscopy involves time-lapse imaging of stem cells in 3D tissue slices.
Stem cell biologists are routinely able to capture time lapse microscopy data showing live stem cells in their vascular niche. This vascular niche is known to be fundamentally important to the development of stem cells [1], although the specific mechanisms and the dynamic properties are poorly understood. Imaging stem cells live in their vascular niche allows for quantitative analysis of the dynamic properties of the cells in intact tissue, providing insight into fundamentally important questions in regenerative medicine and cancer therapeutics. These image sequences consist of three spatial dimensions, time, and a 5th dimension containing distinct biological imaging channels, such as stem cells and blood vessels.
Previously, we developed computational tools for the analysis of 2D time lapse image sequences showing the proliferation of aclone, or family tree, of cells from a single progenitor through the ultimate differentiated progeny [2]. Recently, we have extended these computational tools to incorporate 3D visualization and analysis components [3]. This stem cell image analysis and visualization software allows these five dimensional data sets to be effectively visualized. Still, it remains uniquely challenging to interact with this complex 5D data. Traditional image manipulation methods constrain the user to a keyboard and mouse, making spatiotemporal navigation difficult and imprecise. Biologists are unable to qualitatively analyze and explore their data sets due to the limitations of the current standard. We have developed a new system using a force feedback exoskeleton, haptic vibrating motors, and a tonal audio interface to augment the user interaction with the 5D data, extending a purely visual interface to include touch and sound as well.
Various haptic feedback interfaces exist for the exploration of 3D data. They can be grouped by grounded-type, [4], [5], or body-type [6], [7], [8]. The HIRO III Five Fingered Haptic Interface Robot [4] emulates the anatomy of the human hand by holding and augmenting force at each fingertip. The system provides extremely accurate fingertip feedback; however, significantly limits the fingers’ range of motion due to its bulky attachment mechanism. The Geomagic Touch [5] is a commercial grounded-type haptic device that can accurately deliver low-force single point haptic feedback to a stylus or fingertip. This refined system suffers from the same range of motion and workspace limitations.
Body-type (exoskeletal) interfaces, which attach only to the hand, have much larger workspaces due to their ungrounded nature. The ExoPhalanx system [6], designed for passive force-feedback robotic teleoperation, is able to deliver resistive force to finger flexion. Its tendon-based dual clutch brake system can independently prevent flexion of the distal interphalangeal (DIP)/ proximal interphalangeal (PIP) or metacarpophalangeal (MCP) joint. Fontanaet al. [7] developed an active exoskeleton for delivering accurate force to the fingertip of the index and thumb. This system can provide up to 5 Newtons continuous force in any direction and all actuation mechanisms are located on the hand. CyberGlove System’s CyberGrasp [8] is the only commercially available exoskeletal system that can provide individual force feedback to the flexion of each finger.
In this paper, we discuss a novel multi-sensory interface integrated with 3D stem cell visualization and analysis tools that effectively replaces the keyboard and mouse, allowing the user to naturally explore and interact with the data. The system consists of a haptic feedback glove coupled with a tonal audio interface. Using this system, the wearer is able to manipulate the volume and to physically feel the size, shape and texture of its distinct biological structures. Audio feedback allows the user tohear the visually obscured interactions between the different biological structures. The system creates a fully immersive environment for exploratory analysis of the data.
II. Interaction System
The interaction system consists of three components. The first component is the 3D interactive visualization tool with dynamic transfer functions for controlling image appearance and mixing properties on a per channel basis [3]. The second component incorporates haptic interaction, including force and textural feedback as well as manipulation of the user interface through the haptic device. Finally, the third component incorporates tonal audio feedback specific to each of the imaging channels. The integrated control system can be seen inFigure 1.
Figure 1.
Block diagram of the multi-sensory interaction system illustrating the basic control loop architecture.
A. Visualization
Processing of the 5D image sequences begins by converting the native microscope data (Zeiss LSM file) into intensity valued tiff images. In addition to the images, the metadata is extracted including the spatiotemporal resolution used to account for the image anisotropy, providing scaling for tracking and distance-based calculations. The tiff images and metadata are then combined to create a voxel intensity valued volume containing distinct biological structures.
The structures within the volume, including stem cells, blood vessels, etc. are segmented and convex hulls are drawn on each boundary using a series of background removal and de-noising algorithms. The resulting volumetric representation with segmentation overlay can be seen inFigure 2(a).
Figure 2.
(a) Stem cell visualization and analysis software with fingertip and palm pointers rendered within the volume. (b) Visualization transfer functions for dynamic image adjustment. (c) Haptic and audio transfer functions for dynamic adjustment of various interface properties.
The transfer function interface seen inFigure 2(b) is used to adjust the voxel intensities of individual biological structures to enhance visualization. The sliders control minimum visible intensity, original intensity vs. mapped intensity curve, and maximum visible intensity, from top to bottom respectively. This simple transfer function interface allows the user to efficiently alter the appearance of the data in real time.
B. Haptic Interaction
Haptic interaction with the data is achieved using the Exo-Skin haptic glove that we have developed. The Exo-Skin glove was created with a combination of 3D printed parts, micro gear-motors, force sensing resistors, vibro-tactile stimulatory motors, and a soft under-glove base. The Exo-Skin is controlled by an Arduino-based system communicating via serial link with the visualization software. Finally, a “Leap Motion” sensor provides sub-millimeter accurate location information to place the users hand within the computational 3D space (https://www.leapmotion.com/). This approach provides force as well as textural feedback to the user based on fingertip locations and movements within the rendered volume. Currently, only Windows software integrates the glove; however, it would not be difficult to incorporate with software that runs on other platforms.
C. Exo-Skin Design and Kinematics
We have created an active haptic exoskeleton device based on the anatomical kinematic structure of the hand. The realized device can be seen inFigure 3. The design consists of four parts: the exoskeletal glove outer-layer, the sensor glove under-layer, the tendon actuating cuff, and the control array.
Figure 3.
The Exo-Skin haptic interface overlaid with virtual pointers showing a 5-finger and pinch-grip around two stem cells.
In order to provide accurate haptic feedback, the device must be able to augment the motion and exert forces on the user’s fingertip. This is accomplished by two actuated cables that mimic the flexor and extensor tendons of each finger. Artificial tendons run through pathways on the 3D printed segments and attach to actuators mounted on the wrist cuff. The comparison of artificial to corporal tendon pathways is visualized inFigure 4. Two actuators, one flexor and one extensor, for each finger work in unison, receiving force feedback data from two force sensors mounted in each fingertip, actively adapting to the natural motion of the hand. The glove can then augment this natural motion to simulate haptic forces generated from the data interaction. All actuators and necessary sensors are mounted on the hand, allowing it to be a lightweight wearable wireless system. When operating at 6V, each actuator has a free run speed of 320 RPM with a consumption of 80 mA, each can produce 30 oz-in (2.2 kg-cm) of torque with a stall current of 1.6 A. The system is powered by a single 6V wall mounted supply.
Figure 4.
The Exo-Skin tendon pathways overlaid on anatomical drawings of the hand. This illustrates the direct bio-mimicry of the joint actuation and force transmittance at the fingertip.
The exoskeleton outer layer consists of fourteen 3D printed segments that rest on the proximal, middle, and distal phalanges of each finger as well as the proximal and distal phalanges of the thumb as seen inFigure 4. The independent segments are securely fastened to the under glove and guide the artificial flexor and extensor tendons to the desired anchor points on the fingertip. The structure is entirely jointless, allowing for direct augmentation of PIP, DIP, and MCP joint flexion and extension. This mitigates the mechanical complexity and sizing constraints associated with typical exoskeletal technologies.
Each fingertip houses an array of components that allow the device to deliver textural feedback as well as sense the user’s “fingertip intention.” The location of these components can be seen inFigure 5. “Fingertip intention” is the motion desired by the user and sensed by the device. For example, if the user attempts to flex a finger, the glove immediately detects the change in pressure inside the fingertip and reacts accordingly to the stimulus. Using this same principle, the glove is able to deliver force feedback. If a virtual finger contacts a structure within the computational volume; the glove can augment the natural motion of the finger, restricting its flexion or extension in the direction of the structure. Using the haptic transfer functions seen inFigure 2(c), specific structures can be adjusted empirically to feel “rigid” while others can be made to feel “spongy” or “muddy”. This is achieved by varying the ratio of zero force reaction to augmented motor reaction. When a finger flexes with no haptic resistance, the ratio is 1:1. The motors react at full speed to accommodate for the finger’s movement. If a finger were to contact a structure with 50% rigidity, the ratio would be 1:0.5. The motors would move at half the zero-force rate to accommodate for the finger’s movement. This would give the structure a dense spongy feel. A structure with 100% rigidity would have a ratio of 1:0. Upon contact, the motors would lock movement, not allowing the fingertip to enter the structure.
Figure 5.
3D rendering of an Exo-Skin fingertip illustrating the location of various electronic components.
A vibrotactile stimulator is also located at each fingertip for textural simulation. The small coin motor, either DC or linear resonate actuator, is securely fastened to the fingertip housing allowing the haptic stimulus to propagate throughout the structure. This creates a larger surface area for the vibrations to be perceived by the sensory nerves of the hand. Accurate textural simulation can be achieved by controlling the vibration frequency and amplitude according to the pressure and velocity of the fingertip relative to the virtual surface [9]. This allows the biological structures within the volume to have distinct textural responses. A stem cell could feel “rough” or “sandy” where as a blood vessel could be perceived as “smooth”. All textures are assigned and adjusted using the haptic transfer functions seen inFigure 2(c).
Vibrotactile response is not only limited to texture. The motors can be set to respond to invisible properties such as voxel intensity or hull convexity values. The vibration response allows the user to establish a non-traditional (non-visual) interface with the data. This is essential for qualitative analysis because while visualization offers a large outlet, a vast quantity of information about the data sets can be relayed via alternate sensory interfaces.
D. “Leap Motion” Hand Tracking Interface
The “Leap Motion” is used to optically track the position of the user’s five fingertips and palm above the workspace. The positional data in physical space is translated into computational volume space where the corresponding pointers can be rendered with the biological structures. The pointers can then be translated according to the user’s hand movements. Voxel intensity values at each pointer location are interrogated and used for haptic force feedback as well as vibro-tactile output calculation. The positional data is also used for gesture recognition.
The “Leap Motion” in conjunction with our software is able to recognize multiple gestures that can be used to explore the 5D images. The 3D volume can be rotated about thex,y axes or zoomed on thez-axis by executing a “grasping” gesture with two or more fingers and dragging in any desired direction. The magnitude and direction of the dragging gesture determine how the volume is translated in real time. Temporal navigation is achieved by rotating one finger in a circular motion. The 5D volume will step forward in time once a clockwise rotation threshold has been reached. It will step backwards if a similar threshold is crossed in the counter-clockwise rotation. A structure within the volume can be selected and view-centered by executing a “tap” gesture while hovering over it in screen space. By using these gestures, the difficulties associated with spatiotemporal navigation using the keyboard and mouse are completely eliminated. This allows for much more efficient exploration and analysis of the 5D data.
E. Tonal Audio Feedback
The final component of the augmented interaction system incorporates tonal audio feedback to indicate a desired property of a structure corresponding to the users hand position established by the Leap Motion sensor.
We assign independent audio oscillators to each biological structure channel where the oscillators’ frequency, amplitude, and waveform are dynamically controlled. All parameters of the audio response are set by transfer functions seen inFigure 2(c). Similar to textural feedback, audio responses are not limited to one data type. They can be set to convey information that cannot be displayed visually or texturally. For example, the oscillators can relay interactions between two or more structures’ intensity values over a designated region.
Each biological structure is assigned a distinct tone or di-tone (two notes) designated by the user in the audio transfer function panel seen inFigure 2(c). The intensity values of these structures are interrogated at the pointer controlled by the motion of the user’s hand. A moving average smoothing algorithm is applied to each channels’ intensity value along the designated path. The smoothed intensity is then normalized and assigned to the amplitude of its respective oscillator. This technique is useful for analysis of the data because the user can hear the tonal interaction between various structures over a certain area. When two or more channels occupy the same area, the colors blend and the visualization becomes imprecise. With the tonal interface, three distinct tones with varying amplitudes (volumes) can easily be perceived.
III. Conclusion
We have developed a new multi-sensory interface for interacting with big data, specifically 5D stem cell images. The interface consists of a dynamic visualization tool for adjusting image appearance, a haptic feedback exoskeletal glove, and a tonal audio interface. Together, the tools provide an immersive system for efficiently exploring these data sets. Visualization can only go so far in the qualitative analysis of 5D stem cell volumes. By interfacing non-traditional senses, specifically sound and touch, we are able to relay data to the user on an entirely new sensory level.
Future work involves making the Exo-Skin device entirely “soft”, eliminating most if not all 3D printed parts. This would make it more universal and comfortable for extended use. All microcontrollers will be mounted on the arm and wireless communication protocols will be established, making the glove a standalone system.
Acknowledgments
Portions of the research reported in this publication were supported by Drexel University, and by the National Institute On Aging of the National Institutes of Health under Award Number R01AG041861. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Contributor Information
Michael Koerner, School of Biomedical Engineering, Science and Health Systems.
Eric Wait, Dept. of Electrical and Computer Engineering, Drexel University, Philadelphia PA, USA.
Mark Winter, Dept. of Electrical and Computer Engineering, Drexel University, Philadelphia PA, USA.
Chris Bjornsson, Neural Stem Cell Institute, Rensselaer, NY, USA.
Erzsebet Kokovay, UT Health Science Center, 78229 San Antonio, TX, USA.
Yue Wang, Neural Stem Cell Institute, Rensselaer, NY, USA.
Susan K Goderie, Neural Stem Cell Institute, Rensselaer, NY, USA.
Sally Temple, Neural Stem Cell Institute, Rensselaer, NY, USA.
Andrew R Cohen, Email: acohen@coe.drexel.edu, Dept. of Electrical and Computer Engineering, Drexel University, Philadelphia PA, USA.
References
- 1.Shen Q, Wang Y, Kokovay E, Lin G, Chuang SM, Goderie SK, et al. Adult SVZ Stem Cells Lie in a Vascular Niche: A Quantitative Analysis of Niche Cell-Cell Interactions. Cell Stem Cell. 2008 Sep 11;3:289–300. doi: 10.1016/j.stem.2008.07.026. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Winter M, Wait E, Roysam B, Goderie SK, Ali RA, Kokovay E, et al. Vertebrate neural stem cell segmentation, tracking and lineaging with validation and editing. Nat Protoc. 2011 Dec;6:1942–52. doi: 10.1038/nprot.2011.422. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Wait E, Winter M, Bjornsson C, Kokovay E, Wang Y, Goderie S, et al. Visualization and Correction of Automated Segmentation, Tracking and Lineaging from 5-D Stem Cell Image Sequences. BMC Bioinformatics. 2014 doi: 10.1186/1471-2105-15-328. vol. In Press. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Endo T, Kawasaki H, Mouri T, Doi Y, Yoshida T, Ishigure Y, et al. Five-fingered haptic interface robot: HIRO III. EuroHaptics conference, 2009 and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics 2009. Third Joint; 2009. pp. 458–463. [Google Scholar]
- 5.Geomagic. Geomagic Touch. 2013 Mar 15; Available:http://geomagic.com/en/products/phantom-omni/overview/
- 6.Kobayashi F, Ikai G, Fukui W, Nakamoto H, Kojima F. Multipoint haptic device for robot hand teleoperation. Micro-NanoMechatronics and Human Science (MHS), 2012 International Symposium on; 2012. pp. 304–309. [Google Scholar]
- 7.Fontana M, Dettori A, Salsedo F, Bergamasco M. Mechanical design of a novel Hand Exoskeleton for accurate force displaying. Robotics and Automation, 2009. ICRA ‘09. IEEE International Conference on; 2009. pp. 1704–1709. [Google Scholar]
- 8.CyberGrasp. 2010 Mar 15; Available:http://www.cyberglovesystems.com/?q=products/cybergrasp/overview.
- 9.Klatzky RL, Pawluk D, Peer A. Haptic Perception of Material Properties and Implications for Applications. Proceedings of the IEEE. 2013;101:2081–2092. [Google Scholar]




