Movatterモバイル変換


[0]ホーム

URL:


US8276091B2 - Haptic response system and method of use - Google Patents

Haptic response system and method of use
Download PDF

Info

Publication number
US8276091B2
US8276091B2US10/941,088US94108804AUS8276091B2US 8276091 B2US8276091 B2US 8276091B2US 94108804 AUS94108804 AUS 94108804AUS 8276091 B2US8276091 B2US 8276091B2
Authority
US
United States
Prior art keywords
virtual
passageway
virtual object
force
haptic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US10/941,088
Other versions
US20050093847A1 (en
Inventor
Robert Altkorn
Xiao Chen
Scott Milkovich
John Owens
Brian Rider
Eugene Rider
Daniel Stool
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ram Consulting
Original Assignee
Ram Consulting
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ram ConsultingfiledCriticalRam Consulting
Priority to US10/941,088priorityCriticalpatent/US8276091B2/en
Assigned to RAM CONSULTINGreassignmentRAM CONSULTINGASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: ALTKORN, ROBERT, CHEN, XIAO, MILKOVICH, SCOTT, OWENS, JOHN, RIDER, BRIAN, RIDER, EUGENE, STOOL, DANIEL
Publication of US20050093847A1publicationCriticalpatent/US20050093847A1/en
Priority to US13/540,210prioritypatent/US20120278711A1/en
Application grantedgrantedCritical
Publication of US8276091B2publicationCriticalpatent/US8276091B2/en
Expired - Fee Relatedlegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

An apparatus and method for assessing a hazard associated with an object are disclosed. The apparatus includes a haptic input/output device coupled to a computer with haptic modeling software and a display device. A virtual object and a virtual passageway are displayed on the display device. The virtual passageway includes a haptic layer along a surface thereof. Force applied by a user to the haptic input/output device causes a cursor on the display device to move the virtual object into the virtual passageway. An interaction of the virtual object with the haptic layer generates a virtual contact force which may be determined by the user sensing a corresponding tactile feedback force generated by the haptic input/output device and/or by the computer processor. The magnitude of the virtual contact force may be used to assess a hazard associated with the virtual object.

Description

CROSS REFERENCE TO RELATED APPLICATIONS
This application claims priority to U.S. Provisional Application No. 60/502,983 filed on Sep. 16, 2003, which is incorporated herein by reference in its entirety.
FIELD OF THE INVENTION
The invention relates to hazard assessment simulators, and more particularly to a haptic response system and method of use which enables a user to assess a hazard, such as a choking, aspiration, or blockage hazard, in humans caused by an inanimate object.
DESCRIPTION OF RELATED ART
Haptic, or force feedback, technology includes hardware and associated software that allows a user to physically feel objects existing in a virtual (e.g., computational) environment. Haptic hardware integrates force sensors and motors or actuators and is often shaped to simulate specific tools, such as surgical devices or sculpting tools. In haptic technology, haptic hardware replaces conventional tactile computer input devices such as a mouse, trackball, or keyboard. The force sensors measure a magnitude and direction of forces applied by a user and input these measurements to a computer. Software installed on the computer converts the inputted measurements into movement of one or more virtual objects that are displayed on a display device, calculates one or more interactions between objects, and outputs the interactions as computer signals. The motors or actuators in each input/output device resist forces applied by a user, or apply forces to the user, pursuant to the signals received from the computer.
Various haptic hardware devices have been developed. Illustratively, known haptic hardware devices include a MagLev Wrist developed by Carnegie Mellon University, an Eye Surgery Simulator developed by Georgia Tech University, a Laparoscopic Impulse Engine developed by Immersion Corporation, and a Cybergrasp Force Feedback Glove developed by Virtual Technologies, Inc.
Haptic technologies have been applied to various disciplines, including the training of surgeons in minimally invasive surgery or other medical procedures. Specific medical procedures for which haptic technologies have been developed include, for example, bronchoscopy, urinary tract endoscopy, epidural injections, cardiovascular surgery, and gynecology. These technologies are specifically designed to mimic the interaction between a surgical instrument and a part of the human body. However, currently, such haptic systems may not accurately model the forces experienced during the actual surgery or performance of a medical procedure for various reasons, the foremost being the inaccurate modeling techniques. For example, these known haptic models do not account for variations in the size, shape, and elasticity over different population groups. Thus, the modeling is generally a “gross” calculation of a particular body part and interactions with a surgical tool, without taking into account variables that may exist between persons.
Additionally, the known haptic surgical simulators do not provide body parts that are dimensionally sized and imbued with specific material properties unique to persons within a particular age group. Consequently, such simulators cannot generate anatomically correct models of parts of the human body that are statistically representative of a particular sector of the population.
Moreover, surgical haptic response simulators are generally modeled to show an interaction strictly with a surgical tool and a body part. Such interaction is very limited to the human manipulation of a surgical instrument (e.g., cutting and moving), ranging from incisions in the skin to removal of body parts such as a spleen, cataracts, etc. These systems do not model objects which have no human interaction such as, for example, objects which were accidentally swallowed. Additionally, these simulators are primarily concerned with modeling the treatment and repair of body parts, not with determining how inanimate objects interact with the human body in way that creates an injury hazard, such as causing a blockage with a passageway located within the body.
Other haptic applications include virtual assembly path planning and virtual maintenance path planning. Virtual assembly path planning haptic technologies permit users to manipulate or simulate tools and components within a virtual environment to verify that an assembly process may be successfully completed. Similarly, virtual maintenance path planning technologies permit users to manipulate tools and components within a virtual environment to confirm that a broken component may be removed and replaced by a working component. Consequently, the haptic training systems used in virtual assembly path planning and virtual maintenance path planning simulate mechanical systems that exist outside the human body. As such, they are not concerned with, nor configured to show interactions with a part of the human body.
SUMMARY OF THE INVENTION
In one embodiment, the invention provides a virtual haptic response system and method of use that enable a user to assess a choking, ingestion, blocking, insertion, aspiration, or any other physical hazard in humans caused by an inanimate object. As an example, the virtual haptic response system and method of use enables assessment of a hazard associated with an insertion of a manufactured, or yet to be manufactured, object into a human passageway. Illustratively, the object may be a toy or other articles intended for use by children, as well as other consumer products intended for use by teenagers and adults. The hazards may be assessed using an anatomically correct, virtual model of a passageway, such as, but not limited to a nasal pharynx, an oral cavity, an oral pharynx, a trachea, a hypo-pharynx, and an esophagus, and accurate, realistic tactile force feedback generated by a haptic input/output device. Additionally, the virtual model of the passageway may be dimensionally sized and imbued with specific material properties unique to persons within a particular age group. Consequently, the dimensions and material properties modeled by the virtual model of the passageway may statistically represent a particular sector of the population.
Thus, an embodiment of the invention is directed to a virtual computer model, tangibly embodied in computer executable instructions, which simulates on a display device a virtual object modeled after a particular real object, a virtual passageway modeled after a particular real human passageway, and an interaction between them. An interaction occurs when the virtual object and the virtual passageway are positioned proximate to or in contact with each other. Intensities of a force or forces generated by the interaction may be calculated and analyzed to determine whether the virtual object poses a hazard to the virtual passageway. Once calculated, the values of the generated force or forces may be processed so that one or more areas of the virtual object and/or the virtual passageway visibly deform and/or turn a non-anatomical color in response thereto.
In one embodiment, one or more forces generated by the interaction are output as computer signals to an input/output device manipulated by a user. In response, one or more actuators within the input/output device generate one or more feedback forces that simulate an intensity level of one or more real forces that would be exerted if an interaction occurred between the real object and the real passageway. The force feedback enables the user to determine whether the virtual object is capable of traversing the virtual passageway, and if not, where in the virtual passageway the virtual object is likely to lodge. The intensity of one or more calculated forces may be displayed on the display device by color variations and/or alphanumeric data.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1A is a cut-away, profile view of a virtual human head showing placement of a virtual object within a virtual airway, according to one embodiment of the invention;
FIG. 1B is a perspective view of an apparatus useable with an embodiment of the invention;
FIG. 2 is a screenshot of magnetic resonance images (MRI) used in embodiments of the invention to create the virtual human body part shown inFIG. 1;
FIG. 3A is a screenshot illustrating a three-dimensional, frontal view of a human skull constructed using data and measurements obtained from the magnetic resonance images ofFIG. 2, according to one embodiment of the invention;
FIG. 3B is a cut-away profile view of a human head illustrating construction of reference layers, according to one embodiment of the invention;
FIG. 3C is a screenshot illustrating a three-dimensional, frontal view of a child's head with left and right side skin layers, according to one embodiment of the invention;
FIG. 4 is a screenshot illustrating four representative views of a model of a hypopharnyx that may define an airspace used in the haptic modeling system, according to one embodiment of the invention;
FIG. 5 is a screenshot of an exemplary interface used in an embodiment of the invention to select a type of virtual object;
FIG. 6A is a three-dimensional virtual view of human internal organs illustrating an interaction with a virtual object positioned therein;
FIG. 6B is a screen shot of an exemplary interface used in an embodiment of the invention to adjust one or more spring-constant and/or mass values in one or more spring-mass models;
FIG. 7A is a cross-sectional, side view of a diagram used to illustrate a virtual object and a virtual passageway, according to one embodiment of the invention;
FIG. 7B is an end view of a diagram used to illustrate a virtual object and a virtual passageway, according to one embodiment of the invention;
FIG. 7C is a cross-sectional, side view of a diagram used to illustrate an interaction between a virtual object and a virtual passageway, according to one embodiment of the invention;
FIG. 7D is an end view of a diagram used to illustrate an interaction between a virtual object and a virtual passageway, according to one embodiment of the invention;
FIG. 8 is a flowchart of an exemplary method according to one embodiment of the invention;
FIG. 9 is a flowchart of another exemplary method according to one embodiment of the invention;
FIG. 10 is a flowchart of yet another exemplary method according to one embodiment of the invention; and
FIG. 11 is a flowchart of yet another exemplary method according to one embodiment of the invention.
DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION
In one embodiment, the invention provides a virtual haptic response system and method that enables a user to visually and tactilely assess a choking, ingestion, blocking, insertion, aspiration, or other hazard associated with an insertion of a manufactured, or yet to be manufactured, object into a human passageway. Illustratively, the object may be a toy or other article intended for use by children, as well as other consumer or food products intended for use by any age group. The object may be modeled by a virtual object that includes the dimensions and material properties of the real object.
Hazards associated with the object may be assessed by interacting the virtual object with an anatomically correct, virtual passageway that models a real human passageway, such as, but not limited to, a nasal pharynx, an oral cavity, an oral pharynx, a trachea, a hypo-pharynx, and an esophagus. The virtual passageway may be dimensionally sized and imbued with specific material properties unique to persons within a particular age group. Additionally, the dimensions and material properties modeled by the virtual passageway may be statistically obtained to represent a particular sector of the population.
In one embodiment, a haptic input/output device is connected to a display device through a computer. The display device displays a two-dimensional or three-dimensional view of a virtual object and a virtual passageway, both of which may model the exact or substantially exact dimensions and material characteristics of a real object and a real passageway, respectively. The display device may also indicate a magnitude of a force caused by an interaction of the virtual object with the virtual passageway. Additionally, the haptic input/output device may generate a tactile force that enables a user to feel the interaction of the virtual object with the virtual passageway in order to assist in assessing a degree of hazard associated with the virtual object. Optionally, assessment of the hazard may be performed by the computer itself using computational techniques. Simulating a design of an object being considered for manufacture and testing it for hazards in the manner described herein enables the designer and/or manufacturer to modify the object's dimensions and/or material properties early in the design cycle, which reduces costs and saves time.
System of the Invention
FIG. 1A is a cut-away, profile view of a virtualhuman head100 showing placement of avirtual object105 within avirtual passageway110. Thehead100 includes ahaptic layer103, which may be a virtual, complex, computer-generated surface that forms an interface between thevirtual passageway110 and corresponding portions of thehead100. Thehaptic layer103 may be formed using, for example, Boolean subtraction to remove a volume having the size and shape of a normal or expanded passageway from thehead100. Thehaptic layer103 may be used to calculate a magnitude of a contact force (or contact forces) exerted between thevirtual object105 and thevirtual passageway110.
Additionally, as shown inFIG. 1A, thehaptic layer103 may be positioned to correspond to an inner surface of a passageway. For example, thehaptic layer103 may be positioned on an inner surface of anasal passageway102. Similarly, anotherhaptic layer103 may be positioned on an inner surface of anoral passageway101. Although thehaptic layer103 may not be visible to a user, the user may deduce its position by a tactile force generated by a haptic input/output device whenever the virtual device interacts with thehaptic layer103. The haptic layer may be toggled on and off. In the latter case, no force feedback is provided.
Thevirtual object105 may be, for example, a computer generated model of a real object. Accordingly, thevirtual object105 may have the exact dimensions and material properties (modulus of elasticity, poisson's ratio, density, texture, friction, etc.) of a natural object, a manufactured object, or a yet to be manufactured object. The dimensions and material properties of the real object may be obtained from reference sources or experimentally measured data. These properties may be linear or non-linear, isotropic or anisotropic, homogeneous or inhomogeneous. Once obtained, the dimensions and material properties of the virtual object and virtual passageway may be imported or otherwise input into a computer program that creates thevirtual object105 and thevirtual passageway110, respectively. In one embodiment, anoptional handle107 connected to theobject105 is provided so that a user can more clearly see an interaction of theobject105 with thevirtual passageway110. Additionally, thehandle107 may be used to manipulate thevirtual object105 through a portion of thevirtual passageway110, or to position thevirtual object105 at any particular location within thevirtual passageway110 for hazard assessment. In an implementation, thevirtual object105 may be created using the FreeForm® Concept™ software produced by SensAble Technologies, Inc. of Woburn, Mass., or other graphical software programs.
Thevirtual passageway110 may be, for example, a computer generated model of a nasal pharynx, an oral cavity, an oral pharynx, a trachea, a hypopharnyx, an esophagus, or other anatomical entity, such as an ear canal, a lumen, intestine, lungs, or other passageway. In an implementation, thevirtual passageway110 will accurately represent, anatomically, a human passageway. This accurate representation will include the interaction of tissue, bone, and muscle groups associated with the passageway. The dimensions of such tissues, bone, and muscle groups may be determined using MRI modeling, CT modeling, statistical modeling, or other empirical data discussed below. The material properties of such tissues, bone, and muscle groups may be determined by direct measurement, or from compilations such as H. Yamada, Strength of Biological Materials, Wilkins and Williams, Baltimore, Mass., 1970, herein incorporated in its entirety, or from statistical modeling of data from a single or multiple sources. In one embodiment, the virtual passageway will comprise the haptic layer in order to provide feedback and modeling according to an aspect of the invention. The model ofFIG. 1A, as well as any of the remaining models of the invention may further include one ormore reference layers109A and109B.
In an implementation, the reference layers are computer-generated artistic or mathematical renderings of certain anatomical features that may have a fixed shape. In an embodiment, the reference layers may include haptic properties, such that a user will feel resistance (e.g., feedback force) when passing the virtual object through one or more of the reference layers. In an embodiment, one or more of the reference layers may be toggled off to permit placement of the virtual object at any particular location of the virtual passageway, and then toggled back on to provide a cumulative resistance that, combined with the resistance provided by the haptic layer, realistically simulates the force(s) exerted by and on the virtual object. Alternatively, once the virtual object is positioned, only the reference layer(s) may be toggled on to permit determination of the resistance(s) provided by the tissues which surround the virtual passageway. The one ormore reference layers109A and109B may be simultaneously displayed with a visible or invisiblehaptic layer103 to provide frames of reference to a user and to enable the user to better understand and visualize relevant anatomy. Additionally, the reference layers may be toggled on and off separately, or simultaneously.
The one or more of thereference layers109A and109B may be created using data imported from MRI scans and CT scans, together, or separately, with other inputted data that is either experimentally measured or obtained from reference sources. A combination of MRI and CT scans is preferable because MRI scans offer excellent soft tissue discrimination, and CT scans offer excellent bone discrimination. In an implementation, thereference layers109A and109B, and thehaptic layer103, may be created using multiple software applications and then imported into the FreeForm® or other development environment.
In one embodiment, high resolution CT and MRI scans, in DICOM or other format, are imported into a software program that allows each scan to be viewed individually, and which recreates an approximate volumetric representation of a head or other body part using a polygonal or other finite element mesh that may serve as the basis for a virtual spring-mass damper model or other mathematical method of modeling the material properties of tissues. One such software program is the Mimics software program, manufactured by Materialise, Inc. of Ann Arbor, Mich. However, other software programs may be used with the invention.
Once the data from the CT and MRI scans is imported, reference layers that correspond to specific anatomical entities, such as theskull layer109A and themandible layer109B, may be isolated by “filtering” one or more CT or MRI images. Filtering may include highlighting only those areas of the image that correspond to a specific range of gray shades. After filtering, the selected anatomical entities are exported by the software program in .stl (stereolithography) format, and imported into sculpting or general geometrical modeling software such as FreeForm® Concept™ Software, where the quality of the images may be improved and extraneous tissue identical in density to the desired anatomical entity may be removed, according to a user's preference.
In an implementation, one or more pre-assembledvirtual objects105,virtual passageways110,reference layers109A and109B, andhaptic layers103 may be stored in one or more databases for later retrieval by a user. This enables a user to select from among a range of choices. For example, embodiments of pre-assembledvirtual objects105 may include square, round, rectangular, polygonal, and other shaped objects, of various sizes, textures, and rigidity. Additionally, embodiments ofvirtual passageways110, may include anatomically correct, non-anatomically correct, and statistically characterized passageways.
In one embodiment, thevirtual passageway110 is not only anatomically correct, but it also includes anatomical measurements and/or material properties that have been statistically correlated to correspond to a particular age group. For example, thevirtual passageway110 shown inFIG. 1A may represent a passageway having the dimensions and material properties most likely to be found in the 75thpercentile of children ages 3 years to 6 years. Naturally, the invention is not limited to this percentile or age group, but may include any other percentile or age group.
One technique for creating a statistically characterizedvirtual passageway110 may include obtaining detailed external and internal anatomical measurements for different age groups and different demographic groups of children, teenagers, or adults. Illustratively, external anatomical measurements such as height and various facial dimensions for children in different age and demographic groups may be obtained from existing reference sources. In some cases, the dimensions of internal passageways may also be obtained from existing reference sources. However, in some cases, existing studies of human passageways may not provide sufficient data to provide a statistical basis for embodiments of the present invention. Accordingly, in one implementation, internal passageway dimensions from CT and MRI scans may be obtained and compared with measurements of external anatomical features in the same CT and MRI scans to find an external anatomical feature that correlates with an internal anatomical feature. The best-correlated pair of external and internal features may then be used to statistically calculate the passageway's size percentile within a particular population group.
Illustrative measurements obtained from MRI or CT scans include, but are not limited to, head length (mm), head width (mm), and tragion-to-menton distance (mm). In an embodiment of the invention that assesses hazards associated with an object placed in the passageway of a child, these measurements are preferable because they have been tabulated for children of different ages in L. W. Schneider, R. J. Lehman, M. A. Pflug, and C. L. Owings, “Size and Shape of the Head and Neck from Birth to Four Years, University of Michigan Transportation Research Institute Publication UMTRI-86-2, January 1986, which is herein incorporated by reference in its entirety. Consequently, these measurements may serve as independent variables in correlation assessment.
Other internal measurements may serve as dependent variables in the correlation analysis. Such other internal measurements include, but are not limited to: bassioccipial to incisal edge length (mm), bassioccipial to posterior hard palate length (mm), bassioccipial to superior epiglottis length (mm), bassioccipial to superior hyoid length (mm), molar to molar length (mm), epiglottis width (mm), epiglottis length (mm), vocal ligament width (mm), and vocal ligament length (mm).
Once the appropriate internal measurements are obtained, correlation analysis may be performed to test the significance level of a correlation between each pair of independent and dependent variables. The independent variable that is the most significantly correlated with all dependent variables is selected as the indicator variable (anatomical marker). Probability distributions are then fitted for each indicator variable, within each age group, according to statistics from an existing case study, for example, but not limited to, the University of Michigan study mentioned above. A best fitting distribution is then chosen based on one of the Chi-Square, Kolmorgorov-Smirnov, and/or Anderson Darling tests, or other appropriately statistical methods.
Once the best-fitting distribution is determined, a relative position (e.g., the nth percentile in a specific age group) of the indicator variable may signify the location of each subject in terms of measurements of all dependent variables. Additionally, a single MRI or CT scan may be used in multiple age categories (e.g., the same scan may represent a 25th percentile in the age group of 3 years to 4 years and a 50th percentile in the age group of 2 years to 3 years). Statistical characterization of the dimensions and/or material properties of a passageway enables the storing of two or more statistically characterized virtual passageways in a database for later retrieval by a user.
FIG. 1B shows an apparatus implementing the invention. The apparatus may include adisplay device115, acomputer120,databases125,130, and135, haptic input/output device140,keyboard145,mouse150, and (optionally)scanner165. Thedisplay device115 may be a flat panel plasma display, a cathode-ray-tube display, or other display device. Thedisplay device115 displays a two-dimensional or three-dimensional image of thevirtual object105 and thevirtual body part100 that includes avirtual passageway110. An example of a type of image that may be displayed is the two-dimensional image depicted inFIG. 1A. Illustratively, the haptic input/output device140 may be, but is not limited to, a PHANToM® arm produced by SensAble Technologies, Inc. of Woburn, Mass. Alternatively, the haptic input/output device140 may be custom manufactured.
Thecomputer120 includes a processor (not shown) connected to a memory (not shown) by a central bus (not shown). The central bus also connects the processor to anetwork195, such as, but not limited to the Internet, a local-area-network, or a wide-area network. The central bus may also connect the processor to one or more peripheral devices such as the haptic input/output device140,keyboard145,mouse150,wireless antenna155,disc drive160, and (optionally) ascanner165.
A wired orwireless communications channel170 conveys signals between the processor and the haptic input/output device140 so that a user-initiated movement of the haptic input/output device causes a corresponding movement of a cursor on the display device. The cursor may be used to cause avirtual object105 to interact with thevirtual passageway110. Additionally, thecommunications channel170 also conveys signals that generate a tactile feedback force in the haptic input/output device140 so that a user can feel the interaction of the virtual object with the virtual passageway110 (or included haptic layer). This may allow the user, for example, to determine when an object may be irrevocably lodged within the virtual passageway. In an implementation of the invention, the signals passed between the computer processor and the haptic input/output device occur in real-time, or substantially in real-time, so that movement of thevirtual object105 and/or deformations in the virtual passageway appear to be smooth and continuous.
Signals exchanged among the processor and thekeyboard145 and/ormouse150 are transmitted overwireless communications channel175. Similarly, signals exchanged between the processor and thescanner165 or other peripheral device are routed overcommunications channel180.
One or more of thedatabases125,130, and135 may be stored within the computer's memory, or stored at a remote location that is accessible over thenetwork195. Data may be written to and read from thedatabases125,130, and135 over a wired orwireless communications channel185. Data may be input to one or more of thedatabases125,130, and135 using thekeyboard145, thedisc drive160, thescanner165, or other peripheral device (such as, but not limited to, a MRI, CT, or other medical device). These databases may include custom software applications that mathematically detect collisions between the virtual object and the human anatomy, and solve the dynamic and static equations of motion that ultimately determine the interaction forces, stresses and strains transferred haptically between the user to the virtual product and the virtual passageway. These software applications may use a number of methods for creating the mathematical simulations including modified forms of general finite element methods.
In one embodiment,database125 may store one or more assembledvirtual objects105.Database130 may store MRI, CT, and other data used to assemble the virtual objects, virtual passageways, reference layers, haptic layer, and anatomical features associated with the virtual passageway.Database135 may store a set of one or more anatomically accuratevirtual passageways110 that may include associated tissues, related reference layers, and related haptic layers. As discussed, eachvirtual passageway110 may have anatomical characteristics that correspond to children or adults in different age groups. Additionally, eachvirtual passageway110 may correspond to a different size percentile within one or more of the age groups.
FIG. 2 is a screenshot illustrating threeMRI scans201,205, and210 that may be used by embodiments of the invention to obtain internal dimensions, external dimensions, and placement of various anatomical features, such as nasal pharynx, an oral cavity, an oral pharynx, a trachea, a hypo-pharynx, and an esophagus. For example, the MRI scans201,205, and210 represent top, back, and side views of a human head. Although a head is illustratively shown, other body parts may be scanned and used in accordance with the invention. As previously discussed, the data obtained from the MRI scans may be combined with data provided by CT scans and/or other radiological scans in order to capture the bone or skeletal features of the subject.
FIGS. 3A,3B, and3C are screen shots illustrating a method of forming a virtual three-dimensional model of a human anatomical feature, e.g., thehead100, that includes one ormore reference layers305,310,315,320,325, and330. For purposes of illustration, it will be assumed that thehead100 shown inFIGS. 3A-3C corresponds to the head shown in the MRI scans201,205, and210 ofFIG. 2. To form the three-dimensional view of thehead100 shown inFIG. 3A and/or a two-dimensional view of thehead100 shown inFIG. 3B, data obtained from the MRI scans may be imported into a software program that provides an approximate volumetric representation of thehead100 using a polygonal mesh that may serve as the basis for a virtual spring-mass damper model or other model incorporating accurate material properties.
After being filtered, the images may be imported into a sculpting, or other geometric software and edited to create the right side skull reference layer305, left side skull reference layer310, right sidemandible reference layer315, and left sidemandible reference layer320. These layers represent the bone structure of thehead100, and may be toggled on or off in order to insert the virtual object into a particular location within the virtual passageway and/or to enable the user to more clearly see the interaction of the virtual object with the virtual passageway. Additionally, one or more areas of each reference layer may be imbued with one or more material properties of real bone (e.g., density, friction, stiffness, etc.) so that a force exerted on the haptic layer of a passageway also interacts with one or more forces exerted by the underlying bone structure (e.g., reference layers305,310,315, and320). In other embodiments, other reference layers, may be imbued with the material properties of their corresponding real counterparts.
Referring toFIG. 3B, a side view of thehead100 ofFIG. 3A is shown. A profile, cut-away view such as this one is preferable because it allows a user of the invention to see the significant portions of the virtual object and the virtual passageway. Additionally, this view shows ahyoid reference layer316, constructed in a similar manner to the reference layers previously described.
FIG. 3C is a three-dimensional view of thehead100 ofFIGS. 3A and 3B showing the implementation of skin reference layers325 and330. In the view shown inFIG. 3A, these skin layers are toggled off to show the underlying bone reference layers. Like the bone reference layers, the skin reference layers325 and330 may include one or more material properties of real skin for a particular age group. Data used to form the skin layers325 and330 may be obtained from the MRI and CT scans201,205 and210 ofFIG. 2, obtained via experimental testing of real skin, and/or obtained from existing reference sources.
FIG. 4 is a screenshot showing fourrepresentative views401,405,410 and415 of a model of a hypopharnyx, which defines the air space used in the haptic modeling system. In particular,view401 illustrates a color-coded top-down view of the model of the hypopharnyx. View405 is a three-dimensional perspective view, which may also be color-coded. View410 is a two-dimensional front view of the model of the hypopharnyx. And,view415 is a two-dimensional side view of the model of the hypopharnyx.
In an implementation, theviews401,405,410, and415 may be assembled in the same or similar manner as the reference layers shown inFIGS. 3A,3B, and3C. For example, the data used to create the model of the hypopharnyx shown inFIG. 4 may be obtained from MRI and CT scans of a real hypopharnyx, from experimental testing of a real hypopharnyx, or from pre-existing reference sources. This data may be imported into a computer software program that recreates an approximate volumetric model of the real hypopharnyx using a polygonal mesh. This volumetric model may be saved as a .stl file and exported to a sculpting program for editing and fine-tuning. Additionally, the virtual model of the tissues surrounding the hypopharnyx may be imbued with one or more material properties of the real hypopharnyx.
FIG. 5 is a screenshot illustrating one embodiment of aninterface500 provided by an implementation of the invention. As shown, a drop downmenu505 may be selected by moving a cursor over thetoolbar501 and selecting the “Tool” heading. Thereafter, a particular shape of the object may be selected from the drop downmenu510. In this manner, the user may select from a plurality of objects, each having different material qualities. In a similar manner, theinterface500 may further include drop down menus that enable a user to select one of a plurality of pre-assembled virtual passageways or other pre-assembled anatomical entities, such as a lung, an ear canal, an intestine, a lumen, or other anatomical entity.
The data indexed by the drop downmenus505 and510 may be stored in, and retrieved from, one or more of thedatabases125,130, and135 that were previously shown inFIG. 1B. Using data stored in thedatabases125,130, and135 in combination with the drop downmenus505 and510, a user may select, and/or modify, the shape, size, material properties of avirtual object105,virtual passageway110, and/or one or more reference layers. Additionally, drop downmenus505 and510 may be used to toggle the haptic layer and/or reference layer(s) on and off.
FIG. 6A is a three-dimensional virtual view of humaninternal organ600 andpassageways605,610, and615. As shown, one or more reference layers in the lower section ofpassageway615 are toggled off to enable viewing of avirtual object105 that is placed within thepassageway615. In this illustrative embodiment, thevirtual organ600 represents a uterus,virtual passageway610 represents an ileum, andvirtual passageway605 represents a tuba uterina. However, other organs and passageways may also be modeled.
Thevirtual organ600 is shown covered with a non-anatomical reference layer625 formed of point-masses630 interconnected by spring-like connectors635. In this manner, thevirtual organ600, or another anatomical entity may be represented by a mass-spring system, a finite element method, or deformable model of the type described in “Evaluation and Visualization of Stress and Strain on Soft Biological Tissues in Contact,” by Sofiane Sarni, et. al., Proceedings of International Conference on Shape Modeling and Applications, Los Alamitos, Calif. IEEE Computer Society Press, 2004, Virtual Reality Lab, Swiss Federal Institute of Technology, which is herein incorporated by reference in its entirety.
Similarly, thevirtual object105 may also be represented by a spring-mass damper system having spring-like connectors. By using this technology, a computer processor may calculate the collision between thevirtual object105 and the haptic layer, as well as the magnitude of a force (or forces) caused by the interaction of the virtual object105 (or virtual object105) with the virtual passageway. However, the invention may be implemented using methods, techniques, algorithms, and formulas different than those disclosed in the above-mentioned reference.
In an implementation, the spring-like connectors635 each have spring and damper constants that simulates a particular material property of a biological tissue that forms a real organ or passageway. Illustratively, the material properties that may be simulated include, but are not limited to, friction, texture, stiffness, vibration of spring-mass, Young's modulus of elasticity, density, inertia, and other properties. For virtual organs or virtual passageways, the spring and damper constant values of these various material properties may be obtained, for example, as described in Sarni et al.
Alternatively, one or more of the material property values (for avirtual object105, thevirtual passageways605,610, and615, and the virtual organ600) may be arbitrary values. Depending on the embodiment, thevirtual object105 may be rigid or deformable. Similarly, thevirtual passageways605,610, and615, and thevirtual organ600, may each be rigid or deformable.
FIG. 6B is a screen shot of anexemplary interface601 used in an embodiment of the invention to adjust one or more spring-constant values in a spring-mass model of a virtual object, a virtual passageway (and associated tissues), or other anatomical feature. As shown, theinterface601 may include adisplay window640 in which avirtual model645 is displayed. Theinterface601 may include one ormore menu areas650,655,660, and665. Thevirtual model645 may be represented in either two-dimensional or three-dimensional form. Additionally, thevirtual model645 may represent a virtual object, a virtual passageway, or another virtual anatomical feature.
The data indexed by themenus650,655,660, and665 may be stored in, and retrieved from, one or more of thedatabases125,130, and135 that were previously shown inFIG. 1B. Using data stored in thedatabases125,130, and135 in combination with themenus650,655,660, and665, a user may select, and/or modify, the shape, size, material properties of avirtual object105, virtual passageway, and/or one or more reference layers. Additionally,menus650,655,660, and665 may be used to toggle the haptic layer and/or reference layer(s) on and off. Theinterface601 may further include a menu that permits a user to adjust the consistency, viscoelasticity, friction, etc., of the haptic layer that lines a complex inner surface of thepassageway615.
In an implementation of the invention,menu650 presents a list of variable rules that may be used to govern the behavior of all of or parts of thevirtual model645. These rules may be used for example to implement elasticity, viscosity, and other values used to model a virtual object and/or a virtual passageway. The values of such variables may be obtained from known reference sources.
It should be further understood that many different ranges may be provided for the invention, which may be accomplished in one embodiment using one or more shifting scales or other interactive mechanisms shown on the display device. Additionally, a menu may be provided that permits a user to adjust the number ofpoint masses630 and the number of spring-like connectors635 by manually inputting the desired number for each, or by selecting a number for each from a menu displayed on the display device.
Themenu665 may be manipulated using keyboard or mouse commands to adjust the types of algorithms used to calculate the deformations and/or contact force(s) generated during an interaction of thevirtual object105 with the virtual passageway615 (and, optionally, any surrounding virtual organs, virtual passageways, or other virtual anatomical entities that surround the virtual passageway615). Themenu665 may also be used to select the type of algorithms used to calculate the numeric values of the Elasticity, Friction, Friction Force, Shiftability, Viscosity, and spring-constant properties. Illustratively, one or more fuzzy logic sets may be used to perform these, and other, calculations.
After conducting a hazard evaluation, a different model (that may include a different set of Reference and Haptic layers which correspond to a different set of MRI and CT scans) may be selected, using themenus650,655,660, and665, to re-evaluate one or more hazards that thevirtual object105 poses for a different population group. This provides flexibility to the system, and further allows comparison between different models.
FIGS. 7A-7D are schematic diagrams that illustrate, by analogy, how a virtual object “VO” interacts with a virtual passageway, generally denoted as “VP”. In this illustration, the virtual object may represent any virtual object, such as, for example, thevirtual object105 shown inFIG. 1A orFIG. 6A. Additionally, the virtual passageways illustrated inFIGS. 7A-7D may represent any virtual passageway such as that shown, for example inFIG. 1A orFIG. 6A.
InFIG. 7A the virtual object VO is positioned outside the virtual passageway VP1. InFIG. 7B, a virtual force F(t) is imparted to the virtual object VO to move it into the deformable passageway VP1. The virtual force F(t) results from a user-initiated movement of the haptic input/output device. This user-initiated movement may be tracked by a collision detection routine that continuously looks for virtual collisions between the product and the virtual model of a human passageway. As shown inFIG. 7C, movement of the virtual object VO along the virtual passageway VP causes a collision and resulting deformations712 to occur in the sidewalls of the virtual passageway VP1.FIGS. 7C and 7D further illustrate that the virtual object VO is likely to lodge at (and/or obstruct or partially obstruct) the juncture between the virtual passageway VP1and virtual passageway VP2. This is due to the fact that the diameter of the virtual passageway VP2is smaller than the diameter of the virtual object VO, and does not possess the same elasticity as virtual passageway VP1. Additionally, the virtual passageway VP2has a higher rigidity than the virtual object VO, such that the virtual object VO cannot pass therethrough.
In an embodiment, a haptic layer (not shown) is positioned to correspond to the complex inner surfaces of passageway VP1, and optionally VP2, such that the virtual object VO cannot pass beyond the haptic layer. This constrains the virtual object VO within the interior of at least the passageway VP1. Additionally, contact between the virtual object VO and the haptic layer generates a tactile feedback force in the haptic input/output device that is felt by the user. Depending on the material properties modeled by both the virtual object VO and the virtual passageway VP1, the tactile feedback force may complement or resist user-initiated movements of the haptic input/output device. Additionally, the haptic input/output device may temporarily freeze in position when the virtual object VO reaches a point where it is likely to lodge within the virtual passageway VP1or the virtual passageway VP2.
FromFIGS. 7A-7D, it may be seen that the virtual model provides at least one or more of the following data:
    • the penetration depth of the virtual object VO;
    • whether the virtual object will lodge within the virtual passageway VP1;
    • the vector force history F(t) input required to generate the interaction; and
    • the stresses and strains introduced to all the components at any particular time during the interaction.
FIGS. 8-11 show several embodiments of methods implementing the invention. It will be appreciated that the methods disclosed may include more or fewer steps, and that the steps shown and described may be performed in any convenient order.FIGS. 8-11 may equally represent high-level block diagrams of components of the invention implementing the steps thereof. The steps ofFIGS. 8-11 may be implemented on computer program code in combination with the appropriate hardware. This computer program code may be stored on storage media such as a diskette, hard disk, CD-ROM, DVD-ROM or tape, as well as a memory storage device or collection of memory storage devices such as read-only memory (ROM) or random access memory (RAM). Additionally, the computer program code may be transferred to a workstation over the Internet or some other type of network.
FIG. 8 is a flowchart of an exemplary method according to one embodiment of the invention. In this method, using the cursor to manipulate the virtual object, the user may place the virtual object at any location within the virtual passageway atstep801. The virtual object may then interact with a virtual passageway and any related tissues, muscles, and the like. This interaction generates computer signals that cause the haptic input/output device to deliver a tactile feedback force, which may be felt by the user to represent the feel of the virtual object moving along the virtual passageway. Atstep805, the magnitude of a contact force exerted between the virtual object and the virtual passageway may be calculated by the computer processor. Optionally, this step may represent a tactile feedback force sensed by the user.
Based on the magnitude of the contact force and (optionally) degree of occlusion, a hazard associated with the virtual object is assessed atstep810 by the user, and/or optionally, the computer processor. For example, the computer processor may feed the determined value of the contact force to a comparator for comparison with a previously determined threshold value, equal to or above which the virtual object is likely to lodge within the passageway, and below which the virtual object is not likely to lodge within the passageway. This may then be translated into a scale of hazard. Similarly, the computer may determine the degree of occlusion associated with a lodged object and calculate a degree of hazard associated with such occlusion and/or display the degree of occlusion enabling the user to assess the hazard. This may be based on a percentage of occlusion of the virtual passageway by the virtual object. In one example, 100% occlusion is considered a high hazard, and 10% occulsion is considered a low hazard.
FIG. 9 is a flowchart of another exemplary method according to one embodiment of the invention. Atstep901, a virtual object and a virtual passageway are displayed on a display device. In one implementation, the virtual passageway is selected from a set of previously compiled statistically characterized, anatomical passageways. Atstep905, a haptic input/output device coupled to a cursor displayed on the display device is used to move the virtual object within, or to a certain location within, the virtual passageway. Atstep910, computer signals representative of the contact force between the virtual object and the haptic layer lining the interior surfaces of the virtual passageway are output to generate a tactile force. The tactile force simulates the feel of the virtual object moving along the virtual passageway to enable assessment of a hazard associated with the virtual object.
FIG. 10 is a flowchart of yet another exemplary method according to one embodiment of the invention. In this method, MRI, CT, and/or other anatomical or radiological data is obtained atstep1001. This data is used to create one or more statistically characterized virtual passageways atstep1005 used to model an interaction with a real object (step1010). The virtual object and the virtual passageway are both displayed on a display device atstep1015. Atstep1020, the virtual object is manipulated in response to input received from a haptic input/output device. In response to contact between the virtual object and the virtual passageway, signals are output to the haptic input/output device that cause the device to simulate a contact force exerted between the virtual object and the virtual passageway to enable assessment of a hazard associated with the virtual object atstep1025. This simulated force may now be used by the user to assess a hazard of lodgment of the object at a certain location of the passageway.
FIG. 11 is a flowchart of yet another exemplary method according to one embodiment of the invention. In this method, data is imported from magnetic resonance imaging and computer tomography scans atstep1101. A combination of MRI and CT scans may be used in combination with other statistical or studies. Atstep1105, an anatomically correct feature is created using the data. A virtual passageway is formed within the feature atstep1110, such as a trachea, for example. The virtual passageway includes one or more internal measurements that correspond to a real passageway, and may include one of a rigid haptic layer or a deformable haptic layer formed along an inner surface of the virtual passageway (step1115). Atstep1120, virtual object modeled after a real object is formed. The virtual object may be rigid or deformable. Atstep1125, the virtual object is placed, via computer simulation or signals generated from a user-initiated force applied, to the haptic input/output device, within the virtual passageway. The magnitude of the force exerted between the virtual object and the virtual passageway (or haptic layer) is determined atstep1130 and a hazard associated with the virtual object is assessed atstep1135, either by the user feeling the tactile force feedback, and/or, optionally, the computer processor.
Method of Using the Invention
In use, the computer processor transforms raw data obtained from magnetic resonance imaging (MRI) scans and computer tomography (CT) scans, as well as other data input by a user, alone or in combination, into two-dimensional or three-dimensional, anatomically correct, models of various parts of the human body. The computer processor also transforms additional input data into a two-dimensional or three-dimensional model of a manufactured or yet to be manufactured object to be assessed for hazard that includes the exact dimensions and material properties of a real object.
These geometric and material property virtual models may be transferred into a mathematical representation of the solid models using various forms of general finite element formulations. This model has the ability to numerically determine stress and strain in the model as a result of input loads. This connectivity of this virtual model to the user feedback device enables this virtual model to act as a haptic model. This virtual haptic model receives vector forces provided by the user through the haptic device. Those vector forces may be input as loads to the above mentioned numerical model. The model also includes a collision detection routine which tracks the position of the product and compares that with the position of the human anatomy part of the virtual model. When collisions are detected, the model calculates the resulting stress and strains based on the user input force vectors.
The computer processor may calculate one or more forces caused by the interaction of the virtual object with the virtual passageway. Illustratively, calculations may be performed using, but are not limited to, general finite element methods, simplifications of finite element methods, implicit surface techniques, feature-based modeling and deformation techniques, geometric algorithms, subdivision surfaces techniques, mesh processing techniques, point-based modeling techniques, and interactive modeling techniques.
Once the magnitudes of the force (or forces) are calculated, the processor may output signals that cause the magnitude(s) of the one or more forces to be indicated on the display device. The processor may also output signals that cause the haptic input/output device to generate a tactile feed back force so the user can feel the interaction of the virtual object with the virtual passageway. The processor may further output signals that cause one or more areas of the virtual passageway to deform and/or to change color in proportion to a magnitude of a force exerted between the haptic layer and the virtual object.
In an implementation, the processor may analyze the intensities of a force or forces generated by the interaction to determine whether the virtual object poses a hazard to the virtual passageway. For example, through experimentation, a threshold magnitude of a contact force may be established for a particular object/passageway combination, at and above which the object will obstruct or partially obstruct the passageway, and below which the object will not obstruct or partially obstruct the passageway.
In use, the computer processor may cause a comparator to compare the magnitude of a virtual contact force (e.g., an estimation of a force caused by the interaction of the virtual object with the virtual passageway) with the magnitude of the predetermined threshold force. If the magnitude of the virtual contact force is equal to or greater than the magnitude of the pre-determined threshold force, the computer processor may output signals that cause a warning to be displayed on the display device. Additionally, the computer may provide signals that cause the haptic input/output device to temporarily lock in a fixed position, and/or signals that cause an area of the virtual passageway where the virtual object will likely lodge to highlight or change color. Similarly, if the comparator determines that the magnitude of the virtual contact force is less than the magnitude of the predetermined threshold force, the computer processor may output signals that cause an indication of the magnitude of the virtual contact force to be displayed on the display device.
Rigid Virtual Object,
Rigid Virtual Passageway/Haptic Layer
In this embodiment, all virtual features, including the virtual object, the virtual passageway, and the haptic layer (not shown), are treated as perfectly rigid, non-deformable objects. On the display device, the only visible change is a motion of the virtual object in response to signals input to the computer from the haptic input/output device. In this embodiment, collisions between the virtual object and the haptic layer felt through the haptic input/output device are always rigid.
This embodiment enables a determination of whether the virtual object may be inserted within and moved along the virtual passageway, where the virtual object might stop or lodge if it is not capable of traversing the virtual passageway, and (optionally) the degree of occlusion or blockage caused by the object. However, this embodiment does not provide information on deformations of the virtual object or the virtual passageway, and may not provide information on a magnitude of a force required to lodge or dislodge the virtual object.
The dimensions of the haptic layer used in this embodiment may correspond to the normal interior dimensions of a nondeformed passageway, or to a set of intermediate dimensions. In the latter case, the virtual object may be seen to pass through one or more reference layers, and felt to stop as it contacts the haptic layer.
Rigid Virtual Object,
Deformable Virtual Passageway
In this embodiment, the virtual object is created to be much stiffer than any tissues comprising an actual passageway. Consequently, the actual material properties of the virtual object may be ignored as it is considered to be rigid and non-deformable. By contrast, the virtual passageway is made to deform per one or more material properties of its component tissues when contacted by the virtual object.
In this embodiment, a false color may be provided to one or more tissues of the virtual passageway to enable clearer viewing of the motion of these tissues as the virtual object is inserted into and travels along the virtual passageway. In this embodiment, the virtual model provides a realistic approximation for many objects known to be choking, aspiration, blocking, or ingestion hazards. By providing haptic force feedback, this embodiment enables determination of where a virtual object is likely to lodge within the virtual passageway. It also enables approximation of the forces that are likely to be associated with choking, aspiration, ingestion, or related injuries.
Deformable Virtual Small Object,
Rigid Virtual Passageway
In this embodiment, the virtual object is constructed to have a consistency that is less stiff than that of the tissues comprising an actual passageway. Thus, the actual material properties of the virtual passageway are ignored, and the virtual passageway is assumed to be perfectly rigid while the virtual object is particular material properties corresponding to an actual object that is being evaluated for hazard. It will be appreciated, that this embodiment provides a realistic approximation of very soft objects, such as those made of soft foam.
Deformable Virtual Object,
Deformable Virtual Passageway
In this embodiment, both the virtual object and the virtual passageway have realistic material properties. Thus, both the virtual object and the virtual passageway may be seen to deform as the virtual object is inserted into and/or moved along the virtual passageway. Additionally, the haptic force feedback generated by the haptic input/output device corresponds to the forces exerted between the virtual object and the virtual passageway. This embodiment is particularly useful in enabling determination of how deformations of the virtual object and/or the virtual passageway may interact to restrict a flow of air or fluid.
In an implementation, a two or three-dimensional image of the virtual object and the virtual passageway are continuously updated in real-time so that motions of the virtual object and any deformations appear to be smooth and continuous. Additionally, the force feedback provided through the haptic input-output device is updated in real-time so that all motion is felt to be smooth and continuous.
The foregoing description of one or more embodiments has been presented for purposes of illustration and description, and is not intended to be exhaustive or to limit the invention to the precise form or methods disclosed. Rather, it is intended that the scope of the invention not be limited by the specification, but be defined by the claims set forth below.

Claims (38)

1. A system for simulating hazard assessment in humans for display, the display comprising:
at least one database comprising a plurality of pre-assembled virtual objects, a plurality virtual passageways, and a plurality of the rigid haptic layers, each of the plurality of pre-assembled virtual objects being modeled from a physical object, each of plurality virtual passageways having dimensional information modeled from a biological anatomical region, and each rigid haptic layers for detecting a force being applied to a corresponding virtual passage way;
a user interface to:
generate the plurality of pre-assembled virtual objects and the plurality virtual passageways for display;
receive a first input selecting a particular virtual object from the plurality of pre-assembled virtual objects;
receive a second input selecting a particular virtual passageway from the plurality virtual passageways;
generate the particular virtual object for display;
generate the particular virtual passageway for display, the virtual passageway comprising a corresponding rigid haptic layer retrieved from the at least one database;
generate a first reference layer that corresponds to a different anatomical entity for display, the first reference layer for detecting a first resistance;
generate a second reference layer that corresponds to a second different anatomical entity, the second reference layer for detecting a second resistance; and
receive a third input controlling movement of the particular virtual object through the particular passageway; and
a processor to:
calculate the magnitude of a force generated by interaction between the virtual object and the corresponding rigid haptic layer, the first resistance detected at the first reference layer, and the second resistance detected at the second reference layer; and
generate hazard data for display based on the calculated magnitude of the force display the interaction of the virtual object with the virtual passageway in one of a two-dimensional or three-dimensional view.
23. A method for simulating hazard assessment in humans for display, the method comprising:
storing in at least one database a plurality of pre-assembled virtual objects, a plurality virtual passageways, and a plurality of the rigid haptic layers, each of the plurality of pre-assembled virtual objects being modeled from a physical object, each of plurality virtual passageways having dimensional information modeled from a biological anatomical region, and each rigid haptic layers for detecting a force being applied to a corresponding virtual passage way;
generating, at a user interface, the plurality of pre-assembled virtual objects and the plurality virtual passageways for display;
receiving, at a user interface, a first input selecting a particular virtual object from the plurality of pre-assembled virtual objects;
receiving, at a user interface a second input selecting a particular virtual passageway from the plurality virtual passageways;
generating, at a user interface, the particular virtual object for display;
generating, at a user interface, the particular virtual passageway comprising a corresponding rigid haptic layer for display;
generating, at a user interface, a first reference layer that corresponds to a different anatomical entity for display, the first reference layer for detecting a first resistance;
generating, at a user interface, a second reference layer that corresponds to a second different anatomical entity, the second reference layer for detecting a second resistance;
receiving, at a user interface, a third input controlling movement of the particular virtual object through the particular passageway;
calculating, at a processor, the magnitude of the force generated by interaction between the virtual object and the corresponding rigid haptic layer, the first resistance detected at the first reference layer, and the second resistance detected at the second reference layer; and
generating, at the processor, hazard data for display based on the calculated force.
US10/941,0882003-09-162004-09-15Haptic response system and method of useExpired - Fee RelatedUS8276091B2 (en)

Priority Applications (2)

Application NumberPriority DateFiling DateTitle
US10/941,088US8276091B2 (en)2003-09-162004-09-15Haptic response system and method of use
US13/540,210US20120278711A1 (en)2003-09-162012-07-02Haptic response system and method of use

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US50298303P2003-09-162003-09-16
US10/941,088US8276091B2 (en)2003-09-162004-09-15Haptic response system and method of use

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US13/540,210ContinuationUS20120278711A1 (en)2003-09-162012-07-02Haptic response system and method of use

Publications (2)

Publication NumberPublication Date
US20050093847A1 US20050093847A1 (en)2005-05-05
US8276091B2true US8276091B2 (en)2012-09-25

Family

ID=34193385

Family Applications (2)

Application NumberTitlePriority DateFiling Date
US10/941,088Expired - Fee RelatedUS8276091B2 (en)2003-09-162004-09-15Haptic response system and method of use
US13/540,210AbandonedUS20120278711A1 (en)2003-09-162012-07-02Haptic response system and method of use

Family Applications After (1)

Application NumberTitlePriority DateFiling Date
US13/540,210AbandonedUS20120278711A1 (en)2003-09-162012-07-02Haptic response system and method of use

Country Status (2)

CountryLink
US (2)US8276091B2 (en)
EP (1)EP1517225A3 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20120166996A1 (en)*2010-12-232012-06-28Glockner Group LlcAnesthesia recordation device
US20120182291A1 (en)*2011-01-182012-07-19Rishi RawatComputer based system and method for medical symptoms analysis, visualization and social network
US20150140535A1 (en)*2012-05-252015-05-21Surgical Theater LLCHybrid image/scene renderer with hands free control
US20160162023A1 (en)*2014-12-052016-06-09International Business Machines CorporationVisually enhanced tactile feedback
US10241495B2 (en)2013-12-252019-03-26Industrial Technology Research InstituteApparatus and method for providing feedback force and machine tool system
LU101235B1 (en)2019-05-282020-12-01Ferrero Trading Lux S AApparatus and method for determining the patency of a conduit at least partially occluded by an object

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2005332157A (en)*2004-05-192005-12-02Alps Electric Co LtdHaptic force application type input device
US20080297475A1 (en)*2005-08-022008-12-04Woolf Tod MInput Device Having Multifunctional Keys
CN101310249B (en)*2005-11-142012-06-27伊梅森公司 System and method for editing a model of a physical system for simulation
KR101384434B1 (en)*2006-04-062014-04-10임머숀 코퍼레이션Systems and methods for enhanced haptic effects, and recording medium
US9430042B2 (en)*2006-12-272016-08-30Immersion CorporationVirtual detents through vibrotactile feedback
US7925068B2 (en)*2007-02-012011-04-12General Electric CompanyMethod and apparatus for forming a guide image for an ultrasound image scanner
US8167813B2 (en)*2007-05-172012-05-01Immersion Medical, Inc.Systems and methods for locating a blood vessel
US8156809B2 (en)*2008-03-272012-04-17Immersion CorporationSystems and methods for resonance detection
US20130100042A1 (en)*2011-10-212013-04-25Robert H. KincaidTouch screen implemented control panel
US9547366B2 (en)2013-03-142017-01-17Immersion CorporationSystems and methods for haptic and gesture-driven paper simulation
WO2015116056A1 (en)*2014-01-292015-08-06Hewlett-Packard Development Company, L.P.Force feedback
US10123846B2 (en)*2014-11-132018-11-13Intuitive Surgical Operations, Inc.User-interface control using master controller
CN118370610A (en)2014-11-132024-07-23直观外科手术操作公司Interaction between user interface and master controller
CN106650237B (en)*2016-11-162019-07-02南京信息工程大学 A virtual flexible body surgery simulation system supporting haptic feedback
US20190231430A1 (en)*2018-01-312019-08-01Varian Medical Systems International AgFeedback system and method for treatment planning
US11416065B1 (en)*2019-11-082022-08-16Meta Platforms Technologies, LlcSynthesizing haptic and sonic feedback for textured materials in interactive virtual environments
CN111047937A (en)*2019-12-142020-04-21上海工程技术大学 A surgical training system based on magnetorheological fluid
CN112989449B (en)*2021-03-262023-08-15温州大学 A tactile force feedback simulation interaction method and device for motion stiffness optimization
CN113191541A (en)*2021-04-262021-07-30万航宇宙智能工程有限公司Method for carrying out efficient cargo transportation activities based on utilization of low-altitude airspace

Citations (33)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US3704529A (en)*1970-07-131972-12-05Forrest J CioppaTraining and instruction device for performing cricothyroidotomy
US4850876A (en)*1985-11-261989-07-25Raionnoe Energeticheskoe Upravlenie "Irkutskenergo"Training device for practicing emergency life saving techniques
US5174283A (en)*1989-11-081992-12-29Parker Jeffrey DBlind orolaryngeal and oroesophageal guiding and aiming device
US5471399A (en)*1991-08-281995-11-28Hitachi, Ltd.Network management system and network status display method
WO1996028800A1 (en)1995-03-101996-09-19High Techsplanations, Inc.Computer based medical procedure simulation system
US5588914A (en)*1994-06-281996-12-31The Walt Disney CompanyMethod and system for guiding a user in a virtual reality presentation
US5625128A (en)*1992-09-111997-04-29The Regents Of The University Of MichiganNon-human animal model of a human airway
US5922018A (en)*1992-12-211999-07-13Artann CorporationMethod for using a transrectal probe to mechanically image the prostate gland
US6049622A (en)*1996-12-052000-04-11Mayo Foundation For Medical Education And ResearchGraphic navigational guides for accurate image orientation and navigation
US6069632A (en)*1997-07-032000-05-30International Business Machines CorporationPassageway properties: customizable protocols for entry and exit of places
US6125375A (en)*1991-12-062000-09-26Lucent Technologies, Inc.Apparatus for visualizing program slices
US6191796B1 (en)*1998-01-212001-02-20Sensable Technologies, Inc.Method and apparatus for generating and interfacing with rigid and deformable surfaces in a haptic virtual reality environment
US6192329B1 (en)*1998-08-122001-02-20Risk Analysis & ManagementMethod and apparatus for assessing risks of injury
US6225999B1 (en)*1996-12-312001-05-01Cisco Technology, Inc.Customizable user interface for network navigation and management
US6327618B1 (en)*1998-12-032001-12-04Cisco Technology, Inc.Recognizing and processing conflicts in network management policies
US6345112B1 (en)*1997-08-192002-02-05The United States Of America As Represented By The Department Of Health And Human ServicesMethod for segmenting medical images and detecting surface anomalies in anatomical structures
WO2002070980A1 (en)2001-03-062002-09-12The Johns Hopkins University School Of MedicineSimulation system for image-guided medical procedures
US20020143276A1 (en)*2000-06-282002-10-03Ernst Maurice M.Working model of the intra oral cavity
US20030016850A1 (en)*2001-07-172003-01-23Leon KaufmanSystems and graphical user interface for analyzing body images
US20030103077A1 (en)*2001-12-032003-06-05Lucent Technologies Inc.Method and apparatus for managing and representing elements in a network
US20030179249A1 (en)*2002-02-122003-09-25Frank SauerUser interface for three-dimensional data sets
US20030197734A1 (en)*2002-04-192003-10-23Binkert Christoph A.Graphic user interface for a stent-graft planning process
US6705319B1 (en)*2000-05-262004-03-16Purdue Research FoundationMiniature acoustical guidance and monitoring system for tube or catheter placement
US6714901B1 (en)*1997-11-192004-03-30Inria Institut National De Recherche En Informatique Et En AutomatiqueElectronic device for processing image-data, for simulating the behaviour of a deformable object
US20040223636A1 (en)*1999-11-192004-11-11Edic Peter MichaelFeature quantification from multidimensional image data
US20040257532A1 (en)*2003-06-132004-12-23Carole MoquinPassageway with virtual reality environment
US7061467B2 (en)*1993-07-162006-06-13Immersion CorporationForce feedback device with microprocessor receiving low level commands
US7084868B2 (en)*2000-04-262006-08-01University Of Louisville Research Foundation, Inc.System and method for 3-D digital reconstruction of an oral cavity from a sequence of 2-D images
US7134093B2 (en)*2001-04-182006-11-07International Business Machines CorporationGraphical user interface for direct control of display of data
US20070038080A1 (en)*1998-12-082007-02-15Intuitive Surgical Inc.Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US7269348B1 (en)*2002-11-182007-09-11At&T Corp.Router having dual propagation paths for packets
US7409647B2 (en)*2000-09-192008-08-05Technion Research & Development Foundation Ltd.Control of interactions within virtual environments
US20090192975A1 (en)*2002-07-172009-07-30Equine Biomechanics And Exercise Physiology, Inc.Echocardiographic Measurements As Predictors Of Racing Success

Family Cites Families (70)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US3802096A (en)*1971-08-091974-04-09H MaternComposite model for medical study
US5041108A (en)*1981-12-111991-08-20Pillco Limited PartnershipMethod for laser treatment of body lumens
US4736306A (en)*1985-04-291988-04-05The United States Of America As Represented By The United States Department Of EnergySystem for conversion between the boundary representation model and a constructive solid geometry model of an object
US5315512A (en)*1989-09-011994-05-24Montefiore Medical CenterApparatus and method for generating image representations of a body utilizing an ultrasonic imaging subsystem and a three-dimensional digitizer subsystem
US5061187A (en)*1990-04-121991-10-29Ravinder JerathUltrasound training apparatus
US5078736A (en)*1990-05-041992-01-07Interventional Thermodynamics, Inc.Method and apparatus for maintaining patency in the body passages
JPH0716488B2 (en)*1991-11-271995-03-01アロカ株式会社 Ultrasonic 3D image display
DE69332042T2 (en)*1992-12-182003-01-02Koninklijke Philips Electronics N.V., Eindhoven Delayed positioning of relatively elastically deformed spatial images by matching surfaces
WO1994024631A1 (en)*1993-04-201994-10-27General Electric CompanyComputer graphic and live video system for enhancing visualisation of body structures during surgery
US5563988A (en)*1994-08-011996-10-08Massachusetts Institute Of TechnologyMethod and system for facilitating wireless, full-body, real-time user interaction with a digitally represented visual environment
NO943696D0 (en)*1994-10-041994-10-04Vingmed Sound As Method of ultrasound imaging
US5782762A (en)*1994-10-271998-07-21Wake Forest UniversityMethod and system for producing interactive, three-dimensional renderings of selected body organs having hollow lumens to enable simulated movement through the lumen
US6694163B1 (en)*1994-10-272004-02-17Wake Forest University Health SciencesMethod and system for producing interactive, three-dimensional renderings of selected body organs having hollow lumens to enable simulated movement through the lumen
US5561749A (en)*1994-12-021996-10-01General Electric CompanyModeling of surfaces employing polygon strips
US6702736B2 (en)*1995-07-242004-03-09David T. ChenAnatomical visualization system
US5776050A (en)*1995-07-241998-07-07Medical Media SystemsAnatomical visualization system
US6256529B1 (en)*1995-07-262001-07-03Burdette Medical Systems, Inc.Virtual reality 3D visualization for surgical procedures
EP0864145A4 (en)*1995-11-301998-12-16Virtual Technologies IncTactile feedback man-machine interface device
US5956484A (en)*1995-12-131999-09-21Immersion CorporationMethod and apparatus for providing force feedback over a computer network
US6028593A (en)*1995-12-012000-02-22Immersion CorporationMethod and apparatus for providing simulated physical interactions within computer generated environments
US8508469B1 (en)*1995-12-012013-08-13Immersion CorporationNetworked applications including haptic feedback
SG64340A1 (en)*1996-02-271999-04-27Inst Of Systems Science NationCurved surgical instruments and methods of mapping a curved path for stereotactic surgery
USRE40176E1 (en)*1996-05-152008-03-25Northwestern UniversityApparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
US6167296A (en)*1996-06-282000-12-26The Board Of Trustees Of The Leland Stanford Junior UniversityMethod for volumetric image navigation
US6929481B1 (en)*1996-09-042005-08-16Immersion Medical, Inc.Interface device and method for interfacing instruments to medical procedure simulation systems
US6331116B1 (en)*1996-09-162001-12-18The Research Foundation Of State University Of New YorkSystem and method for performing a three-dimensional virtual segmentation and examination
US6343936B1 (en)*1996-09-162002-02-05The Research Foundation Of State University Of New YorkSystem and method for performing a three-dimensional virtual examination, navigation and visualization
US7194117B2 (en)*1999-06-292007-03-20The Research Foundation Of State University Of New YorkSystem and method for performing a three-dimensional virtual examination of objects, such as internal organs
JPH10111958A (en)*1996-10-041998-04-28Olympus Optical Co LtdSimulation system using computer graphics and model representing method of simulation system
CN2280205Y (en)*1996-10-251998-04-29北京有色金属研究总院Internal heating type pipeline heater
CN2301218Y (en)*1997-01-301998-12-23乔建民Intracorporal membrane lung
US6346940B1 (en)*1997-02-272002-02-12Kabushiki Kaisha ToshibaVirtualized endoscope system
JP3391466B2 (en)*1997-06-132003-03-31アースロケア コーポレイション Electrosurgical catheter system and catheter for recanalization of occluded body lumen
EP1079730B1 (en)*1997-11-242007-01-03Computerized Medical Systems, Inc.Real time brachytherapy spatial registration and visualization system
US6470302B1 (en)*1998-01-282002-10-22Immersion Medical, Inc.Interface device and method for interfacing instruments to vascular access simulation systems
US6810281B2 (en)*2000-12-212004-10-26Endovia Medical, Inc.Medical mapping system
US6440138B1 (en)*1998-04-062002-08-27Kyphon Inc.Structures and methods for creating cavities in interior body regions
US5946370A (en)*1998-04-151999-08-31International Business Machines CorporationSystem and method for accessing the three-dimensional geometry of large objects using X-ray based method subject to limitations on radiation doses
US6950689B1 (en)*1998-08-032005-09-27Boston Scientific Scimed, Inc.Dynamically alterable three-dimensional graphical model of a body region
CA2352671A1 (en)*1998-11-252000-06-08Wake Forest UniversityVirtual endoscopy with improved image segmentation and lesion detection
WO2000041134A1 (en)*1999-01-042000-07-13Koninklijke Philips Electronics N.V.Method, system and apparatus for processing an image representing a tubular structure and for constructing a path through said structure
JP3212287B2 (en)*1999-03-012001-09-25富士通株式会社 Object cross-section display device and method, and program recording medium
US6466815B1 (en)*1999-03-302002-10-15Olympus Optical Co., Ltd.Navigation apparatus and surgical operation image acquisition/display apparatus using the same
DE19916978C1 (en)*1999-04-152001-04-26Bock Orthopaed Ind Body area measurement method
US6301495B1 (en)*1999-04-272001-10-09International Business Machines CorporationSystem and method for intra-operative, image-based, interactive verification of a pre-operative surgical plan
DE19922279A1 (en)*1999-05-112000-11-16Friedrich Schiller Uni Jena Bu Procedure for generating patient-specific implants
WO2000078259A1 (en)*1999-06-222000-12-28Research Development FoundationEnhanced wound coverage to enhance wound healing
JP3760793B2 (en)*2000-05-222006-03-29株式会社豊田中央研究所 Human body model creation method, program, and recording medium
US7353151B2 (en)*2000-05-222008-04-01Kabushiki Kaisha Toyota Chuo KenkyushoMethod and system for analyzing behavior of whole human body by simulation using whole human body
WO2001093745A2 (en)*2000-06-062001-12-13The Research Foundation Of State University Of New YorkComputer aided visualization, fusion and treatment planning
US8909325B2 (en)*2000-08-212014-12-09Biosensors International Group, Ltd.Radioactive emission detector equipped with a position tracking system and utilization thereof with medical systems and in medical procedures
US8565860B2 (en)*2000-08-212013-10-22Biosensors International Group, Ltd.Radioactive emission detector equipped with a position tracking system
US8489176B1 (en)*2000-08-212013-07-16Spectrum Dynamics LlcRadioactive emission detector equipped with a position tracking system and utilization thereof with medical systems and in medical procedures
AU728749B3 (en)*2000-11-142001-01-181St Share Pty LtdDiagnostic imaging simulator
JP4395689B2 (en)*2001-02-092010-01-13コニカミノルタホールディングス株式会社 Image data processing method and modeling apparatus
JP4170096B2 (en)*2001-03-292008-10-22コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Image processing apparatus for evaluating the suitability of a 3D mesh model mapped on a 3D surface of an object
US6780368B2 (en)*2001-04-102004-08-24Nanotek Instruments, Inc.Layer manufacturing of a multi-material or multi-color 3-D object using electrostatic imaging and lamination
US7327862B2 (en)*2001-04-302008-02-05Chase Medical, L.P.System and method for facilitating cardiac intervention
US7079674B2 (en)*2001-05-172006-07-18Siemens Corporate Research, Inc.Variational approach for the segmentation of the left ventricle in MR cardiac images
DE10148341A1 (en)*2001-09-292003-04-24Friedhelm Brassel Process for the production of a model system for vascular malformations
TW200304608A (en)*2002-03-062003-10-01Z Kat IncSystem and method for using a haptic device in combination with a computer-assisted surgery system
US7046835B2 (en)*2002-03-072006-05-16Ge Medical Systems Global Technology Company LlcMethod and system for processing vascular radiographic images which have been reconstructed by three-dimensional modelling
US6792071B2 (en)*2002-03-272004-09-14Agfa-GevaertMethod of performing geometric measurements on digital radiological images
US6746401B2 (en)*2002-05-062004-06-08Scimed Life Systems, Inc.Tissue ablation visualization
US20030220556A1 (en)*2002-05-202003-11-27Vespro Ltd.Method, system and device for tissue characterization
JP4138371B2 (en)*2002-06-062008-08-27富士フイルム株式会社 Anatomical feature position detecting device, recording medium, subject structure measuring device, and recording medium
CN2565463Y (en)*2002-08-132003-08-13陈阳瑞Ultrasonic probe
US7182602B2 (en)*2002-09-102007-02-27The University Of Vermont And State Agricultural CollegeWhole-body mathematical model for simulating intracranial pressure dynamics
US7283652B2 (en)*2002-11-272007-10-16General Electric CompanyMethod and system for measuring disease relevant tissue changes
US8963914B2 (en)*2011-01-182015-02-24Rishi RawatComputer based system and method for medical symptoms analysis, visualization and social network

Patent Citations (39)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US3704529A (en)*1970-07-131972-12-05Forrest J CioppaTraining and instruction device for performing cricothyroidotomy
US4850876A (en)*1985-11-261989-07-25Raionnoe Energeticheskoe Upravlenie "Irkutskenergo"Training device for practicing emergency life saving techniques
US5174283A (en)*1989-11-081992-12-29Parker Jeffrey DBlind orolaryngeal and oroesophageal guiding and aiming device
US5339805A (en)*1989-11-081994-08-23Parker Jeffrey DBlind orolaryngeal and oroesophageal guiding and aiming device
US5471399A (en)*1991-08-281995-11-28Hitachi, Ltd.Network management system and network status display method
US6125375A (en)*1991-12-062000-09-26Lucent Technologies, Inc.Apparatus for visualizing program slices
US5625128A (en)*1992-09-111997-04-29The Regents Of The University Of MichiganNon-human animal model of a human airway
US5922018A (en)*1992-12-211999-07-13Artann CorporationMethod for using a transrectal probe to mechanically image the prostate gland
US7061467B2 (en)*1993-07-162006-06-13Immersion CorporationForce feedback device with microprocessor receiving low level commands
US5588914A (en)*1994-06-281996-12-31The Walt Disney CompanyMethod and system for guiding a user in a virtual reality presentation
WO1996028800A1 (en)1995-03-101996-09-19High Techsplanations, Inc.Computer based medical procedure simulation system
US6049622A (en)*1996-12-052000-04-11Mayo Foundation For Medical Education And ResearchGraphic navigational guides for accurate image orientation and navigation
US6225999B1 (en)*1996-12-312001-05-01Cisco Technology, Inc.Customizable user interface for network navigation and management
US6069632A (en)*1997-07-032000-05-30International Business Machines CorporationPassageway properties: customizable protocols for entry and exit of places
US6345112B1 (en)*1997-08-192002-02-05The United States Of America As Represented By The Department Of Health And Human ServicesMethod for segmenting medical images and detecting surface anomalies in anatomical structures
US6556696B1 (en)*1997-08-192003-04-29The United States Of America As Represented By The Department Of Health And Human ServicesMethod for segmenting medical images and detecting surface anomalies in anatomical structures
US6714901B1 (en)*1997-11-192004-03-30Inria Institut National De Recherche En Informatique Et En AutomatiqueElectronic device for processing image-data, for simulating the behaviour of a deformable object
US6191796B1 (en)*1998-01-212001-02-20Sensable Technologies, Inc.Method and apparatus for generating and interfacing with rigid and deformable surfaces in a haptic virtual reality environment
US6192329B1 (en)*1998-08-122001-02-20Risk Analysis & ManagementMethod and apparatus for assessing risks of injury
US6327618B1 (en)*1998-12-032001-12-04Cisco Technology, Inc.Recognizing and processing conflicts in network management policies
US20070038080A1 (en)*1998-12-082007-02-15Intuitive Surgical Inc.Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US7333648B2 (en)*1999-11-192008-02-19General Electric CompanyFeature quantification from multidimensional image data
US20040223636A1 (en)*1999-11-192004-11-11Edic Peter MichaelFeature quantification from multidimensional image data
US7084868B2 (en)*2000-04-262006-08-01University Of Louisville Research Foundation, Inc.System and method for 3-D digital reconstruction of an oral cavity from a sequence of 2-D images
US6705319B1 (en)*2000-05-262004-03-16Purdue Research FoundationMiniature acoustical guidance and monitoring system for tube or catheter placement
US20020143276A1 (en)*2000-06-282002-10-03Ernst Maurice M.Working model of the intra oral cavity
US7409647B2 (en)*2000-09-192008-08-05Technion Research & Development Foundation Ltd.Control of interactions within virtual environments
WO2002070980A1 (en)2001-03-062002-09-12The Johns Hopkins University School Of MedicineSimulation system for image-guided medical procedures
US7134093B2 (en)*2001-04-182006-11-07International Business Machines CorporationGraphical user interface for direct control of display of data
US20070019849A1 (en)*2001-07-172007-01-25Acculmage Diagnostics CorpSystems and graphical user interface for analyzing body images
US20030016850A1 (en)*2001-07-172003-01-23Leon KaufmanSystems and graphical user interface for analyzing body images
US7130457B2 (en)*2001-07-172006-10-31Accuimage Diagnostics Corp.Systems and graphical user interface for analyzing body images
US20030103077A1 (en)*2001-12-032003-06-05Lucent Technologies Inc.Method and apparatus for managing and representing elements in a network
US20030179249A1 (en)*2002-02-122003-09-25Frank SauerUser interface for three-dimensional data sets
US20030197734A1 (en)*2002-04-192003-10-23Binkert Christoph A.Graphic user interface for a stent-graft planning process
US20090192975A1 (en)*2002-07-172009-07-30Equine Biomechanics And Exercise Physiology, Inc.Echocardiographic Measurements As Predictors Of Racing Success
US7269348B1 (en)*2002-11-182007-09-11At&T Corp.Router having dual propagation paths for packets
US6834966B1 (en)*2003-06-132004-12-28Carole MoquinPassageway with virtual reality environment
US20040257532A1 (en)*2003-06-132004-12-23Carole MoquinPassageway with virtual reality environment

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Ask Search, http://www.ask.com/web?q=virtual+passageway+haptic&qsrc=0&o=0&l=dir&oo=0.*
Ask Search, http://www.ask.com/web?qsrc=1&o=0&l=dir&q=anatomical+model+haptic&oo=0.*
Nürnberger, Andreas, et al. "Determination of Elastodynamic Model Parameters Using a Recurrent Neuro-Fuzzy System." (May 18, 1999). Institute of Knowledge Processing and Language Engineering. All Pages; . University of Magdeburg, Germany.
Nürnberger, Andreas, et al. "Determination of Elastodynamic Model Parameters Using a Recurrent Neuro-Fuzzy System." (May 18, 1999). Institute of Knowledge Processing and Language Engineering. All Pages; <http//fuzzy.cs.uni-magdeburg.de/publications/NueRadKru99.pdf>. University of Magdeburg, Germany.
Sofiane Sarni, et al., "Evaluation and Visualization of Stress and Strain on Soft Biological Tissues in Contact" (2004). Virtual Reality Lab (VRlab), Swiss Federal Institute of Technology (EPFL). pp. 1-8; <http://vrlab.epfl.ch/~amaciel/docs/sarni-marciel-boulic-thalmam-smi04.pdf>, Lausanne, Switzerland.
Sofiane Sarni, et al., "Evaluation and Visualization of Stress and Strain on Soft Biological Tissues in Contact" (2004). Virtual Reality Lab (VRlab), Swiss Federal Institute of Technology (EPFL). pp. 1-8; <http://vrlab.epfl.ch/˜amaciel/docs/sarni—marciel—boulic—thalmam—smi04.pdf>, Lausanne, Switzerland.

Cited By (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20120166996A1 (en)*2010-12-232012-06-28Glockner Group LlcAnesthesia recordation device
US20120182291A1 (en)*2011-01-182012-07-19Rishi RawatComputer based system and method for medical symptoms analysis, visualization and social network
US8963914B2 (en)*2011-01-182015-02-24Rishi RawatComputer based system and method for medical symptoms analysis, visualization and social network
US20150140535A1 (en)*2012-05-252015-05-21Surgical Theater LLCHybrid image/scene renderer with hands free control
US10056012B2 (en)*2012-05-252018-08-21Surgical Theatre LLCHybrid image/scene renderer with hands free control
US10241495B2 (en)2013-12-252019-03-26Industrial Technology Research InstituteApparatus and method for providing feedback force and machine tool system
US20160162023A1 (en)*2014-12-052016-06-09International Business Machines CorporationVisually enhanced tactile feedback
US9971406B2 (en)*2014-12-052018-05-15International Business Machines CorporationVisually enhanced tactile feedback
US10055020B2 (en)2014-12-052018-08-21International Business Machines CorporationVisually enhanced tactile feedback
LU101235B1 (en)2019-05-282020-12-01Ferrero Trading Lux S AApparatus and method for determining the patency of a conduit at least partially occluded by an object

Also Published As

Publication numberPublication date
EP1517225A2 (en)2005-03-23
US20050093847A1 (en)2005-05-05
EP1517225A3 (en)2007-01-03
US20120278711A1 (en)2012-11-01
EP1517225A8 (en)2005-06-29

Similar Documents

PublicationPublication DateTitle
US20120278711A1 (en)Haptic response system and method of use
SarvazyanMechanical imaging:: A new technology for medical diagnostics
Morris et al.Visuohaptic simulation of bone surgery for training and evaluation
Sutherland et al.An augmented reality haptic training simulator for spinal needle procedures
US9940714B2 (en)Image analyzing device, image analyzing method, and computer program product
US20220137593A1 (en)Method for fabricating a physical simulation device, simulation device and simulation system
KR101522690B1 (en)3d visuo-haptic display system and method based on perception for skin diagnosis
Ni et al.A virtual reality simulator for ultrasound-guided biopsy training
Echegaray et al.A brain surgery simulator
De et al.Physically realistic virtual surgery using the point-associated finite field (PAFF) approach
Choi et al.An efficient and scalable deformable model for virtual reality-based medical applications
CN118917976A (en)First-aid effect analysis method and system based on dummy
CN119867928B (en)Virtual reality-based minimally invasive surgery simulation method and system
Müller et al.The virtual reality arthroscopy training simulator
JP3996628B2 (en) A computer simulation model for measuring damage to the human central nervous system.
KR100551201B1 (en) Dental training and evaluation system using haptic interface based on volume model
JPH1049045A (en)Formation of human body model and device therefor, human body model
Chen et al.Dynamic touch‐enabled virtual palpation
Pflesser et al.Volume based planning and rehearsal of surgical interventions
Vaughan et al.Haptic feedback from human tissues of various stiffness and homogeneity
Choi et al.A heuristic force model for haptic simulation of nasogastric tube insertion using fuzzy logic
ElHelwOverview of Surgical Simulation
Skiadopoulos et al.Simulating the mammographic appearance of circumscribed lesions
ErikssonHaptic Milling Simulation in Six Degrees-of-Freedom: With Application to Surgery in Stiff Tissue
JP6746751B2 (en) Image analysis device, image analysis method, and program

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:RAM CONSULTING, ILLINOIS

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ALTKORN, ROBERT;CHEN, XIAO;MILKOVICH, SCOTT;AND OTHERS;REEL/FRAME:016136/0938

Effective date:20040924

REMIMaintenance fee reminder mailed
LAPSLapse for failure to pay maintenance fees
STCHInformation on status: patent discontinuation

Free format text:PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FPLapsed due to failure to pay maintenance fee

Effective date:20160925


[8]ページ先頭

©2009-2025 Movatter.jp