Movatterモバイル変換


[0]ホーム

URL:


US11304757B2 - Orthopedic fixation control and visualization - Google Patents

Orthopedic fixation control and visualization
Download PDF

Info

Publication number
US11304757B2
US11304757B2US16/367,526US201916367526AUS11304757B2US 11304757 B2US11304757 B2US 11304757B2US 201916367526 AUS201916367526 AUS 201916367526AUS 11304757 B2US11304757 B2US 11304757B2
Authority
US
United States
Prior art keywords
image
fixator
graphical
anatomical structure
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/367,526
Other versions
US20200305977A1 (en
Inventor
Bernd Gutmann
Martin Raabe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Synthes GmbH
Original Assignee
Synthes GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Synthes GmbHfiledCriticalSynthes GmbH
Priority to US16/367,526priorityCriticalpatent/US11304757B2/en
Assigned to INNOMEDIC GMBHreassignmentINNOMEDIC GMBHASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: GUTMANN, BERND, RAABE, MARTIN
Assigned to SYNTHES GMBHreassignmentSYNTHES GMBHASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: INNOMEDIC GMBH
Priority to PCT/EP2020/058356prioritypatent/WO2020193629A1/en
Priority to CA3135073Aprioritypatent/CA3135073A1/en
Priority to BR112021019062Aprioritypatent/BR112021019062A2/en
Priority to AU2020245028Aprioritypatent/AU2020245028B2/en
Priority to JP2021557502Aprioritypatent/JP7500601B2/en
Priority to CN202080039246.3Aprioritypatent/CN113841183B/en
Priority to EP20715011.1Aprioritypatent/EP3948801A1/en
Publication of US20200305977A1publicationCriticalpatent/US20200305977A1/en
Publication of US11304757B2publicationCriticalpatent/US11304757B2/en
Application grantedgrantedCritical
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

A guided frame matching technique may be employed in which a graphical projection of a fixator is used to guide and assist a user in identifying fixator elements within an image. Upon being overlaid on an image, the graphical projection of fixator may be manipulated to align with, and overlay, fixator elements displayed within the image. Additionally, a three-dimensional overview of an imaging scene of the fixator may be generated. The three-dimensional overview may include a three-dimensional graphical model that displays representations of images of the fixator attached to anatomical structure segments as well as representations of imaging sources, reference locations, fixator elements, and anatomical structure segments. When the images are not truly orthogonal, a modified second image representation may be displayed to represent a second image that is truly orthogonal to a first image.

Description

BACKGROUND
Techniques used to treat fractures and/or deformities of anatomical structures, such as bones, can include the use of external fixators, such as hexapods and other fixation frames, which are surgically mounted to anatomical structure segments on opposed sides of a fracture site. A pair of radiographic images is taken of the fixator and anatomical structure segments at the fracture site. Data from the images is then manipulated to construct a three-dimensional representation of the fixator and the anatomical structures segments that can be used in developing a treatment plan, which may for example comprise realigning the anatomical structure segments through adjustments to the fixator.
Existing techniques for controlling fixator manipulation may, however, involve a number of limitations that may introduce inefficiency, complication, and unreliability. For example, some conventional techniques may rely on a surgeon or other user to indicate locations of certain fixator elements, such as struts, within images that are displayed in a graphical user interface of a computer. However, it may often be difficult for the user to identify and mark positions of the struts and other fixator elements within the images. In particular, depending upon the location and orientation from which an image is captured, struts and other fixator elements may be not be identified easily, such as because they may wholly or partially overlap one another or may otherwise be obscured within the images. This may make it cumbersome for the user to identify the fixator elements, thereby increasing time required to identify the elements, increasing the probability of errors, and reducing the reliability of the calculations. As another example, some conventional techniques may be limited with respect to the ability to visualize a complete three-dimensional fixator imaging scene for the user. Additionally, when images are non-orthogonal, the user may be unable to efficiently view representations of various corrections, such as corrections to the anatomical structure angulations, calculated by the software corresponding to true orthogonal images. Thus, insufficient feedback may be provided to the user, thereby resulting in additional errors. This may reduce the reliability of the treatment plan, possibly resulting in improper alignment of anatomical structures segments during the healing process, compromised union between the anatomical structure segments, necessitating additional rounds of radiographic imaging to facilitate alignment corrections, or even necessitating additional surgical procedures.
SUMMARY
Techniques for orthopedic fixation control and visualization, for example for correction of a deformity of an anatomical structure, such as a bone, are described herein. In particular, in some examples, a fixation apparatus may be attached to first and second anatomical structure segments. Images, such as x-rays, of the fixation apparatus and the attached anatomical structure segments may then be captured from different orientations with respect to the fixation apparatus.
In some examples, various manipulations to the fixation apparatus for correction of the anatomical structure deformity may be determined based on positions and orientations of the anatomical structure segments in three-dimensional space. Also, in some examples, the positions and orientations of the anatomical structure segments in three-dimensional space may be determined based on the images. In particular, in some cases, the positions and orientations of the anatomical structure segments in three-dimensional space may be determined by having a surgeon or other user indicate locations of various fixator elements and anatomical structures within the images. However, as described above, it may often be difficult for the user to identify and mark positions of certain fixator elements, such as struts, within the images. In particular, depending upon the location and orientation from which an image is captured, struts and other fixator elements may be not be identified easily, such as because they may wholly or partially overlap one another or may otherwise be obscured within the images. For example, in some cases, it may often be more difficult to identify struts within a lateral image than to identify the struts within an anterior image, such as because the struts may often overlap one another when viewed within the lateral image. This may make it cumbersome for the user to identify the fixator elements, thereby increasing time required to identify the elements, increasing the probability of errors, and reducing the reliability of the calculations.
In some examples, to alleviate the above and other problems, a guided frame matching technique may be employed in which a graphical projection of the fixator is used to guide and assist the user in identifying fixator elements within an image. In particular, locations of fixator elements, such as rings and/or struts of the fixator, may be identified by a user within a first image, for example an anterior image or other image in which the fixator elements are easily identifiable. A graphical projection of the fixator, such as including graphical representations of the rings and/or struts, may then be generated based at least in part on the identified locations of the fixator elements in the first image. The graphical projection of the fixator may then be overlaid upon a second image, such as a lateral image or other image in which the fixator elements are not easily identifiable. The graphical projection of the fixator may then be used within the second image as a guide, such as to assist the user in identifying locations of the fixator elements within the second image. In some examples, the graphical projection of the fixator may be rotated relative to the locations of the fixator elements in the first image. In particular, the graphical projection of the fixator may be rotated based at least in part on an angle of image planes of the first and the second images with respect to one another. For example, if the first image is an anterior image, and the second image is a lateral image at an angle of ninety degrees to the anterior image, then the graphical projection of the fixator, upon being overlaid onto the second image, may be rotated ninety degrees relative to the identified locations of the fixator elements in the first image. The graphical projection of the fixator may also be modified within the second image by a user, for example by moving, resizing and/or further rotating the graphical projection of the fixator, such as to better align with the locations of the fixator elements in the second image. Upon being overlaid on the second image, the graphical projection of fixator may serve as a guide to assist the user in identifying fixator elements within the second image, such as by manipulating the graphical projection of the fixator to align with, and overlay, fixator elements displayed within the second image.
Another technique for improving accuracy and reliability of input values and resulting calculations is disclosed herein that provides a three-dimensional overview of an imaging scene of the fixator. The three-dimensional overview may be used to provide feedback and visual confirmation to help ensure that the calculated positions and orientations of anatomical structures is reliable and correct. In particular, the three-dimensional overview may include a three-dimensional graphical model that is displayed in a graphical user interface of a computing system. The three-dimensional graphical model may include a first image representation that represents a first image and may include a second image representation that represents a second image. The first image representation and the second image representation may be displayed at an angle that is the same as the angle between the image planes of the first and second images. The three-dimensional model may also include graphical representations of virtual locations corresponding to respective first and second imaging sources of the first and the second image. A first virtual line may connect a first virtual location corresponding to the first imaging source to a reference location on the first image representation. A second virtual line may connect a second virtual location corresponding to the second imaging source to the same reference location on the second image representation. In some examples, the reference location may be a reference point of the first or the second anatomical structure segment, such as an endpoint of the anatomical structure segment. The three-dimensional graphical model may display a first graphical representation associated with a shortest distance (e.g., intersection) between the first virtual line and the second virtual line, and this first graphical representation may represent a physical location of the reference location in three-dimensional space. The three-dimensional graphical model may also display graphical representations of fixator elements (rings, struts, etc.) at virtual locations that represent physical locations of the fixator elements in the three-dimensional space. The three-dimensional graphical model may also display graphical representations of the anatomical structure segments at virtual locations that represent physical locations of the anatomical structure segments in the three-dimensional space. The three-dimensional graphical model may be zoomed, panned, and rotated in any combination of one or more directions by a user.
In some examples, when the first and the second image planes are non-orthogonal, the three-dimensional graphical model may display a modified second image representation that is orthogonal to the first image representation. The modified second image representation may represent a modified second image having an image plane that is truly orthogonal to the image plane of the first image. In some examples, the software may calculate the angle of the anatomical structure segments that would be displayed in the second image if the second image were truly orthogonal to the first image. The software may then display, in the modified second image representation, a modified second image in which the anatomical structure segments are displayed with the calculated angle with respect to one another. In this way, the software may demonstrate to the user how measured anatomical structure deformity values from a second image that is not truly orthogonal to the first image may be adjusted to corrected anatomical structure deformity values for truly orthogonal images, and the software provide a guidance to validate the corrected values for use in the deformity calculations.
BRIEF DESCRIPTION OF THE DRAWINGS
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
The foregoing summary, as well as the following detailed description of the preferred embodiments of the application, will be better understood when read in conjunction with the appended drawings. For the purposes of illustrating the methods and/or techniques of orthopedic fixation with imagery analysis, there are shown in the drawings preferred embodiments. It should be understood, however, that the instant application is not limited to the precise arrangements and/or instrumentalities illustrated in the drawings, in which:
FIG. 1 is a perspective view of a fixation assembly positioned for imaging in accordance with an embodiment;
FIG. 2 is a perspective view of an example imaging process of the fixation assembly illustrated inFIG. 1;
FIGS. 3A and 3B are flow diagrams illustrating an example process for controlling manipulation of a fixation apparatus to correct an anatomical structure deformity;
FIG. 4 is a screen shot of an example interface for selecting a Perspective Frame Matching (PFM) technique;
FIG. 5 is a screen shot of an example configuration information entry interface for the PFM technique;
FIG. 6 is a screen shot of an example first image information entry interface for the PFM technique;
FIG. 7 is a screen shot of an example close-up assist interface for the PFM technique;
FIGS. 8A-8H are screen shots of an example second image information entry interface for the PFM technique;
FIG. 9 is a screen shot of example deformity parameter interface for the PFM technique;
FIG. 10 is a screen shot of an example mounting parameter interface for the PFM technique;
FIG. 11 is a screen shot of a first example treatment plan interface for the PFM technique;
FIG. 12 is a screen shot of a second example treatment plan interface for the PFM technique;
FIG. 13 is a screen shot of a third example treatment plan interface for the PFM technique;
FIG. 14 is a flow diagram illustrating an example process for providing a graphical projection of a fixator using a guided frame matching technique;
FIG. 15A is a diagram illustrating example images of first and second anatomical structure segments and a fixator attached thereto;
FIG. 15B is a diagram illustrating example first and second images of a fixator in which locations of fixator elements are indicated in the first image but not the second image;
FIG. 16 is a diagram illustrating example graphical projection of a fixator overlaid upon an image;
FIG. 17A is a diagram illustrating an example graphical projection of a fixator that is manipulated by a user;
FIG. 17B is a diagram illustrating example first and second images of a fixator in which locations of fixator elements are indicated in both the first image and the second image;
FIG. 18 is a flow diagram illustrating an example process for generating a three-dimensional overview of an imaging scene of a fixator;
FIG. 19 is a diagram illustrating an example three-dimensional graphical model of the imaging scene of the fixator;
FIG. 20 is a diagram illustrating an example three-dimensional graphical model including graphical representations of fixator rings;
FIG. 21 is a diagram illustrating an example three-dimensional graphical model including graphical representations of fixator struts and anatomical structure segments;
FIG. 22 is a diagram illustrating an example three-dimensional graphical model from an anteroposterior perspective;
FIG. 23 is a diagram illustrating an example three-dimensional graphical model from a lateral perspective;
FIG. 24 is a diagram illustrating an example rotated three-dimensional graphical model;
FIG. 25 is a diagram illustrating an example modified second image representation; and
FIG. 26 is a block diagram of an example computing device for use in accordance with the present disclosure.
DETAILED DESCRIPTION
For convenience, the same or equivalent elements in the various embodiments illustrated in the drawings have been identified with the same reference numerals. Certain terminology is used in the following description for convenience only and is not limiting. The words “right”, “left”, “top” and “bottom” designate directions in the drawings to which reference is made. The words “inward”, “inwardly”, “outward”, and “outwardly” refer to directions toward and away from, respectively, the geometric center of the device and designated parts thereof. The terminology intended to be non-limiting includes the above-listed words, derivatives thereof and words of similar import.
Referring initially toFIG. 1, bodily tissues, for instance first and secondanatomical structure segments102,104, can be aligned and/or oriented to promote union or other healing between the bodily tissues. Anatomical structures may include, for example, anatomical tissue and artificial anatomical implants. Anatomical tissue may include, for example, bone or other tissue in the body. The alignment and/or orientation of the bodily tissues can be achieved by connecting the bodily tissues to an adjustable fixation apparatus, such asorthopedic fixator100. The orthopedic fixator can comprise an external fixation apparatus that includes a plurality of discrete fixator members that remain external to the patient's body, but that are attached to respective discreet bodily tissues, for example with minimally invasive attachment members. A fixation apparatus may include, for example, a distraction osteogenesis ring system, a hexapod, or a Taylor spatial frame. By adjusting the spatial positioning of the fixator members with respect to each other, the respective bodily tissues attached thereto can be reoriented and/or otherwise brought into alignment with each other, for example to promote union between the bodily tissues during the healing process. The use of external orthopedic fixators in combination with the imagery analysis and positioning techniques described herein can be advantageous in applications where direct measurement and manipulation of the bodily tissues is not possible, where limited or minimally invasive access to the bodily tissues is desired, or the like. Some examples of orthopedic fixators and their use for correcting deformities of anatomical structure segments, as well as techniques for performing imagery analysis on the fixators and anatomical structure segments are described in U.S. Pat. No. 9,642,649, entitled “ORTHOPEDIC FIXATION WITH IMAGERY ANALYSIS,” issued on May 9, 2017, the entirety of which is hereby incorporated by reference.
The fixator members can be connected to each other via adjustment members, the adjustment members configured to facilitate the spatial repositioning of the fixator members with respect to each other. For example, in the illustrated embodiment, theorthopedic fixator100 comprises a pair of fixator members in the form of anupper fixator ring106 and alower fixator ring108. The fixator rings106,108 can be constructed the same or differently. For instance, the fixator rings106,108 can have diameters that are the same or different. Similarly, the fixator rings106,108 can be constructed with varying cross sectional diameters, thicknesses, etc. It should be appreciated that the fixator members of theorthopedic fixator100 are not limited to the illustrated upper and lower fixator rings106,108, and that theorthopedic fixator100 can be alternatively constructed. For example, additional fixator rings can be provided and interconnected with thefixator ring106 and/or108. It should further be appreciated that the geometries of the fixator members are not limited to rings, and that at least one, such as all of the fixator members can be alternatively constructed using any other suitable geometry.
The first and secondanatomical structure segments102,104 can be rigidly attached to the upper and lower fixator rings106,108, respectively, with attachment members that can be mounted to the fixator rings106,108. For example, in the illustrated embodiment, attachment members are provided in the form ofattachment rods110 andattachment wires112.
Therods110 and thewires112 extend between proximal ends attached to mountingmembers114 that are mounted to the fixator rings106,108, and opposed distal ends that are inserted into or otherwise secured to theanatomical structure segments102,104. The mountingmembers114 can be removably mounted to the fixator rings106,108 at predefined points along the peripheries of the fixator rings106,108, for example by disposing them into threaded apertures defined by the fixator rings. With respect to eachfixator ring106,108, the mountingmembers114 can be mounted to the upper surface of the ring, the lower surface of the ring, or any combination thereof. It should be appreciated that the attachment members are not limited to the configuration of the illustrated embodiment. For example, any number of attachment members, such as the illustratedrods110 andwires112 and any others, can be used to secure the anatomical structure segments to respective fixator members as desired. It should further be appreciated that one or more of the attachment members, for instance therods110 and/orwires112, can be alternatively configured to mount directly to the fixator rings106,108, without utilizing mountingmembers114.
The upper and lower fixator rings106,108 can be connected to each other by at least one, such as a plurality of adjustment members. At least one, such as all, of the adjustment members can be configured to allow the spatial positioning of the fixator rings with respect to each other to be adjusted. For example, in the illustrated embodiment, the upper and lower fixator rings106,108 are connected to each other with a plurality of adjustment members provided in the form of adjustable length struts116. It should be appreciated that the construction of theorthopedic fixator100 is not limited to the sixstruts116 of the illustrated embodiment, and that more or fewer struts can be used as desired.
Each of the adjustable length struts116 can comprise opposed upper andlower strut arms118,120. Each of the upper andlower strut arms118,120 have proximal ends disposed in a coupling member, orsleeve122, and opposed distal ends that are coupled touniversal joints124 mounted to the upper and lower fixator rings106,108, respectively. The universal joints of the illustrated embodiment are disposed in pairs spaced evenly around the peripheries of the upper and lower fixator rings106,108, but can be alternatively placed in any other locations on the fixator rings as desired.
The proximal ends of the upper andlower strut arms118,120 of eachstrut116 can have threads defined thereon that are configured to be received by complementary threads defined in thesleeve122, such that when the proximal ends of the upper andlower strut arms118,120 of astrut116 are received in arespective sleeve122, rotation of thesleeve122 causes the upper andlower strut arms118,120 to translate within thesleeve122, thus causing thestrut116 to be elongated or shortened, depending on the direction of rotation. Thus, the length of eachstrut116 can be independently adjusted with respect to the remaining struts. It should be appreciated that the adjustment members are not limited to the lengthadjustable struts116 of the illustrated embodiment, and that the adjustment members can be alternatively constructed as desired, for example using one or more alternative geometries, alternative length adjustment mechanisms, and the like.
The adjustable length struts116 and theuniversal joints124 by which they are mounted to the upper and lower fixator rings106,108, allow theorthopedic fixator100 to function much like a Stewart platform, and more specifically like a distraction osteogenesis ring system, a hexapod, or a Taylor spatial frame. That is, by making length adjustments to thestruts116, the spatial positioning of the upper and lower fixator rings106,108, and thus theanatomical structure segments102,104 can be altered. For example, in the illustrated embodiment the firstanatomical structure segment102 is attached to theupper fixator ring106 and the secondanatomical structure segment104 is attached to thelower fixator ring108. It should be appreciated that attachment of the first and secondanatomical structure segments102,104 to the upper and lower fixator rings106,108 is not limited to the illustrated embodiment (e.g., where the central longitudinal axes L1, L2 of the first and secondanatomical structure segments102,104 are substantially perpendicular to the respective planes of the upper and lower fixator rings106,108), and that a surgeon has complete flexibility in aligning the first and secondanatomical structure segments102,104 within the upper and lower fixator rings106,108 when configuring theorthopedic fixator100.
By varying the length of one or more of thestruts116, the upper and lower fixator rings106,108, and thus theanatomical structure segments102 and104 can be repositioned with respect to each other such that their respective longitudinal axes L1, L2 are substantially aligned with each other, and such that their respective fractured ends103,105 abut each other, so as to promote union during the healing process. It should be appreciated that adjustment of thestruts116 is not limited to the length adjustments as described herein, and that thestruts116 can be differently adjusted as desired. It should further be appreciated that adjusting the positions of the fixator members is not limited to adjusting the lengths of the lengthadjustable struts116, and that the positioning of the fixator members with respect to each other can be alternatively adjusted, for example in accordance the type and/or number of adjustment members connected to the fixation apparatus.
Repositioning of the fixator members of an orthopedic fixation apparatus, such asorthopedic fixator100, can be used to correct displacements of angulation, translation, rotation, or any combination thereof, within bodily tissues. A fixation apparatus, such asorthopedic fixator100, utilized with the techniques described herein, can correct a plurality of such displacement defects individually or simultaneously. However, it should be appreciated that the fixation apparatus is not limited to the illustratedorthopedic fixator100, and that the fixation apparatus can be alternatively constructed as desired. For example, the fixation apparatus can include additional fixation members, can include fixation members having alternative geometries, can include more or fewer adjustment members, can include alternatively constructed adjustment members, or any combination thereof.
Referring now toFIG. 2, an example imaging of a fixation apparatus will now be described in detail. The images can be captured using the same or different imaging techniques. For example, the images can be acquired using x-ray imaging, computer tomography, magnetic resonance imaging, ultrasound, infrared imaging, photography, fluoroscopy, visual spectrum imaging, or any combination thereof.
The images can be captured from any position and/or orientation with respect to each other and with respect to thefixator100 and theanatomical structure segments102,104. In other words, there is no requirement that the captured images be orthogonal with respect to each other or aligned with anatomical axes of the patient, thereby providing a surgeon with near complete flexibility in positioning theimagers130. Preferably, theimages126,128 are captured from different directions, or orientations, such that the images do not overlap. For example, in the illustrated embodiment, the image planes of the pair ofimages126,128 are not perpendicular with respect to each other. In other words, the angle α between the image planes of theimages126,128 is not equal to 90 degrees, such that theimages126,128 are non-orthogonal with respect to each other. Preferably, at least two images are taken, although capturing additional images may increase the accuracy of the method.
Theimages126,128 can be captured using one or more imaging sources, or imagers, for instance thex-ray imagers130 and/or correspondingimage capturing devices127,129. Theimages126,128 can be x-ray images captured by a singlerepositionable x-ray imager130, or can be captured by separately positionedimagers130. Preferably, the position of theimage capturing devices127,129 and/or theimagers130 with respect to thespace origin135 of the three-dimensional space, described in more detail below, are known. Theimagers130 can be manually positioned and/or oriented under the control of a surgeon, automatically positioned, for instance by a software assisted imager, or any combination thereof. Thefixator100 may also have a respectivefixator origin145.
Referring now toFIGS. 3A and 3B, an example process for controlling manipulation of a fixation apparatus including rings and struts to correct an anatomical structure deformity of first and second anatomical structure segments will now be described in detail. In particular, atoperation310, first and second anatomical structure segments are attached to a fixation apparatus, for example as shown inFIG. 1 and described in detail above. At operation,312, first and second images of the fixation apparatus and the attached first and second anatomical structure segments are captured, for example as shown inFIG. 2 and described in detail above.
The remaining operations of the process ofFIGS. 3A and 3B (e.g., operations314-342) will now be described in association with a treatment technique referred to hereinafter as Perspective Frame Matching, in which images, such as post-operative x-rays, may be used along with a frame to generate deformity and mounting parameters for a strut adjustment plan. For example, referring now toFIG. 4, an example treatment planning technique selection interface400-A is shown. In the example ofFIG. 4, the user has selectedoption401 in order to use the Perspective Frame Matching (PFM) technique, which will now be described in detail with reference toFIGS. 5-13.
Referring back toFIG. 3A, at operation314, configuration information associated with a fixation apparatus is received, for example using one or more graphical user interfaces of a computing system. In some examples, the configuration information may include one or more geometric characteristics (e.g., size, length, diameter, etc.) of one or more elements of the fixation apparatus, such as struts, hinges, rings, and others. In some examples, the configuration information may include information such as ring types (e.g., full ring, foot plate, etc.), indications of mounting points (e.g., ring holes) used for strut mounting, and other information. In some examples, the configuration information may also include information about marker elements, for example that are mounted to components of the fixation apparatus, such as struts, hinges, and rings. Referring now toFIG. 5, an example configurationinformation entry interface500 is shown. As shown,interface500 includesring type indicators501 and502, which, in this example, are drop down menus that may be used to select ring types for the proximal and distal rings, respectively.Indicators501 and502 are set to the “Full” option to indicate that the proximal and distal rings are full rings.Interface500 also includesdiameter indicators503 and504, which, in this example, are drop down menus that may be used to select diameters or lengths for the proximal and distal rings, respectively.
Theinterface500 also includes controls for entry of strut information. In particular,interface500 includes six drop downmenus512 may each be used to indicate a size of a respective strut. Globalstrut size indicator511 may also be used to globally select a size for all six struts.Length selectors513 may be each be used to select a length of a respective strut.Length indicators514 may be each be used to provide a visual representation of the lengths of the respective struts. It is noted that thelength indicators514 do not necessarily depict the actual exact length of each strut, but rather represent the comparative lengths of the struts with respect to one another.
Save andUpdate button516 may be selected to save and update the configuration information values shown ininterface500. In some examples, selection ofbutton516 may causeinterface500 to display and/or update agraphical representation520 of the fixation apparatus generated based, at least in part, on the entered configuration information. Thegraphical representation520 may be displayed using one or more graphical user interfaces of a computing system. As shown,graphical representation520 includes six struts that may be color-coded in multiple colors for easy identification. For example, in some cases, each of the struts (or at least two of the struts) may be shown in different colors with respect to one another. The struts ingraphical representation520 may have sizes, lengths, mounting points, and other features corresponding to entered configuration information.Graphical representation520 also depicts the fixator rings, which may have diameters/lengths, ring types, and other features corresponding to entered configuration information.Graphical representation520 may, for example, improve efficiency and reliability by providing the user with a visual confirmation of information entered intointerface500, for example to allow fast and easy identification of errors or other problems.
At operation316, images of the fixation apparatus and the first and second anatomical structure segments attached thereto are displayed, for example using one or more graphical user interfaces of a computing system. The displayed images may include images that were captured at operation312, such as using x-ray imaging, computer tomography, magnetic resonance imaging, ultrasound, infrared imaging, photography, fluoroscopy, visual spectrum imaging, or any combination thereof. Techniques for acquiring images of the fixation apparatus and the first and second anatomical structure segments are described in detail above and are not repeated here. As set forth above, the acquired and displayed images need not necessarily be orthogonal to one another. Referring now toFIG. 6, an example first imageinformation entry interface600 is shown. As shown,interface600 includes images601-A and601-B, which show the fixation apparatus and first and second anatomical structure segments from different angles. In the example ofFIG. 6, image601-A corresponds to an anteroposterior (AP) View, while image601-B corresponds to a lateral (LAT) view. In some examples, the displayed images601-A-B may be loaded and saved in computer memory, for example in a library, database or other local collection of stored images. The displayed images601-A-B may then be selected and retrieved, acquired, and/or received from memory for display.
Atoperation318, first image information is received, for example using one or more graphical user interfaces of a computing system. The first image information may include indications of one or more locations, within the images, of at least part of one or more elements of the fixation apparatus. For example, the first image information may include one or more indications of locations of struts, hinges, rings, and other fixator elements. In some examples, the first image information may also include information about locations, within the images, of marker elements, for example that are mounted to components of the fixation apparatus, such as struts, hinges, and rings. In some cases, the first image information may include points representing locations of hinges and/or lines or vectors representing locations of struts. In some examples, the first image information may be entered into a computing system by selecting or indicating one or more locations within the displayed images, for example using a mouse, keyboard, touchscreen or other user input devices. In particular, using one or more input devices, a user may select points or other locations in the images, draw lines, circles, and generate other graphical indications within the images. For example, in some cases, a user may generate a point or small circle at a particular location in an image to indicate a location (e.g., center point) of a hinge within the image. As another example, in some cases, a user may generate a line and/or vector within an image to indicate a location and/or length of a strut within the image.
For example, as shown inFIG. 6,interface600 includes six AP View strut indicator buttons611-A corresponding to each of the six struts of the fixation apparatus shown in AP View image601-A. Each button611-A includes text indicating a respective strut number (i.e.,Strut1,Strut2, Strut3,Strut4,Strut5, Strut6). Buttons611-A may be selected by a user to indicate a strut for which first image information (e.g., hinge locations, strut locations, etc.) will be provided by the user in AP View image601-A. For example, in some cases, to provide first image information forStrut1 in AP View image601-A, a user may first select the top strut indicator button611-A (labeled with the text “Strut1”) in order to indicate to the software that the user is about to provide first image information forStrut1 within AP View image601-A. In some cases, the strut indicator button611-A forStrut1 may be pre-selected automatically for the user. Upon selection (or automatic pre-selection) of the strut indicator button611-A forStrut1, the user may proceed to draw (or otherwise indicate) a representation ofStrut1 within AP View image601-A. For example, in some cases, the user may use a mouse or other input device to select a location621 (e.g., a center point) of a proximal hinge forStrut1 within image601-A. In some examples, the user may then use a mouse or other input device to select a location622 (e.g., a center point) of the distal hinge ofStrut1 within image601-A. In some examples, the user may indicate the location and/or length ofStrut1 by selecting the locations of the proximal and distal hinges and/or as the endpoints of a line or vector that represents the location and/or length ofStrut1. For example, as shown inFIG. 6, the software may generate points or circles at thelocations621 and622 of the proximal and distal hinges selected by the user within image601-A. Additionally, the software may generate aline623 representing the location and/or length ofStrut1 that connects the points or circles at thelocations621 and622 and of the proximal and distal hinges selected by the user within image601-A. Any other appropriate input techniques may also be employed by the user to indicate a location and/or length ofStrut1 within image610-A, such as generatingline623 by dragging and dropping a mouse, using a finger and/or pen on a touch screen, keyboard, and others. In some examples, the above described process may be repeated to draw points representing proximal and distal hinges and lines representing the locations and/or lengths of each of the six struts in the AP View image601-A. Furthermore, the above described process may also be repeated using LAT View strut indicator buttons611-B to draw points representing proximal and distal hinges and lines representing the locations and/or lengths of each of the six struts in the LAT View image601-B.
In some examples, the first image information generated within images601-A and601-B may include color-coded graphical representations of the struts, for example to enable the graphical representations to be more clearly associated with their respective struts. For example, inFIG. 6, the graphical representations (e.g., points, circles, and/or lines) ofStrut1 in images601A- and601-B may be colored in red. This may match a strut icon (which may also be colored red) displayed in the strut indicator buttons611-A and611-B for Strut1 (displayed to the right of the text “Strut1” in buttons611-A and611-B). As another example, inFIG. 6, the graphical representations (e.g., points, circles, and/or lines) of Strut3 in images601-A and601-B may be colored in yellow. This may match a strut icon (which may also be colored yellow) displayed in the strut indicator buttons611-A and611-B for Strut3 (displayed to the right of the text “Strut3” in buttons611-A and611-B).
FIG. 6 includes an AP View close-up assist checkbox616-A and a LAT View close-up assist checkbox616-B, for example provided using one or more graphical interfaces of a computing system. Selection of checkboxes616-A and616-B may allow close-up views of areas of images601-A and601-B surrounding the proximal and distal hinges of the struts that are currently being drawn by the user. This may enable more accurate indications of the locations (e.g., center points) of the hinges. Referring now toFIG. 7, close-upassist interface700 depicts anotherAP View image701 with the close-up assist being selected to provide a proximal hinge close-upassist view702 and a distal hinge close-upassist view703. As shown, proximal hinge close-upassist view702 provides an enlarged view of an area ofAP View image701 associated with the proximal hinge, while distal hinge close-upassist view703 provides an enlarged view of an area ofAP View image701 associated with the distal hinge. The user may manipulate (e.g., drag and drop) the location of the point/circle721 in proximal hinge close-upassist view702 in order to more accurately depict the center point of the proximal hinge. The user may also manipulate (e.g., drag and drop) the location of the point/circle722 in distal hinge close-upassist view703 in order to more accurately depict the center point of the distal hinge. As should be appreciated, corresponding close-up assist views similar toviews702 and703 may also be provided for a respective LAT View image, for example using one or more graphical interfaces of a computing system.
Referring back toFIG. 6, to the right of buttons611-A, are six proximal hinge selector buttons612-A. Additionally, to the right of buttons612-A, are six distal hinge selector buttons613-A. Furthermore, to the right of buttons613-A, are six strut line selector buttons614-A. In some examples, buttons612-A and/or613-A may be selected to use the locations (e.g., center points) of the proximal and/or distal hinges indicated in AP View image601-A in calculating positions and orientations of the first and the second anatomical structure segments and rings of the fixation apparatus in three-dimensional space (see operation322). Additionally, in some examples, buttons612-A and/or613-A may be selected to use the lines or vectors representing the location and/or length of struts indicated in AP View image601-A in calculating positions and orientations of the first and the second anatomical structure segments in three-dimensional space. Similarly, buttons612-B,613-B, and614-B may be used to select the use of locations (e.g., center points) of the proximal and/or distal hinges or strut lines or vectors indicated in LAT View image601-B in calculating positions and orientations of the first and the second anatomical structure segments in three-dimensional space.
Referring again toFIG. 3A, atoperation320, second image information is received, for example using one or more graphical user interfaces of a computing system. The second image information may include indications of one or more locations, within the images, of at least part of the first and the second anatomical structure segments. In some examples, the second image information may include indications of center lines of the first and the second anatomical structure segments and/or one or more reference points (e.g., end points) of the first and the second anatomical structure segments. In some examples, the second image information may also include indications of locations of marker elements, for example implanted or otherwise associated with the first and the second anatomical structure segments. In some examples, the second image information may be entered into a computing system by selecting or indicating one or more locations within the displayed images, for example using a mouse, keyboard, touchscreen or other user input devices. In particular, using one or more input devices, a user may select points or other locations in the images, draw lines, circles, and generate other graphical indications within the images. For example, in some cases, a user may generate points or small circles at particular locations in an image to indicate one or more reference points (e.g., end points) of the first and the second anatomical structure segments within the images. As another example, in some cases, a user may generate a line within an image to indicate a center line of the first and the second anatomical structure segments within the images.
Referring now toFIG. 8A, an example second imageinformation entry interface800 is shown. As shown,interface800 includes AP View image601-A and LAT View image601-B. Additionally,interface800 includes buttons801-808, which may be used to assist in indication of anatomical structure center lines and reference points as will be described below. In particular,buttons801 and805 may be selected to indicate a proximal anatomical structure reference point in the AP View and LAT View, respectively.Buttons802 and806 may be selected to indicate a distal anatomical structure reference point in the AP View and LAT View, respectively.Buttons803 and807 may be selected to indicate a proximal anatomical structure center line in the AP View and LAT View, respectively.Buttons804 and808 may be selected to indicate a distal anatomical structure center line in the AP View and LAT View, respectively. For example, as shown inFIG. 8A, a user may selectbutton807 and then use one or more input devices to draw thecenter line831 for the proximal anatomical structure within LAT View image601-B. In some examples, thecenter line831 may be colored red. Additionally, twoguidelines832 are generated and displayed by the software on both sides of the red center line. In some examples, theguidelines832 may be colored green. Theseguidelines832 may be displayed while the user is drawing thecenter line831 in order to assist the user in locating the center of the anatomical structure segment. Theguidelines832 may be generated at equal distances from each side of thecenter line831 and may assist the user by, for example, potentially allowing the user to match (or nearly match) theguidelines832 to sides of the anatomical structure segment. As shown inFIG. 8B, the user may selectbutton808 and then use one or more input devices to draw thecenter line841 for the distal anatomical structure within LAT View image601-B. As shown inFIG. 8C, the user may selectbutton803 and then use one or more input devices to draw thecenter line851 for the proximal anatomical structure within AP View image601-A. As shown inFIG. 8D, the user may selectbutton804 and then use one or more input devices to draw the center line861 for the distal anatomical structure within AP View image601-A. As shown inFIGS. 8B-8D,guidelines832 may also be displayed for assistance indrawing center lines841,851 and861.
As shown inFIG. 8E, the user may selectbutton805 and then use one or more input devices to indicate a reference point (e.g., end point) for the proximal anatomical structure within LAT View image601-B. As shown, a user has indicated areference point811 in LAT View image601-B at an end point of the proximal anatomical structure segment. Additionally, upon indication ofreference point811, the software may generate and display a corresponding dashedreference line812 in AP View image601-A. Thereference line812 is a line drawn across AP View image601-A that passes through the location of the LAT Viewproximal reference point811 within AP View image601-A. Thereference line812 may, therefore, assist the user in determining the location of the corresponding AP View proximal reference point, which may often be at the intersection of thereference line812 and the AP Viewproximal center line851 within the AP View image601-A. As shown inFIG. 8F, the user may selectbutton801 and then use one or more input devices to indicate a reference point (e.g., end point) for the proximal anatomical structure within AP View image601-A. In this example, the AP View proximal anatomicalstructure reference point814 is indicated at the intersection ofreference line812 and the AP Viewproximal center line851 within the AP View image601-A. The software may then generate and display a corresponding dashedreference line813 in the LAT View image601-B. Thereference line813 is a line drawn across LAT View image601-B that passes through the location of the AP Viewproximal reference point814 within LAT View image601-B. Thereference line813 may assist the user by helping the user to confirm that the APView reference point814 was placed correctly by showing how well it lines up relative to the LATView reference point811.
As shown inFIG. 8G, the user may selectbutton806 and then use one or more input devices to indicate a reference point (e.g., end point) for the distal anatomical structure within LAT View image601-B. As shown, a user has indicated areference point815 in LAT View image601-B at an end point of the distal anatomical structure segment. Additionally, upon indication ofreference point815, the software may generate and display a corresponding dashedreference line816 in AP View image601-A. Thereference line816 is a line drawn across AP View image601-A that passes through the location of the LAT Viewdistal reference point815 within AP View image601-A. Thereference line816 may, therefore, assist the user in determining the location of the corresponding AP View distal reference point, which may often be at the intersection of thereference line816 and the AP View distal center line within the AP View image601-A. As shown inFIG. 8H, the user may selectbutton802 and then use one or more input devices to indicate a reference point (e.g., end point) for the distal anatomical structure within AP View image601-A. In this example, the AP View distal anatomicalstructure reference point817 is indicated at the intersection ofreference line816 and the AP View distal center line within the AP View image601-A. The software may then generate and display a corresponding dashedreference line818 in the LAT View image601-B. Thereference line818 is a line drawn across LAT View image601-B that passes through the location of the AP Viewdistal reference point817 within LAT View image601-B. Thereference line818 may assist the user by helping the user to confirm that the APView reference point817 was placed correctly by showing how well it lines up relative to the LATView reference point815.
Referring again toFIG. 3A, at operation322, positions and orientations of the first and second anatomical structure segments and rings of the fixation apparatus are determined in three-dimensional space. For example, in some cases, imaging scene parameters pertaining tofixator100, theanatomical structure segments102,104, imager(s)130, andimage capturing devices127,129 are obtained. The imaging scene parameters can be used in constructing a three-dimensional representation of the positioning of theanatomical structure segments102,104 in thefixator100, as described in more detail below. One or more of the imaging scene parameters may be known. Imaging scene parameters that are not known can be obtained, for example by mathematically comparing the locations of fixator element representations in the two-dimensional space of thex-ray images126,128 to the three-dimensional locations of those elements on the geometry of thefixator100. In a preferred embodiment, imaging scene parameters can be calculated using a pin hole or perspective camera models. For example, the imaging scene parameters can be determined numerically using matrix algebra, as described in more detail below.
The imaging scene parameters can include, but are not limited to image pixel scale factors, image pixel aspect ratio, the image sensor skew factor, the image size, the focal length, the position and orientation of the imaging source, the position of the principle point (defined as the point in the plane of arespective image126,128 that is closest to the respective imager130), positions and orientations of elements of thefixator100, the position and orientation of a respective image receiver, and the position and orientation of the imaging source's lens.
In a preferred embodiment, at least some, such as all of the imaging scene parameters can be obtained by comparing the locations of representations of particular components, or fixator elements of thefixator100 within the two-dimensional spaces of theimages126,128, with the corresponding locations of those same fixator elements in actual, three-dimensional space. The fixator elements comprise components of theorthopedic fixator100, and preferably are components that are easy to identify in theimages126,128. Points, lines, conics, or the like, or any combination thereof can be used to describe the respective geometries of the fixator elements. For example, the representations of fixator elements used in the comparison could include center lines of one or more of the adjustable length struts116, center points of theuniversal joints124, center points of the mountingmembers114, and the like.
The fixator elements can further include marker elements that are distinct from the above-described components of thefixator100. The marker elements can be used in the comparison, as a supplement to or in lieu of using components of thefixator100. The marker elements can be mounted to specific locations of components of thefixator100 prior to imaging, can be imbedded within components of thefixator100, or any combination thereof. The marker elements can be configured for enhanced viewability in theimages126,128 when compared to the viewability of the other components of thefixator100. For example, the marker elements may be constructed of a different material, such as a radio-opaque material, or may be constructed with geometries that readily distinguish them from other components of thefixator100 in theimages126,128. In an example embodiment, the marker elements can have designated geometries that correspond to their respective locations on thefixator100.
Fixator elements can be identified for use in the comparison. For example, locations, within theimages126,128 of fixator elements may be indicated using the first image information received atoperation318 and described in detail above. In some examples, the locations of the fixator elements in the two-dimensional space of theimages126,128 may be determined with respect tolocal origins125 defined in the imaging planes of theimages126,128. Thelocal origins125 serve as a “zero points” for determining the locations of the fixator elements in theimages126,128. The locations of the fixator elements can be defined by their respective x and y coordinates with respect to a respectivelocal origin125. The location of thelocal origin125 within the respective image can be arbitrary so long it is in the plane of the image. Typically, the origin is located at the center of the image or at a corner of the image, such as the lower left hand corner. It should be appreciated that the locations of the local origins are not limited to illustratedlocal origins125, and that thelocal origins125 can be alternatively defined at any other locations.
In some examples, a respective transformation matrix P may then be computed for each of theimages126,128. The transformation matrices can be utilized to map location coordinates of one or more respective fixator elements in actual three-dimensional space to corresponding location coordinates of the fixator element(s) in the two-dimensional space of therespective image126,128. It should be appreciated that the same fixator element(s) need not be used in the comparisons of bothimages126,128. For example, a fixator element used in constructing the transformation matrix associated withimage126 can be the same or different from the fixator element used in constructing the transformation matrix associated withimage128. It should further be appreciated that increasing the number of fixator elements used in computing the transformation matrices can increase the accuracy method. The following equation represents this operation:
[xy1]=P·[XYZ1](1)
The symbols x and y represent location coordinates, with respect to thelocal origin125, of a fixator element point in the two-dimensional space ofimages126,128. The symbols X, Y and Z represent corresponding location coordinates, with respect to aspace origin135, of the fixator element point in actual three-dimensional space. In the illustrated embodiment, the point corresponding to the center of the plane defined by the upper surface of theupper fixator ring106 has been designated as thespace origin135. The illustrated matrix P can be at least four elements wide and three elements tall. In a preferred embodiment, the elements of the matrix P can be computed by solving the following matrix equation:
A·p=B  (2)
The vector p can contain eleven elements representing values of the matrix P. The following equations present arrangements of the elements in the vector p and the matrix P:
p=[p1p2p3p4p5p6p7p8p9p10p11]T(3)P=[p1p2p3p4p5p6p7p8p9p10p11p12](4)
In the preferred embodiment, the twelfth element p12of the matrix P can be set to a numerical value of one. The matrices A and B can be assembled using the two-dimensional and three-dimensional information of the fixator elements. For every point representing a respective fixator element, two rows of matrices A and B can be constructed. The following equation presents the values of the two rows added to the matrices A and B for every point of a fixator element (e.g., a center point of a respective universal joint124):
[XYZ10000-x·X-x·Y-x·Z0000XYZ1-y·X-y·Y-y·Z]·p=[xy](5)
The symbols X, Y and Z represent location coordinate values of a fixator element point in actual three-dimensional space relative to thespace origin135, and the symbols x and y represent location coordinate values of the corresponding fixator element point in the two-dimensional space of therespective image126,128 relative tolocal origin125.
For every line representing a respective fixator element, two rows of matrices A and B can be constructed. The following equation presents the values of the two rows added to the matrices A and B for every line of a fixator element (e.g., a center line of a respective adjustable length strut116):
[X·aY·aZ·aaX·bY·bZ·bbX·cY·cZ·cdX·adY·adZ·a0dX·bdY·bdZ·b0dY·cdY·cdZ·c]·p=[-c0](6)
The symbols X, Y and Z represent location coordinate values of a point belonging to a line of a fixator element in actual three-dimensional space relative to thespace origin135. The symbols dX, dY and dZ represent gradient values of the line in actual three-dimensional space. The symbols a, b and c represent constants defining a line in the two-dimensional space of arespective image126,128. For example, a, b, and c can be computed using two points belonging to a line on arespective image126,128. In a preferred embodiment, the value of b is assumed to be 1, unless the line is a vertical line, in which case the value of b is zero. A correlation of constants a, b and c with the respective image coordinates x and y is presented in the following equation:
a·x+b·y+c=0  (7)
The equation (2) can be over constrained by using six or more fixator elements, for example the adjustable length struts116. It should be appreciated that it is not necessary for all of the fixator elements to be visible in a single one of theimages126,128 in order to obtain the matrix P. It should further be appreciated that if one or more of the above-described imaging scene parameters are known, the known parameters can be used to reduce the minimum number of the fixator elements required to constrain equation (2). For instance, such information could be obtained from modern imaging systems in DICOM image headers. Preferably, a singular value decomposition or least squares method can be used to solve equation (2) for values of the vector p.
In some examples, the transformation matrices may then be decomposed into imaging scene parameters. The following equation can be used to relate the matrix P to matrices E and I:
P=I·E  (8)
It should be appreciated that additional terms can be introduced when decomposing the matrix P. For example, the method presented by Tsai, described in “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using of-the-shelf TV Cameras and Lenses”, IEEE Journal of Robotics & Automation, RA-3, No. 4, 323-344, August 1987, which is incorporated herein by reference in its entirety, can be used to correctimages126,128, for radial distortion.
Matrices E and I contain imaging scene parameters. The following equation represents a composition of the matrix I:
I=[sx0-tx0sy-ty001/f](9)
The symbols sx and sy represent values of image coordinate scale factors (e.g., pixel scale factors). The symbol f, representing the focal length, corresponds to the value of the shortest distance between arespective imaging source130 and the plane of acorresponding image126,128. The symbols tx and ty represent the coordinates of the principle point relative to thelocal origin125 of therespective image126,128. The following equation represents the composition of the matrix E:
E=[r1r2r3-(r1·ox+r2·oy+r3·oz)r4r5r6-(r4·ox+r5·oy+r6·oz)r7r8r9-(r7·ox+r8·oy+r9·oz)](10)
The symbols ox, oyand ozrepresent values of the position of thefixator100 in actual three-dimensional space. The symbols r1to r9describe the orientation of thefixator100. These values can be assembled into a three-dimensional rotational matrix R represented by the following equation:
R=[r1r2r3r4r5r6r7r8r9](11)
The methods of Trucco and Verri, as described in “Introductory Techniques of 3-D Computer Vision”, Prentice Hall, 1998, or the method of Hartley, as described in “Euclidian Reconstruction from Uncalibrated Views”, Applications of Invariance in Computer Vision, pages 237-256, Springer Verlag, Berlin Heidelberg, 1994, which are incorporated herein in their entireties, can be used to obtain values of the matrices E and/or I. Utilizing the resulting values of matrices E and I, a complete three-dimensional imaging scene of thefixator100 and theanatomical structure segments102,104 can be reconstructed.
For example,FIG. 2 illustrates an example three-dimensional imaging scene reconstructed from thex-ray images126,128. In the illustrated embodiment, x-rays are emitted fromx-ray imagers130. It should be appreciated that thex-ray imagers130 can be the same or different imagers, as described above. The x-rays emitted from theimagers130 are received on by corresponding imaging devices, thus capturing theimages126,128. Preferably, the positioning of theimagers130 with respect to thelocal origins125 is known.
In some examples, theimages126,128 and the imaging scene parameters may then be used to obtain the positions and/or orientations of theanatomical structure segments102,104 in three-dimensional space. The position and/or orientation data obtained can be used to develop a treatment plan for a patient, for example to change the orientation and/or position of the fractured first and secondanatomical structure segments102,104 in order to promote union between theanatomical structure segments102,104, as described in more detail below. It should be appreciated that the methods and techniques described herein are not limited to applications of repositioning broken anatomical structures, and that orthopedic fixation with imagery analysis can be used in any other type of fixation procedure as desired, for example lengthening of anatomical structures, correction of anatomical defects, and the like.
In some examples, anatomical structure elements comprising representations of particular portions (e.g., anatomical features) of theanatomical structure segments102,104, may then be identified and their locations within theimages126,128 determined. For example, locations, within theimages126,128 of the first and the second anatomical structure segments may be indicated using the second image information received atoperation320 and described in detail above. In some examples, the locations of the anatomical structure elements may be determined with respect to the respectivelocal origins125 ofimages126,128.
The anatomical structure elements can be used in the construction of the three-dimensional representation of the position and/or orientation of theanatomical structure segments102,104. Preferably, the anatomical structure elements are easy to identify in theimages126,128. Points, lines, conics, or the like, or any combination thereof can be used to describe the respective geometries of the anatomical structure elements. For example, in the illustrated embodiment, points134 and136 representing the fractured ends103,105 of theanatomical structure segments102,104, respectively, are identified as anatomical structure elements in theimages126,128.
The anatomical structure elements can further include marker elements that are implanted into theanatomical structure segments102,104 prior to imaging. The marker elements can be used as a supplement to or in lieu of the above-described anatomical structure elements identified in theimages126,128. The marker elements can be configured for enhanced viewability in theimages126,128 when compared to the viewability of anatomical features of theanatomical structure segments102,104. For example, the marker elements may be constructed of a radio-opaque material, or may be constructed with readily distinguishable geometries.
A three-dimensional representation200 of theanatomical structure segments102,104 can be reconstructed. The three-dimensional representation can be constructed with or without a corresponding representation of thefixator100. In the illustrated embodiment, pairs of ray-lines, such asray lines138,140 and142,144 can be constructed for the anatomical structure element points134,136, respectively. Each ray line connects an anatomical structure element in one of theimages126,128 with arespective imager130. Each pair of ray lines can be analyzed for a common intersection point, such aspoints146,148. The common intersection points146,148 represent the respective positions of the anatomical structure element points134,136, in the three-dimensional representation of theanatomical structure segments102,104. Of course more than a pair of ray lines, such as a plurality, can be constructed, for example if more than two images were captured. If the ray lines of a particular set do not intersect, a point closest to all the ray lines in the set can be used as the common intersection point.
The positions and/or orientations of theanatomical structure segments102,104 can be quantified or measured using common intersection points, for instance points146,148. For example, lines representing center lines of theanatomical structure segments102,104 can be constructed and can be compared to the anatomical axes of the patient. Additionally, the distance between the fractured ends103,105 of theanatomical structure segments102,104 can be quantified. Using these or similar techniques, the positions and/or orientations of theanatomical structure segments102,104 can be determined. It is further noted that, in some examples, in addition to the positions and orientations of the first and second anatomical structure segments, the positions and orientation of rings (and/or other elements of the fixation apparatus) in three-dimensional space may also be determined, for example using any of the techniques described. For example, in some cases, locations of the rings within theimages126,128 may be determined based on the first image information and/or other provided information. In some examples, these locations may then be used to determine the positions and orientations of the rings in three-dimensional space. Additionally, in some examples, configuration information for the fixation apparatus, such as ring diameters and strut length and mounting information, may also be used to determine positions and orientations of the rings in three-dimensional space.
Referring now toFIG. 3B, atoperation324, one or more deformity parameters are calculated. The deformity parameters may include parameters relating to the deformity associated with the first and second anatomical structure segments. For example, in some cases, the deformity parameters may include an amount of translation (e.g., lateral, medial, anterior, and/or posterior), a degree of coronal angulation (e.g., valgus and/or varus), a degree of sagittal angulation, an amount by which anatomical structure length is too short and/or too long, a degree of clinical rotational deformity (e.g., internal and/or external), and others. In some examples, the deformity parameters may be calculated as part of the process determining the positions and orientations of the first and segment anatomical structure segments described above at operation422, for example using the techniques described above with reference to operation422.
Atoperation326, the deformity parameters calculated at operation424 are displayed, for example using one or more graphical user interfaces of a computing system. Referring now toFIG. 9, a deformity parameter interface900 is shown. As shown, interface900 includes various fields901-906 for displaying calculated values of various example deformity parameters, including AP View translation and coronal angulation, LAT View translation and sagittal angulation, an amount by which anatomical structure length is too short or too long, and a degree of clinical rotational deformity. In the example ofFIG. 9, fields901-905 each have a respective PFM badge915 (including the text “PFM”) that is displayed to the left of each field901-905. EachPFM badge915 indicates that the value shown in the respective field901-905 has been calculated by the software. Interface900 allows the deformity parameter values that are displayed in each field901-906 to be edited by a user, for example by typing a number in the fields901-906 and/or by using number increment controls916 displayed to the right of each field901-906. When a user edits a value that was calculated by the software, thePFM badge915 adjacent to the respective field may be removed to indicate that the value for the field has been edited by the user. In some examples, after editing the values in one or more fields, the user may select Refresh Perspective FrameMatching Data button920 to return each of the fields to the value that was calculated by the software. Also, in some examples, after editing the values in one or more fields, the user may select Save andUpdate button921 to cause the deformity parameters to be recalculated based on the edited values provided by the user, for example by repeating all or any portion of the calculations performed at operation322.
Atoperation328, a graphical representation of the position and orientation of the first and the second anatomical structure segments is generated and displayed. The graphical representation of the position and orientation of the first and the second anatomical structure segments may be displayed using one or more graphical user interfaces of a computing system. For example, as shown inFIG. 9, interface900 includes agraphical representation950 of the position and orientation of the first and the second anatomical structure segments.Graphical representation950 includes arepresentation931 of the proximal anatomical structure segment and arepresentation932 of the distal anatomical structure segment. In some examples, thegraphical representation950 may be generated based, at least in part, on the positions and orientations of the first and segment anatomical structure segments determined at operation322. In some examples, when the user edits one or more deformity parameters and selects Save andUpdate button921, thegraphical representation950 may also be adjusted to reflect the saved edits to the deformity parameters.Graphical representation950 may, for example, improve efficiency and reliability by providing the user with a visual confirmation of information entered into interface900, for example to allow fast and easy identification of errors or other problems.
Atoperation330, one or more mounting parameters are calculated. The mounting parameters may include parameters relating to mounting of a reference ring of the fixator onto a respective anatomical structure segment. For example, in some cases, the mounting parameters may include an amount of offset (e.g., lateral, medial, anterior, and/or posterior) such as for a center of the reference ring with respect to a reference point, a degree of tilt (e.g., proximal and/or distal), an amount of axial offset, a master tab rotation, and others. In some examples, the mounting parameters may be calculated as part of the process determining the positions and orientations of the first and segment anatomical structure segments described above at operation322, for example using the techniques described above with reference to operation322. It is noted that, for the process ofFIG. 3, the reference ring is not necessarily required to be orthogonal with respect to the respective anatomical structure segment on which it is mounted. Thus, in some examples, the reference ring may be non-orthogonal with respect to the respective anatomical structure segment on which it is mounted.
At operation432, the mounting parameters calculated at operation430 are displayed, for example using one or more graphical user interfaces of a computing system. Referring now toFIG. 10, a mountingparameter interface1000 is shown. As shown,interface1000 includes various fields1001-1006 for displaying calculated values of various example mounting parameters, including AP View offset and tilt, LAT View offset and tilt, axial offset, and master tab rotation. In the example ofFIG. 10, fields1001-1006 each have arespective PFM badge1015 that is displayed to the left of each field1001-1006. EachPFM badge1015 indicates that the value shown in the respective field1001-1006 has been calculated by the software.Interface1000 allows the mounting parameter values that are displayed in each field1001-1006 to be edited by a user, for example by typing a number in the fields1001-1006 and/or by using number increment controls1016 displayed to the right of each field1001-1006. When a user edits a value that was calculated by the software, thePFM badge1015 adjacent to the respective field may be removed to indicate that the value for the field has been edited by the user. In some examples, after editing the values in one or more fields, the user may select Refresh Perspective FrameMatching Data button1020 to return each of the fields to the value that was calculated by the software. Also, in some examples, after editing the values in one or more fields, the user may select Save andUpdate button1021 to cause the deformity parameters to be recalculated based on the edited values provided by the user, for example by repeating all or any portion of the calculations performed at operation322.
Atoperation334, a graphical representation of the position and orientation of the reference ring and the respective anatomical structure segment to which it is mounted is generated and displayed. The graphical representation of the position and orientation of the reference ring and the respective anatomical structure segment may be displayed using one or more graphical user interfaces of a computing system. For example, as shown inFIG. 10,interface1000 includes agraphical representation1050 of the position and orientation of the reference ring and the respective anatomical structure segment.Graphical representation1050 includes arepresentation1031 of the proximal anatomical structure segment, arepresentation1033 of the proximal (reference) ring, and arepresentation1032 of the distal anatomical structure segment. In some examples, thegraphical representation1050 may be generated based, at least in part, on the positions and orientations of the reference ring and the respective anatomical structure segment determined at operation322. The graphical representation of the reference ring and the respective anatomical structure segment may, therefore, reflect and/or indicate the positions and orientations of reference ring and the respective anatomical structure segment determined at operation322. In some examples, when the user edits one or more mounting parameters and selects Save andUpdate button1021, thegraphical representation1050 may also be adjusted to reflect the saved edits to the mounting parameters.Graphical representation1050 may, for example, improve efficiency and reliability by providing the user with a visual confirmation of information entered intointerface1000, for example to allow fast and easy identification of errors or other problems.
Atoperation336, one or more treatment plan options are received, for example using one or more graphical user interfaces of a computing system. A treatment plan is a plan for manipulating the fixation apparatus, for example in order to correct the deformity of the first and the second anatomical structure segments. The treatment plan may include, for example, a plan for making gradual adjustments to the positions and orientations of the fixator rings with respect to each other, for example by changing the lengths of the struts of the fixation apparatus. Referring now toFIG. 11, an example treatment plan interface1100A is shown. The interface1100A includes controls for selecting, by a user, various treatment plan options. In particular, controls1101 and/or1102 allow selecting of a treatment plan start date,control1103 allows selection of an option to perform axial movement first (e.g., in an initial part of the treatment, such as prior to rotational movement),control1104 allows selection of an option to indicate a final distance between reference points,control1105 allows selection of an option to calculate the treatment plan based on a specified duration (e.g., a number of days) for axial movement,control1106 allows selection of an option to calculate the treatment plan based on a rate of distraction at the reference point (e.g., for example millimeters (mm)/day) for axial movement,control1108 allows selection of an option to calculate the treatment plan based on a specified duration (e.g., a number of days) for deformity correction,control1109 allows selection of an option to calculate the treatment plan based on a rate of distraction at the reference point (e.g., for example millimeters (mm)/day) for deformity correction, andcontrol1107 allows selection of an option to perform two adjustments per day. In some examples, when control1007 is not selected, a default option of one adjustment per day may be used. In some examples, after selecting desired treatment plan options, the user may select UpdateAdjustment Plan button1110 to trigger generation of the treatment plan. Additionally, after initial generation of the treatment plan, the user may also be permitted to adjust the treatment plan options and have the treatment plan re-generated with the adjusted options by re-selecting UpdateAdjustment Plan button1110
At operation338, manipulations to the fixation apparatus for correction of the anatomical structure deformity (i.e., a treatment plan) are determined. The manipulations to the fixation apparatus may include adjustments to the struts of the fixation apparatus, such as adjustments to the sizes and/or lengths of the struts. In some examples, operation338 may be performed based, at least in part, on the treatment plan options received atoperation336. For example, operation338 may be performed based, at least in part, on specified start date, on instructions to perform axial movement first (e.g., in an initial part of the treatment, such as prior to rotational movement), a specified final distance between reference points, instructions to perform additional lengthening by a specified amount, instructions to generate an axial gap to ensure anatomical structure clearance, a specified duration (e.g., a number of days) of treatment, a specified rate of distraction, and/or instructions to perform two perform a specified quantity (e.g., one, two, etc.) of adjustments per day.
In some examples, the treatment plan may also be determined based, at least in part, on a determination of desired changes to the positions and/or orientations of theanatomical structure segments102,104, for instance how theanatomical structure segments102,104 can be repositioned with respect to each other in order to promote union between theanatomical structure segments102,104. For example, in some cases, it may be desirable to change the angulation of the secondanatomical structure segment104 such that the axes L1 and L2 are brought into alignment, and to change the position of the second anatomical structure segment such that the fractured ends103,105 of theanatomical structure segments102,104 abut each other. Once the desired changes to the positions and/or orientations of theanatomical structure segments102,104 have been determined, a treatment plan for effecting the position and/or orientation changes can be determined. In a preferred embodiment, the desired changes to the positions and/or orientations of theanatomical structure segments102,104 can be effected gradually, in a series of smaller changes. The positions and/or orientations of theanatomical structure segments102,104 can be changed by changing the positions and/or orientations of the upper and lower fixator rings106,108 with respect to each other, for instance by lengthening or shortening one or more of the lengthadjustable struts116.
The required changes to the geometry of the fixator100 (i.e., the position and/or orientation of the fixator100) that can enable the desired changes to the positions and/or orientations of theanatomical structure segments102,104 can be computed using the matrix algebra described above. For example, the required repositioning and/or reorientation of the secondanatomical structure segment104 with respect to the firstanatomical structure segment102 can be translated to changes in the position and/or orientation of thelower fixator ring108 with respect to theupper fixator ring106.
At operation340, indications of the determined manipulations to the fixation apparatus are provided to one or more users. For example, in some cases, indications of the determined manipulations to the fixation apparatus may be provided using one or more graphical user interfaces of a computing system, using a printed hard copy, using audio feedback, and/or using other techniques. In particular, referring now toFIG. 12, it is seen that indications of the determined manipulations to the fixation apparatus may be provided within interface1100B. Specifically, selection of StrutAdjustment Plan tab1122 may cause treatment plan interface1100B to provide achart1130, including day-by-day manipulation information for each strut within the fixation apparatus. In this example,chart1130 shows a length for each strut on each day of treatment. In some examples, one or more alerts may be generated for one or more manipulations to the fixation apparatus that result in at least one of strut movement of more than a threshold amount. For example, in some cases, strut movements exceeding particular threshold amount (e.g., 3 mm per day), which may be referred to as rapid strut movements, may be indicated by displaying a red triangle icon next to the indication of the strut movement inchart1130. As also shown inFIG. 12, a PDF version of thechart1130 may be generated by selecting ViewDraft PDF button1131. The generated PDF may, in some examples, be printed to create a hard copy version ofchart1130.
In the example ofFIG. 12,chart1130 includes blocks1132-A and1132-B indicating ranges of dates on which changes of strut sizes, referred to as strut swaps, may be performed. In particular, block1132-A indicates that a strut swap may be performed forStrut4 onDay0 throughDay2, while block1132-B indicates that a strut swap may be performed forStrut4 on Day3 through Day14 (and subsequent days). In some examples, blocks1132-A and1132-B may be color-coded to match a color assigned to a respective strut. For example, blocks1132-A and1132-B may be colored green to match a green color that may be assigned toStrut4. Referring now toFIG. 13, Strut SwapsCalendar tab1123 of treatment plan interface1100-C may be selected to generate acalendar1140 indicating ranges of dates on which strut swaps may be performed.
In some examples, the struts of the fixation apparatus attached to the patient may be color-coded, for example using color-coded caps, marker, or other color-coded materials included within and/or attached to the struts. In some examples, the physical color-coding of the struts in the fixation apparatus attached to the patient may match the color-coding of struts used in the software. For example, the physical color-coding of the struts in the fixation apparatus may match the color-coding of struts that may be used to color-code the blocks1132-A and1132-B ofchart1130,graphical representation520, and other color-coded representations of the struts displayed by the software. In some examples, this may make it easier for physicians and/or patients to confirm that, when they physically adjust a strut on the fixation apparatus, they are adjusting the correct strut by the correct amount.
At operation342, one or more graphical representations of the position and orientation of the first and second anatomical structure segments and the rings of the fixation apparatus is generated and displayed. The graphical representation of the position and orientation of the first and the second anatomical structure segments and the rings of the fixation apparatus may be displayed using one or more graphical user interfaces of a computing system. For example, referring back toFIG. 11, selection ofTreatment Simulation tab1121 may causeinterface1100 to display agraphical representation1150 of the position and orientation of the first and the second anatomical structure segments and the rings of the fixation apparatus.Graphical representation1150 includes arepresentation1031 of the proximal anatomical structure segment, arepresentation1033 of the proximal (reference) ring, arepresentation1032 of the distal anatomical structure segment, and arepresentation1034 of the distal ring. In some examples, the one or more graphical representations of the position and orientation of the first and second anatomical structure segments and the rings of the fixation apparatus may include day-by-day graphical representations of the position and orientation of the first and second anatomical structure segments and the rings of the fixation apparatus throughout treatment for the anatomical structure deformity. For example, as shown inFIG. 11, a user may select a particular day of treatment for which to generate and display agraphical representation1150 usingcontrols1151,1152,1153, and/or1154. For example,control1151 may be selected to allow incrementing of the selected day,control1154 may be selected to allow decrementing of the selected day, andslider1152 may be slid alongbar1153 to increment and/or decrement the selected day. It is also noted thatslider1152 displays an indication of the currently selected day, which, in the example ofFIG. 11, is treatment day zero. Thus, inFIG. 11,graphical representation1150 shows the position and orientation of the first and second anatomical structure segments and the rings of the fixation apparatus at treatment day zero. Using controls1151-1154 to select a different day of treatment may causegraphical representation1150 to be adjusted to show the position and orientation of the first and second anatomical structure segments and the rings of the fixation apparatus on the selected different day. As should be appreciated, allowing the surgeon and/or patient to see graphical representations of the position and orientation of the first and second anatomical structure segments and the rings of the fixation apparatus throughout treatment may be beneficial by, for example, providing an additional visual tool to improve accuracy and assist in planning of treatment. Additionally, graphical representation1150 (as well as graphical representations described herein) may, for example, improve efficiency and reliability by providing the user with a visual confirmation of information entered intointerface1100, for example to allow fast and easy identification of errors or other problems. It is further noted that the view of graphical representation1150 (as well as other graphical representations described herein) may be rotated (for example by a complete 360 degrees), zoomed in and out, moved in direction, and otherwise manipulated, for example using controls1181-1184 adjacent to the upper right side of thegraphical representation1150. This may allow views of the first and second anatomical structure segments and/or the rings of the fixation apparatus from various orientations that may not be available, or may be difficult to obtain, using x-rays and other imaging techniques, thereby also improving reliability and accuracy and providing additional visual confirmation of calculated values. In particular, view of thegraphical representation1150 may be rotated usingcontrol1181, zoomed in usingcontrol1182, zoomed out usingcontrol1183, and panned usingcontrol1184. Also, in some examples, other controls, such as a mouse and touchscreen, may also be employed to rotate, zoom, pan, and otherwise manipulategraphical representation1150. Additionally, in some examples,control1185 may be used to select an anteroposterior (AP) view,control1186 may be used to select a lateral view, andcontrol1187 may be used to select a proximal view.
At operation344, the treatment plan may be implemented, that is the geometry of the fixation apparatus may be changed, for example based on the manipulations determined at operation338, in order to change positions and orientations of the anatomical structure segments.
Guided Frame Matching with Graphical Fixator Projection
As described above, a frame matching process may be employed to determine positions and orientations of anatomical structure segments in three-dimensional space, such as for generating a treatment plan for correction of an anatomical deformity. As also described above, in some examples, as part of the frame matching process, a surgeon or other user may identify locations of fixator elements (e.g., rings, struts, etc.) within displayed images (e.g., x-rays) that show the fixator attached to the anatomical structure segments. Some examples of this process are described above with reference tooperation318 ofFIG. 3A andFIG. 6. For example, as shown inFIG. 6 and described above, a user may identify locations of struts within AP View image601-A and LAT View image601-B as part of the frame matching process. However, it may often be difficult for the user to identify and mark positions of certain fixator elements, such as struts, within the images. In particular, depending upon the location and orientation from which an image is captured, struts and other fixator elements may be not be identified easily, such as because they may wholly or partially overlap one another or may otherwise be obscured within the images. For example, in some cases, it may often be more difficult to identify struts within a lateral image than to identify the struts within an anterior image, such as because the struts may often overlap one another when viewed within the lateral image. For example, as shown inFIG. 6, several of the struts are positioned within LAT View image601-B such that they are displayed very close together. Specifically,Strut1 andStrut2 are displayed very close together on the left side of the image601-B, Strut3 and Strut6 are displayed very close together in the middle side of the image601-B, andStrut4 andStrut5 are displayed very close together on the right side of the image601-B. This may make it cumbersome for the user to identify, for example,Strut1 as opposed toStrut2, Strut3 as opposed to Strut6, orStrut4 as opposed toStrut5.
In some examples, to alleviate the above and other problems, a guided frame matching technique may be employed. Some examples of the guided frame matching technique will now be described with reference toFIGS. 14-17. Specifically, referring now toFIG. 14, an example process for providing a graphical projection of a fixator using a guided frame matching technique will now be described in detail. As described above, the fixator may include fixator elements such as rings and struts and may be for correcting a deformity of first and second anatomical structure segments. The process ofFIG. 14 is initiated atoperation1410, at which first and second images of the first and the second anatomical structure segments and the fixator attached thereto are displayed. The first and the second images may have respective image planes. As shown inFIG. 2 and described above, there is an angle α between the image planes of theimages126,128.
A first example of the display of the first and the second images atoperation1410 is shown inFIG. 6, which includes AP View image601-A and LAT View image601-B as described above. An additional example of the display of the first and the second images atoperation1410 is shown inFIG. 15A, which will now be described in detail. In particular,FIG. 15A displays an AP View image1501-A and a LAT View image1501-B, which are images of afixator1510 includingproximal fixator ring1511,distal fixator ring1512 and fixator struts1513. The images1501-A and1501-B show thefixator1510 attached to a firstanatomical structure segment1521 and a secondanatomical structure segment1522. The first and second images of the first and the second anatomical structure segments and the fixator attached thereto may be displayed atoperation1410 using one or more graphical user interfaces of a computing system. For example, images1501-A and1501-B ofFIGS. 15A-17B may be displayed using one or more graphical user interfaces of a computing system. It is noted that any, or all, of the contents shown in each ofFIGS. 15A-17B may be displayed using one or more graphical user interfaces of a computing system.
It is noted that, in the examples ofFIGS. 15A-17B, the images1501-A and1501-B are simulated images-as opposed to actual x-rays (as inFIG. 6) or other images captured from an imager or imaging source. It is noted that the simulated images ofFIGS. 15-17 are provided merely for ease of illustration of the concepts described herein. In practice, the images1501-A and1501-B may be non-simulated images, such as x-rays, which are captured using an imager, imaging source, x-ray imager, camera or other image capture device, and that show an actual fixator that is physically attached to an actual anatomical structure segment (such as shown inFIG. 6). Thus, even though images1501-A and1501-B are displayed as simulations, the concepts described herein should be understood to also be applicable to non-simulated images (i.e., images that were captured using an imager, imaging source, x-ray imager, camera or other image capture device) similar to the images601-A and601-B ofFIG. 6.
Atoperation1412, first indications are received of first locations, within the first image, of a plurality of elements of the fixator, such as struts of the fixator, for example using the one or more graphical user interfaces of the computing system. For example, as described above with respect toFIG. 6, the user may indicate locations of struts within the AP View image601-A, such as by clicking on endpoints of the struts (e.g., hinges) using an attached mouse or other input device. As described above, the strut indicator button611-A forStrut1 may be pre-selected automatically for the user. Upon selection (or automatic pre-selection) of the strut indicator button611-A forStrut1, the user may proceed to draw (or otherwise indicate) a representation ofStrut1 within AP View image601-A. For example, in some cases, the user may use a mouse or other input device to select a location621 (e.g., a center point) of a proximal hinge forStrut1 within image601-A. In some examples, the user may then use a mouse or other input device to select a location622 (e.g., a center point) of the distal hinge ofStrut1 within image601-A. In some examples, the user may indicate the location and/or length ofStrut1 by selecting the locations of the proximal and distal hinges and/or as the endpoints of a line or vector that represents the location and/or length ofStrut1. For example, as shown inFIG. 6, the software may generate points or circles at thelocations621 and622 of the proximal and distal hinges selected by the user within image601-A. Additionally, the software may generate aline623 representing the location and/or length ofStrut1 that connects the points or circles at thelocations621 and622 and of the proximal and distal hinges selected by the user within image601-A. Any other appropriate input techniques may also be employed by the user to indicate a location and/or length ofStrut1 within image610-A, such as generatingline623 by dragging and dropping a mouse, using a finger and/or pen on a touch screen, keyboard, and others. In some examples, the above described process may be repeated to draw points representing proximal and distal hinges and lines representing the locations and/or lengths of each of the six struts in the AP View image601-A. A similar technique may also be employed to indicate the locations of each of the sixfixator struts1513 in AP View image1501-A ofFIG. 15A.
In some examples, after the user indicates locations of thestruts1513 within the AP View image1501-A, the software may use the indicated strut locations to determine locations of the fixator rings1511 and1512 within the AP View image1501-A. The software may then generate ringgraphical representations1531 and1532, corresponding to the fixator rings1511 and1512, respectively, and display the ringgraphical representations1531 and1532 at the determined locations of the fixator rings1511 and1512 within the AP View image1501-A. Referring now toFIG. 15B, it is seen that ringgraphical representations1531 and1532 are generated by the software and displayed within AP View image1501-A at the corresponding locations of the respective fixator rings1511 and1512. It is noted that the fixator ringgraphical representations1531 and1532 are shown inFIG. 15B with a different shade/color than the actual fixator rings1511 and1512 to indicate that the fixator ringgraphical representations1531 and1532 are generated by the software and are not included in the actual underlying AP View image1501-A. Specifically, the fixator ringgraphical representations1531 and1532 are shown in blue color/shades, while the fixator rings1511 and1512 are shown in black color/shades. The presence of the fixator ringgraphical representations1531 and1532 in the AP View image1501-A ofFIG. 15B (but not in the LAT View image1501-B ofFIG. 15B) indicates that fixator element location information has been received for the AP View image1501-A ofFIG. 15B but has not yet been received for the LAT View image1501-B ofFIG. 15B.
Atoperation1414, a graphical projection of the fixator is overlaid, for example using the one or more graphical user interfaces of the computing system, on the second image. The graphical projection of the fixator may include, for example, graphical representations of the rings and/or the struts of the fixator. For example, referring now toFIG. 16, it is seen that agraphical projection1600 of the fixator is displayed that includes agraphical representation1611 of the proximal ring and agraphical representation1612 of the distal ring. As shown, thegraphical projection1600, includinggraphical representations1611 and1612 is overlaid on the second image, which in this example is the LAT View image1501-B. Although not shown inFIG. 16, in addition or as an alternative tographical representations1611 and1612 of the fixator rings, thegraphical projection1600 may also include graphical representations of other fixator elements, such as fixator struts. Moreover, in some examples, the graphical representations of the fixator struts included in a graphical projection may be color coded such that different struts are shown in different colors, for example to match different colors of strut indicator buttons611-A and611-B.
Thegraphical projection1600 of the fixator may be rotated relative to the first locations of the plurality of fixator elements identified in the first image. Specifically, thegraphical projection1600 of the fixator may be rotated relative to the first locations based at least in part on an angle (such as at the exact angle or at an approximation of the angle) of image planes of the first and the second images with respect to one another. As shown inFIG. 2 and described above, there is an angle α between the image planes of theimages126,128. Thus, in the example ofFIG. 16, the AP View image1501-A may have a respective AP View image plane, and the LAT View image1501-B may have a respective LAT View image plane at an angle of ninety degrees with respect to the AP View image plane. Accordingly, in the example ofFIG. 16, thegraphical projection1600 of the fixator is rotated ninety degrees relative to the first locations of the plurality of fixator elements identified in the first image. For example, both the proximal ringgraphical representation1611 and the distal ringgraphical representation1612 ofFIG. 16 are rotated ninety degrees relative to the proximal fixator ring1511 (and/or the respective ring representation1531) and distal fixator ring1512 (and/or the respective ring representation1532) in the AP View image1501-A.
Thegraphical projection1600 of the fixator may be rotated based at least in part on the angle between image planes of the images because that rotation may correspond to the expected position of the fixator in the second image. For example, if an image plane of the LAT View image1501-B is at an angle of ninety degrees to an image plane of the AP View image1501-A, then it may be expected that the locations of the fixator rings in the LAT View image1501-B will be rotated ninety degrees relative to the locations of the fixator rings in the AP View image1501-A. In this way, the overlaying of thegraphical projection1600 on the second image may assist the user in identifying locations of the plurality of fixator elements in the second image. In some examples, a user may provide a numerical value, such as a quantity of degrees (e.g., ninety degrees), that expressly indicates to the software the value of the angle between image planes of the images. In other examples, the value of the angle may be inferred by the software based on descriptions of the images (e.g., anteroposterior, anterior, posterior, lateral, medial, etc.) or using other techniques. In the examples ofFIGS. 15-17, image1501-A is an AP View image and image1501-B is a lateral image. It is noted, however, that the guided frame matching techniques described herein may be used between any different combinations of images taken from any directions and orientations and having image planes at any angle with respect to one another.
Additionally, it is noted that the software may also manipulate other features of the graphical projection1600 (e.g. size, location, orientations, etc.) such as to correct for other differences (e.g., location, orientation, zoom level, etc.) between the first and the second images. For example, in some cases, if the second image was captured from a closer location to the fixator and/or is more zoomed-in than the first image, then the software may correct for this by enlarging the size of thegraphical projection1600 relative to the size of the fixator elements in the first image. By contrast, in some cases, if the second image was captured from a further location from the fixator and/or is more zoomed-out than the first image, then the software may correct for this by reducing the size of thegraphical projection1600 relative to the size of the fixator elements in the first image.
Thus, in some examples, thegraphical projection1600 of the fixator may be generated based, at least in part, on the first locations of the plurality of fixator elements in the first image indicated atoperation1412. Additionally or alternatively, in some examples, thegraphical projection1600 of the fixator may be generated based, at least in part, on configuration information for the fixator that is provided to the software by the user, such as ring types (e.g., full ring, foot plate, etc.), ring sizes, strut lengths, indications of mounting points (e.g., ring holes), and other information. Various types of configuration information and techniques for providing such information to the software are described in detail above, such as with respect toFIG. 5 and operation314 ofFIG. 3A, and are not repeated here.
At operation1416, the software may allow a user to manipulate (e.g., resize, rotate, move, etc.) the graphical projection and/or the second image. For example, the user may manipulate the graphical projection to make it more precisely align with the positions of the fixator elements in the second image. For example, the software may provide controls that allow resizing (making the graphical projection larger or smaller) or rotating of the graphical projection relative to its initial placement by the software when being overlaid upon the second image atoperation1414. For example, in some cases, it may be necessary to resize and/or rotate the graphical projection to correct for slight differences in the actual angle between the first and the second images relative to the expected angle (e.g., if the images are actually at an angle of ninety-two degrees rather than ninety degrees, etc.), to correct for differences in distance, position or orientation of the first and the second images relative to the objects included in the images, or for other reasons. In some examples, the software may provide various controls, such as buttons, that allow selections of operations such as move, resize and rotate, and the software may be configured to receive input from input devices, such as a mouse or keyboard, to accomplish those manipulations, for example via drag-and-drop, button clicks, keystrokes, etc.
In some examples, in addition or as an alternative to allowing a user to manipulate the graphical projection, the software may allow the user to manipulate the second image (e.g., LAT View image1501-B) upon which the graphical projection is overlaid. For example, in some cases, the software may allow the user to resize, rotate and/or move the second image and/or elements shown within the second image, such as to assist in aligning the fixator elements shown in the second image with corresponding elements of the graphical projection. Referring now toFIG. 17A, it is seen that the user has manipulated the second image, which is LAT View image1501-B, by moving the LAT View image1501-B down and to the right from its prior screen/interface location shown inFIG. 16. By moving the LAT View image1501-B in this manner (without moving the graphical projection1600), this allows the fixator elements in the LAT View image1501-B to be moved down and to the right such that they align with corresponding elements ofgraphical projection1600. For example, as shown inFIG. 17A, thegraphical representations1611 and1612 of the fixator rings substantially align with the respective fixator rings1511 and1512. Thus, only small portions of the fixator rings1511 and1512 are visible inFIG. 17 because they have been almost entirely overlaid by the respectivegraphical representations1611 and1612 of the fixator rings. In particular, inFIG. 17A, proximal ringgraphical representation1611 substantially aligns with (and almost entirely overlays)proximal fixator ring1511, and distal ringgraphical representation1612 substantially aligns with (and almost entirely overlays)distal fixator ring1512.
Atoperation1418, second indications are received of second locations, within the second image, of the plurality of elements of the fixator, for example using the one or more graphical user interfaces of the computing system. For example, in some cases, once thegraphical projection1600 of the fixator has been satisfactorily aligned with the position of the fixator in the second image (e.g., in the LAT View image1501-B), the user may employ thegraphical projection1600 as a guide to assist in identifying the locations of the plurality of fixator elements in the second image (e.g., in the LAT View image1501-B). For example, although not shown inFIGS. 16 and 17A, thegraphical projection1600 of the fixator may include graphical representations of the fixator struts. In these examples, when the user aligns the graphical projection of the fixator with the position of the fixator in the second image, the graphical representations of the fixator struts in the graphical projection will align with (and overlay) the respective locations of the fixator struts within the second image. As also noted above, graphical representations of the fixator struts in the graphical projection may be color coded such that different struts are shown in different colors, for example to match different colors of strut indicator buttons611-A and611-B and different colors of struts in the first image. The color coding of the graphical representations of the struts in the graphical projection may therefore assist the user in distinguishing between respective struts that are positioned closely together in the second image. For example, as shown inFIG. 17A, a first strut1513-A and a second strut1513-B are positioned closely together on the right side of the LAT View image1501-B. If graphical representations of these struts are included in thegraphical projection1600 and are aligned with (and overlaying) the struts1513-A and1513-B, then this may assist the user in distinguishing between the struts1513-A and1513-B in the LAT View image, particularly if the graphical representations of the struts in thegraphical projection1600 are color coded. For example, if a graphical representation of the first strut1513-A is shown in red, then the user may immediately appreciate that the red colored strut representation is aligned with and overlaying the first strut1513-A. Additionally, if a graphical representation of the second strut1513-B is shown in orange, then the user may immediately appreciate that the orange colored strut representation is aligned with and overlaying the second strut1513-B. In the manner, the user may employ graphical representations of the struts (and/or other fixator elements) in the graphical projection as a guide to identify respective ones of the plurality of fixator elements in the second image. It is noted that, in addition to the struts,graphical representations1611 and1612 of the rings may also be color coded.
Accordingly, by using thegraphical projection1600 of the fixator as a guide, the user may identify locations of the plurality of the fixator elements in the second image. The user may then indicate the locations of these fixator elements to the software, such as by using the same or a similar process as was used to identify the plurality of fixator elements in the first image atoperation1412. For example, the techniques described above in operation1412 (e.g., identifying proximal and distal hinges or endpoints of each strut) for the first image (AP View images601-A and1501-A) may be repeated for the second image (LAT View images601-B and1501-B), such as by using LAT View strut indicator buttons611-B to draw points representing proximal and distal hinges and lines representing the locations and/or lengths of each of the six struts in the LAT View images601-B and1501-B.
In some examples, after the user has indicated locations of thestruts1513 and/or other fixator elements within the LAT View image1501-A at operation1418 (e.g., using thegraphical projection1600 of the fixator as a guide), the software may use the indicated fixator element locations to determine locations of the fixator rings1511 and1512 within the LAT View image1501-B. The software may then generate ringgraphical representations1731 and1732, corresponding to the fixator rings1511 and1512, respectively, and display the ringgraphical representations1731 and1732 at the determined locations of the fixator rings1511 and1512 within the LAT View image1501-B. Referring now toFIG. 17B, it is seen that ringgraphical representations1731 and1732 are generated by the software and displayed within LAT View image1501-B at the corresponding locations of the respective fixator rings1511 and1512.
At operation1420, the first locations of the plurality of fixator elements in the first image (indicated at operation1412) and the second locations of the plurality of fixator elements in the second image (indicated atoperation1418 using the graphical projection as a guide) are used to determine positions and orientations of the first and second anatomical structure segments in three-dimensional space. For example, as described in detail above with respect to operation322 ofFIG. 3A, imaging scene parameters may be used to determine positions and orientations of the first and second anatomical structure segments in three-dimensional space. As also described above, the imaging scene parameters may be obtained by comparing the locations of representations of particular components, or fixator elements of the fixator within the two-dimensional spaces of the first and the second images, with the corresponding locations of those same fixator elements in actual, three-dimensional space. As also described above, such as with respect to operation338 ofFIG. 3B, manipulations to the fixation apparatus for correction of the anatomical structure deformity (i.e., a treatment plan) may be determined using the positions and orientations of the first and second anatomical structure segments in three-dimensional space. Specifically, the treatment plan may be determined based, at least in part, on a determination of desired changes to the positions and/or orientations of the anatomical structure segments, for instance how the anatomical structure segments can be repositioned with respect to each other in order to promote union between the anatomical structure segments.
Three-Dimensional Overview of Imaging Scene of the Fixator
Another technique for improving accuracy and reliability of input values and resulting calculations is disclosed herein that provides a three-dimensional overview of an imaging scene of the fixator. The three-dimensional overview may be used to provide feedback and visual confirmation to help ensure that the calculated positions and orientations of anatomical structures is reliable and correct. Various techniques for generating the three-dimensional overview will now be described in detail with reference toFIGS. 18-25. In particular, referring now toFIG. 18 an example process for generating a three-dimensional overview of an imaging scene of a fixator including rings and struts to correct a deformity of first and second anatomical structure segments will now be described in detail. The process ofFIG. 18 is initiated atoperation1810, at which first and second images, such as x-rays, of the fixator and the first and the second anatomical structure segments attached thereto are received, for example by a computing system. As described above, the images can be captured using one or more imaging sources, such as x-ray imagers and/or image capturing devices. As also described above, the first and the second images have respective first and second image planes at an angle with respect to one another. The images may be received by a computing system, such as by scanning or otherwise electronically loading or communicating image data for the images to the computing system.
As set forth above, upon receiving the first and second images, frame matching techniques may be employed in association with the first and second images, for example as described above with reference to operations314-334 ofFIGS. 3A-3B, such as to determine positions and orientations of the anatomical structure segments, the fixator, the imaging sources, and other elements of the fixator imaging scene in three-dimensional space. These techniques are described in detail above and are not repeated here. In some examples, upon completion of the frame matching process, for example including operations314-334 (and optionally the guided frame matching techniques ofFIG. 14), a three-dimensional overview may be displayed to the user, such as will now be described in detail. Specifically, atoperation1812 ofFIG. 18, a three-dimensional graphical model is displayed, for example using one or more graphical user interfaces of the computing system. The three-dimensional graphical model may include a first image representation that represents the first image and a second image representation that represents the second image. For example, referring now toFIG. 19, it is seen thatmodel1900, which is a three-dimensional graphical model, includes an AP View image representation1901-A that represents an AP View image and a LAT View image representation1901-B that represents a LAT View image. In the example ofFIG. 19, the image representations1901-A and1901-B are graphical simulations of the first and second images, which may be x-ray images. In some examples, however, the image representations1901-A and1901-B may include actual x-ray images or other actual images that are captured from the imaging sources. It is noted that any, or all, of the contents shown in each ofFIGS. 19-25 may be displayed using one or more graphical user interfaces of a computing system.
In themodel1900, the image representations1901-A and1901-B may be displayed with respect to one another at the same angle as the angle between their respective image planes. For example, the image planes of the AP View image and the LAT View image may have an angle of approximately (but in some cases not exactly) ninety degrees with respect to one another. Thus, in themodel1900, the AP View image representation1901-A and the LAT View image representation1901-B are positioned with respect to one another at the same angle as their respective image planes (approximately ninety degrees). In some examples, the software may calculate and display the actual angle between the image planes of the first and the second images as a numerical value.
In some examples, themodel1900 may display graphical representations of respective locations of imaging sources of the first and the second images. In the example ofFIG. 19, AP imaging source representation1911-A represents a location of an imaging source (e.g., x-ray imager, image capture device, etc.) of the AP View image, while LAT imaging source representation1911-B represents a location of an imaging source (e.g., x-ray imager, image capture device, etc.) of the LAT View image. The imaging source representations1911-A and1911-B may indicate respective virtual locations corresponding to the first and the second imaging sources. For example, upon performance of the frame matching process, such as described above with reference to operations314-334 ofFIGS. 3A-3B, the software may be capable of calculating positions and orientations of an entire three-dimensional scene including positions and orientations of the anatomical structure segments, the fixator, and the imaging sources with respect to one another. Thus, based on the frame matching process, the software may determine respective virtual locations corresponding to the first and the second imaging sources. These virtual locations may reflect the distance, positions, and/or orientations of the imaging sources relative to the anatomical structure segments and/or the fixator.
In the example ofFIG. 19, areference location1913 is shown in both the AP View image representation1901-A and the LAT View image representation1901-B. In this particular example, thereference location1913 is a reference point, such as an endpoint, on a proximalanatomical structure segment1915.
In some examples, a first virtual line may connect a first virtual location corresponding to the first imaging source to the reference location in the first image representation. In the example ofFIG. 19, a beam1912-A is shown as a graphical representation of this first virtual line that is displayed in themodel1900. As shown, the beam1912-A connects a first virtual location corresponding to the first imaging source (e.g., the location indicated by AP imaging source representation1911-A) to the reference location in the first image representation (e.g., thereference location1913 in the AP View image representation1901-A).
Also, in some examples, a second virtual line may connect a second virtual location corresponding to the second imaging source to the reference location in the second image representation. In the example ofFIG. 19, a beam1912-B is shown as a graphical representation of this second virtual line that is displayed in themodel1900. As shown, the beam1912-B connects a second virtual location corresponding to the second imaging source (e.g., the location indicated by LAT imaging source representation1911-B) to the reference location in the second image representation (e.g., thereference location1913 in the LAT View image representation1901-B). Although beams1912-A and1912-B are displayed inmodel1900, there is no requirement that the first and second virtual lines must be displayed in the model, and some models may not display the beams1912-A and1912-B (or may display only portions thereof).
Atoperation1814, a first graphical representation associated with a shortest distance (e.g. intersection) between the first virtual line and the second virtual line is displayed in the three-dimensional graphical model. In the example ofFIG. 19, beams912-A and1912-B intersect, and apoint1914 is shown at an intersection of the beams1912-A and1912-B, which is the shortest distance between beams1912-A and1912-B. Thus, in this example, thepoint1914 is a first graphical representation of the intersection between the first virtual line (represented by beam1912-A) and the second virtual line (represented by beam1912-B). In some examples, as opposed to a point, the first graphical representation may simply be an intersection between two graphical lines or beams or may be any other type of graphical representation. The first graphical representation (e.g., point1914) represents a physical location of the reference location in three-dimensional space. Thus, in the example ofFIG. 19, thepoint1914 represents a physical location of the endpoint of proximal anatomical structure segment1915 (which is the reference location1913) in three-dimensional space.
In the example described above, the beams1912-A (representing the first virtual line) and1912-B (representing the second virtual line) intersect one another at thepoint1914. In some examples, however, the first and second virtual lines described above may not actually intersect one another. In some examples, in these scenarios, as opposed to being displayed at the intersection point between the lines (since no intersection point exists), the first graphical representation may instead be displayed at a point that bisects (i.e., divides into two equal halves) the shortest distance between the first virtual line and the second virtual line. Specifically, a vector may connect respective points on the first virtual line and the second virtual line at the shortest distance, and the first graphical representation may be displayed at the midpoint of the vector.
Operation1814 is described above with respect toreference location1913 on proximalanatomical structure segment1915. It is noted, however, thatoperation1814 may be repeated for any number of other reference locations. For example,FIG. 19 also shows areference location1923 in both the AP View image representation1901-A and the LAT View image representation1901-B. In this particular example, thereference location1923 is a reference point, such as an endpoint, on a distalanatomical structure segment1925. As shown, a beam1922-A connects a first virtual location corresponding to the first imaging source (e.g., the location indicated by AP imaging source representation1911-A) to thereference location1923 in the AP View image representation1901-A. Additionally, a beam1922-B connects a second virtual location corresponding to the second imaging source (e.g., the location indicated by LAT imaging source representation1911-B) to thereference location1923 in the LAT View image representation1901-B). Furthermore, apoint1924 is shown at an intersection of the beams1922-A and1922-B. Thus, in this example, thepoint1924 is a first graphical representation of the intersection between the beam1922-A and the beam1922-B. The first graphical representation (e.g., point1924) represents a physical position of the reference location in three-dimensional space. Thus, in this example, thepoint1924 represents a physical location of the endpoint of distal anatomical structure segment1925 (which is the reference location1923) in three-dimensional space.
Atoperation1816, graphical representations of elements (e.g., rings, struts, etc.) of the fixator may be displayed, in the three-dimensional graphical model, at virtual locations that represent physical locations of the elements of the fixator in the three-dimensional space. For example, referring now toFIG. 20, it is seen that a user may activate a control of the software, such as hexapodvisible control2001, which causes themodel1900 to display graphical representations of the fixator rings. The user may also deactivate the hexapodvisible control2001, which causes themodel1900 to cease to display graphical representations of the fixator rings. As shown inFIG. 20, activation of the hexapodvisible control2001 causes a proximal ringgraphical representation2011 and a distal ringgraphical representation2012 to be displayed in themodel1900. Additionally, referring now toFIG. 21, it is seen that a user may activate a control of the software, such as strutsvisible control2101, which causes themodel1900 to displaygraphical representations2110 of the fixator struts. The user may also deactivate the strutsvisible control2101, which causes themodel1900 to cease to displaygraphical representations2110 of the fixator struts. As shown, thegraphical representations2110 of the fixator struts may be color coded, such that different struts are shown in different colors, for example to match different colors of strut indicator buttons611-A and611-B ofFIG. 6. In some examples, the physical locations of the fixator elements in three-dimensional space may be determined by the software by performing the frame matching techniques described above (e.g., operations314-334 ofFIGS. 3A-3B). The software may then determine virtual locations within themodel1900 that correspond to these physical locations, and the software may display the graphical representations of the fixator elements at the determined virtual locations.
Atoperation1818, graphical representations of the first and the second anatomical structure segments may be displayed, in the three-dimensional graphical model, at virtual locations that represent physical locations of the first and the second anatomical structure segments in the three-dimensional space. For example, referring now toFIG. 21, it is seen that a user may activate a control of the software, such as proximal stick visible control2102, which causes themodel1900 to display agraphical representation2112 of the proximalanatomical structure segment1915. The user may also deactivate the proximal stick visible control2102, which causes themodel1900 to cease to display thegraphical representation2112 of the proximalanatomical structure segment1915. As also shown inFIG. 21, a user may activate a control of the software, such as distal stickvisible control2103, which causes themodel1900 to display agraphical representation2113 of the distalanatomical structure segment1925. The user may also deactivate the distal stickvisible control2103, which causes themodel1900 to cease to display thegraphical representation2113 of the distalanatomical structure segment1925. In some examples, the physical locations of the anatomical structure segments in three-dimensional space may be determined by the software by performing the frame matching techniques described above (e.g., operations314-334 ofFIGS. 3A-3B). The software may then determine virtual locations within themodel1900 that correspond to these physical locations, and the software may display the graphical representations of the anatomical structure segments at the determined virtual locations.
Thus, as described above, themodel1900 may provide a three-dimensional graphical model that displays the first and the second images (or representations thereof) in combination with graphical representations of imaging sources, graphical representations of reference locations (e.g., endpoints of the first and the second anatomical structure segments), graphical representations of virtual lines connecting virtual locations corresponding to the imaging sources to the reference locations and intersections thereof, graphical representations of the fixator elements (e.g., rings, struts, etc.), and graphical representations of the anatomical structure segments, thereby indicating spatial relationships between these objects. Moreover, each of the graphical representations may be displayed at virtual locations that represent respective physical locations in three-dimensional space. As should be appreciated, the above described three-dimensional graphical model may therefore provide visual feedback that allows the user to locate points and other locations in three-dimensional space and review and confirm the correctness of the values calculated during the frame matching process. For example, the user may confirm thatreference locations1913 and1923, such as endpoints of the anatomical structure segments, are at correct locations in relation to other objects, such as the fixator rings, fixator struts, and other locations on the anatomical structure segments. For example, in some cases, if thepoints1914 and/or1924 were positioned at an incorrect location relative to thegraphical representations2011 and2012 of the fixator rings and/or thegraphical representations2110 of the fixator sturts, then this would be an indication to the user that one or more calculations have not been performed correctly, for example due to user error in identifying input values during the frame matching process. For example, if thepoints1914 and/or1924 are positioned at non-sensical locations (e.g., locations that collide with the fixator rings or struts, locations outward from the struts, location above an upper ring or below a lower ring, etc.), then this may be a clear indication of an error in the frame matching calculations. Upon determining such an error, the user may choose to review and resubmit any or all input values provided during the frame matching process and then reperform the calculations.
In some examples, themodel1900 may be zoomable, resizable and rotatable, and otherwise manipulatable by the user. For example, the user may zoom-in to enlarge portions of themodel1900 and/or zoom-out to increase the field of view. As another example, the user may pan and/or rotate the model, such as in any combination of directions (e.g., up, down, left, right, pitch, yaw, etc.). In some examples, themodel1900 may be manipulated to be shown from various different perspectives, such as respective perspectives that correspond to the first and the second images. For example, referring now toFIG. 22, it is seen that a user may select a Show fromAP button2200 to cause themodel1900 to be shown from an AP view perspective corresponding to the AP View image. Additionally, referring now toFIG. 23, it is seen that a user may select a Show fromLAT button2300 to cause themodel1900 to be shown from a LAT view perspective corresponding to the LAT View image. Additionally, referring now toFIG. 24, it is seen that themodel1900 may be rotated vertically relative to the perspectives shown inFIGS. 19-23.
As described above, the first and second images corresponding to image representations1901-A and1901-B have image planes at an angle relative to one another. For example, the AP View and LAT View images corresponding to image representations1901-A and1901-B may have image planes that are approximately orthogonal to one another. In some cases, however, these image planes may not be exactly orthogonal to one another. In some examples, when the image planes are not exactly orthogonal, the software may display a modified second image representation that is truly orthogonal to the first image representation. The modified second image representation may represent a modified second image having an image plane that is truly orthogonal to the image plane of the first image. For example, referring now toFIG. 25, a modified LATView image representation2500 is shown in themodel1900. The modified LATView image representation2500 is a modification of LAT View image representation1901-B. The modified LATView image representation2500 is truly orthogonal to the AP View image representation1901-A within themodel1900. The modified LATView image representation2500 may represent a modified LAT View image having an image plane that is truly orthogonal to the image plane of the AP View image. In some examples, a user may activate IdealPlane Visible control2510 in order to cause the modified LATView image representation2500 to be shown in themodel1900. The user may also deactivate the IdealPlane Visible control2510 in order to cause the modified LATView image representation2500 to cease to be shown in themodel1900.
In some examples, the software may calculate the angle of the anatomical structure segments that would be displayed in the second image if the second image were truly orthogonal to the first image. The software may then display, in the modified second image representation, a modified second image in which the anatomical structure segments are displayed with the calculated angle with respect to one another. For example, the software may calculate the angle of theanatomical structure segments1915 and1925 that would be displayed in the LAT View image if the LAT View image were truly orthogonal to the AP View image. The software may then display, in the modified LATView image representation2500, a modified LAT View image in which theanatomical structure segments1915 and1925 are displayed with the calculated angle with respect to one another. In some examples, the angles of the anatomical structure segments in the modified LATView image representation2500 may be determined based on a knowledge of the positions and orientations of the anatomical structure segments in physical three-dimensional space, such as may be determined by performing the frame matching techniques described above (e.g., operations314-334 ofFIGS. 3A-3B).
As should be appreciated, the ability to generate and display a modified second image representation (e.g., modified LAT View image representation2500) may provide a number of benefits. For example, when a surgeon measures anatomical structure deformities on two-dimensional images, the assumption is that the images are orthogonal. Any deviation from the orthogonality of these images may lead to an inaccurate measurement, for example associated with angles in an anterior view (varus/valgus) and a lateral view (apex anterior/posterior) beginning to mix. This deviation may result in a residual deformity after the correction. Using frame matching techniques described above, the software has knowledge of the positions and orientations of the anatomical structure segments in physical three-dimensional space. Thus, the software may calculate the angle of the anatomical structure segments that would be displayed in the second image if the second image were truly orthogonal to the first image. In this way, the software may demonstrate to the user how measured anatomical structure deformity values from a second image that is not truly orthogonal to the first image may be adjusted to corrected anatomical structure deformity values that correspond to a modified second image that would be truly orthogonal to the first image and provide a guidance to validate the corrected values for use in the calculations described above. In some examples, the user may be provided with an option to override the corrected values (such as to use the measured values) if the user is not confident with the corrected values.
It is noted that, while themodel1900 described above shows image representations for both the first and second images, a three-dimensional graphical model may also be provided that shows only a single image representation for a single image. Such a single-image model may be useful to provide feedback to the user at an earlier stage of the input process, such as when frame matching has only been performed on a single image, for example when a user has indicated strut locations on the first image but not yet on the second image. Additionally, while themodel1900 described above showspoints1914 and1924 that represent reference locations of anatomical structures, a three-dimensional graphical model may also be provided that need not necessarily include anatomical structure information. This type of model may also be useful to provide feedback to the user at an earlier stage of the input process, such as when user has indicated location of fixator elements in the first and/or the second image-but has not yet indicated locations of anatomical structures in the first and/or the second image.
For example, for scenarios in which a user has indicated strut locations on a first image (but not yet for the second image) a single-image model could depict an image representation of the first image, an imaging source representation for the first image, and graphical representations of the frame (e.g., the rings and struts). Additionally, if the user subsequently performs deformity planning and indicates the locations of the anatomical structure segments on the first image, then the single-image model could be updated to show planning elements (e.g., anatomical structure reference points, anatomical structure center lines) on the first image representation and to include beams that connect the imaging source representation for the first image to corresponding points on the first image representation, in this way showing the user where they pass the fixator frame. Moreover, for scenarios in which identification of fixator element (e.g. strut) locations has been performed for both images but deformity planning (e.g., indication of anatomical structure locations) has not yet been performed, a model may be generated that shows image representations of the both images, imaging source representations for both images, and graphical representations of the frame (e.g., the rings and struts), thereby allowing the user to review the relative orientation between the images (i.e. angulation and rotation). Any or all of the above described models may be rotated, zoomed and panned by the user similar to themodel1900.
Changing of Distraction Rate
In some examples, during a course of a computer assisted ring fixator treatment, a surgeon may desire a distraction rate change for a patient. This could be due to premature consolidation (correction is too slow), poor regenerate formation (correction is too fast), a strut swap at the clinic needing to be rescheduled, too much pain, or any other reason. In some examples, in order to allow distraction rate to be changed in an efficient and reliable manner, an option may be provided within the software for a surgeon to select a “change distraction rate” button or other control that will allow the distraction rate to be changed to a new distraction rate starting at a given day of the treatment plan and using the new distraction rate for the remainder of the treatment plan. In some examples, this may allow the distraction rate to be changed during the course of treatment (e.g., during an intermediate day within the treatment plan) using only one screen, one field, and one click to produce a clinically relevant change to the patient treatment and potentially reduce the patient time in the frame. In some examples, when the change distraction rate control is selected, the plan may be reopened by the software in a planning state, such as on a treatment plan tab. In some examples, all other tabs (other than the treatment plan tab) may be inactive. The user may then be able to edit either a “Number of Days” field (or other field that represents the treatment plan duration) or a “Distraction at Reference Point” field. The user may then select an “Update Adjustment Plan” control, and the software may generate a new treatment plan starting with the parameters from the day at which the change distraction rate control was selected. The user may then deliver the new plan to patient.
Example Computing Device
Referring toFIG. 26, a suitable computing device such asexample computing device78 can be configured to perform any or all of the techniques set forth above. It will be understood that thecomputing device78 can include any appropriate device, examples of which include a desktop computing device, a server computing device, or a portable computing device, such as a laptop, tablet, or smart phone.
In an example configuration, thecomputing device78 includes aprocessing portion80, amemory portion82, an input/output portion84, and a user interface (UI)portion86. It is emphasized that the block diagram depiction of thecomputing device78 is exemplary and not intended to imply a specific implementation and/or configuration. Theprocessing portion80,memory portion82, input/output portion84, anduser interface portion86 can be coupled together to allow communications therebetween. As should be appreciated, any of the above components may be distributed across one or more separate devices and/or locations.
In various embodiments, the input/output portion84 includes a receiver of thecomputing device78, a transmitter of thecomputing device78, or a combination thereof. The input/output portion84 is capable of receiving and/or providing information pertaining to communicate a network such as, for example, the Internet. As should be appreciated, transmit and receive functionality may also be provided by one or more devices external to thecomputing device78.
Theprocessing portion80 may include one or more processors. Depending upon the exact configuration and type of processor, thememory portion82 can be volatile (such as some types of RAM), non-volatile (such as ROM, flash memory, etc.), or a combination thereof. Thecomputing device78 can include additional storage (e.g., removable storage and/or non-removable storage) including, but not limited to, tape, flash memory, smart cards, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, universal serial bus (USB) compatible memory, or any other medium which can be used to store information and which can be accessed by thecomputing device78.
Thecomputing device78 also can contain theuser interface portion86 allowing a user to communicate with thecomputing device78. Theuser interface86 can include inputs that provide the ability to control thecomputing device78, via, for example, buttons, soft keys, a mouse, voice actuated controls, a touch screen, movement of thecomputing device78, visual cues (e.g., moving a hand in front of a camera on the computing device78), or the like. Theuser interface portion86 can provide outputs, including visual information (e.g., via a display), audio information (e.g., via speaker), mechanically (e.g., via a vibrating mechanism), or a combination thereof. In various configurations, theuser interface portion86 can include a display, one or more graphical user interfaces, a touch screen, a keyboard, a mouse, an accelerometer, a motion detector, a speaker, a microphone, a camera, a tilt sensor, or any combination thereof. Thus, a computing system including, for example, one ormore computing devices78 can include a processor, a display coupled to the processor, and a memory in communication with the processor, one or more graphical user interfaces, and various other components. The memory can have stored therein instructions that, upon execution by the processor, cause the computer system to perform operations, such as the operations described above. As used herein, the term computing system can refer to a system that includes one ormore computing devices78. For instance, the computing system can include one or more server computing devices that communicate with one or more client computing devices.
While example embodiments of devices for executing the disclosed techniques are described herein, the underlying concepts can be applied to any computing device, processor, or system capable of communicating and presenting information as described herein. The various techniques described herein can be implemented in connection with hardware or software or, where appropriate, with a combination of both. Thus, the methods and apparatuses described herein can be implemented, or certain aspects or portions thereof, can take the form of program code (i.e., instructions) embodied in tangible non-transitory storage media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium (computer-readable storage medium), wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for performing the techniques described herein. In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device, for instance a display. The display can be configured to display visual information. The program(s) can be implemented in assembly or machine language, if desired. The language can be a compiled or interpreted language, and combined with hardware implementations.
It should be appreciated that the orthopedic fixation with imagery analysis techniques described herein provide not only for the use of non-orthogonal images, but also allow the use of overlapping images, images captured using different imaging techniques, images captured in different settings, and the like, thereby presenting a surgeon with greater flexibility when compared with existing fixation and imagery techniques.
The techniques described herein also can be practiced via communications embodied in the form of program code that is transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via any other form of transmission. When implemented on a general-purpose processor, the program code combines with the processor to provide a unique apparatus that operates to invoke the functionality described herein. Additionally, any storage techniques used in connection with the techniques described herein can invariably be a combination of hardware and software.
While the techniques described herein can be implemented and have been described in connection with the various embodiments of the various figures, it is to be understood that other similar embodiments can be used or modifications and additions can be made to the described embodiments without deviating therefrom. For example, it should be appreciated that the steps disclosed above can be performed in the order set forth above, or in any other order as desired. Further, one skilled in the art will recognize that the techniques described in the present application may apply to any environment, whether wired or wireless, and may be applied to any number of such devices connected via a communications network and interacting across the network. Therefore, the techniques described herein should not be limited to any single embodiment, but rather should be construed in breadth and scope in accordance with the appended claims.

Claims (17)

What is claimed:
1. A computer-implemented method for providing a graphical projection of a fixator including rings and struts to correct a deformity of first and second anatomical structure segments comprising:
displaying, using one or more graphical user interfaces of a computing system, first and second images of the first and the second anatomical structure segments and the fixator attached thereto, the first and the second images having image planes at an angle with respect to one another;
receiving, using the one or more graphical user interfaces, first indications of first locations, within the first image, of a plurality of elements of the fixator;
overlaying, using the one or more graphical user interfaces, the graphical projection of the fixator on the second image, wherein the graphical projection of the fixator is rotated relative to the first locations based at least in part on the angle; and
receiving, using the one or more graphical user interfaces, second indications of second locations, within the second image, of the plurality of elements of the fixator, wherein the graphical projection of the fixator is employable to assist in providing of the second indications.
2. The computer-implemented method ofclaim 1, wherein the graphical projection of the fixator is movable, resizable and rotatable by a user.
3. The computer-implemented method ofclaim 1, wherein the deformity is corrected using a treatment plan in which the distraction rate is changed during an intermediate day within the treatment plan.
4. The computer-implemented method ofclaim 1, wherein the plurality of elements of the fixator include the struts.
5. The computer-implemented method ofclaim 1, wherein the graphical projection of the fixator includes graphical representations of the struts.
6. The computer-implemented method ofclaim 1, wherein the graphical projection of the fixator includes graphical representations of the rings.
7. A computer-implemented method for generating a three-dimensional overview of an imaging scene of a fixator including rings and struts to correct a deformity of first and second anatomical structure segments comprising:
receiving, by a computing system, first and second images of the fixator and the first and the second anatomical structure segments attached thereto, the first image captured from a first imaging source and the second image captured from a second imaging source, the first and the second images having respective first and second image planes at an angle with respect to one another;
displaying, using one or more graphical user interfaces of the computing system, a three-dimensional graphical model that comprises a first image representation that represents the first image and a second image representation that represents the second image, wherein the one or more graphical user interfaces of the computing system display a first virtual line that connects a first virtual location corresponding to the first imaging source to a reference location in the first image representation, and wherein the one or more graphical user interfaces of the computing system display a second virtual line that connects a second virtual location corresponding to the second imaging source to the reference location in the second image representation; and
displaying, in the three-dimensional graphical model, a first graphical representation associated with a shortest distance between the first virtual line and the second virtual line, wherein the first graphical representation represents a physical location of the reference location in three-dimensional space.
8. The computer-implemented method ofclaim 7, wherein the shortest distance between the first virtual line and the second virtual line is an intersection between the first virtual line and the second virtual line, and wherein the first graphical representation is displayed at the intersection.
9. The computer-implemented method ofclaim 7, wherein the first virtual line and the second virtual line do not intersect, and wherein the first graphical representation is displayed at a point that bisects the shortest distance.
10. The computer-implemented method ofclaim 7, wherein the reference location is a reference point of the first or the second anatomical structure segment.
11. The computer-implemented method ofclaim 7, further comprising displaying, in the three-dimensional graphical model, graphical representations of the rings and the struts of the fixator at virtual locations that represent physical locations of the rings and the struts of the fixator in the three-dimensional space.
12. The computer-implemented method ofclaim 7, further comprising displaying, in the three-dimensional graphical model, graphical representations of the first and the second anatomical structure segments at virtual locations that represent physical locations of the first and the second anatomical structure segments in the three-dimensional space.
13. The computer-implemented method ofclaim 7, wherein the three-dimensional graphical model displays the first image representation and the second image representation at the angle with respect to one another.
14. The computer-implemented method ofclaim 7, further comprising, when the first and the second image planes are non-orthogonal, displaying a modified second image representation that is orthogonal to the first image representation.
15. The computer-implemented method ofclaim 7, wherein the three-dimensional graphical model is zoomable and rotatable by a user.
16. The computer-implemented method ofclaim 7, further comprising displaying, in the three-dimensional graphical model, graphical representations of the first virtual location corresponding to the first imaging source and the second virtual location corresponding to the second imaging source.
17. One or more non-transitory computer-readable storage media having stored thereon instructions that, upon execution by one or more computing devices, cause the one or more computing devices to perform operations for generating a three-dimensional overview of an imaging scene of a fixator including rings and struts to correct a deformity of first and second anatomical structure segments comprising:
receiving, by a computing system, first and second images of the fixator and the first and the second anatomical structure segments attached thereto, the first image captured from a first imaging source and the second image captured from a second imaging source, the first and the second images having respective first and second image planes at an angle with respect to one another;
displaying, using one or more graphical user interfaces of the computing system, a three-dimensional graphical model that comprises a first image representation that represents the first image and a second image representation that represents the second image, wherein the one or more graphical user interfaces of the computing system display a first virtual line that connects a first virtual location corresponding to the first imaging source to a reference location in the first image representation, and wherein the one or more graphical user interfaces of the computing system display a second virtual line that connects a second virtual location corresponding to the second imaging source to the reference location in the second image representation; and
displaying, in the three-dimensional graphical model, a first graphical representation associated with a shortest distance between the first virtual line and the second virtual line, wherein the first graphical representation represents a physical location of the reference location in three-dimensional space.
US16/367,5262019-03-282019-03-28Orthopedic fixation control and visualizationActiveUS11304757B2 (en)

Priority Applications (8)

Application NumberPriority DateFiling DateTitle
US16/367,526US11304757B2 (en)2019-03-282019-03-28Orthopedic fixation control and visualization
EP20715011.1AEP3948801A1 (en)2019-03-282020-03-25Orthopedic fixation control and visualization
PCT/EP2020/058356WO2020193629A1 (en)2019-03-282020-03-25Orthopedic fixation control and visualization
CA3135073ACA3135073A1 (en)2019-03-282020-03-25Orthopedic fixation control and visualization
BR112021019062ABR112021019062A2 (en)2019-03-282020-03-25 Control and visualization of orthopedic fixation
AU2020245028AAU2020245028B2 (en)2019-03-282020-03-25Orthopedic fixation control and visualization
JP2021557502AJP7500601B2 (en)2019-03-282020-03-25 Orthopedic Fixation Control and Visualization
CN202080039246.3ACN113841183B (en)2019-03-282020-03-25 Orthopedic Fixation Control and Visualization

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US16/367,526US11304757B2 (en)2019-03-282019-03-28Orthopedic fixation control and visualization

Publications (2)

Publication NumberPublication Date
US20200305977A1 US20200305977A1 (en)2020-10-01
US11304757B2true US11304757B2 (en)2022-04-19

Family

ID=70050099

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US16/367,526ActiveUS11304757B2 (en)2019-03-282019-03-28Orthopedic fixation control and visualization

Country Status (8)

CountryLink
US (1)US11304757B2 (en)
EP (1)EP3948801A1 (en)
JP (1)JP7500601B2 (en)
CN (1)CN113841183B (en)
AU (1)AU2020245028B2 (en)
BR (1)BR112021019062A2 (en)
CA (1)CA3135073A1 (en)
WO (1)WO2020193629A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20240325086A1 (en)*2016-06-022024-10-03Stryker European Operations Holdings LlcSoftware for use with deformity correction
US12220250B2 (en)2016-06-192025-02-11Synthes GmbhUser interface for strut device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11334997B2 (en)2020-04-032022-05-17Synthes GmbhHinge detection for orthopedic fixation

Citations (195)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US2055024A (en)1934-08-071936-09-22Jr Joseph E BittnerFracture reducing splint
US2391537A (en)1943-09-271945-12-25Anderson RogerAmbulatory rotating reduction and fixation splint
US3977397A (en)1974-11-271976-08-31Kalnberz Viktor KonstantinovicSurgical compression-distraction instrument
US4081686A (en)1977-02-151978-03-28E. I. Du Pont De Nemours And CompanyX-ray film cassette and method of making same
US4450834A (en)1979-10-181984-05-29Ace Orthopedic Manufacturing, Inc.External fixation device
US4489111A (en)1984-01-241984-12-18Woodrum Dorothy BBeaded trimmed satin christmas ornament
FR2576774A1 (en)1985-02-071986-08-08Issoire Aviat SaDevice for three-dimensional positioning of any two pieces, in particular two pieces of bone, and allowing modification of the said positioning
US4615338A (en)1985-09-181986-10-07Kurgansky Nauchno-Issledovatelsky Institut Experimentalnoi I Klinicheskoi Ortopedii I TravmatologiiAutomatic compression-distraction apparatus
US4620533A (en)1985-09-161986-11-04Pfizer Hospital Products Group Inc.External bone fixation apparatus
US4630203A (en)1983-12-271986-12-16Thomas SzirtesContour radiography: a system for determining 3-dimensional contours of an object from its 2-dimensional images
US4768524A (en)1986-02-281988-09-06Hardy Jean MarieDevice for immobilizing a bone structure, especially intended for orthopedic use
US4784125A (en)1985-01-241988-11-15Jaquet Orthopedie, S. A.Arcuate element and external fixation device containing same for osteosynthesis and osteoplasty
US4875165A (en)1987-11-271989-10-17University Of ChicagoMethod for determination of 3-D structure in biplane angiography
US4889111A (en)1984-02-081989-12-26Ben Dov MeirBone growth stimulator
US4890631A (en)1985-02-221990-01-02Societe De Realisations Electro-Mecaniques SoremExternal fixation device intended for orthopedic use
US4930961A (en)1988-12-231990-06-05Weis Charles WQuick lock and release fastener
US4964320A (en)1988-12-221990-10-23Engineering & Precision Machining, Inc.Method of forming a beaded transfixion wire
US4973331A (en)1989-03-081990-11-27Autogenesis CorporationAutomatic compression-distraction-torsion method and apparatus
US5062844A (en)1990-09-071991-11-05Smith & Nephew Richards Inc.Method and apparatus for the fixation of bone fractures, limb lengthening and the correction of deformities
US5074866A (en)1990-10-161991-12-24Smith & Nephew Richards Inc.Translation/rotation device for external bone fixation system
US5087258A (en)1987-06-191992-02-11Thomas SchewiorRing splint to set, affix and regulate the tension position of bone segments
US5108393A (en)1991-04-081992-04-28The United States Of America As Represented By The Secretary Of The NavyNon-invasive body-clamp
US5156605A (en)1990-07-061992-10-20Autogenesis CorporationAutomatic internal compression-distraction-method and apparatus
US5179525A (en)1990-05-011993-01-12University Of FloridaMethod and apparatus for controlling geometrically simple parallel mechanisms with distinctive connections
US5180380A (en)1989-03-081993-01-19Autogenesis CorporationAutomatic compression-distraction-torsion method and apparatus
US5209750A (en)1990-10-121993-05-11Compagnie General De Materiel OrthopediqueExternal holding and reducing brace for bone fractures
US5275598A (en)1991-10-091994-01-04Cook Richard LQuasi-isotropic apparatus and method of fabricating the apparatus
US5358504A (en)1993-05-051994-10-25Smith & Nephew Richards, Inc.Fixation brace with focal hinge
US5437668A (en)1994-02-181995-08-01Board Of Trustees Of The University Of Ark.Apparatus and method for clinical use of load measurement in distraction osteogenesis
US5443464A (en)1993-02-161995-08-22Memphis Orthopaedic Design, Inc.External fixator apparatus
US5451225A (en)1993-06-101995-09-19Texas Scottish Rite Hospital For Crippled ChildrenFastener for external fixation device wires and pins
US5458599A (en)1994-04-211995-10-17Adobbati; Ricardo N.System for the use in the fixation of a fractured bone
US5540686A (en)1993-02-181996-07-30Endocare AgApparatus for lengthening bones
US5601551A (en)1995-03-011997-02-11Smith & Nephew Richards, Inc.Geared external fixator
US5653707A (en)1994-11-011997-08-05Smith & Nephew Richards, Inc.External skeletal fixation system with improved bar-to-bar connector
US5681309A (en)1993-06-101997-10-28Texas Scottish Rite Hospital For Crippled ChildrenDistractor mechanism for external fixation device
US5702389A (en)1995-03-011997-12-30Smith & Nephew Richards, Inc.Orthopaedic fixation device
US5728095A (en)1995-03-011998-03-17Smith & Nephew, Inc.Method of using an orthopaedic fixation device
WO1998012975A2 (en)1996-09-261998-04-02Aleksandar TosicArticulated external orthopedic fixation system and method of use
US5746741A (en)1996-05-061998-05-05Tufts UniversityExternal fixator system
FR2756025A1 (en)1996-11-151998-05-22Const Mecaniques Des Vosges CARDAN STRUCTURE FOR THE ARTICULATED CONNECTION OF A HEXAPODE
US5776132A (en)1996-12-261998-07-07Blyakher; ArkadyExternal fixation assembly
US5871018A (en)1995-12-261999-02-16Delp; Scott L.Computer-assisted surgical method
US5885282A (en)1997-05-091999-03-23The Regents Of The University Of CaliforniaApparatus for treatment of fracture and malunion of the distal radius
US5891143A (en)1997-10-201999-04-06Smith & Nephew, Inc.Orthopaedic fixation plate
US5919192A (en)1997-06-101999-07-06Cottec Orthopaedic Technologies Development Ltd.Compression-distraction apparatus for treatment of a bone fracture
US5951556A (en)1996-05-151999-09-14Orthofix S.R.L.Compact external fixation device
US5961515A (en)1995-03-011999-10-05Smith & Nephew, Inc.External skeletal fixation system
US5963612A (en)1997-12-311999-10-05Siemens Corporation Research, Inc.Apparatus for C-arm calibration for 3D reconstruction in an imaging system utilizing planar transformation
US5967777A (en)1997-11-241999-10-19Klein; MichaelSurgical template assembly and method for drilling and installing dental implants
US5971984A (en)1995-03-011999-10-26Smith & Nephew, Inc.Method of using an orthopaedic fixation device
US5976142A (en)1996-10-161999-11-02Chin; MartinApparatus and method for distraction osteogenesis of small alveolar bone
WO1999059100A1 (en)1998-05-141999-11-18Cognitens, Ltd.Euclidean reconstruction of 3d scene from 2d images following a non-rigid transformation
US6017341A (en)1997-06-202000-01-25Novo Nordisk A/SApparatus for fixation of the bones in a healing bone fracture
US6021579A (en)1998-04-012000-02-08Joseph M. SchimmelsSpatial parallel compliant mechanism
US6030386A (en)1998-08-102000-02-29Smith & Nephew, Inc.Six axis external fixator strut
US6047080A (en)1996-06-192000-04-04Arch Development CorporationMethod and apparatus for three-dimensional reconstruction of coronary vessels from angiographic images
US6129727A (en)1999-03-022000-10-10Smith & NephewOrthopaedic spatial frame apparatus
RU2159091C2 (en)1999-01-102000-11-20Борозда Иван ВикторовичDevice for reposition and fixation of fractures of hipbones
WO2001015611A1 (en)1999-08-302001-03-08Smith & Nephew, Inc.Six axis external fixator strut
US6206566B1 (en)1998-11-022001-03-27Siemens AktiengesellschaftX-ray apparatus for producing a 3D image from a set of 2D projections
EP1100048A1 (en)1999-11-122001-05-16Société N3DI S.A.R.L.Automatic building process of a digital model using stereoscopic image couples
US20010018617A1 (en)1996-04-022001-08-30Franz CopfProsthetic part
US6293947B1 (en)2000-01-282001-09-25Daniel BuchbinderDistraction osteogenesis device and method
WO2001078015A2 (en)2000-04-072001-10-18Carnegie Mellon UniversityComputer-aided bone distraction
US6320928B1 (en)1996-09-042001-11-20Ge Medical Systems, S.A.Method of reconstruction of a three-dimensional image of an object
US20020010465A1 (en)2000-01-312002-01-24Ja Kyo KooFrame fixator and operation system thereof
US6363169B1 (en)1997-07-232002-03-26Sanyo Electric Co., Ltd.Apparatus and method of three-dimensional modeling
US6434278B1 (en)1997-09-232002-08-13Enroute, Inc.Generating three-dimensional models of objects defined by two-dimensional image data
US6510241B1 (en)1998-06-112003-01-21Ge Medical Systems SaProcess for reconstructing a three-dimensional image of an object
US6537275B2 (en)2000-05-092003-03-25Orthodix S.R.L.Securing component for a ring fixator used in orthopaedic surgery
WO2003030759A2 (en)2001-10-092003-04-17Synthes (U.S.A.)Adjustable fixator
JP2003144454A (en)2001-11-162003-05-20Yoshio KogaJoint operation support information computing method, joint operation support information computing program, and joint operation support information computing system
US20030106230A1 (en)2001-12-102003-06-12Hennessey C. WilliamParallel kinematic micromanipulator
US20030191466A1 (en)2002-04-052003-10-09Ed AustinOrthopaedic fixation method and device
US20040068187A1 (en)2000-04-072004-04-08Krause Norman M.Computer-aided orthopedic surgery
US20040073212A1 (en)2002-10-152004-04-15Kim Jung JaeExtracorporeal fixing device for a bone fracture
US20040082849A1 (en)2000-08-012004-04-29Achim SchweikardMethod for navigating in the interior of the body using three-dimensionally visualised structures
CN1494397A (en)2001-03-052004-05-05���ŷƿ�˹���ʹ�˾ External fixation devices for fracture recovery
US20040097922A1 (en)2002-11-142004-05-20Visionmed, L.L.C.Method for a using fixator device
US20040111024A1 (en)2001-02-072004-06-10Guoyan ZhengMethod for establishing a three-dimensional representation of a bone from image data
US20040167518A1 (en)2002-07-122004-08-26Estrada Hector MarkRadiolucent frame element for external bone fixators
JP2004254899A (en)2003-02-262004-09-16Hitachi Ltd Operation support system and operation support method
US20040208279A1 (en)2002-12-312004-10-21Yongshun XiaoApparatus and methods for multiple view angle stereoscopic radiography
US6912293B1 (en)1998-06-262005-06-28Carl P. KorobkinPhotogrammetry engine for model construction
US20050149018A1 (en)2003-12-312005-07-07Paul CooperExternal bone/joint fixation device
EP1690506A1 (en)2005-02-092006-08-16Stryker Trauma SAExternal fixation device; in particular for increasing a distance between clamping elements
US7113623B2 (en)2002-10-082006-09-26The Regents Of The University Of ColoradoMethods and systems for display and analysis of moving arterial tree structures
US20060276786A1 (en)2005-05-252006-12-07Brinker Mark RApparatus for accurately positioning fractured bone fragments toward facilitating use of an external ring fixator system
US20070043354A1 (en)2005-08-032007-02-22Koo Terry KBone reposition device, method and system
US20070043429A1 (en)2005-08-182007-02-22Admedes Schuessler GmbhX-ray visibility and corrosion resistance of niti stents using markers made of sandwich material
US20070049930A1 (en)2005-08-252007-03-01Jim HearnExternal fixation system and method of use
US7187792B2 (en)2003-08-292007-03-06Accuray, Inc.Apparatus and method for determining measure of similarity between images
US20070055234A1 (en)2005-06-102007-03-08Mcgrath William MExternal fixation system with provisional brace
US7226449B2 (en)2000-05-092007-06-05Orthofix S.R.L.Ring fixator
US20070161984A1 (en)2005-12-082007-07-12Ebi, L.P.Foot plate fixation
US20070161983A1 (en)2005-12-082007-07-12Ebi, L.P.External fixation system
US7280687B2 (en)2002-09-022007-10-09Fanuc LtdDevice for detecting position/orientation of object
US20070238069A1 (en)2006-04-102007-10-11Scott LovaldOsteosynthesis plate, method of customizing same, and method for installing same
US20080012850A1 (en)2003-12-302008-01-17The Trustees Of The Stevens Institute Of TechnologyThree-Dimensional Imaging System Using Optical Pulses, Non-Linear Optical Mixers And Holographic Calibration
US20080051779A1 (en)2006-08-022008-02-28The Nemours FoundationForce-controlled autodistraction
US20080114267A1 (en)2006-11-142008-05-15General Electric CompanySystems and methods for implant distance measurement
US7388972B2 (en)2002-09-262008-06-17Meridian Technique LimitedOrthopaedic surgery planning
US20080234554A1 (en)2007-03-212008-09-25Vvedensky Pyotr SComputer-Aided System for Limb Lengthening
US20080269741A1 (en)2007-04-282008-10-30John Peter KaridisOrthopedic fixation device with zero backlash and adjustable compliance, and process for adjusting same
KR200443058Y1 (en)2005-12-292009-01-09페드럴 스테이트 인스티튜션 (러시안 일리자로브 사이언티픽센터 (레스토러티브 트라우마톨로지 앤드 오르토패딕스) 오브 페드럴 에이젼시 온 하이 테크놀로지 메디컬 케어) Compression-stretching device
US20090036892A1 (en)2007-07-302009-02-05John Peter KaridisAdjustable length strut apparatus for orthopaedic applications
US20090036890A1 (en)2007-07-312009-02-05John Peter KaridisFixator apparatus with radiotransparent apertures for orthopaedic applications
US7490085B2 (en)2002-12-182009-02-10Ge Medical Systems Global Technology Company, LlcComputer-assisted data processing system and method incorporating automated learning
RU2352283C2 (en)2007-05-042009-04-20Леонид Николаевич СоломинSolomin-utekhin-vilensky apparatus for perosseous osteosynthesis
US20090105621A1 (en)2007-10-182009-04-23Boyd Lawrence MProtective and Cosmetic Covering for External Fixators
US20090143788A1 (en)2007-12-042009-06-04National Cheng Kung UniversityNavigation method and system for drilling operation in spinal surgery
US20090161945A1 (en)2007-12-212009-06-25Canon Kabushiki KaishaGeometric parameter measurement of an imaging device
US20090177198A1 (en)2005-12-292009-07-09Matsukidis TheodorosCompression-distraction device
US20090198234A1 (en)2008-02-012009-08-06Stryker Trauma SaTelescopic strut for an external fixator
WO2009102904A1 (en)2008-02-122009-08-20Texas Scottish Rite Hospital For ChildrenFast adjust external fixation connection rod
US20090226055A1 (en)2004-12-102009-09-10Harry DankowiczSystems and methods for multi-dimensional characterization and classification of spinal shape
US20090275944A1 (en)2008-05-022009-11-05Quantum Medical Concepts LlcExternal Fixation and foot-supporting Device.
US20090326560A1 (en)2008-06-272009-12-31Lampropoulos Fred PCatheter with radiopaque marker
US20090326532A1 (en)2008-06-302009-12-31Depuy Products, Inc.External fixator
US7645279B1 (en)2003-07-252010-01-12Haupt Bruce FBone fixation method
US7657079B2 (en)2002-06-282010-02-02Intel CorporationSingle constraint at a time (SCAAT) tracking of a virtual reality (VR) display
US20100030219A1 (en)2007-07-012010-02-04L.R.S. Ortho Ltd.Orthopedic navigation system and method
US20100039421A1 (en)2007-04-042010-02-18Sony CorporationDriving method for organic electroluminescence light emitting section
US7677078B2 (en)2006-02-022010-03-16Siemens Medical Solutions Usa, Inc.Line-based calibration of ultrasound transducer integrated with a pose sensor
US20100087819A1 (en)2008-10-072010-04-08Extraortho, Inc.Forward Kinematic Solution for a Hexapod Manipulator and Method of Use
US20100104150A1 (en)2008-10-242010-04-29Biospace MedMeasurement of geometric quantities intrinsic to an anatomical system
US20100172567A1 (en)2007-04-172010-07-08Prokoski Francine JSystem and method for using three dimensional infrared imaging to provide detailed anatomical structure maps
US20100179548A1 (en)2009-01-132010-07-15Marin Luis EExternal fixator assembly
US7758582B2 (en)2002-06-142010-07-20Smith & Nephew, Inc.Device and methods for placing external fixation elements
US20100191239A1 (en)2007-06-012010-07-29Umc Utrecht Holding B.V.System for Correcting Bones
US20100191500A1 (en)2007-04-042010-07-29Andrew Joseph Lawrence HarrisonAnalysis of parallel manipulators
WO2010104567A1 (en)2009-03-102010-09-16Stryker Trauma SaExternal fixation system
US20100280516A1 (en)2009-04-302010-11-04Jeffrey TaylorAccessory Device for an Orthopedic Fixator
US7828801B2 (en)2004-09-032010-11-09A.M. Surgical, Inc.External fixation device for fractures
US20100305568A1 (en)2008-02-052010-12-02Texas Scottish Rite Hospital For ChildrenExternal fixator ring
US20100312243A1 (en)2008-02-082010-12-09Texas Scottish Rite Hospital For ChildrenExternal fixator ring
US20110004199A1 (en)2008-02-182011-01-06Texas Scottish Rite Hospital For ChildrenTool and method for external fixation strut adjustment
US20110029093A1 (en)2001-05-252011-02-03Ray BojarskiPatient-adapted and improved articular implants, designs and related guide tools
US7887537B2 (en)2002-02-042011-02-15Smith & Nephew, Inc.External fixation system
WO2011026475A1 (en)2009-09-052011-03-10Surgitaix AgDevice for fixating bone segments
WO2011060266A1 (en)2009-11-132011-05-19Amei Technologies, Inc.Fixation device and multiple-axis joint for a fixation device
US20110118738A1 (en)2009-11-132011-05-19Amei Technologies, Inc.Adjustable orthopedic fixation system
US20110131418A1 (en)2009-12-022011-06-02Giga-Byte Technology Co.,Ltd.Method of password management and authentication suitable for trusted platform module
US7955334B2 (en)2008-04-182011-06-07Stryker Trauma SaExternal fixation system
US20110208187A1 (en)2010-02-242011-08-25Wright Medical Technology, Inc.Orthopedic external fixation device
US8062293B2 (en)2008-02-012011-11-22Stryker Trauma SaStrut joint for an external fixator
WO2011146703A1 (en)2010-05-192011-11-24Synthes Usa, LlcOrthopedic fixation with imagery analysis
US20110313419A1 (en)2010-06-222011-12-22Extraortho, Inc.Hexapod External Fixation System with Collapsing Connectors
WO2012021307A2 (en)2010-08-122012-02-16Heartflow, Inc.Method and system for patient-specific modeling of blood flow
US20120041439A1 (en)2010-08-112012-02-16Stryker Trauma SaExternal fixator system
US20120078251A1 (en)2010-09-232012-03-29Mgv Enterprises, Inc.External Fixator Linkage
US8147491B2 (en)2007-06-272012-04-03Vilex In Tennessee, Inc.Multi-angle clamp
US20120232554A1 (en)2011-03-092012-09-13Quantum Medical Concepts LlcAlignment Plate for Lower-extremity Ring Fixation, Method of Use, and System
US20120259343A1 (en)2011-04-082012-10-11Allen Medical Systems, Inc.Low profile distractor apparatuses
US20120330312A1 (en)2011-06-232012-12-27Stryker Trauma GmbhMethods and systems for adjusting an external fixation frame
US20130041288A1 (en)2011-08-082013-02-14John Charles TaylorApparatus and Method of Monitoring Healing and/or Assessing Mechanical Stiffness of a Bone Fracture Site or the Like
US20130060146A1 (en)2010-04-282013-03-07Ryerson UniversitySystem and methods for intraoperative guidance feedback
US20130138017A1 (en)2010-03-242013-05-30Jonathon JundtUltrasound guided automated wireless distraction osteogenesis
US8469958B2 (en)2005-02-152013-06-25Morphographics, LcFixing block and method for stabilizing bone
US20130201212A1 (en)2012-02-032013-08-08Orthohub, Inc.External Fixator Deformity Correction Systems and Methods
US20130211521A1 (en)2009-08-272013-08-15Cotera, Inc.Method and Apparatus for Altering Biomechanics of the Articular Joints
US20130289575A1 (en)2012-04-262013-10-31Stryker Trauma GmbhMeasurement device for external fixation frame
US8574232B1 (en)2012-11-132013-11-05Texas Scottish Hospital for ChildrenExternal fixation connection rod for rapid and gradual adjustment
US20130296857A1 (en)2012-05-042013-11-07Gary D. BarnettRatcheting strut
US8777946B2 (en)2009-10-052014-07-15Aalto University FoundationAnatomically customized and mobilizing external support, method for manufacture
EP2767252A1 (en)2013-02-192014-08-20Stryker Trauma GmbHSoftware for planning deformity correction
US20140236152A1 (en)2011-08-232014-08-21Aesculap AgElectrosurgical device and methods of manufacture and use
US20140276821A1 (en)2013-03-132014-09-18Nicole MurrayExternal Bone Fixation Device
US20140276817A1 (en)2013-03-132014-09-18Nicole MurrayExternal Bone Fixation Device
US20140303670A1 (en)2011-11-162014-10-09Neuromechanical Innovations, LlcMethod and Device for Spinal Analysis
US8858555B2 (en)2009-10-052014-10-14Stryker Trauma SaDynamic external fixator and methods for use
WO2014186453A2 (en)2013-05-142014-11-20Smith & Nephew, Inc.Apparatus and method for administering a medical device prescription
US8906021B1 (en)2012-08-202014-12-09Stryker Trauma SaTelescopic strut for an external fixator
US20140379038A1 (en)2011-09-092014-12-25University Of The West Of England, BristolSystem for anatomical reduction of bone fractures
US8945128B2 (en)2010-08-112015-02-03Stryker Trauma SaExternal fixator system
US20150088135A1 (en)2013-09-262015-03-26Stryker Trauma GmbhBone position tracking system
US9011438B2 (en)2008-04-182015-04-21Stryker Trauma SaRadiolucent orthopedic fixation plate
US20150112339A1 (en)2009-10-052015-04-23Aalto University FoundationAnatomically personalized and mobilizing external support and method for controlling a path of an external auxiliary frame
US9101398B2 (en)2012-08-232015-08-11Stryker Trauma SaBone transport external fixation frame
US20150257788A1 (en)2012-09-062015-09-17Solana Surgical LLCExternal fixator
US20150272624A1 (en)2014-04-012015-10-01Stryker European Holdings I, LlcExternal fixator with y strut
US20150305776A1 (en)2014-04-232015-10-29Texas Scottish Rite Hospital For ChildrenDynamization module for external fixation strut
US20160022314A1 (en)2013-03-132016-01-28DePuy Synthes Products, Inc.External bone fixation device
US20160113681A1 (en)2014-10-242016-04-28Stryker European Holdings I, LlcMethods and systems for adjusting an external fixation frame
US20160125603A1 (en)2013-06-112016-05-05Atsushi TanjiBone cutting support system, information processing apparatus, image processing method, and image processing program
US20160183979A1 (en)2014-08-272016-06-30Vito Del DeoMethod and device for positioning and stabilization of bony structures during maxillofacial surgery
CN105852985A (en)2016-04-182016-08-17上海昕健医疗技术有限公司Method for manufacturing personalized orthopedic positioning guide plates
US20170224520A1 (en)2014-11-042017-08-10Osteoid Saglik Teknolojileri A.S.Methods for integrating sensors and effectors in custom three-dimensional orthosis
US20170303966A1 (en)2016-04-202017-10-26Stryker European Holdings I, LlcRing hole planning for external fixation frames
US20170348054A1 (en)2016-06-022017-12-07Stryker European Holdings I, LlcSoftware for use with deformity correction
US20170354439A1 (en)2016-06-142017-12-14Stryker European Holdings I, LlcGear mechanisms for fixation frame struts
US20180055569A1 (en)*2016-08-252018-03-01DePuy Synthes Products, Inc.Orthopedic fixation control and manipulation
WO2019040829A1 (en)2017-08-242019-02-28Amdt Holdings, Inc.Methods and systems for determining adjustment prescriptions of external fixation devices
WO2020023686A1 (en)2018-07-242020-01-30Amdt Holdings, Inc.Methods and systems of registering a radiographic image and a 3d model of an external fixation device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
ATE548987T1 (en)*2006-11-282012-03-15Koninkl Philips Electronics Nv DEVICE FOR DETERMINING A POSITION OF A FIRST OBJECT WITHIN A SECOND OBJECT
JP5269500B2 (en)*2008-07-042013-08-21株式会社東芝 Image processing device
KR20100024457A (en)*2010-01-222010-03-05한국과학기술원Surgery planning simulation for closed reduction and internal fixation
EP3224804B1 (en)*2014-11-272021-02-24Koninklijke Philips N.V.Apparatus for determining positions of an interventional instrument in a projection image

Patent Citations (272)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US2055024A (en)1934-08-071936-09-22Jr Joseph E BittnerFracture reducing splint
US2391537A (en)1943-09-271945-12-25Anderson RogerAmbulatory rotating reduction and fixation splint
US3977397A (en)1974-11-271976-08-31Kalnberz Viktor KonstantinovicSurgical compression-distraction instrument
US4081686A (en)1977-02-151978-03-28E. I. Du Pont De Nemours And CompanyX-ray film cassette and method of making same
US4450834A (en)1979-10-181984-05-29Ace Orthopedic Manufacturing, Inc.External fixation device
US4630203A (en)1983-12-271986-12-16Thomas SzirtesContour radiography: a system for determining 3-dimensional contours of an object from its 2-dimensional images
US4489111A (en)1984-01-241984-12-18Woodrum Dorothy BBeaded trimmed satin christmas ornament
US4889111A (en)1984-02-081989-12-26Ben Dov MeirBone growth stimulator
US4784125A (en)1985-01-241988-11-15Jaquet Orthopedie, S. A.Arcuate element and external fixation device containing same for osteosynthesis and osteoplasty
US5095919A (en)1985-01-241992-03-17Jaquet Orthopedie S.A.Arcuate element and external fixation device
FR2576774A1 (en)1985-02-071986-08-08Issoire Aviat SaDevice for three-dimensional positioning of any two pieces, in particular two pieces of bone, and allowing modification of the said positioning
US4890631A (en)1985-02-221990-01-02Societe De Realisations Electro-Mecaniques SoremExternal fixation device intended for orthopedic use
US4620533A (en)1985-09-161986-11-04Pfizer Hospital Products Group Inc.External bone fixation apparatus
US4615338A (en)1985-09-181986-10-07Kurgansky Nauchno-Issledovatelsky Institut Experimentalnoi I Klinicheskoi Ortopedii I TravmatologiiAutomatic compression-distraction apparatus
US4768524A (en)1986-02-281988-09-06Hardy Jean MarieDevice for immobilizing a bone structure, especially intended for orthopedic use
US5087258A (en)1987-06-191992-02-11Thomas SchewiorRing splint to set, affix and regulate the tension position of bone segments
US4875165A (en)1987-11-271989-10-17University Of ChicagoMethod for determination of 3-D structure in biplane angiography
US4964320A (en)1988-12-221990-10-23Engineering & Precision Machining, Inc.Method of forming a beaded transfixion wire
US4930961A (en)1988-12-231990-06-05Weis Charles WQuick lock and release fastener
US5180380A (en)1989-03-081993-01-19Autogenesis CorporationAutomatic compression-distraction-torsion method and apparatus
US4973331A (en)1989-03-081990-11-27Autogenesis CorporationAutomatic compression-distraction-torsion method and apparatus
US5179525A (en)1990-05-011993-01-12University Of FloridaMethod and apparatus for controlling geometrically simple parallel mechanisms with distinctive connections
US5156605A (en)1990-07-061992-10-20Autogenesis CorporationAutomatic internal compression-distraction-method and apparatus
US5062844A (en)1990-09-071991-11-05Smith & Nephew Richards Inc.Method and apparatus for the fixation of bone fractures, limb lengthening and the correction of deformities
US5209750A (en)1990-10-121993-05-11Compagnie General De Materiel OrthopediqueExternal holding and reducing brace for bone fractures
US5074866A (en)1990-10-161991-12-24Smith & Nephew Richards Inc.Translation/rotation device for external bone fixation system
US5108393A (en)1991-04-081992-04-28The United States Of America As Represented By The Secretary Of The NavyNon-invasive body-clamp
US5275598A (en)1991-10-091994-01-04Cook Richard LQuasi-isotropic apparatus and method of fabricating the apparatus
US5443464A (en)1993-02-161995-08-22Memphis Orthopaedic Design, Inc.External fixator apparatus
US5540686A (en)1993-02-181996-07-30Endocare AgApparatus for lengthening bones
US5358504A (en)1993-05-051994-10-25Smith & Nephew Richards, Inc.Fixation brace with focal hinge
US5451225A (en)1993-06-101995-09-19Texas Scottish Rite Hospital For Crippled ChildrenFastener for external fixation device wires and pins
US5766173A (en)1993-06-101998-06-16Texas Scottish Rite Hospital For ChildrenDistractor mechanism for external fixation device
US5968043A (en)1993-06-101999-10-19Texas Scottish Rite Hospital For ChildrenPlastic double nut mechanism enabling rigid orthopedic distraction
US5630814A (en)1993-06-101997-05-20Texas Scottish Rite Hospital For Crippled ChildrenFastener for external fixation device wires and pins
US5681309A (en)1993-06-101997-10-28Texas Scottish Rite Hospital For Crippled ChildrenDistractor mechanism for external fixation device
US5437668A (en)1994-02-181995-08-01Board Of Trustees Of The University Of Ark.Apparatus and method for clinical use of load measurement in distraction osteogenesis
US5458599A (en)1994-04-211995-10-17Adobbati; Ricardo N.System for the use in the fixation of a fractured bone
US5653707A (en)1994-11-011997-08-05Smith & Nephew Richards, Inc.External skeletal fixation system with improved bar-to-bar connector
US5961515A (en)1995-03-011999-10-05Smith & Nephew, Inc.External skeletal fixation system
US5702389A (en)1995-03-011997-12-30Smith & Nephew Richards, Inc.Orthopaedic fixation device
US5971984A (en)1995-03-011999-10-26Smith & Nephew, Inc.Method of using an orthopaedic fixation device
US5728095A (en)1995-03-011998-03-17Smith & Nephew, Inc.Method of using an orthopaedic fixation device
US5601551A (en)1995-03-011997-02-11Smith & Nephew Richards, Inc.Geared external fixator
US5871018A (en)1995-12-261999-02-16Delp; Scott L.Computer-assisted surgical method
US20010018617A1 (en)1996-04-022001-08-30Franz CopfProsthetic part
US5746741A (en)1996-05-061998-05-05Tufts UniversityExternal fixator system
US5951556A (en)1996-05-151999-09-14Orthofix S.R.L.Compact external fixation device
US6047080A (en)1996-06-192000-04-04Arch Development CorporationMethod and apparatus for three-dimensional reconstruction of coronary vessels from angiographic images
US6501848B1 (en)1996-06-192002-12-31University Technology CorporationMethod and apparatus for three-dimensional reconstruction of coronary vessels from angiographic images and analytical techniques applied thereto
US6320928B1 (en)1996-09-042001-11-20Ge Medical Systems, S.A.Method of reconstruction of a three-dimensional image of an object
WO1998012975A2 (en)1996-09-261998-04-02Aleksandar TosicArticulated external orthopedic fixation system and method of use
JP2001523985A (en)1996-10-072001-11-27スミス アンド ネフュー インコーポレーテッド How to adjust orthopedic or other fixators
US5976142A (en)1996-10-161999-11-02Chin; MartinApparatus and method for distraction osteogenesis of small alveolar bone
FR2756025A1 (en)1996-11-151998-05-22Const Mecaniques Des Vosges CARDAN STRUCTURE FOR THE ARTICULATED CONNECTION OF A HEXAPODE
US5776132A (en)1996-12-261998-07-07Blyakher; ArkadyExternal fixation assembly
US5885282A (en)1997-05-091999-03-23The Regents Of The University Of CaliforniaApparatus for treatment of fracture and malunion of the distal radius
US5919192A (en)1997-06-101999-07-06Cottec Orthopaedic Technologies Development Ltd.Compression-distraction apparatus for treatment of a bone fracture
US6017341A (en)1997-06-202000-01-25Novo Nordisk A/SApparatus for fixation of the bones in a healing bone fracture
US6363169B1 (en)1997-07-232002-03-26Sanyo Electric Co., Ltd.Apparatus and method of three-dimensional modeling
US6434278B1 (en)1997-09-232002-08-13Enroute, Inc.Generating three-dimensional models of objects defined by two-dimensional image data
USRE40914E1 (en)1997-10-202009-09-08Smith & Nephew, Inc.Orthopaedic fixation plate
US5891143A (en)1997-10-201999-04-06Smith & Nephew, Inc.Orthopaedic fixation plate
US5967777A (en)1997-11-241999-10-19Klein; MichaelSurgical template assembly and method for drilling and installing dental implants
US5963612A (en)1997-12-311999-10-05Siemens Corporation Research, Inc.Apparatus for C-arm calibration for 3D reconstruction in an imaging system utilizing planar transformation
US6021579A (en)1998-04-012000-02-08Joseph M. SchimmelsSpatial parallel compliant mechanism
WO1999059100A1 (en)1998-05-141999-11-18Cognitens, Ltd.Euclidean reconstruction of 3d scene from 2d images following a non-rigid transformation
US6510241B1 (en)1998-06-112003-01-21Ge Medical Systems SaProcess for reconstructing a three-dimensional image of an object
US6912293B1 (en)1998-06-262005-06-28Carl P. KorobkinPhotogrammetry engine for model construction
US6030386A (en)1998-08-102000-02-29Smith & Nephew, Inc.Six axis external fixator strut
US6206566B1 (en)1998-11-022001-03-27Siemens AktiengesellschaftX-ray apparatus for producing a 3D image from a set of 2D projections
RU2159091C2 (en)1999-01-102000-11-20Борозда Иван ВикторовичDevice for reposition and fixation of fractures of hipbones
US6129727A (en)1999-03-022000-10-10Smith & NephewOrthopaedic spatial frame apparatus
WO2001015611A1 (en)1999-08-302001-03-08Smith & Nephew, Inc.Six axis external fixator strut
EP1100048A1 (en)1999-11-122001-05-16Société N3DI S.A.R.L.Automatic building process of a digital model using stereoscopic image couples
US6293947B1 (en)2000-01-282001-09-25Daniel BuchbinderDistraction osteogenesis device and method
US20020010465A1 (en)2000-01-312002-01-24Ja Kyo KooFrame fixator and operation system thereof
US6701174B1 (en)2000-04-072004-03-02Carnegie Mellon UniversityComputer-aided bone distraction
US20040068187A1 (en)2000-04-072004-04-08Krause Norman M.Computer-aided orthopedic surgery
WO2001078015A2 (en)2000-04-072001-10-18Carnegie Mellon UniversityComputer-aided bone distraction
US7837621B2 (en)2000-04-072010-11-23Carnegie Mellon UniversityComputer-aided bone distraction
JP2003530177A (en)2000-04-072003-10-14カーネギー メロン ユニヴァーシティ Computer aided bone lengthening
US20040039259A1 (en)2000-04-072004-02-26Norman KrauseComputer-aided bone distraction
US7226449B2 (en)2000-05-092007-06-05Orthofix S.R.L.Ring fixator
US6537275B2 (en)2000-05-092003-03-25Orthodix S.R.L.Securing component for a ring fixator used in orthopaedic surgery
US20040082849A1 (en)2000-08-012004-04-29Achim SchweikardMethod for navigating in the interior of the body using three-dimensionally visualised structures
US20040111024A1 (en)2001-02-072004-06-10Guoyan ZhengMethod for establishing a three-dimensional representation of a bone from image data
US20040133199A1 (en)2001-03-052004-07-08Michele CoatiExternal fixation device for reducing bone fractures
CN1494397A (en)2001-03-052004-05-05���ŷƿ�˹���ʹ�˾ External fixation devices for fracture recovery
US20110029093A1 (en)2001-05-252011-02-03Ray BojarskiPatient-adapted and improved articular implants, designs and related guide tools
WO2003030759A2 (en)2001-10-092003-04-17Synthes (U.S.A.)Adjustable fixator
JP2003144454A (en)2001-11-162003-05-20Yoshio KogaJoint operation support information computing method, joint operation support information computing program, and joint operation support information computing system
US20050256389A1 (en)2001-11-162005-11-17Yoshio KogaCalculation method, calculation program and calculation system for information supporting arthroplasty
US20030106230A1 (en)2001-12-102003-06-12Hennessey C. WilliamParallel kinematic micromanipulator
US7887537B2 (en)2002-02-042011-02-15Smith & Nephew, Inc.External fixation system
US20040073211A1 (en)2002-04-052004-04-15Ed AustinOrthopaedic fixation method and device with delivery and presentation features
US20050215997A1 (en)2002-04-052005-09-29Ed AustinOrthopaedic fixation method and device with delivery and presentation features
US20030191466A1 (en)2002-04-052003-10-09Ed AustinOrthopaedic fixation method and device
US7758582B2 (en)2002-06-142010-07-20Smith & Nephew, Inc.Device and methods for placing external fixation elements
US7657079B2 (en)2002-06-282010-02-02Intel CorporationSingle constraint at a time (SCAAT) tracking of a virtual reality (VR) display
US20040167518A1 (en)2002-07-122004-08-26Estrada Hector MarkRadiolucent frame element for external bone fixators
US7280687B2 (en)2002-09-022007-10-09Fanuc LtdDevice for detecting position/orientation of object
US7388972B2 (en)2002-09-262008-06-17Meridian Technique LimitedOrthopaedic surgery planning
US7113623B2 (en)2002-10-082006-09-26The Regents Of The University Of ColoradoMethods and systems for display and analysis of moving arterial tree structures
US20040073212A1 (en)2002-10-152004-04-15Kim Jung JaeExtracorporeal fixing device for a bone fracture
US8419732B2 (en)2002-11-142013-04-16Sixfix, Inc.Method for using a fixator device
US20110103676A1 (en)2002-11-142011-05-05Extraortho, Inc.Method for using a fixator device
JP2006507056A (en)2002-11-142006-03-02ビジョンメッド エルエルシー How to use a fixator device
US20040097922A1 (en)2002-11-142004-05-20Visionmed, L.L.C.Method for a using fixator device
US7490085B2 (en)2002-12-182009-02-10Ge Medical Systems Global Technology Company, LlcComputer-assisted data processing system and method incorporating automated learning
US20040208279A1 (en)2002-12-312004-10-21Yongshun XiaoApparatus and methods for multiple view angle stereoscopic radiography
JP2004254899A (en)2003-02-262004-09-16Hitachi Ltd Operation support system and operation support method
US7645279B1 (en)2003-07-252010-01-12Haupt Bruce FBone fixation method
US7187792B2 (en)2003-08-292007-03-06Accuray, Inc.Apparatus and method for determining measure of similarity between images
US20080012850A1 (en)2003-12-302008-01-17The Trustees Of The Stevens Institute Of TechnologyThree-Dimensional Imaging System Using Optical Pulses, Non-Linear Optical Mixers And Holographic Calibration
US20050149018A1 (en)2003-12-312005-07-07Paul CooperExternal bone/joint fixation device
US7828801B2 (en)2004-09-032010-11-09A.M. Surgical, Inc.External fixation device for fractures
US20090226055A1 (en)2004-12-102009-09-10Harry DankowiczSystems and methods for multi-dimensional characterization and classification of spinal shape
JP2006218298A (en)2005-02-092006-08-24Stryker Trauma Sa External fixing device that increases the distance between the clamping elements
EP1690506A1 (en)2005-02-092006-08-16Stryker Trauma SAExternal fixation device; in particular for increasing a distance between clamping elements
US8469958B2 (en)2005-02-152013-06-25Morphographics, LcFixing block and method for stabilizing bone
US20060276786A1 (en)2005-05-252006-12-07Brinker Mark RApparatus for accurately positioning fractured bone fragments toward facilitating use of an external ring fixator system
US20070055234A1 (en)2005-06-102007-03-08Mcgrath William MExternal fixation system with provisional brace
US7306601B2 (en)2005-06-102007-12-11Quantum Medical Concepts, Inc.External fixation system with provisional brace
US20070043354A1 (en)2005-08-032007-02-22Koo Terry KBone reposition device, method and system
US20070043429A1 (en)2005-08-182007-02-22Admedes Schuessler GmbhX-ray visibility and corrosion resistance of niti stents using markers made of sandwich material
WO2007024904A2 (en)2005-08-252007-03-01Syntes (U.S.A.)External fixation system and method of use
JP2009505736A (en)2005-08-252009-02-12ジンテス ゲゼルシャフト ミット ベシュレンクテル ハフツング External fixation system and method of use thereof
US20070049930A1 (en)2005-08-252007-03-01Jim HearnExternal fixation system and method of use
US8029505B2 (en)2005-08-252011-10-04Synthes Usa, LlcExternal fixation system and method of use
CN101296664A (en)2005-08-252008-10-29新特斯有限责任公司External fixation system and method of use
US20070161984A1 (en)2005-12-082007-07-12Ebi, L.P.Foot plate fixation
US20070161983A1 (en)2005-12-082007-07-12Ebi, L.P.External fixation system
US20090177198A1 (en)2005-12-292009-07-09Matsukidis TheodorosCompression-distraction device
KR200443058Y1 (en)2005-12-292009-01-09페드럴 스테이트 인스티튜션 (러시안 일리자로브 사이언티픽센터 (레스토러티브 트라우마톨로지 앤드 오르토패딕스) 오브 페드럴 에이젼시 온 하이 테크놀로지 메디컬 케어) Compression-stretching device
US7677078B2 (en)2006-02-022010-03-16Siemens Medical Solutions Usa, Inc.Line-based calibration of ultrasound transducer integrated with a pose sensor
US20070238069A1 (en)2006-04-102007-10-11Scott LovaldOsteosynthesis plate, method of customizing same, and method for installing same
US20080051779A1 (en)2006-08-022008-02-28The Nemours FoundationForce-controlled autodistraction
US8282652B2 (en)2006-08-022012-10-09The Nemours FoundationForce-controlled autodistraction
US20080114267A1 (en)2006-11-142008-05-15General Electric CompanySystems and methods for implant distance measurement
US8157800B2 (en)2007-03-212012-04-17Vvedensky Pyotr SComputer-aided system for limb lengthening
US20080234554A1 (en)2007-03-212008-09-25Vvedensky Pyotr SComputer-Aided System for Limb Lengthening
US20100039421A1 (en)2007-04-042010-02-18Sony CorporationDriving method for organic electroluminescence light emitting section
US8296094B2 (en)2007-04-042012-10-23Smith & Nephew, Inc.Analysis of parallel manipulators
US20100191500A1 (en)2007-04-042010-07-29Andrew Joseph Lawrence HarrisonAnalysis of parallel manipulators
US20100172567A1 (en)2007-04-172010-07-08Prokoski Francine JSystem and method for using three dimensional infrared imaging to provide detailed anatomical structure maps
US20080269741A1 (en)2007-04-282008-10-30John Peter KaridisOrthopedic fixation device with zero backlash and adjustable compliance, and process for adjusting same
US8202273B2 (en)2007-04-282012-06-19John Peter KaridisOrthopedic fixation device with zero backlash and adjustable compliance, and process for adjusting same
RU2352283C2 (en)2007-05-042009-04-20Леонид Николаевич СоломинSolomin-utekhin-vilensky apparatus for perosseous osteosynthesis
US20100191239A1 (en)2007-06-012010-07-29Umc Utrecht Holding B.V.System for Correcting Bones
US8147491B2 (en)2007-06-272012-04-03Vilex In Tennessee, Inc.Multi-angle clamp
US20100030219A1 (en)2007-07-012010-02-04L.R.S. Ortho Ltd.Orthopedic navigation system and method
US20090036892A1 (en)2007-07-302009-02-05John Peter KaridisAdjustable length strut apparatus for orthopaedic applications
US20090036890A1 (en)2007-07-312009-02-05John Peter KaridisFixator apparatus with radiotransparent apertures for orthopaedic applications
US20090105621A1 (en)2007-10-182009-04-23Boyd Lawrence MProtective and Cosmetic Covering for External Fixators
US20090143788A1 (en)2007-12-042009-06-04National Cheng Kung UniversityNavigation method and system for drilling operation in spinal surgery
US20090161945A1 (en)2007-12-212009-06-25Canon Kabushiki KaishaGeometric parameter measurement of an imaging device
US20090198234A1 (en)2008-02-012009-08-06Stryker Trauma SaTelescopic strut for an external fixator
US8057474B2 (en)2008-02-012011-11-15Stryker Trauma SaTelescopic strut for an external fixator
US8062293B2 (en)2008-02-012011-11-22Stryker Trauma SaStrut joint for an external fixator
US20100305568A1 (en)2008-02-052010-12-02Texas Scottish Rite Hospital For ChildrenExternal fixator ring
US8439914B2 (en)2008-02-082013-05-14Texas Scottish Rite Hospital For ChildrenExternal fixation strut
US9155559B2 (en)2008-02-082015-10-13Texas Scottish Rite Hospital For ChildrenExternal fixator strut
US20100312243A1 (en)2008-02-082010-12-09Texas Scottish Rite Hospital For ChildrenExternal fixator ring
US8444644B2 (en)2008-02-122013-05-21Texas Scottish Rite Hospital For ChildrenFast adjust external fixation connection rod
US20150313641A1 (en)2008-02-122015-11-05Texas Scottish Rite Hospital For ChildrenFast adjust external fixation connection rod
WO2009102904A1 (en)2008-02-122009-08-20Texas Scottish Rite Hospital For ChildrenFast adjust external fixation connection rod
US9078700B2 (en)2008-02-122015-07-14Texas Scottish Rite Hospital For ChildrenFast adjust external fixation connection rod
JP2011512883A (en)2008-02-182011-04-28テキサス スコティッシュ ライト ホスピタル フォー チルドレン Tool and method for external fixed support adjustment
US20110004199A1 (en)2008-02-182011-01-06Texas Scottish Rite Hospital For ChildrenTool and method for external fixation strut adjustment
US7955334B2 (en)2008-04-182011-06-07Stryker Trauma SaExternal fixation system
US8951252B2 (en)2008-04-182015-02-10Stryker Trauma SaExternal fixation system
US9011438B2 (en)2008-04-182015-04-21Stryker Trauma SaRadiolucent orthopedic fixation plate
US20090275944A1 (en)2008-05-022009-11-05Quantum Medical Concepts LlcExternal Fixation and foot-supporting Device.
US20090326560A1 (en)2008-06-272009-12-31Lampropoulos Fred PCatheter with radiopaque marker
US20090326532A1 (en)2008-06-302009-12-31Depuy Products, Inc.External fixator
WO2010002587A1 (en)2008-06-302010-01-07Depuy Products, Inc.External fixator
US20100087819A1 (en)2008-10-072010-04-08Extraortho, Inc.Forward Kinematic Solution for a Hexapod Manipulator and Method of Use
US20100104150A1 (en)2008-10-242010-04-29Biospace MedMeasurement of geometric quantities intrinsic to an anatomical system
US20100179548A1 (en)2009-01-132010-07-15Marin Luis EExternal fixator assembly
US9044271B2 (en)2009-03-102015-06-02Stryker Trauma SaExternal fixation system
US8333766B2 (en)2009-03-102012-12-18Stryker Trauma SaExternal fixation system
US20100234844A1 (en)2009-03-102010-09-16Stryker Trauma SaExernal fixation system
WO2010104567A1 (en)2009-03-102010-09-16Stryker Trauma SaExternal fixation system
US20100280516A1 (en)2009-04-302010-11-04Jeffrey TaylorAccessory Device for an Orthopedic Fixator
US8323282B2 (en)2009-04-302012-12-04Jeffrey TaylorAccessory device for an orthopedic fixator
US20130211521A1 (en)2009-08-272013-08-15Cotera, Inc.Method and Apparatus for Altering Biomechanics of the Articular Joints
WO2011026475A1 (en)2009-09-052011-03-10Surgitaix AgDevice for fixating bone segments
US8777946B2 (en)2009-10-052014-07-15Aalto University FoundationAnatomically customized and mobilizing external support, method for manufacture
US20150112339A1 (en)2009-10-052015-04-23Aalto University FoundationAnatomically personalized and mobilizing external support and method for controlling a path of an external auxiliary frame
US8858555B2 (en)2009-10-052014-10-14Stryker Trauma SaDynamic external fixator and methods for use
US20140257286A1 (en)2009-10-052014-09-11Aalto University FoundationAnatomically customized and mobilizing external support, method for manufacture
WO2011060264A1 (en)2009-11-132011-05-19Amei Technologies, Inc.Adjustable orthopedic fixation system
US20130131675A1 (en)2009-11-132013-05-23Amei Technologies, Inc.Fixation device and multiple-axis joint for a fixation device
WO2011060266A1 (en)2009-11-132011-05-19Amei Technologies, Inc.Fixation device and multiple-axis joint for a fixation device
US8377060B2 (en)2009-11-132013-02-19Amei Technologies, Inc.Fixation device and multiple-axis joint for a fixation device
US20110118737A1 (en)2009-11-132011-05-19Amei Technologies, Inc.Fixation device and multiple-axis joint for a fixation device
US20110118738A1 (en)2009-11-132011-05-19Amei Technologies, Inc.Adjustable orthopedic fixation system
US8425512B2 (en)2009-11-132013-04-23Amei Technologies, Inc.Fixation device and multiple-axis joint for a fixation device
US8430878B2 (en)2009-11-132013-04-30Amei Technologies, Inc.Adjustable orthopedic fixation system
US20130245625A1 (en)2009-11-132013-09-19Amei Technologies, Inc.Adjustable orthopedic fixation system
US20110131418A1 (en)2009-12-022011-06-02Giga-Byte Technology Co.,Ltd.Method of password management and authentication suitable for trusted platform module
US8257353B2 (en)2010-02-242012-09-04Wright Medical Technology, Inc.Orthopedic external fixation device
US20110208187A1 (en)2010-02-242011-08-25Wright Medical Technology, Inc.Orthopedic external fixation device
US8454604B2 (en)2010-02-242013-06-04Wright Medical Technology, Inc.Orthopedic external fixation device
US9066756B2 (en)2010-02-242015-06-30Wright Medical Technology, Inc.Orthopedic external fixation device
US20150265313A1 (en)2010-02-242015-09-24Wright Medical Technology, Inc.Orthopedic external fixation device
US20120303028A1 (en)2010-02-242012-11-29Wright Medical Technology, Inc.Orthopedic external fixation device
US20130138017A1 (en)2010-03-242013-05-30Jonathon JundtUltrasound guided automated wireless distraction osteogenesis
US20130060146A1 (en)2010-04-282013-03-07Ryerson UniversitySystem and methods for intraoperative guidance feedback
WO2011146703A1 (en)2010-05-192011-11-24Synthes Usa, LlcOrthopedic fixation with imagery analysis
US20110313418A1 (en)2010-05-192011-12-22Arkadijus NikonovasOrthopedic fixation with imagery analysis
US9642649B2 (en)2010-05-192017-05-09DePuy Synthes Products, Inc.Orthopedic fixation with imagery analysis
US20170181800A1 (en)2010-05-192017-06-29DePuy Synthes Products, Inc.Orthopedic fixation with imagery analysis
US20110313419A1 (en)2010-06-222011-12-22Extraortho, Inc.Hexapod External Fixation System with Collapsing Connectors
US8834467B2 (en)2010-08-112014-09-16Stryker Trauma SaExternal fixator system
US9220533B2 (en)2010-08-112015-12-29Stryker Trauma SaExternal fixator system
US20120041439A1 (en)2010-08-112012-02-16Stryker Trauma SaExternal fixator system
US20150238227A1 (en)2010-08-112015-08-27Stryker European Holdings I, LlcExternal fixator system
US8945128B2 (en)2010-08-112015-02-03Stryker Trauma SaExternal fixator system
CN103270513A (en)2010-08-122013-08-28哈特弗罗公司 Methods and systems for patient-specific blood flow modeling
WO2012021307A2 (en)2010-08-122012-02-16Heartflow, Inc.Method and system for patient-specific modeling of blood flow
US20120078251A1 (en)2010-09-232012-03-29Mgv Enterprises, Inc.External Fixator Linkage
US20120232554A1 (en)2011-03-092012-09-13Quantum Medical Concepts LlcAlignment Plate for Lower-extremity Ring Fixation, Method of Use, and System
US20120259343A1 (en)2011-04-082012-10-11Allen Medical Systems, Inc.Low profile distractor apparatuses
US20120330312A1 (en)2011-06-232012-12-27Stryker Trauma GmbhMethods and systems for adjusting an external fixation frame
US20140278325A1 (en)2011-06-232014-09-18Stryker Trauma GmbhMethods and systems for adjusting an external fixation frame
US20130041288A1 (en)2011-08-082013-02-14John Charles TaylorApparatus and Method of Monitoring Healing and/or Assessing Mechanical Stiffness of a Bone Fracture Site or the Like
US20140236152A1 (en)2011-08-232014-08-21Aesculap AgElectrosurgical device and methods of manufacture and use
US20140379038A1 (en)2011-09-092014-12-25University Of The West Of England, BristolSystem for anatomical reduction of bone fractures
US20140303670A1 (en)2011-11-162014-10-09Neuromechanical Innovations, LlcMethod and Device for Spinal Analysis
US20130201212A1 (en)2012-02-032013-08-08Orthohub, Inc.External Fixator Deformity Correction Systems and Methods
US8952986B2 (en)2012-02-032015-02-10Orthohub, Inc.External fixator deformity correction systems and methods
US8654150B2 (en)2012-02-032014-02-18Orthohub, Inc.External fixator deformity correction systems and methods
US9017339B2 (en)2012-04-262015-04-28Stryker Trauma GmbhMeasurement device for external fixation frame
US20130289575A1 (en)2012-04-262013-10-31Stryker Trauma GmbhMeasurement device for external fixation frame
US20130296857A1 (en)2012-05-042013-11-07Gary D. BarnettRatcheting strut
US8906021B1 (en)2012-08-202014-12-09Stryker Trauma SaTelescopic strut for an external fixator
US20150080892A1 (en)2012-08-202015-03-19Stryker European Holdings I, LlcTelescopic strut for an external fixator
US9101398B2 (en)2012-08-232015-08-11Stryker Trauma SaBone transport external fixation frame
US20150305777A1 (en)2012-08-232015-10-29Stryker European Holdings I, LlcBone transport external fixation frame
US20150257788A1 (en)2012-09-062015-09-17Solana Surgical LLCExternal fixator
US20140135764A1 (en)2012-11-132014-05-15Texas Scottish Rite Hospital For ChildrenExternal fixation connection rod for rapid and gradual adjustment
US8574232B1 (en)2012-11-132013-11-05Texas Scottish Hospital for ChildrenExternal fixation connection rod for rapid and gradual adjustment
EP2767252A1 (en)2013-02-192014-08-20Stryker Trauma GmbHSoftware for planning deformity correction
US20140236153A1 (en)2013-02-192014-08-21Stryker Trauma GmbhSoftware for use with deformity correction
US20160045225A1 (en)2013-02-192016-02-18Stryker European Holdings I, LlcSoftware for use with deformity correction
US9204937B2 (en)2013-02-192015-12-08Stryker Trauma GmbhSoftware for use with deformity correction
US20160022314A1 (en)2013-03-132016-01-28DePuy Synthes Products, Inc.External bone fixation device
US20150223842A1 (en)2013-03-132015-08-13DePuy Synthes Products, Inc.External Bone Fixation Device
US20140276821A1 (en)2013-03-132014-09-18Nicole MurrayExternal Bone Fixation Device
US9039706B2 (en)2013-03-132015-05-26DePuy Synthes Products, Inc.External bone fixation device
US8864763B2 (en)2013-03-132014-10-21DePuy Synthes Products, LLCExternal bone fixation device
US20140276817A1 (en)2013-03-132014-09-18Nicole MurrayExternal Bone Fixation Device
US20160092651A1 (en)2013-05-142016-03-31Smith & Nephew, Inc.Apparatus and method for administering a medical device prescription
WO2014186453A2 (en)2013-05-142014-11-20Smith & Nephew, Inc.Apparatus and method for administering a medical device prescription
US20160125603A1 (en)2013-06-112016-05-05Atsushi TanjiBone cutting support system, information processing apparatus, image processing method, and image processing program
US20150088135A1 (en)2013-09-262015-03-26Stryker Trauma GmbhBone position tracking system
US20150272624A1 (en)2014-04-012015-10-01Stryker European Holdings I, LlcExternal fixator with y strut
US20150305776A1 (en)2014-04-232015-10-29Texas Scottish Rite Hospital For ChildrenDynamization module for external fixation strut
US20160183979A1 (en)2014-08-272016-06-30Vito Del DeoMethod and device for positioning and stabilization of bony structures during maxillofacial surgery
US20160113681A1 (en)2014-10-242016-04-28Stryker European Holdings I, LlcMethods and systems for adjusting an external fixation frame
US20170224520A1 (en)2014-11-042017-08-10Osteoid Saglik Teknolojileri A.S.Methods for integrating sensors and effectors in custom three-dimensional orthosis
CN105852985A (en)2016-04-182016-08-17上海昕健医疗技术有限公司Method for manufacturing personalized orthopedic positioning guide plates
US20170303966A1 (en)2016-04-202017-10-26Stryker European Holdings I, LlcRing hole planning for external fixation frames
US9895167B2 (en)2016-04-202018-02-20Stryker European Holdings I, LlcRing hole planning for external fixation frames
US20170348054A1 (en)2016-06-022017-12-07Stryker European Holdings I, LlcSoftware for use with deformity correction
US20170348057A1 (en)2016-06-022017-12-07Stryker European Holdings I, LlcSoftware for use with deformity correction
US20170354439A1 (en)2016-06-142017-12-14Stryker European Holdings I, LlcGear mechanisms for fixation frame struts
US20180055569A1 (en)*2016-08-252018-03-01DePuy Synthes Products, Inc.Orthopedic fixation control and manipulation
WO2019040829A1 (en)2017-08-242019-02-28Amdt Holdings, Inc.Methods and systems for determining adjustment prescriptions of external fixation devices
WO2020023686A1 (en)2018-07-242020-01-30Amdt Holdings, Inc.Methods and systems of registering a radiographic image and a 3d model of an external fixation device

Non-Patent Citations (40)

* Cited by examiner, † Cited by third party
Title
Changjiang Yang et al: "Planar conic based camera calibration", Proceedings / 15th International Conference on Pattern Recognition Barcelona, Spain, Sep. 3-7, [Proceedings of the International Conference on Pattern Recognition. (ICPR)], IEEE Computer Society, Los Alamitos, Calif. [U.A.], vol. 1, Sep. 3, 2000 (Sep. 3, 2000) pp. 555-558.
Charlton, an Investigation into the Effect of Lateral Hillslope inputs on Floorplain Hydraulic Model Predictions, Diss.University of Bristol, Sep. 1995, 289 pages.
Circle Hough Transform, Wikipedia, https://en.wikipedia.org/wiki/Circle_Hough_Transform, web-archive capture from Jul. 31, 2017, accessed on Nov. 24, 2020 from web.archive.org/web/20170731074826/https://en.wikipedia.org/wiki/Circle_Hough_Transform, 4 pages.
Circle Hough Transform, Wikipedia, httpsi//en.wikipedia.org/wiki/Circle Hough Transform, web-archive capture from Jul. 31, 2017, accessed on Jun. 16, 2021 from web.archive.org/web/20170731074826/https://en.wikipedia.org/wiki/Circle_Hough_Transform, 4 pages.
Decision to Grant (Translation) dated Mar. 2016 in Russian patent application 2012147835, 6 pages.
Durali M, Shameli E. Full order neural velocity and acceleration observer for a general 6-6 Stewart platform. InNetworking, Sensing and Control, 2004 IEEE International Conference on Mar. 21, 2004 (vol. 1, pp. 333-338).
Garreau et al., "A Knowledge-Based Approach for 3-D Reconstruction and Labeling of Vascular Networks from Biplane Angiographic Projections", IEEE Transactions On Medical Imaging, Jun. 1991, vol. 10, No. 2, 122-131.
Hartley, Euclidian Reconstruction from Uncalibrated Views, Applications of Invariance in Computer Vision, pp. 237-256, Springer Verlag, Berlin Heidelberg, 1994.
Iterative Closest Point, Wikipedia, https://en.wikipedia.org/wiki/iterative_closest_point, web-archive capture from Jan. 17, 2019, accessed on Mar. 4, 2021 from web.archive.org/web/20190117001205/https://en.wikipedia.org/wiki/iterative_closest_point, 3 pages.
Iterative Closest Point, Wikipedia, https://en.wikipedia.org/wiki/iterative_closest_point, web-archive capture from Jan. 17, 2019, accessed on Nov. 24, 2020 from web.archive.org/web/20190117001205/https://en.wikipedia.org/wiki/iterative_closest_point, 3 pages.
Iterative Closest Point, Wikipedia, https://en.wikipedia.org/wiki/iterative_closest_point, web-archive capture from Oct. 28, 2010, accessed on Oct. 26, 2020 from web.archive.org/web/20101028140305/https://en.wikipedia.org/wiki/iterative_closest_point, 3 pages.
Iterative Closest Point, Wikipedia, https://en.wikipedia.org/wiki/iterative_closest_point, web-archive capture from Sep. 13, 2006, accessed on Oct. 26, 2020 from web.archive.org/web/20060913000000/http://en.wikipedia.org/wiki/iterative_closest_point, 1 page.
Kelly, "How to calculate 3D coordinates with two cameras, a calibration object, a java program, and a lot of MS Excel macros", Jun. 10, 2002, 9 pages.
Maiocchi et al., "Operative Principles Of Ilizarov", Chapter 2, 1991, 26 pages.
Maxframe Software User's Manual, dated Feb. 8, 2017, 138 pages.
Maxframe Software User's Manual, dated Feb. 8, 2017, Section 7.4.2.6 "Display Frame Configuration Model" and Section 7.4.3 "View Options", pp. 72-73.
Maxframe Software User's Manual, dated Feb. 8, 2017, Section 9.1 Edit Strut Mounting Points, pp. 88-89.
Maxframe User's Manual, dated Feb. 8, 2017, Section 7.4 "Perform the Frame Matching" and Section 7.5 "Save and Proceed", pp. 63-73.
Nikonovas, Arkadijus. Taylor Spatial Frame: Kinematics, Mechanical Properties and Automation. Diss. University of Bristol, May 2005, 230 pages.
Orthofix, TL-HEX Software User's Guide: Software version 1.4, Nov. 2015, 60 pages.
Ortho-SUV Frame—Art of Deformity Correction, Ortho-SUV Ltd, captured by https://web.archive.org from http://www.miito.org/download/ortho-suv-frame-eng.pdfon Jun. 13, 2010; 11 pages.
Paley et al., "Deformity Correction By The Ilizarov Technique", Operative Orthopaedics, 1993, 883-948.
Paley, "The principles of deformity correction by the Ilizarov technique: Technical aspects", Techniques in Orthopaedics, 1989, vol. 4, Issue 1,15-29.
Parikh PJ, Lam SS. A hybrid strategy to solve the forward kinematics problem in parallel manipulators. IEEE Transactions on Robotics Feb. 2005; 21(1): 18-25.
Point set registration, Wikipedia, https://en.wikipedia.org/wiki/Point_set_registration, web-archive capture from Jul. 21, 2017, accessed on Nov. 24, 2020 from web.archive.org/web/20170721011854/https://en.wikipedia.org/wiki/Point_set-registration, 12 pages.
Point set registration, Wikipedia, httpsi//en.wikipedia.org/wiki/Point set registration, web-archive capture from Jul. 21, 2017, accessed on Mar. 11, 2021 from web.archive.org/web/20170721011854/https://en.wikipedia.org/wiki/Pointset-registration, 12 pages.
Ren L, Feng Z, Mills JK. A self-tuning iterative calculation approach for the forward kinematics of a Stewart-Gough platform. In Mechatronics and Automation, Proceedings of the 2006 IEEE International Conference on Jun. 25, 2006, 2018-2023.
Russakoff et al., "Intensity-Based 2D-3D Spine Image Registration Incorporating a Single Fiducial Marker", Academic Radiology, Jan. 2005, vol. 12, No. 1, 37-50.
Simard et al., "The Ilizarov Procedure: Limb Lengthening and Its Implications", Physical Therapy, Jan. 1992, vol. 72, No. 1, 25-35.
Solomin et al., Deformity Correction and Fracture Treatment by software-based Ortho-SUV Frame User Manual Draft, Sep. 2011, 90 pages.
Solomin et al., Deformity Correction and Fracture Treatment by software-based Ortho-SUV Frame, User Manual, For SUV-Software vp 1.0 and vr 1.0, Vreden Russian Research Institute of Traumatology and Orthopedics, (Ortho-SUV) Ltd., Saint Petersburg, 2013, 144 pages.
Solomin et al., Deformity Correction and Fracture Treatment by software-based Ortho-SUV Frame, User Manual, For SUV-Software vp 2.1, Vreden Russian Research Institute of Traumatology and Orthopedics, (Ortho-SUV) Ltd., Saint Petersburg, 2016, 158 pages.
Solomin, The Basic Principles of External Fixation Using The Ilizarov Device, 2005, 371 pages.
Stoughton et al., "A Modified Stewart Platform Manipulator with Improved Dexterity", IEEE Transactions On Robotics And Automation, Apr. 1993, vol. 9, No. 2, 166-173.
Stryker, Hoffman LRF, Gradual Correction, Operative technique, 2016, 36 pages.
Styker, Hoffmann LRF Hexapod, Operative technique, Jul. 2016, 44 pages.
T.A. Larionova et al., "X-ray absorptiometry in the analysis of bone mineral density of a patient with an orthopaedic trauma", Genius of Orthopaedy No. 3, pp. 98-102 (w/English abstract) 2009.
Trucco, Introductory Techniques of 3-D Computer Vision, Prentice Hall, 178-194, 1998.
Tsai, A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using off-the-shelf TV Cameras and Lenses, IEEE Journal of Robotics & Automation, RA-3, No. 4, 323-344, Aug. 1987.
Viceconti et al., "A software simulation of tibial fracture reduction with external fixator", Computer Methods and Programs in Biomedicine, 1993, 40, 89-94.

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20240325086A1 (en)*2016-06-022024-10-03Stryker European Operations Holdings LlcSoftware for use with deformity correction
US12295665B2 (en)*2016-06-022025-05-13Stryker European Operations Holdings LlcSoftware for use with deformity correction
US12220250B2 (en)2016-06-192025-02-11Synthes GmbhUser interface for strut device

Also Published As

Publication numberPublication date
CA3135073A1 (en)2020-10-01
CN113841183B (en)2024-11-26
WO2020193629A1 (en)2020-10-01
EP3948801A1 (en)2022-02-09
AU2020245028B2 (en)2024-11-07
CN113841183A (en)2021-12-24
BR112021019062A2 (en)2022-02-15
US20200305977A1 (en)2020-10-01
JP2022526540A (en)2022-05-25
AU2020245028A1 (en)2021-10-14
JP7500601B2 (en)2024-06-17

Similar Documents

PublicationPublication DateTitle
US11918292B2 (en)Orthopedic fixation control and manipulation
US11957419B2 (en)External fixator deformity correction systems and methods
CN108601629A (en)The 3D visualizations of radioactive exposure are reduced during surgical operation
AU2020245028B2 (en)Orthopedic fixation control and visualization
US11648035B2 (en)Orthopedic fixation strut swapping
US11893737B2 (en)Hinge detection for orthopedic fixation
CN119344869B (en) An intramedullary nail locking method, device and program product based on an optical tracking system

Legal Events

DateCodeTitleDescription
FEPPFee payment procedure

Free format text:ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

ASAssignment

Owner name:INNOMEDIC GMBH, GERMANY

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GUTMANN, BERND;RAABE, MARTIN;REEL/FRAME:049799/0481

Effective date:20190411

Owner name:SYNTHES GMBH, SWITZERLAND

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INNOMEDIC GMBH;REEL/FRAME:049799/0592

Effective date:20190718

FEPPFee payment procedure

Free format text:PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PTGR); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPPInformation on status: patent application and granting procedure in general

Free format text:PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPPInformation on status: patent application and granting procedure in general

Free format text:PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STPPInformation on status: patent application and granting procedure in general

Free format text:DOCKETED NEW CASE - READY FOR EXAMINATION

STPPInformation on status: patent application and granting procedure in general

Free format text:NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPPInformation on status: patent application and granting procedure in general

Free format text:AWAITING TC RESP., ISSUE FEE NOT PAID

STPPInformation on status: patent application and granting procedure in general

Free format text:NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPPInformation on status: patent application and granting procedure in general

Free format text:PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPPInformation on status: patent application and granting procedure in general

Free format text:PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCFInformation on status: patent grant

Free format text:PATENTED CASE

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4


[8]ページ先頭

©2009-2025 Movatter.jp