Movatterモバイル変換


[0]ホーム

URL:


CN118660668A - Mobile X-ray Positioning System - Google Patents

Mobile X-ray Positioning System
Download PDF

Info

Publication number
CN118660668A
CN118660668ACN202280089774.9ACN202280089774ACN118660668ACN 118660668 ACN118660668 ACN 118660668ACN 202280089774 ACN202280089774 ACN 202280089774ACN 118660668 ACN118660668 ACN 118660668A
Authority
CN
China
Prior art keywords
imaging device
image
coordinates
processor
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280089774.9A
Other languages
Chinese (zh)
Inventor
蔡鹏飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Warsaw Orthopedic Inc
Original Assignee
Warsaw Orthopedic Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Warsaw Orthopedic IncfiledCriticalWarsaw Orthopedic Inc
Publication of CN118660668ApublicationCriticalpatent/CN118660668A/en
Pendinglegal-statusCriticalCurrent

Links

Classifications

Landscapes

Abstract

Translated fromChinese

一种机器人外科成像系统(100,200,300)包括第一成像装置(202)和第二成像装置(204,302)。该第一成像装置(202)可用于捕获目标环境的第一图像,其中该第一图像包括该目标环境中的对象(502,602,702)。随后,可以在该第一图像中选择与该对象的至少一部分相关联的关注坐标(504)。然后可以生成对应于该关注坐标和该对象的该一部分的真实世界坐标(506,606),并且可以基于该真实世界坐标将该第二成像装置(204)放置在某个位置处(508,608)。在验证该第二成像装置(204)的该位置对应于该关注坐标(例如,对该位置进行任何需要的调整)之后,该第二成像装置(204)可以用于捕获该对象的该一部分的第二图像(510,610,710)。

A robotic surgical imaging system (100, 200, 300) includes a first imaging device (202) and a second imaging device (204, 302). The first imaging device (202) can be used to capture a first image of a target environment, wherein the first image includes an object (502, 602, 702) in the target environment. Subsequently, a coordinate of interest (504) associated with at least a portion of the object can be selected in the first image. Real-world coordinates (506, 606) corresponding to the coordinate of interest and the portion of the object can then be generated, and the second imaging device (204) can be placed at a certain position (508, 608) based on the real-world coordinates. After verifying that the position of the second imaging device (204) corresponds to the coordinate of interest (e.g., making any required adjustments to the position), the second imaging device (204) can be used to capture a second image (510, 610, 710) of the portion of the object.

Description

Mobile X-ray positioning system
Background
The present disclosure relates generally to surgical systems, and more particularly to imaging devices for surgical systems.
The surgical robot may assist a surgeon or other medical provider in performing a surgical procedure, or may autonomously complete one or more surgical procedures. Imaging may be used by medical providers for diagnostic, operational, and/or therapeutic purposes. Providing controllably connected articulating members allows the surgical robot to reach areas of the patient's anatomy during various medical procedures (e.g., using imaging).
Disclosure of Invention
Example aspects of the present disclosure include:
A robotic surgical imaging system, the robotic surgical imaging system comprising: a first imaging device; a second imaging device; a processor coupled to the first imaging device and the second imaging device; and a memory coupled to the processor and readable by the processor and storing instructions in the memory that, when executed by the processor, cause the processor to: capturing a first image of a target environment using a first imaging device, wherein the first image includes objects included in the target environment; selecting coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object; generating a set of real world coordinates corresponding to the portion of the object based at least in part on a mapping between the set of pixel coordinates associated with the first image and the set of real world coordinates; positioning the second imaging device to the first location based at least in part on the set of real world coordinates; and displaying a second image captured using the second imaging device, the second image comprising at least the portion of the object.
Any of the aspects herein, wherein the instructions for selecting the coordinates of interest included in the first image cause the processor to: a target line is selected that passes through at least the portion of the object, wherein the target line includes coordinates of interest.
Any of the aspects herein, wherein the instructions further cause the processor to: capturing a third image of the target environment using the first imaging device after the second imaging device has been positioned to the first position; and verifying that the second imaging device is at the coordinates of interest based at least in part on the third image, the coordinates of interest, the set of real world coordinates, the set of pixel coordinates, or a combination thereof.
Any of the aspects herein, wherein the instructions further cause the processor to: the second imaging device is positioned to a second location based at least in part on determining from the verification that the second imaging device is not located at the coordinates of interest.
Any of the aspects herein, wherein the instructions further cause the processor to: the instructional information is displayed to assist the operator in positioning the second imaging device to the second position.
Any of the aspects herein, wherein the instructions for generating the set of real world coordinates corresponding to the portion of the object cause the processor to: a distance is calculated to move the second imaging device to the first position to capture a second image comprising at least the portion of the object, wherein the distance is calculated based at least in part on the coordinates of interest and an initial position of the second imaging device.
Any of the aspects herein, wherein the instructions further cause the processor to: guide information is displayed to assist the operator in locating the second imaging device, wherein the guide information includes the calculated distance.
Any of the aspects herein, wherein the guidance information is displayed to the operator based at least in part on the set of pixel coordinates associated with the first image.
Any of the aspects herein, wherein the first imaging device comprises a camera and the second imaging device comprises an X-ray machine.
A system, the system comprising: a processor; a memory coupled with the processor and readable by the processor and storing instructions in the memory that, when executed by the processor, cause the processor to: capturing a first image of a target environment using a first imaging device, wherein the first image includes objects included in the target environment; selecting coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object; generating a set of real world coordinates corresponding to the portion of the object based at least in part on a mapping between the set of pixel coordinates associated with the first image and the set of real world coordinates; positioning the second imaging device to the first location based at least in part on the set of real world coordinates; and displaying a second image captured using the second imaging device, the second image comprising at least the portion of the object.
Any of the aspects herein, wherein the instructions for selecting the coordinates of interest included in the first image cause the processor to: a target line is selected that passes through at least the portion of the object, wherein the target line includes coordinates of interest.
Any of the aspects herein, wherein the instructions further cause the processor to: capturing a third image of the target environment using the first imaging device after the second imaging device has been positioned to the first position; and verifying that the second imaging device is at the coordinates of interest based at least in part on the third image, the coordinates of interest, the set of real world coordinates, the set of pixel coordinates, or a combination thereof.
Any of the aspects herein, wherein the instructions further cause the processor to: the second imaging device is positioned to a second location based at least in part on determining from the verification that the second imaging device is not located at the coordinates of interest.
Any of the aspects herein, wherein the instructions further cause the processor to: the instructional information is displayed to assist the operator in positioning the second imaging device to the second position.
Any of the aspects herein, wherein the instructions for generating the set of real world coordinates corresponding to the portion of the object cause the processor to: a distance is calculated to move the second imaging device to the first position to capture a second image comprising at least the portion of the object, wherein the distance is calculated based at least in part on the coordinates of interest and an initial position of the second imaging device.
Any of the aspects herein, wherein the instructions further cause the processor to: guide information is displayed to assist the operator in locating the second imaging device, wherein the guide information includes the calculated distance.
Any of the aspects herein, wherein the guidance information is displayed to the operator based at least in part on the set of pixel coordinates associated with the first image.
Any of the aspects herein, wherein the first imaging device comprises a camera and the second imaging device comprises an X-ray machine.
A method, the method comprising: capturing a first image of a target environment using a first imaging device, wherein the first image includes objects included in the target environment; generating a set of real world coordinates corresponding to a coordinate of interest included in the first image based at least in part on a mapping between the set of pixel coordinates associated with the first image and the set of real world coordinates, wherein the coordinate of interest indicates at least a portion of the object; and displaying a second image captured using a second imaging device, the second image comprising at least the portion of the object, wherein the second imaging device is positioned to the first position based at least in part on the set of real world coordinates.
Any of the aspects herein, wherein the coordinates of interest comprise a target line passing through at least the portion of the object.
Any aspect may be combined with any one or more other aspects.
Any one or more of the features disclosed herein.
Any one or more of the features are generally disclosed herein.
Any one or more of the features generally disclosed herein are combined with any one or more other features generally disclosed herein.
Any of the aspects/features/embodiments are combined with any one or more other aspects/features/embodiments.
Any one or more of the aspects or features disclosed herein are used.
It should be understood that any feature described herein may be claimed in combination with any other feature as described herein, whether or not the feature is from the same described embodiment.
The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the technology described in this disclosure will be apparent from the description and drawings, and from the claims.
The phrases "at least one," "one or more," and/or "are open-ended expressions that have both connectivity and separability in operation. For example, each of the expressions "at least one of A, B and C", "at least one of A, B or C", "one or more of A, B and C", "one or more of A, B or C", and "A, B and/or C" means only a, only B, only C, A and B together, a and C together, B and C together, or A, B and C together. When each of A, B and C in the above expressions refers to an element such as X, Y and Z or a class of elements such as X1-Xn, Y1-Ym, and Z1-Zo, the phrase is intended to refer to a single element selected from X, Y and Z, a combination of elements selected from the same class (e.g., X1 and X2), and a combination of elements selected from two or more classes (e.g., Y1 and Zo).
The term "an" entity refers to one or more of that entity. Thus, the terms "a", "one or more", and "at least one" may be used interchangeably herein. It should also be noted that the terms "comprising," "including," and "having" may be used interchangeably.
The foregoing is a simplified summary of the disclosure to provide an understanding of some aspects of the disclosure. This summary is not an extensive nor exhaustive overview of the disclosure and its various aspects, embodiments, and configurations. It is intended to neither identify key or critical elements of the disclosure nor delineate the scope of the disclosure, but to present selected concepts of the disclosure in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other aspects, embodiments and configurations of the present disclosure are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
Many additional features and advantages of the present disclosure will become apparent to those of skill in the art upon consideration of the description of the embodiments presented below.
Drawings
The accompanying drawings are incorporated in and form a part of this specification to illustrate several examples of the present disclosure. Together with the description, these drawings serve to explain the principles of the disclosure. The drawings only show preferred and alternative examples of how the disclosure may be made and used, and these examples should not be construed as limiting the disclosure to only the examples shown and described. Additional features and advantages will be made apparent from the following more detailed description of various aspects, embodiments and configurations of the present disclosure, as illustrated by the accompanying drawings referenced below.
FIG. 1 is a block diagram of a system according to at least one embodiment of the present disclosure;
FIG. 2 is a diagram of an imaging system according to at least one embodiment of the present disclosure;
FIG. 3 is an additional imaging system diagram in accordance with at least one embodiment of the present disclosure;
FIG. 4 is a set of coordinate maps in accordance with at least one embodiment of the present disclosure;
FIG. 5 is a flow chart according to at least one embodiment of the present disclosure;
FIG. 6 is an additional flow diagram in accordance with at least one embodiment of the present disclosure; and
Fig. 7 is an additional flow diagram in accordance with at least one embodiment of the present disclosure.
Detailed Description
It should be understood that the various aspects disclosed herein may be combined in different combinations than specifically presented in the specification and drawings. It should also be appreciated that certain acts or events of any of the processes or methods described herein can be performed in a different order, and/or can be added, combined, or omitted entirely, depending on the example or implementation (e.g., not all of the described acts or events may be required to practice the disclosed techniques in accordance with different embodiments of the disclosure). Moreover, although certain aspects of the disclosure are described as being performed by a single module or unit for clarity, it should be understood that the techniques of this disclosure may be performed by a combination of units or modules associated with, for example, a computing device and/or a medical device.
In one or more examples, the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Alternatively or additionally, the functions may be implemented using a machine learning model, a neural network, an artificial neural network, or a combination thereof (alone or in combination instructions). The computer-readable medium may include a non-transitory computer-readable medium corresponding to a tangible medium, such as a data storage medium (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer).
The instructions may be executed by one or more processors, such as one or more Digital Signal Processors (DSPs), general purpose microprocessors (e.g., an Intel Core i3, i5, i7, or i9 processor, an Intel Celeron processor, an Intel Xeon processor, an Intel Pentium processor, a AMD Ryzen processor, an AMD Athlon processor, a AMD Phenom processor, an Apple A10 or 10 Xfusion processor, an Apple A11, A12X, A Z, or A13 Bionic processor, or any other general purpose microprocessor), graphics processing units (e.g., nvidia GeForce RTX series processor, nvidia GeForce RTX series 3000 processor, AMD Radeon RX 5000 series processor, AMD Radeon RX 6000 series processor, or any other graphics processing unit), application Specific Integrated Circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuits. Thus, the term "processor" as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. In addition, these techniques may be fully implemented in one or more circuits or logic elements.
Before any embodiments of the disclosure are explained in detail, it is to be understood that the disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of "including," "comprising," or "having" and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Further, the present disclosure may use examples to illustrate one or more aspects thereof. The use or listing of one or more examples (which may be indicated by "for example," "by way of example," "such as," or similar language) is not intended to limit the scope of the disclosure, unless expressly stated otherwise.
The terms proximal and distal are used in this disclosure in their conventional medical sense, proximal being closer to the operator or user of the system and further from the surgical field of interest in or on the patient's body, and distal being closer to the surgical field of interest in or on the patient's body and further from the operator or user of the system.
In some surgical procedures (e.g., orthopedic surgery), an X-ray system may be used to ensure that the position of the X-ray machine is correct (e.g., before and/or during the surgical procedure). For example, an X-ray system may include an X-ray machine and a display screen for displaying X-ray images captured from the X-ray machine, wherein the X-ray images are used to ensure that the position of the X-ray machine is correct. However, these X-ray systems may rely on trial and error methods to determine if the X-ray system is positioned correctly. For example, an operator of one of these X-ray systems may place the X-ray machine at an approximate location required to capture a particular portion of the patient (e.g., on which a surgical procedure is being performed). Subsequently, the operator may capture an X-ray image of the patient from the X-ray machine at the approximate location, and may determine whether the X-ray image accurately captures a particular portion of the patient. If the operator determines that the X-ray image does not accurately capture a particular portion of the patient, the operator may readjust or move the X-ray machine and repeatedly capture the X-ray image of the patient until the X-ray machine is accurately positioned (e.g., for capturing an X-ray image of a particular region of the patient, for a surgical procedure, etc.).
In some examples, these X-ray systems may have a limited field of view (FOV). That is, the X-ray machine may only be able to capture a narrow region of interest (e.g., to limit any possible X-ray radiation to the patient and/or operator). When checking whether the X-ray machine is accurately positioned to ensure that the subsequent X-ray scan contains all of the portion of interest of the patient, the limited FOV may result in the operator having to reposition the X-ray machine multiple times and capture multiple X-ray images of the patient. For example, operators of these X-ray systems may attempt to accurately position the X-ray machine and system multiple times while ensuring that the scan range is suitable for capturing X-ray images of a portion of interest of a patient for performing a surgical procedure. However, in the case of capturing multiple X-ray images to ensure that the position of the X-ray machine is correct, a safety risk is introduced that may expose the patient and/or operator to unnecessary amounts of radiation.
As described herein, a positioning system is provided that uses a camera (e.g., located at the top of an operating room or at different locations in an operating room) to identify the current position of both an X-ray machine and a patient, and then calculates and provides the operator (e.g., surgeon) with the precise adjustments needed to adjust the position of the X-ray machine in the operating room in order to obtain an accurate FOV encompassing the entire region of interest of the patient for imaging. The positioning system may assist an operator in effectively positioning the X-ray machine based on manual input from the operator indicative of a region of interest of the patient. For example, using a camera on top of an operating room, markers on top of an X-ray machine may be used to check the position of the X-ray machine in the operating room. Additionally, the operator may use the image captured from the camera to input the target position onto the image corresponding to the FOV or position of interest on the patient. The positioning system may then use the target position to calculate a precise movement distance for moving the X-ray machine to the target position and feed back the calculated distance to the operator, and the operator may move the X-ray machine to the correct position based on the feedback. The positioning system may help avoid an operator having to position and reposition the X-ray machine multiple times and may reduce the uncertainty of whether the X-ray scan is correct, resulting in fewer X-ray images being taken and less exposure to the relevant radiation.
Embodiments of the present disclosure provide technical solutions to one or more of the following problems: (1) determining an accurate location for placement of an X-ray system or machine, (2) exposing a patient and/or operator of the X-ray system to an unnecessary amount of radiation, and (3) extending the procedure time of surgery. The positioning system described herein enables an operator of an X-ray system employing the positioning system to more accurately place the X-ray system in the exact location required to capture the correct region of interest of the patient without having to take multiple X-rays of the patient. With such a quick and accurate positioning of the X-ray machine, fewer X-rays may be taken of the patient or the amount of radiation to which the patient and/or operator are exposed may be otherwise limited. Additionally, the amount of time required to perform the relevant surgical procedure using the X-ray system may be reduced due to the use of the described positioning system.
Turning first to fig. 1, a block diagram of a system 100 in accordance with at least one embodiment of the present disclosure is shown. The system 100 may be used to position an imaging device (e.g., an X-ray system or machine) to capture a region of interest of a patient based on images captured from additional imaging devices (e.g., cameras). Thus, the distance that the imaging device moves can be calculated based on selecting regions of interest of the patient on images captured from additional imaging devices and calculating how far the imaging device needs to be moved to accurately capture those regions of interest. In some examples, the system 100 may be used to control, pose, and/or otherwise manipulate a surgical mounting system, a surgical arm, a surgical tool, and/or an imaging device attached thereto, and/or to perform one or more other aspects of one or more of the methods disclosed herein. The system 100 includes a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database 130, and/or a cloud or other network 134. Systems according to other embodiments of the present disclosure may include more or fewer components than system 100. For example, the system 100 may not include the imaging device 112, the robot 114, the navigation system 118, one or more components of the computing device 102, the database 130, and/or the cloud 134.
The computing device 102 includes a processor 104, a memory 106, a communication interface 108, and a user interface 110. Computing devices according to other embodiments of the present disclosure may include more or fewer components than computing device 102.
The processor 104 of the computing device 102 may be any processor described herein or any similar processor. The processor 104 may be configured to execute instructions stored in the memory 106 that may cause the processor 104 to perform one or more computing steps with or based on data received from the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud 134.
Memory 106 may be or include RAM, DRAM, SDRAM, other solid state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions. Memory 106 may store information or data for performing any step of methods 500, 600, and/or 700, or any other method, such as described herein. The memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114. For example, the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enables the image processing 120, segmentation 122, transformation 124, and/or registration 128. In some embodiments, such content, if provided as instructions, may be organized into one or more applications, modules, packages, layers, or engines. Alternatively or additionally, the memory 106 may store other types of content or data (e.g., machine learning modes, artificial neural networks, deep neural networks, etc.) that may be processed by the processor 104 to carry out the various methods and features described herein. Thus, while various contents of the memory 106 may be described as instructions, it should be understood that the functionality described herein may be implemented through the use of instructions, algorithms, and/or machine learning models. The data, algorithms, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, and/or the cloud 134.
Computing device 102 may also include a communication interface 108. The communication interface 108 may be used to receive image data or other information from external sources (such as the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component that is not part of the system 100) and/or to transmit instructions, images, or other information to external systems or devices (e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component that is not part of the system 100). Communication interface 108 may include one or more wired interfaces (e.g., USB port, ethernet port, firewire port) and/or one or more wireless transceivers or interfaces (configured to transmit and/or receive information, e.g., via one or more wireless communication protocols such as 802.11a/b/g/n, bluetooth, NFC, zigBee, etc.). In some implementations, the communication interface 108 may be used to enable the device 102 to communicate with one or more other processors 104 or computing devices 102, whether to reduce the time required to complete computationally intensive tasks or for any other reason.
The computing device 102 may also include one or more user interfaces 110. The user interface 110 may be or include a keyboard, mouse, trackball, monitor, television, screen, touch screen, and/or any other device for receiving information from a user and/or for providing information to a user. The user interface 110 may be used, for example, to receive user selections or other user inputs regarding any of the steps of any of the methods described herein. Nonetheless, any desired input for any step of any method described herein may be automatically generated by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100. In some embodiments, the user interface 110 may be used to allow a surgeon or other user to modify instructions to be executed by the processor 104 and/or to modify or adjust settings of other information displayed on or corresponding to the user interface 110 in accordance with one or more embodiments of the present disclosure.
Although the user interface 110 is shown as part of the computing device 102, in some embodiments, the computing device 102 may utilize the user interface 110 housed separately from one or more remaining components of the computing device 102. In some embodiments, the user interface 110 may be located proximate to one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computing device 102.
The imaging device 112 is operable to image anatomical features (e.g., bones, veins, tissue, etc.) and/or other aspects of the patient anatomy to produce image data (e.g., image data depicting or corresponding to bones, veins, tissue, etc.). "image data" as used herein refers to data generated or captured by the imaging device 112, including data in machine-readable form, graphical/visual form, and in any other form. In different examples, the image data may include data corresponding to anatomical features of the patient or a portion thereof. The image data may be or include preoperative images, intra-operative images, post-operative images, or images taken independently of any surgical procedure. In some implementations, the first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time, and the second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time that is subsequent to the first time. The imaging device 112 may be capable of capturing 2D images or 3D images to generate image data. The imaging device 112 may be or include, for example, an ultrasound scanner (which may include, for example, physically separate transducers and receivers, or a single ultrasound transceiver), an O-arm, a C-arm, a G-arm, or any other device that utilizes X-ray based imaging (e.g., fluoroscope, CT scanner, or other X-ray machine), a Magnetic Resonance Imaging (MRI) scanner, an Optical Coherence Tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermal imaging camera (e.g., an infrared camera), a radar system (which may include, for example, a transmitter, a receiver, a processor, and one or more antennas), or any other imaging device 112 suitable for obtaining images of anatomical features of a patient. The imaging device 112 may be contained entirely within a single housing, or may include a transmitter/emitter and receiver/detector located in separate housings or otherwise physically separated.
In some embodiments, the imaging device 112 may include more than one imaging device 112. For example, a first imaging device may provide first image data and/or a first image, and a second imaging device may provide second image data and/or a second image. In yet other implementations, the same imaging device may be used to provide both the first image data and the second image data and/or any other image data described herein. The imaging device 112 is operable to generate an image data stream. For example, the imaging device 112 may be configured to operate with the shutter open, or with the shutter continuously alternating between open and closed, in order to capture successive images. For purposes of this disclosure, image data may be considered continuous and/or provided as a stream of image data if the image data represents two or more frames per second, unless otherwise indicated.
The robot 114 may be any surgical robot or surgical robotic system. The robot 114 may be or include, for example, a Mazor XTM stealth robot guidance system. The robot 114 may be configured to position the imaging device 112 at one or more precise locations and orientations and/or return the imaging device 112 to the same location and orientation at a later point in time. The robot 114 may additionally or alternatively be configured to manipulate the surgical tool (whether based on guidance from the navigation system 118 or not) to complete or assist in surgical tasks. In some embodiments, the robot 114 may be configured to hold and/or manipulate anatomical elements during or in conjunction with a surgical procedure. The robot 114 may include one or more robotic arms 116. In some embodiments, robotic arm 116 may include a first robotic arm and a second robotic arm, but robot 114 may include more than two robotic arms. In some embodiments, one or more of the robotic arms 116 may be used to hold and/or manipulate the imaging device 112. In embodiments where the imaging device 112 includes two or more physically separate components (e.g., a transmitter and a receiver), one robotic arm 116 may hold one such component and another robotic arm 116 may hold another such component. Each robotic arm 116 may be capable of being positioned independently of the other robotic arms. The robotic arm 116 may be controlled in a single shared coordinate space or in separate coordinate spaces.
The robot 114 along with the robot arm 116 may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned in or capable of being positioned in any pose, plane, and/or focus. The pose includes a position and an orientation. Thus, the imaging device 112, surgical tool, or other object held by the robot 114 (or more specifically, by the robotic arm 116) may be able to be precisely positioned at one or more desired and specific positions and orientations.
The robotic arm 116 may include one or more sensors that enable the processor 104 (or the processor of the robot 114) to determine the precise pose of the robotic arm (and any objects or elements held by or fixed to the robotic arm) in space.
In some embodiments, the reference markers (i.e., navigation markers) may be placed on the robot 114 (including, for example, on the robotic arm 116), the imaging device 112, or any other object in the surgical space. The reference marks may be tracked by the navigation system 118 and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof. In some embodiments, the navigation system 118 may be used to track other components of the system (e.g., the imaging device 112), and the system may operate without the use of the robot 114 (e.g., the surgeon manually manipulates the imaging device 112 and/or one or more surgical tools, e.g., based on information and/or instructions generated by the navigation system 118).
During operation, the navigation system 118 may provide navigation to the surgeon and/or surgical robot. The navigation system 118 may be any known or future developed navigation system including, for example, the Medtronic StealthStationTM S8 surgical navigation system or any of its successors. The navigation system 118 may include one or more cameras or other sensors for tracking one or more reference markers, navigation trackers, or other objects within the operating room or other room in which part or all of the system 100 is located. The one or more cameras may be optical cameras, infrared cameras, or other cameras. In some embodiments, navigation system 118 may include one or more electromagnetic sensors. In various embodiments, the navigation system 118 may be used to track the position and orientation (e.g., pose) of the imaging device 112, the robot 114, and/or the robotic arm 116, and/or one or more surgical tools (or more specifically, to track the pose of a navigation tracker attached directly or indirectly in a fixed relationship to one or more of the foregoing). The navigation system 118 can include a display for displaying one or more images from an external source (e.g., the computing device 102, the imaging device 112, or other sources) or for displaying images and/or video streams from one or more cameras or other sensors of the navigation system 118. In some implementations, the system 100 may operate without the use of the navigation system 118. The navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or component thereof, to the robot 114 or any other element of the system 100 regarding, for example, the pose of one or more anatomical elements, whether the tool is in an appropriate trajectory, and/or how to move the tool into an appropriate trajectory according to a preoperative or other surgical plan to perform a surgical task.
Database 130 may store information relating one coordinate system to another coordinate system (e.g., relating one or more robotic coordinate systems to a patient coordinate system and/or a navigation coordinate system). Database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about targets and/or image information about anatomy of a patient at and/or near a surgical site for use by robot 114, navigation system 118, and/or a user of computing device 102 or system 100); in combination with one or more useful images of the surgery done by or with the assistance of one or more other components of the system 100; and/or any other useful information. Database 130 may be configured to provide any such information to computing device 102 or any other device of system 100 or any other device external to system 100, whether directly or via cloud 134. In some embodiments, database 130 may be or include a portion of a hospital image storage system, such as a Picture Archiving and Communication System (PACS), a Health Information System (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
Cloud 134 may be or represent the internet or any other wide area network. The computing device 102 may connect to the cloud 134 via the communication interface 108 using a wired connection, a wireless connection, or both. In some implementations, the computing device 102 can communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
The system 100 or similar system may be used, for example, to carry out one or more aspects of any of the methods 500, 600, and/or 700 described herein. The system 100 or similar system may also be used for other purposes.
Fig. 2 illustrates an imaging system diagram 200 in accordance with at least one embodiment of the present disclosure. As shown, the imaging system diagram 200 may include a first imaging device 202 and a second imaging device 204. In some examples, the first imaging device 202 may be a camera or a camera system and the second imaging device 204 may be an X-ray machine or an X-ray system. One or both of the imaging devices 202, 204 may be similar or identical to the imaging device 112 depicted and described in connection with fig. 1. As described herein, the image captured by the first imaging device 202 may be used to determine the location in which to place the second imaging device 204 in a room (e.g., an operating room or other type of room for a different medical procedure). In some examples, the first imaging device 202 may be used to capture an image of its surroundings. For example, the first imaging device may capture a first image of the target environment 206 to identify a location of a different object within the target environment 206, such as the patient 208 (e.g., the object), an operator of the imaging system, various equipment in the target environment (e.g., including the second imaging device 204), and so forth. Although shown as being located on the ceiling, the first imaging device 202 may be placed at other locations in the room, so long as the first imaging device 202 is capable of capturing images of the target environment 206 and at least the patient 208.
After a first image of the target environment 206 including at least the patient 208 has been captured using the first imaging device 202, as described herein, coordinates of interest may be selected from the first image, where the coordinates of interest include a particular region of interest of the patient 208 that is required for imaging (e.g., performing a subsequent surgical procedure and/or for assisting an ongoing surgical procedure). In some examples, an operator of the imaging system may manually input or select coordinates of interest from the first image. Additionally or alternatively, the imaging system may autonomously identify and select coordinates of interest (e.g., based on previously received instructions or data). As will be discussed in greater detail with reference to fig. 3, when selecting the coordinates of interest, the operator and/or the imaging system may select a target line that passes through a particular region of interest of the patient 208.
Subsequently, after the coordinates of interest have been selected on the first image, a processor (e.g., the processor 104 and/or the computing device 102 as described with reference to fig. 1) may calculate the coordinates of interest (e.g., the target line) and the location of the second imaging device 204 within the room. Additionally, the processor may calculate the distance and determine the direction of the coordinate of interest relative to the second imaging device 204. For example, the processor may generate a set of real world coordinates corresponding to a particular region of interest of the patient 208 based on the coordinates of interest, and may calculate the distance required to move the second imaging device 204 from the initial position where the second imaging device 204 is located to those real world coordinates. In some examples, the processor may generate the set of real world coordinates based on a mapping between the set of pixel coordinates and the real world coordinates, wherein the set of pixel coordinates is associated with the first image captured from the first imaging device 202 and the real world coordinates. The mapping between the set of pixel coordinates and the real world coordinates will be described in more detail with reference to fig. 4.
After calculating the distance and determining the direction of the coordinates of interest (e.g., the set of real world coordinates), the imaging system may use the calculated distance and the determined direction to position the second imaging device 204 to the first location. In some examples, the imaging system may output the calculated distance (e.g., the guidance information) to a user interface (e.g., user interface 110 as described with reference to fig. 1) for an operator of the imaging system to move the second imaging device 204 in accordance with the output. Additionally or alternatively, the imaging system may automatically move the second imaging system 204 to the first position based on the coordinates of interest and the calculated distance.
After the second imaging device 204 has been positioned to the first location, the imaging system may verify that the first location corresponds to the coordinates of interest using one or more additional images captured by the first imaging device 202. That is, the imaging system may check whether the second imaging device 204 is accurately positioned based on the coordinates of interest and the current location (e.g., the first location) of the second imaging device 204. If the first location does not correspond to coordinates of interest, the processor of the imaging system may again calculate the distance (and determine the direction) that the second imaging device 204 needs to move. Thus, the imaging system may position the second imaging device 204 according to the newly calculated distance (e.g., autonomously or by the operator based on the newly calculated distance output to the operator). Additionally or alternatively, after the location of the second imaging device 204 has been verified as corresponding to the coordinates of interest (e.g., after one or more given number of attempts to position the second imaging device 204), the second imaging device 204 may be used to capture and display a second image (e.g., an X-ray image) of those areas of interest of the patient 208. Subsequently, any associated surgical and/or other medical procedures may occur with the second imaging device 204 now properly positioned.
Fig. 3 illustrates an imaging system diagram 300 in accordance with at least one embodiment of the present disclosure. As shown, the imaging system diagram 300 may include an imaging device 302, which may be an example (e.g., an X-ray machine) of the second imaging device 204 as described with reference to fig. 2. In some examples, the imaging device 302 (e.g., of an X-ray system) may have a limited FOV, and thus it is important to ensure that the imaging device 302 is properly positioned to accurately capture a region of interest of the patient 304 (e.g., or, more generally, an "object") that is required to image with the limited FOV. As described with reference to fig. 2, although not shown in the example of fig. 3, additional imaging devices (e.g., the first imaging device 202 of fig. 2, such as a camera) may be used to help place the imaging device 302 at the correct location to capture one or more regions of interest of the patient 304.
In some examples, the additional imaging device may capture a first image (e.g., a digital image or video output to a user interface associated with the imaging system (such as user interface 110 described with reference to fig. 1)) of the target environment (e.g., an operating room), where the first image includes at least the imaging device 302 and the patient 304. Subsequently, coordinates of interest corresponding to the region of interest of the patient 304 may be selected on the first image. For example, the operator may draw a target line 306 on the first image that shows a region of interest of the patient 304 (e.g., a target location of the imaging device 302 to be placed to capture a subsequent image of the region of interest, such as an X-ray image). After the target line 306 and/or coordinates of interest have been entered or selected, a computing device and/or processor associated with the imaging system (e.g., computing device 102 and/or processor 104 as described with reference to fig. 1) may calculate or determine a location 308 (e.g., a set of real world coordinates) of the target line 306 and a location 310 of the imaging device 302. As will be described in greater detail with reference to fig. 4, a computing device and/or processor associated with the imaging system may calculate or determine the location 308 and the location 310 based on a mapping between pixel coordinates associated with the first image and real world coordinates of the target line 306 and the imaging device 302, respectively.
Once the position 308 and the position 310 have been calculated/determined, a distance 312 between the position 308 of the target line 306 and the position 310 of the imaging device 302 (e.g., the distance of the target line 306 relative to the current or initial position of the imaging device 302) may be calculated. Additionally, the direction in which the imaging device 302 needs to move to reach the target line 306 may be determined and/or calculated based on the locations 308 and 310. The distance 312 (and the determined direction) may then be used to position the imaging device 302 at a first location corresponding to the target line 306. In some examples, the computing device and/or the processor may output the distance 312 (and the determined direction) to the operator, such that the operator moves the imaging device 302 to the first position based on the output. Additionally or alternatively, the computing device and/or the processor may autonomously move the imaging device 302 to the first position based on the distance 312. In some examples, distance 312 (and the direction in which imaging device 302 is moved) may be referred to as guidance information as described herein.
As previously described with reference to fig. 2, the imaging system may use additional images (e.g., second image, third image, etc.) captured by additional imaging devices (e.g., cameras or camera systems) to verify whether the imaging device 302 is accurately positioned at the target line 306 after being moved to the first position. If the first position does not correspond to the target line 306, another distance for adjusting the position of the imaging device 302 may be calculated and the imaging device 302 may be moved according to this other distance (e.g., by an operator or autonomously). These steps may be repeated until the imaging device 302 is accurately positioned with respect to the target line 306. After the position of the imaging device 302 has been verified as accurate relative to the target line 306, the imaging device 302 may be used to capture images (e.g., X-ray images) of the region of interest of the patient 304 (e.g., for imaging and diagnostic procedures, for surgical procedures, etc.).
Fig. 4 illustrates a set of coordinate maps 400 in accordance with at least one embodiment of the present disclosure. As described herein, an imaging system (e.g., a robotic surgical imaging system) is provided that includes at least a first imaging device (e.g., a camera or camera system) and a second imaging device (e.g., an X-ray machine or X-ray system), wherein images captured from the first imaging device and inputs on those captured images are used to accurately position the second imaging device so that the second imaging device can then capture additional images of a region of interest of an object (e.g., a patient in an operating room) in a target environment.
The set of coordinate maps 400 may be used to calculate real world coordinates from different locations of images captured from the first imaging device. For example, the real world coordinates of the target location of the second imaging device to be placed may be determined from input on the captured image (e.g., coordinates of interest, target line 306 as described with reference to fig. 3, etc.) and the real world coordinates of the second imaging device (e.g., using a marker on top of the second imaging device). Subsequently, a distance between the real world coordinates of the target location and the real world coordinates of the second imaging device may be calculated to move the second imaging device according to the distance (e.g., autonomously or manually). Thus, the set of coordinate maps 400 provided in the example of fig. 4 may be used to map a set of pixel coordinates from an image captured by a first imaging device and corresponding to a target location and a current or initial location of a second imaging device to a respective plurality of sets of real world coordinates, and vice versa (e.g., from a plurality of sets of real world coordinates to a plurality of sets of pixel coordinates, e.g., to display a distance moved by the second imaging device on a user interface).
The set of coordinate maps 400 may include a first rotation map 402, a second rotation map 404, and a third rotation map 406. The first rotation diagram 402 may represent rotation about a first axis (e.g., x-axis) and may indicate how a second axis (e.g., y-axis) and a third axis (e.g., z-axis) are affected by rotation about the first axis (e.g., affected by an angle Φ). This rotation about the first axis can be given by the following equation (1):
the second rotation map 404 may represent rotation about a second axis (e.g., y-axis) and may indicate how the first axis (e.g., x-axis) and the third axis (e.g., z-axis) are affected by rotation about the second axis (e.g., affected by angle θ). This rotation about the second axis can be given by the following equation (2):
Third rotation diagram 406 may represent rotation about a third axis (e.g., z-axis) and may indicate how a first axis (e.g., x-axis) and a second axis (e.g., y-axis) are affected by rotation about the third axis (e.g., by angle ψ). This rotation about the third axis can be given by the following equation (3):
after using one or more of formulas (1), (2) and (3) for rotation about the respective axes, an overall motion matrix can be formed, which is given by formula (4) below:
Using motion matrices and algorithms, the computing devices and/or processors of the imaging systems described herein may calculate the old position (e.g., as given by X1、Y1 and Z1) and/or the target position (e.g., as given by X2、Y2 and Z2, coordinates of interest, etc.) of the second imaging device. In equation (4), 'O' may represent the "implementation location" given a set of coordinates; the 'R3x3' matrix may represent real-time coordinates (e.g., XYZ coordinates determined from the rotation map and corresponding formulas); and the 'T3x1' matrix may represent movement to the target location (e.g., movement from real-time coordinates to the target location).
The set of coordinate maps 400 may also include a first coordinate map 408 and a second coordinate map 410 that may be used to map image pixel coordinates to/from real world coordinates (e.g., using a camera and/or image coordinates). For example, using the first coordinate map 408, the following image pixel coordinate mappings may be determined, as given below in equations (5), (6), and (7):
subsequently, using the second coordinate map 410, a camera coordinate map may be determined, as disclosed below in the disclosure
Given in equation 8:
Thus, based on the various graphs shown in the examples of equations (1) through (8) and fig. 4, pixel locations of a particular region of an image captured by a first imaging device (e.g., a target location input on the captured image, a current or initial location of a second imaging device, etc.) may be mapped to real world coordinates, where differences (e.g., distances) between locations may be precisely determined. Additionally, the pixel coordinates/position may be output to an operator of the imaging system to indicate how far the operator needs to adjust the imaging system (e.g., how far the second imaging device is moved). In some examples, the transformation between real world coordinates and pixel coordinates may be given by the following equation (9):
Using equation (9), the computing device and/or processor of the imaging system may map between (e.g., adjust between) real world coordinates, camera coordinates, image coordinates, and pixel coordinates. As shown by the transformation in equation (9) above, the imaging system described herein may give the operator of the imaging system more advice on how to move the second imaging device (e.g., an X-ray system or an X-ray machine).
Fig. 5 depicts a method 500 that may be used, for example, to identify the current position of an imaging device relative to regions of interest of an object and calculate the distance that the imaging device moves (to more accurately capture those regions of interest).
The method 500 (and/or one or more steps thereof) may be performed or otherwise performed, for example, by at least one processor. The at least one processor may be the same as or similar to the processor 104 of the computing device 102 described above. The at least one processor may be part of a robot, such as robot 114, or part of a navigation system, such as navigation system 118. The method 500 may also be performed using a processor other than any of the processors described herein. The at least one processor may perform the method 500 by executing elements stored in a memory, such as the memory 106. Elements stored in memory and executed by a processor may cause the processor to perform one or more steps of the functions as shown in method 500. One or more portions of method 500 may be performed by a processor executing any of the contents of memory (e.g., image processing 120, segmentation 122, transformation 124, and/or registration 128).
The method 500 includes capturing a first image of a target environment using a first imaging device, wherein the first image includes objects included in the target environment (step 502). For example, the target environment may comprise an operating room, wherein the first image comprises at least an image of the patient. Additionally, the first imaging device may comprise a camera or a camera system. In some examples, the first imaging device may be located on top of the target environment (e.g., on the ceiling of an operating room) or may be located elsewhere in the target environment such that the first image still includes the object.
The method 500 further includes selecting a coordinate of interest included in the first image, the coordinate of interest including at least a portion of the object (step 504). For example, as described with reference to fig. 3, a target line may be selected that passes through the portion of the object, where the coordinates of interest include the target line.
The method 500 further includes generating a set of real world coordinates corresponding to the portion of the object based on a mapping between the set of pixel coordinates associated with the first image and the set of real world coordinates (step 506). For example, a set of real world coordinates may be generated as described with reference to fig. 4.
The method 500 further includes positioning a second imaging device to the first location based on the set of real world coordinates (step 508). That is, the second imaging device may be placed at a location corresponding to the generated real world coordinates, which in turn should correspond to the coordinates of interest.
The method 500 further includes displaying a second image captured using a second imaging device, wherein the second image includes at least the portion of the object (step 510). For example, if the second imaging device is located at the coordinates of interest, the second imaging device may then be used to capture an image (e.g., an X-ray image) of the portion of the object (e.g., the region of interest of the patient).
The present disclosure encompasses embodiments of the method 500 comprising more or fewer steps than those described above and/or one or more steps different from the steps described above.
Fig. 6 depicts a method 600 that may be used, for example, to verify the position of a second imaging device as described herein with respect to a given set of coordinates of interest.
The method 600 (and/or one or more steps thereof) may be performed or otherwise performed, for example, by at least one processor. The at least one processor may be the same as or similar to the processor 104 of the computing device 102 described above. The at least one processor may be part of a robot, such as robot 114, or part of a navigation system, such as navigation system 118. The method 600 may also be performed using a processor other than any of the processors described herein. At least one processor may perform the method 600 by executing elements stored in a memory, such as the memory 106. Elements stored in memory and executed by a processor may cause the processor to perform one or more steps of the functions as shown in method 600. One or more portions of method 600 may be performed by a processor executing any of the contents of memory (e.g., image processing 120, segmentation 122, transformation 124, and/or registration 128).
The method 600 includes capturing a first image of a target environment using a first imaging device, wherein the first image includes objects included in the target environment (step 602). The method 600 further includes selecting a coordinate of interest included in the first image, the coordinate of interest including at least a portion of the object (step 604). The method 600 further includes generating a set of real world coordinates corresponding to the portion of the object based on a mapping between the set of pixel coordinates associated with the first image and the set of real world coordinates (step 606). The method 600 further includes positioning a second imaging device to a first location based on the set of real world coordinates (step 608).
The method 600 further includes capturing a third image of the target environment using the first imaging device after the second imaging device has been positioned to the first location (step 610). The method 600 also includes verifying that the second imaging device is at the coordinates of interest based on the third image, the coordinates of interest, the set of real world coordinates, the set of pixel coordinates, or a combination thereof (step 612). For example, the first imaging device may be used to capture additional images of the moved second imaging device to verify that the second imaging device has been accurately moved to the coordinates of interest. If the second imaging device is accurately positioned (e.g., its location has been verified to be correct), the method 600 may continue to step 614. Alternatively, if it is determined by verification that the second imaging device is not located at the coordinates of interest, the second imaging device may be positioned to the second location (e.g., autonomously or manually by the operator based on the guidance information displayed for the operator).
The method 600 further includes displaying a second image captured using a second imaging device, wherein the second image includes at least the portion of the object (step 614). For example, after the location of the second imaging device has been verified as accurate, the second imaging device may be used to capture and display a second image.
The present disclosure encompasses embodiments of method 600 comprising more or fewer steps than those described above and/or one or more steps different from the steps described above.
Fig. 7 depicts a method 700 that may be used, for example, to guide an operator of the imaging system described herein when the operator manually moves the second imaging device to a target location.
The method 700 (and/or one or more steps thereof) may be performed or otherwise performed, for example, by at least one processor. The at least one processor may be the same as or similar to the processor 104 of the computing device 102 described above. The at least one processor may be part of a robot, such as robot 114, or part of a navigation system, such as navigation system 118. Method 700 may also be performed using a processor other than any of the processors described herein. The at least one processor may perform the method 700 by executing elements stored in a memory, such as the memory 106. Elements stored in memory and executed by a processor may cause the processor to perform one or more steps of the functions as shown in method 700. One or more portions of method 700 may be performed by a processor executing any of the contents of memory (e.g., image processing 120, segmentation 122, transformation 124, and/or registration 128).
The method 700 includes capturing a first image of a target environment using a first imaging device, wherein the first image includes objects included in the target environment (step 702). The method 700 further includes selecting a coordinate of interest included in the first image, the coordinate of interest including at least a portion of the object (step 704). The method 700 further includes generating a set of real world coordinates corresponding to the portion of the object based on a mapping between the set of pixel coordinates associated with the first image and the set of real world coordinates (step 706).
The method 700 further includes calculating a distance to move the second imaging device to the first position to capture a second image including at least the portion of the object, wherein the distance is calculated based on the coordinates of interest, the initial position of the second imaging device, the real world coordinates, or a combination thereof (step 708). In some examples, guidance information may be displayed to assist the operator in locating the second imaging device, where the guidance information includes the calculated distance. Additionally, guidance information may be displayed to the operator based on the set of pixel coordinates associated with the first image (e.g., converting the calculated distance back and forth between pixel coordinates, real world coordinates, camera coordinates, and image coordinates as described with reference to fig. 4).
The method 700 further includes locating the second imaging device to the first location based on the set of real world coordinates (e.g., and the calculated distance) (step 710). The method 700 further includes displaying a second image captured using a second imaging device, the second image including at least the portion of the object (step 712).
The present disclosure encompasses embodiments of method 700 that include more or fewer steps than those described above and/or one or more steps that differ from the steps described above.
As described above, the present disclosure encompasses methods having fewer than all of the steps identified in fig. 5, 6, and 7 (and corresponding descriptions of methods 500, 600, and 700), as well as methods including additional steps beyond those identified in fig. 5, 6, and 7 (and corresponding descriptions of methods 500, 600, and 700). The present disclosure also encompasses methods comprising one or more steps from one method described herein and one or more steps from another method described herein. Any of the correlations described herein may be or include registration or any other correlation.
The foregoing is not intended to limit the disclosure to one or more of the forms disclosed herein. In the foregoing detailed description, for example, various features of the disclosure are grouped together in one or more aspects, embodiments, and/or configurations for the purpose of streamlining the disclosure. Features of aspects, embodiments, and/or configurations of the present disclosure may be combined in alternative aspects, embodiments, and/or configurations than those discussed above. The methods of the present disclosure should not be construed as reflecting the following intent: the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed aspect, embodiment, and/or configuration. Thus, the following claims are hereby incorporated into this detailed description, with each claim standing on its own as a separate preferred embodiment of the disclosure.
Furthermore, while the foregoing has included descriptions of one or more aspects, embodiments and/or configurations, and certain variations and modifications, other variations, combinations, and modifications are within the scope of this disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative aspects, embodiments, and/or configurations to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.

Claims (20)

CN202280089774.9A2022-01-262022-01-26 Mobile X-ray Positioning SystemPendingCN118660668A (en)

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
PCT/CN2022/073939WO2023141800A1 (en)2022-01-262022-01-26Mobile x-ray positioning system

Publications (1)

Publication NumberPublication Date
CN118660668Atrue CN118660668A (en)2024-09-17

Family

ID=87470130

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202280089774.9APendingCN118660668A (en)2022-01-262022-01-26 Mobile X-ray Positioning System

Country Status (3)

CountryLink
EP (1)EP4468963A1 (en)
CN (1)CN118660668A (en)
WO (1)WO2023141800A1 (en)

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20140364720A1 (en)*2013-06-102014-12-11General Electric CompanySystems and methods for interactive magnetic resonance imaging
EP3273861A1 (en)*2015-03-272018-01-313Shape A/SA method of reducing the x-ray dose in an x-ray system
EP3461415A1 (en)*2017-09-272019-04-03Koninklijke Philips N.V.System and method for positioning a mobile medical imaging system
CN209392094U (en)*2018-06-202019-09-17深圳大学 An Augmented Reality Surgical System
CN111374690B (en)*2018-12-282025-04-11通用电气公司 Medical imaging method and system
CN111658142A (en)*2019-03-072020-09-15重庆高新技术产业开发区瑞晟医疗科技有限公司MR-based focus holographic navigation method and system
WO2020220208A1 (en)*2019-04-292020-11-05Shanghai United Imaging Healthcare Co., Ltd.Systems and methods for object positioning and image-guided surgery
CN112543623A (en)*2019-07-222021-03-23京东方科技集团股份有限公司Surgical robot system and control method thereof
JP2021074275A (en)*2019-11-082021-05-20キヤノンメディカルシステムズ株式会社Imaging support device
WO2022021026A1 (en)*2020-07-272022-02-03Shanghai United Imaging Healthcare Co., Ltd.Imaging systems and methods
WO2022032455A1 (en)*2020-08-102022-02-17Shanghai United Imaging Healthcare Co., Ltd.Imaging systems and methods
CN112348851B (en)*2020-11-042021-11-12无锡蓝软智能医疗科技有限公司Moving target tracking system and mixed reality operation auxiliary system
CN113229836A (en)2021-06-182021-08-10上海联影医疗科技股份有限公司Medical scanning method and system
CN113647967A (en)*2021-09-082021-11-16上海联影医疗科技股份有限公司Control method, device and system of medical scanning equipment

Also Published As

Publication numberPublication date
EP4468963A1 (en)2024-12-04
WO2023141800A1 (en)2023-08-03

Similar Documents

PublicationPublication DateTitle
US20230389991A1 (en)Spinous process clamp registration and methods for using the same
US20250143816A1 (en)System and method for aligning an imaging device
EP4473543A1 (en)Systems, methods, and devices for providing an augmented display
US20240404129A1 (en)Systems, methods, and devices for generating a corrected image
CN117769399A (en)Path planning based on working volume mapping
US20230240755A1 (en)Systems and methods for registering one or more anatomical elements
CN118678928A (en)System for verifying the pose of a marker
WO2023141800A1 (en)Mobile x-ray positioning system
US20230401766A1 (en)Systems, methods, and devices for generating a corrected image
US11847809B2 (en)Systems, devices, and methods for identifying and locating a region of interest
US20230278209A1 (en)Systems and methods for controlling a robotic arm
US12249099B2 (en)Systems, methods, and devices for reconstructing a three-dimensional representation
WO2025046505A1 (en)Systems and methods for patient registration using 2d image planes
WO2024180545A1 (en)Systems and methods for registering a target anatomical element
WO2025133940A1 (en)Systems and methods for patient registration using light patterns
WO2025122777A1 (en)Self-calibration of a multi-sensor system
WO2025186761A1 (en)Systems and methods for determining a pose of an object relative to an imaging device
WO2025032446A1 (en)Systems and methods for aligning imaging systems to stereotactic frame target view locations
WO2025120636A1 (en)Systems and methods for determining movement of one or more anatomical elements
CN118647331A (en) System and apparatus for generating hybrid images
WO2025120637A1 (en)Systems and methods for planning and updating trajectories for imaging devices
WO2023148586A1 (en)Systems, methods, and devices for tracking one or more objects

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination

[8]ページ先頭

©2009-2025 Movatter.jp