Movatterモバイル変換


[0]ホーム

URL:


US20170202624A1 - Device and method for assisting laparoscopic surgery utilizing a touch screen - Google Patents

Device and method for assisting laparoscopic surgery utilizing a touch screen
Download PDF

Info

Publication number
US20170202624A1
US20170202624A1US15/317,121US201515317121AUS2017202624A1US 20170202624 A1US20170202624 A1US 20170202624A1US 201515317121 AUS201515317121 AUS 201515317121AUS 2017202624 A1US2017202624 A1US 2017202624A1
Authority
US
United States
Prior art keywords
surgical
tool
movement
movements
rule
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/317,121
Inventor
Gal ATAROT
Tal Nir
Motti FRIMER
Tami Harel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Transenterix Europe Sarl
Original Assignee
MST Medical Surgery Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MST Medical Surgery Technologies LtdfiledCriticalMST Medical Surgery Technologies Ltd
Priority to US15/317,121priorityCriticalpatent/US20170202624A1/en
Assigned to M.S.T. MEDICAL SURGERY TECHNOLOGIES LTDreassignmentM.S.T. MEDICAL SURGERY TECHNOLOGIES LTDASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: ATAROT, Gal, FRIMER, Motti, HAREL, TAMI, NIR, TAL
Publication of US20170202624A1publicationCriticalpatent/US20170202624A1/en
Assigned to TRANSENTERIX EUROPE, S.A.R.L.reassignmentTRANSENTERIX EUROPE, S.A.R.L.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: M.S.T. MEDICAL SURGERY TECHNOLOGIES LTD.
Abandonedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A surgical controlling system, comprising: at least one surgical tool configured to be inserted into a surgical environment of a human body; at least one location estimating means configured for real-time localization of the 3D spatial position of said at least one surgical tool at any given time t; at least one movement detection means communicable with a movement's database and with said location estimating means; a controller having a processing means communicable with a controller's database; said controller's database is in communication with said movement detection means; and at least one display configured to real time provide an image of at least a portion of said surgical environment; wherein said controller is configured to direct said surgical tool to said location via said instructions provided by said controller; further wherein said location is real time updated on said display as said at least one surgical tool is moved.

Description

Claims (34)

167. A surgical controlling system, comprising:
a. at least one surgical tool configured to be inserted into a surgical environment of a human body for assisting a surgical procedure;
b. at least one location estimating means configured for real-time localization of the 3D spatial position of said at least one surgical tool at any given time t;
c. at least one movement detection means communicable with a movement's database and with said location estimating means; said movement's database is configured to store said 3D spatial position of said at least one surgical tool at time tƒand at time t0;
where tƒ>t0; said movement detection means is configured to detect movement of said at least one surgical tool if the 3D spatial position of said at least one surgical tool at time tƒis different than said 3D spatial position of said at least one surgical tool at time t0; and,
d. a controller having a processing means communicable with a controller's database, said controller configured to control the spatial position of said at least one surgical tool;
said controller's database is in communication with said movement detection means;
said controller configured to provide instructions for moving said at least one surgical tool; and
e. input receiving means configured to receive input and to convert said input to at least one location within said surgical environment of said human body; and
f. at least one display configured to real time provide an image of at least a portion of said surgical environment;
wherein said controller is configured to direct said surgical tool to said location via said instructions provided by said controller;
further wherein said location is real time updated on said display as said at least one surgical tool is moved.
168. The surgical controlling system ofclaim 167, wherein said input receiving means is selected from a group consisting of: (a) a touchscreen in wired or wireless communication with said controller, configured to display an image of at least a portion of said surgical environment of said human body and to receive input of said at least one location within said surgical environment of said human body; (b) at least one first camera, in wired or wireless communication with said controller, configured to detect movement of a user's eye; and, (c) at least one second camera, in wired or wireless communication with said controller, configured to detect movement of at least one body portion of the user; said body portion is selected from a group consisting of hand, elbow, leg. foot, finger and any combination thereof; (d) at least one voice receiving means, in wired or wireless communication with said controller, configured to detect sound; and any combination thereof.
179. The surgical controlling system ofclaim 178, wherein at least one of the following is being held true:
a. said route rule comprises a communicable database storing predefined route in which said at least one surgical tool is configured to move within said surgical environment; said predefined route comprises n 3D spatial positions of said at least one surgical tool; n is an integer greater than or equal to 2; said allowed movements are movements in which said at least one surgical tool is located substantially in at least one of said n 3D spatial positions of said predefined route, and said restricted movements are movements in which said location of said at least one surgical tool is substantially different from said n 3D spatial positions of said predefined route;
b. said environmental rule comprises a comprises a communicable database; said communicable database configured to receive at least one real-time image of said surgical environment and is configured to perform real-time image processing of the same and to determine the 3D spatial position of hazards or obstacles in said surgical environment; said environmental rule is configured to determine said allowed and restricted movements according to said hazards or obstacles in said surgical environment, such that said restricted movements are movements in which said at least one surgical tool is located substantially in at least one of said 3D spatial positions, and said allowed movements are movements in which the location of said at least one surgical tool is substantially different from said 3D spatial positions;
c. said operator input rule comprises a communicable database; said communicable database is configured to receive an input from the operator of said system regarding said allowed and restricted movements of said at least one surgical tool;
d. said proximity rule is configured to define a predetermined distance between at least two surgical tools; said allowed movements are movements which are within the range or out of the range of said predetermined distance, and said restricted movements are movements which are out of the range or within the range of said predetermined distance;
e. said proximity rule is configured to define a predetermined angle between at least three surgical tools; said allowed movements are movements which are within the range or out of the range of said predetermined angle, and said restricted movements which are out of the range or within the range of said predetermined angle;
f. said collision prevention rule is configured to define a predetermined distance between said at least one surgical tool and an anatomical element within said surgical environment; said allowed movements are movements which are in a range that is larger than said predetermined distance, and said restricted movements are movements which is in a range that is smaller than said predetermined distance; said anatomical element is selected from a group consisting of tissue, organ, another surgical tool and any combination thereof;
g. said right tool rule is configured to determine said allowed movement of said endoscope according to the movement of the surgical tool positioned to right of said endoscope; further wherein said left tool rule is configured to determine said allowed movement of said endoscope according to the movement of the surgical tool positioned to left of said endoscope;
h. said tagged tool rule comprises means configured to tag at least one surgical tool within said surgical environment and to determine said allowed movement of said endoscope so as to constantly track the movement of said tagged surgical tool;
i. said field of view rule comprises a communicable database comprising n 3D spatial positions; n is an integer greater than or equal to 2; the combination of all of said n 3D spatial positions provides a predetermined field of view; said field of view rule is configured to determine said allowed movement of said endoscope within said n 3D spatial positions so as to maintain a constant field of view, such that said allowed movements are movements in which said endoscope is located substantially in at least one of said n 3D spatial positions, and said restricted movements are movements in which the location of said endoscope is substantially different from said n 3D spatial positions;
j. said preferred volume zone rule comprises a communicable database comprising n 3D spatial positions; n is an integer greater than or equal to 2; said n 3D spatial positions provides said preferred volume zone; said preferred volume zone rule is configured to determine said allowed movement of said endoscope within said n 3D spatial positions and restricted movement of said endoscope outside said n 3D spatial positions, such that said allowed movements are movements in which said endoscope is located substantially in at least one of said n 3D spatial positions, and said restricted movements are movements in which the location of said endoscope is substantially different from said n 3D spatial positions;
k. said preferred tool rule comprises a communicable database, said database stores a preferred tool; said preferred tool rule is configured to determine said allowed movement of said endoscope to constantly track the movement of said preferred tool;
l. said no fly zone rule comprises a communicable database comprising n 3D spatial positions; n is an integer greater than or equal to 2; said n 3D spatial positions define a predetermined volume within said surgical environment; said no fly zone rule is configured to determine said restricted movement if said movement is within said no fly zone and allowed movement if said movement is outside said no fly zone, such that said restricted movements are movements in which said at least one of said surgical tool is located substantially in at least one of said n 3D spatial positions, and said allowed movements are movements in which the location of said at least one endoscope is substantially different from said n 3D spatial positions;
m. said most used tool rule comprises a communicable database counting the amount of movement of each said surgical tool; said most used tool rule is configured to constantly position said endoscope to track the movement of the most moved surgical tool;
n. wherein said system further comprises a maneuvering subsystem communicable with said controller, said maneuvering subsystem is configured to spatially reposition said at least one surgical tool during a surgery according to said predetermined set of rules; further wherein said system is configured to alert the physician of said restricted movement of said at least one surgical tool; said alert is selected from a group consisting of audio signaling, voice signaling, light signaling, flashing signaling and any combination thereof;
o. said history-based rule comprises a communicable database storing each 3D spatial position of each said surgical tool, such that each movement of each surgical tool is stored; said history-based rule is configured to determine said allowed and restricted movements according to historical movements of said at least one surgical tool, such that said allowed movements are movements in which said at least one surgical tool is located substantially in at least one of said 3D spatial positions, and said restricted movements are movements in which the location of said at least one surgical tool is substantially different from said n 3D spatial positions;
p. said tool-dependent allowed and restricted movements rule comprises a communicable database; said communicable database is configured to store predetermined characteristics of at least one of said surgical tool; said tool-dependent allowed and restricted movements rule is configured to determine said allowed and restricted movements according to said predetermined characteristics of said surgical tool; such that allowed movements are movements of said endoscope which track said surgical tool having said predetermined characteristics;
q. said movement detection rule comprises a communicable database comprising the real-time 3D spatial positions of each said surgical tool; said movement detection rule is configured to detect movement of said at least one surgical tool when a change in said 3D spatial positions is received, such that said allowed movements are movements in which said endoscope is re-directed to focus on said moving surgical tool; and any combination thereof.
183. A method of using a surgical controlling system, comprising steps of:
a. providing a surgical controlling system comprising:
i. at least one surgical tool configured to be inserted into a surgical environment of a human body for assisting a surgical procedure;
ii. at least one location estimating means configured for real-time localization of the 3D spatial position of said at least one surgical tool at any given time t;
iii. at least one movement detection means communicable with a movement's database and with said location estimating means; said movement's database is configured to store said 3D spatial position of said at least one surgical tool at time tƒand at time t0; where tƒ>t0; said movement detection means is configured to detect movement of said at least one surgical tool if the 3D spatial position of said at least one surgical tool at time tƒis different than said 3D spatial position of said at least one surgical tool at time t0; and,
iv. a controller having a processing means communicable with a controller's database, said controller configured to control the spatial position of said at least one surgical tool; said controller's database is in communication with said movement detection means;
v. input receiving means configured to receive input and to convert said input to at least one location within said surgical environment of said human body; and
vi. at least one display configured to real time provide an image of at least a portion of said surgical environment;
b. inserting at least one said surgical tool into said surgical environment;
c. displaying said at least a portion of said surgical environment;
d. receiving input and convert said input to at least one location within said surgical environment of said human body;
e. estimating the 3D spatial position of at least one said surgical tool; and
f. directing and moving said surgical tool to said location via instructions provided by said controller
wherein said location is real time updated on said display as said at least one surgical tool is moved.
184. The method ofclaim 183, additionally comprising steps of selecting said input receiving means from a group consisting of: (a) at least one touchscreen in wired or wireless communication with said controller, configured to display an image of at least a portion of said surgical environment of said human body and to receive input of said at least one location within said surgical environment of said human body; (b) at least one first camera, in wired or wireless communication with said controller, configured to detect movement of a user's eye; and (c) at least one second camera, in wired or wireless communication with said controller, configured to detect movement of at least one body portion of the user; said body portion is selected from a group consisting of hand, elbow, leg. foot, finger and any combination thereof; (d) at least one voice receiving means, in wired or wireless communication with said controller, configured to detect sound; and any combination thereof.
185. The method ofclaim 184, additionally comprising steps of receiving said input in a manner selected from a group consisting of: (a) providing said at least one touchscreen, displaying said image of at least a portion of said surgical environment via said touchscreen; and determining said location within said surgical environment of said human body from pressure on a portion of said touchscreen; (b) providing said at least one first camera, detecting said movement of said eye; and determining said location within said surgical environment of said human body from proportional movement of said eye; and (c) providing said at least one second camera, detecting said movement of said body portion of the user; and determining said location within said surgical environment of said human body from proportional movement of said body portion; (d) providing at least one voice receiving means, in wired or wireless communication with said controller; and determining said location within said surgical environment of said human body from said sound; and any combination thereof.
186. A surgical controlling system, comprising:
a. at least one surgical tool configured to be inserted into a surgical environment of a human body for assisting a surgical procedure;
b. at least one location estimating means configured for real-time localization of the 3D spatial position of said at least one surgical tool at any given time t;
c. at least one movement detection means communicable with a movement's database and with said location estimating means; said movement's database is configured to store said 3D spatial position of said at least one surgical tool at time tƒand at time t0; where tƒ>t0; said movement detection means is configured to detect movement of said at least one surgical tool if the 3D spatial position of said at least one surgical tool at time tƒis different than said 3D spatial position of said at least one surgical tool at time t0; and,
d. a controller having a processing means communicable with a controller's database, said controller configured to control the spatial position of said at least one surgical tool;
said controller's database is in communication with said movement detection means;
said controller configured to provide instructions for moving said at least one surgical tool; and
e. input receiving means configured to receive input and to convert said input to at least one location within said surgical environment of said human body; and
f. at least one display configured to real time provide an image of at least a portion of said surgical environment;
wherein said controller is configured to direct said surgical tool to said location via said instructions provided by said controller;
further wherein said location is real time updated on said display as said at least one surgical tool is moved
further wherein said input receiving means is a touchscreen in wired or wireless communication with said controller, configured to display an image of at least a portion of said surgical environment of said human body and to receive input of at least one location within said surgical environment of said human body.
199. The surgical controlling system ofclaim 198, wherein at least one of the following is being held true:
a. said route rule comprises a communicable database storing predefined route in which said at least one surgical tool is configured to move within said surgical environment; said predefined route comprises n 3D spatial positions of said at least one surgical tool; n is an integer greater than or equal to 2; said allowed movements are movements in which said at least one surgical tool is located substantially in at least one of said n 3D spatial positions of said predefined route, and said restricted movements are movements in which said location of said at least one surgical tool is substantially different from said n 3D spatial positions of said predefined route;
b. said environmental rule comprises a comprises a communicable database; said communicable database configured to receive at least one real-time image of said surgical environment and is configured to perform real-time image processing of the same and to determine the 3D spatial position of hazards or obstacles in said surgical environment; said environmental rule is configured to determine said allowed and restricted movements according to said hazards or obstacles in said surgical environment, such that said restricted movements are movements in which said at least one surgical tool is located substantially in at least one of said 3D spatial positions, and said allowed movements are movements in which the location of said at least one surgical tool is substantially different from said 3D spatial positions;
c. said operator input rule comprises a communicable database; said communicable database is configured to receive an input from the operator of said system regarding said allowed and restricted movements of said at least one surgical tool;
d. said proximity rule is configured to define a predetermined distance between at least two surgical tools; said allowed movements are movements which are within the range or out of the range of said predetermined distance, and said restricted movements are movements which are out of the range or within the range of said predetermined distance;
e. said proximity rule is configured to define a predetermined angle between at least three surgical tools; said allowed movements are movements which are within the range or out of the range of said predetermined angle, and said restricted movements which are out of the range or within the range of said predetermined angle;
f. said collision prevention rule is configured to define a predetermined distance between said at least one surgical tool and an anatomical element within said surgical environment; said allowed movements are movements which are in a range that is larger than said predetermined distance, and said restricted movements are movements which is in a range that is smaller than said predetermined distance; said anatomical element is selected from a group consisting of tissue, organ, another surgical tool and any combination thereof;
g. said right tool rule is configured to determine said allowed movement of said endoscope according to the movement of the surgical tool positioned to right of said endoscope; further wherein said left tool rule is configured to determine said allowed movement of said endoscope according to the movement of the surgical tool positioned to left of said endoscope;
h. said tagged tool rule comprises means configured to tag at least one surgical tool within said surgical environment and to determine said allowed movement of said endoscope so as to constantly track the movement of said tagged surgical tool;
i. said field of view rule comprises a communicable database comprising n 3D spatial positions; n is an integer greater than or equal to 2; the combination of all of said n 3D spatial positions provides a predetermined field of view; said field of view rule is configured to determine said allowed movement of said endoscope within said n 3D spatial positions so as to maintain a constant field of view, such that said allowed movements are movements in which said endoscope is located substantially in at least one of said n 3D spatial positions, and said restricted movements are movements in which the location of said endoscope is substantially different from said n 3D spatial positions;
j. said preferred volume zone rule comprises a communicable database comprising n 3D spatial positions; n is an integer greater than or equal to 2; said n 3D spatial positions provides said preferred volume zone; said preferred volume zone rule is configured to determine said allowed movement of said endoscope within said n 3D spatial positions and restricted movement of said endoscope outside said n 3D spatial positions, such that said allowed movements are movements in which said endoscope is located substantially in at least one of said n 3D spatial positions, and said restricted movements are movements in which the location of said endoscope is substantially different from said n 3D spatial positions;
k. said preferred tool rule comprises a communicable database, said database stores a preferred tool; said preferred tool rule is configured to determine said allowed movement of said endoscope to constantly track the movement of said preferred tool;
l. said no fly zone rule comprises a communicable database comprising n 3D spatial positions; n is an integer greater than or equal to 2; said n 3D spatial positions define a predetermined volume within said surgical environment; said no fly zone rule is configured to determine said restricted movement if said movement is within said no fly zone and allowed movement if said movement is outside said no fly zone, such that said restricted movements are movements in which said at least one of said surgical tool is located substantially in at least one of said n 3D spatial positions, and said allowed movements are movements in which the location of said at least one endoscope is substantially different from said n 3D spatial positions;
m. said most used tool rule comprises a communicable database counting the amount of movement of each said surgical tool; said most used tool rule is configured to constantly position said endoscope to track the movement of the most moved surgical tool;
n. wherein said system further comprises a maneuvering subsystem communicable with said controller, said maneuvering subsystem is configured to spatially reposition said at least one surgical tool during a surgery according to said predetermined set of rules; further wherein said system is configured to alert the physician of said restricted movement of said at least one surgical tool; said alert is selected from a group consisting of audio signaling, voice signaling, light signaling, flashing signaling and any combination thereof;
o. said history-based rule comprises a communicable database storing each 3D spatial position of each said surgical tool, such that each movement of each surgical tool is stored; said history-based rule is configured to determine said allowed and restricted movements according to historical movements of said at least one surgical tool, such that said allowed movements are movements in which said at least one surgical tool is located substantially in at least one of said 3D spatial positions, and said restricted movements are movements in which the location of said at least one surgical tool is substantially different from said n 3D spatial positions;
p. said tool-dependent allowed and restricted movements rule comprises a communicable database; said communicable database is configured to store predetermined characteristics of at least one of said surgical tool; said tool-dependent allowed and restricted movements rule is configured to determine said allowed and restricted movements according to said predetermined characteristics of said surgical tool; such that allowed movements are movements of said endoscope which track said surgical tool having said predetermined characteristics;
q. said movement detection rule comprises a communicable database comprising the real-time 3D spatial positions of each said surgical tool; said movement detection rule is configured to detect movement of said at least one surgical tool when a change in said 3D spatial positions is received, such that said allowed movements are movements in which said endoscope is re-directed to focus on said moving surgical tool; and any combination thereof.
US15/317,1212014-06-082015-06-08Device and method for assisting laparoscopic surgery utilizing a touch screenAbandonedUS20170202624A1 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US15/317,121US20170202624A1 (en)2014-06-082015-06-08Device and method for assisting laparoscopic surgery utilizing a touch screen

Applications Claiming Priority (3)

Application NumberPriority DateFiling DateTitle
US201462009240P2014-06-082014-06-08
US15/317,121US20170202624A1 (en)2014-06-082015-06-08Device and method for assisting laparoscopic surgery utilizing a touch screen
PCT/IL2015/050579WO2015189839A1 (en)2014-06-082015-06-08Device and method for assisting laparoscopic surgery utilizing a touch screen

Related Parent Applications (1)

Application NumberTitlePriority DateFiling Date
PCT/IL2015/050579A-371-Of-InternationalWO2015189839A1 (en)2014-06-082015-06-08Device and method for assisting laparoscopic surgery utilizing a touch screen

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US17/962,440ContinuationUS12364546B2 (en)2014-06-082022-10-07Device and method for assisting laparoscopic surgery utilizing a touch screen

Publications (1)

Publication NumberPublication Date
US20170202624A1true US20170202624A1 (en)2017-07-20

Family

ID=54832991

Family Applications (2)

Application NumberTitlePriority DateFiling Date
US15/317,121AbandonedUS20170202624A1 (en)2014-06-082015-06-08Device and method for assisting laparoscopic surgery utilizing a touch screen
US17/962,440ActiveUS12364546B2 (en)2014-06-082022-10-07Device and method for assisting laparoscopic surgery utilizing a touch screen

Family Applications After (1)

Application NumberTitlePriority DateFiling Date
US17/962,440ActiveUS12364546B2 (en)2014-06-082022-10-07Device and method for assisting laparoscopic surgery utilizing a touch screen

Country Status (2)

CountryLink
US (2)US20170202624A1 (en)
WO (1)WO2015189839A1 (en)

Cited By (43)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9937013B2 (en)2011-08-212018-04-10M.S.T. Medical Surgery Technologies LtdDevice and method for assisting laparoscopic surgery—rule based approach
US9943372B2 (en)2005-04-182018-04-17M.S.T. Medical Surgery Technologies Ltd.Device having a wearable interface for improving laparoscopic surgery and methods for use thereof
US20190265327A1 (en)*2018-02-282019-08-29The Boeing CompanyHybrid position locating system in a manufacturing environment
US20190269390A1 (en)*2011-08-212019-09-05Transenterix Europe S.A.R.L.Device and method for assisting laparoscopic surgery - rule based approach
US20200129740A1 (en)*2018-10-302020-04-30Corindus, Inc.System and method for navigating a device through a path to a target location
US10650594B2 (en)2015-02-032020-05-12Globus Medical Inc.Surgeon head-mounted display apparatuses
US10646283B2 (en)2018-02-192020-05-12Globus Medical Inc.Augmented reality navigation systems for use with robotic surgical systems and methods of their use
CN111655160A (en)*2018-01-082020-09-11利万纳医疗有限责任公司 3D Imaging and Modeling of Ultrasound Image Data
CN112236069A (en)*2018-06-082021-01-15阿克拉伦特公司 Surgical navigation system with self-actuated endoscope
US20210030497A1 (en)*2019-07-312021-02-04Auris Health, Inc.Apparatus, systems, and methods to facilitate instrument visualization
WO2021146261A3 (en)*2020-01-132021-08-19Stryker CorporationSystem for monitoring offset during navigation-assisted surgery
WO2021198906A1 (en)2020-03-302021-10-07Auris Health, Inc.Target anatomical feature localization
US11153555B1 (en)2020-05-082021-10-19Globus Medical Inc.Extended reality headset camera system for computer assisted navigation in surgery
US11207150B2 (en)2020-02-192021-12-28Globus Medical, Inc.Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US11253324B1 (en)*2019-11-062022-02-22Cognistic, LLCDetermination of appendix position using a two stage deep neural network
US11382700B2 (en)2020-05-082022-07-12Globus Medical Inc.Extended reality headset tool tracking and control
US11382699B2 (en)2020-02-102022-07-12Globus Medical Inc.Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
US11464581B2 (en)2020-01-282022-10-11Globus Medical, Inc.Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US11471151B2 (en)*2018-07-162022-10-18Cilag Gmbh InternationalSafety logic for surgical suturing systems
US11510750B2 (en)2020-05-082022-11-29Globus Medical, Inc.Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US11589731B2 (en)2019-12-302023-02-28Cilag Gmbh InternationalVisualization systems using structured light
US11607277B2 (en)2020-04-292023-03-21Globus Medical, Inc.Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery
US20230096691A1 (en)*2021-09-292023-03-30Cilag Gmbh InternationalMethods and Systems for Controlling Cooperative Surgical Instruments
US11638615B2 (en)*2015-08-302023-05-02Asensus Surgical Us, Inc.Intelligent surgical tool control system for laparoscopic surgeries
US11648060B2 (en)2019-12-302023-05-16Cilag Gmbh InternationalSurgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ
US11737831B2 (en)2020-09-022023-08-29Globus Medical Inc.Surgical object tracking template generation for computer assisted navigation during surgical procedure
US11744667B2 (en)2019-12-302023-09-05Cilag Gmbh InternationalAdaptive visualization by a surgical system
US11759283B2 (en)2019-12-302023-09-19Cilag Gmbh InternationalSurgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11776144B2 (en)2019-12-302023-10-03Cilag Gmbh InternationalSystem and method for determining, adjusting, and managing resection margin about a subject tissue
US11832996B2 (en)2019-12-302023-12-05Cilag Gmbh InternationalAnalyzing surgical trends by a surgical system
US11850104B2 (en)2019-12-302023-12-26Cilag Gmbh InternationalSurgical imaging system
US11864729B2 (en)2019-12-302024-01-09Cilag Gmbh InternationalMethod of using imaging devices in surgery
US11992373B2 (en)2019-12-102024-05-28Globus Medical, IncAugmented reality headset with varied opacity for navigated robotic surgery
US12002571B2 (en)2019-12-302024-06-04Cilag Gmbh InternationalDynamic surgical visualization systems
US12053223B2 (en)2019-12-302024-08-06Cilag Gmbh InternationalAdaptive surgical system control according to surgical smoke particulate characteristics
US12061746B2 (en)*2022-10-262024-08-13Lixel Inc.Interactive simulation system with stereoscopic image and method for operating the same
US12133772B2 (en)2019-12-102024-11-05Globus Medical, Inc.Augmented reality headset for navigated robotic surgery
US20240423734A1 (en)*2023-06-262024-12-26Cilag Gmbh InternationalSystem and method to restrict range of motion of robotic surgical system
US12207881B2 (en)2019-12-302025-01-28Cilag Gmbh InternationalSurgical systems correlating visualization data and powered surgical instrument data
US12220176B2 (en)2019-12-102025-02-11Globus Medical, Inc.Extended reality instrument interaction zone for navigated robotic
US12257013B2 (en)2019-03-152025-03-25Cilag Gmbh InternationalRobotic surgical systems with mechanisms for scaling camera magnification according to proximity of surgical tool to tissue
US12295683B2 (en)*2022-02-022025-05-13Mazor Robotics Ltd.Systems and methods for robotic collision avoidance using medical imaging
US12414823B2 (en)2019-12-312025-09-16Auris Health, Inc.Anatomical feature tracking

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US10866783B2 (en)2011-08-212020-12-15Transenterix Europe S.A.R.L.Vocally activated surgical control system
US9795282B2 (en)2011-09-202017-10-24M.S.T. Medical Surgery Technologies LtdDevice and method for maneuvering endoscope

Citations (27)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20050111621A1 (en)*2003-10-072005-05-26Robert RikerPlanning system, method and apparatus for conformal radiation therapy
US20080071142A1 (en)*2006-09-182008-03-20Abhishek GattaniVisual navigation system for endoscopic surgery
US20090018419A1 (en)*2004-04-012009-01-15Torch William CBiosensors, communicators, and controllers monitoring eye movement and methods for using them
US20100063630A1 (en)*2002-08-132010-03-11Garnette Roy SutherlandMicrosurgical robot system
US20100121149A1 (en)*2007-07-172010-05-13Mordehai SholevInterface between a surgeon and an automated assistant and method thereof
US20120020547A1 (en)*2007-09-302012-01-26Intuitive Surgical Operations, Inc.Methods of Locating and Tracking Robotic Instruments in Robotic Surgical Systems
US20120254747A1 (en)*2011-03-302012-10-04Mckesson Financial HoldingsMethods, apparatuses and computer program products for generating regions of interest using gestures via a user interface
WO2013027201A2 (en)*2011-08-212013-02-28M.S.T. Medical Surgery Technologies.Ltd.Device and method for asissting laparoscopic surgery- rule based approach
US20130131504A1 (en)*2011-10-282013-05-23Navident Technologies, Inc.Soft body automatic registration and surgical monitoring system
US20130131505A1 (en)*2011-10-282013-05-23Navident Technologies, Inc.Surgical location monitoring system and method using skin applied fiducial reference
US20130172906A1 (en)*2010-03-312013-07-04Eric S. OlsonIntuitive user interface control for remote catheter navigation and 3D mapping and visualization systems
US20130204271A1 (en)*2012-02-022013-08-08Intuitive Surgical Operations, Inc.Systems and Methods for Controlling a Robotic Surgical System
US20130261433A1 (en)*2012-03-282013-10-03Navident Technologies, Inc.Haptic simulation and surgical location monitoring system and method
US20140005484A1 (en)*2012-06-272014-01-02CamPlex LLCInterface for viewing video from cameras on a surgical visualization system
US20140081659A1 (en)*2012-09-172014-03-20Depuy Orthopaedics, Inc.Systems and methods for surgical and interventional planning, support, post-operative follow-up, and functional recovery tracking
US20140309523A1 (en)*2013-04-162014-10-16Navigate Surgical Technologies, Inc.Three-dimensional extraction tracking for implant modeling
US20140320600A1 (en)*2013-04-262014-10-30Navigate Surgical Technologies, Inc.System and method for tracking non-visible structure of a body
US20150182296A1 (en)*2011-10-282015-07-02Navigate Surgical Technologies, Inc.System and method for real time tracking and modeling of surgical site
US20150196800A1 (en)*2014-01-132015-07-16Vincent James MacriApparatus, method and system for pre-action therapy
US20150205947A1 (en)*2013-12-272015-07-23Abbott Diabetes Care Inc.Application interface and display control in an analyte monitoring environment
US9101397B2 (en)*1999-04-072015-08-11Intuitive Surgical Operations, Inc.Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
US20150320514A1 (en)*2014-05-082015-11-12Samsung Electronics Co., Ltd.Surgical robots and control methods thereof
US9204939B2 (en)*2011-08-212015-12-08M.S.T. Medical Surgery Technologies Ltd.Device and method for assisting laparoscopic surgery—rule based approach
US20160078627A1 (en)*2012-11-082016-03-17Navigate Surgical Technologies, Inc.System and method for determining the three-dimensional location and orienation of identification markers
US20160235486A1 (en)*2007-06-132016-08-18Intuitive Surgical Operations, Inc.Preventing instrument/tissue collisions
US20170172675A1 (en)*2014-03-192017-06-22Intuitive Surgical Operations, Inc.Medical devices, systems, and methods using eye gaze tracking
US9757206B2 (en)*2011-08-212017-09-12M.S.T. Medical Surgery Technologies LtdDevice and method for assisting laparoscopic surgery—rule based approach

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8808164B2 (en)*2008-03-282014-08-19Intuitive Surgical Operations, Inc.Controlling a robotic surgical tool with a display monitor
US8155479B2 (en)*2008-03-282012-04-10Intuitive Surgical Operations Inc.Automated panning and digital zooming for robotic surgical systems
EP2320990B2 (en)*2008-08-292023-05-31Corindus, Inc.Catheter control system and graphical user interface
US9474440B2 (en)*2009-06-182016-10-25Endochoice, Inc.Endoscope tip position visual indicator and heat management system
AU2013202944B2 (en)*2012-04-262015-11-12Samsung Electronics Co., Ltd.Method and terminal for displaying a plurality of pages, method and terminal for displaying a plurality of applications being executed on terminal, and method of executing a plurality of applications
KR102079945B1 (en)*2012-11-222020-02-21삼성전자주식회사Surgical robot and method for controlling the surgical robot
WO2014156229A1 (en)*2013-03-272014-10-02オリンパス株式会社Operation input device and master-slave system
DE102013109677A1 (en)*2013-09-052015-03-05MAQUET GmbH Assistance device for the imaging support of an operator during a surgical procedure
US20160331213A1 (en)*2013-10-312016-11-17Optimede Inc.Portable inspection system
KR102255830B1 (en)*2014-02-052021-05-25삼성전자주식회사Apparatus and Method for displaying plural windows
CN105992568B (en)*2014-02-122018-06-08皇家飞利浦有限公司 Robotic control of surgical instrument visibility
WO2017119351A1 (en)*2016-01-082017-07-13オリンパス株式会社Endoscope device and operation control method for endoscope device

Patent Citations (32)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9101397B2 (en)*1999-04-072015-08-11Intuitive Surgical Operations, Inc.Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
US20100063630A1 (en)*2002-08-132010-03-11Garnette Roy SutherlandMicrosurgical robot system
US20050111621A1 (en)*2003-10-072005-05-26Robert RikerPlanning system, method and apparatus for conformal radiation therapy
US20090018419A1 (en)*2004-04-012009-01-15Torch William CBiosensors, communicators, and controllers monitoring eye movement and methods for using them
US20080071142A1 (en)*2006-09-182008-03-20Abhishek GattaniVisual navigation system for endoscopic surgery
US20160235486A1 (en)*2007-06-132016-08-18Intuitive Surgical Operations, Inc.Preventing instrument/tissue collisions
US20100121149A1 (en)*2007-07-172010-05-13Mordehai SholevInterface between a surgeon and an automated assistant and method thereof
US20120020547A1 (en)*2007-09-302012-01-26Intuitive Surgical Operations, Inc.Methods of Locating and Tracking Robotic Instruments in Robotic Surgical Systems
US9888973B2 (en)*2010-03-312018-02-13St. Jude Medical, Atrial Fibrillation Division, Inc.Intuitive user interface control for remote catheter navigation and 3D mapping and visualization systems
US20130172906A1 (en)*2010-03-312013-07-04Eric S. OlsonIntuitive user interface control for remote catheter navigation and 3D mapping and visualization systems
US20120254747A1 (en)*2011-03-302012-10-04Mckesson Financial HoldingsMethods, apparatuses and computer program products for generating regions of interest using gestures via a user interface
US9204939B2 (en)*2011-08-212015-12-08M.S.T. Medical Surgery Technologies Ltd.Device and method for assisting laparoscopic surgery—rule based approach
US20140163359A1 (en)*2011-08-212014-06-12Mordehai SholevDevice and method for assisting laparoscopic surgery - rule based approach
WO2013027201A2 (en)*2011-08-212013-02-28M.S.T. Medical Surgery Technologies.Ltd.Device and method for asissting laparoscopic surgery- rule based approach
US9757206B2 (en)*2011-08-212017-09-12M.S.T. Medical Surgery Technologies LtdDevice and method for assisting laparoscopic surgery—rule based approach
US20130131505A1 (en)*2011-10-282013-05-23Navident Technologies, Inc.Surgical location monitoring system and method using skin applied fiducial reference
US20150182296A1 (en)*2011-10-282015-07-02Navigate Surgical Technologies, Inc.System and method for real time tracking and modeling of surgical site
US20130131504A1 (en)*2011-10-282013-05-23Navident Technologies, Inc.Soft body automatic registration and surgical monitoring system
US20130204271A1 (en)*2012-02-022013-08-08Intuitive Surgical Operations, Inc.Systems and Methods for Controlling a Robotic Surgical System
US20130261433A1 (en)*2012-03-282013-10-03Navident Technologies, Inc.Haptic simulation and surgical location monitoring system and method
US20140005484A1 (en)*2012-06-272014-01-02CamPlex LLCInterface for viewing video from cameras on a surgical visualization system
US20140005489A1 (en)*2012-06-272014-01-02CamPlex LLCSurgical retractor with video cameras
US20140081659A1 (en)*2012-09-172014-03-20Depuy Orthopaedics, Inc.Systems and methods for surgical and interventional planning, support, post-operative follow-up, and functional recovery tracking
US20160078627A1 (en)*2012-11-082016-03-17Navigate Surgical Technologies, Inc.System and method for determining the three-dimensional location and orienation of identification markers
US20140309523A1 (en)*2013-04-162014-10-16Navigate Surgical Technologies, Inc.Three-dimensional extraction tracking for implant modeling
US20140320600A1 (en)*2013-04-262014-10-30Navigate Surgical Technologies, Inc.System and method for tracking non-visible structure of a body
US20150205947A1 (en)*2013-12-272015-07-23Abbott Diabetes Care Inc.Application interface and display control in an analyte monitoring environment
US20150196800A1 (en)*2014-01-132015-07-16Vincent James MacriApparatus, method and system for pre-action therapy
US10111603B2 (en)*2014-01-132018-10-30Vincent James MacriApparatus, method and system for pre-action therapy
US20170172675A1 (en)*2014-03-192017-06-22Intuitive Surgical Operations, Inc.Medical devices, systems, and methods using eye gaze tracking
US10278782B2 (en)*2014-03-192019-05-07Intuitive Surgical Operations, Inc.Medical devices, systems, and methods using eye gaze tracking
US20150320514A1 (en)*2014-05-082015-11-12Samsung Electronics Co., Ltd.Surgical robots and control methods thereof

Cited By (99)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9943372B2 (en)2005-04-182018-04-17M.S.T. Medical Surgery Technologies Ltd.Device having a wearable interface for improving laparoscopic surgery and methods for use thereof
US9937013B2 (en)2011-08-212018-04-10M.S.T. Medical Surgery Technologies LtdDevice and method for assisting laparoscopic surgery—rule based approach
US20240398208A1 (en)*2011-08-212024-12-05Asensus Surgical Europe S.à.R.L.Device and method for assisting laparoscopic surgery - rule based approach
US20190269390A1 (en)*2011-08-212019-09-05Transenterix Europe S.A.R.L.Device and method for assisting laparoscopic surgery - rule based approach
US10650594B2 (en)2015-02-032020-05-12Globus Medical Inc.Surgeon head-mounted display apparatuses
US11734901B2 (en)2015-02-032023-08-22Globus Medical, Inc.Surgeon head-mounted display apparatuses
US11217028B2 (en)2015-02-032022-01-04Globus Medical, Inc.Surgeon head-mounted display apparatuses
US11763531B2 (en)2015-02-032023-09-19Globus Medical, Inc.Surgeon head-mounted display apparatuses
US11176750B2 (en)2015-02-032021-11-16Globus Medical, Inc.Surgeon head-mounted display apparatuses
US12002171B2 (en)2015-02-032024-06-04Globus Medical, IncSurgeon head-mounted display apparatuses
US11461983B2 (en)2015-02-032022-10-04Globus Medical, Inc.Surgeon head-mounted display apparatuses
US11062522B2 (en)2015-02-032021-07-13Global Medical IncSurgeon head-mounted display apparatuses
US12229906B2 (en)2015-02-032025-02-18Globus Medical, Inc.Surgeon head-mounted display apparatuses
US11638615B2 (en)*2015-08-302023-05-02Asensus Surgical Us, Inc.Intelligent surgical tool control system for laparoscopic surgeries
US11504095B2 (en)*2018-01-082022-11-22Rivanna Medical, Inc.Three-dimensional imaging and modeling of ultrasound image data
CN111655160A (en)*2018-01-082020-09-11利万纳医疗有限责任公司 3D Imaging and Modeling of Ultrasound Image Data
US12336771B2 (en)2018-02-192025-06-24Globus Medical Inc.Augmented reality navigation systems for use with robotic surgical systems and methods of their use
US10646283B2 (en)2018-02-192020-05-12Globus Medical Inc.Augmented reality navigation systems for use with robotic surgical systems and methods of their use
US20190265327A1 (en)*2018-02-282019-08-29The Boeing CompanyHybrid position locating system in a manufacturing environment
US10823815B2 (en)*2018-02-282020-11-03The Boeing CompanyHybrid position locating system in a manufacturing environment
CN112236069A (en)*2018-06-082021-01-15阿克拉伦特公司 Surgical navigation system with self-actuated endoscope
JP2021525613A (en)*2018-06-082021-09-27アクラレント インコーポレイテッドAcclarent, Inc. Surgical navigation system with self-driven endoscope
JP7358404B2 (en)2018-06-082023-10-10アクラレント インコーポレイテッド Surgical navigation system with automatically driven endoscope
US11147629B2 (en)*2018-06-082021-10-19Acclarent, Inc.Surgical navigation system with automatically driven endoscope
US11754712B2 (en)2018-07-162023-09-12Cilag Gmbh InternationalCombination emitter and camera assembly
US11471151B2 (en)*2018-07-162022-10-18Cilag Gmbh InternationalSafety logic for surgical suturing systems
US12181579B2 (en)2018-07-162024-12-31Cilag GmbH IntemationalControlling an emitter assembly pulse sequence
US11559298B2 (en)2018-07-162023-01-24Cilag Gmbh InternationalSurgical visualization of multiple targets
US11564678B2 (en)2018-07-162023-01-31Cilag Gmbh InternationalForce sensor through structured light deflection
US12092738B2 (en)2018-07-162024-09-17Cilag Gmbh InternationalSurgical visualization system for generating and updating a three-dimensional digital representation from structured light imaging data
US12025703B2 (en)2018-07-162024-07-02Cilag Gmbh InternationalRobotic systems with separate photoacoustic receivers
US20200129740A1 (en)*2018-10-302020-04-30Corindus, Inc.System and method for navigating a device through a path to a target location
US11918423B2 (en)*2018-10-302024-03-05Corindus, Inc.System and method for navigating a device through a path to a target location
US12257013B2 (en)2019-03-152025-03-25Cilag Gmbh InternationalRobotic surgical systems with mechanisms for scaling camera magnification according to proximity of surgical tool to tissue
US12402963B2 (en)*2019-07-312025-09-02Auris Health, Inc.Apparatus, systems, and methods to facilitate instrument visualization
US20210030497A1 (en)*2019-07-312021-02-04Auris Health, Inc.Apparatus, systems, and methods to facilitate instrument visualization
US11253324B1 (en)*2019-11-062022-02-22Cognistic, LLCDetermination of appendix position using a two stage deep neural network
US11992373B2 (en)2019-12-102024-05-28Globus Medical, IncAugmented reality headset with varied opacity for navigated robotic surgery
US12133772B2 (en)2019-12-102024-11-05Globus Medical, Inc.Augmented reality headset for navigated robotic surgery
US12220176B2 (en)2019-12-102025-02-11Globus Medical, Inc.Extended reality instrument interaction zone for navigated robotic
US12336868B2 (en)2019-12-102025-06-24Globus Medical, Inc.Augmented reality headset with varied opacity for navigated robotic surgery
US11776144B2 (en)2019-12-302023-10-03Cilag Gmbh InternationalSystem and method for determining, adjusting, and managing resection margin about a subject tissue
US11896442B2 (en)2019-12-302024-02-13Cilag Gmbh InternationalSurgical systems for proposing and corroborating organ portion removals
US11744667B2 (en)2019-12-302023-09-05Cilag Gmbh InternationalAdaptive visualization by a surgical system
US11759283B2 (en)2019-12-302023-09-19Cilag Gmbh InternationalSurgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11759284B2 (en)2019-12-302023-09-19Cilag Gmbh InternationalSurgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US12053223B2 (en)2019-12-302024-08-06Cilag Gmbh InternationalAdaptive surgical system control according to surgical smoke particulate characteristics
US11589731B2 (en)2019-12-302023-02-28Cilag Gmbh InternationalVisualization systems using structured light
US11648060B2 (en)2019-12-302023-05-16Cilag Gmbh InternationalSurgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ
US11813120B2 (en)2019-12-302023-11-14Cilag Gmbh InternationalSurgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11832996B2 (en)2019-12-302023-12-05Cilag Gmbh InternationalAnalyzing surgical trends by a surgical system
US12002571B2 (en)2019-12-302024-06-04Cilag Gmbh InternationalDynamic surgical visualization systems
US11937770B2 (en)2019-12-302024-03-26Cilag Gmbh InternationalMethod of using imaging devices in surgery
US11850104B2 (en)2019-12-302023-12-26Cilag Gmbh InternationalSurgical imaging system
US11864956B2 (en)2019-12-302024-01-09Cilag Gmbh InternationalSurgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11864729B2 (en)2019-12-302024-01-09Cilag Gmbh InternationalMethod of using imaging devices in surgery
US11882993B2 (en)2019-12-302024-01-30Cilag Gmbh InternationalMethod of using imaging devices in surgery
US11925309B2 (en)2019-12-302024-03-12Cilag Gmbh InternationalMethod of using imaging devices in surgery
US12096910B2 (en)2019-12-302024-09-24Cilag Gmbh InternationalSurgical hub for use with a surgical system in a surgical procedure
US11908146B2 (en)2019-12-302024-02-20Cilag Gmbh InternationalSystem and method for determining, adjusting, and managing resection margin about a subject tissue
US12207881B2 (en)2019-12-302025-01-28Cilag Gmbh InternationalSurgical systems correlating visualization data and powered surgical instrument data
US11925310B2 (en)2019-12-302024-03-12Cilag Gmbh InternationalMethod of using imaging devices in surgery
US12414823B2 (en)2019-12-312025-09-16Auris Health, Inc.Anatomical feature tracking
WO2021146261A3 (en)*2020-01-132021-08-19Stryker CorporationSystem for monitoring offset during navigation-assisted surgery
US12343092B2 (en)2020-01-132025-07-01Stryker CorporationSystem and method for monitoring offset during navigation-assisted surgery
EP4527334A3 (en)*2020-01-132025-06-25Stryker CorporationSystem for monitoring offset during navigation-assisted surgery
US11660148B2 (en)2020-01-132023-05-30Stryker CorporationSystem and method for monitoring offset during navigation-assisted surgery
US11883117B2 (en)2020-01-282024-01-30Globus Medical, Inc.Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US11464581B2 (en)2020-01-282022-10-11Globus Medical, Inc.Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US12310678B2 (en)2020-01-282025-05-27Globus Medical, Inc.Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US11382699B2 (en)2020-02-102022-07-12Globus Medical Inc.Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
US11690697B2 (en)2020-02-192023-07-04Globus Medical, Inc.Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US12295798B2 (en)2020-02-192025-05-13Globus Medical, Inc.Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US11207150B2 (en)2020-02-192021-12-28Globus Medical, Inc.Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US12414686B2 (en)2020-03-302025-09-16Auris Health, Inc.Endoscopic anatomical feature tracking
JP7648102B2 (en)2020-03-302025-03-18オーリス ヘルス インコーポレイテッド Localization of target anatomical features
WO2021198906A1 (en)2020-03-302021-10-07Auris Health, Inc.Target anatomical feature localization
EP4125690A4 (en)*2020-03-302024-04-03Auris Health, Inc. LOCALIZATION OF TARGET ANATOMICAL FEATURES
JP2023519714A (en)*2020-03-302023-05-12オーリス ヘルス インコーポレイテッド Localization of target anatomical features
US11607277B2 (en)2020-04-292023-03-21Globus Medical, Inc.Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery
US12225181B2 (en)2020-05-082025-02-11Globus Medical, Inc.Extended reality headset camera system for computer assisted navigation in surgery
US11153555B1 (en)2020-05-082021-10-19Globus Medical Inc.Extended reality headset camera system for computer assisted navigation in surgery
US11510750B2 (en)2020-05-082022-11-29Globus Medical, Inc.Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US12115028B2 (en)2020-05-082024-10-15Globus Medical, Inc.Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US11382700B2 (en)2020-05-082022-07-12Globus Medical Inc.Extended reality headset tool tracking and control
US11838493B2 (en)2020-05-082023-12-05Globus Medical Inc.Extended reality headset camera system for computer assisted navigation in surgery
US11839435B2 (en)2020-05-082023-12-12Globus Medical, Inc.Extended reality headset tool tracking and control
US12349987B2 (en)2020-05-082025-07-08Globus Medical, Inc.Extended reality headset tool tracking and control
US11737831B2 (en)2020-09-022023-08-29Globus Medical Inc.Surgical object tracking template generation for computer assisted navigation during surgical procedure
US20230096691A1 (en)*2021-09-292023-03-30Cilag Gmbh InternationalMethods and Systems for Controlling Cooperative Surgical Instruments
US20230093972A1 (en)*2021-09-292023-03-30Cilag Gmbh InternationalMethods and Systems for Controlling Cooperative Surgical Instruments
US20230101714A1 (en)*2021-09-292023-03-30Cilag Gmbh InternationalMethods and Systems for Controlling Cooperative Surgical Instruments
US11957421B2 (en)*2021-09-292024-04-16Cilag Gmbh InternationalMethods and systems for controlling cooperative surgical instruments
US12376910B2 (en)2021-09-292025-08-05Cilag Gmbh InternationalMethods for controlling cooperative surgical instruments
US12082885B2 (en)*2021-09-292024-09-10Cilag Gmbh InternationalMethods and systems for controlling cooperative surgical instruments
US12102392B2 (en)*2021-09-292024-10-01Cilag Gmbh InternationalMethods and systems for controlling cooperative surgical instruments
US12295683B2 (en)*2022-02-022025-05-13Mazor Robotics Ltd.Systems and methods for robotic collision avoidance using medical imaging
US12061746B2 (en)*2022-10-262024-08-13Lixel Inc.Interactive simulation system with stereoscopic image and method for operating the same
US20240423734A1 (en)*2023-06-262024-12-26Cilag Gmbh InternationalSystem and method to restrict range of motion of robotic surgical system

Also Published As

Publication numberPublication date
WO2015189839A1 (en)2015-12-17
US12364546B2 (en)2025-07-22
US20230040952A1 (en)2023-02-09

Similar Documents

PublicationPublication DateTitle
US12364546B2 (en)Device and method for assisting laparoscopic surgery utilizing a touch screen
US11957301B2 (en)Device and method for assisting laparoscopic surgery—rule based approach
US11185315B2 (en)Device and method for assisting laparoscopic surgery—rule based approach
US9757206B2 (en)Device and method for assisting laparoscopic surgery—rule based approach
EP2754383B1 (en)Device and method for assisting laparoscopic surgery - rule based approach
EP2744389B1 (en)Device for assisting laparoscopic surgery - rule based approach
US10052157B2 (en)Device and method for assisting laparoscopic surgery—rule based approach
WO2014049598A1 (en)Directing and maneuvering articulating a laparoscopic surgery tool

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:M.S.T. MEDICAL SURGERY TECHNOLOGIES LTD, ISRAEL

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ATAROT, GAL;NIR, TAL;FRIMER, MOTTI;AND OTHERS;REEL/FRAME:040762/0075

Effective date:20161219

ASAssignment

Owner name:TRANSENTERIX EUROPE, S.A.R.L., SWITZERLAND

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:M.S.T. MEDICAL SURGERY TECHNOLOGIES LTD.;REEL/FRAME:047947/0438

Effective date:20181031

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:FINAL REJECTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:FINAL REJECTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:DOCKETED NEW CASE - READY FOR EXAMINATION

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:FINAL REJECTION MAILED

STCBInformation on status: application discontinuation

Free format text:ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION


[8]ページ先頭

©2009-2025 Movatter.jp