RELATED APPLICATIONSThis application is a Continuation of U.S. patent application Ser. No. 16/605,743, filed on Oct. 16, 2019, which is the U.S. National Stage Entry of International Patent Application No. PCT/US2018/054395, filed on Oct. 4, 2018, which claims the benefit of priority to U.S. Provisional Patent Application No. 62/568,267, filed on Oct. 4, 2017, the disclosures of each of which are hereby incorporated by reference in their entirety.
BACKGROUNDSurgical procedures, such as minimally-invasive procedures, may require a surgeon to insert surgical tools inside the body of the patient to a particular depth to reach the target area inside the patient's body. For example, minimally invasive spinal surgical procedures have been used for stabilization of vertebral bones and spinal joints and for relieving of pressure applied to the spinal nerves. Such procedures may utilize relatively small incisions and insertion of tubular retractors and cannulas while minimizing damage to muscles and other surrounding anatomical features. Minimally invasive surgical approaches can be faster, safer and require less recovery time than conventional open surgeries. There is a continuing need for improvement to the safety and speed of surgical procedures, such as minimally-invasive surgical procedures.
SUMMARYVarious embodiments include systems and methods for performing spine surgery, including minimally invasive lateral access spine surgery. Embodiments include a retractor apparatus that may be used for robot-assisted minimally invasive lateral access spine surgery.
Embodiments include a retractor apparatus for a surgical robotic system that includes a frame defining a central open region, a connecting member that connects the frame to a robotic arm, a plurality of coupling mechanisms for attaching a set of retractor blades within the central open region of the frame such that blades define a working channel interior of the blades, and a plurality of actuators extending between the frame and each of the coupling mechanisms and configured to move the blades with respect to the frame to vary a dimension of the working channel.
Further embodiments include a surgical robotic system that includes a robotic arm and a retractor apparatus attached to the robotic arm, where the retractor apparatus includes a frame attached to the robotic arm and defining a central open region, a connecting member that connects the frame to a robotic arm, a plurality of coupling mechanisms for attaching a set of retractor blades within the central open region of the frame such that blades define a working channel interior of the blades, and a plurality of actuators extending between the frame and each of the coupling mechanisms and configured to move the blades with respect to the frame to vary a dimension of the working channel.
Further embodiments include a method for performing a robot-assisted surgical procedure that includes controlling a robotic arm having a frame of a retractor apparatus frame attached thereto to position the frame over a pre-set trajectory into the body of a patient, attaching a plurality of retractor blades to the frame such that the blades define a working channel into the body of the patient, and moving at least one retractor blade relative to the frame to vary a dimension of the working channel.
BRIEF DESCRIPTION OF THE DRAWINGSOther features and advantages of the present invention will be apparent from the following detailed description of the invention, taken in conjunction with the accompanying drawings of which:
FIG.1 illustrates a robotic-assisted surgical system according to an embodiment.
FIGS.2A-2E illustrate an embodiment retractor apparatus for performing lateral-access spine surgery.
FIGS.3A-3E schematically illustrate a robotic-assisted lateral access spine procedure performed on a patient.
FIGS.4A-4B illustrate a further embodiment retractor apparatus.
FIG.5 schematically illustrates a computing device which may be used for performing various embodiments.
DETAILED DESCRIPTIONThe various embodiments will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. References made to particular examples and implementations are for illustrative purposes, and are not intended to limit the scope of the invention or the claims.
Various embodiments relate to apparatuses and methods for performing spine surgery, including minimally invasive lateral access spine surgery. Embodiments include a retractor apparatus that may be used for robot assisted minimally invasive lateral access spine surgery.
One common surgical procedure performed on the spine is an interbody fusion, which includes fusing two vertebrae together. To perform this procedure, the intervertebral space between the two vertebrae must be accessed to partially or completely remove the intervertebral disc and to insert an implant, such as a spacer or cage, that maintains the normal alignment of the spine while allowing the two vertebrae to fuse. Conventionally, the surgical space has been accessed from the posterior or anterior of the patient. However, this may require removing bony portions of the vertebral column to access the disc space. In addition, such approaches may risk damage to major vascular structures and other sensitive organs. More recently, a lateral approach has been utilized, in which the surgeon may access certain parts of the spine (e.g., the lumbar region of the spine) from the side of the patient. This may be less invasive for the patient, may result in less trauma, and can reduce operating time and recovery periods.
In various embodiments, a lateral-access spine procedure, such as lateral transpsoas interbody fusion, may be performed using a computer-assisted image guided surgery system. In embodiments, the system may be a surgical robotic system that may include at least one robotic arm that is configured to assist a surgeon in performing a surgical procedure.FIG.1 illustrates asystem100 for performing computer-assisted image-guided surgery that includes animaging device103, amotion tracking system105 and arobotic arm101. Therobotic arm101 may be fixed to a support structure at one end and may have anend effector102 located at the other end of therobotic arm101. Therobotic arm101 may comprise a multi-joint arm that includes a plurality of linkages connected by joints having actuator(s) and optional encoder(s) to enable the linkages to rotate, bend and/or translate relative to one another in response to control signals from a robot control system. The motions of therobotic arm101 may enable theend effector102 to be moved to various positions and/or orientations, such as various positions and/or orientations with respect to a patient (not illustrated) that may be located on a patient support60 (e.g., surgical table). In various embodiments described in further detail below, theend effector102 of therobotic arm101 may include a retractor apparatus that may be used to provide a working channel to a target site within the patient.
Theimaging device103 may be used to obtain diagnostic images of a patient (not shown inFIG.1), which may be a human or animal patient. In embodiments, theimaging device103 may be an x-ray computed tomography (CT) imaging device. The patient may be positioned within acentral bore107 of theimaging device103 and an x-ray source and detector may be rotated around thebore107 to obtain x-ray image data (e.g., raw x-ray projection data) of the patient. The collected image data may be processed using a suitable processor (e.g., computer) to perform a three-dimensional reconstruction of the object. In other embodiments, theimaging device103 may comprise one or more of an x-ray fluoroscopic imaging device, a magnetic resonance (MR) imaging device, a positron emission tomography (PET) imaging device, a single-photon emission computed tomography (SPECT), or an ultrasound imaging device. In embodiments, image data may be obtained pre-operatively (i.e., prior to performing a surgical procedure), intra-operatively (i.e., during a surgical procedure) or post-operatively (i.e., following a surgical procedure) by positioning the patient within thebore107 of theimaging device103. In thesystem100 ofFIG.1, this may be accomplished by moving theimaging device103 over the patient to perform a scan while the patient may remain stationary.
Examples of x-ray CT imaging devices that may be used according to various embodiments are described in, for example, U.S. Pat. No. 8,118,488, U.S. Patent Application Publication No. 2014/0139215, U.S. Patent Application Publication No. 2014/0003572, U.S. Patent Application Publication No. 2014/0265182 and U.S. Patent Application Publication No. 2014/0275953, the entire contents of all of which are incorporated herein by reference. In the embodiment shown inFIG.1, the patient support60 (e.g., surgical table) upon which the patient may be located is secured to theimaging device103, such as via acolumn50 which is mounted to abase20 of theimaging device103. A portion of the imaging device103 (e.g., an O-shaped imaging gantry40) which includes at least one imaging component may translate along the length of thebase20 onrails23 to perform an imaging scan of the patient, and may translate away from the patient to an out-of-the-way positon for performing a surgical procedure on the patient. It will be understood that other imaging devices may be utilized, including other mobile or fixed x-ray CT devices or a C-arm x-ray fluoroscopy device.
Further, although theimaging device103 shown inFIG.1 is located close to the patient within the surgical theater, theimaging device103 may be located remote from the surgical theater, such as in another room or building (e.g., in a hospital radiology department).
Themotion tracking system105 shown inFIG.1 includes a plurality ofmarker devices119,202 and anoptical sensor device111. Various systems and technologies exist for tracking the position (including location and/or orientation) of objects as they move within a three-dimensional space. Such systems may include a plurality of active or passive markers fixed to the object(s) to be tracked and a sensing device that detects radiation emitted by or reflected from the markers. A 3D model of the space may be constructed in software based on the signals detected by the sensing device.
Themotion tracking system105 in the embodiment ofFIG.1 includes a plurality ofmarker devices119,202 and a stereoscopicoptical sensor device111 that includes two or more cameras207 (e.g., IR cameras). Theoptical sensor device111 may include one or more radiation sources (e.g., diode ring(s)) that direct radiation (e.g., IR radiation) into the surgical field, where the radiation may be reflected by themarker devices119,202 and received by the cameras. Themarker devices119,202 may each include three or more (e.g., four) reflecting spheres, which themotion tracking system105 may use to construct a coordinate system for each of themarker devices119,202. Acomputer113 may be coupled to thesensor device111 and may determine the transformations between each of themarker devices119,202 and the cameras using, for example, triangulation techniques. A 3D model of the surgical space in a common coordinate system may be generated and continually updated using motion tracking software implemented by thecomputer113. In embodiments, thecomputer113 may also receive image data from theimaging device103 and may register the image data to the common coordinate system as themotion tracking system105 using image registration techniques as are known in the art. In embodiments, at least one reference marker device may be attached to the patient. The reference marker device may be rigidly attached to a landmark in the anatomical region of interest (e.g., clamped or otherwise attached to a bony portion of the patient's anatomy) to enable the anatomical region of interest to be continually tracked by themotion tracking system105.Additional marker devices119 may be attached to surgical tools orinstruments104 to enable the tools/instruments104 to be tracked within the common coordinate system. Anothermarker device202 may be rigidly attached to therobotic arm101, such as on theend effector102 of therobotic arm101, to enable the position ofrobotic arm101 andend effector102 to be tracked using themotion tracking system105. Thecomputer113 may also include software configured to perform a transform between the joint coordinates of therobotic arm101 and the common coordinate system of themotion tracking system105, which may enable the position and orientation of theend effector102 of therobotic arm101 to be controlled with respect to the patient.
In addition to passive marker devices described above, themotion tracking system105 may alternately utilize active marker devices that may include radiation emitters (e.g., LEDs) that may emit radiation that is detected by anoptical sensor device111. Each active marker device or sets of active marker devices attached to a particular object may emit radiation in a pre-determined strobe pattern (e.g., with modulated pulse width, pulse rate, time slot and/or amplitude) and/or wavelength which may enable different objects to be uniquely identified and tracked by themotion tracking system105. One or more active marker devices may be fixed relative to the patient, such as secured to the patient's skin via an adhesive membrane or mask. Additional active marker devices may be fixed tosurgical tools104 and/or to theend effector102 of therobotic arm101 to allow these objects to be tracked relative to the patient.
In further embodiments, the marker devices may be passive maker devices that include moiré patterns that may enable their position and orientation to be tracked in three-dimensional space using a single camera using Moiré Phase Tracking (MPT) technology. Other tracking technologies, such as computer vision systems and/or magnetic-based tracking systems, may also be utilized.
As shown inFIG.1, theoptical sensor device111 may include a plurality ofcameras207 mounted to anarm209 extending above the patient surgical area. Thearm209 may be mounted to or above theimaging device103. Thearm209 may enable thesensor device111 to pivot with respect to thearm209 and/or the imaging device103 (e.g., via one or more ball joints213). Thearm209 may enable a user to adjust the position and/or orientation of thesensor device111 to provide thecameras207 with a clear view into the surgical field while avoiding obstructions. Thearm209 may enable the position and/or orientation of thesensor device111 to be adjusted and then locked in place during an imaging scan or surgical procedure.
Thesystem100 may also include at least onedisplay device219 as illustrated inFIG.1. Thedisplay device219 may display image data of the patient's anatomy obtained by theimaging device103. In the case of CT image data, for example, thedisplay device219 may display a three-dimensional volume rendering of a portion of the patient's anatomy and/or may display two-dimensional slices (e.g., axial, sagittal and/or coronal slices) through the 3D CT reconstruction dataset. Thedisplay device219 may facilitate planning for a surgical procedure, such as by enabling a surgeon to define one or more target positions in the patient's body and/or a path or trajectory into the patient's body for inserting surgical tool(s) to reach a target position while minimizing damage to other tissue or organs of the patient. The position and/or orientation of one or more objects tracked by themotion tracking system105 may be shown on thedisplay219, and may be shown overlaying the image data (e.g., using augmented reality technology). This may enable the surgeon to precisely navigate the tracked tools/implants within the patient's body in real-time. The use of tracked surgical instruments or tools in combination with pre-operative or intra-operative images of the patient's anatomy in order to guide a surgical procedure may be referred to as “image-guided surgery.”
In embodiments, thedisplay device219 may be a handheld computing device, such as a tablet device. One or morehandheld display devices219 may be mounted to theimaging device103, as shown inFIG.1. In other embodiments, ahandheld display device219 may be mounted to thepatient support60 orcolumn50, thearm209 that supports theoptical sensing device111 for themotion tracking system105, or to any of the wall, ceiling or floor in the operating room, or to a separate cart. Alternately or in addition, the at least onedisplay device219 may be a monitor display that may be located on a mobile cart or mounted to another structure (e.g., a wall) within the surgical theater. In further embodiments, adisplay device219 may be a head-mounted display that may be worn by a surgeon or other clinician.
As shown inFIG.1, therobotic arm101 may be fixed to theimaging device103, such as on a support element215 (e.g., a curved rail) that may extend concentrically over the outer surface of the O-shapedgantry40 of theimaging device103. In embodiments, anarm209 to which theoptical sensing device111 is mounted may be mounted to the same or a similar support element215 (e.g., curved rail) as therobotic arm101. The position of therobotic arm101 and/or thearm209 may be adjustable along the length of thesupport element215. In other embodiments, therobotic arm101 may be secured to any other portion of theimaging device103, such as directly mounted to thegantry40. Alternatively, therobotic arm101 may be mounted to thepatient support60 orcolumn50, to any of the wall, ceiling or floor in the operating room, or to a separate cart.FIG.1 illustrates therobotic arm101 mounted to a support element215 (curved rail) that is directly attached to theimaging device103. Alternately, therobotic arm101 may be mounted to amobile shuttle216 that may moved adjacent to theimaging device103 such that a support member218 (e.g., a curved rail) for mounting therobotic arm101 extends at least partially over thegantry40 of theimaging device103. Various exemplary systems for mounting arobotic arm101 in a computer-assisted image guided surgery system are described in U.S. Provisional Patent Application No. 62/491,645, filed Apr. 28, 2017, the entire contents of which are incorporated by reference herein. Although a singlerobotic arm101 is shown inFIG.1, it will be understood that two or morerobotic arms101 may be utilized. Eachrobotic arm101 may include anend effector102 that may comprise or may be configured to hold an invasive surgical tool or implant.
The at least onerobotic arm101 may aid in the performance of a surgical procedure, such as a minimally-invasive spinal surgical procedure or various other types of orthopedic, neurological, cardiothoracic and general surgical procedures. In embodiments, themotion tracking system105 may track the position of the robotic arm101 (e.g., viamarker device202 onend effector102 as shown inFIG.1) within the patient coordinate system. A control loop may continuously read the tracking data and the current parameters (e.g., joint parameters) of therobotic arm101 and may send instructions to a robotic controller to cause therobotic arm101 to move to a desired position and orientation within the patient coordinate system.
In embodiments, a surgeon may use an image-guided surgery system as a planning tool for a surgical procedure, such as by setting trajectories within the patient for inserting surgical tools, as well as by selecting one or more target locations for a surgical intervention within the patient's body. The trajectories and/or target locations set by the surgeon may be saved (e.g., in a memory of a computer device, such ascomputer device113 shown inFIG.1) for later use during surgery. In embodiments, the surgeon may be able to select stored trajectories and/or target locations using an image guided surgery system, and therobotic arm101 may be controlled to perform a particular movement based on the selected trajectory and/or target location. For example, therobotic arm101 may be moved to position theend effector102 of therobotic arm101 into alignment with the pre-defined trajectory and/or over the pre-determined target location.
In addition to arobotic arm101 as described above, anend effector102 of the present embodiments may be attached to a moveable arm or boom, which may be motor-driven or manually moved. The arm may be moved to position theend effector102 at a desired location with respect to the patient and the arm may be configured to hold its pose during a surgical intervention.
FIGS.2A-2E schematically illustrate aretractor apparatus200 for performing lateral-access spine surgery according to an embodiment.FIG.2A is an overhead view of theretractor apparatus200 andFIGS.2B-2E are side views illustrating theretractor apparatus200 with a plurality ofretractor blades227 mounted therein. Theretractor apparatus200 may be attached to the end of a robotic arm101 (i.e., theretractor apparatus200 may function as theend effector102 of the robotic arm101), such that therobotic arm101 may move theretractor apparatus200 to a desired position and/or orientation with respect to a patient. Theretractor apparatus200 includes aframe221, which may be made from a rigid structural material. Theframe221 may optionally be made of a radiolucent material. Theframe221 may surround a centralopen region225, as shown inFIG.2A. Theframe221 in the embodiment ofFIGS.2A-2E has a rectangular shape. It will be understood that theframe221 may have a circular or other shape. A connectingmember223 connects theframe221 to the end of a robotic arm101 (not illustrated inFIGS.2A-2E). In embodiments, therobotic arm101 may have an attachment mechanism that enablesdifferent end effectors102, such as theretractor apparatus200, to be attached to and removed from the end of therobotic arm101. Theretractor apparatus200 may be fastened to therobotic arm101 using mechanical fasteners (e.g., bolts) and/or via a quick-connect/disconnect mechanism. Alternately, theretractor apparatus200 may be permanently mounted to therobotic arm101.
Theretractor apparatus200 may be a sterile or sterilizable component that may not need to be draped during surgery. In some embodiments, theretractor apparatus200 may be attached to arobotic arm101 over a surgical drape that covers thearm101. All or a portion of theretractor apparatus200 may be a single-use disposable component. Alternately, all or a portion of theretractor apparatus200 multi-use component that may be re-sterilized (e.g., autoclavable). A marker device202 (e.g., an array of reflective spheres) may be attached to theretractor apparatus200 and/or to therobotic arm101 to enable theretractor apparatus200 to be tracked by amotion tracking system105, such as shown inFIG.1.
Theframe221 may further include a coupling mechanism for mechanically coupling theframe221 to a plurality of retractor blades227 (seeFIGS.2B-2E). Theretractor blades227 may extend downwards from the centralopen region225 of theframe221. In this embodiment, the coupling mechanism comprises a plurality ofguides229 through which theretractor blades227 may be inserted. In other embodiments, the coupling mechanism may be an attachment mechanism that engages with the side or top surface of theblades227 to couple theblades227 to theframe221.
As shown in the side view ofFIG.2B, theretractor blades227 may slide through theguides229 to couple theblades227 to theframe221. Theblades227 may includeclips231 or another attachment mechanism to attach theblades227 to therespective guides229 when theblades227 are fully inserted.
Theretractor apparatus200 may also include a plurality ofactuators233 for moving theretractor blades227 radially inwards and outwards with respect to theframe221. Eachactuator233 may include, for example, a screw, a rack-and-pinion system, or a similar apparatus that extends from theframe221 into the centralopen region225. Theactuators233 may be manually operated using a control knob, handle or other feature that enables a user to extend or retract theblades227. As shown inFIG.2A, theframe221 may include a plurality ofsockets235 into which a torque device (e.g., a key, an Allen wrench, screwdriver, etc.) may be inserted such that bi-directional rotation of the torque device causes theactuator233 to extend and retract with respect to theframe221. In alternative embodiments, theactuators233 may be motor-driven. For example, each actuator233 may have an associated motor located on or within theframe221. The motor may drive the extension and retraction of theactuator233 andblade227 in response to control signals received from a system controller and/or a user input device.
FIG.2C is a side view illustrating theretractor apparatus200 in a first configuration in which theactuators233 are fully extended from theframe221 into the centralopen region225. Theretractor blades227 may be positioned adjacent to one another and may define a working channel237 (seeFIG.2A) radially inward from the plurality ofblades227.FIG.2D illustrates theretractor apparatus200 in a second configuration in whichblades227 are retracted by theactuators233. This is schematically illustrated in the overhead view ofFIG.2A, which shows theblades237 retracted along the direction ofarrows239. In the first configuration, theretractor blades227 may define a workingchannel237 having a generally circular cross-section and an initial width dimension (i.e., diameter) Di. Theblades237 may be retracted to a second configuration to increase the width dimension (i.e., diameter) of the workingchannel237, as shown inFIG.2A. In embodiments, eachblade227 of theretractor apparatus200 may be moved (i.e., extended or retracted) via its associatedactuator233 independently of any movement of the other blade(s)227.
In embodiments, theblades227 may also pivot with respect to theframe221 of theretractor apparatus200. This is illustrated byFIG.2E, which shows a pair of blades227 (depicted in phantom) that have been pivoted out with respect to theframe221 in the direction ofarrow239. Pivoting aretractor blade227 as illustrated may enable the width dimension of the workingchannel237 to be increased proximate to the area of surgical intervention (e.g., the spine) while minimizing the size of the opening through the skin surface and peripheral tissue. In one exemplary embodiment, the pivot motion of theblades227 may be controlled using the same input feature (e.g., control knob, handle, torque device) that is used to extend and retract theblades227. For example, in the embodiment ofFIGS.2A-2E, turning a torque device in thesocket235 in one direction may cause theactuator233 to retract thecorresponding blade227 towards theframe221. After theblade227 is fully retracted, continuing to turn the torque device in the same direction may cause theblade227 to pivot outwards as shown inFIG.2E. Alternately, a set of separate manual controllers (e.g., control knobs, levers, handles, torque devices, etc.) may be utilized to control the pivoting motion of theblades227. In such a case, the pivoting motion of theblades227 may be performed independently of the extension and retraction of theblades227. In addition, eachblade227 may be pivoted independently of any pivoting of the other blade(s)227. In further embodiments, the pivoting motion of theblades227 be motor-driven, as described above.
Eachretractor blade227 may be made from a radiolucent material, such as carbon fiber. Theretractor blades227 may include electrically conductive material that forms one or more continuous electrical pathways through theblade227. The continuous electrical pathways may be used for performing intraoperative neurophysiological monitoring (IONM), as described further below. The retractor blade(s)227 and/or the coupling mechanism (e.g., guide(s)229) may optionally include aport241 or other electrical connector to enable a probe device to electrically connect to the blade227 (e.g., for neurophysiological monitoring).
In embodiments, theretractor blades227 may include one ormore channels243 extending through the blade227 (shown in phantom inFIG.2B). Achannel243 in theblade227 may be utilized for illumination of the surgical area (e.g., by inserting an LED or other light source into the channel243), for visualization of the surgical area (e.g., by inserting an endoscope into the channel243) or for any other purpose (e.g., for removal of tissue/fluid via suction or other means).
Aretractor apparatus200 as described above may utilizeretractor blades227 having varying lengths. The length of theblades227 used for a particular surgical procedure may be chosen based on the depth of the surgical site from the patient's skin surface. This depth may be determined using an image guided surgery system as described above. For example, a surgeon may use a tracked instrument to set a target trajectory and/or target location within the patient's anatomy. Based on the pre-set trajectory and/or location, the image guided surgery system may determine the appropriate size of theretractor blades227 for insertion into theretractor apparatus200 out of an available set of sizes for theretractor blades227. The image guided surgery system may provide an indication to the surgeon (e.g., via adisplay device219 as shown inFIG.1) of anappropriate blade227 size to use for the surgical procedure. In some embodiments, the image guided surgery system may control therobotic arm101 to move theframe221 of theretractor apparatus200 to a pre-determined distance from the skin surface of the patient such that the tip ends of the selectedretractor blades227 are located at the proper anatomical depth within the patient.
Theframe221 of the retractor apparatus may include one ormore rails245 that may extend around the periphery of the centralopen region225. The one ormore rails245 may enable tools/instruments to be clipped or clamped on to theretractor apparatus200. For example, an illumination source or camera system (e.g., endoscope) may be attached to a desired position on arail245, and may optionally extend at least partially into the working channel defined by theretractor blades227. Other tools that may be attached to arail245 include, for example, a suction device for removing fluids from the surgical site and/or a shim element that may be inserted into the disc space (e.g., to restore disc height and/or anchor theretractor apparatus200 to the surgical site).
Theretractor apparatus200 ofFIGS.2A-2E includes fourretractor blades227. However, it will be understood that aretractor apparatus200 according to various embodiments may have three blades, two blades or greater than four blades (e.g., five blades, six blades, etc.).
As discussed above, aretractor apparatus200 may be configured to provide intraoperative neurophysiological monitoring (IONM). Use of IONM techniques may enable the surgeon to locate the proximity of tools to nerves and avoid damage or irritation to the nerves during surgery. A variety of IONM methods are known, including electromyography (EMG), including spontaneous EMG (S-EMG) and stimulus-triggered EMG (T-EMG), somatosensory evoked potentials (SSEPs) and motor evoked potentials (MEPs).
In one embodiment, IONM may be performed by electrically stimulating muscle tissue and neural structures surrounding the surgical area and measuring the evoked EMG response using sensor(s) located on or within the patient's body. Aretractor apparatus200 as described above may include at least oneelectrode247 located on aretractor blade227, as schematically shown inFIG.2D. Theelectrode247 may be configured to electrically stimulate the surrounding tissue when theblade227 is inserted into the patient. In embodiments, each of theblades227 of the retractor apparatus220 may include at least oneelectrode247 for stimulating the surrounding tissue. Theelectrode247 inFIG.2D is shown located at the tip end of theretractor blade227, although it will be understood that theelectrode247 may be located at another position on theblade227. In addition, ablade227 may havemultiple electrodes247 located at different positions on theblade227 for stimulating different portions of the surrounding tissue.
For performing neurophysiological monitoring, eachelectrode247 may be electrically connected to a power source246 (e.g., one or more batteries) andcircuitry249 for generating stimulation signals that may be transmitted to the electrode(s)247 via aconductive lead251. Theconductive lead251 may be, for example, a wire located on or within theblade227 or a conductive trace formed on a surface of theblade227 via printing, spray coating, etc. In embodiments, theretractor apparatus200 may include aconductive path252 to conduct power from thepower source246 to theblade237. One or more sensors253 (e.g., surface or needle electrodes) may be positioned at pre-determined locations on the patient's body corresponding to particular muscle(s) and/or neural features to measure the evoked EMG response. A processing device255 (e.g., computer), operably coupled to the sensor(s)253, may include anerve detection component256 configured to process the sensor data according to defined algorithms to determine the proximity (including distance and/or direction) of a neural structure (e.g., a nerve) to ablade227 or a portion thereof. Thenerve detection component256 may be implemented in electronic hardware, in computer software, or in combinations of both.
Thenerve detection component256 may be coupled to a user feedback device to provide audio and/or visual feedback to the surgeon. For example, thenerve detection component256 may be coupled to a display device219 (seeFIG.1) configured provide feedback in the form of a visual indication on thedisplay device219 that aparticular retractor blade227 is proximate to a nerve and/or may be impinging on a nerve. In response to thenerve detection component256 determining that aretractor227 blade is too close to a nerve, thenerve detection component256 may cause thedisplay219 to provide instructions to the surgeon to stop further movement of theblade227 and/or to move theblade227 away from the nerve. In some embodiments, a graphical depiction of one or more nerves detected using an IONM method may be overlaid with image data on the display screen of the image guided surgery system. In embodiments in which the movement of theblades227 is motor-driven as described above, thenerve detection component256 may be configured to send instructions to a motorized drive system of a blade to cause the motorized drive system to automatically stop movement (e.g., retraction or pivoting) of theblade227. Thenerve detection component256 may also cause the motorized drive system to move theretractor blade227 away from a detected nerve.
In embodiments, thenerve detection component256 may be configured to activate theelectrodes247 on theblades227 of theretractor apparatus200 to stimulate the surrounding tissue. Thenerve detection component256 may be operatively coupled tocircuit249 and to theelectrodes247 on theretractor blades227 via a wired or wireless connection. Thenerve detection component256 may be configured to control the characteristics of the stimulation signals, such as the stimulation current, the duration of the signals, and/or the frequency of the signals. In embodiments, stimulation signals may be generated in response to a user input from a user input device. In some embodiments, a plurality of stimulation signals may be generated in a pre-determined sequence or cycle (e.g., eachelectrode247 of a plurality of electrodes on theretractor blades227 may be energized sequentially).
In the embodiment shown inFIG.2D, thepower source246 andcircuitry249 for generating stimulation signals are located on theretractor apparatus200, and may be mounted to or within theframe221 of theapparatus200. Thecircuitry249 may include wireless transceiver circuitry configured to provide awireless communication link254 between theretractor apparatus200 and an external entity, such as theprocessing device255 shown inFIG.2D. Signals, including data and/or command signals, may be transmitted wirelessly between theretractor apparatus200 and theprocessing device255 using a suitable wireless communication protocol or standard (e.g., an IEEE 802.15×(BLUETOOTH®) connection or IEEE 802.11 (WiFi) connection). In embodiments, command signals to energizeparticular electrodes247 may be received wirelessly from aremote processing device255. Alternately or in addition, theretractor apparatus200 may include a user interface component (e.g., one or more buttons) that may be used to trigger the stimulation signals. When the user actuates the user interface component, a signal may be sent to theprocessing device255 to enable thedevice255 to synchronize the recording of the evoked EMG response(s) with the triggering of the stimulation signal(s).
Although the embodiment ofFIG.2D shows thepower source246 andcircuitry249 for generating stimulation signals located on theretractor apparatus200, it will be understood that one or both of these components may be omitted from theretractor apparatus200. For example, theretractor apparatus200 and/or theindividual retractor blades227 may be connected to a separate neurophysiological monitoring device by a wire connection. In some embodiments, a neurophysiological monitoring device may include a handheld probe that may be selectively coupled to theretractor apparatus200 or toindividual retractor blades227 to provide nerve stimulation signals. For example, a handheld probe may be inserted into theports241 as shown inFIG.1.
In some embodiments, aconnection257 between theretractor apparatus200 and therobotic arm101 may be used for data/control signals and/or to provide power to theretractor apparatus200, as is schematically illustrated inFIG.2D. Theconnection257 between therobotic arm101 andretractor apparatus200 may need to pass through a sterile barrier (e.g., a surgical drape) covering thearm101 and may utilize a non-contact transmission mechanism, such as inductive or capacitive coupling and/or optical or other electromagnetic transmission methods.
FIGS.3A-3E illustrate a method of performing a surgical procedure using aretractor apparatus200 such as described above. The surgical procedure may be a robot-assisted spinal procedure, such as a minimally-invasive lateral transpsoas interbody fusion. InFIG.3A, a trackedinstrument304 may be used to define and set a target trajectory or target location within the body of thepatient300. In embodiments, theinstrument304 may be a handheld instrument that may be gripped and easily manipulated by a user (e.g., a surgeon). Theinstrument304 may be a handheld pointer or stylus device that may be manipulated by the surgeon to point to or touch various locations on the skin surface of thepatient300. Alternately, theinstrument304 may be an invasive surgical instrument (e.g., dilator, cannula, needle, scalpel, etc.) that may be inserted into the body of the patient. Theinstrument304 may further include at least onemarker device319 to enable theinstrument304 to be tracked using amotion tracking system105, as described above. In this embodiment, the at least onemarker device319 includes an array of reflective spheres that are rigidly fixed to theinstrument304, although other types of active or passive markers may be utilized. Themarker device319 may be in a known, fixed geometric relationship with theinstrument304 such that by tracking themarker device319 themotion tracking system105 may determine the position and/or orientation of theinstrument304. Themotion tracking system105 may also track the current position and orientation of thepatient300 via areference marker device315 which may be rigidly attached to the patient300 (e.g., clamped or otherwise attached to a bony portion of the patient's anatomy). Themotion tracking system105 may thereby continuously track the position and/or orientation of theinstrument304 relative to the patient300 (i.e., within a common, patient-centric coordinate system).
Patient images318, which may have previously-been obtained by an imaging device103 (seeFIG.1), may be registered to the common patient-centric coordinate system using an image registration technique, as described above. One or morepatient images318 may be shown on a display screen of adisplay device219, as shown inFIG.3A. Thepatient images318 on thedisplay device219 may be augmented by one or more graphical elements indicating the current position/orientation of theinstrument304 within the patient-centric coordinate system. For example, as shown inFIG.3A, a dashedline320 superimposed over thepatient image318 may indicate the trajectory defined by an imaginary ray projected forward from the tip of theinstrument304. As theinstrument304 is moved relative to thepatient300, the location of the graphical element(s)320 on the display screen may be updated to reflect the current pose of theinstrument304 relative to the patient.
In embodiments, the user (e.g., surgeon) may manipulate theinstrument304 while viewing the augmented patient images on thedisplay device219 to identify a desired trajectory though thepatient300 to a surgical area. For example, for a lateral transpoas interbody fusion, the surgeon may utilize theinstrument304 to identify a path through the patient's anatomy to the surgical site (e.g., an intervertebral disc requiring a surgical intervention). The path may be selected to minimize disturbance to other anatomic features, such as neural structures (e.g., lumbar nerve plexus) located around or within the psoas muscle. The user may set a particular trajectory using a user-input command (e.g., a button push, a voice command, etc.). The selected trajectory within the common coordinate system may be saved in a memory (e.g., in computer113).
After a trajectory is set, the surgeon may make anincision331 in the patient's skin surface and insert an invasive surgical instrument through theincision331 and into the patient's body. The invasive surgical instrument may be, for example, a K-wire, a needle, an awl or the like that may be advanced along the pre-determined trajectory to the surgical site of interest. In some embodiments, the invasive surgical instrument may be a tracked instrument that is pre-calibrated and registered within the image guided surgery system. This may enable themotion tracking system105 to track the advancement of the instrument within thepatient300. Thedisplay device219 may graphically illustrate the position of the instrument as it is advanced along the pre-set trajectory.
In some embodiments, arobotic arm101 such as shown inFIG.1 may be used to guide the insertion of an invasive surgical instrument along the pre-determined trajectory. For example, therobotic arm101 may be controlled to move theend effector102 of thearm101 into alignment with the pre-defined trajectory and/or over the pre-determined target location. Theend effector102 may include a guide mechanism (e.g., such as a hollow tube) aligned with the pre-set trajectory and through which the surgical instrument may be inserted to guide the instrument along the trajectory. Alternately, the invasive surgical instrument may be inserted by the surgeon using a free-hand technique (i.e., without robotic assistance). The insertion may be performed with or without the use of image guidance/surgical navigation.
The surgeon may also perform intraoperative neurophysiological monitoring (IONM) such as by inserting a handheld neuro-monitoring probe device into the incision site of the patient to electrically stimulate the surrounding tissue and detecting the evoked EMG response to detect for the presence of nerve(s). Alternately or in addition, the invasive surgical instrument (e.g., K-wire, needle, etc.) that is inserted into the patient's body may be equipped with IONM functionality (e.g., it may include one or more electrodes configured to stimulate the surrounding tissue). This may enable the surgeon to repeatedly monitor for nerves as the instrument is advanced to the target site (e.g., an intervertebral disc).
In embodiments, after the surgeon has advanced an initial surgical instrument along the trajectory to the surgical site, one or more additional instruments may be inserted to dilate the tissue between the incision and the surgical site. For example, a series of dilating cannulas may be inserted over the initial surgical instrument (e.g., a K-wire).FIG.3B illustrates anoutermost cannula333 of a series of sequential dilating cannulas within thesurgical opening331. It will be understood that each of the dilating instruments (e.g., cannulas333) may optionally be tracked by themotion tracking system105. Also, each additional instrument inserted into thepatient300 may optionally include IONM functionality to detect nerve proximity during or after it's insertion into thepatient300.
As shown inFIG.3C, arobotic arm101 having aretractor apparatus200 attached thereto may be controlled to move theretractor apparatus200 over the surgical site. In embodiments where arobotic arm101 is used to guide the insertion of a K-wire or other instrument down to the surgical site, theend effector102 used for guiding may be removed from therobotic arm101 and aretractor apparatus200 such as shown inFIGS.2A-2E may be attached to the end of the arm101 (e.g., using a quick connect/disconnect attachment mechanism). Theretractor apparatus200 may be pre-calibrated and registered within the image guided surgery system and may include amarker device322 to enable the position of theretractor apparatus200 to be tracked and optionally shown on thedisplay device219. Theretractor apparatus200 may be moved by therobotic arm101 into a position such that a retractor axis, a, extending through the centralopen region225 of theapparatus200 may be aligned (i.e., collinear) with the pre-set trajectory into thepatient300. The retractor axis a may extend down the center of the workingchannel237 of theretractor apparatus200 when theretractor blades227 are attached. In embodiments, a controller for therobotic arm101 may move theretractor apparatus200 autonomously to align the retractor apparatus with the pre-set patient trajectory. Alternately, therobotic arm101 may be manually moved using a hand guiding mode to align the retractor apparatus over the pre-set trajectory.
Theretractor blades227 may be attached to theframe221 of the retractor apparatus200 (seeFIGS.2A-2E). In one embodiment, the diameter of the workingchannel237 of the retractor apparatus in an initial configuration (i.e., Di inFIG.2A) may be approximately equal to the outer diameter of thetissue dilator333. Theretractor blades227 may be inserted through therespective guides229 of theretractor assembly200 and advanced along the outer surface of thedilator333 into thepatient300. In embodiments, the outer surface of thedilator333 may include slots or similar features configured to guide theretractor blades227 along thedilator333 and into the surgical opening. After theretractor blades227 are inserted into thepatient300, thedilator333 may be withdrawn from thepatient300 through the centralopen region225 of theretractor apparatus200 to expose the workingchannel237 that may extend to the surgical site.
In an alternative embodiment, theretractor blades227 may first be inserted into the patient300 (e.g., over the outer surface of the dilator333) and may then be attached to theframe221 of theretractor assembly200 via a coupling mechanism. The coupling mechanism may attach the distal ends of theactuators233 to theretractor blades227. The coupling mechanism may be a latch (e.g., a mechanical or electromagnetic-based latch), a mechanical fastener, a clamp, a clip and/or mating features on theactuator233 and theblade227 that enable theblade227 to be secured to theactuator233. In one example, the mating features may include a protrusion on the outer surface of theblade227 that slides into a corresponding slot in the actuator233 (e.g., to provide a dovetail or bayonet-type connection). Alternately, a protrusion on theactuator233 may slide into a slot on theblade227. In embodiments, theretractor blades227 may be secured to theframe221 of theretractor apparatus200 by rotating the blades in a first direction with respect to theframe221. Theblades227 may be detached from theframe221 by rotating theblades227 in the opposite direction.
In other alternative embodiments, the surgeon may create a pathway through the patient's anatomy to the surgical site with or without the use of an image guided surgery system. For example, the surgeon may optionally utilize image guidance/surgical navigation to pre-plan an initial path to the surgical site, and may then use a manual (i.e., non-navigated) approach for deep tissue dissection and/or cannulation. One or more invasive surgical instruments inserted into the patient (e.g., a K-wire, a needle, a cannula, etc.) may be tracked by the motion tracking system105 (either directly by attaching amarker319 to the invasive instrument, or indirectly by touching or aligning a trackedhandheld probe304 to the invasive instrument) to determine the actual trajectory of the instrument(s) (e.g., cannula333) within the patient in the common coordinate system. Theretractor apparatus200 may then be moved by therobotic arm101 to align the retractor axis, a, with the instrument trajectory, as described above.
In various embodiments, theretractor blades227 may be used for performing IONM of thepatient300 as discussed above at any time before, during and/or after theblades227 are attached to theframe221 of theretractor apparatus200.
After theretractor blades227 are attached to theframe221, theblades227 may be retracted to increase the size of the workingchannel237, as shown inFIG.3D. In embodiments, feedback data (e.g., encoder data) from theretractor apparatus200 may be provided to the image guided surgery system to enable patient images shown on thedisplay device219 to be augmented by a graphical indication of real-time positions of theblades227 and/or the size of the workingchannel237 within thepatient300.
During a surgical procedure, therobotic arm101 may maintain the position of theretractor apparatus200 relative to thepatient300. In embodiments, therobotic arm101 may be configured to compensate for any patient movement to maintain the workingchannel237 aligned along the pre-set trajectory. The surgeon may perform a surgical procedure, such as in interbody fusion, through the workingchannel237 defined by theretractor apparatus200. In particular, disc material or other pathologic tissue may be removed and an implant (e.g., a spacer or cage) may be inserted through the workingchannel237 and placed in the intervertebral space. IONM may be utilized as desired to minimize damage or irritation to surrounding neural structures.
After the insertion of an implant, theretractor apparatus200 may be removed from thepatient300 and the incision may be closed. Thepatient300 may optionally be scanned using animaging device103 such as shown inFIG.1 to confirm the placement of the implant. The procedure may also include the insertion of stabilization elements (e.g., a rod and screw system) to stabilize the spine and allow the adjacent vertebra to properly fuse in the case of a fusion procedure. In some embodiments, the placement of screws (e.g., pedicle screws) may be performed using therobotic arm101 and/or image guided surgery system without requiring thepatient300 to be repositioned or moved. In particular,patient images318 on thedisplay device219 may be used by the surgeon to set one or more trajectories323 for screw placement (e.g., via a posterior or anterior approach of thepatient300 lying on his/her side). Therobotic arm101 may be moved into position to align theend effector102 over the pre-set trajectory. Theretractor apparatus200 may be removed and replaced on therobotic arm101 by anend effector102 that includes a guide mechanism (e.g., hollow tube324) through which surgical instruments may be inserted along the pre-set trajectory. Various instruments, such as one or more cannulas, a drill, a screw and a screw driver, may be inserted through theend effector102 and to place the screw in the patient.
FIGS.4A-4B illustrate an alternative embodiment of aretractor apparatus400. Theretractor apparatus400 may be similar toretractor apparatus200 shown inFIGS.2A-2E. Theretractor apparatus400 includes aframe421 having a centralopen region425 as shown in the overhead view ofFIG.4A. Theframe421 may be coupled to a rigid support arm, such asrobotic arm101 shown inFIG.1. In the embodiment ofFIGS.4A-4B, theframe421 has a generally circular shape. Theretractor apparatus400 includes a plurality ofactuators433 extending into the centralopen region425. Theactuators433 may each be independently extended and retracted within the centralopen region425 using control features (e.g., sockets435).
Theretractor apparatus400 includes acoupling mechanism420 for mechanically coupling theactuators433 to a plurality ofretractor blades427. In this embodiment, the coupling mechanism430 comprises aprojection431 extending from the side of theretractor blade427 that is received within aslot432 in theactuator433 to attach theretractor blade427 to theactuator433.
Theretractor apparatus400 may also include a plurality of markers434 (e.g., reflective spheres) attached to apparatus, such as on therigid frame421 of theapparatus400. A plurality of markers434 (reflective spheres) are visible in the side view of theretractor apparatus400 ofFIG.4B. Themarkers434 may enable theretractor apparatus400 to be tracked by amotion tracking system105 as described above.
FIG.5 is a system block diagram of acomputing device1300 useful for performing and implementing the various embodiments described above. While thecomputing device1300 is illustrated as a laptop computer, a computing device providing the functional capabilities of thecomputer device1300 may be implemented as a workstation computer, an embedded computer, a desktop computer, a server computer or a handheld computer (e.g., tablet, a smartphone, etc.). Atypical computing device1300 may include aprocessor1301 coupled to anelectronic display1304, aspeaker1306 and amemory1302, which may be a volatile memory as well as a nonvolatile memory (e.g., a disk drive). When implemented as a laptop computer or desktop computer, thecomputing device1300 may also include a floppy disc drive, compact disc (CD) or DVD disc drive coupled to theprocessor1301. Thecomputing device1300 may include anantenna1310, amultimedia receiver1312, atransceiver1318 and/or communications circuitry coupled to theprocessor1301 for sending and receiving electromagnetic radiation, connecting to a wireless data link, and receiving data. Additionally, thecomputing device1300 may includenetwork access ports1324 coupled to theprocessor1301 for establishing data connections with a network (e.g., LAN coupled to a service provider network, etc.). A laptop computer ordesktop computer1300 typically also includes akeyboard1314 and amouse pad1316 for receiving user inputs.
The foregoing method descriptions are provided merely as illustrative examples and are not intended to require or imply that the steps of the various embodiments must be performed in the order presented. As will be appreciated by one of skill in the art the order of steps in the foregoing embodiments may be performed in any order. Words such as “thereafter,” “then,” “next,” etc. are not necessarily intended to limit the order of the steps; these words may be used to guide the reader through the description of the methods. Further, any reference to claim elements in the singular, for example, using the articles “a,” “an” or “the” is not to be construed as limiting the element to the singular.
The various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The hardware used to implement the various illustrative logics, logical blocks, modules, and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose processor, a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Alternatively, some steps or methods may be performed by circuitry that is specific to a given function.
In one or more exemplary aspects, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on as one or more instructions or code on a non-transitory computer-readable medium. The steps of a method or algorithm disclosed herein may be embodied in a processor-executable software module executed which may reside on a non-transitory computer-readable medium. Non-transitory computer-readable media includes computer storage media that facilitates transfer of a computer program from one place to another. A storage media may be any available media that may be accessed by a computer. By way of example, and not limitation, such non-transitory computer-readable storage media may comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to carry or store desired program code in the form of instructions or data structures and that may be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of non-transitory computer-readable storage media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a machine readable medium and/or computer-readable medium, which may be incorporated into a computer program product.
The preceding description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the invention. Thus, the present invention is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.