This application claims the benefit of U.S. provisional patent application 61/560,677, filed Nov. 16, 2011, entitled “MEDICAL SENSING CONTROL SYSTEM AND METHOD,” the entirety of which is incorporated by reference herein.
TECHNICAL FIELDEmbodiments of the present disclosure relate generally to the field of medical devices and, more particularly, to a medical workflow system and associated methods of use.
BACKGROUNDInnovations in diagnosing and verifying the level of success of treatment of disease have progressed from solely external imaging processes to include internal diagnostic processes. In addition to traditional external image techniques such as X-ray, MRI, CT scans, fluoroscopy, and angiography, small sensors may now be placed directly in the body. For example, diagnostic equipment and processes have been developed for diagnosing vasculature blockages and other vasculature disease by means of ultra-miniature sensors placed upon the distal end of a flexible elongate member such as a catheter, or a guide wire used for catheterization procedures. For example, known medical sensing techniques include intravascular ultrasound (IVUS), forward looking IVUS (FL-IVUS), fractional flow reserve (FFR) determination, a coronary flow reserve (CFR) determination, optical coherence tomography (OCT), trans-esophageal echocardiography, and image-guided therapy. Traditionally, many of these procedures are carried out by a multitude of physicians and clinicians, where each performs an assigned task. For example, a physician may stand next to a patient in the sterile field and guide the insertion and pull back of an imaging catheter. A clinician near the physician may control the procedure workflow with a controller, for example by starting and stopping the capture of images. Further, after images have been captured, a second clinician in an adjacent control room working at a desktop computer may select the images of interest and make measurements on them. Typically, the physician in the catheter lab must instruct the clinician in the control room on how to make such measurements. This may lengthen the time of the procedure, increase the cost of the procedure, and may lead to measurement errors due to miscommunication or clinician inexperience. Further, when making measurements on medical sensing images, a clinician may typically have to select a measurement mode prior to making any measurements, reducing the efficiency of the medical sensing workflow.
Accordingly, while the existing devices and methods for conducting medical sensing workflows have been generally adequate for their intended purposes, they have not been entirely satisfactory in all respects.
SUMMARYIn one exemplary aspect, the present disclosure is directed to a method of conducting a medical workflow with a touch-sensitive bedside controller. The method includes initiating a medical workflow using a graphical user interface on the bedside controller, positioning an imaging tool within a patient's body based on images captured by the imaging tools and displayed on the bedside controller, controlling the commencement and termination of a recordation of images captured by the imaging tool using the graphical user interface on the bedside controller, navigating through the recorded images to identify an image of interest using the graphical user interface on the bedside controller, and performing measurements on the image of interest using the graphical user interface on the bedside controller.
In some instances, the performing measurements may include touching and releasing portions of the image of interest as it is displayed on the bedside controller to make one of an area measurement and a diameter measurement.
In another exemplary aspect, the present disclosure is directed to a bedside controller. The bedside controller include a housing, a touch-sensitive display disposed within a surface of the housing and configured to display images and receive user input on the surface, and a processor disposed within the housing. The bedside controller also includes a communication module disposed within the housing, communicatively coupled to the processor, and configured to transmit and receive medical data and a non-transitory computer readable storage module disposed within the housing, communicatively coupled to the processor, and including a plurality of instructions stored therein and executable by the processor. The plurality of instructions include instructions for rendering a graphical user interface (GUI) on the touch-sensitive display, instructions for displaying images of a patient as they are being captured by an imaging tool disposed within the patient's body, and instructions for initiating and terminating a recordation of the images based on user input to the GUI. The plurality of instructions also include instructions for displaying the recorded images within the GUI so that a user may identify an image of interest and instructions for making a measurement on the image of interest based on a user measurement input to the GUI.
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1 is a schematic drawing depicting a medical sensing system including a bedside controller according to one embodiment of the present disclosure.
FIG. 2 is a schematic drawing depicting a medical sensing system including a wireless bedside controller according to another embodiment of the present disclosure.
FIG. 3A is a diagrammatic perspective view of a bedside controller.
FIG. 3B is a diagrammatic rear perspective view of the bedside controller ofFIG. 3A.
FIG. 3C is a diagrammatic perspective view of the bedside controller ofFIG. 3A mounted to a bed rail.
FIG. 4 is a functional block diagram of the bedside controller ofFIGS. 3A and 3B according to aspects of the present disclosure.
FIG. 5 is a diagrammatic perspective view of a multi-modality mobile processing system with the bedside controller ofFIGS. 3A and 3B attached thereto.
FIG. 6 is a diagrammatic perspective view of the bedside controller ofFIGS. 3A and 3B releasably mounted on an IV pole.
FIG. 7 is a high-level flowchart illustrating a method of conducting a medical sensing workflow with a bedside controller according to various aspects of the present disclosure.
FIG. 8 is high-level flowchart of a method that describes a measurement workflow conducted on a bedside controller according to various aspects of the present disclosure.
FIGS. 9-11 are partial screen images illustrating various aspects of the method ofFIG. 8.
DETAILED DESCRIPTIONFor the purposes of promoting an understanding of the principles of the present disclosure, reference will now be made to the embodiments illustrated in the drawings, and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the disclosure is intended. Any alterations and further modifications in the described devices, instruments, methods, and any further application of the principles of the disclosure as described herein are contemplated as would normally occur to one skilled in the art to which the disclosure relates. In particular, it is fully contemplated that the features, components, and/or steps described with respect to one embodiment may be combined with the features, components, and/or steps described with respect to other embodiments of the present disclosure.
FIG. 1 is a schematic drawing depicting amedical sensing system100 including abedside controller102 according to one embodiment of the present disclosure. In general, themedical sensing system100 provides for coherent integration and consolidation of multiple forms of acquisition and processing elements designed to be sensitive to a variety of methods used to acquire and interpret human biological physiology and morphological information. More specifically, insystem100, thebedside controller102 is a touch-enabled, integrated computing device for the acquisition, control, interpretation, measurement, and display of multi-modality medical sensing data. In the illustrated embodiment, thebedside controller102 is a tablet-style touch-sensitive computer that provides user controls and diagnostic images on a single surface. In themedical sensing system100, thebedside controller102 is operable to present workflow control options and patient image data via graphical user interfaces (GUIs) corresponding to a plurality of medical sensing modalities. Thebedside controller102 will be described in greater detail in association withFIGS. 3A,3B, and4.
In the illustrated embodiment, themedical sensing system100 is deployed in acatheter lab104. Thecatheter lab104 may be used to perform on apatient106 any number of medical sensing procedures alone or in combination such as, by way of example and not limitation, angiography, intravascular ultrasound (IVUS), virtual histology (VH), forward looking IVUS (FL-IVUS), intravascular photoacoustic (IVPA) imaging, fractional flow reserve (FFR) determination, coronary flow reserve (CFR) determination, optical coherence tomography (OCT), computed tomography, intracardiac echocardiography (ICE), forward-looking ICE (FLICE), intravascular palpography, transesophageal ultrasound, or any other medical sensing modalities known in the art. In addition to controlling medical sensing systems, the bedside controller may be used to cooperate with and control medical treatment systems such as, for example but without limitation, those used for stent placement, coil embolism, ablation therapy, kidney stone treatments, basket placement in a cystoscopy, tumor removal, and chemical therapies. Thecatheter lab104 further includes asterile field105 that encompasses the portions of the catheter lab surrounding thepatient106 on a procedure table109 and aclinician107, who may perform any number of medical sensing procedures or treatments. As shown inFIG. 1, thebedside controller102 may be positioned within thesterile field105 and may be utilized by theclinician107 to control a workflow of a medical sensing procedure or treatment being performed on thepatient106. For example, theclinician107 may initiate the procedure workflow, watch real-time IVUS images captured during the procedure, and make measurements on the IVUS images all using thebedside controller102 inside of thesterile field105. In alternative embodiments, thebedside controller102 may be utilized outside of thesterile field105, for instance, in other locations within thecatheter lab104 or in a control room adjacent to the catheter lab. A method of utilizing thebedside controller102 to control a medical sensing workflow or treatment workflow will be discussed in greater detail in association withFIGS. 7 and 8.
In the embodiment illustrated inFIG. 1, themedical sensing system100 additionally includes a number of interconnected medical sensing-related tools in thecatheter lab104 to facilitate a multi-modality workflow procedure, such as anIVUS catheter108, an IVUS patient isolation module (PIM)112, anOCT catheter110, andOCT PIM114, an electrocardiogram (ECG)device116, anangiogram system117, aboom display122, and amulti-modality processing system124. Thebedside controller102,PIMs112 and114,ECG device116,angiography system117, andboom display122 are communicatively coupled to theprocessing system124. In one embodiment, theprocessing system124 is a computer workstation with the hardware and software to acquire, process, and display multi-modality medical sensing data, but in other embodiments, the processing system may be any other type of computing system operable to process medical sensing data. For example, during an IVUS workflow, theprocessing system124 is operable to accept raw IVUS data from theIVUS PIM112, transform it into IVUS images, and make the images available to thebedside controller124, so that they may be displayed to theclinician107 for analysis. In the embodiments in which theprocessing system124 is a computer workstation, the system includes at least a processor such as a microcontroller or a dedicated central processing unit (CPU), a non-transitory computer-readable storage medium such as a hard drive, random access memory (RAM), and/or compact disk read only memory (CD-ROM), a video controller such as a graphics processing unit (GPU), and a network communication device such as an Ethernet controller. Further, themulti-modality processing system124 is communicatively coupled to adata network125. In the illustrated embodiment, thedata network125 is a TCP/IP-based local area network (LAN), however in other embodiments, it may utilize a different protocol such as Synchronous Optical Networking (SONET), or may be a wide area network (WAN). Theprocessing system124 may connect to various resources via thenetwork125, such as a Digital Imaging and Communications in Medicine (DICOM) system, a Picture Archiving and Communication System (PACS), and a Hospital Information System. U.S. Patent Application No. 61/473,570, entitled “MULTI-MODALITY MEDICAL SENSING SYSTEM AND METHOD” and filed on Apr. 8, 2011, discloses a multi-modality processing system that processes medical sensing data and is hereby incorporated by reference in its entirety.
In themedical sensing system100, theIVUS PIM112 andOCT PIM114 are operable to respectively receive medical sensing data collected from thepatient106 by theIVUS catheter108 andOCT catheter110 and are operable to transmit the received data to theprocessing system124. In one embodiment, theIVUS PIM112 andOCT PIM114 transmit the medical sensing data over a Peripheral Component Interconnect Express (PCIe) data bus connection, but, in other embodiments, they may transmit data over a USB connection, a Thunderbolt connection, a FireWire connection, or some other high-speed data bus connection. Additionally, theECG device116 is operable to transmit electrocardiogram signals or other hemodynamic data frompatient106 to theprocessing system124. To aid the clinician in data capture, thebedside controller102 is operable to display the ECG data along side medical sensing data. Further, in some embodiments, theprocessing system124 may be operable to synchronize data collection with thecatheters108 and110 using ECG signals from theECG116. Further, theangiogram system117 is operable to collect x-ray, computed tomography (CT), or magnetic resonance images (MRI) of thepatient106 and transmit them to theprocessing system124. After the x-ray, CT, or MRI data has been processed into human-readable images by theprocessing system124, theclinician107 may navigate the GUI on thebedside controller124 to retrieve the images from theprocessing system124 and display them on the controller. In some embodiments, theprocessing system124 may co-register image data from angiogram system117 (e.g. x-ray data, MRI data, CT data, etc.) with sensing data from the IVUS andOCT catheters108 and110. As one aspect of this, the co-registration may be performed to generate three-dimensional images with the sensing data. Such co-registered 3-D images data may be viewable on thebedside controller124. In one embodiment, a clinician may rotate, zoom, and otherwise manipulate such 3-D images on thebedside controller102 using simultaneous touch inputs (i.e. multitouch) and gestures.
Additionally, in the illustrated embodiment ofFIG. 1, medical sensing tools insystem100, are communicatively coupled to theprocessing system124 via a wired connection such as a standard copper link or a fiber optic link. Specifically, thebedside controller124 may be communicatively and/or electrically coupled to theprocessing system124 via a Universal Serial Bus (USB) connection, a Power-over-Ethernet connection, a Thunderbolt connection, a FireWire connection, or some other high-speed data bus connection.
However, in an alternative embodiment, such as that shown inFIG. 2, the medical sensing tools may communicate wirelessly. In that regard,FIG. 2 is a schematic drawing depicting amedical sensing system200 including awireless bedside controller202 according to another embodiment of the present disclosure. Themedical sensing system200 is similar to thesystem100 ofFIG. 1 but the medical sensing tools including thewireless bedside controller202, awireless IVUS PIM204, and awireless OCT PIM206 communicate with awireless network208 via wireless networking protocols. For example, thebedside controller202 may send and receive workflow control parameters, medical sensing images, and measurement data to and from a remote processing system via IEEE 802.11 Wi-Fi standards, Ultra Wide-Band (UWB) standards, wireless FireWire, wireless USB, Bluetooth, or another high-speed wireless networking standard. Such wireless capability allows theclinician107 to more freely position thebedside controller202 inside or outside of thesterile field105 for better workflow management.
With reference now toFIGS. 3A,3B,3C and4,FIG. 3A is a diagrammatic perspective view of abedside controller300,FIG. 3B is a diagrammatic rear perspective view of the bedside controller,FIG. 3C is a diagrammatic perspective view of the bedside controller mounted to a bed rail, andFIG. 4 is a functional block diagram of thebedside controller300 according to aspects of the present disclosure. Thebedside controller300 is similar to thebedside controllers102 and202 inmedical sensing systems100 and200, and is operable to, among other things, initiate a medical sensing or treatment procedure workflow, display real-time images captured during the procedure, accept measurement input on the images from a clinician. Thebedside controller300 generally improves system control available to a clinician working at a patient table. For instance, giving a clinician both workflow control and measurement capability within the sterile field reduces errors and improves workflow efficiency.
As show inFIG. 3A, thebedside controller300 includes an integrally formedhousing302 that is easy to grasp and move around a catheter lab or other medical setting. In one embodiment, the integrally formedhousing302 may be seamlessly molded from materials such as thermoplastic or thermosetting plastic or moldable metal. In other embodiments, the integrally formedhousing302 may comprise a plurality of housing portions fixedly bonded in a substantially permanent manner to form an integral housing. Thehousing302 is resistant to fluids, and, in one embodiment, may have a rating of IPX4 against fluid ingress as defined by the International Electrotechnical Commission (IEC) standard 60529. In other embodiments in which thehousing302 may be used in different environments, the hub may have a different fluid ingress rating. In the illustrated embodiment, thehousing302 is about 10.5 inches in width, about 8.25 inches in height, and has as thickness of about 2.75 inches. In alternative embodiments, the housing may have a different width, height, or thickness that is similarly conducive to portability.
As shown inFIG. 3B, thehousing302 further includes self-containedmounting structure303 disposed on the housing. In the illustrated embodiment, the mounting structure is disposed near an outer edge of the housing. The mountingstructure303 allows thebedside controller300 to be releasably mounted in a variety of places in and out of a catheter lab in a self-contained manner. That is, thebedside controller300 may be directly secured to another object without the use of a separate external mount. In the illustrated embodiment, the mountingstructure303 includes a mountingchannel304 and a retainingclamp305 that pivots over the mounting channel to secure a mounting platform therewithin. The mountingchannel304 is defined by a longerfront wall350, atop wall352, and ashorter back wall354, and the retaining clamp includes aslot356 that extends through the clamp in a manner generally parallel to the mounting channel. Thefront wall350 and theback wall354 are generally perpendicular to a touch-sensitive display307 in thehousing302, and thetop wall352 is generally parallel to thedisplay307. In the illustrated embodiment, the retaining clamp is spring-loaded and releasably exerts pressure on objects situated in the mounting channel. In alternative embodiments, the retaining clamp may be configured differently and exert force via mechanisms other than springs.
As shown inFIG. 3C, in operation, thebedside controller300 may be releasably secured to a mounting platform, for example abed rail306, by pivoting the mountingclamp305 to an open position, positioning the controller such that the rail extends through the length of thechannel304, and releasing the clamp such that it secures the rail within the channel. When therail306 is positioned in the mountingchannel304 and theclamp305 is holding it therein, three surfaces of the rail are respectively engaged by thefront wall350, thetop wall352, and theback wall354, and a fourth surface of the rail extends through theslot356 in theclamp305. In this manner, the mountingstructure303 may maintain thebedside controller300 in a position generally parallel to a procedure table350 associated with thebed rail306, as shown inFIG. 3B. Described differently, the mountingstructure303 is a cantilevered mounting structure in that it secures one end of the controller to an object while the majority of the controller extends away from the object in an unsupported manner. Such a cantilevered position allows for a display of the controller to be both readable and at a comfortable input angle for an operator. Further, the self-containedmounting structure303 allows thebedside controller300 to be quickly released from thebed rail306 and reattached to an IV pole, a cart on which a processing system is deployed, or other location in or out of the sterile field to allow for convenient workflow control and image analysis. In alternative embodiments the mountingstructure303 of the bedside controller may vary from the design illustrated inFIGS. 3A and 3B and include additional and/or different components to allow for self-contained mounting.
Embedded into the front of thehousing302 is the touch-sensitive display307 that comprises both atouch panel308 and aflat panel display309. Thetouch panel308 overlays theflat panel display308 and accepts user input via human touch, stylus touch, or some other analogous input method. In other words, the touch-sensitive display307 displays images and accepts user input on the same surface. In the current embodiment, thetouch panel308 is a resistive-type panel, but in alternative embodiments it may be a capacitive-type panel, projective-type panel, or some other suitable type of touch enabled input panel. Further, thetouch panel308 is operable to accept multiple inputs simultaneously (multitouch), for instance, to enable rotation of a three-dimensional rendering of a vessel along multiple axes. Additionally, thetouch panel308 is capable of receiving input when asterile drape301 is covering thebedside controller300 and also when a user is gloved. Thetouch panel308 is controlled by atouch controller310 disposed within thehousing302. Further, when a clinician makes contact with thetouch panel308, the touch panel is operable to provide haptic feedback via ahaptics controller312 andhaptics drivers314. This haptic technology is operable to simulate a plurality of sensations on thetouch panel308 by varying the intensity and frequency of vibrations generated when a user contacts the touch panel. In some embodiments, thehousing302 may include a sheath configured to store a stylus therein. Thus, a clinician may remove the stylus from the sheath in the housing to make measurements on the bedside controller and store it when the measurements have been completed.
Beneath thetouch panel308 is theflat panel display309 that presents a graphical user interface (GUI)316 to a user. In the illustrated embodiment, theflat panel display309 is a LCD display but in alternative embodiments, it may be a different type of display such an LED display or an AMOLED display. In the illustrated embodiment, theflat panel display309 is illuminated by a LEDbacklight power inverter318. As mentioned above, theGUI316 not only allows a clinician to control a medical sensing workflow but also make measurements on images captured from a patient in the sterile field. A method of interacting with theGUI316 to make vessel measurements will be discussed in greater detail in association withFIGS. 8-11.
Thebedside controller300 includes a singleboard processing platform320 within thehousing302 that is operable to render theGUI316 and process user input. In the illustrated embodiment, the processing platform has a pico form factor and includes integrated processing components such as aprocessor321,system memory322, graphics processing unit (GPU),communications module323, and I/O bus controller. In some embodiments, theprocessor321 may be a low power processor such as an Intel Atom® processor or a ARM-based processor, and thecommunications module323 may be a 10/100/1 Gb Ethernet module. And, the I/O bus controller may be a Universal Serial Bus (USB) controller. Thebedside controller300 further includes astorage module324 that is a non-transitory computer readable storage medium operable to store an operating system (i.e. software to render and control the GUI), image manipulation software, medical sensing data and images received from a processing system, and other medical sensing-related software. Theprocessor321 is configured to execute software and instructions stored on thestorage module324. In the illustrated embodiment, thestorage module324 is a solid state drive (SSD) hard drive communicatively coupled to theprocessing platform320 via a SATA connection, but, in alternative embodiments, it may be any other type of non-volatile or temporary storage module. Thebedside controller300 further includes awireless communications module326 communicatively coupled to theprocessing platform320. In some embodiments, the wireless communications module is a IEEE 802.11 Wi-Fi module, but in other may be a Ultra Wide-Band (UWB) wireless module, a wireless FireWire module, a wireless USB module, a Bluetooth module, or another high-speed wireless networking module.
In the illustrated embodiment, thebedside controller300 is powered via both a wired 12 VDC power-over-Ethernet (PoE)connection328 and abattery330 disposed within thehousing302. In one embodiment, thebattery330 may be sealed within the integrally formedhousing302 and may be recharged through electrical contacts disposed on the exterior of the housing and electrically coupled to the battery. As shown in the embodiment ofFIG. 3B, thefront wall350 may include one or moreelectrical contacts358 through which thebattery330 may be charged when the controller is mounted to objects with compatible charging structure. In other embodiments, thehousing302 may include a battery compartment with a removable cover to permit battery replacement. Such a battery compartment cover may be resistant to fluid ingress (e.g., with an IPX4 rating). The besidecontroller300 may be coupled to a processing system in the catheter lab via thePoE connection328, over which it receives medical sensing images that have been captured from the patient and rendered on the processing system. In operation, when the bedside controller is coupled to thePoE connection328, it receives power and communications over the same physical wire. When thebedside controller300 is disconnected from thePoE connection328, it runs on battery power and receives data wirelessly via thewireless communications module326. When used wirelessly in a catheter lab, the beside controller may directly communicate with a processing system (i.e. in an ad-hoc wireless mode), or, alternatively, it may communicate with a wireless network that serves a plurality of wireless devices. In alternative embodiments, thebedside controller300 may receive power and data through different wired connections, or receive data communications through a wired data connection and power from thebattery330, or receive data communications through thewireless module326 and power from a wired electrical connection. In some embodiments, thebedside controller300 may be used in a semi-wireless configuration, in which thebattery330 provides backup power to the controller when the controller is temporarily disconnected from a wired power source. For example, if at the beginning of a procedure, thebedside controller300 is connected to a PoE connection (or other type of wired connection) and during the procedure the controller must be disconnected from the PoE connection to allow for a cabling adjustment, thebattery330 may keep the controller alive until a PoE connection can be re-established. In this manner, a full power-off and reboot of thecontroller300 is avoided during a procedure. As shown inFIG. 4, a DC-DC power converter332 converts input voltage to a voltage usable by theprocessing platform320.
It is understood that although thebedside controller300 in the illustrated embodiments ofFIGS. 3 and 4 includes specific components described herein, the bedside controller may include any number of additional components, for example a charge regulator interposed between the electrical contacts and the battery, and may be configured in any number of alternative arrangements in alternative embodiments.
With reference now toFIGS. 5 and 6, illustrated are examples of locations in which thebedside controller300 may be mounted.FIG. 5 is a diagrammatic perspective view of a multi-modalitymobile processing system500. Theprocessing system500 is disposed on acart502 that enables the processing system to be easily moved between different locations such as different catheter labs. As shown inFIG. 5, thebedside controller300 is mounted to thecart502 so that it may be transported to catheter labs with the processing system. Thebedside controller300 is releasably secured to the cart via the self-containedmounting structure303 that is built into thehousing302. Further, in some embodiments, thecart502 may include a dock for thebedside controller300 such that when the controller is docked on the cart its battery is recharged through theelectrical contacts358 disposed on thehousing302. As shown inFIG. 6, thebedside controller300 may also releasably attach to anIV pole600 via the self-containedmounting structure303. When so attached, thebedside controller300 may be rolled next to a patient in the sterile field and thus within reach of a clinician who may operate the controller with a single hand.
FIG. 7 is a high-level flowchart illustrating amethod700 of conducting a medical sensing workflow with thebedside controller300 ofFIGS. 3-4 according to various aspects of the present disclosure. Themethod700 will be described in the context of an IVUS procedure but may equally apply to any number of medical sensing or treatment procedures, such as an OCT procedure, a FLIVUS procedure, an ICE procedure, etc. Themethod700 begins atblock702 where a medical sensing workflow is initiated with thebedside controller300. Using an IVUS procedure as an example, a clinician in the sterile field and adjacent a patient may select the “IVUS” option out of a plurality of modes (e.g., OCT, Chromaflow, FLIVUS, etc) on the bedside controller's GUI to begin the IVUS workflow. Next, inblock704, after an IVUS imaging catheter has been inserted into the patient, the clinician may select a ‘Live Images’ option on the bedside controller's GUI to receive live images from the catheter. Using the real-time images, the clinician may guide the catheter within the patient to a desired position. In typical embodiments, a processing system may collect raw IVUS data from the catheter and process the data to render IVUS images. The bedside controller retrieves the IVUS images from the processing system and displays them to a user in real-time. Then, inblock706, after the IVUS catheter has been appropriately positioned in the patient using the live images, the clinician selects a ‘Record’ option on the bedside controller GUI and begins the catheter pull back. The processing system responds to the record command and begins rendering and storing IVUS images. Themethod700 proceeds to block708 where, after the IVUS catheter pull back has been completed, the clinician terminates the recording of IVUS images via the bedside controller's GUI. Then, inblock710, the clinician at the bedside recalls the captured IVUS images on the bedside controller and finds the IVUS images associated with the area of interest. Specifically, the bedside controller may present a condensed view of all captured images and the clinician may navigate through them using gestures on the bedside controller's touch panel to find the target area. Finally, inblock720, the clinician performs measurements on the IVUS images directly on the bedside controller. The user of the bedside controller creates measurements by interacting with an image through a series of presses, moves and releases using a finger or stylus on the controller's touch-sensitive display. These actions are interpreted by the bedside controller's internal processor and converted to measurements on the display. For precise measurements, the clinician may annotate the images using a stylus or another tool compatible with the bedside controller's touch panel. After the appropriate measurements have been completed, the clinician may save the images to the processing system by selecting the appropriate options in the bedside controller GUI. A method of performing measurements on the bedside controller will be described below.
FIG. 8 is high-level flowchart of amethod800 that describes a measurement workflow on thebedside controller300 ofFIGS. 3A-4. In one embodiment, themethod800 may be carried out duringblock720 of themethod700 inFIG. 7 as part of a medical sensing workflow on intravascular images. Further, in the illustrated embodiment, themethod800 of making measurements on thebedside controller300 is implemented in measurement software stored in thestorage module324 in the bedside controller. In general, when measuring images, such as intravascular images, a clinician has the option of making different types of measurements such as diameter measurements and area measurements. Typically, when making area measurements, a clinician may either denote the edges of an object by drawings a series of discrete points that are connected in subsequent processing or by drawing a continuous line around the object to the measured. In this regard, themethod800 of performing measurements on images is “smart” in that it does not require a user to select a particular measurement mode prior to interacting with an image on the bedside controller. For instance, when a user performs a series of measurement inputs on the bedside controller, the GUI software interprets the nature (e.g. shape) of a user's measurement inputs, automatically enters either diameter mode, area-point mode or area-draw mode, and outputs the desired measurement on the controller's display.
In more detail, themethod800 begins atblock802 where an image to be measured is displayed on the bedside controller and a user inputs a measurement start point on the image with an input device. For example, the user may use a finger or stylus to indicate a point on a vessel border from which a measurement will commence. Note that prior to selecting the measurement start point, the measurement software did not require the user to select a measurement mode. Next, inblock804, the user, without removing the input device from the image after indicating the start point, drags the input device across the image a distance to trace a line. Then, inblock806, the user withdraws the input device from the image at a measurement end point. Themethod800 proceeds to decision block808 where the measurement software determines whether the distance between the start point and the end point is less than a threshold value. In one embodiment, the threshold value is equivalent to10 pixels, but, in alternative embodiments, the threshold value may be smaller or larger or measured in different units. Further, in some embodiments, the threshold value is adjustable either manually by a user or automatically based on detected error rates. If the distance is less than the threshold value, the method proceeds to block810 where the measurement software enters area-point mode and draws a point on the image corresponding to the end point (i.e. where the user lifted the input device from the touch-enabled display). This sequence is illustrated inFIG. 9. Specifically, when a user presses (900) an input device on an image and immediately lifts (902) the input device, the input will be interpreted as a point entry and apoint904 will be drawn on the image.
Themethod800 then proceeds to decision block812 where it is decided whether additional points are needed to make a measurement on the image. If additional points are needed, the method proceeds to block814 where a user touches and releases the displayed image at a different location. Note that in this branch ofmethod800, the measurement software is in area-point mode so that all entries will be interpreted as points and, when an input is detected, a point will be drawn on the image inblock810 regardless of the distance between a start point and end point of the input. If no additional points are needed to make a measurement indecision block812, themethod800 proceeds to block816, where a user selects a ‘Done’ button in the bedside controller GUI to exit area-point mode. In block818, the measurement software creates an area measurement using the entered points. For example, in an embodiment directed toward vessel measurement, the measurement software connects the entered points to create a bounding circle at the vessel's outer edge. In one embodiment, the measurement software uses the entered points as seed points to assist edge detection algorithms.
With reference back to decision block808, if the distance between the start point and the end point is greater than or equal to the threshold, themethod800 proceeds to decision block820 where the measurement software determines whether the drawn line is “relatively straight”. That is, it determines whether the user desires to measure a diameter with a line or an area with an enclosed shape. As shown inFIG. 10, to make such a determination, the measurement software compares intervening points on the traced line between astart point1000 and anend point1002 against aboundary threshold1004. If all intervening points are within theboundary threshold1004, the measurement software determines that the user desires to make a diameter measurement and transforms the traced line into astraight line1006 extending from the start point to the end point. The diameter measurement is thus based on the length of thestraight line1006. In alternative embodiments, however, the measurement software may employ different methods for determining whether the user desires to make a diameter measurement or an area measurement, such as detecting whether intervening points between start and end points increase in distance from the start point before decreasing in distance from the start point or detecting whether the traced line extending through the start point, at least one intervening point, and the end point is arcuate past a threshold degree. Atdecision block820, if the user's traced line is relatively straight, the method proceeds to block822 where the measurement software enters diameter mode and outputs a measurement of thestraight line1006 created between the start and end points. If, however, the traced line is not relatively straight, themethod800 proceeds to818 where the measurement software enters area-draw mode. As shown inFIG. 11, the tracedline1100 betweenstart point1102 andend point1104 extends outside of a boundary threshold (not shown) and is thus not relatively straight, prompting the measurement software to enter area-draw mode. Once this determination is made, the software connects the start and ends points to create anunbroken bounding line1006 from which an area may be calculated. After an area measurement has been made in block818 (either in area-point mode or area-draw mode), the method proceeds to decision block824 where it is determined if another measurement needs to be done. If so, the method proceeds back to block802 where a user selects another start point on the image without first selecting a measurement mode. If all measurements have been completed, themethod800 ends.
It is understood that themethods700 and800 illustrated in the flow charts ofFIGS. 7 and 8 may, in alternative embodiments, be performed in a different order and may include different and/or additional blocks in some embodiments. For example, workflows for some medical sensing procedure may allow for additional measurement modes, such as volumetric measurements. According to the described aspects of the present disclosure, a user may initiate any such additional measurement modes without first selecting a measurement mode, thus simplifying the workflow. Further, the steps inmethods700 and800 described above may be completed over the course of more than one patient visit to a catheter lab.
Although illustrative embodiments have been shown and described, a wide range of modification, change, and substitution is contemplated in the foregoing disclosure and in some instances, some features of the present disclosure may be employed without a corresponding use of the other features. For example, in some embodiments, the touch-enabledintegrated bedside controllers102 and300 may be used to control and measure non-cardiovascular diagnostic data such as data from cranial or peripheral arteries, as well as data from non-vascular body portions. Further, thecontrollers102 and300 may be used to control an MRI workflow and measure MRI image data, or may be utilized in computer assisted surgery (CAS) applications. Further, the modules described above in association with thebedside controller300 may be implemented in hardware, software, or a combination of both. And the bedside controller may be designed to enable user control in many different network settings such as ad-hoc networks, local area networks, client-server networks, wide area networks, internets, and the controller may have a number of form factors such as a tablet, a smartphone, a laptop, or any other similar device. It is understood that such variations may be made in the foregoing without departing from the scope of the present disclosure. Accordingly, it is appropriate that the appended claims be construed broadly and in a manner consistent with the scope of the present disclosure.