Movatterモバイル変換


[0]ホーム

URL:


US10545582B2 - Dynamic customizable human-computer interaction behavior - Google Patents

Dynamic customizable human-computer interaction behavior
Download PDF

Info

Publication number
US10545582B2
US10545582B2US15/347,099US201615347099AUS10545582B2US 10545582 B2US10545582 B2US 10545582B2US 201615347099 AUS201615347099 AUS 201615347099AUS 10545582 B2US10545582 B2US 10545582B2
Authority
US
United States
Prior art keywords
display
images
rate
behavior model
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/347,099
Other versions
US20170060270A1 (en
Inventor
Evan K. Fram
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Merative US LP
Original Assignee
Merge Healthcare Solutions Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Merge Healthcare Solutions IncfiledCriticalMerge Healthcare Solutions Inc
Priority to US15/347,099priorityCriticalpatent/US10545582B2/en
Assigned to D.R. SYSTEMS, INC.reassignmentD.R. SYSTEMS, INC.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: FRAM, EVAN K.
Publication of US20170060270A1publicationCriticalpatent/US20170060270A1/en
Assigned to MERGE HEALTHCARE SOLUTIONS INC.reassignmentMERGE HEALTHCARE SOLUTIONS INC.NUNC PRO TUNC ASSIGNMENT (SEE DOCUMENT FOR DETAILS).Assignors: D.R. SYSTEMS, INC.
Assigned to MERGE HEALTHCARE SOLUTIONS INC.reassignmentMERGE HEALTHCARE SOLUTIONS INC.AFFIDAVIT CONCERNING CHANGE IN PATENT OWNERSHIPAssignors: D.R. SYSTEMS, INC.
Application grantedgrantedCritical
Publication of US10545582B2publicationCriticalpatent/US10545582B2/en
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATIONreassignmentINTERNATIONAL BUSINESS MACHINES CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: MERGE HEALTHCARE SOLUTIONS INC.
Assigned to MERATIVE US L.P.reassignmentMERATIVE US L.P.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: INTERNATIONAL BUSINESS MACHINES CORPORATION
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

Systems and methods for customizing behavior of a computing system based on details of interactions with the computing system by a user, such as a direction, intensity, or magnitude of a particular input from a user input device.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is a continuation of U.S. patent application Ser. No. 14/318,437, filed Jun. 27, 2014, which is a continuation of U.S. patent application Ser. No. 13/331,651, filed Dec. 20, 2011, now U.S. Pat. No. 8,797,350, which application claims the benefit of priority under 35 U.S.C. § 119(e) of U.S. Provisional Application No. 61/425,156, filed Dec. 20, 2010. All of these applications are hereby incorporated by reference herein in their entireties and for all purposes.
BACKGROUND
There are many situations in which users employ computing systems to view information where it is important that the users accurately and efficiently interpret that information. Current computing systems are capable of presenting information at rates that exceed a user's perceptual ability to accurately interpret the information presented, resulting in errors. This problem will increase in severity as computer processing power and bandwidth continue to increase.
SUMMARY
For purposes of this summary, certain aspects, advantages, and novel features are described herein. It is to be understood that not necessarily all such advantages may be achieved in accordance with any particular embodiment. Thus, for example, those skilled in the art will recognize that the invention may be embodied or carried out in a manner that achieves one advantage or group of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.
Disclosed herein are systems and methods for operating a computing device so that the speed of information presentation is matched to the user's preferences and/or to various factors that may impact the user's ability to accurately and efficiently interpret it, including properties of the visual information, properties of the display device, environmental factors, characteristics of the user, such as expertise and fatigue, and any other factors that may be relevant to the user's review of the information.
In one embodiment, a computing system comprises one or more hardware processors configured to execute software instructions stored in modules and a tangible computer readable medium storing modules configured for execution by the one or more hardware processors. In one embodiment, the modules include a display module configured to display medical images on a display device of the computing system and a control module configured to: access a data structure storing conditions associated with respective behavior models; identify one or more of the conditions that are matched by one or more characteristics associated with exam viewing, the computing system, an environment surrounding the computing system, the display device, bandwidth available to the computing system, the medical images, a patient, a medical history of the patient, an input device, and/or a user of the computing system, wherein at least one of the identified conditions is associated with a display behavior including two or more rates of display of medical images associated with respective levels of interaction with an input device; receive data from an input devices indicating a level of interaction with the input device; determine a rate of display of medical images based on an association in the at least one of the identified conditions between the level of interaction and one of the two or more rates of display of medical images; and display the medical images at the determined rate.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram illustrating one embodiment of a computing system that may be used to implement certain systems and methods described herein.
FIG. 2 illustrates an input device in which the user tilts a knob and the degree of the tilt is provided as input to a computing device.
FIG. 3 is a graphical representation of information within an interaction behavior model, graphing the relationship between image navigation speed in images/second vs. the angle of tilt of an input device, such as the one shown inFIG. 2.
FIG. 4 shows examples of information within one or more interaction behavior models, mapping user input from an input device to the speed of image navigation on a computing device.
FIG. 5aillustrates information within an interaction behavior model.
FIG. 5billustrates information within an interaction behavior model.
FIG. 5cillustrates an interaction behavior model that maps input to an output that can be interpreted by a computing device.
FIG. 6 is a block diagram depicting one embodiment of a system using the interaction behavior model.
FIG. 7 illustrates a computing device with a display screen and a sensor that detects the tilt of the device and depicts how the device can be tilted by the user to provide input using an embodiment of the interaction behavior model.
FIG. 8 shows different modes of user input, tilting, rotation, and translation.
FIG. 9 illustrates a mouse that includes two buttons and wheel.
FIG. 10 illustrates a handheld computing device such as a smartphone, PDA or tablet computer that includes a touch screen.
FIG. 11 illustrates a user interacting with a tablet computer or display with a touch screen.
FIG. 12 illustrates one embodiment of an interaction behavior model that may be used to implement certain systems and methods described herein.
FIG. 13 displays six image frames as might be displayed on a display device, where each image frame displays an image from a different series from within one or more medical imaging exams.
FIG. 14 shows aspects of an embodiment of an interaction behavior model used with digital pathology.
FIG. 15ashows aspects of embodiments of an interaction behavior model in which a region of an image has been marked by CAD.
FIG. 15bshows aspects of embodiments of an interaction behavior model in which a region of an imaging volume have been marked by CAD.
FIG. 16 shows aspects of an embodiment of an interaction behavior model in which a region of a mammo tomosynthesis exam has been marked by CAD.
These and other features will now be described with reference to the drawings summarized above. The drawings and the associated descriptions are provided to illustrate certain embodiments of the invention and not to limit the scope of the invention. Throughout the drawings, reference numbers may be re-used to indicate correspondence between referenced elements.
DETAILED DESCRIPTION
Embodiments of the disclosure will now be described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the disclosure. Furthermore, embodiments of the disclosure may include several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the embodiments of the disclosure herein described.
As used herein, the terms “viewer” and “user” are used interchangeably to describe an individual (or group of individuals) that interfaces with a computing device. Users may include, for example, doctors, radiologists, hospital staff, or other individuals involved in acquisition, analysis, storage, management, or other tasks related to medical images. Any discussion herein of user preferences should be construed to also, or alternatively, include user group preferences, site preferences, system preferences, and/or default software preferences.
Depending on the embodiment, the methods described with reference to the flowcharts, as well as any other methods discussed herein, may include fewer or additional blocks and/or the blocks may be performed in a different order than is illustrated. Software code configured for execution on a computing device in order to perform the methods may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, hard drive, memory device or any other tangible medium. Such software code may be stored, partially or fully, on a memory of a computing device (e.g., RAM, ROM, etc.), such as the computing system150 (see discussion ofFIG. 1, below), and/or other computing devices illustrated in the figures, in order to perform the respective methods. For ease of explanation, the methods will be described herein as performed by thecomputing system150, but the methods are not limited to performance by thecomputing system150 and should be interpreted to include performance by any one or more of the computing devices noted herein and/or any other suitable computing device.
Images
In the fields of radiology, cardiology, and pathology, for example, physicians often view a large amount of imaging information and it is critical that they accurately interpret the imaging information to make an accurate diagnosis. In addition, there are many other fields where accurate and efficient interpretation of imaging information is important, such as baggage screening, satellite imaging, seismic imaging used in oil and gas exploration, and surveillance video.
Medical imaging exams can be acquired by a number of different medical imaging techniques, including computed tomography (CT), magnetic resonance imaging (MRI), ultrasound, nuclear medicine, positron emission computed tomography (PET), digital angiography, mammography, computed radiography, digital radiography, fluoroscopy, and others such as images generated in medical pathology and endoscopy. A variety of computing systems are used to manage medical images, including storage, distribution, analysis, processing and display. These computing systems include Picture Archive and Communication Systems (PACS), Digital Pathology Systems, Cardiovascular Information Systems, Computer Aided Diagnosis Systems (CAD), 3D Processing systems, Electronic Medical Record (EMR) systems, standalone software for display of medical images, web based Personal Health Record (PHR) systems and other systems that manage medical imaging exams, such as online physician portals.
As described below, physicians and others utilize computing devices, herein referred to as information display computing devices, to view information. Information display computing devices can come in many forms and can be single computing devices or combinations of computing devices, including dedicated PACS workstations, Electronic Medical Record Systems, general purpose computing systems, computer tablets, and/or handheld devices such as smartphones.
Medical imaging exams often include a large number of images. For example, a computed tomography (CT) exam may include hundreds or thousands of images. Because it is usually impractical to view all of an exam's images simultaneously, the display of an information display computing device typically displays a fraction of the total number of images at once and allows the user to interact with the information display computing device to display other images or other portions of an image being displayed.
In addition, a medical image may include too many pixels to be displayed at the full or desired resolution on an information display computing device. For example, a single digital pathology image could include a trillion pixels, vastly exceeding the display resolution of a 1 megapixel monitor. The mismatch between the pixel size of the image and the pixel size of the display device requires the user to interact with the computing device to display various portions of the image.
Medical imaging exams are often divided into series, where a series comprises a group of one or more images acquired or displayed using a certain technique. Images within a series may differ in anatomic position or time of acquisition, for example. For example, in a computed tomography exam, one series might include 250 contiguous 0.6 mm thick images of the brain obtained without administration of intravenous contrast material. Another series might include 250 contiguous 0.6 mm thick images of the brain obtained during administration of intravenous contrast material. A brain MRI might include multiple series acquired with different technical parameters, possibly including images acquired before administration of intravenous contrast material and images acquired after administration of intravenous contrast material. For example a brain MRI might include the following series: Sagittal T1, axial T1, axial FLAIR, axial T2, axial diffusion, coronal gradient echo and post-contrast axial, sagittal and coronal T1 weighted images.
Separate image series may be displayed in separate display frames on a display device, e.g., as illustrated inFIG. 13 where six series from a brain MRI are displayed as they might appear on an information display computing device. Depending on the input device and user preference, a number of methods can be used to allow the user to change the image within the series that is displayed in an image frame. The images within a series are typically numbered, for example 1 to 100 in a series with 100 images.
While much of the discussion herein refers to display of medical images, the systems and methods disclosed herein are not limited to such images. In fact, the systems and methods discussed herein may be applied to any type of information that is presented visually and controlled by any input device. For example, the systems and methods discussed herein may be used with images related to baggage screening, satellite imaging, seismic imaging used in oil and gas exploration, surveillance video, and/or any other type of images. Additionally, while visual information may be in the form of images, visual information may be in other forms. For example, visual information may be in graphical form, such as EKG and EEG information. Visual information may also be in the form of text, such as medical laboratory results, medical exam reports or documents utilized in legal proceedings. Visual information may also be presented in other forms or as a mixture of various forms, for example a multimedia web page comprising text, graphics, images, and/or animations. The systems and methods described here may be applied to any visual information presented on a computing device. Thus, any reference herein to medical images should be construed to cover other embodiments involving other image types.
Example Features of Certain Embodiments
In some embodiments, users may interact with computing systems via a wide range of input devices to control the presentation of information and/or to perform other functions. For any manipulation of a particular input device by a user, there are many potential actions that a computing device might take. For example, an input device might be utilized by a user to control the display of visual information on a computing device. In some embodiments, the systems and methods discussed herein modify the behavior of the computing device as a function of user input, accounting for factors such as properties of the visual information displayed, properties of the display device, environmental factors, user preference, and/or characteristics of the user such as expertise and fatigue.
In addition, users may interact with computing systems via a wide range of input devices to control the presentation of information or to perform other functions using computing devices. For any manipulation of a particular input device by a user, there are many potential actions that a computing device might take. For example, a user moving ajoystick type device 5 degrees to the right center could result in many different actions. For example, it might result in serial presentation of images at a rate of 5 images/second, 10 images/second or 20 images/second.
There is a need for better methods for matching a user's manipulation of an input device to the resulting action that accounts for a variety of factors, including user preferences, properties of the device, and/or various factors related to the activity being controlled by the input device. For example, in the case where the input device is used to control a physical action, it may be useful for the mapping of user input to an activity controlled by the computing device so that the activity may be altered based on user preference, environmental factors, and/or characteristics of the user such as expertise and fatigue.
As used herein, the term “interaction behavior model” describes a model, algorithm, and/or other logic that may be used to customize display of information on a computing device, customize the display of information as a function of user input, and/or customize activity controlled by a computing device as a function of user input. An interaction behavior model may be used to control the speed of presentation of information presented by a computing device independent of the input device. In other embodiments, an interaction behavior model could be used to control how input devices result in actions performed by computing devices. In other embodiments, an interaction behavior model could be used to control how input from input devices is interpreted by computing devices. Interaction behavior models may access any number of inputs in order to determine how the computing device displays data and/or interacts with a user.
Various adjustments may be made by a computing system applying an interaction behavior model in response to user navigation input (e.g., via an input device). For example, adjustments may include:
    • Which image or portion of an image is displayed.
    • Image display characteristics such as brightness/contrast, window/level, magnification, panning.
    • Display parameters such as image view angle for computer generated 3D volumetric images and other renderings.
    • Spatial position and direction in 3D volumetric endoluminal fly through imaging as used in virtual colonography.
    • Parameters used in various types of image rendering, e.g., location and/or angle of a reconstruction plane in multiplanar reconstruction (MPR) and maximum intensity projection (MPR) reconstruction.
    • Spatial position and/or view angle in computer generated virtual environments.
In addition, an interaction behavior model could be used to control how user input via an input device is mapped into physical actions controlled by a computing device such as a machine or vehicle. For example, in various embodiments an interaction behavior model could be used to control how user input via a foot pedal or other input device controls the speed or steering of a car, boat, aircraft, spacecraft, submarine, robot or drone. Interaction behavior models may also be used in videogames and computer simulations, for example.
Example Factors
As noted above, various factors may be used by the computing device to customize operations performed by the computing device using an interaction behavior model, such as how the computing device responds to input received from one or more input devices from a particular user in a particular environment. For example, when a certain condition is met, e.g., one or more factors match a rule associated with a condition (seeFIG. 5 and beyond), the effect of particular user inputs (e.g., movements of an input device) may be customized based on behaviors and/or modifiers associated with the condition. Thus, conditions may include any one or more thresholds of a factor (e.g., time of day is before 8 am or user has more than 10 years experience reading a particular exam type) and/or an indication of whether or not a factor is present (e.g., the image has not been previously viewed). Use of conditions that are based on various factors is discussed in further detail below.
Below are several factors that may be used in customizing interaction behavior models and/or selected parameters of interaction behavior models. Any one or more of these factors, and/or any other factors, may be used in customizing actions of a computing device in accordance with the systems and methods described herein.
    • Factors related to the user
      • Level of expertise, e.g., lower level of expertise may require longer viewing times of medical imaging exams
      • Level of fatigue, e.g., number of hours worked that day, time of day, measures of user fatigue.
      • User preference.
    • Factors related to exam viewing
      • Complete read vs. viewing with a specific goal, e.g., a radiologist performing a diagnostic read on a brain MRI may prefer to view it at a slower rate than a neurosurgeon that has read the neuroradiologist's report and is primarily interested in the size and location of an intracranial hematoma for surgical decision making.
      • Viewing for complete read vs. for comparison purposes, e.g., a radiologist doing a primary interpretation of a chest CT may prefer to view it at a slower rate than a prior comparison exam that has already been interpreted.
      • Whether or not the current image has been viewed by the user. For example, one embodiment of an interaction behavior model might cap the image display rate at 5 images/second for unviewed images, but allow an image display rate of 20 images/second for images that have been viewed by the user.
    • Environmental factors
      • Room noise that could distract the user.
      • Ambient light that could make viewing of images more difficult.
    • Factors related to the display device
      • Spatial resolution.
      • Contrast resolution.
      • Background luminance and luminance range.
      • Brightness.
    • Characteristics of the Computing Device and Network Bandwidth.
      • Bandwidth, e.g., a user may prefer a constant lower rate of image display than an irregular rate of display of sequential images that could be caused by a low bandwidth connection used to retrieve the images.
      • Computational speed.
      • Memory available for image preloading.
    • Factors related to the images being viewed
      • Image size in pixels (larger number of pixels may contain more information)
      • Signal to Noise Ratio (SNR)
      • The type of imaging exam being viewed, e.g., MRI, CT, nuclear medicine, etc.
      • Fraction of the image that is not homogeneous.
    • Factors related to the patient or other information related to the images
      • Clinical indication for the exam, such as evaluation of metastatic disease which may indicate that the probability of abnormalities is higher than a routine screening exam.
      • Prior exams. For example, in one embodiment, the interaction behavior model may be used to automatically slow the display rate of images in anatomic regions where abnormalities were identified on prior exams.
      • Computer Aided Diagnosis (CAD). For example, in one embodiment, the interaction behavior model may utilize CAD information or other forms of image analysis to slow display in regions flagged as suspicious. For example, CAD data indicating possible lesions in the liver may cause the interaction behavior model to slow down viewing of images including the liver, even if the user is providing the same input for proceeding through the images. For example, CAD data indicating possible lesions in virtual colonography may cause the interaction behavior model to slow down the rate of change of spatial position and direction in 3D volumetric endoluminal fly through imaging as used in virtual colonography.
    • Factors related to the way images are presented to the user
      • Image display size, e.g., larger size may require greater search time due to limited perceptual field of view.
      • Image display parameters, e.g., a greater fraction of an abdominal CT image may be relevant when displaying images with soft tissue windows compared to bone windows.
      • Use of image coupling where more than one image is changing on the screen, requiring the user to divide his attention between images
    • Input device characteristics
      • Precision with which the user can manipulate the display device, e.g., it can be difficult for users to precisely manipulate a joystick to a particular position but positioning it to within a range may be relatively easy
The mapping of physical actions to computer behavior could be customized. For example, one user might prefer a forward tilt to increment image number while another might prefer the opposite. One user might want a forward tilt to navigate superiorly within the body regardless of how the patient was scanned.
Example Computing System
FIG. 1 is a system diagram which shows the various components of asystem100 for displaying information utilizing certain systems and methods described herein. As shown, thesystem100 may include an information display computing device150 (also referred to herein as a “computing device150”) and may include other systems, including those shown inFIG. 1.
The informationdisplay computing device150 may take various forms. In one embodiment, the informationdisplay computing device150 may be a computer workstation havinginformation display modules151. In other embodiments,modules151 may reside on another computing device, such as a web server, and the user directly interacts with a second computing device that is connected to the web server via a computer network. Themodules151 will be described in detail below.
In one embodiment, the informationdisplay computing device150 comprises a server, a desktop computer, a workstation, a laptop computer, a mobile computer, a smartphone, a tablet computer, a cell phone, a personal digital assistant, a gaming system, a kiosk, an audio player, any other device that utilizes a graphical user interface, including office equipment, automobiles, airplane cockpits, household appliances, automated teller machines, self-service checkouts at stores, information and other kiosks, ticketing kiosks, vending machines, industrial equipment, and/or a television, for example.
The informationdisplay computing device150 may run an off-the-shelf operating system154 such as a Windows, Linux, MacOS, Android, or iOS. The informationdisplay computing device150 may also run a more specialized operating system which may be designed for the specific tasks performed by thecomputing device150.
The informationdisplay computing device150 may include one ormore computing processors152. Thecomputer processors152 may include central processing units (CPUs), and may further include dedicated processors such as graphics processor chips, or other specialized processors. The processors generally are used to execute computer instructions based on the informationdisplay software modules151 to cause the computing device to perform operations as specified by themodules151. Themodules151 may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. For example, modules may include software code written in a programming language, such as, for example, Java, JavaScript, ActionScript, Visual Basic, HTML, C, C++, or C#. While “modules” are generally discussed herein with reference to software, any modules may alternatively be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.
The informationdisplay computing device150 may also includememory153. Thememory153 may include volatile data storage such as RAM or SDRAM. Thememory153 may also include more permanent forms of storage such as a hard disk drive, a flash disk, flash memory, a solid state drive, or some other type of non-volatile storage.
The informationdisplay computing device150 may also include or be interfaced to one ormore display devices155 that provide information to the users.Display devices155 may include a video display, such as one or more high-resolution computer monitors, or a display device integrated into or attached to a laptop computer, handheld computer, smartphone, computer tablet device, or medical scanner. In other embodiments, thedisplay device155 may include an LCD, OLED, or other thin screen display surface, a monitor, television, projector, a display integrated into wearable glasses, or any other device that visually depicts user interfaces and data to viewers.
The informationdisplay computing device150 may also include or be interfaced to one ormore input devices156 which receive input from users, such as a keyboard, trackball, mouse, 3D mouse, drawing tablet, joystick, game controller, touch screen (e.g., capacitive or resistive touch screen), touchpad, accelerometer, video camera and/or microphone.
The informationdisplay computing device150 may also include one ormore interfaces157 which allow information exchange between informationdisplay computing device150 and other computers and input/output devices using systems such as Ethernet, Wi-Fi, Bluetooth, as well as other wired and wireless data communications techniques.
The modules of the informationdisplay computing device150 may be connected using a standard based bus system. In different embodiments, the standard based bus system could be Peripheral Component Interconnect (“PCI”), PCI Express, Accelerated Graphics Port (“AGP”), Micro channel, Small Computer System Interface (“SCSI”), Industrial Standard Architecture (“ISA”) and Extended ISA (“EISA”) architectures, for example. In addition, the functionality provided for in the components and modules of informationdisplay computing device150 may be combined into fewer components and modules or further separated into additional components and modules.
The informationdisplay computing device150 may communicate and/or interface with other systems and/or devices. In one or more embodiments, thecomputing device150 may be connected to acomputer network190. Thecomputer network190 may take various forms. It may include a wired network or a wireless network, or it may be some combination of both. Thecomputer network190 may be a single computer network, or it may be a combination or collection of different networks and network protocols. For example, thecomputer network190 may include one or more local area networks (LAN), wide area networks (WAN), personal area networks (PAN), cellular or data networks, and/or the Internet.
Various devices and subsystems may be connected to thenetwork190. For example, one or more medical scanners may be connected, such asMRI scanners120. TheMRI scanner120 may be used to acquire MRI images from patients, and may share the acquired images with other devices on thenetwork190. Thenetwork190 may also include one ormore CT scanners122. TheCT scanners122 may also be used to acquire images and, like theMRI scanner120, may then store those images and/or share those images with other devices via thenetwork190. Any other scanner or device capable of inputting or generating information that can be displayed as images or text could be included, including ultrasound, angiography, nuclear medicine, radiography, endoscopy, pathology, dermatology, etc.
Also connected to thenetwork190 may be a Picture Archiving and Communications System (PACS)136 andPACS workstation138.
Also connected to thenetwork190 may be an interaction behaviormodel data structure160 used to store interaction behavior models. In various embodiments, the interaction behaviormodel data structure160 may reside withinPACS System136, reside within a server accessible on a LAN that is local to the informationdisplay computing device150, and/or reside within a server that is located remote to the informationdisplay computing device150 and accessible via the Internet. In other embodiments, interaction behaviormodel data structure160 may reside locally, within informationdisplay computing device150. Interaction behavior model information may be stored in any computer readable format such as a database, flat file, table, or XML file, and may be stored on any computer readable medium, such as volatile or non-volatile memory, compact disc, digital video disc, flash drive, or any other tangible medium.
ThePACS System136 may be used for the storage, retrieval, distribution and presentation of images (such as those created and/or generated by theMRI scanner120 and CT Scanner122). The medical images may be stored in an independent format, an open source format, or some other proprietary format. One format for image storage in the PACS system is the Digital Imaging and Communications in Medicine (DICOM) format. The stored images may be transmitted digitally via the PACS system, often reducing or eliminating the need for manually creating, filing, or transporting film.
Thenetwork190 may also be connected to a Radiology Information System (RIS)140. Theradiology information system140 may be a computerized data storage system that is used by radiology departments to store, manipulate and distribute patient radiological information.
Also attached to thenetwork190 may be an Electronic Medical Record (EMR)system142. TheEMR system142 may be configured to store and make accessible to a plurality of medical practitioners computerized medical records. Also attached to thenetwork190 may be alaboratory information system144.Laboratory information system144 may be a software system which stores information created or generated by clinical laboratories. Also attached to thenetwork190 may be adigital pathology system146 used to digitally manage and store information related to medical pathology.
Also attached to thenetwork190 may be a computer aided diagnosis system (CAD)148 used to analyze images. In one embodiment, theCAD148 functionality may reside in a computing device separate from the informationdisplay computing device150 while in another embodiment theCAD148 functionality may reside within the informationdisplay computing device150.
Also attached to thenetwork190 may be a3D Processing System149 used to perform computations on imaging information to create new views of the information, e.g., 3D volumetric display, Multiplanar Reconstruction (MPR) and Maximum Intensity Projection reconstruction (MIP). In one embodiment, the 3D processing functionality may reside in a computing device separate from the informationdisplay computing device150 while in another embodiment the 3D processing functionality may reside within the informationdisplay computing device150
In other embodiments, other computing devices that store, provide, acquire, and/or otherwise manipulate medical data may also be coupled to thenetwork190 and may be in communication with one or more of the devices illustrated inFIG. 1, such as with the informationdisplay computing device150.
As will be discussed in detail below, the informationdisplay computing device150 may be configured to interface with various networked computing devices in order to provide efficient and useful review of medical examination data that is stored among the various systems present in the network. In other embodiments, informationdisplay computing device150 may be used to display non-medical information.
Depending on the embodiment, the other devices illustrated inFIG. 1 may include some or all of the same components discussed above with reference to the informationdisplay computing device150.
Input Devices
There are embodiments of the interaction behavior model in which the user controls presentation of visual information by interacting with an input device interfaced to an information display computing device. While embodiments will be described using input devices in which the user physically interacts with the input device, the systems and methods described herein can be applied to any input method including:
    • Devices that rely on direct physical interaction, including devices or portions of devices that the user rotates, tilts, presses, lifts up or down, squeezes, translates and/or touches, either with a body part or device such as a stylus.
    • Input methods that rely on measurement of a user's muscular or neural electrical activity.
    • Input methods that sense a user's position, configuration or motion, including body parts such as extremities, eyes, and face, for example using a video camera.
    • Input methods that rely on detection of sound, including recognition of voice commands.
FIG. 2 illustrates an input device200 that utilizes input related to tilt. In this embodiment, the device illustrated has abase221 and aknob222 that is manipulated by the user.Knob222 could instead be attached to something other than a base, such as a large input device, panel or dashboard. In this example, the user may tilt the knob to the left or right, illustrated byarrows225 and226, but in other embodiments the input device could accept user input in other ways, for example with translation, rotation, etc., as discussed below.
FIG. 3 is a graph illustrating image display rate verses tile angle ofknob222 with respect to base221 (FIG. 2). In this embodiment, the information display computing device is configured to display a series of images at a rate that is controlled by the tilt that the user applies to the input device200.
FIG. 2 andFIG. 3 illustrate changes in the display rate of images as the user tilts theknob222 to the right. Although not illustrated, the same or different behavior might occur when the user tilts theknob222 to the left. For example, tilting theknob222 to the right ofmidline230 might increment the number of the images displayed per second while tilting the knob to the left of midline might decrease the number of the images displayed per second, where the degree of tilt affects the rate of display of images. Thus, in this embodiment the degree of tilt off of midline determines the rate of display of images, while the direction of tilt determines a direction of movement within a series of images, such as whether the currently displayed image number should be increased (e.g., to move to a next image in an image series) or decreased (e.g., to move to a prior image in the image series). In other embodiments, other input device controls can be used in similar manners to determine the display rate of images and the direction of movement between images.
As illustrated inFIG. 2, the speed of image presentation would be determined by the degree of tilt from themidline230 inFIG. 2. As illustrated inFIG. 2, a tilt betweenposition230 and231 would result in no change in the image displayed. A tilt betweenposition231 and232 would result in serial presentation of images at a rate of 4 images/second, between232 and233 a rate of 10 images/second, and beyond233 a rate of 20 images/second.
The graph ofFIG. 3 includes two different mappings between tilt angle and the action of the information display computing device. In particular,line241 shows the mapping between tilt angle, shown on the horizontal axis, to image display rate, shown on the vertical axis.Line242 shows another mapping in which the display rate is slower and the transition between display rates occurs at tilt angles that are different than those utilized within the mapping shown inline241. In various embodiments, different mappings within an interaction behavior model might be chosen manually, for example as a user preference, or automatically, for example based on image size, image content, user fatigue, or other factors as discussed herein.
FIG. 4 illustrates four graphs that each illustrate exemplary mappings of image display rate as a function of tilt angle for a device such as the one shown inFIG. 2. In each of the mappings, illustrated bygraphs350,360,370, and380, there are two lines illustrating different interaction behaviors that may be present in an interaction behavior model, but any number of behaviors may be present.
FIG. 5ashows another example of an embodiment of an interaction behavior model. In this case the information is represented as textual information rather than graphically as inFIG. 3 andFIG. 4. In this embodiment the interaction behavior model includes three components,behavior settings390,modifier settings392 andconditions394. In this embodiment, the interaction behavior model defines two different behaviors B1 and B2 in thebehavior settings390. In this example, the behaviors B1 and B2 define rates of displaying images in response to a tilt angle of an input device. In other embodiments, behaviors may define any other display characteristic and/or feature in response to other defined inputs from one or more input device. In this example, themodifier settings392 include four modifiers M1-M4 that may be applied to one of the behaviors B1 or B2 in response to meeting of certain of the conditions C1-C6 in theconditions settings394. Thus, in one embodiment the conditions are rules that are applied to determine which of the behaviors should be applied and when the selected behavior should be modified by one of the modifiers. For example, condition C1 indicates that B1 is the default behavior. Thus, the display speed defined by B1 is used as a default when this interaction behavior model is selected. However, condition C2 indicates that if the exam from which images are displayed is a CTA exam, behavior B2 is to be used to define the display speed of images. Thus, the display speed may be changed based on the type of images that are displayed. Condition C3 indicates that a modifier (modifier M1) of the currently selected behavior (e.g., B1 by default or B2 if the exam is a CTA) to decrease the display rate by 20% is applied in response to display of an image that is greater than 1 megapixel. Accordingly, the behaviors may be modified in response to characteristics of the image being displayed. The remaining conditions inFIG. 5aillustrate other example criteria by which different behaviors may be selected and modifiers to the selected behaviors applied. In other embodiments, fewer, more or different components may be used.
FIG. 5billustrates example conditions that may be used in an interaction behavior model that limits drive speed of a user operated vehicle. As noted above, interaction behavior models may be used in other contexts beyond image display.FIG. 5billustrates one such example, but countless other uses of such interaction behavior models are within the scope of this disclosure.
In the example ofFIG. 5b, conditions C1-C4 each cap the maximum speed of the vehicle to respective caps based on different criteria. In particular, condition C1 caps the speed at 10 mph if the GPS determines that the vehicle is at a particular location (e.g., a strip mine in the example ofFIG. 5b), condition C2 caps the speed at 8 mph if the driver has less than 2 years experience or has worked more than 8 hours, condition C3 caps the speed at 5 mph if sensors determine that it is dark outside, and condition C4 caps the speed at 5 mph if the mine conditions are designated as dangerous, which may be determined based on a data structure that is updated to include such information in response to user and/or automated sensor input. Thus, the conditions ofFIG. 5bare based on vehicle speed, user conditions and experience, environmental conditions, and current conditions defined by a third party. In other embodiments, any other criteria may be used to set maximum vehicle speed (or any other characteristic).
FIG. 5cillustratessample data650 andgraph652 that correlates input from an input device into various outputs that could be interpreted by a computing device. In this example, three associations between input and outputs are indicated asbehavior 1,behavior 2, andbehavior 3, illustrated in tabular form indata650 and ingraph652. As described in other embodiments, the choice of which of the output behaviors is used could be based on one or more factors as described herein. In addition, these behaviors could be modified based on one or more factors as described herein.
FIG. 6 is a flowchart illustrating one embodiment of a method of implementing an interaction behavior model. In one embodiment, the method ofFIG. 6 is performed by the informationdisplay computing device150 ofFIG. 1. For ease of explanation, the method ofFIG. 6 is described herein with reference to the informationdisplay computing device150, with the understanding that in other embodiments the method may be performed by any other suitable computing device. Depending on the embodiment, the flowchart ofFIG. 6 may include additional or fewer blocks and/or the blocks may be performed in a different order than is illustrated.
At Initialization block705, information relevant to the use of the interaction behavior model is accessed and/or retrieved, such as information regarding the user and his preferred interaction behavior model. For example, a particular user might prefer the interaction behavior model shown inFIG. 5a, while a different user might prefer a different interaction behavior model or a similar one with different default setting.
One or more interaction behavior models may be stored in interaction behaviormodel data structure160 shown inFIG. 1 and/or stored, for example within informationdisplay computing device150. User preferences could also be stored in interaction behaviormodel data structure160 or in another data structure, for example withinPACS System136 ofFIG. 1.
In other embodiments, there are no user specific preferences, but default interaction behavior models that could be determined by the manufacturer of the information display computing device or, for example, set by a site to apply to all users or to groups of users. For example, behavior B1 in behavior settings390 (FIG. 5a) might be applied as the default behavior to radiologists in training at a site while behavior B2 might be applied by default to staff radiologists.
Inblock710, with a behavior selected (e.g., based on conditions, such asconditions394 ofFIG. 5a), the computing system determines if any modifiers should be applied to the selected behavior and/or if a different behavior should be selected. In the example ofFIG. 5a, this is done via the conditions C2-C6, which indicate application of modifiers and/or changes to the current behavior in response to the indicated conditions. Using the conditions ofFIG. 5a, for example, if the patient has cancer or is suspected of having cancer, condition C6 (section394 inFIG. 5a) would result in the application of modifier M2 which would slow the image display rate by 50%. Behaviors may be selected and/or customized based on any one or more of the factors listed above and/or any other factors, and may be applied using a condition/rule system such as inFIG. 5a, or in other manners.
While some factors such as clinical information may not change during the course of viewing the imaging exam, other factors may change dynamically, such as:
    • The size of the image being displayed (relevant to condition C3 in the example illustrated inFIG. 5a).
    • Whether a nearby image has been marked as positive by a Computer Aided Detection (CAD) system (relevant to condition C4 in the example illustrated inFIG. 5a).
    • Image display parameters (relevant to condition C5 in the example illustrated inFIG. 5a).
Thus, in some embodiments the behavior and/or modifiers to behaviors are adjusted in real time as a user views medical images. For example, if a user initially views images that are less than 1 megapixel, condition C3 is not triggered, but if the user moves to an image in an image series or otherwise displays an image that is 1.1 megapixel, condition C3 may be immediately triggered such that the image is displayed with modifier M1 applied. Information regarding any other factors may be accessed and/or retrieved inblock710, such as the factors listed above and/or any other factors.
Inblock715 user input is received via any available input devices and/or user interactions. For example, in the embodiment ofFIG. 2, user input would include the tilt angle the user has applied toknob222 and may include other input, for example from a keyboard, mouse, or other buttons on the input device with the tilt knob illustrated inFIG. 2. As noted above, the input device ofFIG. 2 is provided only as an example of an input device—any other input device, even an input device that doesn't require direct user contact, such as a motion detection camera or microphone, may be used in place of the tilt knob.
Inblock730, the informationdisplay computing device150 applies the selected interaction behavior model, including any modifications, in order to determine the appropriate action. For example, in the embodiment ofFIG. 5athis may be accomplished by applying the conditions sequentially to determine the behavior and any modifiers that would be mapped to the received user input. For example, consider an MRI of the Brain in a patient with a clinical history of Lung Cancer when the user is applying a tilt angle of 11 degrees to the knob of the input device illustrated inFIG. 2. Stepping through the conditions sequentially in the example illustrated inFIG. 5a:
    • Condition C1 is always applied initially so behavior B1 is initially selected.
    • Condition C2 is false as the exam is an MRI not a CTA, so behavior B2 is still selected.
    • Condition C3 is false as the MRI images are <1 megapixel/image.
    • Condition C4 is false as CAD was not utilized.
    • Condition C5 is false as the modality is not CT.
    • Condition C6 is true as the history is cancer so Modifier M2 is applied to the current selected behavior B1.
Having stepped through the conditions in this example, the informationdisplay computing device150 would determine that behavior B1 with modifier M2 are to be applied to the received user input. Applying a tilt angle of 11 degrees to behavior B1 maps to a display rate of 10 images/sec. Applying modifier M2 (activated by condition C6) decreases the display rate by 50%, resulting in a display rate of 5 images/second.
Inblock740 the determined action is performed. In this example, the informationdisplay computing device150 also considers the direction of the tilt ofknob226, left vs. right, to determine whether the image to be displayed within the series is to be incremented or decremented. Assuming the knob is tilted to the right, the image number within the series is incremented and the resulting image is displayed for 200 milliseconds, the determined rate of 5 images/second before a new image is displayed. The logic loops back to block710 to repeat the process.
In other embodiments the user has the option of overriding the interaction behavior model, for example by pressing a key that returns to the default behavior.
Image Navigation Using Other Input Devices
With the device shown inFIG. 2, the input device senses the degree of tilt of a knob manipulated by the user. Tilt input could also be sensed in other ways, for example by devices that utilize accelerometers, including handheld controllers such as the Wiimote made by Nintendo, and computing devices with internal sensors such as the iPhone and iPad made by Apple. However, other technology, including cameras and other sensors, may be used to sense the physical positions of objects or body parts. The embodiments described herein may be used with any devices and/or methods that are capable of sensing the desired user input, such as tilt in the embodiment ofFIG. 2.
FIG. 7 illustrates adevice310 that includes a display screen and a sensor that detects the device orientation. By tilting the device, the user can provide tilt information to the device in a way that is analogous to tilting the knob in the input device illustrated inFIG. 2.
Inview320 ofFIG. 7,device310 is shown in a particular position, theta degrees from horizontal. In one embodiment, the user may indicate to the computing device that the current tilt ofdevice310 is to be assigned as a neutral position, for example by pressing a button on the device or touching a control on the touch screen. When in this neutral position, the computing device would not change images, which is analogous to a midline position of the knob inFIG. 2. Tiltingdevice320 to angles different than theta, such as is shown inviews322 and324, would be similar to tilting the knob of the device shown inFIG. 2. For example, tilting thedevice310 intopositions322 or324 may be equivalent to tilting the knob of the device shown inFIG. 2 to the left or the right, respectively, causing an increase or decrease in the number of the image displayed within the series, where the degree of tilt from the neutral position may determine the rate of incremental image change, as described previously.
In another embodiment, the actual tilt may be mapped to different images in the series. For example, in a series with 90 images, positioning the device horizontally, such as shown inview324, may cause display ofimage1, and positioning the device vertically might cause display ofimage90, with intermediate angles causing display of intermediate images in a linear fashion.
In other embodiments the interaction behavior model may be used with left and right tilting of thedevice310. In other embodiments, other input devices that are manipulated in other ways, for example as illustrated inFIG. 8, may be used in conjunction with interaction behavior models.
FIG. 8 illustrates another example input device, in particular, aknob401. Inview410 ofFIG. 8,knob401 of an input device may be tilted left-right and/or forward-backward, as illustrated by the curved arrows above theknob401. The systems and methods described herein may be applied to one or both of these two axes of tilt.
Inview420, a different input mode is shown, where theknob401 may be twisted clockwise and counterclockwise, where the twisting motion may be independent from the motions shown inview410. The systems and methods described herein may be applied to rotation, where angle or rotation is treated in a way that is similar to angle of tilt, as previously described.
Inview430, theknob401 may be translated by the user forward-backward and/or left-right. The systems and methods described here may be applied to these directions as well.
Inview440, theknob401 may be translated by the user superiorly/inferiorly, perpendicular to the direction of translation shown inview430. This systems and methods described herein may be applied to this input mode as well.
In the device's shown inFIG. 8, as the user translates, tilts, and/or rotates the knob of the input device from a neutral position to change the angle of tilt, angle of rotation, and/or degree of translation, the direction, timing, and/or other factors of image viewing (or other computer functionality) may be adjusted according to an interaction behavior model. Depending on the embodiment, the input device may include one or a combination of multiple of the movement capabilities discussed with reference toknob401.
FIG. 9 illustrates movements of amouse501 that can be used to control a user interface for viewing images. In one embodiment, the user holds down abutton502 on themouse501 and moves themouse501 forward (motion indicated by arrow505) or backward (motion indicated by arrow506) on a surface to cause images to change. Typically a fixed increment in physical mouse position results in a fixed change in the number of the image displayed. For example, forward motion of 2 mm might result in an increase in image number and backward motion of 2 mm might result in a decrement in image number. Movement of amouse wheel503 may provide similar inputs that are usable to provide image navigation commands.
Another system that may make use of interaction behavior models might be a touch screen that is separate or integrated with a display device. For example,FIG. 10 illustrates ahandheld computing device521 andFIG. 11 illustrates alarger tablet computer601 with an integrated display and touchpad. With reference toFIG. 10, astylus522 may be moved in the directions illustrated byarrows525 and526 in order to navigate between images. Similarly, a finger could be utilized instead of a stylus, for example in the case of a smartphone or thetablet computer601.
As illustrated inFIGS. 9, 10 and 11, a user can move themouse501, roll themouse wheel503, move a trackball, or move a stylus or finger on the touchpad ofdevice521 ordevice601 in order to provide user input that could be mapped by an interaction behavior model, for example to navigate through images, where an incremental change in the degree of rotation of a trackball or incremental movement of the mouse position or a finger or stylus on a touch screen causes the information display computing device to change the image displayed (and/or other display characteristics, such as speed of changing between images).
FIG. 12 illustrates fourgraphs450,460,470,480 that illustrate inputs and outputs of an interaction behavior model over time. In particular,FIG. 12 illustrates mouse position ingraph450, mouse speed ingraph460, image number ingraph470, and image change rate ingraph480, each with reference to a common time period and user interaction with the mouse.Graph450 shows an example of mouse position over time, for example along the direction of movement indicated byarrow505 ofFIG. 9. In this example, the user moves the mouse forward first at a slow rate (segment451), then a medium rate (segment452) and finally at a fast rate (segment453). The rate of movement for these three time periods is shown inview460, as mouse speeds461,462, and463.
Theline segments471,472, and473 in the graph shown ingraph470 indicate the number of the image that would be displayed from a series of images as a function of time as a result of the mouse movement graphed inview450, where a fixed incremental movement of mouse position results in a fixed incremental change in image number. The rate of change of image number that corresponds to theimage number graph470 shown in thegraph480, whereimage change rate481 corresponds tosegment471,image change rate482 corresponds tosegment472, andimage change rate483 corresponds tosegments473.
In one embodiment, an interaction behavior model may be applied to cap the maximum rate of image change, illustrated in thegraph480 by the “m” on the vertical axis. For example, a behavior and/or a modifier may be defined that sets a maximum image change rate to m. Therefore, theimage change rate483 would be reduced to the image change rate484 (which is equivalent to m). This would result in a change in the image number over time illustrated in the graph ofview470, where the third segment of the graph would besegment474 rather thansegment473.
While the embodiment ofFIG. 12 was described using incremental movement of a mouse, in other embodiments other input devices could be used including input on a tablet by finger or stylus, rotation of a trackball, rotation of a mouse wheel or knob, and other input devices that sense change in position or orientation.
In one embodiment, an input device, such as a joystick, may be used to provide commands. A joystick, or other device wherein movement returns to an original home position when the user releases the device, may be used in a similar manner as discussed above with interaction behavior models. Such devices may detect one or more of tilt, translation, and/or rotation of the joystick.
Example Digital Pathology
FIG. 14 illustrates computer monitor680 displaying a pathology image, for example from a Pathology Information System. Such a pathology image may be much larger than can be displayed at full resolution on a computer monitor. For example, a digital pathology image as might be acquired using a digital slide scanner, represented byimage690, might be 100,000×100,000 pixels in size. However, the display resolution of a computer monitor, represented by monitor680, might be on the order of 1,000×1,000 pixels.
Ifpathology image690 is to be displayed at full resolution onmonitor690, only a fraction ofimage690 may be displayed at any one time.Rectangle692 represents that portion ofimage690 displayed on monitor680, a viewport into the image. The portion being viewed may be changed interactively by the user, represented by the black arrows along the sides ofviewport692.
In one embodiment, the user utilizes a knob685 (or any other input device), shown with four black arrows indicating that the user can translate the knob. When the computing device senses translation ofinput device685, that input is used to translate the position ofviewport692 intoimage690, allowing the user to interactively display different portions ofimage690.
Just as it is possible for users to display images at rates that exceed their perceptual ability to accurately interpret them, it would be possible for a user to translate themonitor viewport692 at rates that exceed his ability to accurately interpret the information being displayed on monitor680. Therefore, the systems and methods described herein may be used to map the input from theuser input device685 into the speed of translation ofviewport692.
In other embodiments, other input device modes, such as the up-down translation shown inview440 ofFIG. 8, may be used to change the magnification of the image displayed on monitor680. This allows pan and zoom of an image to be controlled by a single input device, with up-down translation controlling zoom and left-right/forward-backward translation controlling pan.
There are many fields where image size exceeds the display resolution of the computing device and in other embodiments, other types of images may be used, such as satellite imagery, telescope imagery, seismic imagery, and mammography.
Example 3D and Image Rendering
In medical imaging and other areas imaging data may be processed to produce 3D or other rendered images. In medical imaging, imaging information may be processed to create 2D images that are at planes other than the plane in which the images were acquired. Volumetrically acquired imaging information, for example with CT, MRI and ultrasound, may be processed to create 2D or 3D images, including 3D volume rendering, surface rendering, multiplanar reformatted images (MPR), and maximum intensity projection (MIP) images. In some cases, 3D volume rendered images may be used to visualize the internal surfaces of structures, such as endoluminal imaging of the GI tract with CT colonography as well as endoluminal imaging of airways, vessels and other structures.
Embodiments described herein may be used with these and/or other types of images. For example, the speed with which a user travels through a 3D volume rendered structure, such as the inside of the colon with CT colonography, may be controlled with the illustrative systems and methods described herein. For example, the speed with which a user traverses images of a colon may be modified as a function of the user's expertise, the roughness of the internal surface of the colon, and/or the presence of regions marked as suspicious or abnormal by a computer aided diagnosis system (CAD) as discussed below.
Example Computer Aided Diagnosis (CAD)
Imaging information may be analyzed by computers to detect regions of interest. In medical imaging this is known as Computer Aided Diagnosis (CAD). Examples include the detection of cancers in mammograms, detection of lung nodules in chest CT, detection of polyps in CT colonography, and detection of abnormal cells in pap smears.
Generally, CAD systems are not sufficiently accurate to make a final diagnosis, but rather detect suspicious areas for further scrutiny by the expert human reader. Marks of some sort may be superimposed on the images to indicate to the human reader the regions marked as suspicious by the CAD system. These marks could be placed at the suspicious locations in the originally acquired images, for example in mammography, or rendered images, for example 3D or MPR images in the case of CT colonography.
In one embodiment, an interaction behavior model may be used with CAD to modify the display of various types of visual information. For example, in the case of a user viewing a series of 2D images such as a chest CT, the rate of image display could be slowed in the regions marked by CAD, whether or not marks are displayed on the images to indicate the regions marked by CAD. For example, in a series of images, the maximum image display rate could be cut by 50% (e.g., and/or required magnification level, contrast level, etc.) for images within 10 mm of an image that has one or more regions within it marked by CAD. Depending on the embodiment, such changes in display rate might be expressed in modifiers that are responsive to the indications conditions in an interaction behavior model. In another example, the maximum image display rate could be modified as a result of an interaction behavior model for images within a certain distance of a region marked by CAD, regardless of the image's 3D spatial orientation.
FIG. 15aillustrates a series of images820-829, each representing a plane of imaging through a region of anatomy. The thickness of the anatomy depicted in images could vary, from contiguous slices (no gap between the slices), to slices that overlap, to slices having a gap between the images. For the purpose of illustration,image824 is shown with an associatedCAD marker830. By way of example, images might be acquired at 1 mm intervals and an interaction behavior model might be selected (and or modified by one or more modifiers) to:
    • Cap the display rate at 2 images/second for images within 2 mm of a slice with a CAD mark.
    • Display images with a CAD mark for a minimum of 1 second.
With this example behavior and assuming 1 mm slice spacing, display ofimages822,823,825 and826 would be capped at a display rate of 2 images/second andimage824 would be displayed for a minimum of 1 second when the user displayed those images.
An embodiment of an interaction behavior model may be used in mammo tomosynthesis, as illustrated inFIG. 16.FIG. 16 illustrates a series ofimages850 of different positions of a breast, and a larger view of asubset855 of those images.Marker856 is displayed by the computing device and indicates a location marked as suspicious by CAD withinimage857. As discussed above, an embodiment of the interaction behavior model may be configured to alter the display of images in proximity to images marked by CAD, in thisexample image857. By way of example, images in the exam that are within a 10 mm distance from an image marked by CAD,image857 in this example, might be displayed for a minimum of 2 seconds. In other embodiments, the minimum display time or maximum image display rate might be applied only to these images when they are displayed for the first time to the user. In other embodiments, other characteristics of image display might be changed for images in proximity to images with CAD markers, e.g., magnification, color mapping, brightness, and/or contrast. Other embodiments can also be applied to other types of imaging where CAD may be utilized, e.g., breast MRI, chest CT, and/or CT colonography.
In the case of reformatted images, for example thick-slice MIP applied to chest CT, an embodiment may be used to control the rate of movement of the plane of reconstruction through the imaging volume. For example, the rate of movement of the plane in mm/sec could be capped at 5 mm/second when the reconstruction plane is within 20 mm of a region in the imaging volume marked by CAD.
FIG. 15bshows aspects of embodiments of an interaction behavior model in which a region of an imaging volume has been marked by CAD. For example, aCAD marker831 is shown in a spatial position within the superior aspect of theimaging volume840. A slab of the imaging volume being reconstructed by thick-slice MIP or another technique is shown as shadedvolume841. For the example above, the speed that the user could move the location ofreconstruction slab841 would be capped at 5 mm/second when the reconstruction slab was within 20 mm of the region marked by CAD, in thisexample region831.
In the case of 3D volume rendering, for example, endoluminal imaging such as CT colonography, an embodiment may be used to control visual display and 3D rendering. For example, CAD may be used to mark suspicious regions on the inner surface of the colon and it is important that the user be aware of these regions. In one embodiment, the rate that the user traverses the colon via endoluminal 3D volume rendering may be automatically controlled by the presence of CAD markers or locations marked as suspicious by CAD, whether or not markers are displayed, for example slowing the rate of “movement” when the region being viewed is in proximity to a CAD mark. It is possible for an area of interest marked by CAD to be hidden from view because of the complexity of the inner surface of the colon, for example requiring the user to look sideways or backwards. In one example, the user is not permitted to move further than 2 cm beyond the CAD mark until that region marked has been displayed on the computer device. This may require the user to change his view within the colon so that he “looks” sideways or backwards.
In another embodiment, interaction behavior related to image translation might be controlled by the presence of CAD markers. In the example embodiments described above with reference toFIG. 14, animage690 could have aCAD marker693. The rate of translation ofviewport692 could be automatically slowed when it is in the proximity ofCAD marker693. In other embodiments, a reader could not terminate viewing ofimage690 until all CAD markers were viewed.
Example Control of Machines
In other embodiments, the interaction behavior model may be used to control how user input via an input device is mapped into physical actions controlled by a computing device such as a machine or vehicle, rather than display of information. For example, in various embodiments the interaction behavior model could be used to control how user input via a foot pedal or other input device controls a car, boat, aircraft, spacecraft, submarine, robot or drone.
For example,FIG. 5bis an embodiment of an interaction behavior model that could be used to control the behavior of a strip mining truck's speed as a function of user input and several conditions. In this example, the maximum speed of the truck is modified based on a number of factors including the physical location of the truck, the experience of the driver, the potential for driver fatigue based on number of hours worked that day, and environmental conditions.
Summary
All of the methods and processes described above may be embodied in, and partially or fully automated via, software code modules executed by one or more general purpose computers. For example, the methods described herein may be performed by an information display computing device and/or any other suitable computing device. The methods may be executed on the computing devices in response to execution of software instructions or other executable code read from a tangible computer readable medium. A tangible computer readable medium is a data storage device that can store data that is readable by a computer system. Examples of computer readable mediums include read-only memory, random-access memory, other volatile or non-volatile memory devices, CD-ROMs, magnetic tape, flash drives, and optical data storage devices.
While the methods described herein are typically implemented as software modules, they may alternatively be embodied in specialized computer hardware or firmware. For example, certain methods, portions of methods, and/or combinations of methods may be implemented in hardware modules comprising programmable units, such as programmable gate arrays (e.g., FPGAs) or application specific integrated circuits (e.g., ASICs). The results of the disclosed methods may be stored in any tangible computer readable medium.

Claims (21)

What is claimed is:
1. A computer-implemented method comprising:
by one or more processors executing program instructions:
accessing a data structure storing:
a first behavior model that indicates an association between a first rate of display of images and a first level of user interaction, and an association between a second rate of display of images and a second level of user interaction, wherein the first rate of display is different from the second rate of display, and
a second behavior model that indicates an association between a third rate of display of images and the first level of user interaction, and an association between a fourth rate of display of images and the second level of user interaction, wherein the third rate of display is different from the first and fourth rates of display; and
in response to an update in an image selected for display on a display device and/or portions of an image selected for display on the display device:
determining whether a computer aided detection indicator was superimposed automatically by a computer on an image in response to a computer-aided diagnosis of the image to mark an area within the image within a predetermined number of images before or after the image or portions of the image selected for display;
in response to determining that a computer aided detection indicator was superimposed by a computer on an image within the predetermined number of images before or after the image or portions of the image, selecting the first behavior model;
in response to determining that a computer aided detection indicator was not superimposed by a computer on an image within the predetermined number of images before or after the image or portions of the image, selecting the second behavior model;
receiving an indication of a level of user interaction with an input device of the computing system;
modifying the selected behavior model by requiring a particular magnification level of the images;
determining a rate of display of images based on the selected behavior model and the indication of the level of user interaction; and
causing display of the images at the determined rate of display.
2. The computer-implemented method ofclaim 1, wherein the level of user interaction is indicated by at least one of: a rotation, a tilt, a press, a lift, a squeeze, a touch, a motion, a detection by a camera, a measurement of muscular or neural activity, or a sound.
3. A system comprising:
a computer readable storage medium having program instructions embodied therewith; and
one or more processors configured to execute the program instructions to cause the one or more processors to:
access a data structure storing at least:
a first behavior model indicating an association between a first rate of display of images and a first level of interaction indicated by an input device, the first behavior model also indicating an association between a second rate of display of images and a second level of interaction indicated by the input device, wherein the first rate of display is different from the second rate of display;
a second behavior model indicating an association between a third rate of display of images and the first level of interaction indicated by the input device, the second behavior model also indicating an association between a fourth rate of display of images and the second level of interaction indicated by the input device, wherein the third rate of display is different from the first and fourth rates of display;
select a behavior model from at least one of the first or second behavior models based on whether or not an imaging region relative to a current image includes an image superimposed automatically by a computer in response to a computer-aided diagnosis with a computer aided detection indicator to mark an area within the image;
modify the selected behavior model by requiring a particular contrast level of the images; and
in response to receiving an indication of a level of interaction indicated by the input device, determine a rate of display of the images based on an association between the level of interaction and the rate of display of images in the selected behavior model.
4. The system ofclaim 3, wherein selecting the behavior model based on whether or not an imaging region relative to the current image includes an image superimposed by a computer with a computer aided detection indicator comprises: selecting the first behavior model in response to an image superimposed by a computer with a computer aided detection indicator being within the imaging region; and selecting the second behavior model in response to an image superimposed by a computer with a computer aided detection indicator not being within the imaging region.
5. The system ofclaim 4, wherein the first rate of display is slower than the third rate of display.
6. The system ofclaim 4, wherein the first behavior model indicates a relatively lower maximum rate of display for a given level of interaction as compared to the second behavior model.
7. The system ofclaim 3, wherein the imaging region is defined as a particular number of images from the current image in a series of images.
8. The system ofclaim 3, wherein the level of interaction is indicated by at least one of: a rotation, a tilt, a press, a lift, a squeeze, a touch, a motion, a detection by a camera, a measurement of muscular or neural activity, or a sound.
9. The system ofclaim 3, wherein the images comprise a series of images.
10. The system ofclaim 3, wherein the images comprise two-dimensional renderings of a imaging volume, and wherein rates of display comprise rates of movement of rendered two-dimensional planes through the imaging volume.
11. The system ofclaim 3, wherein the rate of display of images is determined in real time as the images are displayed.
12. A computer-implemented method comprising:
by one or more processors executing program instructions:
accessing a data structure storing a first behavior model and a second behavior model, wherein:
the first behavior model indicates an association between a first rate of display of images and a first level of interaction,
the first behavior model further indicates an association between a second rate of display of images and a second level of interaction,
the first rate of display is different from the second rate of display,
the second behavior model indicates an association between a third rate of display of images and the first level of interaction,
the second behavior model further indicates an association between a fourth rate of display of images and the second level of interaction, and
the third rate of display is different from the first and fourth rates of display; and
in response to an input from an input device indicating a level of interaction, determining a rate of display of images by at least:
selecting the first behavior model in response to detecting a computer aided detection indicator superimposed automatically by a computer on an image in response to a computer-aided diagnosis of the image to mark an area within the image within a distance of a current image;
selecting the second behavior model in response to not detecting a computer aided detection indicator superimposed by a computer on an image to mark an area within the image within the distance of the current image; and
modifying the selected behavior model by at least one of requiring a particular magnification level of the images or requiring a particular contrast level of the images.
13. The computer-implemented method ofclaim 12, wherein the level of interaction is indicated by at least one of: a rotation, a tilt, a press, a lift, a squeeze, a touch, a motion, a detection by a camera, a measurement of muscular or neural activity, or a sound.
14. The computer-implemented method ofclaim 12, wherein the rate of display is determined based on the selected behavior model, wherein the selected behavior model indicates an association between the level of interaction and the rate of display.
15. The computer-implemented method ofclaim 14, wherein the first rate of display is slower than the third rate of display.
16. The computer-implemented method ofclaim 14, wherein the determined rate of display and/or the selected behavior model is modified based on a preference of a user viewing the image, a training or expertise of the user viewing the images, and/or a fatigue of the user viewing the images.
17. The computer-implemented method ofclaim 12, wherein the images comprise a series of images or two-dimensional renderings of an imaging volume.
18. The computer-implemented method ofclaim 12 further comprising: by one or more processors executing program instructions: displaying the images at the determined rate of display.
19. A computer program product comprising a non-transitory computer readable storage medium having program instructions embodied therewith, the program instructions executable by one or more processors to cause the one or more processors to:
access a data structure storing a first behavior model and a second behavior model, wherein:
the first behavior model indicates an association between a first rate of display of images and a first level of interaction,
the first behavior model further indicates an association between a second rate of display of images and a second level of interaction,
the first rate of display is different from the second rate of display,
the second behavior model indicates an association between a third rate of display of images and the first level of interaction,
the second behavior model further indicates an association between a fourth rate of display of images and the second level of interaction, and
the third rate of display is different from the first and fourth rates of display; and
in response to an input from an input device indicating a level of interaction, determining a rate of display of images by at least:
selecting the first behavior model in response to detecting a computer aided detection indicator superimposed automatically by a computer within an imaging region of a current image in response to a computer-aided diagnosis of the image to mark an area within the imaging region;
selecting the second behavior model in response to not detecting a computer aided detection indicator superimposed by the computer within the imaging region of the current image; and
modifying the selected behavior model by at least one of requiring a particular magnification level of the images or requiring a particular contrast level of the images.
20. The computer program product ofclaim 19, wherein the program instructions are executable by one or more processors to further cause the one or more processors to: display images at the determined rate of display, wherein images comprise views of a three-dimensional volume rendering, wherein the rate of display is determined by selecting the first behavior model even when the computer aided detection indicator would not be displayed in the views of the three-dimensional volume rendering.
21. The computer program product ofclaim 20, wherein the program instructions are executable by one or more processors to further cause the one or more processors to: in response to detecting the computer aided detection indicator, require a view including a portion of the three-dimensional volume rendering marked by the computer aided detection indicator to be displayed.
US15/347,0992010-12-202016-11-09Dynamic customizable human-computer interaction behaviorActiveUS10545582B2 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US15/347,099US10545582B2 (en)2010-12-202016-11-09Dynamic customizable human-computer interaction behavior

Applications Claiming Priority (4)

Application NumberPriority DateFiling DateTitle
US201061425156P2010-12-202010-12-20
US13/331,651US8797350B2 (en)2010-12-202011-12-20Dynamic customizable human-computer interaction behavior
US14/318,437US9524080B1 (en)2010-12-202014-06-27Dynamic customizable human-computer interaction behavior
US15/347,099US10545582B2 (en)2010-12-202016-11-09Dynamic customizable human-computer interaction behavior

Related Parent Applications (1)

Application NumberTitlePriority DateFiling Date
US14/318,437ContinuationUS9524080B1 (en)2010-12-202014-06-27Dynamic customizable human-computer interaction behavior

Publications (2)

Publication NumberPublication Date
US20170060270A1 US20170060270A1 (en)2017-03-02
US10545582B2true US10545582B2 (en)2020-01-28

Family

ID=46233794

Family Applications (3)

Application NumberTitlePriority DateFiling Date
US13/331,651Active2032-07-17US8797350B2 (en)2010-12-202011-12-20Dynamic customizable human-computer interaction behavior
US14/318,437Active2032-03-30US9524080B1 (en)2010-12-202014-06-27Dynamic customizable human-computer interaction behavior
US15/347,099ActiveUS10545582B2 (en)2010-12-202016-11-09Dynamic customizable human-computer interaction behavior

Family Applications Before (2)

Application NumberTitlePriority DateFiling Date
US13/331,651Active2032-07-17US8797350B2 (en)2010-12-202011-12-20Dynamic customizable human-computer interaction behavior
US14/318,437Active2032-03-30US9524080B1 (en)2010-12-202014-06-27Dynamic customizable human-computer interaction behavior

Country Status (1)

CountryLink
US (3)US8797350B2 (en)

Families Citing this family (48)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US7885440B2 (en)2004-11-042011-02-08Dr Systems, Inc.Systems and methods for interleaving series of medical images
US7920152B2 (en)2004-11-042011-04-05Dr Systems, Inc.Systems and methods for viewing medical 3D imaging volumes
US7660488B2 (en)2004-11-042010-02-09Dr Systems, Inc.Systems and methods for viewing medical images
US7787672B2 (en)2004-11-042010-08-31Dr Systems, Inc.Systems and methods for matching, naming, and displaying medical images
US7970625B2 (en)2004-11-042011-06-28Dr Systems, Inc.Systems and methods for retrieval of medical data
US7953614B1 (en)2006-11-222011-05-31Dr Systems, Inc.Smart placement rules
US20100100849A1 (en)2008-10-222010-04-22Dr Systems, Inc.User interface systems and methods
US10345996B2 (en)2008-10-222019-07-09Merge Healthcare Solutions Inc.User interface systems and methods
US10768785B2 (en)2008-10-222020-09-08Merge Healthcare Solutions Inc.Pressure sensitive manipulation of medical image data
US8380533B2 (en)2008-11-192013-02-19DR Systems Inc.System and method of providing dynamic and customizable medical examination forms
US8712120B1 (en)2009-09-282014-04-29Dr Systems, Inc.Rules-based approach to transferring and/or viewing medical images
US9378331B2 (en)2010-11-192016-06-28D.R. Systems, Inc.Annotation and assessment of images
US8797350B2 (en)2010-12-202014-08-05Dr Systems, Inc.Dynamic customizable human-computer interaction behavior
JP5287843B2 (en)*2010-12-242013-09-11株式会社Jvcケンウッド Input operation device, input operation control method, input operation control program, and input operation device design method
US9075899B1 (en)2011-08-112015-07-07D.R. Systems, Inc.Automated display settings for categories of items
US20140111523A1 (en)*2012-10-222014-04-24Google Inc.Variable length animations based on user inputs
US9495604B1 (en)2013-01-092016-11-15D.R. Systems, Inc.Intelligent management of computerized advanced processing
US20140204115A1 (en)*2013-01-232014-07-24Honeywell International Inc.System and method for automatically and dynamically varying the feedback to any operator by an automated system
JP5813031B2 (en)*2013-03-252015-11-17富士フイルム株式会社 Mobile device
US11107379B2 (en)*2013-08-122021-08-31Kabushiki Kaisha ToshibaInformation processing apparatus and information processing method
JP6086851B2 (en)*2013-09-182017-03-01株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus and information processing method
US9536106B2 (en)2013-10-082017-01-03D.R. Systems, Inc.System and method for the display of restricted information on private displays
US10120451B1 (en)*2014-01-092018-11-06D.R. Systems, Inc.Systems and user interfaces for dynamic interaction with two- and three-dimensional medical image data using spatial positioning of mobile devices
US20150350146A1 (en)2014-05-292015-12-03Apple Inc.Coordination of message alert presentations across devices based on device modes
JP6320859B2 (en)*2014-07-012018-05-09キヤノンメディカルシステムズ株式会社 Medical image processing device
US10127662B1 (en)*2014-08-112018-11-13D.R. Systems, Inc.Systems and user interfaces for automated generation of matching 2D series of medical images and efficient annotation of matching 2D medical images
KR102143308B1 (en)2014-09-022020-08-10애플 인크.Reduced-size interfaces for managing alerts
WO2016036546A2 (en)*2014-09-022016-03-10Apple Inc.Reduced-size notification interface
US10318013B1 (en)2015-04-012019-06-11Bansen Labs LLCSystem and method for converting input from alternate input devices
US20170046483A1 (en)2015-04-302017-02-16D.R. Systems, Inc.Database systems and interactive user interfaces for dynamic interaction with, and comparison of, digital medical image data
CN109997196B (en)*2016-11-252024-02-23霍罗吉克公司Medical care information manipulation and visualization controller
US10356395B2 (en)*2017-03-032019-07-16Fyusion, Inc.Tilts as a measure of user engagement for multiview digital media representations
US10440351B2 (en)2017-03-032019-10-08Fyusion, Inc.Tilts as a measure of user engagement for multiview interactive digital media representations
EP3398514A1 (en)*2017-05-022018-11-07Koninklijke Philips N.V.X-ray system for guided operation
US20210225495A1 (en)*2018-05-152021-07-22Nunetz, Inc.Systems and methods for adapting a ui based platform on patient medical data
EP3827297A4 (en)*2018-07-262022-01-26Bayerische Motoren Werke AktiengesellschaftApparatus and method for use with vehicle
WO2020068767A1 (en)2018-09-282020-04-02Hologic, Inc.System and method for synthetic breast tissue image generation by high density element suppression
WO2020107019A1 (en)2018-11-252020-05-28Hologic, Inc.Multimodality hanging protocols
US11049289B2 (en)*2019-01-102021-06-29General Electric CompanySystems and methods to semi-automatically segment a 3D medical image using a real-time edge-aware brush
US11170889B2 (en)*2019-01-152021-11-09Fujifilm Medical Systems U.S.A., Inc.Smooth image scrolling
US11194461B2 (en)*2019-01-152021-12-07Fujifilm Medical Systems U.S.A., Inc.Smooth image scrolling with dynamic scroll extension
DE202020006044U1 (en)2019-03-292024-07-02Hologic Inc. Report generation for cropped digital images
US11481094B2 (en)2019-06-012022-10-25Apple Inc.User interfaces for location-related communications
US11152100B2 (en)2019-06-012021-10-19Apple Inc.Health application user interfaces
US11477609B2 (en)2019-06-012022-10-18Apple Inc.User interfaces for location-related communications
EP4115387A1 (en)*2020-03-052023-01-11Visiopharm A/SMedical image analysis system and method for managing medical images
CN111460292B (en)*2020-03-302024-08-06百度在线网络技术(北京)有限公司Model evaluation method, device, equipment and medium
US12186119B2 (en)2021-10-052025-01-07Hologic, Inc.Interactive model interface for image selection in medical imaging systems

Citations (51)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5374942A (en)1993-02-051994-12-20Gilligan; Federico G.Mouse and method for concurrent cursor position and scrolling control
US5596699A (en)1994-02-021997-01-21Driskell; Stanley W.Linear-viewing/radial-selection graphic for menu display
US5701424A (en)1992-07-061997-12-23Microsoft CorporationPalladian menus and methods relating thereto
US5943039A (en)1991-02-011999-08-24U.S. Philips CorporationApparatus for the interactive handling of objects
US20020075333A1 (en)2000-12-152002-06-20International Business Machines CorporationProximity selection of selectable items in a graphical user interface
US6549219B2 (en)1999-04-092003-04-15International Business Machines CorporationPie menu graphical user interface
US20030217892A1 (en)2002-02-012003-11-27Michael PerskyCustomized self-checkout system
US20040263475A1 (en)2003-06-272004-12-30Microsoft CorporationMenus whose geometry is bounded by two radii and an arc
US20050204312A1 (en)2004-03-122005-09-15Sebastien RoselProcess for selecting and handling objects in a computer-aided design system
US20050245803A1 (en)2002-03-142005-11-03Glenn Jr William VSystem and method for analyzing and displaying computed tomography data
US20050251755A1 (en)2004-05-062005-11-10PixarToolbar slot method and apparatus
US20060187204A1 (en)2005-02-232006-08-24Samsung Electronics Co., Ltd.Apparatus and method for controlling menu navigation in a terminal
US20070136690A1 (en)2005-12-122007-06-14Microsoft CorporationWedge menu
US20070234224A1 (en)2000-11-092007-10-04Leavitt Joseph MMethod for developing and implementing efficient workflow oriented user interfaces and controls
US20070250793A1 (en)2001-05-182007-10-25Miura Britt SMultiple menus for use with a graphical user interface
US20070274585A1 (en)2006-05-252007-11-29Zhang Daoxian HDigital mammography system with improved workflow
US20080022228A1 (en)2006-07-242008-01-24Samsung Electronics Co., Ltd.User interface device and method of implementing the same
US20080024599A1 (en)2004-11-292008-01-31Katsumi HirakawaImage Display Apparatus
US7327348B2 (en)1996-11-262008-02-05Immersion CorporationHaptic feedback effects for control knobs and other interface devices
US20080046931A1 (en)2006-07-312008-02-21Kevin CorbettApparatus, system and method for secondary navigation options
US7389591B2 (en)2005-05-172008-06-24Gesturetek, Inc.Orientation-sensitive signal output
US20080178090A1 (en)2006-08-282008-07-24Ajay MahajanUniversal Medical Imager
US20080222439A1 (en)2007-03-092008-09-11Yung-Sen LinNotebook battery replacement time-saving method and battery detector thereof
US20080222569A1 (en)2007-03-082008-09-11International Business Machines CorporationMethod, Apparatus and Program Storage Device For Providing Customizable, Immediate and Radiating Menus For Accessing Applications and Actions
US20080235583A1 (en)2007-03-232008-09-25Nokia CorporatioinMethod and System for File Fast-Forwarding and Rewind
US20090187860A1 (en)2008-01-232009-07-23David FleckRadial control menu, graphical user interface, method of controlling variables using a radial control menu, and computer readable medium for performing the method
US20090235201A1 (en)2008-03-112009-09-17Aaron BaalbergenMethods for controlling display of on-screen menus
US20090327964A1 (en)2008-06-282009-12-31Mouilleseaux Jean-Pierre MMoving radial menus
US20100073563A1 (en)2008-09-122010-03-25Christopher PainterMethod and apparatus for controlling an electrical device
US20100100849A1 (en)2008-10-222010-04-22Dr Systems, Inc.User interface systems and methods
US20100214211A1 (en)2009-02-242010-08-26Research In Motion LimitedHandheld electronic device having gesture-based control and a method of using same
US20100235794A1 (en)2009-03-162010-09-16Bas OrdingAccelerated Scrolling for a Multifunction Device
US20100306650A1 (en)2009-05-262010-12-02Pantech Co., Ltd.User interface apparatus and method for user interface in touch device
US20110041077A1 (en)2006-06-052011-02-17Bruce ReinerMethod and apparatus for adapting computer-based systems to end-user profiles
US20110109650A1 (en)2009-10-072011-05-12Hologic, Inc.Displaying Computer-Aided Detection Information With Associated Breast Tomosynthesis Image Information
US20110289161A1 (en)2010-05-212011-11-24Rankin Jr Claiborne RApparatuses, Methods and Systems For An Intelligent Inbox Coordinating HUB
US20120033866A1 (en)*2009-04-162012-02-09Fujifilm CorporationDiagnosis assisting apparatus, diagnosis assisting method, and recording medium having a diagnosis assisting program stored therein
US20120154431A1 (en)2010-12-202012-06-21Dr Systems, Inc.Dynamic customizable human-computer interaction behavior
US20120192108A1 (en)2011-01-262012-07-26Google Inc.Gesture-based menu controls
US8245156B2 (en)2008-06-282012-08-14Apple Inc.Radial menu selection
US8423306B2 (en)2008-05-222013-04-16Microsoft CorporationBattery detection and user experience
US20130212535A1 (en)2012-02-132013-08-15Samsung Electronics Co., Ltd.Tablet having user interface
US8549442B2 (en)2005-12-122013-10-01Sony Computer Entertainment Inc.Voice and video control of interactive electronically simulated environment
US8751948B2 (en)2008-05-132014-06-10Cyandia, Inc.Methods, apparatus and systems for providing and monitoring secure information via multiple authorized channels and generating alerts relating to same
US20140325443A1 (en)2013-04-242014-10-30Samsung Electronics Co., Ltd.Method and apparatus for operating menu in electronic device including touch screen
US20140359456A1 (en)2013-06-042014-12-04Sony CorporationConfiguring user interface (ui) based on context
US20140362056A1 (en)2013-06-092014-12-11Apple Inc.Device, method, and graphical user interface for moving user interface objects
US20150106731A1 (en)2013-10-102015-04-16Facebook, Inc.Positioning of components in a user interface
US20150220218A1 (en)2013-07-102015-08-06Lg Electronics Inc.Mobile terminal and controlling method thereof
US20170038917A1 (en)2008-10-222017-02-09D.R. Systems, Inc.User interface systems and methods
US20170038926A1 (en)2008-10-222017-02-09D.R. Systems, Inc.Pressure sensitive manipulation of medical image data

Patent Citations (57)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5943039A (en)1991-02-011999-08-24U.S. Philips CorporationApparatus for the interactive handling of objects
US5701424A (en)1992-07-061997-12-23Microsoft CorporationPalladian menus and methods relating thereto
US5374942A (en)1993-02-051994-12-20Gilligan; Federico G.Mouse and method for concurrent cursor position and scrolling control
US5596699A (en)1994-02-021997-01-21Driskell; Stanley W.Linear-viewing/radial-selection graphic for menu display
US7327348B2 (en)1996-11-262008-02-05Immersion CorporationHaptic feedback effects for control knobs and other interface devices
US6549219B2 (en)1999-04-092003-04-15International Business Machines CorporationPie menu graphical user interface
US20070234224A1 (en)2000-11-092007-10-04Leavitt Joseph MMethod for developing and implementing efficient workflow oriented user interfaces and controls
US20020075333A1 (en)2000-12-152002-06-20International Business Machines CorporationProximity selection of selectable items in a graphical user interface
US20070250793A1 (en)2001-05-182007-10-25Miura Britt SMultiple menus for use with a graphical user interface
US20030217892A1 (en)2002-02-012003-11-27Michael PerskyCustomized self-checkout system
US20050245803A1 (en)2002-03-142005-11-03Glenn Jr William VSystem and method for analyzing and displaying computed tomography data
US20040263475A1 (en)2003-06-272004-12-30Microsoft CorporationMenus whose geometry is bounded by two radii and an arc
US20050204312A1 (en)2004-03-122005-09-15Sebastien RoselProcess for selecting and handling objects in a computer-aided design system
US20050251755A1 (en)2004-05-062005-11-10PixarToolbar slot method and apparatus
US20080024599A1 (en)2004-11-292008-01-31Katsumi HirakawaImage Display Apparatus
US20060187204A1 (en)2005-02-232006-08-24Samsung Electronics Co., Ltd.Apparatus and method for controlling menu navigation in a terminal
US7389591B2 (en)2005-05-172008-06-24Gesturetek, Inc.Orientation-sensitive signal output
US20070136690A1 (en)2005-12-122007-06-14Microsoft CorporationWedge menu
US8549442B2 (en)2005-12-122013-10-01Sony Computer Entertainment Inc.Voice and video control of interactive electronically simulated environment
US20070274585A1 (en)2006-05-252007-11-29Zhang Daoxian HDigital mammography system with improved workflow
US20110041077A1 (en)2006-06-052011-02-17Bruce ReinerMethod and apparatus for adapting computer-based systems to end-user profiles
US20080022228A1 (en)2006-07-242008-01-24Samsung Electronics Co., Ltd.User interface device and method of implementing the same
US20080046931A1 (en)2006-07-312008-02-21Kevin CorbettApparatus, system and method for secondary navigation options
US20080178090A1 (en)2006-08-282008-07-24Ajay MahajanUniversal Medical Imager
US20080222569A1 (en)2007-03-082008-09-11International Business Machines CorporationMethod, Apparatus and Program Storage Device For Providing Customizable, Immediate and Radiating Menus For Accessing Applications and Actions
US20080222439A1 (en)2007-03-092008-09-11Yung-Sen LinNotebook battery replacement time-saving method and battery detector thereof
US20080235583A1 (en)2007-03-232008-09-25Nokia CorporatioinMethod and System for File Fast-Forwarding and Rewind
US20090187860A1 (en)2008-01-232009-07-23David FleckRadial control menu, graphical user interface, method of controlling variables using a radial control menu, and computer readable medium for performing the method
US20090235201A1 (en)2008-03-112009-09-17Aaron BaalbergenMethods for controlling display of on-screen menus
US8751948B2 (en)2008-05-132014-06-10Cyandia, Inc.Methods, apparatus and systems for providing and monitoring secure information via multiple authorized channels and generating alerts relating to same
US8423306B2 (en)2008-05-222013-04-16Microsoft CorporationBattery detection and user experience
US8245156B2 (en)2008-06-282012-08-14Apple Inc.Radial menu selection
US8826181B2 (en)2008-06-282014-09-02Apple Inc.Moving radial menus
US20090327964A1 (en)2008-06-282009-12-31Mouilleseaux Jean-Pierre MMoving radial menus
US20100073563A1 (en)2008-09-122010-03-25Christopher PainterMethod and apparatus for controlling an electrical device
US20100100849A1 (en)2008-10-222010-04-22Dr Systems, Inc.User interface systems and methods
US20170038917A1 (en)2008-10-222017-02-09D.R. Systems, Inc.User interface systems and methods
US9081479B1 (en)2008-10-222015-07-14D.R. Systems, Inc.User interface systems and methods
US20170038926A1 (en)2008-10-222017-02-09D.R. Systems, Inc.Pressure sensitive manipulation of medical image data
US10162483B1 (en)2008-10-222018-12-25D.R. Systems, Inc.User interface systems and methods
US10345996B2 (en)2008-10-222019-07-09Merge Healthcare Solutions Inc.User interface systems and methods
US20100214211A1 (en)2009-02-242010-08-26Research In Motion LimitedHandheld electronic device having gesture-based control and a method of using same
US20100235794A1 (en)2009-03-162010-09-16Bas OrdingAccelerated Scrolling for a Multifunction Device
US20120033866A1 (en)*2009-04-162012-02-09Fujifilm CorporationDiagnosis assisting apparatus, diagnosis assisting method, and recording medium having a diagnosis assisting program stored therein
US20100306650A1 (en)2009-05-262010-12-02Pantech Co., Ltd.User interface apparatus and method for user interface in touch device
US20110109650A1 (en)2009-10-072011-05-12Hologic, Inc.Displaying Computer-Aided Detection Information With Associated Breast Tomosynthesis Image Information
US20110289161A1 (en)2010-05-212011-11-24Rankin Jr Claiborne RApparatuses, Methods and Systems For An Intelligent Inbox Coordinating HUB
US8797350B2 (en)2010-12-202014-08-05Dr Systems, Inc.Dynamic customizable human-computer interaction behavior
US20120154431A1 (en)2010-12-202012-06-21Dr Systems, Inc.Dynamic customizable human-computer interaction behavior
US9524080B1 (en)2010-12-202016-12-20D.R. Systems, Inc.Dynamic customizable human-computer interaction behavior
US20120192108A1 (en)2011-01-262012-07-26Google Inc.Gesture-based menu controls
US20130212535A1 (en)2012-02-132013-08-15Samsung Electronics Co., Ltd.Tablet having user interface
US20140325443A1 (en)2013-04-242014-10-30Samsung Electronics Co., Ltd.Method and apparatus for operating menu in electronic device including touch screen
US20140359456A1 (en)2013-06-042014-12-04Sony CorporationConfiguring user interface (ui) based on context
US20140362056A1 (en)2013-06-092014-12-11Apple Inc.Device, method, and graphical user interface for moving user interface objects
US20150220218A1 (en)2013-07-102015-08-06Lg Electronics Inc.Mobile terminal and controlling method thereof
US20150106731A1 (en)2013-10-102015-04-16Facebook, Inc.Positioning of components in a user interface

Non-Patent Citations (118)

* Cited by examiner, † Cited by third party
Title
AGFA HealthCare, color brochure "IMPAX 6: Digital Image and Information Management," © 2012 Agfa HealthCare N.V. Downloaded from http://www.agfahealthcare.com/global/en/he/library/libraryopen?ID=32882925. Accessed on Feb. 9, 2015.
AGFA HealthCare, IMPAX 6.5 Datasheet (US)2012. © 2012 Agfa HealthCare N.V. Downloaded from http://www.agfahealthcare.com/global/en/he/library/libraryopen?ID=37459801. Accessed on Feb. 9, 2015.
AMD Technologies, Inc., Catella PACS 5.0 Viewer User Manual (112 pgs), © 2010, AMD Technologies, Inc. (Doc. 340-3-503 Rev. 01). Downloaded from http://www.amdtechnologies.com/lit/cat5viewer.pdf. Accessed on Feb. 9, 2015.
Applicant-Initiated Interview Summery from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/792,016 dated Jan. 19, 2018 (3 pages).
Applicant-Initiated Interview Summery from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/264,404 dated Mar. 1, 2019 (3 pages).
ASPYRA's Imaging Solutions, 3 page color print out. Accessed at http://www.aspyra.com/imaging-solutions. Accessed on Feb. 9, 2015.
AVREO, interWorks-RIS/PACS package, 2 page color brochure, © 2014, Avreo, Inc. (Document MR-5032 Rev. 4). Downloaded from http://www.avreo.com/ProductBrochures/MR-5032Rev.%204interWORKS%20RISPACSPackage.pdf. Accessed on Feb. 9, 2015.
AVREO, interWorks—RIS/PACS package, 2 page color brochure, © 2014, Avreo, Inc. (Document MR-5032 Rev. 4). Downloaded from http://www.avreo.com/ProductBrochures/MR-5032Rev.%204interWORKS%20RISPACSPackage.pdf. Accessed on Feb. 9, 2015.
BRIT Systems, BRIT PACS View Viewer, 2 page color brochure, (BPB-BPV-0001). Downloaded from http://www.brit.com/pdfs/britpacsview.pdf. Accessed on Feb. 9, 2015.
BRIT Systems, Roentgen Works-100% Browers-based VNA (Vendor Neutral Archive/PACS), © 2010 BRIT Systems, 1 page color sheet. Accessed at http://www.roentgenworks.com/PACS. Accessed on Feb. 9, 2015.
BRIT Systems, Roentgen Works—100% Browers-based VNA (Vendor Neutral Archive/PACS), © 2010 BRIT Systems, 1 page color sheet. Accessed at http://www.roentgenworks.com/PACS. Accessed on Feb. 9, 2015.
BRIT Systems, Vision Multi-modality Viewer-with 3D, 2 page color brochure, (BPB-BVV-0001 REVC). Downloaded from http://www.brit.com/pdfs/BPB-BVV-0001REVC_BRIT_Vision_Viewer.pdf. Accessed on Feb. 9, 2015.
BRIT Systems, Vision Multi-modality Viewer—with 3D, 2 page color brochure, (BPB-BVV-0001 REVC). Downloaded from http://www.brit.com/pdfs/BPB-BVV-0001REVC_BRIT_Vision_Viewer.pdf. Accessed on Feb. 9, 2015.
Bronevetsky, Greg, Circle Menus, "What is a Circle Menu?," downloaded fromhttp://www.cs.cornell.edu/boom/2001sp/bronevetsky/What%20is%20a%20Circle%20Menu.htm on May 24, 2010 (3 pages).
Callahan, Jack et al., "An Empirical Comparison of Pie vs. Linear Menus," Computer Science Department, University of Maryland, Sep. 1988 (6 pages).
CANDELiS, ImageGrid™: Image Management Appliance, 6 page color brochure. (AD-012 Rev. F Nov. 2012), © 2012 Candelis, Inc. Downloaded from http://www.candelis.com/images/pdf/Candelis_ImageGrid_Appliance_20111121.pdf. Accessed on Feb. 9, 2015.
Carestream, Cardiology PACS, 8 page color brochure. (CAT 866 6075 Jun. 2012). © Carestream Health, Inc., 2012. Downloaded from http://www.carestream.com/cardioPACS_brochure_M1-877.pdf. Accessed on Feb. 9, 2015.
Carestream, Vue PACS, 8 page color brochure. (CAT 300 1035 May 2014). © Carestream Health, Inc., 2014. Downloaded from http://www.carestream.com/csPACS_brochure_M1-876.pdf. Accessed on Feb. 9, 2015.
Cerner, Radiology-Streamline image management, 2 page color brochure, (fl03_332_10_v3). Downloaded from http://www.cerner.com/uploadedFiles/Clinical_Imaging.pdf. Accessed on Feb. 9, 2015.
Cerner, Radiology—Streamline image management, 2 page color brochure, (fl03_332_10_v3). Downloaded from http://www.cerner.com/uploadedFiles/Clinical_Imaging.pdf. Accessed on Feb. 9, 2015.
CoActiv, EXAM-PACS, 2 page color brochure, © 2014 CoActiv, LLC. Downloaded from http://coactiv.com/wp-content/uploads/2013/08/EXAM-PACS-BROCHURE-final-web.pdf. Accessed on Feb. 9, 2015.
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/097,219 dated Mar. 28, 2018 (2 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/097,219 dated Mar. 28, 2019 (2 pages).
DR Systems, Dominator™ Guide for Reading Physicians, Release 8.2, 546 pages, (TCP-000260-A), © 1997-2009, DR Systems, Inc. Downloaded from https://resources.dominator.com/assets/004/6999.pdf. Document accessed Feb. 9, 2015.
DR Systems, DR Scheduler User Guide, Release 8.2, 410 pages, (TCP-000115-A), © 1997-2009, DR Systems, Inc. Downloaded from https://resources.dominator.com/assets/003/6850.pdf. Document accessed Feb. 9, 2015.
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/318,437 dated Sep. 23, 2016 (1 page).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/577,949 dated Apr. 13, 2012 (19 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/792,016 dated Apr. 20, 2018 (18 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/347,099 dated Jul. 12, 2018 (22 pages).
FUJIFILM Medical Systems, SYNAPASE® Product Data, Synapse Release Version 3.2.1, Server Modules and Interfaces, 4 page color brochure, (XBUSSY085) Aug. 2008. Downloaded from http://www.fujifilmusa.com/shared/bin/server-interface.pdf. Accessed on Feb. 9, 2015.
FUJIFILM Medical Systems, SYNAPSE® Product Data, Synapse Release Version 3.2.1, Foundation Technologies, 4 page color brochure, (XBUSSY084) Aug. 2008. Downloaded from http://www.fujifilmusa.com/shared/bin/foundation.pdf. Accessed on Feb. 9, 2015.
FUJIFILM Medical Systems, SYNAPSE® Product Data, Synapse Release Version 3.2.1, Workstation Software, 4 page color brochure, (XBUSSY082) Aug. 2008. Downloaded from http://www.fujifilmusa.com/shared/bin/workstation.pdf. Accessed on Feb. 9, 2015.
GE Healthcare, Centricity PACS, in 8 page printout. Accessed at http://www3.gehealthcare.com/en/products/categories/healthcare_it/medical_imaging_informatics_-_ris-pacs-cvis/centricity_pacs. Accessed on Feb. 9, 2015.
Handylife.com-Features of Handy Patients Enterprise, in 4 page printout. Accessed from http://www.handylife.com/en/software/features.html. Accessed on Feb. 18, 2015.
Handylife.com—Features of Handy Patients Enterprise, in 4 page printout. Accessed from http://www.handylife.com/en/software/features.html. Accessed on Feb. 18, 2015.
Handylife.com-Overview of Handy Patients Enterprise, in 2 page printout. Accessed from http://www.handylife.com/en/software/overview.html. Accessed on Feb. 18, 2015.
Handylife.com—Overview of Handy Patients Enterprise, in 2 page printout. Accessed from http://www.handylife.com/en/software/overview.html. Accessed on Feb. 18, 2015.
Handylife.com-Screenshots of Handy Patients Enterprise, in 2 page printout. Accessed from http://www.handylife.com/en/software/screenshots.html. Accessed on Feb. 18, 2015.
Handylife.com—Screenshots of Handy Patients Enterprise, in 2 page printout. Accessed from http://www.handylife.com/en/software/screenshots.html. Accessed on Feb. 18, 2015.
Hopkins, Don, "Dynamic Pie Menus," Don Hopkins' Web Site, submitted Sep. 18, 2005, downloaded from www.donhopkins.com/drupal/node/68, (3 pages).
Hopkins, Don, "Pie Menus for OLPC Sugar User Interface, in Python with GTK, Cairo and Pangomodules," Don Hopkins' Web Site, downloaded from www.donhopkins.com/drupal/node/128 on May 24, 2010 (13 pages).
Hopkins, Don, "Theta Menus Proposal and Pie Menu Designs-May 1986," Don Hopkins' Web Site, submitted Sep. 26, 2005, downloaded from www.donhopkins.com/drupal/node/82, (14 pages).
Hopkins, Don, "Theta Menus Proposal and Pie Menu Designs—May 1986," Don Hopkins' Web Site, submitted Sep. 26, 2005, downloaded from www.donhopkins.com/drupal/node/82, (14 pages).
Hopkins, Don, The Design and Implementation of Pie Menus-Dr. Dobb's Journal, Dec. 1991, DonHopkins' Web Site, submitted Sep. 27, 2005, downloaded fromwww.donhopkins.com/drupal/node/98, (8 pages).
Hopkins, Don, The Design and Implementation of Pie Menus—Dr. Dobb's Journal, Dec. 1991, DonHopkins' Web Site, submitted Sep. 27, 2005, downloaded fromwww.donhopkins.com/drupal/node/98, (8 pages).
iCRco, I See The Future, in 12 pages, color brochure, (BR080809AUS), © 2009 iCRco.ClarityPACS. Downloaded from http://www.claritypacs.com/pdfs/ISeeFuture_26_Web.pdf. Accessed on Feb. 9, 2015.
Imageanalysis, dynamika, 2 page color brochure. Downloaded from http://www.imageanalysis.org.uk/what-we-do. Accessed on Feb. 9, 2015.
Imageanalysis, MRI Software, in 5 page printout. Accessed at http://www.imageanalysis.org.uk/mri-software. Accessed on Feb. 9, 2015.
IMSI, Integrated Modular Systems, Inc., Hosted / Cloud PACS in one page printout. Accessed at http://www.imsimed.com/#!products-services/ctnu. Accessed on Feb. 9, 2015.
Infinitt, PACS, RIS, Mammo PACS, Cardiology Suite and 3D/Advanced Visualization | Infinittna, 2 page printout. Accessed at http://www.infinittna.com/products/radiology/radiology-pacs. Accessed on Feb. 9, 2015.
Intelerad, IntelePACS, 2 page color brochure, © 2014 Intelerad Medical Systems Incoprorated. Downloaded http://www.intelerad.com/wp-content/uploads/sites/2/2014/08/IntelePACS-brochure.pdf. Accessed on Feb. 9, 2015.
Intelerad, InteleViewer, 2 page color brochure, © 2014 Intelerad Medical Systems Incoprorated. Downloaded from http://www.intelerad.com/wp-content/uploads/sites/2/2014/09/InteleViewer-brochure.pdf. Accessed on Feb. 9, 2015.
Intuitive Imaging Informatics, ImageQube, 1 page in color. Downloaded from http://www.intuitiveimaging.com/2013/pdf/ImageQube%20one-sheet.pdf. Accessed on Feb. 9, 2015.
Kuhl, Helen: Comparison Chart/PACS, Customers Are Happy, But Looking for More, (color) Imaging Techology News, itnonline.com, May 2012, pp. 24-27. Downloaded from http://www.merge.com/MergeHealthcare/media/company/In%20The%20News/merge-pacs-comparison.pdf. Accessed on Feb. 9, 2015.
Kurtenbach, G., et al., "User Learning and Performance with Marking Menus," Department of ComputerScience, University of Toronto, Ontario, Canada, as downloaded frombillbuxton.com/MMUserLearn.html on May 24, 2010 (11 pages).
Kurtenbach, Gordon, "Notes on the History of Radial menus, Pie menus and Marking menus," Alias, Toronto, Ontario, Canada, Apr. 2004 (2 pages).
LUMEDX CardioPACS 5.0 Web Viewer, Cardiopacs Module, 2 page color brochure, (506-10011 Rev A). Downloaded from http://cdn.medicexchange.com/images/whitepaper/cardiopacs_web_viewer.pdf?1295436926. Accessed on Feb. 9, 2015.
LUMEDX Cardiovascular Information System, CardioPACS, one page in color printout. Accessed at http://www.lumedx..com/pacs.aspx. Accessed on Feb. 9, 2015.
McKesson Enterprise Medical Imagining and PACS | McKesson, 1 page (color) printout. Accessed at http://www.mckesson.com/providers/health-systems/diagnostic-imaging/enterprise-medical-imaging. Accessed on Feb. 9, 2015.
Medweb Radiology Workflow Solutions, Radiology Workflow Solutions, Complete Workflow & Flexible Turnkey Solutions, Web RIS/PACS with Advanced Viewer, 3 page color brochure, © 2006-2014 Medweb. Downloaded from http://www.medweb.com/docs/rispacs_brochure_2014.pdf. Accessed on Feb. 9, 2015.
Merge Radiology Solutions, Merge PACS, A real-time picture archiving communication system, (PAX-21990 rev 2.0), 2 page color brochure. Downloaded from http://www.merge.com/MergeHealthcare/media/documents/brochures/Merge_PACS_web.pdf. Accessed on Feb. 9, 2015.
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/792,016 dated Nov. 2, 2017 (19 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/097,219 dated Mar. 8, 2018 (23 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/097,219 dated Sep. 21, 2018 (10 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/264,404 dated Dec. 28, 2018 (12 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/347,099 dated Jan. 26, 2018 (23 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/331,651 dated Mar. 31, 2014 (9 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/651,328 dated Mar. 13, 2015 (7 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/318,437 dated Sep. 23, 2016 (9 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/792,016 dated Aug. 7, 2018 (8 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/792,016 dated May 30, 2018 (8 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/097,219 dated Aug. 16, 2018 (2 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/097,219 dated Jul. 11, 2018 (5 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/097,219 dated Mar. 1, 2019 (5 pages).
NOVARAD Enterprise Imaging Solutions, NOVAPACS, 2 page (color) printout. Accessed at http://ww1.novarad.net/novapacs. Accessed on Feb. 9, 2015.
PACSPLUS, PACSPLUS Server, 1 page (color) printout. Accessed at http://www.pacsplus.com/01_products/products_01.html. Accessed on Feb. 9, 2015.
PACSPLUS, PACSPLUS Workstation, 3 page (color) printout. Accessed at http://www.pacsplus.com/01_products/products_01.html. Accessed on Feb. 9, 2015.
PHILIPS IntelliSpace PACS, in 2 color page printout. Accessed at https://www.healthcare.philips.com/main/products/healthcare_informatics/products/enterprise_imaging_informatics/isite_pacs. Accessed on Feb. 9, 2015.
PHILIPS, IntelliSpace: Multi-modality tumor tracking application versus manual PACS methods, A time study for Response Evaluation Criteria in Solid Tumors (RECIST). 2012, Koninklijke Philips Electronics N.V., in four pages.
Pie Menus, as downloaded from http://c2.com/cgi/wiki?PieMenus on May 24, 2010, (5 pages).
RamSoft, RIS PACS Teleradiology, PowerServer PACS, Lite PACS, XU PACS Compare RamSoft PACS Products, 2 color page printout. Accessed at http://www.ramsoft.com/products/powerserver-pacs-overview. Accessed on Feb. 9, 2015.
Rollo, Carl C., "A Brief Description of Pie Menus for Windows," as downloaded fromhttp://web.archive.org/web/20060505030416/www.sm.luth.se/˜david/carl/www/piedscrp.html on May 24, 2010 (2 pages).
Sage Intergy PACS | Product Summary. Enhancing Your Workflow by Delivering Web-based Diagnostic Images When and Where You Need Them, in 2 color pages (IRV-SS-INTPACS-PSS-031309). © 2009 Sage Software Healcare, Inc. Downloaded from http://www.greenwayhealth.com/solutions/intergy/. Accessed on Feb. 9, 2015.
ScImage, Cardiology PACS, in 8 color page printout. Accessed at http://www.scimage.com/solutions/clinical-solutions/cardiology. Accessed on Feb. 9 2015.
Sectra RIS PACS, in 2 color page printout. Accessed at https://www.sectra.com/medical/diagnostic_imaging/solutions/ris-pacs/. Accessed on Feb. 9, 2015.
Siemens syngo.plaza, Features and Benefits, in 2 color page printout. Accessed at http://www.healthcare.siemens.com/medical-imaging-it/imaging-it-radiology-image-management-pacs/syngoplaza/features. Accessed on Feb. 9, 2015.
Simms | RIS and PACS Medical Imaging Software, in 2 color page printout. http://www.mysimms.com/ris-pacs.php. Accessed on Feb. 9, 2015.
Stryker, Imaging-OfficePACS Power Digital Imaging, in one color page printout. Accessed from http://www.stryker.com/emea/Solutions/Imaging/OfficePACSPowerDigitalImaging/index.htm. Accessed on Feb. 9, 2015.
Stryker, Imaging—OfficePACS Power Digital Imaging, in one color page printout. Accessed from http://www.stryker.com/emea/Solutions/Imaging/OfficePACSPowerDigitalImaging/index.htm. Accessed on Feb. 9, 2015.
Stryker, OfficePACS Power-Digital Imaging, 8 page color brochure, (MPP-022 Rev 4 BC/MP 300 Jan. 2007). © 2007 Stryker. Downloaded from http://www.stryker.com/emea/Solutions/Imaging/OfficePACSPowerDigitalImaging/ssLINk/emea/1557/022268. Accessed on Feb. 9, 2015.
Stryker, OfficePACS Power—Digital Imaging, 8 page color brochure, (MPP-022 Rev 4 BC/MP 300 Jan. 2007). © 2007 Stryker. Downloaded from http://www.stryker.com/emea/Solutions/Imaging/OfficePACSPowerDigitalImaging/ssLINk/emea/1557/022268. Accessed on Feb. 9, 2015.
U.S. Appl. No. 14/792,016, User Interface Systems and Methods, filed Jul. 6, 2015.
U.S. Appl. No. 15/097,219, User Interface Systems and Methods, filed Apr. 12, 2016.
U.S. Appl. No. 15/264,404, Pressure Sensitive Manipulation of Medical Image Data, filed Sep. 13, 2016.
U.S. Final Office Action, re U.S. Appl. No. 13/331,651, dated Jan. 24, 2014.
U.S. Interview Summary, re U.S. Appl. No. 13/331,651, dated Mar. 21, 2014.
U.S. Interview Summary, re U.S. Appl. No. 13/331,651, dated Oct. 16, 2013.
U.S. Interview Summary, re U.S. Appl. No. 13/651,328, dated Jan. 23, 2015.
U.S. Interview Summary, re U.S. Appl. No. 14/318,437, dated Jun. 8, 2016, 2016.
U.S. Notice of Allowance, re U.S. Appl. No. 13/331,651, dated Mar. 31, 2014.
U.S. Notice of Allowance, re U.S. Appl. No. 13/651,328, dated Mar. 13, 2015.
U.S. Notice of Allowance, re U.S. Appl. No. 14/318,437, dated Sep. 23, 2016, 2016.
U.S. Office Action, Final, re U.S. Appl. No. 12/577,949, dated Apr. 13, 2012.
U.S. Office Action, Final, re U.S. Appl. No. 12/577,949, dated Dec. 16, 2011.
U.S. Office Action, Interview Summary, re U.S. Appl. No. 12/577,949, dated Feb. 13, 2012.
U.S. Office Action, Notice of Abandonment, re U.S. Appl. No. 12/577,949, dated Nov. 15, 2012.
U.S. Office Action, re U.S. Appl. No. 13/331,651, dated Sep. 18, 2013.
U.S. Office Action, re U.S. Appl. No. 13/651,328, dated Sep. 3, 2014.
U.S. Office Action, re U.S. Appl. No. 14/318,437, dated Mar. 16, 2016.
UltraRAD-ultra VISION, 1 page (color). Downloaded from http://www.ultraradcorp.com/pdf/UltraVISION.pdf. Accessed on Feb. 9, 2015.
UltraRAD—ultra VISION, 1 page (color). Downloaded from http://www.ultraradcorp.com/pdf/UltraVISION.pdf. Accessed on Feb. 9, 2015.
VioStream for VitreaView, 2 color pages printout. Accessed at http://www.vitalimages.com/solutions/universal-viewing/viostream-for-vitreaview. Accessed on Feb. 9, 2015.
Visage Imaging Visage 7, 3 color page printout. Accessed at http://www.visageimaging.com/visage-7. Accessed on Feb. 9, 2015.
VIZTEK Radiology PACS Software Vixtek Opal-RAD, 4 color page printout. Accessed at http://viztek.net/products/opal-rad. Accessed on Feb. 9, 2015.
Voyager Imaging-Voyager PACS Radiologist Workstation, 2 page color brochure. Downloaded from http://www.intellirad.com.au/assets/Uploads/Voyager-PacsWorkstations.pdf?. Accessed on Feb. 9, 2015.
Voyager Imaging—Voyager PACS Radiologist Workstation, 2 page color brochure. Downloaded from http://www.intellirad.com.au/assets/Uploads/Voyager-PacsWorkstations.pdf?. Accessed on Feb. 9, 2015.
Voyager Imaging-Voyager PACS, 3 page color brochure. Downloaded from http://www.intellirad.com.au/index.php/assets/Uploads/Voyager-Pacs3.pdf. Accessed on Feb. 9, 2015.
Voyager Imaging—Voyager PACS, 3 page color brochure. Downloaded from http://www.intellirad.com.au/index.php/assets/Uploads/Voyager-Pacs3.pdf. Accessed on Feb. 9, 2015.

Also Published As

Publication numberPublication date
US20170060270A1 (en)2017-03-02
US20120154431A1 (en)2012-06-21
US9524080B1 (en)2016-12-20
US8797350B2 (en)2014-08-05

Similar Documents

PublicationPublication DateTitle
US10545582B2 (en)Dynamic customizable human-computer interaction behavior
US10229753B2 (en)Systems and user interfaces for dynamic interaction with two-and three-dimensional medical image data using hand gestures
US10929508B2 (en)Database systems and interactive user interfaces for dynamic interaction with, and indications of, digital medical image data
US10269449B2 (en)Automated report generation
US10129553B2 (en)Dynamic digital image compression based on digital image characteristics
US11900266B2 (en)Database systems and interactive user interfaces for dynamic conversational interactions
US10127662B1 (en)Systems and user interfaces for automated generation of matching 2D series of medical images and efficient annotation of matching 2D medical images
US10134126B2 (en)Intelligent dynamic preloading and processing
Andriole et al.Optimizing analysis, visualization, and navigation of large image data sets: one 5000-section CT scan can ruin your whole day
US20080118237A1 (en)Auto-Zoom Mark-Up Display System and Method
US11169693B2 (en)Image navigation
EP3657512B1 (en)Integrated medical image visualization and exploration
US10120451B1 (en)Systems and user interfaces for dynamic interaction with two- and three-dimensional medical image data using spatial positioning of mobile devices
WO2014024132A1 (en)Audio activated and/or audio activation of a mode and/or a tool of an executing software application

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:D.R. SYSTEMS, INC., CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FRAM, EVAN K.;REEL/FRAME:040272/0413

Effective date:20111222

STPPInformation on status: patent application and granting procedure in general

Free format text:FINAL REJECTION MAILED

ASAssignment

Owner name:MERGE HEALTHCARE SOLUTIONS INC., WISCONSIN

Free format text:NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:D.R. SYSTEMS, INC.;REEL/FRAME:050043/0921

Effective date:20190515

STPPInformation on status: patent application and granting procedure in general

Free format text:NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

ASAssignment

Owner name:MERGE HEALTHCARE SOLUTIONS INC., WISCONSIN

Free format text:AFFIDAVIT CONCERNING CHANGE IN PATENT OWNERSHIP;ASSIGNOR:D.R. SYSTEMS, INC.;REEL/FRAME:049967/0426

Effective date:20190515

STPPInformation on status: patent application and granting procedure in general

Free format text:NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPPInformation on status: patent application and granting procedure in general

Free format text:PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCFInformation on status: patent grant

Free format text:PATENTED CASE

ASAssignment

Owner name:INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW YORK

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MERGE HEALTHCARE SOLUTIONS INC.;REEL/FRAME:055617/0985

Effective date:20210315

ASAssignment

Owner name:MERATIVE US L.P., MICHIGAN

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTERNATIONAL BUSINESS MACHINES CORPORATION;REEL/FRAME:061496/0752

Effective date:20220630

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4


[8]ページ先頭

©2009-2025 Movatter.jp