CROSS-REFERENCE TO RELATED APPLICATIONSThis application is a divisional of U.S. patent application Ser. No. 12/685,593, filed Jan. 11, 2010, now allowed, which application is a divisional of Ser. No. 11/336,649, filed Jan. 19, 2006, issued as U.S. Pat. No. 7,657,101, which application claims the benefit under 35 U.S.C. §119(e) of U.S. Provisional Patent Application No. 60/645,538 filed Jan. 19, 2005, where these applications are incorporated herein by reference in their entireties.
BACKGROUND OF THE INVENTION1. Field of the Invention
This disclosure relates to devices and methods, which can be used alone or in combination, to identify and monitor changes of a suspect area on a patient, for example dermatological changes.
2. Description of the Related Art
There are many reasons why a medical professional, patient, or both would want to monitor changes on an exterior or internal surface of a patient. For a suspect area, especially one that may indicate some form of skin cancer, it is important to detect and treat the area in its early stages. One type of skin cancer is known as melanoma, which is a malignant cancer of the pigment cells (melanocytes). Other forms of skin cancer also exist and are known as basal and squamous cell cancers, which are tumors of unpigmented cells (keratinocytes) of the skin.
Melanocytes occur at various depths within the epidermal (upper) and dermal (lower) layers of skin. Melanocytes are normally distributed in the layers of the skin and produce pigment in response to being subjected to ultraviolet light (e.g., sunlight). Aggregated melanocytes are termed naevus cells and can be indicative of a melanoma. Because a melanoma may appear as a mole, medical professionals typically attempt to ascertain whether the suspicious area is changing over time. Identifying changes early typically results in a rapid diagnosis, which in turn often leads to rapid and highly effective treatments that can greatly increase the patient's survival rate and in most cases, complete recovery.
Historically, the standard of care for screening or monitoring melanoma is a visual inspection or visual comparison of photographs by a medical professional. These visual inspections or comparisons are subjective and do not enable the medical professional to detect small or subtle changes in a suspect area. Changes in the perimeter, depth, shape, and even color of a melanoma can be subtle for a time and then progress rapidly. Moreover, changes in the color or perimeter, for example, of a melanoma are not easily discernable to the human eye so these changes may go unnoticed for a long period of time.
In U.S. Pat. No. 6,427,022, issued to Craine et al., a skin lesion is monitored by obtaining a series of digital baseline images over time and comparing these images. The method of comparison taught in Craine et al. is that the baseline image is compared visually by the viewer with a subsequently obtained image by alternately displaying the respective images, in a blinking fashion. The blinking action is created by quickly alternating the images with respect to one another on a display monitor to enable the viewer to detect changes in the skin lesion.
Even though the standards of care discussed above may involve images, each standard of care suffers from the subjectivity and uncertainty associated with the medical professional trying to ascertain changes in a suspicious area by visual comparison. The visual comparison methods are subjective and less accurate for a number of reasons. For instance, the medical professional may be inexperienced, may have been distracted during the examination, or may have selected the wrong location on the patient's body during a follow-up examination.
Therefore, a more effective, less subjective, and low-cost approach for at least monitoring changes in a suspect area is desirable.
SUMMARY OF THE INVENTIONIt should be understood that one aspect of the present invention is the comparison of a plurality (e.g., at least two) images taken at different times utilizing a computer based algorithm that can overlay two images and either transform the images to fit over each other or do a best fit analysis thereby denoting or calling out one or more of any color, perimeter, or depth changes. In one embodiment the analysis can include transforming the images to match color, contrast, angle, focus (sharpness), brightness, and subsequently comparing the multiple images with each other. Changes between the images may be called out in a variety of ways. Such embodiments include text reports, highlight or color coding the image itself, etc.
In another aspect a device or apparatus is contemplated that comprises a digital image capture device and a distance-measuring device. In certain embodiments the distance measuring device measures the distance between the suspect area and the image capture device and provides a read-out or a tone to signify the optimal distance. Further, embodiments include the use of a reference such as a strip of adhesive affixed to the surface or attached the device that provide contrast, color, sharpness, and/or depth references. Such an embodiment could include an adhesive strip having a color palette (e.g., one or more colors), gray scale, distance references (hash marks) or depth references.
In certain embodiments of the invention distance measurements can be done by a sonic device, laser or any other means of measuring distance.
In one particular embodiment an enclosed tube or housing is affixed to the image device and positioned over the suspect area. In one embodiment of such a device the housing or enclosure is essentially light free and may contain its own light source internally to provide consistent lighting of the suspect area. In a specific embodiment the enclosure is a tube of a fixed length have LEDs or fiber optics positioned inside. One end of such an enclosure may be fitted to the image capturing device and the other fitted over the suspect area.
In one aspect, an apparatus to acquire an image of a suspect area on a patient comprises an imaging device; and a separation tool having a first end connected with a first section, at least a portion of the first end formed to contact the patient, the first section having an attachment portion to receive the imaging device, the first section formed to maintain the imaging device at a substantially fixed distance from the suspect area.
In another aspect, an imaging assembly to image a suspect area on a patient comprises an imaging device and at least one sensor to indicate an orientation and/or distance of the imaging device relative to a first location.
In another aspect, an imaging assembly to image a suspect area on a patient comprises a housing; an imaging device located within the housing; and at least one sensor to indicate an orientation of the housing relative to a first location.
In yet another aspect, a method of acquiring an image of a suspect area includes identifying the suspect area; positioning a patient with the suspect area in approximately a first position; identifying a reference item located on the patient; determining a position of the suspect area in relationship to the reference item; aligning an imaging device to acquire the image of the suspect area; and acquiring the image after aligning the image device.
In yet another aspect, a method of comparing at least two images, each image capturing a suspect area includes identifying a reference item in the at least two images; measuring an attribute of the reference item in a first image; transforming a second image based on the measured attribute of the reference item in the first image, wherein a reference item in the second image is transformed to correspond with an orientation and size of the reference item in the first image; measuring an attribute of the suspect areas in both images; and comparing the respective measured attributes of the respective suspect areas. Such reference items can be points either away from the suspect area or within the suspect area. Further, the measured attribute can be color, distance between at least two points, total perimeter, distance between multiple points etc.
In one aspect the method of comparing two images comprises receiving at least two images digitally into a computer system, performing a fitting analysis on the at least two images to obtain an overlay and providing an output noting any differences between the at least two images.
In yet another aspect, a method of acquiring an image of a suspect area includes identifying the suspect area; positioning a patient with the suspect area in approximately a first position; aligning an imaging device to acquire an image of the suspect area; and acquiring the image after aligning the image device.
In still yet another aspect, a method of comparing at least two images of a suspect area on a patient includes providing at least two digital images of the suspect area; digitally overlaying the at least two images; performing a best-fit transformation of one image to encourage the one image to approximately correspond to at least one detected attribute of the other acquired image; comparing the at least two images to determine whether a difference exists between an aspect of the one image when compared with the same aspect of the other image.
In an even further aspect the present invention can be used in the context of full-body imaging wherein one or more digital or other image capture device or devices are placed around the patient and the full-body is imaged either in a piece by piece manner or in its entirety. These images can then be compared by transformation or best-fit analysis and analyzed for any changes by a computer algorithm.
BRIEF DESCRIPTION OF THE DRAWINGSIn the drawings, identical reference numbers identify similar elements or acts. The sizes and relative positions of elements in the drawings are not necessarily drawn to scale. For example, the shapes of various elements and angles are not drawn to scale, and some of these elements are arbitrarily enlarged and positioned to improve drawing legibility. Further, the particular shapes of the elements as drawn, are not intended to convey any information regarding the actual shape of the particular elements, and have been solely selected for ease of recognition in the drawings.
FIG. 1A is a front, left isometric view of an imaging device according to one illustrated embodiment positioned with respect to a portion of skin.
FIG. 1B is a side view of a portion of a guide of the imaging device ofFIG. 1A having measurement markers and a palette according to one illustrated embodiment.
FIG. 2A is a partially exploded, front, left, isometric view of an imaging device according to another illustrated embodiment.
FIG. 2B is a front, left isometric view of an intermediate bracket according to one illustrated embodiment.
FIG. 3 is a front, left, isometric view of an imaging device according to another illustrated embodiment.
FIG. 4 is an elevational view of a hand having several reference points for locating a suspect area according to one illustrated embodiment.
FIG. 5A is a flowchart of a method of identifying a suspect area according to one illustrated embodiment.
FIG. 5B is a continuation of the flowchart ofFIG. 5A.
FIG. 6 is a top plan view of an image after the image has been pre-processed according to one illustrated embodiment.
FIG. 7 is a flowchart of a method of acquiring a subsequent image of a suspect area according to one illustrated embodiment.
FIG. 8A is left, front isometric view of a first image and a second image, each having in an initial and respectively different orientation and size according to one illustrated embodiment.
FIG. 8B is a top plan view of the first image and the second image ofFIG. 8A transformed to have approximately the same respective orientation and size.
FIG. 9 is a flowchart of a method of comparing at least two images of a suspect area according to one illustrated embodiment.
FIGS. 10A-10C are images of suspect areas illustrating the various stages of the color balancing method ofFIG. 11 according to one illustrated embodiment.
FIG. 11 is a flowchart of a method of color balancing an image to detect a potential suspect area according to one illustrated embodiment.
DETAILED DESCRIPTIONIn the following description, certain specific details are set forth in order to provide a thorough understanding of various embodiments of the disclosed subject matter. However, one skilled in the art will understand that the embodiments may be practiced without these details. In other instances, well-known structures associated with imaging systems, computing systems and processors, and various techniques for manipulating and evaluating digital image data have not been shown or described in detail to avoid unnecessarily obscuring descriptions of the embodiments.
Unless the context requires otherwise, throughout the specification and claims which follow, the word “comprise” and variations thereof, such as, “comprises” and “comprising” are to be construed in an open, inclusive sense, that is as “including, but not limited to.”
Unless the context requires otherwise, throughout the specification and claims that follow, the term “patient” refers primarily to warm blooded mammals and is not limited to human beings, but could include animals such as dogs, cats, horses, cows, pigs, higher and lower primates, etc.
The headings provided herein are for convenience only and do not interpret the scope or meaning of the claimed invention.
The embodiments disclosed herein are generally directed to acquiring images of a suspect area located on a patient, comparing the acquired images to one another; and evaluating the compared images to determine if some amount of change from one image to a subsequent image warrants a more detailed examination by a medical care professional. The embodiments disclose a number of different devices and methods for achieving such results.
The surface of interest can be either internal or external to the patient. In one instance, the surface of interest is the patient's exposed skin that is monitored for the detection or growth of skin cancer. In another instance, the surface of interest can be the patient's mucous membranes, interior body surfaces related to reproductive and/or digestive systems of the patient, ocular surfaces, or any other accessible surface on a patient. For purposes of this description, the surface of interest will be exemplified as an area on the patient's skin, referred to as a suspect area. However, this exemplification is not meant to limit or otherwise narrow the scope of the description, the claims, or any specific embodiment depicted herein.
The suspect area referred to herein can be the site of a suspected melanoma or mole (e.g., melanin containing areas to be monitored), but can also be any other suspect area on a patient that needs to be monitored. Thus, it is within the scope of this disclosure that the suspect area can be located in a variety of places on a patient, for example the patient's mucous membranes, surfaces of interior body cavities related to reproductive and/or digestive systems, ocular surfaces, or any other interior or exterior surface on a patient where monitoring is desired. In the exemplary embodiment used for discussion purposes, the suspect area can be a dermal feature, such a type of skin cancer, a skin lesion, a skin rash, a burn or scar, an infected or inflamed area, a wound, or some other skin anomaly that may or may not be capable of growth, reduction or other change. For example, one embodiment may monitor a healing rate (i.e., recession) of a burn or scar when certain medications, lotions, or creams are applied to the skin. A further embodiment envisions utilizing such technology to monitor the effectiveness of a drug or nutriceutical, such as those that heal the skin. Another embodiment may monitor a patient's scalp for hair loss and/or growth. In addition, the embodiments disclosed herein may be used in a number of settings, such as a home setting, a clinical setting, a laboratory or research setting, a regulatory compliance setting, or any combination of the above.
Devices and Systems to Acquire an Image of a Suspect AreaFIGS. 1A-3 show three different embodiments of a device to acquire an image of a suspect area. Each of the devices differs in its degree of complexity, accuracy, and cost. It is contemplated that many, if not all, of the features or aspects of one device can be incorporated into the other devices.
FIG. 1A shows afirst imaging device10 for imaging asuspect area12 onskin14 according to the illustrated embodiment. Thefirst imaging device10 includes ahousing16 and alens18 to receive and direct light to imaging components (not shown) located within thehousing16. By locating the imaging components in thehousing16, damage and/or exposure of the imaging components may be prevented. Thehousing16 can have ahandle20 to permit thehousing16 to be lifted, moved, positioned, or otherwise manipulated. Additionally or alternatively, thehandle20 and/or other portions of thehousing16 can be configured with support locations so that theimaging device10 can be secured to a tripod, for example.
The imaging components may take the form of a camera or an optical scanner operable to capture images of thesuspect area12. In one embodiment, the camera may advantageously take the form of a digital image capture device such as a CCD or CMOS type camera. A CCD camera may consist of one-dimensional or two-dimensional arrays of charge coupled devices (“CCD”) and suitable optics, such as optical lenses, for focusing an image on the CCD array. CCD arrays can capture whole images at a time, or can be electronically controlled to successively sample (e.g., pixel-by-pixel, row-by-row, or column-by-column) the information on a region of the skin14 (i.e., electronically scan). Alternatively, the imaging components can take the form of a CMOS imager capable of capturing one-dimensional or two-dimensional arrays similar to that of a CCD reader.
Employing a digital image capture device advantageously provides the image in a form suitable for use with a data processing system such as a computing system. Alternatively, the camera may take the form of a non-digital image capture device, such as a film camera. Such embodiments may employ image scanners, or other devices to digitize the images captured on film. Theimaging device10 may advantageously take the form of a still image capture device. Alternatively, theimaging device10 may take the form of a motion picture capture device such as a movie camera or video camera. Such embodiments may include a frame grabber or other device to capture single images.
Theimaging device10 may rely on ambient light, or may include one or more light sources, such as light emitting diodes (“LEDs”) or incandescent lights, which may be manually or automatically controlled.
Aguide22 is attachable to thehousing16 of theimaging device10. Theguide22 is configured so that thehousing16 can be placed at a desired distance away from theskin14 along a Z-axis, perpendicular to an X-Y plane when an image is acquired. In the illustrated embodiment, theguide22 includes anextension member24 having afirst end26 that is coupled to thehousing16 of theimaging device10. Asecond end28 is coupled to acontact member30. Thecontact member30 can include a number of features to enhance the control and/or optimization of theimaging device10. For example as illustrated inFIG. 1B, thecontact member30 of theguide22 can include measurement markings30a, similar to those of a ruler, and/orcontrast markings30b, which can represent a color or grayscale palette.
In one embodiment, theextension member24 includes adjustable, complementary sliding members withgradations25 to allow thehousing16 to be placed at a desired distance from thesuspect area12. In another embodiment, theextension member24 is formed to be non-adjustable, thus thehousing16 is set at a fixed length from thecontact member30. Thecontact member30 may be shaped (e.g., arc-shaped) to provide an unobstructed line of sight between theimaging device10 and thesuspect area12. As will be discussed in more detail below, it may be desirable that thecontact member30 be shaped such that at least a portion of thecontact member30 can be captured in the acquired image. Askin contact region31 of thecontact member30 can be padded to provide a more comfortable interaction with the patient. One skilled in the art will understand and appreciate that thefirst end26 can be coupled to thehousing16 by any number of mechanical methods, for example fasteners, clips, VELCRO®, adhesive bonding, tie down straps, or some other structure that substantially keeps thehousing16 attached to theextension member24.
In the illustrated embodiment, theimaging device10 can include at least onesensor32 for determining an orientation of thedevice10. One advantage of determining the orientation of thedevice10 is to provide for repetitive and consistent images in an X-Y plane, especially if the images are acquired at different times. For example, if a second image is being acquired of thesuspect area12, but theimaging device10 is tilted or rotated at too much of an angle, the second image may be too distorted or misaligned to digitally process or compare to a previously acquired image.
A variety ofsensors32 can be used to indicate the orientation of theimaging device10. In the illustrated embodiments, thesensor32 is a fluid level encompassed in thehousing16 and visible by a user of theimaging device10. Thesensor32 may also be integrated with theguide22. Thefluid level sensor32 generally indicates whether theimaging device10 is tilted relative to the ground. Additionally or alternatively, a gyroscope, which is sometimes referred to as a tilt sensor, can be used to determine an acceleration of theimaging device10 about at least one axis.
In addition to or instead of sensing the orientation of theimaging device10,other sensors34 can be used to determine the proximity of theimaging device10 in relation to theskin14 of the patient. In one embodiment, apressure sensor34 is located in thecontact member30 to sense the pressure exerted on thecontact member30 as it is positioned against theskin14 of the patient. By sensing the pressure that theimaging device10 is pressed against theskin14, theimaging device10 can be repetitively and accurately repositioned relative to thesuspect area12 from one image to the next. Additionally or alternatively, aproximity sensor35 can be used to detect when thecontact member30 is at a desired distance from the patient, to include when thecontact member30 barely makes contact with the patient.
Each of thesensors32,34, and/or35, described above, as well as equivalent sensors, can be electronically coupled with theimaging device10 to provide an indication that theimaging device10 is at the desired orientation or distance. For example, theimaging device10 can have anindicator36 that sends a visual and/or audio signal to indicate when theimaging device10 is at the desired orientation, distance, and/or when an amount of pressure is present between thecontact member30 and the patient. A signal from theindicator36 would indicate that the image could be acquired at that moment in time. Likewise, thesensors32,34, and/or35 can also be electronically coupled with a processor (not shown) to computationally update the orientation and/or proximity to the patient of theimaging device10. In one embodiment, the processed information can be displayed on a screen (not shown) located on thehousing16.
FIG. 2A shows asecond imaging device100 that includes acamera102 and amember104 for receiving and coupling thecamera102 to anextension member106. Similar to the extension member discussed above, theextension member106 includes acontact member108. In addition, theextension member106 further includesdetents110 sized and configured to complementarily receive themember104, asensor112 to indicate the orientation of theextension member106 about aRoll axis114, aPitch axis116, and/or aYaw axis118, and acolor palette120 that can be used to provide color and/or contrast balancing once the image is acquired and archived. Color and/or contrast balancing are described in more detail below.
Thecamera102 can be a digital camera as described in the previous embodiment or a film camera that includes at least alens122, acamera body124, animage trigger126. The camera can capture images on photographic film (not shown), which can be standard photographic film that is purchased in a store and is configured to be chemically processed in a photo lab after it has been exposed to light. Alternatively, the photographic film may be specialized film, such as film that is sensitive to the non-visible portions of the electromagnetic spectrum, such as infrared or ultraviolet sensitive films.
In the illustrated embodiment, themember104 includes acompartment128 that is sized to receive thecamera102 and a pair offlanges130 formed to couple to theextension member106. Afront portion128 of thecompartment128 does not obstruct thelens122 of thecamera102 when thecamera102 is seated in thecompartment128. Thecamera102 can be secured to themember104 by virtue of thecompartment128 being sized to provide a tight or snug fit for thecamera body124. Alternatively, hook and loop fastener pads, commonly available under the trademark VELCRO®, can be provided to keep thecamera102 relatively secure in thecompartment128. One skilled in the art will appreciate and understand that securing thecamera102 in thecompartment128 can be accomplished in a variety of known ways.
Theflanges130 are further formed to complementarily engage thedetents110 provided on theextension member106. In the illustrated embodiment, theflanges130 include rounded,depressible buttons132. Sliding afirst end134 of theextension member106 in between theflanges130 and permitting thebuttons132 to click into thedetents110, so that thecontact member108 is at a desired distance from thecamera102, accomplishes the assembly of themember104 with theextension member106.
FIG. 2B shows a different embodiment of amember104 without a compartment. Instead, abonding strip136 is provided on abase138 of themember104. Eachside140, extending from the base138 can be biasly resilient to form a snug fit with thecamera102. Thebonding strip136 can be a pad of hook and loop fastener, a tacky substance, or other equivalent object or substance.
FIG. 3 shows anautomated imaging device200 according to another embodiment. Many of the aspects of theimaging device200 are similar to the aspects described in the previous embodiments, for example ahousing202, animager204, ahandle206, and asensor208. One difference between theimaging device200 and the previously describeddevices10,100 is that the present embodiment does not employ an extension member. In lieu of the extension member, a second sensor orrange finder210 is used to indicate the distance between theimaging device204 and asuspect area212.
In one embodiment, therange finder210 is a laser triangulation sensor that provides non-contact linear displacement measurements of thesuspect area212 on theskin214. A laser beam (e.g., from a semiconductor laser) is reflected off theskin214. A returning beam is received and focused onto a CCD sensing array (not shown) of theimager204. The CCD array detects the peak value of the light and determines the distance of theskin214 based on the position of the beam spot. Therange finder210 produces an analog voltage that is proportional to the distance of theskin214 from therange finder210.
As an alternative to the above embodiment, therange finder210 can be a laser interferometer, an ultrasonic sensor, or an equivalent sensor to measure the linear distance of thedermal suspect212 to therange finder210. Laser interferometers use the length of a wave of light as the unit for measuring position and consist of three basic components, a laser that supplies a monochromatic light beam, optics that direct the beam and generate an interference pattern, and electronics which detect and count the light and dark interference fringes and output the distance information. Ultrasonic sensors offer another means to make non-contact distance measurements. An ultrasonic sensor works by measuring the time it takes a sound wave to propagate from therange finder210, to an object and back to therange finder210. In the illustrated embodiment, theskin214 would reflect the ultrasonic waves generated by a transmitter and then a receiver would detect the returning waves. The elapsed time from initial transmission to reception of the returning waves is used to determine the distance to theskin214.
The monochromatic light used to illuminate at least thesuspect area212 during imaging can have a wavelength outside of the visible portion of the light spectrum. For example, the monochromatic light can be in a frequency range of ultraviolet light, infrared light, or some other non-visible range along the light spectrum.
In yet another embodiment, theimaging device200 is a camera. Again, theimaging device200 may advantageously take the form of a still image capture device, a motion picture capture device such as a movie camera or video camera. In the present embodiment, the alignment of theimaging device200 is accomplished manually without the aid of an extension member or sensor. The acquired images are compared in a best-fit analysis. The best-fit analysis includes digitizing the images and matching key points or parameters of a first image onto similar key points or parameters of a second image. For example, the perimeter or border of the first image can be matched to the perimeter or border of the second image. It is appreciated that in one embodiment the first image and the second image are of the same suspect area and the best-fit analysis is employed to detect changes, if any, of the suspect area over time without respect to using other reference points and/or markers to align and/or orient theimaging device200 relative to the suspect area. The analysis software can transform, rotate, and otherwise manipulate at least one of the images until enough similarities are found between the two compared images to verify that both images are of the same suspect area or are possibly not of the same suspect area. The best-fit analysis is described in more detail below in the discussion on image comparison.
Devices and Systems to Acquire Images of Larger SurfacesIn one embodiment, a system is capable of contemporaneously acquiring images over a variety of locations on a patient. The system can capture images over a larger surface area or can take multiple images over a larger area where the multiple images can be digitally overlaid and matched to form a large image.
Methods of Identifying and Re-Locating a Suspect AreaFIG. 4 shows a patient'shand300 according to one embodiment. By way of example, a suspect area302 (e.g., a grouping of melanoma cells) appears on a backside surface304 of the patient'shand300. Two spots are identifiable on the backside surface304, a first spot306 (e.g., a scar) is located on thering finger308 and a second spot310 (e.g., a freckle) is located on thewrist312 of the patient'shand300. Thespots306,310, respectively, can be freckles, birthmarks, borders of a limb, or some other equivalent feature or landmark that is not susceptible to substantial changes in shape, size, and/or location with respect to its present location on the patient. Further, thespot306 or310 may be naturally occurring, such as a freckle, or thespot306 or310 may be a portion of a scar, a tattoo, or some other feature that is not susceptible to substantial changes in shape, size, and/or location with respect its present location on the patient.
One advantage of locating at least one of thespots306 or310 on the patient is to use one of thespots306 or310 as areference object311. Thereference object311, which is equivalent to spot310 in the illustrated embodiment, provides a starting point from which other key measurements can be taken, as explained in the method below. However, it is appreciated that areference object311 is not always necessary when the suspect area can be easily relocated. For example, a group of melanoma cells that is easily and routinely detectable on a patient could be imaged and re-imaged, especially when the images are compared using a best-fit analysis.
The selection of thespots306,310 is generally left to the discretion of the medical professional, and it is contemplated that the medical professional will select thespot310 that is the most stable or less susceptible to change over time. Optionally, the imaging software may also select thespots306,310. Although thefirst spot306 may have a stable configuration, such as the scar, the location of thespot306 on thering finger308 makes it less attractive as anreference object311 because thering finger308 is easily moveable in relation to thehand300, which can add error in repetitive measurements taken with respect to thespot306 on thering finger308. In contrast, thesecond spot310, shown on thewrist312, has a more fixed relationship with respect to thesuspect area302 and thus may be abetter reference object311 from which to measure and document the location of thesuspect area302.
It is also advantageous if thereference object311 or at least areference marker318 is located proximate to thesuspect area302 so that thereference object311 orreference marker318 can be captured in an image of thesuspect area302. In accordance with the embodiments herein and described in more detail below, it is desirable to manipulate an image by matching or overlaying either the reference objects311 orreference markers318 that appear in different images, taken at different times. In the illustrated embodiment, thereference marker318 has a defined size and shape and can be placed quite near thesuspect area302. The advantage, however, of locating thereference object311 remains unchanged because the placement of thereference marker318 on the patient is made relative to the location of thereference object311 on the patient.
Methods of Acquiring a First Image of a Suspect AreaFIG. 5 is a flowchart illustrating amethod400 to identify and take an image of asuspect area302 on a patient. For clarity and ease of explanation, themethod400 is described in reference toFIG. 4. One aspect of locating thesuspect area302 is to accurately identify, map, and document thereference object311, themarker318, if needed, and thesuspect area302 for image comparison purposes as described in greater detail below.
FIG. 5 shows that themethod400 commences at402 where the suspect area is identified on the patient. A medical professional, the patient, the computing system, or some other entity may identify the suspect area. Identifying the suspect area most often will be done visually, but it is understood that other approaches may be used, such as the sense of touch.
At404, a reference object is identified on the patient. At406, the medical professional determines whether the reference object will be within a first field-of-view or first frame314 (FIG. 4) of an imaging device. Recall, it is desirable to have thereference object311 within thefirst frame314 of because multiple images of thesuspect area302 will be compared to one another. In one embodiment, thefirst frame314 is sized to provide an amount of resolution of thesuspect area302 that will be adequate for detailed image processing and evaluation.
If thereference object311 is advantageously within thefirst frame314 of the imaging device, then at408, a position of thesuspect area302 is determined relative to thereference object311. In one embodiment, a Cartesian coordinate system (X, Y) having perpendicular axes, is used to determine the position of thesuspect area302 relative to thereference object311. In another embodiment, a spherical coordinate system (r, θ) is used. By way of the exemplary embodiment illustrated inFIG. 4, which employs the Cartesian coordinate system, thereference object311 is assigned coordinates “0, 0” and thesuspect area302, as measured from thereference object311, is determined to have coordinates of (a, b) at a point on thesuspect area302 that represents an approximate center point of thesuspect area302. If a smaller image frame316 (FIG. 4) is necessary, for example to get an image with higher resolution, and thereference object311 is located outside of thesmaller image frame316, then at410, the reference marker318 (FIG. 4) is placed in proximity to thesuspect area302. At412, a position of thereference marker318 is determined relative to thereference object311, for example the position of thereference marker318 is determined to have coordinates (c, d). Next, the position of thesuspect area302 is determined relative to thereference marker318 and, by way of example, has coordinates (e, f).
In one embodiment, thereference marker318 is a pen mark on the patient. In another embodiment, thereference marker318 is a small patch or sticker backed with an adhesive. The shape of the patch is customized so that thereference marker318 can be placed in a desired orientation during successive examinations of the patient. In the illustrated embodiment ofFIG. 4, thereference marker318 includes a pointed region that points towards the finger tips and parallel sides that substantially align with the sides of the patient's arm. One skilled in the art will understand and appreciate that thereference marker311 can have a variety of shapes, sizes, colors, contrast features, textures, and can even have features, like a center dot, to identify an exact starting point for measurements. Additionally or alternatively, human-readable and/or machine-readable indicia can be encoded on thereference marker318.
It should be understood that in certain aspects of the invention no reference object is utilized per se, and the computer algorithm performs best fit on two images using only the suspect area or multiple points within the picture frame to make a transformation or best fit analysis. While user applied reference markers or the use of anatomical features as reference points are useful and may lead to higher quality results in certain scenarios, they are by no means required and thus should be considered optional embodiments. In addition, when utilizing such analysis in certain embodiments a computer algorithm can take points on the perimeter of the lesion or suspect area of a first image and perform multiple measurements between any two points or more and compare such measurements with a second image. The first and second image may be resized, skewed, color balanced and/or brightness changed to assist in attempting to fit one image to the other. When measurements between points in the first image and measurements between points in the second image are substantially the same the images can be considered compared and any deviations outside the error for such comparisons can be noted as a possible change for the user or medical professional to review.
The coordinates of thereference object311, thereference marker318, if needed, and thesuspect area302 can be recorded and/or documented on paper or via and electronic medium, for example entering the data into a computer. In addition, descriptions of these features can also be recorded and/or documented. It is understood that the recordation and/or documentation can be accomplished in a number of known ways, which may be through manual, automatic, paper, or paperless means. At414, animaging device10,100,200 is positioned to take an image of thesuspect area302. Because the images will be digitally processed, it is desirable to position theimaging device10,100,200 in a repeatable manner with respect to thesuspect area302. Depending on the type of method used to compare images and the quality of the images, it may desirable that the distance of theimaging device10,100,200 from and the angle of theimaging device10,100,200 with respect to thesuspect area302 is kept substantially constant from one image to the next. However, it is appreciated and understood that the distance and angle of theimaging device10,100,200 with respect to thesuspect area302 can vary by a significant amount from one image to the next and the analysis/comparison software can best-fit analysis to substantially match and align respective images. In addition, it may also be desirable to maintain constant lighting, at least within the frame of the image, in order to more easily detect color changes and/or shape changes of thesuspect area302 when the images are electronically processed and compared.
In another embodiment, theimaging device10,100,200 can include a stereo camera to add depth perception to the resulting image. Stereo cameras that are placed at a constant distance from each other could provide two images, one from each camera, of thesuspect area302. When the images are compared against each other, the depth and/or texture of thesuspect area302 can be determined.
Optionally, at416, a parameter on theimaging device10,100,200 may be adjusted to enhance a quality of the image. For example, light filters or color filters can be coupled with theimaging device10,100,200 as a way to control the light within the frame of the image. Additionally or alternatively, theimaging device10,100,200 can be focused to obtain a desired resolution, thus increasing or decreasing the frame size of the image to be acquired.
At418, a first image is acquired that captures thesuspect area302 alone or the suspect area with one of either thereference object311 or thereference marker318. At420, the first image is electronically archived. During the archiving process, the first image can be given an identifier such as a file name, label, number, date stamp, or some other association that makes it easy to re-locate the first image in a database. The electronic format of the first image can be archived as any number of common graphics formats such as *.jpg, *.tif, *.bmp, *.gif, or another equivalent format that is readable by a standard computer system.
Optionally, at422, the first image may be preprocessed. Pre-processing the first image may include, but is not limited to, identifying thereference object311 and/orreference marker318 in the image; detecting, mapping, and computing the border of theobject311 and/ormarker318; detecting, mapping, and computing the border of thesuspect area302; and/or overlaying areference grid608 onto the image as shown inFIG. 6, according to one illustrated, exemplary embodiment.
FIG. 6 shows animage600 having animage frame602. Captured within theimage frame602 is an image604 (i.e., an image of the suspect area302) and a reference image606 (i.e., an image of either one of thereference object311 or the reference marker318) located nearby or withinimage604. Thereference grid608 overlies theimage604 and thereference image606. One advantage of including thereference grid608 is that thegrid608 can be printed with theimage600. This allows the medical professional to more easily visually examine theimage604 to identify obvious changes. Another advantage of thereference grid608 is that it allows the medical professional to more accurately identify, describe, and even communicate respective changes of thesuspect area302 by referring to various quadrants or blocks of the reference grid, which can be colored or coded to indicate regions where substantial change has occurred.
Methods of Acquiring a Subsequent Image of a Suspect AreaFIG. 7 shows amethod700 of acquiring a subsequent image of thesuspect area302 according to one embodiment.Method700 differs from theprevious method400 in thatmethod700 includes relocating thesuspect area302 and realigning theimaging device10,100,200. At702, thesuspect area302 is relocated on the patient. As explained above, one of the purposes of the embodiments described herein is to track the changes of asuspect area302. Because some patients may have many suspicious areas that are crowded together in one location or suspicious areas that rapidly change, it is important to relocate the exact area that is to be re-evaluated.
Thesuspect area302 can be relocated by visually inspecting the patient, reviewing the patient's records, reviewing the position of a documented reference object and then measuring to obtain the position of thesuspect area302, automated by the computing system, or some combination thereof. At704, areference marker318 can be repositioned on the patient proximate to thesuspect area302, if necessary.
The patient is positioned at706 and animaging device10,100,200 is reoriented and/or realigned relative to thesuspect area302. Recall that a distance and an angle of theimaging device10,100,200 used to acquire a subsequent image should be approximately matched to a distance and an angle of theimaging device10,100,200 of a previous image. The orientation of theimaging device10,100,200 does not have to exactly match because a transformation algorithm can be used to account for some amount of deviation in the angle, the distance, and even the lighting. At708, a parameter (e.g., functional features such as zoom, contrast, etc.) on theimaging device10,100,200 may be adjusted to enhance a quality of the image, if necessary.
At710, a subsequent image is acquired that captures both thesuspect area302 and one of either thereference object311 or thereference marker318. At712, the subsequent image is electronically archived according to the archiving process described above. Optionally, at714, the subsequent image may be pre-processed as described above and illustrated inFIG. 6.
Image TransformationFIG. 8A shows twoimages600a,600bundergoing a transformation according to one embodiment. Afirst image600aincludes afirst frame602aenclosing afirst reference image606aand afirst image604a. The orientation of thefirst frame602aresults from the angle and position of theimaging device10,100,200 when the image was acquired. A second image600bincludes asecond frame602benclosing asecond reference image606band asecond image604b, wherein both thefirst image604aand thesecond image604bare images of thesuspect area302. It should be understood the second reference image600bmay be skewed, of a different size, or otherwise misaligned with respect to thefirst reference image600a. Thus, the transformation algorithm is used to align, size, deskew, or otherwise manipulate thesecond reference image606bto match thefirst reference image606aas closely as possible. Moreover, any changes made to thesecond reference image606bduring the transformation process are made to the entire image600band everything enclosed within theimage606b. For example, if thereference image606bis scaled down by ten percent, then thesecond frame602b, the reference grid (not shown for clarity), and thesecond image604bare also scaled down by ten percent. Further it should be noted thatreference image606aand606bneed not be separate fromimages604aand604b, but can be points within or onimages604aand604bthat are considered by the computer algorithm during the fit analysis.
FIG. 8B shows the same two images fromFIG. 8A about to be overlaid after the second image600bhas been transformed. Once theimages600a,600bare overlaid with respect to one another, a comparison algorithm can be employed to detect, map, and document differences, if any, between thefirst image604aand thesecond image604b.
Methods of Comparing ImagesFIG. 9 shows amethod800 to comparesubsequent images600a,600btaken of asuspect area302 according to one illustrated embodiment. This comparison can take place in a variety of settings, for example in the facility where the patient is treated or in a remote facility. The comparison, when done remotely, simply means that images of thesuspect area302 are forwarded to another location, which could be by a computer algorithm or a third party technician that specializes in performing the image comparisons. The images can be transferred to the remote facility through any available means, for example over a computer network (private or the Internet), through a file transfer protocol (FTP) system, by courier or regular mail, with the images stored on a computer readable medium such as a compact disk, magnetic storage device, or other equivalent digital storage media.
Themethod800 can commence with thefirst image600abeing compared to a second, subsequent image600b, for example. In the present embodiment, the images have been electronically archived, but they may not have been pre-processed. In addition, the present embodiment is not limited to the comparison of only two images. It is appreciated and understood that multiple images can be simultaneously compared against a baseline image and/or relative to each other. For example, each image taken over the preceding six months could be simultaneously compared to afirst image600ataken the previous year. In one embodiment, an animation software program can be used to animate the changes in thesuspect area302 over time. For purposes of clarity and brevity, however, the comparison of only two images will be described below.
At802, the electronically archived images are accessed from a database of images. At804, the images are pre-processed, if desired. Optionally, at806 a user may select abaseline image600a. The baseline image could be a first acquired image, an intermediately acquired image, or an image taken during the patient's previous office visit. For purposes of detecting changes, it is not necessary, but may be helpful to select a baseline image.
At808, a mapping algorithm can be used to determine key features, such as a border or perimeter of thereference images606a,606b. Alternatively, a mapping algorithm may look for key points on thereference images606a,606bwith respect to the reference grid608 (FIG. 6) or may usereference images606aand606bas points onimages604aand604bduring the fit analysis. In all of the embodiments described hereinreference images606aand606bshould be understood to be an image such as a landmark feature on the surface or simply a reference point or pixel or collection of pixels either separated fromimages604aand604bor on or within604aor604b. Thus, the term image should be construed to mean a point that can be captured in a digital form and used as a reference point.
At810, a transformation algorithm is used to transform the second image600binto a comparative posture with thefirst image600aby using thereference images606a,600b. In one embodiment, the reference image600bis scaled up or down in size, rotated, skewed, or otherwise manipulated so that thereference images606bis approximately the same size, same orientation, and in the same position within theframe602bas thefirst reference image606aofframe602a. In an alternate embodiment, bothimages600a,600bcan be scaled up or down in size, rotated, skewed, or otherwise manipulated so that therespective reference images606a,606bare approximately the same size, have the same orientation, and are in the same position with respect to the reference grid608 (FIG. 6). It should be understood that in certain aspects thereference images606aand606bare in fact contained withinimages604aand604b. Such reference images may correspond to points on the perimeter ofimages604aand604bthat appear unchanged once the images are sized and overlaid. As one of ordinary skill in the art can readily appreciate the more reference points taken into account during the fit analysis, the higher the quality of the comparison. Accordingly, in certain embodiments at least two reference points are considered, in other embodiments, at least 3, 4, 5, 6, 7, 8, 9, 10, 15, 20, or more are utilized by the algorithm.
At812, after thereference images606a,606bhave been sufficiently matched during the transformation process; a comparison algorithm is used to evaluate and compare therespective images604a,604b. Key points or parameters are identified in both thefirst image604aand thesecond image604b. For example, the overall area, the perimeter or border length, the percentage change in size in a given quadrant, etc. are just some of the parameters that can be evaluated in eachrespective image604a,604b.
In addition, a color, contrast, and/or a depth of each of therespective images604a,604bcan be determined. Balancing the color, brightness, and/or the contrast of the respective images is described below. The features that are to be evaluated can be selected by a user or can be selected automatically.
At814, theimages604a,604bare compared with respect to one another to identify differences between the evaluated features. For example, the areas or perimeter lengths of therespective images604a,604bcan be compared. The differences may be subtle, like slight changes in color or they may be substantial like a greatly enlarged area of thesecond image604b.
At816, any identified differences are further compared to determine if a threshold is exceeded. For example, the threshold could be one, two, three, four, five, six, seven, eight, nine, ten, fifteen, twenty, twenty-five percent increase in the area of thesecond image604bcompared to the area of thefirst image604a. At818, if there are no detected differences or if the detected differences do not exceed the threshold, then a notification is provided that no noteworthy changes of theimage604bwere detected. At820, if the detected differences do exceed the threshold, then a notification is provided that noteworthy changes of theimage604bwere detected. The threshold can be a user defined setting, a preprogrammed setting, or an automatically adjustable range depending on the image quality and resolution, for example.
At822, data is provided detailing the specific changes, for example, shape, color, texture, a shift in position, etc. The data and/or the results obtained from the comparison can be made available to the medical professional in a short amount of time to enable the medical professional to make a more objective, informed diagnosis and to quickly formulate a treatment plan. Additionally or alternatively, a post-processing algorithm can be used to overlay therespective images600a,600bon a screen. Color-coding, animation techniques, and other graphic processing techniques can be used to identify areas or regions of greatest change.
In certain embodiments, images may be captured and compared using only a standard imaging device, which includes, digital cameras, movie cameras, film cameras etc., as long as the image to be compared is at some point moved to a digital format thus allowing computational analysis thereon. Clearly in one embodiment an image from a standard consumer model digital camera is compared against a subsequent image. In such embodiments, the computer algorithm used will perform a best fit or transformation of the images by modifying size, angle, and brightness, to obtain the best possible fit prior to analysis for changes. Accordingly, in such embodiments users already having an archive of older images can compare these images. While the error rate for such comparison is slightly higher, the flexibility of being able to review older images far exceeds the risk of a few false positive outcomes that can be easily discounted by the user upon further review. It should also be clearly understood that images taken with no focal length limiter, brightness, color, or contrast control can be compared with images having such controls.
Color and/or Contrast Balancing to Detect a Suspect Area
FIGS. 10A,10B, and10C illustrate the color balancing of an image900. In particular,FIG. 10A shows adigital image900aof asuspect area902 and abackground region904 prior to color balancing.FIG. 10B shows theimage900bafter a filter has been applied to filter out thebackground skin region904 based on the color, brightness, and/or contrast of thesuspect area902 compared to thebackground skin region904.FIG. 10C shows theimage900cafter it has been preprocessed, which may include but is not limited the application of additional filters to remove other features in the image and/or the application of areference grid908 over theimage900c.
Due to slight differences in lighting and environment, the colors in an image will likely not be constant from one image to the next, even though steps are taken to provide constant lighting. Moreover, a skin does not provide an adequate background for evaluating color changes of a suspect area because the skin can change color, for example the skin may be darker in the summer than in the winter.
One advantage of color balancing is to provide an additional parameter that can be compared from one image to the next, for instance the respective darkness or lightness of the respective images. A second advantage of color balancing permits a comparison between the suspect area and the surrounding skin as a means to more accurately detectsuspect areas302 over a larger skin surface.
FIG. 11 shows amethod1000 of detecting suspect areas over a surface ofskin14 by means of image filtration (i.e., color or contrast balancing). At1002, at least several images over a large area of skin are acquired. The images may have overlapping sections to insure that the entire skin area was imaged. In addition,reference markers318 can be placed at various locations on the imaged skin area so that anypotential suspect areas302 that may be discovered can be relocated at a later time.
At1004 and according to one embodiment, each image is color balanced with respect to a reference color. In one embodiment, the reference color appears in the acquired image. Such a reference could include distance measurements, contrast standards, color standards, etc. The reference color can be a color palette of single color placed within the frame of the image when the image is acquired (FIG. 2A). The color palette can have a variety of shades or colors thereon. Because the reference colors are electronically isolatable, the color and/or shading of the image can be digitally adjusted until a feature in the image approximately matches a certain reference color.
At1006, after the color of the image has been adjusted, a spot detection algorithm is used to process each of the respective images to detect anysuspect areas302 in one or more images. In one embodiment, a filter is applied to the image to make darker objects, such as a mole, stand out relative to the skin. The type of filter used will depend on the amount of color contrast between the skin and the suspect area. By way of example, images of a light skinned person with dark patches on their skin may not require filtering; whereas images of a dark skinned person with moderately dark patches on their skin may require a series of filters to achieve enough contrast between the skin and the suspect area.
In1008, notification of a potential suspect areas is provided to the medical professional. At this point, the medical professional could perform a refined evaluation of any potential suspect area by taking higher resolution images of the suspect area and comparing these images over time, as described in detail above. Additionally or alternatively, the medical professional can perform or recommend that a biopsy be taken of the suspect area.
Computing SystemsThe computing system for performing the image comparisons may include a number of local computers for receiving downloaded images and at least one mainframe computer for performing the image comparisons. Alternatively, the image comparisons could be performed on the local computers.
The local computer typically includes a processor, memory, multiplex (“Mux”) card, video and Ethernet cards, power supply and an image acquisition card. A number of local computers be networked together and service a number of patient treatment facilities. The local computer can communicate with other local computers and/or the mainframe computer over a communications link such as a local area network (“LAN”) and/or a wide area network (“WAN”). The communications link can be wired and/or wireless. The communications link can employ Internet, or World Wide Web communications protocols, and can take the form of a proprietary extranet. In such instances a user could obtain images and upload these to a web-based server that could perform all the analysis and send back to the user only the analysis or only the analysis that yielded possible changes. In other embodiments all algorithms could reside on the computer wherein the images or uploaded or on a server directly connected thereto. In certain embodiments, patient confidentiality is maintained.
In certain specific embodiments a user could upload all patient information into a database and also have image analysis linked thereto. Accordingly, either on a remote server or housed in the users facility could be a computer that contains a database with a unique patient identifier, this identifier can be used to add new images to a patient folder and the image analysis could either be performed immediately while the user waits or could be performed in the background. Subsequent to this analysis a notification could be sent via email or secured web-access or the like that indicates the analysis has been completed and either no action is necessary or further review/action may be required, thus indicating a change was noted between the images.
The various embodiments described above can be combined to provide further embodiments. All of the above U.S. patents, patent applications and publications referred to in this specification are incorporated herein by reference. Aspects can be modified, if necessary, to employ devices, features, and concepts of the various patents, applications and publications to provide yet further embodiments.
These and other changes can be made in light of the above detailed description. In general, the terms used in the claims should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to cover all imaging devices, types of image formats, measuring techniques, and image transformation and comparison algorithms. Accordingly, the claims are not limited by the disclosure.