Movatterモバイル変換


[0]ホーム

URL:


US8348429B2 - Optical coherence tomography device, method, and system - Google Patents

Optical coherence tomography device, method, and system
Download PDF

Info

Publication number
US8348429B2
US8348429B2US12/111,894US11189408AUS8348429B2US 8348429 B2US8348429 B2US 8348429B2US 11189408 AUS11189408 AUS 11189408AUS 8348429 B2US8348429 B2US 8348429B2
Authority
US
United States
Prior art keywords
coherence tomography
user
optical coherence
instrument
optical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US12/111,894
Other versions
US20090244485A1 (en
Inventor
Alexander C. Walsh
Paul G. Updike
Srinivas R. Sadda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Doheny Eye Institute of USC
Original Assignee
Doheny Eye Institute of USC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filedlitigationCriticalhttps://patents.darts-ip.com/?family=40651707&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US8348429(B2)"Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Doheny Eye Institute of USCfiledCriticalDoheny Eye Institute of USC
Priority to US12/111,894priorityCriticalpatent/US8348429B2/en
Assigned to DOHENY EYE INSTITUTEreassignmentDOHENY EYE INSTITUTEASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: SADDA, SRINIVAS R., UPDIKE, PAUL G., WALSH, ALEXANDER C.
Priority to EP16157648.3Aprioritypatent/EP3053513B1/en
Priority to PCT/US2009/037448prioritypatent/WO2009120543A1/en
Priority to EP24165712.1Aprioritypatent/EP4386775A3/en
Priority to CN202110028197.XAprioritypatent/CN112869696B/en
Priority to CN200980119252.3Aprioritypatent/CN102046067B/en
Priority to EP09724806.6Aprioritypatent/EP2271249B2/en
Priority to CN202510241127.0Aprioritypatent/CN120052807A/en
Priority to CN201710950092.3Aprioritypatent/CN107692960B/en
Priority to JP2011501911Aprioritypatent/JP5469658B2/en
Priority to PCT/US2009/037449prioritypatent/WO2009120544A1/en
Priority to TW98109845Aprioritypatent/TW200940026A/en
Publication of US20090244485A1publicationCriticalpatent/US20090244485A1/en
Priority to US13/717,508prioritypatent/US9149182B2/en
Application grantedgrantedCritical
Publication of US8348429B2publicationCriticalpatent/US8348429B2/en
Priority to JP2014016709Aprioritypatent/JP6040177B2/en
Priority to US14/521,392prioritypatent/US20150138503A1/en
Priority to US15/249,151prioritypatent/US11291364B2/en
Priority to US15/349,970prioritypatent/US10165941B2/en
Priority to US16/184,772prioritypatent/US10945597B2/en
Priority to US17/249,026prioritypatent/US11839430B2/en
Priority to US17/475,153prioritypatent/US11510567B2/en
Priority to US17/811,658prioritypatent/US12193742B2/en
Priority to US18/493,155prioritypatent/US12414688B2/en
Priority to US19/017,198prioritypatent/US20250143569A1/en
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

In accordance with one aspect of the present invention, an optical coherence tomography instrument comprises an eyepiece for receiving at least one eye of a user is provided; a light source that outputs light that is directed through the eyepiece into the user's eye; an interferometer configured to produce optical interference using light reflected from the user's eye; an optical detector disposed so as to detect said optical interference; and electronics coupled to the detector. The electronics can be configured to perform a risk assessment analysis based on optical coherence tomography measurements obtained using the interferometer. An output device can be electrically coupled to the electronics, and may be configured to output the risk assessment to the user through the output device. The optical coherence tomography instrument can be self-administered, and the eyepiece can be a monocular system or a binocular system.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of U.S. Provisional Application No. 61/040,084, titled, “OPTICAL COHERENCE TOMOGRAPHY DEVICE, METHOD, AND SYSTEM,” and filed Mar. 27, 2008, which is hereby incorporated by reference in its entirety, including without limitation, for example, the optical coherence tomography devices, methods, and systems disclosed therein.
BACKGROUND
1. Field
Embodiments of the invention relate to the field of optical coherence tomography and, in particular, to devices, systems, methods of utilizing such optical coherence tomography data to perform precision measurements on eye tissue for the detection of eye diseases.
2. Description of the Related Art
Many industrial, medical, and other applications exist for optical coherence tomography (OCT), which generally refers to an interferometric, non-invasive optical tomographic imaging technique offering millimeter penetration (approximately 2-3 mm in tissue) with micrometer-scale axial and lateral resolution. For example, in medical applications, doctors generally desire a non-invasive, in vivo imaging technique for obtaining sub-surface, cross-sectional and/or three-dimensional images of translucent and/or opaque materials at a resolution equivalent to low-power microscopes. Accordingly, in the coming years, it is projected that there will be 20 million OCT scans performed per year on patients. Most of these will probably occur in the field of ophthalmology. In current optical coherence tomography systems, doctors or other medical professionals administer the OCT scans in the doctors' medical office or medical facilities.
SUMMARY
Various embodiments of the present invention relate to the utilization of optical coherence tomography, which generally refers to an interferometric, non-invasive optical tomographic imaging technique, that can be used to detect and analyze, for example, eye tissue, and/or disease features, including but not limited to cystoid retinal degeneration, outer retinal edema, subretinal fluid, subretinal tissue, macular holes, drusen, or the like. For example, and in accordance with one aspect of the present invention, an optical coherence tomography instrument comprises an eyepiece for receiving at least one eye of a user; a light source that outputs light that is directed through the eyepiece into the user's eye; an interferometer configured to produce optical interference using light reflected from the user's eye; an optical detector disposed so as to detect said optical interference; electronics coupled to the detector and configured to perform a risk assessment analysis based on optical coherence tomography measurements obtained using said interferometer; and an output device electrically coupled to the electronics, and the output device may be configured to output the risk assessment to the user through the output device. Generally, the optical coherence tomography instruments, devices, systems, and methods disclosed herein can be self-administered, and the eyepiece can be a monocular system or a binocular system.
In accordance with another aspect of the present invention, an optical coherence tomography instrument comprises first and second oculars for receiving a pair of eyes of a user; a light source that outputs light that is directed through the first and second oculars to the user's eyes; an interferometer configured to produce optical interference using light reflected from the user's eye; an optical detector disposed so as to detect said optical interference; and electronics coupled to the detector and configured to provide an output to a user based on optical coherence tomography measurements obtained using said interferometer.
In other aspects of the present invention, an optical coherence tomography instrument comprises an eyepiece for receiving at least one eye of a user; a light source that outputs a beam that is directed through the eyepiece to the user's eye; an interferometer configured to produce optical interference using light reflected from the user's eye; an optical detector disposed so as to detect said optical interference; a electronics coupled to the detector and configured to automatically perform a diagnosis based on optical coherence tomography measurements obtained using said interferometer; and an output device electrically coupled to the electronics, said output device configured to output the diagnosis to the user through the output device.
In accordance with another aspect of the present invention, an optical coherence tomography instrument for providing self-administered disease screening, said instrument comprises an eyepiece for receiving at least one eye of a user; a light source that outputs a beam that is directed through the eyepiece to the user's eye; an interferometer configured to produce optical interference using light reflected from the user's eye; an optical detector disposed so as to detect said optical interference; a processor in communication with the detector and configured to identify one or more diseases that the user may manifest indications of based on optical coherence tomography measurements obtained using said interferometer; and an output device electrically coupled to the electronics, said output device configured to alert the user.
In other aspects of the present invention, an optical coherence tomography instrument comprises an eyepiece for receiving at least one eye of a user; at least one target display visible through said eyepieces; a light source that outputs light that is directed through the eyepiece to the user's eye; an interferometer configured to produce optical interference using light reflected from the user's eye; an optical detector disposed so as to detect said optical interference; and electronics coupled to the target display and said detector and configured to provide an output to a user based on optical coherence tomography measurements obtained using said interferometer, wherein said electronics is further configured to produce features on said target display of varying size and receive user responses to test a user's visual acuity.
In accordance with another aspect of the present invention, an optical coherence tomography instrument comprises an eyepiece for receiving at least one eye of a user; a light source that outputs light that is directed through the eyepiece to the user's eye; an interferometer configured to produce optical interference using light reflected from the user's eye; an optical detector disposed so as to detect said optical interference; electronics coupled to the detector and configured to provide an output to a user based on optical coherence tomography measurements; and a card reader configured to receive a card from said user, said card reader in communication with said electronics to transmit signals thereto to authorize said electronics to provide the output to said user.
In other aspects of the present invention, an optical coherence tomography instrument comprises an eyepiece for receiving at least one eye of a user; a light source that outputs light that is directed through the eyepiece to the user's eye; an interferometer configured to produce optical interference using light reflected from the user's eye; an optical detector disposed so as to detect said optical interference; electronics coupled to the detector and configured to provide an output to a user based on optical coherence tomography measurements obtained using said interferometer; memory that includes a list of healthcare providers; and an output device electrically coupled to the electronics to provide an output to said user, wherein said electronics is configured to access said memory to provide at least a portion of said list in said output to said user.
In accordance with another aspect of the present invention, an optical coherence tomography instrument comprises an eyepiece for receiving at least one eye of a user; a light source that outputs light that is directed through the eyepiece to the user's eye; an interferometer configured to produce optical interference using light reflected from the user's eye; an optical detector disposed so as to detect said optical interference; electronics coupled to the detector and configured to provide an output to a user based on optical coherence tomography measurements obtained using said interferometer; and an output device electrically coupled to the electronics and configured to provide an output to said user, wherein said electronics is configured to provide a recommended deadline for consulting a healthcare provider based on the patient's risk level of having certain types of diseases as determined using said optical coherence tomography measurements.
In other aspects of the present invention, an optical coherence tomography instrument comprises an eyepiece for receiving an eye of a user; a light source that outputs a light beam that is directed through the eyepiece to the user's eye; an interferometer configured to produce optical interference using light reflected from the user's eye; an optical detector disposed so as to detect said optical interference; an array of display elements visible to the user through the eyepiece, said array of display elements configured to display a display target at different locations across the array; and electronics in communication with said array of display elements and said detector, said electronics configured to use said position of said display target on said array of display elements to associate optical coherence tomography measurements with spatial locations.
In accordance with another aspect of the present invention, an optical coherence tomography instrument comprises an eyepiece for receiving a subject's eye; a light source that outputs light that is directed through the eyepiece to the subject's eye; an interferometer configured to produce optical interference using light reflected from the user's eye; an optical detector disposed so as to detect said optical interference; and memory including statistical information correlating optical coherence tomography measurements with risk of at least one disease; electronics coupled to the detector and configured to access said memory to compare data obtained based on optical coherence tomography measurements of said subject's eye using said interferometer with said statistical information to provide an assessment of said subject's risk of having said at least one disease.
In other aspects of the present invention, an optical coherence tomography instrument comprises an eyepiece for receiving an eye of a user; a light source that outputs light that is directed through the eyepiece to the user's eye; an adjustable optical element configured to alter the focus of the light directed into the user's eye; an interferometer configured to produce optical interference using light reflected from the user's eye; an optical detector disposed so as to detect said optical interference; electronics coupled to the detector and configured to provide an output to a user based on optical coherence tomography measurements obtained using said interferometer; and an output device electrically coupled to the electronics and configured to provide an output to said user, wherein said electronics is configured to adjust said optical element to focus said light using a signal from said detector.
For purposes of this summary, certain aspects, advantages, and novel features of the invention are described herein. It is to be understood that not necessarily all such aspects, advantages, and features may be employed and/or achieved in accordance with any particular embodiment of the invention. Thus, for example, those skilled in the art will recognize that the invention may be embodied or carried out in a manner that achieves one advantage or group of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other features, aspects and advantages of the present invention are described in detail below with reference to the drawings of various embodiments, which are intended to illustrate and not to limit the invention. The drawings comprise the following figures in which:
FIG. 1 is a schematic diagram of one embodiment of the optical coherence tomography system described herein.
FIG. 2 is a schematic diagram of one embodiment of an interferometer arranged to perform measurements of an eye.
FIG. 3A is a schematic diagram of one embodiment of an OCT system comprising a main body configured to conveniently interfere with a person's eyes, the main body being in communication with various systems as described herein.
FIG. 3B is a perspective view schematically illustrating an embodiment of the main body shown inFIG. 3A.
FIG. 4 is schematic diagram of one embodiment of a spectrometer used to analyze data from an interferometer used for OCT.
FIG. 5 is a schematic diagram of the main body of an OCT system comprising a single display for presenting a display target to a patient.
FIGS. 6A-6C are schematic diagrams illustrating the use of optical coherence tomography to scan retinal tissue to generate A-scans and B-scans.
FIGS. 7A-7F are schematic diagrams illustrating embodiments for adjusting and/or calibrating interpupillary distance.
FIG. 8 is a block diagram schematically illustrating one embodiment of the computer system of the optical coherence tomography system described herein.
FIG. 9 is illustrates a process flow diagram of one embodiment of performing precision measurements on retinal tissue for the detection of pathognomonic disease features.
FIGS. 10A-10D illustrate possible embodiments of disposing the main body of an optical coherence tomography device with respect to a user.
FIGS. 11A-11B illustrate possible embodiments of output reports generated by the optical coherence tomography device.
FIG. 12 is a block diagram schematically illustrating another embodiment of the computer system for a optical coherence tomography system described herein.
FIG. 13 is a block diagram schematically illustrating components in one embodiment of the computer system for an optical coherence tomography system described herein.
FIG. 14A is a diagram schematically illustrating one embodiment for determining a risk assessment.
FIG. 14B is a schematic illustration of a plot of risk of retinal disease versus retinal thickness for determining a risk assessment in another embodiment.
FIG. 15 is an illustration of RPE detection and RPE polynomial fit curvature, and the difference there between.
FIG. 16 is an illustration of retinal tissue segmented into inner and outer retinal tissue regions.
DETAILED DESCRIPTION
Embodiments of the invention will now be described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the invention. Furthermore, embodiments of the invention may comprise several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the inventions herein described. The embodiments described herein make OCT screening more accessible to users thereby allowing for earlier detection and/or treatment of various diseases, ailments, or conditions, for example, maculopathy, glaucoma, or the like.
The terms “optical coherence tomography” and “OCT” generally refer to an interferometric technique for imaging samples, in some cases, with micrometer lateral resolution. This non-invasive optical tomographic imaging technique is used in ophthalmology to provide cross-sectional images of the eye, and more particularly the posterior of the eye, though it can also be used to image other samples or tissues in areas of the user's body.
Generally, OCT employs an interferometer. Light from a light source (for example, a broadband light source) is split (for example, by a beamsplitter) and travels along a sample arm (generally comprising the sample) and a reference arm (generally comprising a mirror). A portion of the light from the sample arm is reflected by the sample. Light is also reflected from a mirror in the reference arm. (Light from the test arm and the reference arm is recombined, for example by the beamsplitter.) When the distance travelled by light in the sample arm is within a coherence length of the distance travelled by light in the reference arm, optical interference occurs, which affects the intensity of the recombined light. The intensity of the combined reflected light varies depending on the sample properties. Thus, variations for the intensity of the reflectance measured are indications of the physical features of the sample being tested.
In time-domain OCT, the length of the reference arm can be varied (for example, by moving one or more reference mirrors). The reflectance observed as the reference arm distance changes indicates sample properties at different depths of the sample. (In some embodiments, the length of the sample arm is varied instead of or in addition to the variation of the reference arm length.) In frequency-domain OCT, the distance of the reference arm can be fixed, and the reflectance can then be measured at different frequencies. For example, the frequency of light emitted from a light source can be scanned across a range of frequencies or a dispersive element, such as a grating, and a detector array may be used to separate and detect different wavelengths. Fourier analysis can convert the frequency-dependent reflectance properties to distance-dependent reflectance properties, thereby indicating sample properties at different sample depths. In certain embodiments, OCT can show additional information or data than nonmydriatic color fundus imaging.
The term “A-scan” describes the light reflectivity associated with different sample depths. The term “B-scan” as used herein refers to the use of cross-sectional views of tissues formed by assembly of a plurality of A-scans. In the case of ophthalmology, light reflected by eye tissues is converted into electrical signals and can be used to provide data regarding the structure of tissue in the eye and to display a cross-sectional view of the eye. In the case of ophthalmology, A-scans and B-scans can be used, for example, for differentiating normal and abnormal eye tissue or for measuring thicknesses of tissue layers in the eyes.
In ophthalmic instances, an A-scan can generally include data from the cornea to the retina, and a B-scan can include cross-sectional data from a medial border to a lateral border of the eye and from the cornea to the retina. Three-dimensional C-scans can be formed by combining a plurality of B-scans.
As used herein the terms “user” or “patient” may be used interchangeably, and the foregoing terms comprise without limitation human beings, whether or not under the care of a physician, and other mammals.
The terms “eye scan,” “scanning the eye,” or “scan the eyes,” as used herein, are broad interchangeable terms that generally refer to the measurement of any part or substantially all of the eye, including but not limited to the cornea, the retina, the eye lens, the iris, optic nerve, or any other tissue or nerve related to the eye.
The terms “risk assessment” and “diagnosis,” may be used in the specification interchangeably although the terms have different meanings. The term “risk assessment” generally refers to a probability, number, score, grade, estimate, etc. of the likelihood of the existence of one or more illnesses, diseases, ailments, or the like. The term “diagnosis” generally refers to a determination by examination and/or tests the nature and circumstances of an illness, ailment, or diseased condition.
The disclosure herein provides various methods, systems, and devices for generating and utilizing optical coherence tomography image data to perform precision measurements on retinal tissue for the detection of disease features, and generating a risk assessment and/or diagnosis based on data obtained by optical coherence tomography imaging techniques. These methods, systems and devices may employ, in some embodiments, a statistical analysis of the detected disease features obtained by optical coherence tomography imaging techniques. Such methods, systems, and devices can be used to screen for diseases.
With reference toFIG. 1, there is illustrated a block diagram depicting one embodiment of the optical coherence tomography system. In one embodiment,computer system104 is electrically coupled to anoutput device102, acommunications medium108, and a usercard reader system112. Thecommunications medium108 can enable thecomputer system104 to communicate with otherremote systems110. Thecomputer system104 may be electrically coupled tomain body106, which theuser114 positions near or onto the user's eyes. In the illustrated example, themain body106 is a binocular system (for example, has a two oculars or optical paths for the eyes providing one view for one eye and another view for another eye, or the like) configured to scan two eyes without repositioning the oculars with respect to the head of the patient, thereby reducing the time to scan a patient. In some embodiments, the eyes are scanned simultaneously using a scanner (for example, galvanometer), which provides interlaces of measurements from both eyes. Other embodiments are possible as well, for example, the binocular system or a two ocular system having two respective optical paths to the two eyes can be configured to scan the eyes in series, meaning one eye first, and then the second eye. In some embodiments, serial scanning of the eyes comprises scanning a first portion of the first eye, a first portion of the second eye, a second portion the first eye, and so on. Alternatively, themain body106 can comprise a monocular system or one ocular system or optical path to the eye for performing eye scans.
Referring toFIG. 1, theuser114 can engage handle118 and position (for example, up, down, or sideways) themain body106 that is at least partially supported and connected to a zerogravity arm116, and accordingly thesystem100 has no chin rest. In some embodiments, this configuration can introduce positioning error due to movement of the mandible. When themain body106 is in such a position, the distance between the outermost lens (the lens closest to the user) and the user's eye can range between 10 mm and 30 mm, or 5 mm and 25 mm, or 5 mm and 10 mm. The close proximity of the lens system to the user's eyes increases compactness of the system, reduces position variability when the patient places his eyes (for example, orbital rims) against the man body, and increases the viewing angle of the OCT apparatus when imaging through an undilated pupil. Accordingly, themain body106 can also comprise eyecups120 (for example, disposable eyecups) that are configured to contact the user's eye socket to substantially block out ambient light and/or to at least partially support themain body106 on the eye socket of theuser114. The eyecups120 have central openings (for example, apertures) to allow passage of light from the light source in the instrument to the eyes. The eyecups120 can be constructed of paper, cardboard, plastic, silicon, metal, latex, or a combination thereof The eyecups120 can be tubular, conical, or cup-shaped flexible or semi-rigid structures with openings on either end. Other materials, shapes and designs are possible. In some embodiments, the eyecups120 are constructed of latex that conforms around eyepiece portions of themain body106. The eyecups120 are detachable from themain body106 after the eye scan has been completed, and new eyecups120 can be attached for a new user to ensure hygiene and/or to protect against the spread of disease. The eyecups120 can be clear, translucent or opaque, although opaque eyecups offer the advantage of blocking ambient light for measurement in lit environments.
Themain body106 may comprise one or more eyepieces, an interferometer, one or more target displays, a detector and/or an alignment system. The optical coherence tomography system may comprise a time domain optical coherence tomography system and/or a spectral domain optical coherence tomography system. Accordingly, in some embodiments, themain body106 comprises a spectrometer, (for example, a grating) and a detector array. The main body may, in some embodiments, comprise signal processing component (for example, electronics) for performing, for example, Fourier transforms. Other types of optical coherence tomography systems may be employed.
FIG. 2 shows a diagram of an example optical coherence tomography system.Light150 is output from alight source155. Thelight source155 may comprise a broadband light source, such as a superluminescent diode or a white light source. (Alternatively, light emitted from thelight source155 may vary in frequency as a function of time.) The light150 may comprise collimated light. In one embodiment, light150 from thelight source155 is collimated with a collimating lens. The light is split atbeamsplitter160. Beamsplitters, as described herein, may comprise without limitation a polarization-based beamsplitter, a temporally based beamsplitter and/or a 50/50 beamsplitter or other devices and configurations. A portion of the light travels along a sample arm, directed towards a sample, such as aneye165 of auser114. Another portion of the light150 travels along a reference arm, directed towards areference mirror170. The light reflected by the sample and thereference mirror170 are combined at thebeamsplitter160 and sensed either by a one-dimensional photodetector or a two-dimensional detector array such as a charge-coupled device (CCD) or complementary metal-oxide-semiconductor (CMOS). A two-dimensional array may be included in a full field OCT instrument, which may gather information more quickly than a version that uses a one dimensional photodetector array instead. In time-domain OCT, the length of the reference arm (which may be determined in part by the position of the reference mirror170) may be varying in time.
Whether interference between the light reflected by the sample and the light reflected by the reference mirror/s occurs will depend on the length of the reference arm (as compared to the length of the test arm) and the frequency of the light emitted by the light source. High contrast light interference occurs between light travelling similar optical distances (for example, (differences less than a coherence length). The coherence length is determined by the bandwidth of the light source. Broadband light sources correspond to smaller coherence lengths.
In time-domain OCT, when the relative length of the reference and sample arms varies over time, the intensity of the output light may be analyzed as a function of time. The light signal detected results from light rays scattered from the sample that interfere constructively with light reflected by the reference mirror/s. Increased interference occurs, however, when the lengths of the sample and reference arms are approximately similar (for example, within about one coherence length in some cases). The light from the reference arm, therefore, will interfere with light reflected from a narrow range of depths within the sample. As the reference (or sample) arms are translated, this narrow range of depths can be moved through the thickness of the sample while the intensity of reflected light is monitored to obtain information about the sample. Samples that scatter light will scatter light back that interferes with the reference arm and thereby produce an interference signal. Using a light source having a short coherence length can provide increased to high resolution (for example, 0.1-10 microns), as the shorter coherence length yields a smaller range of depths that is probed at a single instant in time.
In various embodiments of frequency-domain optical coherence tomography, the reference and sample arms are fixed. Light from a broadband light source comprising a plurality of wavelengths is reflected from the sample and interfered with light reflected by the reference mirror/s. The optical spectrum of the reflected signal can be obtained. For example, the light may be input to a spectrometer or a spectrograph comprising, for example, a grating and a detector array, that detects the intensity of light at different frequencies.
Fourier analysis performed, for example, by a processor may convert data corresponding to a plurality of frequencies to that corresponding to a plurality of positions within the sample. Thus, data from a plurality of sample depths can be simultaneously collected without the need for scanning of the reference arm (or sample) arms. Additional details related to frequency domain optical coherence tomography are described in Vakhtin et al., (Vakhtin A B, Kane D J, Wood W R and Peterson K A. “Common-path interferometer for frequency-domain optical coherence tomography,” Applied Optics. 42(34), 6953-6958 (2003)).
Other methods of performing optical coherence tomography are possible. For example, in some embodiment of frequency domain optical coherence tomography, the frequency of light emitted from a light source varies in time. Thus, differences in light intensity as a function of time relate to different light frequencies. When a spectrally time-varying light source is used, a detector may detect light intensity as a function of time to obtain optical spectrum of the interference signal. The Fourier transform of the optical spectrum may be employed as described above. A wide variety of other techniques are also possible.
FIG. 3A shows one configuration ofmain body106 comprising an optical coherence tomography system and an alignment system. Other optical coherence tomography systems and/or alignment systems may be included in place of or in addition to the systems shown inFIG. 3A. As shown, themain body106 can include twoeyepieces203, each eyepiece configured to receive an eye from auser114. In other embodiments, themain body106 includes only oneeyepiece203.
FIG. 3A shows one representative embodiment of an optical coherence tomography system. Light from alight source240 may propagate along a path that is modulated, for example, vertically and/or horizontally by one ormore beam deflectors280. A galvanometer may be used for this purpose. Thegalvanometer280 can control the horizontal and/or vertical location of a light beam from thelight source240, thereby allowing a plurality of A-scans (and thus one or more B-scan and/or a C-scan) to be formed.
The light from thelight source240 is split atbeamsplitter245. In some embodiments,beamsplitter245 is replaced by a high frequency switch that uses, for example, a galvanometer, that directs about 100% of the light towards mirror250afor about ½ of a cycle and then directs about 100% of the light towardsmirror250bfor the remainder of the cycle. Thelight source240 may include a broadband light source, such as a superluminescent light-emitting diode. Light split at thebeamsplitter245 is then split again atbeamsplitter285aor285bto form a reference arm and a sample arm. A first portion of the light split atbeamsplitter285aor285bis reflected by reference mirrors273aor273b, reference mirrors270aor270b, and reference mirrors265aor265b. A second portion of the light split atbeamsplitter285aor285bis reflected bymirror250aor250b, bymirror255aor255band bymirror260aor260b.Mirrors255aor255band mirrors250aand250bare connected to a Z-offsetadjustment stage290b. By moving the position of theadjustment stage290aor290b, the length of the sample arm is adjusted. Thus, theadjustment stage290aor290bcan adjust the difference between the optical length from thelight source240 to a portion of the sample and the optical length from thelight source240 and thereference mirror270aor270band/orreference mirror273aor273b. This difference can be made small, for example, less than a coherence length, thereby promoting for optical interference to occur. In some embodiments, the positions of one or more reference mirrors (for example,reference mirror270aor270bandreference mirror273aor273b) are movable in addition to or instead of the adjustment stage being movable. Thus, the length of the reference arm and/or of the sample arm may be adjustable. The position of the adjustment stages290aand/or290bmay be based on the signals from the device, as described in more detail below.
The light reflected bymirror260aor260bis combined with light fromdisplay215aor215batbeamsplitter230aor230b. Thedisplays215aand215bmay comprise one or more light sources, such as in an emissive display like an array of matrix LEDs. Other types of displays can be used. The display can display targets of varying shapes and configurations, including a bar and/or one or more dots. A portion of the optical path from thelight source240 to the eye may be coaxial with a portion of the path from thedisplays215aand215bto the eye. These portions may extend though the eyepiece. Accordingly, a light beam from thelight source240 is coaxial with a light beam from thedisplays215aand215bsuch that the eyes can be positioned and aligned with respect to the eyepieces using the displays.
As described in greater detail below, for example, theuser114 may use images from the displays in order to adjust interpupillary distance. In various embodiments, for example, proper alignment of two images presented by the displays may indicate that the interpupillary distance is appropriately adjusted. Thus, one or more adjustment controls235 may be used to adjust the distance between the display targets215aand215band/or between theeyepieces203. The adjustment controls235 may be provided on the sides of themain body106 or elsewhere. In certain embodiments, the adjustment control204 may comprise a handle on themain body106, as shown inFIG. 3B. In this embodiment, rotation of the adjustment control204 may increase or decrease the interpupillary distance.
The combined light (that is reflected bymirror260aor260band that comes fromdisplay215aor215b) is focused by adjustable powered optics (for example, lens)210 possibly in conjunction withoptical element205. Theadjustable optics210 may comprise a zoom lens or lens system that may be have, for example, a focal length and/or power that is adjustable. Theadjustable optics210 may comprise be part of an auto-focus system or may be manually adjusted. Theadjustable optics210 may provide optical correction for those in need of such correction (for example, a user whose glasses are removed during testing). The position of thepowered optics210 may be based on the signals obtained from the device, as described in more detail below. The focused light then travels through eyepiece windows orlens205, positioned at a proximal end of theeyepiece203, towards the eye of auser114. In the case where alens205 is includes, thislens205 may contribute to focusing of the light into the eye.
This light focused into the eye may be scattered by tissue or features therein. A portion of this scattered light may be directed back into the eyepiece.Lens205 may thus receive light207 reflected from the user's eye, which travels through thepowered optics210, reflects off of thebeamsplitter230aor230btowardsbeamsplitter220aor220b, which reflects the light towardsmirrors295aor295b. At295aor295b, light reflected by the sample interferes with light in the reference arm (path betweenbeamsplitter285aor285bandbeamsplitter295aor295bthat includesmirrors273aor273band270aor270b). (Accordingly, the sample arm includes the optical path betweenbeamsplitter285aor285bandbeamsplitter295aor295bthat includesmirrors250aor250band255aor255band the sample or eye.) The light is then reflected bymirror225aor225btowardsswitch275. In some embodiments, theswitch275 comprises a switchable deflector that switches optical paths to the first or second eye to collect data from the respective eye to be sent to thedata acquisition device202. The switch may comprise a low-frequency switch, such that all data to be collected from one eye is obtained before the data is collected from the other eye. Alternatively, the switch may comprise a high-frequency switch, which may interlace data collected from each eye.
The instrument may be configured differently. For example, a common reference path may be used for each eye. In some embodiments, the reference arm includes one or more movable mirrors to adjust the optical path length difference between the reference and sample arms. Components may be added, removed, or repositioned in other embodiments. Other techniques, may be used.
Although not shown, for example, polarizers and polarizing beamspitters may be used to control the propagation of light through the optical path in the optical system. Other variations are possible. Other designs may be used.
In some embodiments, an A-scan may be formed in the time domain. In these instances, the Z-offset adjustment stage andcorresponding mirror255aor255band mirror255aor255bmay change positions in time. Alternatively, reference mirrors270aand270band reference mirrors273aand273bor other mirrors in the reference or sample arms may be translated. The combined light associated with various mirror positions may be analyzed to determine characteristics of an eye as a function of depth. In other embodiments, an A-scan may be formed in the spectral domain. In these instances, the frequencies of the combined light may be analyzed to determine characteristics of an eye as a function of depth. Additionally, one ormore galvanometers280 can control the horizontal and/or vertical location of the A-scan. Thus, a plurality of A-scans can be obtained to form a B-scan and/or a C-scan.
Light output from thestructure275 can be input into adata acquisition device202, which may comprise, for example, a spectrometer or a light meter. A grating may be in themain body106. Thedata acquisition device202 is coupled to acomputer system104, which may present output based on scans to theuser114. The output device may include a monitor screen, in which output results are displayed. The output device may include a printer, which prints output results. The output device may be configured to store data on a portable medium, such as a compact disc or USB drive, or a custom portable data storage device.
In some embodiments, thecomputer system104 analyzes data received by thedata acquisition device202 in order to determine whether one or more of the adjustment stages290aand/or290band/or thepowered optics210 should be adjusted. In one instance, an A-scan is analyzed to determine a position (for example, a coarse position) of the retina such that data on the retina may be obtained by the instrument. In some embodiments, each A-scan comprises a plurality of light intensity values, each associated with a different depth into the sample. The A-scan may be obtained, in some embodiments, by translating theZ adjustment stage290aor290b. Likewise, the A-scan comprises values of reflected signal for obtained for different location of Z adjustment stage. The retina reflects more light than other parts of the eye, and thus, it is possible to determine a position of theadjustment stage290aor290bthat effectively images the retina by assessing what depths provide an increase in reflected intensity. In some embodiments, the Z adjustment stage may be translated and the intensity values may be monitored. An extended peak in intensity for a number of Z adjustment stage positions may correspond to the retina. A variety of different approaches and values may be monitored to determine the location of the retina. For example, multiple A-scans may be obtained at different depths and the integrated intensity of each scan may be obtained and compared to determine which depth provided a peak integrated intensity. In certain embodiments, intensity values within an A-scan can be compared to other values within the A-scan and/or to a threshold. The intensity value corresponding to the preferred location may be greater than a preset or relative threshold and/or may be different from the rest of the intensity values, (for example, by more than a specified number of standard deviations). A wide variety of approaches may be employed.
After the positions of the adjustment stages290aand290bhave been determined, subsequent image analysis may be performed to account for vibration or movement of the user's head, eyes or retinas relative to thelight source240. A feedback system such as a closed loop feedback system may be employed in effort to provide a more stabilized signal in the presence of such motion. The optical coherence tomography signal may be monitored and feedback provided to, for example, one or more translation stages to compensate for such vibration or movement. In some embodiments, subsequent image analysis may be based on initial image and/or detect changes in image characteristics. For example, the image analysis may determine that the brightest pixel within an A-scan has moved 3 pixels from a previous scan. Theadjustment stage290aor290bmay thus be moved based on this analysis. Other approaches may be used.
In some instances, optical coherence tomography signals are used to adjust thepowered optics210 to provide for increased or improved focus, for example, when a patient needs refractive correction. Many users/patients, for example, may wear glasses and may be tested while not wearing any glasses. Thepowered optics210 may be adjusted based on reflected signal to determine what added correction enhances signal quality or is otherwise an improvement. Accordingly, in some embodiments, a plurality of A-scans is analyzed in order to determine a position for thepowered optics210. In some instances, a plurality of A-scans is analyzed in order to determine a position for thepowered optics210. In some embodiments, this determination occurs after the position of theadjustment stage290aor290bhas been determined. One or more A-scans, one or more B-scans or a C-scan may be obtained for each of a plurality of positions of thepowered optics210. These scans may be analyzed to assess, for example, image quality. The position of thepowered optics210 may be chosen based on these image quality measures.
The image quality measure may include a noise measure. The noise measure may be estimated based on the distribution of different intensity levels of reflected light within the scans. For example, lower signals may be associated with noise. Conversely, the highest signals may be associated with a saturated signal. A noise measure may be compared to a saturation measure as in signal to noise ratios or variants thereof The lowest reflectivity measured (referred to as a low measure or low value) may also be considered. In some embodiments, the positions of the adjustment stages290aand/or290band/or thepowered optics210 is determined based upon a signal-to-noise measure, a signal strength measure, a noise measure, a saturation measure, and a low measure. Different combinations of these parameters may also be used. Values obtained by integrating parameters over a number of positions or scans, etc., may also be used. Other parameters as well as other image quality assessments may also be used.
In one embodiment, a noise value is estimated to be a reflected light value for which approximately 75% of the measured reflected light is below and approximately 25% of the measured reflected light is above. The saturation value is estimated to be a reflected light value for which approximately 99% of the measured reflected light is below and approximately 1% of the measured reflected light is above. A middle value is defined as the mean value of the noise value and the saturation value. An intensity ratio is defined as the difference between the saturation value and the low value divided by the low value multiplied by 100. A tissue signal ratio is defined as the number of reflected light values between the middle value and the saturation value divided by the number of reflected light values between the noise value and the saturation value. A quality value is defined as the intensity ratio multiplied by the tissue signal ratio. Additional details are described, for example, in Stein D M, Ishikawa H. Hariprasad R. Wollstein G. Noecker R J, Fujimoto J G, Schuman J S. A new quality assessment parameter for optical coherence tomography. Br. J. Ophthalmol. 2006; 90; 186-190. A variety of other approaches may be used to obtain a figure of merit to use to measure performance and adjust the instrument accordingly.
In the case of adjusting the adjustable power optics,210, in some embodiments, a plurality of positions are tested. For example, the powered optics may be continuously moved in defined increments towards the eyes for each scan or set of scans. Alternatively, the plurality of positions may depend on previously determined image quality measures. For example, if a first movement of thepowered optics210 towards the eye improved an image quality measure but a subsequent second movement towards the eye decreased an image quality measure, the third movement may be away from the eye. Accordingly, optical power settings may be obtained that improve and/or maintains an improved signal. This optical power setting may correspond to optical correction and increase focus of the light beam in the eye, for example, on the retina, in some embodiments.
As described above, various embodiments employ an arrangement wherein a pair of oculars is employed. Accordingly, such adjustments, may be applied to each of the eyes as a user may have eyes of different size and the retina may located at different depths and thus a pair of z adjust stages may be used in some embodiments. Similarly, a user may have different prescription optical correction for the different eyes. A variety of arrangements may be employed to accommodate such needs. For example, measurements and/or adjustments may be performed and completed on one eye and subsequently performed and completed the other eye. Alternatively, the measurements and/or adjustments may be performed simultaneously or interlaced. A wide variety of other variations are possible.
FIG. 4 shows a diagram of aspectrometer400 that can be used as adata acquisition device202 for a frequency domain OCT system.Light405 input into thespectrometer400 is collected by collectinglens410. The collected light then projects through aslit415, after which it is collimated by thecollimating lens420. The collimated light is separated into various spectral components by agrating425. The grating425 may have optical power to focus the spectral distribution onto an image plane. Notably, other separation components, such as a prism may be used to separate the light. The separated light is then directed onto a detector array by focusinglens430, such that spectral components of each frequency from various light rays are measured.
A wide variety of OCT designs are possible. For example, frequency can be varied with time. The reference and sample arms can overlap. In some embodiments, a reference arm is distinct from a sample arm, while in other embodiments, the reference arm and sample arm are shared. See, for example, Vakhtin A B, Kane D J, Wood W R and Peterson K A. “Common-path interferometer for frequency-domain optical coherence tomography,” Applied Optics. 42(34), 6953-6958 (2003). The OCT arrangements should not be limited to those described herein. Other variations are possible.
In some embodiments, as shown inFIG. 5, themain body106 includes only asingle display target215. Light from thedisplay target215 is split at anx-prism505. Notably, other optical devices that split the source light into a plurality of light rays may be used. This split light is reflected atmirror510aor510band directed towards theuser114.
The user may be directed to fixate on adisplay target215 while one ormore galvanometers280 move light from thelight source240 to image an area of tissue. In some embodiments, the display targets215 are moved within the user's field of vision while an area of tissue is imaged. For example, inFIG. 6A, adisplay target215 may be moved horizontally (for example, in the medial-lateral direction), such that a patient is directed to look from left to right or from right to left. Meanwhile, a vertical scanner (for example, galvanometer) allows the vertical location (for example, in the superior-inferior) of the sample scanning to change in time.FIG. 6 shows an eye, which is directed to move in thehorizontal direction605. Due to the vertical scanner, the scannedtrajectory610 covers a large portion of theeye600. Scanning in the vertical and horizontal directions can produce a C-scan. In some embodiments, continuous and/or regularly patterned A-scans are combined to form a full scan for example, B-scan or C-scan. In other embodiments, discrete and/or random A-scans are combined to form the full scan. Systems configured such thatusers114 are directed to move their eyes throughout a scan may include fewer scanners than comparable systems configured such thatusers114 keep their eyes fixated at a stationary target. For example, instead of a system comprising both a vertical and a horizontal scanner, theuser114 may move his eyes in the horizontal direction, thereby eliminating the need for a horizontal scanner.
FIG. 6B shows an example of an A scan. The A scan comprises the signal strength (indicated by the brightness) as a function of depth for one horizontal and vertical position. Thus, an A-scan comprises a plurality of intensity values corresponding to different anterior-posterior positions. A plurality of A scans form a B scan.FIG. 6C shows a B-scan, in which the largest portion of the bright signal corresponds to retinal tissue and the elevated region under the retina corresponds to diseased tissue within the eye.
With reference toFIG. 7A, there is illustrated an enlarged view depicting an embodiment of themain body106 that is configured with ahandle118 for adjusting the eyepieces to conform to the user's interpupillary distance. In the illustrative embodiment, themain body106 comprises a left eyepiece712 and a right eyepiece714 wherein each is connected to the other by interpupillary distance adjustment device718. The interpupillary distance adjustment device718 is coupled to thehandle118, wherein thehandle118 is configured to allow the user to engage thehandle118 to adjust the distance between the left and right eyepieces712,714 to match or substantially conform to the interpupillary distance between the eyes of the user.
Referring toFIG. 7A, the user can rotate, turn, or twist thehandle118 to adjust the distance between the left and right eyepieces712,714 so as to match or substantially conform to the interpupillary distance between the eyes of the user. Alternatively, thehandle118 can be configured to move side to side to allow the user to adjust the distance between the left and right eyepieces712,714. Additionally, thehandle118 can be configured to move forward and backward to allow the user to adjust the distance between the left and right eyepieces712,714. In the alternative, thehandle118 can be configured to move up and down to allow the user to adjust the distance between the left and right eyepieces712,714. In another embodiment, the distance between the left and right eyepieces712,714 can be adjusted and/or controlled by a motor activated by the user. Alternatively, the motor can be configured to be controlled bycomputer system104 to semi-automatically position the left and right eyepieces712,714 to match the interpupillary distance between the eyes of the user. In these instances, eye tracking devices may be included with a system described herein. In other embodiments, a combination of the foregoing are utilized to adjust the distance between the left and right eyepieces712,714 to match or substantially conform to the user's interpupillary distance.
Auser114 may adjust interpupillary distance based on the user's viewing of one or more fixation targets on one ormore displays215. For example, thedisplays215 and the fixation targets may be configured such that the user views two aligned images, which may form a single, complete image when the interpupillary distance is appropriate for theuser114. Theuser114 may adjust (for example, rotate) an adjustment control204 to change the interpupillary distance based on the fixation target images, as shown inFIG. 7A.FIGS. 7B-7F illustrate one embodiment of fixation targets as seen by the viewer under a plurality of conditions; however, other fixation targets are possible, including but not limited to a box configuration.FIG. 7B shows aU-shaped fixation target715aon thedisplay215afor the left eye.FIG. 7C shows an upside-downU-shaped fixation target715bon thedisplay215bfor the right eye.
When the interpupillary distance is appropriately adjusted, the bottom andtop images715aand715bare aligned, as shown inFIG. 7D to form a complete H-shapedfixation target715. When the interpupillary distance is too narrow, thefixation target715aon thedisplay215afor the left eye appear shifted to the right and the fixation target on thedisplay215bfor the right eye appear shifted to the left and the user sees the image shown inFIG. 7E. Conversely, when the interpupillary distance is too wide, thefixation target715aon thedisplay215afor the left eye appear shifted to the left and the fixation target on thedisplay215bfor the right eye appear shifted to the right and the user sees the image shown inFIG. 7F. Thus, the interpupillary distance may be adjusted based on these images.
In particular, inFIG. 7D, thealignment image715 is in the shape of an “H.” Thus, when the interpupillary distance is properly adjusted, the fixation targets on the left and right displays overlap to form an “H”.Other alignment images715 may be provided.
With reference toFIG. 8, there is illustrated an embodiment of thecomputer system104. In the illustrated embodiment, thecomputer system104 can comprise a scan control andanalysis module824 configured to control the scanning operations performed by themain body106. Thecomputer system104 can also comprise a fixationmarker control system822 configured to display a fixation marker visible by the user frommain body106. In certain embodiments, the fixation marker is displayed as an “X,” a dot, a box, or the like. The fixation marker can be configured to move horizontally, vertically, diagonally, circularly, or a combination thereof The fixation marker can be repositioned quickly to relocate the beam location on the retina as the eye repositions itself Thecomputer system104 can also comprise a focus adjustmodule820 for automatically adjusting the focusing lenses in themain body106 as further discussed herein. Thecomputer system104 can also comprise aZ positioning module818 for automatically adjusting the Z offset as herein discussed.
Referring toFIG. 8, thecomputer system104 comprises in the illustrative embodiment a disease risk assessment/diagnosis module808 for storing and accessing information, data, and algorithms for determining, assessing the risk or likelihood of disease, and/or generating a diagnosis based on the data and/or measurements obtained from scanning the eyes of the user. In one embodiment, the scan control andanalysis module824 is configured to compare the data received from themain body106 to the data stored in the disease risk assessment/diagnosis module808 in order to generate a risk assessment and/or diagnosis of disease in the eyes of the user as further illustrated. Thecomputer system104 can also comprise an image/scans database configured to store images and/or scans generated by themain body106 for a plurality of users, and to store a unique identifier associated with each image and/or scan. In certain embodiments, the scan control andanalysis module824 uses historical images and/or scans of a specific user to compare with current images and/or scans of the same user to detect changes in the eyes of the user. In certain embodiments, the scan control andanalysis module824 uses the detected changes to help generate a risk assessment and/or diagnosis of disease in the eyes of the user.
In the illustrative embodiment shown inFIG. 8, thecomputer system104 can comprise a user/patient database802 for storing and accessing patient information, for example, user name, date of birth, mailing address, residence address, office address, unique identifier, age, affiliated doctor, telephone number, email address, social security number, ethnicity, gender, dietary history and related information, lifestyle and/or exercise history information, use of corrective lens, family health history, medical and/or ophthalmic history, prior procedures, or other similar user information. Thecomputer system104 can also comprise a physician referral database for storing and accessing physician information, for example, physician name, physician training and/or expertise/specialty, physician office address, physician telephone number and/or email address, physician scheduling availability, physician rating or quality, physician office hours, or other physician information.
In reference toFIG. 8, thecomputer system104 can also comprise a user interface module805 (which can comprise without limitation commonly available input/output (I/O) devices and interfaces as described herein) configured to communicate, instruct, and/or interact with the user through audible verbal commands, a voice recognition interface, a key pad, toggles, a joystick handle, switches, buttons, a visual display, touch screen display, etc. or a combination thereof In certain embodiments, theuser interface module805 is configured to instruct and/or guide the user in utilizing and/or positioning themain body106 of the opticalcoherence tomography system100. Thecomputer system104 can also comprise a reporting/output module806 configured to generate, output, display, and/or print a report (for example,FIGS. 10A and 10B) comprising the risk assessment and/or diagnosis generated by the disease risk assessment/diagnosis module808. In other embodiments, the report comprises at least one recommended physician to contact regarding the risk assessment.
Referring toFIG. 8, thecomputer system104 can also comprise anauthentication module816 for interfacing with usercard reader system112, wherein a user can insert a user identification card into the usercard reader system112. In certain embodiments, theauthentication module816 is configured to authenticate the user by reading the data from the identification card and compare and/or store the information with the data stored in the user/patient database802. In certain embodiments, theauthentication module816 is configured to read or obtain the user's insurance information from the user's identification card through the usercard reader system112. Theauthentication module816 can be configured to compare the user's insurance information with the data stored in theinsurance acceptance database828 to determine whether the user's insurance is accepted or whether the user's insurance company will pay for scanning the user's eyes. In other embodiments, the authentication module communicates with thebilling module810 to send a message and/or invoice to the user's insurance company and/or device manufacturer to request payment for performing a scan of the patient's eyes. The card can activate one or more functions of the machine allowing the user, for example, to have a test performed or receive output from the machine. In other embodiments, thebilling module810 is configured to communicate with theuser interface module805 to request payment from the user to pay for all or some (for example, co-pay) of the cost for performing the scan. In certain embodiments, thebilling module810 is configured to communicate with the usercard reader system112 to obtain card information from the user's credit card, debit card, gift card, or draw down credit stored on the user's identification card. Alternatively, thebilling module810 is configured to receive payment from the user by communicating and/or controlling an interface device for receiving paper money, coins, tokens, or the like. Alternatively, thebilling module810 is configured to receive payment from the user by communicating with the user's mobile device through Bluetooth® or other communications protocols/channels in order to obtain credit card information, billing address, or to charge the user's mobile network service account (for example, the cellular carrier network).
With reference toFIG. 8, the user card may be used by insurers to track which users have used the system. In one embodiment, the system can print (on the face of the card) or store (in a chip or magnetic stripe) the scan results, risk assessment, and/or report directly onto or into the card that the patient inserts into the system (wherein the card is returned to the user). The system can be configured to store multiple scan results, risk assessments, and/or reports, and/or clear prior scan results, risk assessments, and/or reports before storing new information on the magnetic stripe. In certain embodiments, the calculation of the risk assessment is performed by the system (for example, scanning analysis module824). In certain embodiments, the calculated risk assessment is transmitted a centralized server system (for example, remote systems110) in another location that provides the results via a web page to physicians, users, patients, or the like. The centralized server system (for example, remote system110) allows the user, patients, or doctors to enter their card code to see the results which are saved in the centralized database.
In the example embodiment ofFIG. 8, thecomputer system104 can comprise anetwork interface812 and afirewall814 for communicating with otherremote systems110 through acommunications medium108. Otherremote systems110 can comprise without limitation a system for checking the status/accuracy of the opticalcoherence tomography system100; a system for updating the disease risk assessment/diagnosis database808, theinsurance acceptance database828, thephysician referral database804, and/or the scan control andanalysis module824. In certain embodiments, thecomputer system104 can be configured to communicate with aremote system110 to conduct a primary and/or secondary risk assessment based on the data from scanning the user's eyes with themain body106.
Referring toFIG. 8, theremote system110 can be configured to remotely perform (on an immediate, delayed, and/or batch basis) a risk assessment and/or diagnosis and transmit through a network or communications medium the risk assessment, diagnosis, and/or report to thecomputer system104 for output to the user usingoutput device102. In certain embodiments, theoutput device102 is configured to display the risk assessment, diagnosis, and/or report as a webpage that can be printed, emailed, transmitted, and/or saved by thecomputer system104. Theremote system110 can also be configured to transmit through a network or communications medium the risk assessment, diagnosis, and/or report to the user's (or doctor) cellular phone, computer, email account, fax, or the like.
With reference toFIG. 9, there is shown an illustrated method of using the opticalcoherence tomography system100 to self-administer an OCT scan of the user's eyes and obtain a risk assessment or diagnosis of various diseases and ailments. The process begins atblock901 wherein the user approaches the opticalcoherence tomography system100 and activates the system, by for example pushing a button or typing in a activation code or anonymous identification number. In other embodiments, theuser interface module805 instructs users atblock901 to first insert an identification card or anonymous coded screening card in usercard reader system112 to activate the system. The system can also be activated atblock901 when users insert their user identification card in usercard reader system112. Other means of activating the system are possible as well as, including without limitation, a motion sensor, a weight sensor, a radio frequency identification (RFID) device, or other actuator to detect the presence of the user. Alternatively, theoptical tomography system100 can be activated when thebilling module810 detects that the user has inserted paper money, coins, tokens, or the like into an interface device configured to receive such payment. Alternatively, thebilling module810 can also be configured to activate theoptical tomography system100 when thebilling module810 communicates with a user's mobile device in order to obtain the user's credit card information, billing address, or the like, or to charge the user's mobile network service account (for example, the cellular carrier network)
In referring toFIG. 9 at block902, theuser interface module805 is configured to direct the user to attach disposable eyecups onto themain body106, and then position themain body106 with the disposable eyecups near the eyes of the user and/or support the disposable eyecups against the user's eye socket. Theuser interface module805 instructs the user to engage handle118 to adjust the distance between the left and right eyepieces612,614 to match or substantially conform to the interpupillary distance of the user as described with respect toFIGS. 6A-6F. After themain body106 and the interpupillary distance has been appropriately calibrated and/or adjusted by the user, the user inputs into or indicates to theuser interface module805 to begin the scan. The scan control andanalysis module824 substantially restricts movement or locks the position of the zero gravity arm and/or the distance between the left and right tubes612,614 to begin the scan.
Referring toFIG. 9, theZ module818 automatically adjusts the z-offset in themain body106 atblock906 such that the OCT measurement will be obtained, for example, from tissue in the retina. TheZ module818 may identify and/or estimate a position of part of the sample (for example, part of an eye of a user114) and adjust the location of one or more optical components based on the position. One of ordinary skill in the art will appreciate the multitude of ways to perform such an adjustment. For example, theZ module818 may comprise a motor, such as a piezoelectric motor, to translate the reference mirror/s longitudinally such that the optical path length from the beam splitter to the retina is about equal to (within a coherence length of) the optical path length in the reference arm. This movement may enable light from the reference arm to interfere with light reflected by a desired portion of the sample (for example, the retina). Atblock908, the illustrative method performs a focus adjustment using thefocus adjustment module820. Those of ordinary skill in the art will also appreciate the different techniques for performing such auto-focus calibration.Block910 illustrates an optional test performed by thecomputer system104 to determine the visual functional and acuity of the user's eye. Such visual functional and acuity tests will be appreciated by those skilled in the art. In one embodiment, the visual acuity test works with or is combined with the fixation marker control system722, and can test both eyes simultaneously or one eye at time. For example, the fixation marker will initially appear small and then gradually increase in size until the user indicates through the user interface module705 that the fixation marker is visible. Based on the size at which the user can clearly see the fixation marker, fixation marker control system722 can estimate or determine or assess the visual acuity of the user's eyes (for example, 20/20, 20/40, or the like).
With reference toFIG. 9 atBlock912, theuser interface module805 instructs the user to follow the movement of the fixation marker that is visible to the user from themain body106. In one embodiment, thefixation marker control822 is configured to display a fixation marker that moves horizontally. In some embodiments, the horizontal movement of the fixation marker allows the scan control andanalysis module824 to scan the eye vertically as the eye moves horizontally, thus possibly obtaining a two-dimensional, volume, or raster scan of the eye tissue at issue. Alternatively, the scan control andanalysis module824 and/or the fixation marker control may cause the fixation marker or the beam to jump or move around to obtain measurements at different lateral locations on the eye.
During the scanning of the eye, the scan control andanalysis module824 could be configured to detect atblock913 whether there has been a shift in the position of themain body106 relative to the user. In one embodiment, the scan control andanalysis module824 can detect (in real-time, substantially real-time, or with a delay) whether a shift has occurred based on what the values themodule824 expects to receive during the scanning process. For example, as the scan control andanalysis module824 scans the retina, themodule824 expects to detect a change in signal as the scanning process approaches the optic nerve (for example, based on the location of the fixation target and/or state of the scanner(s)). Alternatively, the expected values or the expected change in values can also be determined or generated using a nomogram. If the system does not detect an expected signal change consistent with a detection of the optic nerve and/or receives no signal change, then themodule824 can be configured to interpret such data as the user is not tracking properly. Other features, for example, the fovea, or the like, can be used to determine whether the expected signal is observed. If improper tracking occurs enough (based on, for example, a threshold), thesystem100 may request that the user fixate again (using fixation marker control822) for another scan. If the foregoing shift detection process does not occur in real-time or substantially real-time, then the system can be configured to complete the scan, perform data analysis, and during the analysis the system can be configured to detect whether a shift occurred during the scan. If a substantial shift is detected, then the user may be instructed (through visual, audible, or verbal instructions using the user interface module805) to sit forward again so another scan can be performed. If the system detects ashift 2 or 3 or more times, the system can be configured to refer the user to a general eye doctor.
At the end of a scan, the scan control andanalysis module824 can be configured to produce a confidence value that indicates how likely the nomograms will be to apply to this patient. For example, if the patient had borderline fixation, the confidence value might be lower than a patient whose fixation appeared to be good.
In the real-time embodiment, the system can be configured to perform rapid cross-correlations between adjacent A-scans or B-scans to make sure the eye is moving somewhat. In some embodiments, the foregoing can be advantageous for ANSI laser safety standards so as to avoid having users stare at the same location with laser energy bombarding the user's retina. Accordingly, in some embodiments, the system is configured with a laser time-out feature if the system detects no eye moment (for example, cross-correlations above a certain threshold). In some embodiments, to expedite this process and provide real time analysis in frequency domain OCT, signal data may be analyzed prior to performing an FFT. Other technologies can be used to determine that the user has some eye movement.
If no fixation problem has been detected, the scan control andanalysis module824 completes the scan of the user's eyes, stores the image and/or scan data in the images/scans database826, and analyzes the A-scan data atblock915 to generate/determine a risk assessment and/or diagnosis atblock916 by accessing the data and/or algorithms stored in the disease risk assessment/diagnosis database808. In some embodiments, groups of A-scans, partial or full B scans, or partial or full C-scan data can be analyzed.
As used herein the term “nomogram” generally refers to predictive tools, algorithms, and/or data sets. Nomograms in general can provide predictions for a user based on the comparison of characteristics of the user with the nomogram. The nomograms are derived, generated, calculated, or computed from a number, for example, hundreds, thousands, or millions of users/patients who exhibited the same condition (normal or diseased). In some embodiments described herein, nomograms compare the risk of having a disease based on physical characteristics. Accordingly, in some cases, nomograms can provide individualized predictions that are relative to risk groupings of patient populations who share similar disease characteristics. In some embodiments, nomograms can be used to provide the risk estimation or risk assessment on a 0-100% scale. Alternatively, nomograms used herein can provide an expected value, for example, at a certain position in the eye there is an expected eye thickness value of 100 microns.
Generally, nomograms have been developed and validated in large patient populations and are highly generalizable, and therefore, nomograms can provide the objective, evidence-based, individualized risk estimation or assessment. Accordingly, nomograms can be used as described herein to empower patients and allow them to better understand their disease. Further, nomograms as used herein can assist physicians with clinical decision-making and to provide consistent, standardized and reliable predictions.
In the illustrative method shown inFIG. 9 atblock917, an eye health assessment or eye health grade report, as illustrated inFIGS. 10A and 10B, is generated for the user by accessing the disease risk assessment/diagnosis database808. Atblock918, thephysician referral database804 is accessed to generate a recommendation of when the user should visit a physician (for example, within one to two weeks). Thephysician referral database804 is also accessed to generate, compile a listing of physicians suitable for treating the patient. The physician referral list can be randomly generated or selected based on referral fee payments paid by physicians, insurance companies, or based on location of the physician relative to the user's present location or office/home address, or based on the type of detected disease, or based on the severity of the detected disease, based on the location or proximity of the system relative the location of the physician, or based on a combination thereof. At block919, the report is displayed to the user by using reporting/output module806 andoutput device102. In certain embodiments, the report data is stored in the user/patient database802 for future analysis or comparative analysis with future scans.
In some embodiments, themain body106 is not supported by theuser114. For example, themain body106 may be supported by a free-standing structure, as shown inFIG. 10A. Theuser114 may look into the eyepiece(s). Theuser114 may be seated on a seating apparatus, which may include a height-adjusting mechanism. Themain body106 may supported by a height-adjustable support.
In some embodiments, such as those shown inFIGS. 10B-10C, astrap1005 is connected to themain body106. The strap may function to fully or partly support themain body106, as shown inFIG. 10B. The strap905 may be excluded in some embodiments. Themain body106 may be hand held by the user. In some embodiments, themain body106 may be supported on eyewear frames. In some embodiments, all of the optics are contained within themain body106 that is directly or indirectly supported by theuser114. For example, themain body106 inFIG. 10B may include an optical coherence tomography system, an alignment system, and a data acquisition device. The data acquisition device may wirelessly transmit data to a network or computer system or may use a cable to transfer control signals.FIG. 10C is similar to that ofFIG. 1 and is supported by a separate support structure (for example, an zero gravity arm). In some embodiments, a strap, belt, or other fastener assists in the alignment of themain body106 with one or both eyes of theuser114.
In some embodiments, as shown inFIG. 10D, the user wears anobject1010 connected to the eyepiece. Thewearable object1010 may include a head-mounted object, a hat or an object to be positioned on a user's head. As described above, in some embodiments, themain body106 is supported on an eyewear frame worn by the user like glasses. Thewearable object1010 may fully or partly support themain body106 and/or may assist in aligning themain body106 with one or both eyes of theuser114.
Referring toFIGS. 11A and 11B, there are illustrated two example embodiments of the eye health grades and the eye health assessment reports. With reference toFIG. 11A, the eye health grades report can comprise without limitation a numeric and/or letter grade for each eye of the user for various eye health categories, including but not limited to macular health, optic nerve health, eye clarity, or the like. The eye health grades report can also comprise at least one recommendation to see or consult a physician within a certain period of time, and can provide at least one possible physician to contact. Data for generating the recommendation information and the list of referral physicians are stored in thephysician referral database804. In reference toFIG. 11B, the eye health assessment report can comprise a graphical representation for each eye of the user for various eye health categories. The report can be presented to the user on an electronic display, printed on paper, printed onto a card that the user inserted into the machine, electronically stored on the user's identification card, emailed to the user, or a combination thereof.
With reference toFIG. 12, there is illustrated another embodiment of thecomputer system104 connected toremote system110 and billing/insurance reporting andpayment systems1201. Thebilling module810 can be configured to communicate with billing/insurancereporting payment systems1201 through communications medium108 in order to request or process an insurance claim for conducting a scan of the user's eyes. Based on communications with billing/insurance reporting andpayment system1201, thebilling module810 can also be configured to determine the amount payable or covered by the user's insurance company and/or calculate or determine the co-pay amount to be charge the consumer. In certain embodiments, the user can interact with theuser interface module805 to schedule an appointment with the one of the recommended physicians and/or schedule a reminder to be sent to the user to consult with a physician. Thecomputer system104 or aremote system110 can be configured to send the user the reminder via email, text message, regular mail, automated telephone message, or the like.
Computing System
In some embodiments, the systems, computer clients and/or servers described above take the form of acomputing system1300 shown inFIG. 13, which is a block diagram of one embodiment of a computing system (which can be a fixed system or mobile device) that is in communication with one ormore computing systems1310 and/or one ormore data sources1315 via one ormore networks1310. Thecomputing system1300 may be used to implement one or more of the systems and methods described herein. In addition, in one embodiment, thecomputing system1300 may be configured to process image files. WhileFIG. 13 illustrates one embodiment of acomputing system1300, it is recognized that the functionality provided for in the components and modules ofcomputing system1300 may be combined into fewer components and modules or further separated into additional components and modules.
Client/Server Module
In one embodiment, thesystem1300 comprises an image processing andanalysis module1306 that carries out the functions, methods, and/or processes described herein. The image processing andanalysis module1306 may be executed on thecomputing system1300 by acentral processing unit1304 discussed further below.
Computing System Components
In one embodiment, the processes, systems, and methods illustrated above may be embodied in part or in whole in software that is running on a computing device. The functionality provided for in the components and modules of the computing device may comprise one or more components and/or modules. For example, the computing device may comprise multiple central processing units (CPUs) and a mass storage device, such as may be implemented in an array of servers.
In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, C or C++, or the like. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, Lua, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.
In one embodiment, thecomputing system1300 also comprises a mainframe computer suitable for controlling and/or communicating with large databases, performing high volume transaction processing, and generating reports from large databases. Thecomputing system1300 also comprises a central processing unit (“CPU”)1304, which may comprise a conventional microprocessor. Thecomputing system1300 further comprises amemory1305, such as random access memory (“RAM”) for temporary storage of information and/or a read only memory (“ROM”) for permanent storage of information, and amass storage device1301, such as a hard drive, diskette, or optical media storage device. Typically, the modules of thecomputing system1300 are connected to the computer using a standards based bus system. In different embodiments, the standards based bus system could be Peripheral Component Interconnect (PCI), Microchannel, SCSI, Industrial Standard Architecture (ISA) and Extended ISA (EISA) architectures, for example.
Theexample computing system1300 comprises one or more commonly available input/output (I/O) devices andinterfaces1303, such as a keyboard, mouse, touchpad, and printer. In one embodiment, the I/O devices andinterfaces1303 comprise one or more display devices, such as a monitor, that allows the visual presentation of data to a user. More particularly, a display device provides for the presentation of GUIs, application software data, and multimedia presentations, for example. In the embodiment ofFIG. 13, the I/O devices andinterfaces1303 also provide a communications interface to various external devices. Thecomputing system1300 may also comprise one ormore multimedia devices1302, such as speakers, video cards, graphics accelerators, and microphones, for example.
Computing System Device/Operating System
Thecomputing system1300 may run on a variety of computing devices, such as, for example, a server, a Windows server, a Structure Query Language server, a Unix server, a personal computer, a mainframe computer, a laptop computer, a cell phone, a personal digital assistant, a kiosk, an audio player, and so forth. Thecomputing system1300 is generally controlled and coordinated by operating system software, such as z/OS, Windows 95, Windows 98, Windows NT, Windows 2000, Windows XP, Windows Vista, Linux, BSD, SunOS, Solaris, or other compatible operating systems. In Macintosh systems, the operating system may be any available operating system, such as MAC OS X. In other embodiments, thecomputing system1300 may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, and I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.
Network
In the embodiment ofFIG. 13, thecomputing system1300 is coupled to anetwork1310, such as a modem system using POTS/PSTN (plain old telephone service/public switched telephone network), ISDN, FDDI, LAN, WAN, or the Internet, for example, via a wired, wireless, or combination of wired and wireless,communication link1315. Thenetwork1310 communicates (for example, constantly, intermittently, periodically) with various computing devices and/or other electronic devices via wired or wireless communication links. In the example embodiment ofFIG. 13, thenetwork1310 is communicating with one ormore computing systems1317 and/or one ormore data sources1319.
Access to the image processing andanalysis module1306 of thecomputer system1300 byremote computing systems1317 and/or bydata sources1319 may be through a web-enabled user access point such as the computing systems'1317 or data source's1319 personal computer, cellular phone, laptop, or other device capable of connecting to thenetwork1310. Such a device may have a browser module implemented as a module that uses text, graphics, audio, video, and other media to present data and to allow interaction with data via thenetwork1310.
The browser module or other output module may be implemented as a combination of an all points addressable display such as a cathode-ray tube (CRT), a liquid crystal display (LCD), a plasma display, or other types and/or combinations of displays. In addition, the browser module or other output module may be implemented to communicate withinput devices1303 and may also comprise software with the appropriate interfaces which allow a user to access data through the use of stylized screen elements such as, for example, menus, windows, dialog boxes, toolbars, and controls (for example, radio buttons, check boxes, sliding scales, and so forth). Furthermore, the browser module or other output module may communicate with a set of input and output devices to receive signals from the user.
The input device(s) may comprise a keyboard, roller ball, pen and stylus, mouse, trackball, voice recognition system, or pre-designated switches or buttons. The output device(s) may comprise a speaker, a display screen, a printer, or a voice synthesizer. In addition a touch screen may act as a hybrid input/output device. In another embodiment, a user may interact with the system more directly such as through a system terminal connected to the score generator without communications over the Internet, a WAN, or LAN, or similar network.
In some embodiments, thesystem1300 may comprise a physical or logical connection established between a remote microprocessor and a mainframe host computer for the express purpose of uploading, downloading, or viewing interactive data and databases on-line in real time. The remote microprocessor may be operated by an entity operating thecomputer system1300, including the client server systems or the main server system, and/or may be operated by one or more of thedata sources1319 and/or one or more of the computing systems. In some embodiments, terminal emulation software may be used on the microprocessor for participating in the micro-mainframe link.
In some embodiments,computing systems1317 that are internal to an entity operating thecomputer system1300 may access the image processing andanalysis module1306 internally as an application or process run by theCPU1304.
User Access Point
In one embodiment, a user access point comprises a personal computer, a laptop computer, a cellular phone, a GPS system, a Blackberry® device, a portable computing device, a server, a computer workstation, a local area network of individual computers, an interactive kiosk, a personal digital assistant, an interactive wireless communications device, a handheld computer, an embedded computing device, or the like.
Other Systems
In addition to the systems that are illustrated inFIG. 13, thenetwork1310 may communicate with other data sources or other computing devices. Thecomputing system1300 may also comprise one or more internal and/or external data sources. In some embodiments, one or more of the data repositories and the data sources may be implemented using a relational database, such as DB2, Sybase, Oracle, CodeBase and Microsoft® SQL Server as well as other types of databases such as, for example, a flat file database, an entity-relationship database, and object-oriented database, and/or a record-based database.
With reference toFIG. 14A, there is illustrated an example method for determining or generating a risk assessment of a disease, such as an eye disease, thereby allowing the generation of a health grade and recommended time to see a physician. The example shown inFIG. 14A is for retinal disease, however, the process and method illustrated can be used for other diseases or eye diseases. In this example, the scan control andanalysis module824 is configured to determine the thickness of the retina based on the A-scan data derived from themain body106. This data may include but is not limited to A-scan data from different A-scans. The scan control andanalysis module824 can also be configured to access data and algorithms in the disease risk assessment/diagnosis database808 to calculate the risk assessment of retinal disease based on the measured thickness of the retina as illustrated by the function curve inFIG. 14A. The reporting/output module806 can be configured to normalize the calculated risk assessment value into an eye health letter or numerical grade or score. The reporting/output module806 can also be configured to access data and algorithms in thephysician referral database804 to calculate a recommended time to see a physician based on the calculated risk assessment value.
With reference toFIG. 14B, there is illustrated another example method or process for determining or generating a risk assessment of disease by comparing the scan data to the disease risk assessment/diagnosis database808 comprising, for example, minimum and maximum thickness data and algorithms, and such minimum and maximum thickness data and algorithms that can be based on or are in the form of nomograms. In certain embodiments, the system is configured to generate scan data for portions of the eye scanned to determine thickness of the retina at any one point, and compare such data to histograms and/or nomograms (for example, nomograms that show expected thickness at said location likelihood of or disease for a given thickness) to derive a risk assessment. The system can also be configured to generate an average thickness for the entire retina that is scanned, and compare such data to histograms and/or nomograms to derive a risk assessment.
The term “histogram” as used herein generally refers to an algorithm, curve, or data or other representation of a frequency distribution for a particular variable, for example, retinal thickness. In some cases, the variable is divided into ranges, interval classes, and/or points on a graph (along the X-axis) for which the frequency of occurrence is represented by a rectangular column or location of points; the height of the column and/or point along the Y-axis is proportional to or otherwise indicative of the frequency of observations within the range or interval. “Histograms,” as referred to herein, can comprise measured data obtained, for example, from scanning the eyes of a user, or can comprise data obtained from a population of people. Histograms of the former case can be analyzed to determine the mean, minimum, or maximum values, and analyze changes in slope or detect shapes or curvatures of the histogram curve. Histograms of the latter case can be used to determine the frequency of observation of a measured value in a surveyed sample.
In the instance where an average thickness value is derived from the scan data, there are some conditions/diseases that may be indicated by thickening of the retina in a localized area. Accordingly, such a condition may not significantly affect the average thickness value (for example, if a substantial portion of the retina is of normal thickness). Therefore, the maximum thickness value may be needed to detect this abnormal thickening in the retina. In some embodiments, this maximum thickness value may be due to a segmentation error. Accordingly, a more stable way of determining the maximum value may also be to use the value corresponding to 95% (or any value between 75% and 99%) maximal thickness. The foregoing can also be applied to minimum retinal thickness or any other value, measurement, and/or detectable condition in the eye. For example, with minimum retinal thickness, if the user has a macular hole, there will only be a small area of zero thickness, and possibly not enough to significantly reduce the average thickness, but definitely an abnormality that may be detected.
In other embodiments, the system may be configured to create histograms of measured thickness and/or measured intensity values and/or slopes or derivatives of intensity values and/or variables to identify abnormalities. For example, changes or substantial changes in slope (calculated as the derivative of adjacent intensity values) may indicate hyporeflective or hyperreflective structures that may not affect mean or average intensity values, but may be indicative of disease or conditions. For example, the system can determine if the distribution of retinal thicknesses across the measured portion of the retina matches that of the normal population. Deviation from such a “normal” histogram would result in lower health grades/higher risk assessments.
In various embodiments, the methods or processes described herein can be used to determine or generate a risk assessment of maculopathy based, for example, on abnormal thickening of the retina or fovea, the presence of hyperreflective (bright or high intensity) or hyporeflective (dark or low intensity) structures in the outer half of the retina, the presence of hyporeflective (dark) structures in the inner half of the retina, the presence of irregularities in the contour of the retinal pigment epithelium that depart from the normal curvature of the eye, or of the presence of hypertransmission of light through the retinal pigment epithelium when compared to a database of normal values stored in the disease risk assessment/diagnosis database708.
As described above, there are several ways to detect or generate a risk assessment for several diseases or conditions. In certain embodiments, scan data is compared to data found in normal people to identify similarities or differences from a nomogram and/or histogram. In other embodiments, scan data is compared to data found in people with diseases to identify similarities or differences from nomograms and/or histograms. The pathognomonic disease features could be indicated by similarity to nomograms, for example, images, histograms, or other data, etc. from diseased patients.
In one embodiment, “normal” data (for example, histograms) are created for retinal thickness in each region of the retina (optic nerve, fovea, temporal retina) and compare to measured, detected, scanned, or encountered values to these “normal” data (for example, histograms) to determine relative risks of retinal disease or other diseases. The same can be performed for nerve fiber layer (NFL) thickness to detect glaucoma. In other embodiments, the detection or generation of a risk assessment for glaucoma is performed or generated by analyzing collinear A-scan data to see if curvilinear thinning indicates the presence of glaucoma because glaucoma tends to thin the NFL in curvilinear bundles. The NFL radiates out from the optic nerve in a curvilinear fashion like iron filings around a magnet. Measuring and analyzing a sequence of A-scan data that follow such a curvilinear path may be useful to identify such thinning that is characteristic of glaucoma. The analysis could be centered on and/or around the optic nerve or centered on and/or around the fovea or elsewhere. In another embodiment, the detection and/or generation of a risk assessment for glaucoma is performed or generated by analyzing the inner surface of the optic nerve to determine the optic disc cup volume.
The system can also be configured to detect and/or generate a risk assessment for optical clarity wherein the system integrates A-scan data in the Z direction and compares some or all the A-scan data to a nomogram value or values, or, for example, a histogram. In general, darker A-scans will probably indicate the presence of media opacities, for example, cataracts, that decrease optical clarity (therefore, increase the subject's risk of having an optical clarity problem, for example, cataracts).
The system can also be configured to detect or generate risk assessments for retinal pigment epithelium (RPE) features that depart from the normal curvature of the eye (drusen, retinal pigment epithelial detachments). Such RPE features can be detected by fitting the detected RPE layer to a polynomial curve that mimics the expected curvature for the eye, and using a computer algorithm to analyze, compare, or examine the difference between these curves. For example with respect toFIG. 15, the system can be configured to subtract the polynomial curve that mimics the expected curvature of theRPE layer1502 from the detectedRPE layer curve1504, and analyze and/or compare the resulting difference/value1506 with the values (for example, in a histogram or nomogram) from normal and/or diseased eyes to generate a diagnosis or risk assessment. The foregoing method and process is similar to a measure of tortuosity in that a bumpy RPE detection will generally have more deviations from a polynomial curve than smooth RPE detections, which are common in young, healthy people.
Such RPE detection can also be used to detect increased transmission through the RPE which is essentially synonymous with RPE degeneration or atrophy. In certain embodiments, the system is configured to analyze the tissue layer beyond or beneath the RPE layer. Using imaging segmentation techniques, the RPE layer can be segmented. In certain embodiments, the system is configured to add up all of the intensity values beneath the RPE detection. When atrophy is present, there are generally many high values beneath the RPE line, which makes the integral value high and would increase the patient's risk of having a serious macular condition, such as geographic atrophy.
With reference toFIG. 16, the system can also be used to detect or generate risk factors for abnormal intensities within the retina. In certain embodiments, the system is configured to divide the retina into an inner1602 and outer1604 half based on the midpoint between the internal limiting membrane (ILM)detection1606 and theRPE detection lines1608. In some instances, a blur filter (for example, a Gaussian blur, radial blur, or the like) is applied to the retinal tissue to remove speckle noise and/or other noise. For each the inner and outer retina regions, a first derivative of the intensity values (with respect to position, for example, d/dx, d/dy, or the like) can be calculated to determine the slope of the curve to differentiate the areas where there are large changes from dark to bright or vice versa across lateral dimensions of the tissue. For example, intensities or derivatives within the retina can be compared to, for example, normal histograms, wherein inner retinal hypointensity can be an indicator of cystoid macular edema; or wherein outer retinal hypointensity can be an indicator of cystoid macular edema, subretinal fluid, or diffuse macular edema; or wherein outer retinal hyperintensity can be an indication of diabetes (which may be the cause of diabetic retinopathy, or damage to the retina due to, for example, complications of diabetes mellitus), or age-related macular degeneration.
Data from normal patients can used to compile histograms of intensity and/or slope (derivative) data to indicate expected values for normal people. Data from people with various diseases can also be placed into histograms of intensity and/or derivative (slope) values to indicate expected values for those people with diseases. In certain embodiments, a relative risk will then be developed for each entry on the histogram such that this risk can be applied to unknown cases. For example, in some instances, people with 10% of their outer retinal intensity values equal to 0 have an 85% chance of having a retinal problem. Accordingly, such users may receive a health grade of 15. In another example, people with any inner retinal points less than 10 have a 100% chance of disease, and therefore such users may receive a health grade of 5.
Alternatively, as discussed herein, the foregoing method or process can also be used to determine or generate a risk assessment of glaucoma based on patterns of thinning of the macular and/or peripapillary nerve fiber layer or enlarged cupping of the optic nerve head as compared to a database of normal and abnormal values stored in the disease risk assessment/diagnosis database708. Similarly, to detect or develop a risk assessment for uveitis, a histogram of expected intensity values above the inner retinal surface (in the vitreous), for example, can be used. The presence of large, bright specks (for example, high intensity areas) in the vitreous cavity would indicate possible uveitis and would likely indicate a need for referral. The foregoing method and process can also be used to determine or generate a risk of eye disease based on the intensity levels of the image signal as compared to a database of normal and abnormal values stored in the disease risk assessment/diagnosis database708.
In other embodiments, the foregoing method and process can also be used to determine or generate a risk assessment of uveitis based on hyperreflective features in the vitreous cavity as compared to normal and abnormal hyperreflective features stored in the disease risk assessment/diagnosis database708. The foregoing method and process can also be used to determine or generate a risk assessment of anterior eye disease based on detection of pathognomonic disease features, such as cystoid retinal degeneration, outer retinal edema, subretinal fluid, subretinal tissue, macular holes, drusen, retinal pigment epithelial detachments, and/or retinal pigment epithelial atrophy, wherein the detected features are compared with such pathognomonic disease features stored in the disease risk assessment/diagnosis database708. In certain embodiments, the system is configured to perform template matching wherein the system detects, compares, and/or matches characteristics from A-scans generated from scanning a user, also known as unknown A-scans, with a database of patterns known to be associated with disease features, such as subretinal fluid, or the like.
With reference toFIGS. 1,8 and9, the opticalcoherence tomography system100 is configured to allow the user to self-administer an OCT scan of the user's eyes without dilation of the eyes, and obtain a risk assessment or diagnosis of various diseases and ailments without the engaging or involving a doctor and/or technician to align the user's eyes with the system, administer the OCT scan and/or interpret the data from the scan to generate or determine a risk assessment or diagnosis. In one embodiment, the opticalcoherence tomography system100 can perform a screening in less than two minutes, between 2-3 minutes, or 2-5 minutes. In certain embodiments, the use of the binocular system allows the user to self-align the opticalcoherence tomography system100. Theoptical coherence system100 with a binocular system is faster since it scans both eyes without repositioning and can allow the opticalcoherence tomography system100 to scan a person's bad eye because the person's bad eye will follow the person's good eye as the latter tracks the fixation marker. Accordingly, the opticalcoherence tomography system100 reduces the expense of conducting an OCT scan, thereby making OCT scanning more accessible to more people and/or users, and saving millions of people from losing their eye sight due to eye diseases or ailments that are preventable through earlier detection. In one embodiment, the opticalcoherence tomography system100 is configured to have a small-foot print and/or to be portable, such that the opticalcoherence tomography system100 can be installed or placed in drug stores, retail malls or stores, medical imaging facilities, grocery stores, libraries, and/or mobile vehicles, buses, or vans, a general practitioner's or other doctor's office, such that the opticalcoherence tomography system100 can be used by people who do not have access to a doctor.
All of the methods and processes described above may be embodied in, and fully automated via, software code modules executed by one or more general purpose computers or processors. The code modules may be stored in any type of computer-readable medium or other computer storage device. Some or all of the methods may alternatively be embodied in specialized computer hardware.
While the invention has been discussed in terms of certain embodiments, it should be appreciated that the invention is not so limited. The embodiments are explained herein by way of example, and there are numerous modifications, variations and other embodiments that may be employed that would still be within the scope of the present invention.
For purposes of this disclosure, certain aspects, advantages, and novel features of the invention are described herein. It is to be understood that not necessarily all such advantages may be achieved in accordance with any particular embodiment of the invention. Thus, for example, those skilled in the art will recognize that the invention may be embodied or carried out in a manner that achieves one advantage or group of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.

Claims (21)

1. An optical coherence tomography instrument comprising: first and second oculars for directing light to and receiving light reflected from a pair of eyes of a subject; a light source that outputs light that is directed through the first and second oculars to the subject's eye; an interferometer configured to produce optical interference using light reflected from the subject's eye; and an optical detector disposed so as to detect said optical interference; wherein said instrument is configured to obtain optical coherence tomography scans of eye tissue, and said instrument comprises at least one of: (i) auto-focus lenses configured to focus the instrument based on said detected optical interference; or (ii) an interpupillary distance adjustment configured to facilitate the adjustment of interpupillary distance by the subject while the subject looks through said first and second oculars and a Z positioning module configured to automatically and independently adjust the Z offset of each ocular, thereby adjusting a depth at which each scan is obtained.
US12/111,8942008-03-272008-04-29Optical coherence tomography device, method, and systemActive2030-02-27US8348429B2 (en)

Priority Applications (23)

Application NumberPriority DateFiling DateTitle
US12/111,894US8348429B2 (en)2008-03-272008-04-29Optical coherence tomography device, method, and system
PCT/US2009/037449WO2009120544A1 (en)2008-03-272009-03-17Optical coherence tomography device, method, and system
JP2011501911AJP5469658B2 (en)2008-03-272009-03-17 Optical coherence tomography apparatus and method
PCT/US2009/037448WO2009120543A1 (en)2008-03-272009-03-17Optical coherence tomography device, method, and system
EP24165712.1AEP4386775A3 (en)2008-03-272009-03-17Optical coherence tomography device, method, and system
CN202110028197.XACN112869696B (en)2008-03-272009-03-17 Optical coherence tomography device, method and system
EP16157648.3AEP3053513B1 (en)2008-03-272009-03-17Optical coherence tomography instrument
CN200980119252.3ACN102046067B (en)2008-03-272009-03-17 Optical coherence tomography analysis equipment, method and system
EP09724806.6AEP2271249B2 (en)2008-03-272009-03-17Optical coherence tomography device, method, and system
CN202510241127.0ACN120052807A (en)2008-03-272009-03-17Optical coherence tomography apparatus, method and system
CN201710950092.3ACN107692960B (en)2008-03-272009-03-17Optical coherence tomography analysis apparatus, method and system
TW98109845ATW200940026A (en)2008-03-272009-03-26Optical coherence tomography device, method, and system
US13/717,508US9149182B2 (en)2008-03-272012-12-17Optical coherence tomography device, method, and system
JP2014016709AJP6040177B2 (en)2008-03-272014-01-31 Optical coherence tomography device
US14/521,392US20150138503A1 (en)2008-03-272014-10-22Optical coherence tomography device, method, and system
US15/249,151US11291364B2 (en)2008-03-272016-08-26Optical coherence tomography device, method, and system
US15/349,970US10165941B2 (en)2008-03-272016-11-11Optical coherence tomography-based ophthalmic testing methods, devices and systems
US16/184,772US10945597B2 (en)2008-03-272018-11-08Optical coherence tomography-based ophthalmic testing methods, devices and systems
US17/249,026US11839430B2 (en)2008-03-272021-02-17Optical coherence tomography-based ophthalmic testing methods, devices and systems
US17/475,153US11510567B2 (en)2008-03-272021-09-14Optical coherence tomography-based ophthalmic testing methods, devices and systems
US17/811,658US12193742B2 (en)2008-03-272022-07-11Optical coherence tomography-based ophthalmic testing methods, devices and systems
US18/493,155US12414688B2 (en)2008-03-272023-10-24Optical coherence tomography-based ophthalmic testing methods, devices and systems
US19/017,198US20250143569A1 (en)2008-03-272025-01-10Optical coherence tomography-based ophthalmic testing methods, devices and systems

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US4008408P2008-03-272008-03-27
US12/111,894US8348429B2 (en)2008-03-272008-04-29Optical coherence tomography device, method, and system

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US13/717,508ContinuationUS9149182B2 (en)2008-03-272012-12-17Optical coherence tomography device, method, and system

Publications (2)

Publication NumberPublication Date
US20090244485A1 US20090244485A1 (en)2009-10-01
US8348429B2true US8348429B2 (en)2013-01-08

Family

ID=40651707

Family Applications (4)

Application NumberTitlePriority DateFiling Date
US12/111,894Active2030-02-27US8348429B2 (en)2008-03-272008-04-29Optical coherence tomography device, method, and system
US13/717,508ActiveUS9149182B2 (en)2008-03-272012-12-17Optical coherence tomography device, method, and system
US14/521,392AbandonedUS20150138503A1 (en)2008-03-272014-10-22Optical coherence tomography device, method, and system
US15/249,151ActiveUS11291364B2 (en)2008-03-272016-08-26Optical coherence tomography device, method, and system

Family Applications After (3)

Application NumberTitlePriority DateFiling Date
US13/717,508ActiveUS9149182B2 (en)2008-03-272012-12-17Optical coherence tomography device, method, and system
US14/521,392AbandonedUS20150138503A1 (en)2008-03-272014-10-22Optical coherence tomography device, method, and system
US15/249,151ActiveUS11291364B2 (en)2008-03-272016-08-26Optical coherence tomography device, method, and system

Country Status (5)

CountryLink
US (4)US8348429B2 (en)
EP (3)EP3053513B1 (en)
JP (2)JP5469658B2 (en)
CN (4)CN120052807A (en)
WO (1)WO2009120544A1 (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20130173750A1 (en)*2011-12-302013-07-04Matthew CarnevaleRemote exam viewing system
US20130176402A1 (en)*2012-01-092013-07-11Kla-Tencor CorporationStereo Extended Depth of Focus
US20130265544A1 (en)*2010-10-152013-10-10Universidad De MurciaInstrument for rapid measurement of the optical properties of the eye in the entire field of vision
US8820931B2 (en)2008-07-182014-09-02Doheny Eye InstituteOptical coherence tomography-based ophthalmic testing methods, devices and systems
US20140313477A1 (en)*2013-03-152014-10-23Amo Wavefront Sciences, Llc.Angular multiplexed optical coherence tomography systems and methods
US20150160726A1 (en)*2013-03-182015-06-11Mirametrix Inc.System and Method for On-Axis Eye Gaze Tracking
US9149182B2 (en)2008-03-272015-10-06Doheny Eye InstituteOptical coherence tomography device, method, and system
US9226856B2 (en)2013-03-142016-01-05Envision Diagnostics, Inc.Inflatable medical interfaces and other medical devices, systems, and methods
US20170000342A1 (en)2015-03-162017-01-05Magic Leap, Inc.Methods and systems for detecting health conditions by imaging portions of the eye, including the fundus
US20170245755A1 (en)*2014-09-192017-08-31Carl Zeiss Meditec AgSystem for optical coherence tomography, comprising a zoomable kepler system
US10149610B2 (en)2014-04-252018-12-11Carl Zeiss Meditec, Inc.Methods and systems for automatic detection and classification of ocular inflammation
US10241576B2 (en)2017-05-082019-03-26International Business Machines CorporationAuthenticating users and improving virtual reality experiences via ocular scans and pupillometry
US10459231B2 (en)2016-04-082019-10-29Magic Leap, Inc.Augmented reality systems and methods with variable focus lens elements
US10595722B1 (en)*2018-10-032020-03-24Notal Vision Ltd.Automatic optical path adjustment in home OCT
US10610096B2 (en)2016-12-212020-04-07Acucela Inc.Miniaturized mobile, low cost optical coherence tomography system for home based ophthalmic applications
US10653314B2 (en)2017-11-072020-05-19Notal Vision Ltd.Methods and systems for alignment of ophthalmic imaging devices
US10653311B1 (en)2019-06-122020-05-19Notal Vision Ltd.Home OCT with automatic focus adjustment
US10772497B2 (en)2014-09-122020-09-15Envision Diagnostics, Inc.Medical interfaces and other medical devices, systems, and methods for performing eye exams
US10849498B2 (en)2015-08-122020-12-01Carl Zeiss Meditec, Inc.Alignment improvements for ophthalmic diagnostic systems
US10962855B2 (en)2017-02-232021-03-30Magic Leap, Inc.Display system with variable power reflector
US11039741B2 (en)2015-09-172021-06-22Envision Diagnostics, Inc.Medical interfaces and other medical devices, systems, and methods for performing eye exams
US11058299B2 (en)2017-11-072021-07-13Notal Vision Ltd.Retinal imaging device and related methods
US11357401B2 (en)2018-06-202022-06-14Acucela Inc.Miniaturized mobile, low cost optical coherence tomography system for home based ophthalmic applications
US11393094B2 (en)2020-09-112022-07-19Acucela Inc.Artificial intelligence for evaluation of optical coherence tomography images
US11497396B2 (en)2021-03-242022-11-15Acucela Inc.Axial length measurement monitor
US11510567B2 (en)2008-03-272022-11-29Doheny Eye InstituteOptical coherence tomography-based ophthalmic testing methods, devices and systems
US11684254B2 (en)2020-08-042023-06-27Acucela Inc.Scan pattern and signal processing for optical coherence tomography
US11717153B2 (en)2016-04-302023-08-08Envision Diagnostics, Inc.Medical devices, systems, and methods for performing eye exams and eye tracking
US11730363B2 (en)2019-12-262023-08-22Acucela Inc.Optical coherence tomography patient alignment system for home based ophthalmic applications
US11911105B2 (en)2020-09-302024-02-27Acucela Inc.Myopia prediction, diagnosis, planning, and monitoring device
US11974807B2 (en)2020-08-142024-05-07Acucela Inc.System and method for optical coherence tomography a-scan decurving

Families Citing this family (154)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8223143B2 (en)2006-10-272012-07-17Carl Zeiss Meditec, Inc.User interface for efficiently displaying relevant OCT imaging data
US10874299B2 (en)2007-02-162020-12-2920/20 Vision Center, LlcSystem and method for enabling customers to obtain refraction specifications and purchase eyeglasses or contact lenses
US20080312552A1 (en)*2007-06-182008-12-18Qienyuan ZhouMethod to detect change in tissue measurements
US8696695B2 (en)2009-04-282014-04-15Avinger, Inc.Guidewire positioning catheter
US9788790B2 (en)*2009-05-282017-10-17Avinger, Inc.Optical coherence tomography for biological imaging
US9125562B2 (en)2009-07-012015-09-08Avinger, Inc.Catheter-based off-axis optical coherence tomography imaging system
US8062316B2 (en)2008-04-232011-11-22Avinger, Inc.Catheter system and method for boring through blocked vascular passages
WO2010060622A2 (en)*2008-11-262010-06-03Carl Zeiss Surgical GmbhImaging system
JP2010259492A (en)*2009-04-302010-11-18Topcon Corp Fundus observation device
WO2011003006A2 (en)2009-07-012011-01-06Avinger, Inc.Atherectomy catheter with laterally-displaceable tip
US20110190657A1 (en)*2009-08-102011-08-04Carl Zeiss Meditec, Inc.Glaucoma combinatorial analysis
US8322853B2 (en)*2009-10-022012-12-04Optos PlcDiagnostic method and apparatus for predicting potential preserved visual acuity
WO2011072068A2 (en)2009-12-082011-06-16Avinger, Inc.Devices and methods for predicting and preventing restenosis
US11382653B2 (en)2010-07-012022-07-12Avinger, Inc.Atherectomy catheter
US10548478B2 (en)2010-07-012020-02-04Avinger, Inc.Balloon atherectomy catheters with imaging
WO2014039096A1 (en)2012-09-062014-03-13Avinger, Inc.Re-entry stylet for catheter
US9345510B2 (en)2010-07-012016-05-24Avinger, Inc.Atherectomy catheters with longitudinally displaceable drive shafts
US8684526B2 (en)*2010-07-022014-04-01Amo Wavefront Sciences, LlcCompact binocular adaptive optics phoropter
JP5820154B2 (en)*2010-07-052015-11-24キヤノン株式会社 Ophthalmic apparatus, ophthalmic system, and storage medium
JP6180073B2 (en)*2010-08-312017-08-16キヤノン株式会社 Image processing apparatus, control method therefor, and program
KR101223283B1 (en)*2011-03-082013-01-21경북대학교 산학협력단Diagnostic display and operating intergrated optical tomography otoscope
US8678592B2 (en)*2011-03-092014-03-25The Johns Hopkins UniversityMethod and apparatus for detecting fixation of at least one eye of a subject on a target
EP2499964B1 (en)*2011-03-182015-04-15SensoMotoric Instruments Gesellschaft für innovative Sensorik mbHOptical measuring device and system
EP2691038B1 (en)2011-03-282016-07-20Avinger, Inc.Occlusion-crossing devices, imaging, and atherectomy devices
US9949754B2 (en)2011-03-282018-04-24Avinger, Inc.Occlusion-crossing devices
JP5639523B2 (en)*2011-03-312014-12-10キヤノン株式会社 Optical coherence tomography apparatus, control method of optical coherence tomography apparatus, program, and ophthalmic system
US8632180B2 (en)2011-04-252014-01-21Carl Zeiss Meditec, Inc.Automated detection of uveitis using optical coherence tomography
US9055892B2 (en)*2011-04-272015-06-16Carl Zeiss Meditec, Inc.Systems and methods for improved ophthalmic imaging
US9357911B2 (en)*2011-05-092016-06-07Carl Zeiss Meditec, Inc.Integration and fusion of data from diagnostic measurements for glaucoma detection and progression analysis
US20130030260A1 (en)*2011-07-282013-01-31Sean HaleSystem and method for biometric health risk assessment
JP6025311B2 (en)*2011-08-012016-11-16キヤノン株式会社 Ophthalmic diagnosis support apparatus and method
AT511935B1 (en)*2011-09-122015-09-15Ima Integrated Microsystems Austria Gmbh METHOD AND DEVICE FOR SPATIAL MEASUREMENT OF TISSUE STRUCTURES
EP3653151A1 (en)2011-10-172020-05-20Avinger, Inc.Atherectomy catheters and non-contact actuation mechanism for catheters
US9345406B2 (en)2011-11-112016-05-24Avinger, Inc.Occlusion-crossing devices, atherectomy devices, and imaging
EP3597100B1 (en)2011-12-052024-10-23Leica Microsystems NC, Inc.Optical imaging systems having input beam shape control and path length control
FR2984718B1 (en)*2011-12-222014-09-12Essilor Int DEVICE AND METHOD FOR BINOCULAR MULTIPLEXING
US8944597B2 (en)*2012-01-192015-02-03Carl Zeiss Meditec, Inc.Standardized display of optical coherence tomography imaging data
JP6226510B2 (en)*2012-01-272017-11-08キヤノン株式会社 Image processing system, processing method, and program
JP2013153880A (en)*2012-01-272013-08-15Canon IncImage processing system, processing method, and program
US8777412B2 (en)2012-04-052014-07-15Bioptigen, Inc.Surgical microscopes using optical coherence tomography and related methods
US20150043003A1 (en)*2012-04-182015-02-12Lg Electronics Inc.Optical coherence tomography and control method for the same
WO2013172970A1 (en)2012-05-142013-11-21Avinger, Inc.Atherectomy catheters with imaging
EP2849660B1 (en)2012-05-142021-08-25Avinger, Inc.Atherectomy catheter drive assemblies
US9557156B2 (en)2012-05-142017-01-31Avinger, Inc.Optical coherence tomography with graded index fiber for biological imaging
US9117121B2 (en)*2012-05-212015-08-25The Chinese University Of Hong KongDetection of disease-related retinal nerve fiber layer thinning
CN102681516B (en)*2012-05-222014-07-23中国科学院上海应用物理研究所Interface system and method for implementing communication between monochromator and spectroscopic microscope
JP2014045868A (en)2012-08-302014-03-17Canon IncInteractive controller
US9498247B2 (en)2014-02-062016-11-22Avinger, Inc.Atherectomy catheters and occlusion crossing devices
US11284916B2 (en)2012-09-062022-03-29Avinger, Inc.Atherectomy catheters and occlusion crossing devices
US9230062B2 (en)*2012-11-062016-01-0520/20 Vision Center, LlcSystems and methods for enabling customers to obtain vision and eye health examinations
US9265458B2 (en)2012-12-042016-02-23Sync-Think, Inc.Application of smooth pursuit cognitive testing paradigms to clinical drug development
EP2929327B1 (en)2012-12-052019-08-14Perimeter Medical Imaging, Inc.System and method for wide field oct imaging
JP6147001B2 (en)*2012-12-282017-06-14キヤノン株式会社 Image processing apparatus and image processing method
JP2014161439A (en)*2013-02-222014-09-08Sony CorpEyeground information acquisition device and method, and program
US9380976B2 (en)2013-03-112016-07-05Sync-Think, Inc.Optical neuroinformatics
GB2525817B (en)*2013-03-122020-02-26Opternative IncComputerized refraction and astigmatism determination
US9420945B2 (en)2013-03-142016-08-23Carl Zeiss Meditec, Inc.User interface for acquisition, display and analysis of ophthalmic diagnostic data
CN105228514B (en)2013-03-152019-01-22阿维格公司 Optical Pressure Sensor Assembly
WO2014143064A1 (en)2013-03-152014-09-18Avinger, Inc.Chronic total occlusion crossing devices with imaging
WO2014151573A1 (en)*2013-03-152014-09-25Steven VerdoonerMethod for detecting amyloid beta plaques and drusen
WO2014152058A1 (en)*2013-03-152014-09-25Steven VerdoonerMethod for detecting a disease by analysis of retinal vasculature
US11096717B2 (en)2013-03-152021-08-24Avinger, Inc.Tissue collection device for catheter
US9955865B2 (en)*2013-04-112018-05-01Novartis AgMethod and system to detect ophthalmic tissue structure and pathologies
WO2014197553A2 (en)2013-06-042014-12-11Bioptigen, Inc.Hybrid telescope for optical beam delivery and related systems and methods
JP6175945B2 (en)*2013-07-052017-08-09ソニー株式会社 Gaze detection apparatus and gaze detection method
EP3019096B1 (en)2013-07-082023-07-05Avinger, Inc.System for identification of elastic lamina to guide interventional therapy
CN105592829B (en)2013-07-292018-11-16拜尔普泰戈恩公司Surgical optical coherence tomography (OCT) and its related system and method for surgical operation
JP2015033472A (en)2013-08-082015-02-19株式会社トプコンOphthalmologic image-capturing apparatus
JP6141140B2 (en)2013-08-082017-06-07株式会社トプコン Ophthalmic imaging equipment
JP2015035111A (en)*2013-08-082015-02-19株式会社トプコンPatient management system and patient management server
WO2015023547A1 (en)*2013-08-102015-02-19Hogan Joshua Noel JoshHead-mounted optical coherence tomography
EP3039474A1 (en)2013-08-282016-07-06Bioptigen, Inc.Heads up displays for optical coherence tomography integrated surgical microscopes
US9526412B2 (en)*2014-01-212016-12-27Kabushiki Kaisha TopconGeographic atrophy identification and measurement
US10660519B2 (en)2014-01-302020-05-26Duke UniversitySystems and methods for eye tracking for motion corrected ophthalmic optical coherence tomography
MX2016010141A (en)2014-02-062017-04-06Avinger IncAtherectomy catheters and occlusion crossing devices.
US9237847B2 (en)2014-02-112016-01-19Welch Allyn, Inc.Ophthalmoscope device
US9211064B2 (en)2014-02-112015-12-15Welch Allyn, Inc.Fundus imaging system
EP3113667A4 (en)*2014-03-042017-11-08University of Southern CaliforniaExtended duration optical coherence tomography (oct) system
JP2015201003A (en)2014-04-072015-11-12株式会社トプコンOphthalmologic information system and ophthalmologic information processing server
WO2015171566A1 (en)*2014-05-062015-11-12Oregon Health & Science UniversityAqueous cell differentiation in anterior uveitis using optical coherence tomography
DE102014007909A1 (en)2014-05-272015-12-03Carl Zeiss Meditec Ag Surgical microscope
JP2017522066A (en)*2014-06-102017-08-10カール ツァイス メディテック インコーポレイテッドCarl Zeiss Meditec Inc. Imaging system and method with improved frequency domain interferometry
US10143370B2 (en)*2014-06-192018-12-04Novartis AgOphthalmic imaging system with automatic retinal feature detection
US10357277B2 (en)2014-07-082019-07-23Avinger, Inc.High speed chronic total occlusion crossing devices
US9724239B2 (en)*2014-07-142017-08-08Novartis AgMovable wide-angle ophthalmic surgical system
WO2016127140A1 (en)2015-02-052016-08-11Duke UniversityCompact telescope configurations for light scanning systems and methods of using the same
US10238279B2 (en)2015-02-062019-03-26Duke UniversityStereoscopic display systems and methods for displaying surgical data and information in a surgical microscope
US10799115B2 (en)2015-02-272020-10-13Welch Allyn, Inc.Through focus retinal image capturing
US11045088B2 (en)*2015-02-272021-06-29Welch Allyn, Inc.Through focus retinal image capturing
US20160278983A1 (en)*2015-03-232016-09-29Novartis AgSystems, apparatuses, and methods for the optimization of laser photocoagulation
CN104856640B (en)*2015-05-212016-05-25中国科学院光电研究院Laser induced plasma spectroscopy analytical equipment and the method measured for cornea
CN104856642A (en)*2015-05-292015-08-26厦门大学Webpage based remote control slit-lamp examination system
JP6490501B2 (en)*2015-06-092019-03-27株式会社トプコン Ophthalmic examination system
JP6499937B2 (en)*2015-06-302019-04-10株式会社トプコン Ophthalmic microscope system
US20170000392A1 (en)*2015-07-012017-01-05Rememdia LCMicro-Camera Based Health Monitor
US9913583B2 (en)2015-07-012018-03-13Rememdia LCHealth monitoring system using outwardly manifested micro-physiological markers
US10568520B2 (en)2015-07-132020-02-25Avinger, Inc.Micro-molded anamorphic reflector lens for image guided therapeutic/diagnostic catheters
US10136804B2 (en)2015-07-242018-11-27Welch Allyn, Inc.Automatic fundus image capture system
US20170112373A1 (en)*2015-10-232017-04-27Gobiquity, Inc.Visual acuity testing method and product
KR101861672B1 (en)*2015-10-292018-05-29주식회사 고영테크놀러지Full-field swept-source optical coherence tomography system and three-dimensional image compensation method for same
US10772495B2 (en)2015-11-022020-09-15Welch Allyn, Inc.Retinal image capturing
US9928602B2 (en)2015-12-112018-03-27Novartis AgFast and automated segmentation of layered image with heuristic graph search
WO2017120217A1 (en)2016-01-072017-07-13Welch Allyn, Inc.Infrared fundus imaging system
JP6927986B2 (en)2016-01-252021-09-01アビンガー・インコーポレイテッドAvinger, Inc. OCT imaging catheter with delay compensation
US20180116503A1 (en)*2016-02-052018-05-03Joshua Noel HoganHead-mounted Optical coherence tomography
EP3435892B1 (en)2016-04-012024-04-03Avinger, Inc.Atherectomy catheter with serrated cutter
AU2017257403B2 (en)*2016-04-292022-03-31OncoRes Medical Pty LtdAn optical coherence tomography system
US10694939B2 (en)2016-04-292020-06-30Duke UniversityWhole eye optical coherence tomography(OCT) imaging systems and related methods
US10366792B2 (en)*2016-05-022019-07-30Bio-Tree Systems, Inc.System and method for detecting retina disease
US11344327B2 (en)2016-06-032022-05-31Avinger, Inc.Catheter device with detachable distal end
WO2018006041A1 (en)2016-06-302018-01-04Avinger, Inc.Atherectomy catheter with shapeable distal tip
US10602926B2 (en)2016-09-292020-03-31Welch Allyn, Inc.Through focus retinal image capturing
US9968251B2 (en)2016-09-302018-05-15Carl Zeiss Meditec, Inc.Combined structure-function guided progression analysis
US10916012B2 (en)2016-10-052021-02-09Canon Kabushiki KaishaImage processing apparatus and image processing method
AU2016265973A1 (en)*2016-11-282018-06-14Big Picture Medical Pty LtdSystem and method for identifying a medical condition
US11197607B2 (en)*2016-12-142021-12-14C. Light Technologies, Inc.Binocular retinal imaging device, system, and method for tracking fixational eye motion
US20190374160A1 (en)*2017-01-052019-12-12The Trustees Of Princeton UniversityHierarchical health decision support system and method
CN106645156B (en)*2017-01-062023-03-17塔里木大学A rotary conveying clamping device for bergamot pear detects
US10690919B1 (en)2017-02-172020-06-23Facebook Technologies, LlcSuperluminous LED array for waveguide display
JP2018139846A (en)*2017-02-282018-09-13富士フイルム株式会社Endoscope system and operation method thereof
US10849547B2 (en)2017-05-042020-12-01Junebrain, Inc.Brain monitoring system
WO2019014767A1 (en)2017-07-182019-01-24Perimeter Medical Imaging, Inc.Sample container for stabilizing and aligning excised biological tissue samples for ex vivo analysis
EP3703550A4 (en)*2017-10-312021-07-21Octhealth, LLC DEVICE AND METHOD FOR THE SELF-ADMINISTRATION OF THE OPTICAL SCAN OF AN OPTICAL SYSTEM OF THE EYE OF A PERSON
DE102018101917A1 (en)*2018-01-292019-08-01Carl Zeiss Ag Method and device for eye examination by means of OCT
RU2672391C1 (en)*2018-02-082018-11-14Федеральное государственное автономное учреждение "Межотраслевой научно-технический комплекс "Микрохирургия глаза" имени академика С.Н. Федорова" Министерства здравоохранения Российской ФедерацииMethod for diagnosis of drusen of the optic nerve disc using the method of optical coherence tomography - angiography
EP3530175A1 (en)*2018-02-262019-08-28Nokia Technologies OyApparatus for optical coherence tomography
CN111819417B (en)*2018-03-012022-03-22爱尔康公司Common-path waveguide for stabilizing optical coherence tomography imaging
US10168537B1 (en)*2018-03-162019-01-01Facebook Technologies, LlcSingle chip superluminous light emitting diode array for waveguide displays
US12167867B2 (en)2018-04-192024-12-17Avinger, Inc.Occlusion-crossing devices
US11096574B2 (en)2018-05-242021-08-24Welch Allyn, Inc.Retinal image capturing
CN108968922A (en)*2018-08-172018-12-11苏州长脉科技有限责任公司A kind of hand-held compact oedema detection device and its data processing method based on near-infrared absorption
JP2020044027A (en)2018-09-182020-03-26株式会社トプコン Ophthalmic apparatus, control method thereof, program, and recording medium
CN109620131B (en)*2018-12-142021-08-03佛山科学技术学院 Common optical path microlens array multi-beam optical coherent elasticity measurement system and method
JP6699956B1 (en)2019-01-162020-05-27株式会社トプコン Ophthalmic device
JP2019107485A (en)*2019-02-262019-07-04株式会社トプコンOphthalmic examination equipment
RU2718322C1 (en)*2019-08-272020-04-01федеральное государственное автономное учреждение "Национальный медицинский исследовательский центр "Межотраслевой научно-технический комплекс "Микрохирургия глаза" имени академика С.Н. Федорова" Министерства здравоохранения Российской ФедерацииMethod for differential diagnosis of optic nerve head drusen and congestive optic discs by optical coherent tomography-angiography
CN114746033B (en)2019-10-182025-01-10阿维格公司 Blocking crossing device
CN111103975B (en)2019-11-302022-09-23华为技术有限公司Display method, electronic equipment and system
CN111110183A (en)*2019-12-172020-05-08温州医科大学Binocular optical coherence automatic focusing imaging device and working method
JP7384656B2 (en)*2019-12-202023-11-21株式会社トプコン Ophthalmology information processing device, ophthalmology device, ophthalmology information processing method, and program
EP3889889A1 (en)*2020-03-302021-10-06Optos PLCOcular image data processing
WO2021236237A2 (en)2020-04-012021-11-25Sarcos Corp.System and methods for early detection of non-biological mobile aerial target
FI129077B (en)2020-06-292021-06-30Optomed OyjContact arrangement for eye examining instrument, eye examining instrument and method of contacting between eye and eye examining instrument
DK3939492T3 (en)*2020-07-172022-10-31Optos Plc BINOCULAR OPTICAL COHERENCE TOMOGRAPHY IMAGING SYSTEM
RU2750907C1 (en)*2020-11-232021-07-06Федеральное Государственное Бюджетное Учреждение "Национальный Медицинский Исследовательский Центр Оториноларингологии Федерального Медико-Биологического Агентства" (Фгбу Нмицо Фмба России)Method for differential diagnosis of drusen in age-related macular dystrophy
US11819274B2 (en)*2020-12-092023-11-21Alcon Inc.Non-contact wide angle retina viewing system
CN112967534B (en)*2021-02-072022-11-11广州大学Michelson interferometer virtual simulation system and Michelson interference experiment method
WO2022225946A1 (en)*2021-04-192022-10-27Welch Allyn, Inc.Visually linear and discrete dimming for ophthalmoscopes and otoscopes and other medical examination or diagnostic instruments
RU2762992C1 (en)*2021-05-212021-12-24федеральное государственное автономное учреждение "Национальный медицинский исследовательский центр "Межотраслевой научно-технический комплекс "Микрохирургия глаза" имени академика С.Н. Федорова" Министерства здравоохранения Российской ФедерацииMethod for diagnosing optic nerve atrophy after suffering optic neuritis in young patients using oct angiography
KR102566442B1 (en)*2021-06-042023-08-14고려대학교 산학협력단Apparatus and method for choroidal stroma analysis using optical coherence tomography
JP2023066499A (en)*2021-10-292023-05-16株式会社トプコン Ophthalmic information processing device, ophthalmic device, ophthalmic information processing method, and program
JP2023102032A (en)*2022-01-112023-07-24株式会社トプコンOphthalmologic apparatus
GB2624650A (en)*2022-11-232024-05-29Siloton LtdApparatus for optical coherence tomography
CN116636808B (en)*2023-06-282023-10-31交通运输部公路科学研究所 A method and device for analyzing visual health of smart cockpit driver

Citations (91)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4154114A (en)1977-12-021979-05-15Sonometrics Systems, Inc.Biometric measuring device
US4237901A (en)1978-08-301980-12-09Picker CorporationLow and constant pressure transducer probe for ultrasonic diagnostic system
US4479931A (en)1982-09-231984-10-30The United States Of America As Represented By The United States Department Of EnergyMethod for non-invasive detection of ocular melanoma
US4764006A (en)1985-09-131988-08-16Canon Kabushiki KaishaOphthalmic measuring apparatus
US4848340A (en)1988-02-101989-07-18Intelligent Surgical LasersEyetracker and method of use
US4930512A (en)1988-06-161990-06-05Sonomed, Inc.Hand held spring-loaded ultrasonic probe
US5005966A (en)1988-06-021991-04-09Handler Albert WExophthalmometer light, and methods of constructing and utilizing same
US5056522A (en)1987-11-021991-10-15Canon Kabushiki KaishaSupersonic ophthalmic measuring apparatus
US5141302A (en)1990-05-311992-08-25Kabushiki Kaisha TopconIntraocular length measuring instrument
US5369454A (en)1993-11-011994-11-29Cornell Research Foundation, Inc.System and method for producing and maintaining parallel and vertical fixation of visual axis
US5442412A (en)1994-04-251995-08-15Autonomous Technologies Corp.Patient responsive eye fixation target method and system
US5491524A (en)1994-10-051996-02-13Carl Zeiss, Inc.Optical coherence tomography corneal mapping apparatus
US5493109A (en)*1994-08-181996-02-20Carl Zeiss, Inc.Optical coherence tomography assisted ophthalmologic surgical microscope
EP0697611A2 (en)1994-08-181996-02-21Carl ZeissOptical coherence tomography assisted surgical apparatus
US5543866A (en)*1994-01-071996-08-06Jozef F. Van de VeldeScanning laser ophthalmoscope for binocular imaging and functional testing
US5557350A (en)1994-04-151996-09-17Nidek Co. Ltd.Ophthalmometric apparatus with alignment device including filter means
US5776068A (en)1997-06-191998-07-07Cornell Research FoundationUltrasonic scanning of the eye using a stationary transducer
US5914772A (en)1997-08-291999-06-22Eyelogic Inc.Method and device for testing eyes
JPH11225958A (en)1998-02-181999-08-24Koonan:KkChin rest device for ophthalmologic instrument and ophthalmogic instrument
WO1999057507A1 (en)1998-05-011999-11-11Board Of Regents, The University Of Texas SystemMethod and apparatus for subsurface imaging
US6086205A (en)1997-10-212000-07-11Medibell Medical Vision Technologies Ltd.Apparatus and method for simultaneous bilateral retinal digital angiography
US6112114A (en)1991-12-162000-08-29Laser Diagnostic Technologies, Inc.Eye examination apparatus employing polarized light probe
US6293674B1 (en)2000-07-112001-09-25Carl Zeiss, Inc.Method and apparatus for diagnosing and monitoring eye disease
US20010025226A1 (en)2000-02-182001-09-27Lavery Kevin T.Medical screening apparatus and method
US20020021411A1 (en)2000-05-302002-02-21Wilson Ralph C.Systems and methods for performing an eye examination
US20020080329A1 (en)2000-12-272002-06-27Tatsuya KasaharaCorneal endothelium analysis system
US20020099305A1 (en)2000-12-282002-07-25Matsushita Electic Works, Ltd.Non-invasive brain function examination
US6439720B1 (en)2000-01-272002-08-27Aoptics, Inc.Method and apparatus for measuring optical aberrations of the human eye
US6450643B1 (en)2000-05-302002-09-17Ralph C. WilsonSystems and methods for performing an eye examination
US20030065636A1 (en)2001-10-012003-04-03L'orealUse of artificial intelligence in providing beauty advice
US6592223B1 (en)1999-10-072003-07-15Panaseca, Inc.System and method for optimal viewing of computer monitors to minimize eyestrain
US6609794B2 (en)2001-06-052003-08-26Adaptive Optics Associates, Inc.Method of treating the human eye with a wavefront sensor-based ophthalmic instrument
US6619799B1 (en)1999-07-022003-09-16E-Vision, LlcOptical lens system with electro-active lens having alterably different focal lengths
US6656131B2 (en)2000-10-062003-12-02Notal Vision Inc.Method and system for detecting eye disease
US20030232015A1 (en)2002-02-282003-12-18Reay BrownDevice and method for monitoring aqueous flow within the eye
US20040019032A1 (en)2001-07-202004-01-29Janice NorthTreatment of macular edema
US6687389B2 (en)1997-10-242004-02-03British Telecommunications Public Limited CompanyImaging apparatus
US6692436B1 (en)2000-04-142004-02-17Computerized Screening, Inc.Health care information system
US6705726B2 (en)2002-02-202004-03-16Nidek Co., Ltd.Instrument for eye examination and method
US20040141152A1 (en)2002-09-272004-07-22Marino Joseph A.Apparatus and method for conducting vision screening
US20040196432A1 (en)2000-06-132004-10-07Wei SuDigital eye camera
US6820979B1 (en)1999-04-232004-11-23Neuroptics, Inc.Pupilometer with pupil irregularity detection, pupil tracking, and pupil response detection capability, glaucoma screening capability, intracranial pressure detection capability, and ocular aberration measurement capability
US20040254154A1 (en)2003-05-072004-12-16Control Delivery Systems, Inc.Prediction of changes to visual acuity from assessment of macular edema
US20040260183A1 (en)2003-02-052004-12-23Lambert James L.Non-invasive in vivo measurement of macular carotenoids
US20050001980A1 (en)2003-07-042005-01-06Spector Robert T.Method of and apparatus for diagnosing and treating amblyopic conditions in the human visual system
US20050041200A1 (en)2001-11-072005-02-24Darren RichGonioscopy assembly
US20050105044A1 (en)2003-11-142005-05-19Laurence WardenLensometers and wavefront sensors and methods of measuring aberration
US20050140981A1 (en)2002-04-182005-06-30Rudolf WaeltiMeasurement of optical properties
US7008116B2 (en)2003-10-312006-03-07Japan Aviation Electronics Industry,LimitedOptical connector adapter having an engaging portion which is rendered movable by the use of a slit
US20060077348A1 (en)2004-10-052006-04-13University Of Pittsburgh - Of The CommonwealthMethod and apparatus for screening for retinopathy
US20060077347A1 (en)2004-09-222006-04-13Eastman Kodak CompanyFundus camera having scanned illumination and pupil tracking
US20060092376A1 (en)2004-10-292006-05-04Seung-Ho BaekFundus imaging system
US20060109423A1 (en)2004-09-152006-05-25Jianhua WangTear dynamics measured with optical coherence tomography
US20060119858A1 (en)2004-12-022006-06-08Knighton Robert WEnhanced optical coherence tomography for anatomical mapping
US20060135859A1 (en)2004-10-222006-06-22Iliff Edwin CMatrix interface for medical diagnostic and treatment advice system and method
US20060158655A1 (en)2005-01-202006-07-20Everett Matthew JApparatus and method for combined optical-coherence-tomographic and confocal detection
CA2595324A1 (en)2005-01-212006-07-27Massachusetts Institute Of TechnologyMethods and apparatus for optical coherence tomography scanning
US20060195076A1 (en)2005-01-102006-08-31Blumenkranz Mark SMethod and apparatus for patterned plasma-mediated laser trephination of the lens capsule and three dimensional phaco-segmentation
US20060257451A1 (en)2005-04-082006-11-16Varner Signe ESustained release implants and methods for subretinal delivery of bioactive agents to treat or prevent retinal disease
US20070024868A1 (en)1998-09-112007-02-01University Hospitals Of ClevelandInterferometers for optical coherence domain reflectometry...
US20070030450A1 (en)2005-08-032007-02-08Eastman Kodak CompanyAutomated fundus imaging system
US20070055222A1 (en)1999-10-212007-03-08Kristian HohlaIris recognition and tracking for optical treatment
US20070073113A1 (en)2004-11-232007-03-29Squilla John RProviding medical services at a kiosk
US20070081165A1 (en)2005-04-292007-04-12Onur KilicHigh-sensitivity fiber-compatible optical acoustic sensor
US20070081166A1 (en)2005-09-292007-04-12Bioptigen, Inc.Portable Optical Coherence Tomography (OCT) Devices and Related Systems
EP1775545A2 (en)2005-10-122007-04-18Kabushiki Kaisha TOPCONOptical image measuring device, optical image measuring program, fundus observation device, and fundus observation program
US7219996B2 (en)2003-09-252007-05-22Nidek Co., Ltd.Fundus camera
DE102005058220A1 (en)2005-12-062007-06-14Carl Zeiss Meditec Ag Interferometric sample measurement
US7233312B2 (en)2000-07-312007-06-19Panaseca, Inc.System and method for optimal viewing of computer monitors to minimize eyestrain
US7237898B1 (en)1999-10-212007-07-03Bausch & Lomb IncorporatedCustomized corneal profiling
US20070153233A1 (en)2005-12-312007-07-05Campin John ADetermining optimal positioning of ophthalmic devices by use of image processing and autofocusing techniques
US20070177104A1 (en)2004-01-222007-08-02Francois LacombeEye examination device by means of tomography with a sighting device
US20070195269A1 (en)2006-01-192007-08-23Jay WeiMethod of eye examination by optical coherence tomography
US20070216909A1 (en)2006-03-162007-09-20Everett Matthew JMethods for mapping tissue with optical coherence tomography data
US20070282313A1 (en)2006-06-012007-12-06University Of Southern CaliforniaMethod and apparatus to guide laser corneal surgery with optical measurement
EP1864608A1 (en)2006-06-092007-12-12Kabushiki Kaisha TOPCONA fundus observation device, an ophthalmologic image processing unit, an ophthalmologic image processing method
US20070287932A1 (en)2006-05-012007-12-13University Of Southern CaliforniaMapping and diagnosis of macular edema by optical coherence tomography
US20070291228A1 (en)2006-05-012007-12-20University Of Southern CaliforniaGaussian fitting on mean curvature maps of parameterization of corneal ectatic diseases
US20070291277A1 (en)2006-06-202007-12-20Everett Matthew JSpectral domain optical coherence tomography system
US20080007694A1 (en)2006-05-252008-01-10Jianping WeiMeasurement of lenses and lens molds using optical coherence tomography
US7350921B2 (en)2003-06-232008-04-01Phillip V. RidingsiQueVision: animated / vision testing system
US20080106696A1 (en)2006-11-012008-05-08Bioptigen, Inc.Optical coherence imaging systems having a mechanism for shifting focus and scanning modality and related adapters
US7384146B2 (en)2005-06-282008-06-10Carestream Health, Inc.Health care kiosk having automated diagnostic eye examination and a fulfillment remedy based thereon
CA2678506A1 (en)2007-02-232008-08-28Mimo AgOphthalmologic apparatus for imaging an eye by optical coherence tomography
US7445335B2 (en)2006-01-202008-11-04Clarity Medical Systems, Inc.Sequential wavefront sensor
US20090141240A1 (en)2007-11-052009-06-04Oti Ophthalmic Technologies Inc.Method for Performing Micro-Perimetry and Visual Acuity Testing
US7549752B2 (en)2007-08-172009-06-23Peyman Gholam AMethod of detecting glaucoma
US7614747B2 (en)2004-07-282009-11-10Solohealth, Inc.Automated vision screening apparatus and method
US7618372B2 (en)2004-07-022009-11-17Dela Houssaye Arthur JosephLaser guided eye measuring device and method for using
US8002410B2 (en)2006-01-202011-08-23Clarity Medical Systems, Inc.User-proposed entry field(s) for customized data analysis/presentation
US8079711B2 (en)2008-04-242011-12-20Carl Zeiss Meditec, Inc.Method for finding the lateral position of the fovea in an SDOCT image volume

Family Cites Families (128)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US3538754A (en)1969-01-151970-11-10American Optical CorpMethod for measuring intraocular pressure
US4150443A (en)1978-03-131979-04-24Robert E. SmithAnti-fogging sports goggle
JPS5729204Y2 (en)1978-08-291982-06-25
JPS629847Y2 (en)*1980-07-231987-03-07
JPS5729204A (en)1980-07-281982-02-17Iseki Agricult MachDevice for changing climbing speed of rice transplanter body
JPS6040177B2 (en)1980-11-181985-09-10松下電器産業株式会社 Manufacturing method of multilayer capacitor
US4393366A (en)*1981-02-171983-07-12Eye-D Development Ii Ltd.Rotating beam ocular identification apparatus and method
JPS57153635U (en)1981-03-241982-09-27
US4740072A (en)1986-01-291988-04-26Titmus Optical, Inc.Vision testing apparatus
JPH0310748Y2 (en)1987-03-201991-03-18
DE3890892D2 (en)1987-07-181989-09-07Rodenstock Optik GSpectacle lens with astigmatic effect
USH574H (en)*1987-08-281989-02-07The United States Of America As Represented By The Secretary Of The Air ForceSubject operated pupilometer
US5140997A (en)1989-10-041992-08-25Glassman Jacob AOphthalmologic surgical drape with breathing means
US5214455A (en)1991-04-011993-05-25General Electric CompanyObjective eye alignment measurement method and system
US5129109A (en)1991-08-051992-07-14Runckel John LSwim goggles with inflatable air gasket seal
JPH05220113A (en)*1992-02-171993-08-31Topcon Corp Eye refractometer
FR2690329A1 (en)1992-04-221993-10-29Boucobza FabienDevice for external examination of human eye - has eye mask placed over eye and containing infrared video camera and infrared source
US5467104A (en)1992-10-221995-11-14Board Of Regents Of The University Of WashingtonVirtual retinal display
US5345946A (en)1993-04-231994-09-13Johnson & Johnson Medical, Inc.Multi-element surgical drape with sealable surgical run-off pouches
US5644642A (en)1995-04-031997-07-01Carl Zeiss, Inc.Gaze tracking using optical coherence tomography
US5596379A (en)*1995-10-241997-01-21Kawesch; Gary M.Portable visual acuity testing system and method
US5923399A (en)*1996-11-221999-07-13Jozef F. Van de VeldeScanning laser ophthalmoscope optimized for retinal microphotocoagulation
US5838424A (en)1997-02-261998-11-17Welch Allyn, Inc.View port for eye test apparatus
GB9722949D0 (en)1997-10-301998-01-07Bid Instr LtdOcular testing and related projection apparatus and method
US6019103A (en)1998-02-262000-02-01Carroll; LynnetteDisposable sanitary eye protector
AU750778B2 (en)*1998-06-172002-07-25Lions Eye Institute LimitedZ axis tracker
US6285489B1 (en)1999-08-052001-09-04Microvision Inc.Frequency tunable resonant scanner with auxiliary arms
US7374287B2 (en)*1999-11-012008-05-20Jozef F. Van de VeldeRelaxed confocal catadioptric scanning laser ophthalmoscope
US6345621B1 (en)2000-02-182002-02-12Becton, Dickinson And CompanyDual refractive drape for use in eye surgery
US6460997B1 (en)2000-05-082002-10-08Alcon Universal Ltd.Apparatus and method for objective measurements of optical systems using wavefront analysis
US6628041B2 (en)2000-05-162003-09-30Calient Networks, Inc.Micro-electro-mechanical-system (MEMS) mirror device having large angle out of plane motion using shaped combed finger actuators and method for fabricating the same
US7203425B1 (en)2000-07-212007-04-10Texas Instruments IncorporatedOptical wireless link
CA2420530A1 (en)*2000-08-232002-02-28Philadelphia Ophthalmologic Imaging Systems, Inc.Systems and methods for tele-ophthalmology
JP2002238858A (en)*2001-02-202002-08-27Matsushita Electric Works LtdFundus oculi examination method, server and fundus oculi examination enterprise system
US7232220B2 (en)2001-03-012007-06-19Richard FranzSystem for vision examination utilizing telemedicine
JP2002306415A (en)*2001-04-132002-10-22Japan Science & Technology Corp Ophthalmic diagnosis support system
WO2002088684A1 (en)2001-04-302002-11-07The General Hospital CorporationMethod and apparatus for improving image clarity and sensitivity in optical coherence tomography using dynamic feedback to control focal properties and coherence gating
US20050010091A1 (en)*2003-06-102005-01-13Woods Joe W.Non-invasive measurement of blood glucose using retinal imaging
US6634237B2 (en)2001-11-302003-10-21Bausch & Lomb IncorporatedCollection reservoir for use with flow meter control system
EP1487323B1 (en)*2002-03-282015-04-22Heidelberg Engineering GmbHMethod for examining the ocular fundus
CA2390072C (en)2002-06-282018-02-27Adrian Gh PodoleanuOptical mapping apparatus with adjustable depth resolution and multiple functionality
US6637877B1 (en)2002-08-092003-10-28Gentex CorporationEyewear for ballistic and light protection
JP2004201998A (en)*2002-12-262004-07-22Olympus CorpElectronic optometer
JP2004261316A (en)*2003-02-282004-09-24Matsushita Electric Ind Co Ltd Eye diagnostic system
DE10313028A1 (en)2003-03-242004-10-21Technovision Gmbh Method and device for eye alignment
EP2311991A1 (en)2003-04-152011-04-20Intercell AGS. pneumoniae antigens
WO2004098396A2 (en)*2003-05-012004-11-18The Cleveland Clinic FoundationMethod and apparatus for measuring a retinal sublayer characteristic
CA2470027A1 (en)2003-06-052004-12-05Kingston General HospitalManagement systems and methods
CN100387183C (en)2003-09-092008-05-14长春市光电仪器有限公司Digital simultaneous vision machine
JP4187160B2 (en)2003-09-102008-11-26フジノン株式会社 Tomographic imaging system
GB0322551D0 (en)2003-09-262003-10-29Keeler LtdHead-mounted instruments
GB2407378B (en)*2003-10-242006-09-06Lein Applied Diagnostics LtdOcular property measuring apparatus and method therefor
US7731360B2 (en)*2003-11-072010-06-08Neuro KineticsPortable video oculography system
US7114554B2 (en)2003-12-012006-10-03Honeywell International Inc.Controller interface with multiple day programming
US7277223B2 (en)*2004-07-262007-10-02Meade Instruments CorporationApparatus and methods for focusing and collimating telescopes
US8182091B2 (en)*2004-07-282012-05-22Solohealth, Inc.Automated vision screening apparatus and method
CA2580726A1 (en)*2004-09-162006-03-30Imaging Therapeutics, Inc.System and method of predicting future fractures
KR20060054625A (en)*2004-11-152006-05-23엘지전자 주식회사 Portable terminal capable of measuring eyesight and visual acuity measuring method using the same
US20060203195A1 (en)2005-03-102006-09-14Squire Bret CIntegrated ocular examination device
US7390091B2 (en)2005-04-292008-06-24Sperian Protection Optical, Inc.Vision testing apparatus
JP2006350256A (en)2005-06-202006-12-28Canon Inc Scanning image display device
US20090153796A1 (en)2005-09-022009-06-18Arthur RabnerMulti-functional optometric-ophthalmic system for testing diagnosing, or treating, vision or eyes of a subject, and methodologies thereof
WO2007065670A2 (en)2005-12-062007-06-14Carl Zeiss Meditec AgInterferometric sample measurement
JP4823693B2 (en)*2006-01-112011-11-24株式会社トプコン Optical image measuring device
JP4869757B2 (en)*2006-03-242012-02-08株式会社トプコン Fundus observation device
JP4864516B2 (en)2006-04-072012-02-01株式会社トプコン Ophthalmic equipment
WO2007127291A2 (en)*2006-04-242007-11-08Physical Sciences, Inc.Stabilized retinal imaging with adaptive optics
EP2012653B1 (en)*2006-05-012012-12-12Physical Sciences, Inc.Hybrid spectral domain optical coherence tomography line scanning laser ophthalmoscope
CA2653309C (en)2006-05-262013-11-19The Cleveland Clinic FoundationMethod for measuring biomechanical properties in an eye
AU2007254999B2 (en)2006-05-312013-02-21Aeon Imaging, LLCLaser scanning digital camera with simplified optics and potential for multiply scattered light imaging
JP4822332B2 (en)*2006-06-222011-11-24株式会社トプコン Ophthalmic equipment
CN200980154Y (en)2006-07-212007-11-21林炳宏 Headphone structure of the eye mask
EP2068992B1 (en)2006-08-032016-10-05Breathe Technologies, Inc.Devices for minimally invasive respiratory support
US7668619B2 (en)*2006-10-252010-02-23Walgreen Co.Personalized gift card templates
WO2008052793A1 (en)*2006-11-022008-05-08Heidelberg Engineering GmbhMethod and apparatus for retinal diagnosis
US8468244B2 (en)2007-01-052013-06-18Digital Doors, Inc.Digital information infrastructure and method for security designated data and with granular data stores
WO2009039303A1 (en)2007-09-192009-03-26State University Of New York At Stony BrookOptical coherence tomography systems and methods
US20090143685A1 (en)*2007-11-132009-06-04The Regents Of The University Of MichiganMethod and Apparatus for Detecting Diseases Associated with the Eye
JP2011508250A (en)2007-12-122011-03-10スリーエム イノベイティブ プロパティズ カンパニー Optical film laminate
US8079710B2 (en)2008-01-102011-12-20Notal Vision Ltd.Dual position ophthalmic apparatus
US7952778B2 (en)2008-01-152011-05-31Jds Uniphase CorporationBiaxial MEMS mirror with hidden hinge
US8983580B2 (en)*2008-01-182015-03-17The Board Of Trustees Of The University Of IllinoisLow-coherence interferometry and optical coherence tomography for image-guided surgical treatment of solid tumors
DE102008000225B3 (en)2008-02-012009-03-26Linos Photonics Gmbh & Co. Kg fundus
US8348429B2 (en)2008-03-272013-01-08Doheny Eye InstituteOptical coherence tomography device, method, and system
WO2009120543A1 (en)2008-03-272009-10-01Doheny Eye InstituteOptical coherence tomography device, method, and system
EP2306888A1 (en)2008-04-142011-04-13Optovue, Inc.Method of eye registration for optical coherence tomography
US8783866B2 (en)2008-04-242014-07-22Bioptigen, Inc.Optical coherence tomography (OCT) imaging systems having adaptable lens systems and related methods and computer program products
EP3884844A1 (en)2008-07-182021-09-29Doheny Eye InstituteOptical coherence tomography-based ophthalmic testing methods, devices and systems
WO2010117386A1 (en)2009-04-102010-10-14Doheny Eye InstituteOphthalmic testing methods, devices and systems
GB0907557D0 (en)2009-05-012009-06-10Optos PlcImprovements in or relating to scanning ophthalmoscopes
CN201491234U (en)2009-08-222010-05-26倪国庆Massager earphone structure
US20110047682A1 (en)2009-08-252011-03-03Arman HedayatProtective Eyewear Device With Lateral Eye Access and Quick Release Mechanism for Interchanging Lenses
US8510883B2 (en)2009-10-302013-08-20Arcscan, Inc.Method of positioning a patient for medical procedures
JP5701625B2 (en)2010-03-312015-04-15株式会社ニデック Ophthalmic laser treatment device
US9050027B2 (en)2010-07-302015-06-09Adventus Technologies, Inc.Intraoperative imaging system and apparatus
US8712505B2 (en)2010-11-112014-04-29University Of Pittsburgh-Of The Commonwealth System Of Higher EducationAutomated macular pathology diagnosis in three-dimensional (3D) spectral domain optical coherence tomography (SD-OCT) images
JP5952564B2 (en)2011-01-202016-07-13キヤノン株式会社 Image processing apparatus and image processing method
US20120222185A1 (en)2011-03-042012-09-06Laura Margaret EriksonShower face shield
US20120257166A1 (en)2011-04-072012-10-11Raytheon CompanyPortable self-retinal imaging device
US9055892B2 (en)2011-04-272015-06-16Carl Zeiss Meditec, Inc.Systems and methods for improved ophthalmic imaging
US20130010259A1 (en)2011-07-052013-01-10Escalon Digital Vision, Inc.Region based vision tracking system for imaging of the eye for use in optical coherence tomography
JP6057567B2 (en)2011-07-142017-01-11キヤノン株式会社 Imaging control apparatus, ophthalmic imaging apparatus, imaging control method, and program
CA2750287C (en)2011-08-292012-07-03Microsoft CorporationGaze detection in a see-through, near-eye, mixed reality display
KR101245330B1 (en)2011-12-202013-03-25경희대학교 산학협력단Pc-based visual filed self-diagnosis system and gaze fixing method
JP5306493B2 (en)2012-01-252013-10-02キヤノン株式会社 Ophthalmic apparatus, control method for ophthalmic apparatus, and program
JP5936371B2 (en)2012-01-262016-06-22キヤノン株式会社 Ophthalmic apparatus, method for controlling ophthalmic apparatus, and program
US9125724B2 (en)2012-03-092015-09-08John BerdahiIntraocular pressure modification
US20130265537A1 (en)2012-04-042013-10-10Honeywell International, Inc.Safety eyewear with lens frame eyewire having rectangular cross-section for improved lens retention
US9004687B2 (en)2012-05-182015-04-14Sync-Think, Inc.Eye tracking headset and system for neuropsychological testing including the detection of brain damage
US9320427B2 (en)2012-07-092016-04-26Arcscan, Inc.Combination optical and ultrasonic imaging of an eye
US8951046B2 (en)2012-08-102015-02-10Sync-Think, Inc.Desktop-based opto-cognitive device and system for cognitive assessment
KR20150083902A (en)2012-11-072015-07-20클레러티 메디칼 시스템즈 인코포레이티드Apparatus and method for operating a real time large diopter range sequential wavefront sensor
WO2014084231A1 (en)2012-11-302014-06-05株式会社トプコンFundus photographing device
JP5924259B2 (en)2012-12-282016-05-25株式会社ニデック Target presentation device
US9420945B2 (en)2013-03-142016-08-23Carl Zeiss Meditec, Inc.User interface for acquisition, display and analysis of ophthalmic diagnostic data
US10772497B2 (en)2014-09-122020-09-15Envision Diagnostics, Inc.Medical interfaces and other medical devices, systems, and methods for performing eye exams
US9226856B2 (en)2013-03-142016-01-05Envision Diagnostics, Inc.Inflatable medical interfaces and other medical devices, systems, and methods
DK2822448T3 (en)2013-05-292017-02-27Wavelight Gmbh APPARATUS FOR AN EYE OPTICAL COHENSE TOMOGRAPHY AND PROCEDURE FOR OPTICAL COHENSE TOMOGRAPHY OF AN EYE
US9784558B2 (en)2014-01-202017-10-10Apple Inc.Sensing of mirror position using fringing fields
CN107708524A (en)2015-01-262018-02-16威盛纳斯医疗系统公司Disposable separation sleeve for eye imaging devices and associated method
NZ773836A (en)2015-03-162022-07-01Magic Leap IncMethods and systems for diagnosing and treating health ailments
US20160307063A1 (en)2015-04-162016-10-20Synaptive Medical (Barbados) Inc.Dicom de-identification system and method
EP3349642B1 (en)2015-09-172020-10-21Envision Diagnostics, Inc.Medical interfaces and other medical devices, systems, and methods for performing eye exams
US9875541B2 (en)2016-01-142018-01-23Canon Kabushiki KaishaEnhanced algorithm for the detection of eye motion from fundus images
JP6660750B2 (en)2016-02-012020-03-11株式会社トプコン Ophthalmic examination system
US10453431B2 (en)2016-04-282019-10-22Ostendo Technologies, Inc.Integrated near-far light field display systems
EP3448234A4 (en)2016-04-302019-05-01Envision Diagnostics, Inc. MEDICAL DEVICES, SYSTEMS AND METHODS FOR OPERATING OCULAR EXAMINATIONS AND OCULOMETRY
WO2017190071A1 (en)2016-04-302017-11-02Envision Diagnostics, Inc.Medical devices, systems, and methods for performing eye exams using displays comprising mems scanning mirrors

Patent Citations (101)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4154114A (en)1977-12-021979-05-15Sonometrics Systems, Inc.Biometric measuring device
US4237901A (en)1978-08-301980-12-09Picker CorporationLow and constant pressure transducer probe for ultrasonic diagnostic system
US4479931A (en)1982-09-231984-10-30The United States Of America As Represented By The United States Department Of EnergyMethod for non-invasive detection of ocular melanoma
US4764006A (en)1985-09-131988-08-16Canon Kabushiki KaishaOphthalmic measuring apparatus
US5056522A (en)1987-11-021991-10-15Canon Kabushiki KaishaSupersonic ophthalmic measuring apparatus
US4848340A (en)1988-02-101989-07-18Intelligent Surgical LasersEyetracker and method of use
US5005966A (en)1988-06-021991-04-09Handler Albert WExophthalmometer light, and methods of constructing and utilizing same
US4930512A (en)1988-06-161990-06-05Sonomed, Inc.Hand held spring-loaded ultrasonic probe
US5141302A (en)1990-05-311992-08-25Kabushiki Kaisha TopconIntraocular length measuring instrument
US6112114A (en)1991-12-162000-08-29Laser Diagnostic Technologies, Inc.Eye examination apparatus employing polarized light probe
US5369454A (en)1993-11-011994-11-29Cornell Research Foundation, Inc.System and method for producing and maintaining parallel and vertical fixation of visual axis
US5543866A (en)*1994-01-071996-08-06Jozef F. Van de VeldeScanning laser ophthalmoscope for binocular imaging and functional testing
US5557350A (en)1994-04-151996-09-17Nidek Co. Ltd.Ophthalmometric apparatus with alignment device including filter means
US5442412A (en)1994-04-251995-08-15Autonomous Technologies Corp.Patient responsive eye fixation target method and system
US5493109A (en)*1994-08-181996-02-20Carl Zeiss, Inc.Optical coherence tomography assisted ophthalmologic surgical microscope
EP0697611A2 (en)1994-08-181996-02-21Carl ZeissOptical coherence tomography assisted surgical apparatus
US5491524A (en)1994-10-051996-02-13Carl Zeiss, Inc.Optical coherence tomography corneal mapping apparatus
US5776068A (en)1997-06-191998-07-07Cornell Research FoundationUltrasonic scanning of the eye using a stationary transducer
US5914772A (en)1997-08-291999-06-22Eyelogic Inc.Method and device for testing eyes
US6086205A (en)1997-10-212000-07-11Medibell Medical Vision Technologies Ltd.Apparatus and method for simultaneous bilateral retinal digital angiography
US6687389B2 (en)1997-10-242004-02-03British Telecommunications Public Limited CompanyImaging apparatus
JPH11225958A (en)1998-02-181999-08-24Koonan:KkChin rest device for ophthalmologic instrument and ophthalmogic instrument
WO1999057507A1 (en)1998-05-011999-11-11Board Of Regents, The University Of Texas SystemMethod and apparatus for subsurface imaging
US20070024868A1 (en)1998-09-112007-02-01University Hospitals Of ClevelandInterferometers for optical coherence domain reflectometry...
US6820979B1 (en)1999-04-232004-11-23Neuroptics, Inc.Pupilometer with pupil irregularity detection, pupil tracking, and pupil response detection capability, glaucoma screening capability, intracranial pressure detection capability, and ocular aberration measurement capability
US6619799B1 (en)1999-07-022003-09-16E-Vision, LlcOptical lens system with electro-active lens having alterably different focal lengths
US6592223B1 (en)1999-10-072003-07-15Panaseca, Inc.System and method for optimal viewing of computer monitors to minimize eyestrain
US7237898B1 (en)1999-10-212007-07-03Bausch & Lomb IncorporatedCustomized corneal profiling
US20070055222A1 (en)1999-10-212007-03-08Kristian HohlaIris recognition and tracking for optical treatment
US6439720B1 (en)2000-01-272002-08-27Aoptics, Inc.Method and apparatus for measuring optical aberrations of the human eye
US20010025226A1 (en)2000-02-182001-09-27Lavery Kevin T.Medical screening apparatus and method
US6692436B1 (en)2000-04-142004-02-17Computerized Screening, Inc.Health care information system
US6450643B1 (en)2000-05-302002-09-17Ralph C. WilsonSystems and methods for performing an eye examination
US20020021411A1 (en)2000-05-302002-02-21Wilson Ralph C.Systems and methods for performing an eye examination
US20040196432A1 (en)2000-06-132004-10-07Wei SuDigital eye camera
US6293674B1 (en)2000-07-112001-09-25Carl Zeiss, Inc.Method and apparatus for diagnosing and monitoring eye disease
US7233312B2 (en)2000-07-312007-06-19Panaseca, Inc.System and method for optimal viewing of computer monitors to minimize eyestrain
US6656131B2 (en)2000-10-062003-12-02Notal Vision Inc.Method and system for detecting eye disease
US20020080329A1 (en)2000-12-272002-06-27Tatsuya KasaharaCorneal endothelium analysis system
US20020099305A1 (en)2000-12-282002-07-25Matsushita Electic Works, Ltd.Non-invasive brain function examination
US6609794B2 (en)2001-06-052003-08-26Adaptive Optics Associates, Inc.Method of treating the human eye with a wavefront sensor-based ophthalmic instrument
US20040019032A1 (en)2001-07-202004-01-29Janice NorthTreatment of macular edema
US20030065636A1 (en)2001-10-012003-04-03L'orealUse of artificial intelligence in providing beauty advice
US20050041200A1 (en)2001-11-072005-02-24Darren RichGonioscopy assembly
US6705726B2 (en)2002-02-202004-03-16Nidek Co., Ltd.Instrument for eye examination and method
US20030232015A1 (en)2002-02-282003-12-18Reay BrownDevice and method for monitoring aqueous flow within the eye
US6939298B2 (en)2002-02-282005-09-06Gmp Vision Solutions, IncDevice and method for monitoring aqueous flow within the eye
US20050140981A1 (en)2002-04-182005-06-30Rudolf WaeltiMeasurement of optical properties
US20040141152A1 (en)2002-09-272004-07-22Marino Joseph A.Apparatus and method for conducting vision screening
US20040260183A1 (en)2003-02-052004-12-23Lambert James L.Non-invasive in vivo measurement of macular carotenoids
US20040254154A1 (en)2003-05-072004-12-16Control Delivery Systems, Inc.Prediction of changes to visual acuity from assessment of macular edema
US7350921B2 (en)2003-06-232008-04-01Phillip V. RidingsiQueVision: animated / vision testing system
US20050001980A1 (en)2003-07-042005-01-06Spector Robert T.Method of and apparatus for diagnosing and treating amblyopic conditions in the human visual system
US7219996B2 (en)2003-09-252007-05-22Nidek Co., Ltd.Fundus camera
US7008116B2 (en)2003-10-312006-03-07Japan Aviation Electronics Industry,LimitedOptical connector adapter having an engaging portion which is rendered movable by the use of a slit
US20050105044A1 (en)2003-11-142005-05-19Laurence WardenLensometers and wavefront sensors and methods of measuring aberration
US20070177104A1 (en)2004-01-222007-08-02Francois LacombeEye examination device by means of tomography with a sighting device
US7618372B2 (en)2004-07-022009-11-17Dela Houssaye Arthur JosephLaser guided eye measuring device and method for using
US7614747B2 (en)2004-07-282009-11-10Solohealth, Inc.Automated vision screening apparatus and method
US20060109423A1 (en)2004-09-152006-05-25Jianhua WangTear dynamics measured with optical coherence tomography
US20060077347A1 (en)2004-09-222006-04-13Eastman Kodak CompanyFundus camera having scanned illumination and pupil tracking
US20060077348A1 (en)2004-10-052006-04-13University Of Pittsburgh - Of The CommonwealthMethod and apparatus for screening for retinopathy
US20060135859A1 (en)2004-10-222006-06-22Iliff Edwin CMatrix interface for medical diagnostic and treatment advice system and method
US20060092376A1 (en)2004-10-292006-05-04Seung-Ho BaekFundus imaging system
US20070073113A1 (en)2004-11-232007-03-29Squilla John RProviding medical services at a kiosk
US20060119858A1 (en)2004-12-022006-06-08Knighton Robert WEnhanced optical coherence tomography for anatomical mapping
US20060195076A1 (en)2005-01-102006-08-31Blumenkranz Mark SMethod and apparatus for patterned plasma-mediated laser trephination of the lens capsule and three dimensional phaco-segmentation
US20060158655A1 (en)2005-01-202006-07-20Everett Matthew JApparatus and method for combined optical-coherence-tomographic and confocal detection
EP1858402A1 (en)2005-01-212007-11-28Massachusetts Institute Of TechnologyMethods and apparatus for optical coherence tomography scanning
US20060187462A1 (en)2005-01-212006-08-24Vivek SrinivasanMethods and apparatus for optical coherence tomography scanning
CA2595324A1 (en)2005-01-212006-07-27Massachusetts Institute Of TechnologyMethods and apparatus for optical coherence tomography scanning
US20060257451A1 (en)2005-04-082006-11-16Varner Signe ESustained release implants and methods for subretinal delivery of bioactive agents to treat or prevent retinal disease
US20070081165A1 (en)2005-04-292007-04-12Onur KilicHigh-sensitivity fiber-compatible optical acoustic sensor
US7384146B2 (en)2005-06-282008-06-10Carestream Health, Inc.Health care kiosk having automated diagnostic eye examination and a fulfillment remedy based thereon
US7458685B2 (en)2005-08-032008-12-02Carestream Health, Inc.Automated fundus imaging system
US20070030450A1 (en)2005-08-032007-02-08Eastman Kodak CompanyAutomated fundus imaging system
US20070273831A1 (en)2005-08-032007-11-29Rongguang LiangAutomated fundus imaging system
US20070081166A1 (en)2005-09-292007-04-12Bioptigen, Inc.Portable Optical Coherence Tomography (OCT) Devices and Related Systems
EP1775545A2 (en)2005-10-122007-04-18Kabushiki Kaisha TOPCONOptical image measuring device, optical image measuring program, fundus observation device, and fundus observation program
DE102005058220A1 (en)2005-12-062007-06-14Carl Zeiss Meditec Ag Interferometric sample measurement
US20070153233A1 (en)2005-12-312007-07-05Campin John ADetermining optimal positioning of ophthalmic devices by use of image processing and autofocusing techniques
US7744221B2 (en)2006-01-192010-06-29Optovue, Inc.Method of eye examination by optical coherence tomography
US20070195269A1 (en)2006-01-192007-08-23Jay WeiMethod of eye examination by optical coherence tomography
US8002410B2 (en)2006-01-202011-08-23Clarity Medical Systems, Inc.User-proposed entry field(s) for customized data analysis/presentation
US7815310B2 (en)2006-01-202010-10-19Clarity Medical Systems, Inc.Adaptive sequential wavefront sensor and its applications
US8100530B2 (en)2006-01-202012-01-24Clarity Medical Systems, Inc.Optimizing vision correction procedures
US7445335B2 (en)2006-01-202008-11-04Clarity Medical Systems, Inc.Sequential wavefront sensor
US20070216909A1 (en)2006-03-162007-09-20Everett Matthew JMethods for mapping tissue with optical coherence tomography data
US20070291228A1 (en)2006-05-012007-12-20University Of Southern CaliforniaGaussian fitting on mean curvature maps of parameterization of corneal ectatic diseases
US20070287932A1 (en)2006-05-012007-12-13University Of Southern CaliforniaMapping and diagnosis of macular edema by optical coherence tomography
US20080007694A1 (en)2006-05-252008-01-10Jianping WeiMeasurement of lenses and lens molds using optical coherence tomography
US20070282313A1 (en)2006-06-012007-12-06University Of Southern CaliforniaMethod and apparatus to guide laser corneal surgery with optical measurement
EP1864608A1 (en)2006-06-092007-12-12Kabushiki Kaisha TOPCONA fundus observation device, an ophthalmologic image processing unit, an ophthalmologic image processing method
US20070291277A1 (en)2006-06-202007-12-20Everett Matthew JSpectral domain optical coherence tomography system
US20080106696A1 (en)2006-11-012008-05-08Bioptigen, Inc.Optical coherence imaging systems having a mechanism for shifting focus and scanning modality and related adapters
CA2678506A1 (en)2007-02-232008-08-28Mimo AgOphthalmologic apparatus for imaging an eye by optical coherence tomography
EP2124713A1 (en)2007-02-232009-12-02Mimo AGOphthalmologic apparatus for imaging an eye by optical coherence tomography
US20100110377A1 (en)2007-02-232010-05-06Peter MalocaOphthalmologic apparatus for imaging an eye by optical coherence tomography
US7549752B2 (en)2007-08-172009-06-23Peyman Gholam AMethod of detecting glaucoma
US20090141240A1 (en)2007-11-052009-06-04Oti Ophthalmic Technologies Inc.Method for Performing Micro-Perimetry and Visual Acuity Testing
US8079711B2 (en)2008-04-242011-12-20Carl Zeiss Meditec, Inc.Method for finding the lateral position of the fovea in an SDOCT image volume

Non-Patent Citations (46)

* Cited by examiner, † Cited by third party
Title
"STRATUS OCT(TM) Software version 4.0 Real Answers in Real Time." [Online] Jan. 2006, XP002530105 Retrieved from the Internet: URL: http://www.meditec.zeiss.com/88256DE3007B916B/0/C26634D0CFF04511882571B1005DECFD/$file/stratusoct-en.pdf>[retrieved on May 28, 2009] the whole document.
"STRATUS OCT™ Software version 4.0 Real Answers in Real Time." [Online] Jan. 2006, XP002530105 Retrieved from the Internet: URL: http://www.meditec.zeiss.com/88256DE3007B916B/0/C26634D0CFF04511882571B1005DECFD/$file/stratusoct—en.pdf>[retrieved on May 28, 2009] the whole document.
Bachmann, et al., Heterodyne Fourier domain optical coherence tomography for full range probing with high axial resolution; Optics Express; vol. 14; Issue No. 4; pp. 1487-1496, 2006.
Bigelow, et al., Compact multimodal adaptive-optics spectral-domain optical coherence tomography instrument for retinal imaging; J.Opt. Soc., Am. A.; vol. 24; Issue No. 5; pp. 1327-1336, 2007.
Bu, et al., Full-range parallel Fourier-domain optical coherence tomography using sinusoidal phase-modulating interferometry; J. Opt. A: Pure Appl. Opt.; No. 9; pp. 422-426, 2007.
Burgansky-Eliash, et al., Optical Coherence Tomography Machine Learning Classifiers for Glaucoma Detection: A Preliminary Study, Investigative Ophthalmology & Visual Science; vol. 46; No. 11; pp. 4147-4152, 2005.
Chang, et al.; New developments in optical coherence tomography for glaucoma, Curr Opin Ophthalmol; No. 19; pp. 127-135; 2008.
Drexler, et al., State-of-the-art retinal optical coherence tomography; Progress in Retinal and Eye Research; vol. 27; Issue 1; pp. 45-88; 2008.
Fernandez, Delineating Fluid-Filled Region Boundaries in Optical Coherence Tomography Images of the Retina; IEEE Transactions on Medical Imaging; vol. 24; Issue No. 8; pp. 929-945, 2005.
Ghosn, et al., Nondestructive Quantification of Analyte Diffusion in Cornea and Sclera Using Optical Coherence Tomography; investigative Ophthalmology & Visual Science; vol. 48, No. 6, pp. 2726-2733, 2007.
Guo et al., "En face optical coherence tomography" a new method to analyse structural changes of the optic nerve head in rat glaucoma, retrieved from the Internet: ,URL: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1772813/pdf/bjo08901210.pdf>, British Journal of Ophthalmology, 2005, vol. 89, Issue 9, pp. 1210-1216.
Huang, et al., Development and Comparison of Automated Classifiers for Glaucoma Diagnosis Using Stratus Optical Coherence Tomography; Investigative Ophthalmology & Visual Science; vol. 46; Issue No. 11; pp. 4121-4129, 2005.
International Preliminary Report of Patentability dated Jan. 18, 2011 for PCT Application No. PCT/US2009/051073 filed on Jul. 17, 2009.
International Preliminary Report of Patentability dated Oct. 11, 2011 for PCT Application No. PCT/US2009/059133 filed on Sep. 30, 2009.
International Preliminary Report on Patentability dated Jan. 18, 2011 for PCT Application No. PCT/US2009/051077 filed on Jul. 17, 2009.
International Preliminary Report on Patentability dated Sep. 28, 2010 for PCT Application No. PCT/US2009/037448 filed on Mar. 17, 2009.
International Preliminary Report on Patentability dated Sep. 28, 2010 for PCT Application No. PCT/US2009/037449 filed on Mar. 17, 2009.
International Search Report and Written Opinion Received in PCT /US2009/051077 Dated Oct. 13, 2009.
International Search Report and Written Opinion Received in PCT/US2009/037448 Dated Dec. 6, 2009.
International Search Report and Written Opinion Received in PCT/US2009/037449 Dated Aug. 27, 2009.
International Search Report and Written Opinion Received in PCT/US2009/051073 Dated Jan. 13, 2010.
International Search Report and Written Opinion Received in PCT/US2009/059133 Dated Jan. 21, 2010.
Keystone View; Computer Controlled vision Screeners. http://www.keystoneview.com?p=cv&id=39, 2 pages, 2003.
Koizumi et al: "Three-Dimensional Evaluation of Vitreonacular Traction and Epiretinal Membrane Using Spectral-Domain Optical Coherence Tomography" American Journal of Ophthalmology, Ophthalmic Publ, Chicago, IL, US, vol. 145, No. 3, Jan. 11, 2008, pp. 509-517 .el, right-hand column, last paragraph-p. 154, left-hand column.
Koozekanani, et al., Retinal Thickness Measurements from Optical Coherence Tomography Using a Markov Boundary Model. IEEE Transactions on Medical Imaging; vol. 20; No. 9; pp. 900-916, 2001.
Lavanya, et al, Screening for Narrow Angles in the Singapore Population: Evaluation of New Noncontact Screening Methods; vol. 115; Issue No. 10, pp. 1720-1727e2, 2008.
Manassakorn, et al., Comparison of Retinal Nerve Fiber Layer Thickness and Optic Disk Algorithms with Optical Coherence Tomography to Detect Glaucoma; Am J Ophthalmol; vol. No. 141; pp. 105-115; 2006.
Parikh, M.D., Diagnostic Capability of Optical Coherence Tomography (Stratus OCT 3) in Early Glaucoma; American Academy of Ophthalmology; pp. 2238-2243, 2007.
Prevent Blindness America. SureSight Vision Screener. Prevent Blindness Tri-State. http://wwww.preventblindness.org/tristate/suresight.html, 2 pages, 2006.
Sadda, Srinivas R., et al., Automated Detection of Clinically Significant Macular Edema by Grid Scanning Optical Coherence Tomography. American Academy of Ophthalmology, vol. 113, No. 7, pp. 1187 e.1-1187 e.12, 2006.
Sarunic et al, "New Imaging Device Can Detect Glaucoma Risk", Duke Medicine News and Communications, 2008.
Stein, et al., A new quality assessment parameter for optical coherence tomography; Br J Ophthalmol; Issue No. 90; pp. 186-190; 2005.
Stereo Optical Co., Inc. The Optec® 5500/5500 P-Industry Standard for Visual Screening and Vision Testing Devices. http://www.stereooptical.com/html/optec-5500.html, 3 pages, 2007.
Stereo Optical Co., Inc. The Optec® Functional Vision Analyzer(TM)-Contrast Sensitivity Tests with Two Glare Levels Under Four Testing Conditions. http://www.stereooptical.com/html/functional-vision-analyzer.html, 3 pages, 2007.
Stereo Optical Co., Inc. The Optec® Functional Vision Analyzer™—Contrast Sensitivity Tests with Two Glare Levels Under Four Testing Conditions. http://www.stereooptical.com/html/functional—vision—analyzer.html, 3 pages, 2007.
Supplementary European Search Report for Application No. 090798839.8, dated Jan. 30, 2012.
Topcon Optical Coherence Tomography 3D OCT-1000 Brochure, 7 pages, 2008.
Topcon Optical Coherence Tomography 3D OCT-1000 Mark II Brochure, 12 pages.
U.S. Appl. No. 13/054,481, including its prosecution history, and the references cited and the Office Actions therein, Not Yet Published, Walsh, et al.
United States Patent and Trademark Office, Office Action of Aug. 15, 2012, in U.S. Appl. No. 13/054,481.
Vakhtin, et al, Common-path interferometer for frequency-domain optical coherence tomography; Applied Optics; vol. 42, Issue No. 34; pp. 6953-6958, 2003.
Vakhtin, et al., Demonstration of complex-conjugate-resolved harmonic Fourier-domain optical coherence tomography imaging of biological samples; Applied Optics; vol. 46; Issue No. 18; pp. 3870-3877, 2007.
Xu, et al., Anterior Chamber Depth and Chamber Angle and Their Associations with Ocular and General Parameters: The Beijing Eye Study. American Journal of Ophthalmology, vol. 145, pp. 929-936e1, 2008.
Yasuno, et al., One-shot-phase-shifting Fourier domain optical coherence tomography by reference wavefront tilting; Optics Express; vol. 12; Issue No. 25; pp. 6184-6191, 2004.
Zhang, et al., Full range polarization-sensitive Fourier domain optical coherence tomography; Optics Express; vol. 12; Issue No. 24; pp. 6033-6039, 2004.
Zhou et al., "Biometric measurement of the mouse eye using optical coherence tomography with focal plane advancement", Vision Research, 2008, vol. 48, pp. 1137-1143.

Cited By (114)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US11510567B2 (en)2008-03-272022-11-29Doheny Eye InstituteOptical coherence tomography-based ophthalmic testing methods, devices and systems
US10945597B2 (en)2008-03-272021-03-16Doheny Eye InstituteOptical coherence tomography-based ophthalmic testing methods, devices and systems
US11291364B2 (en)2008-03-272022-04-05Doheny Eye InstituteOptical coherence tomography device, method, and system
US12193742B2 (en)2008-03-272025-01-14Doheny Eye InstituteOptical coherence tomography-based ophthalmic testing methods, devices and systems
US12414688B2 (en)2008-03-272025-09-16Doheny Eye InstituteOptical coherence tomography-based ophthalmic testing methods, devices and systems
US9149182B2 (en)2008-03-272015-10-06Doheny Eye InstituteOptical coherence tomography device, method, and system
US11839430B2 (en)2008-03-272023-12-12Doheny Eye InstituteOptical coherence tomography-based ophthalmic testing methods, devices and systems
US10165941B2 (en)2008-03-272019-01-01Doheny Eye InstituteOptical coherence tomography-based ophthalmic testing methods, devices and systems
US9492079B2 (en)2008-07-182016-11-15Doheny Eye InstituteOptical coherence tomography-based ophthalmic testing methods, devices and systems
US8820931B2 (en)2008-07-182014-09-02Doheny Eye InstituteOptical coherence tomography-based ophthalmic testing methods, devices and systems
US9167965B2 (en)*2010-10-152015-10-27Universidad De MurciaInstrument for rapid measurement of the optical properties of the eye in the entire field of vision
US20130265544A1 (en)*2010-10-152013-10-10Universidad De MurciaInstrument for rapid measurement of the optical properties of the eye in the entire field of vision
US20130173750A1 (en)*2011-12-302013-07-04Matthew CarnevaleRemote exam viewing system
US8639779B2 (en)*2011-12-302014-01-28Matthew CarnevaleRemote exam viewing system
US9961326B2 (en)*2012-01-092018-05-01Kla-Tencor CorporationStereo extended depth of focus
US20130176402A1 (en)*2012-01-092013-07-11Kla-Tencor CorporationStereo Extended Depth of Focus
US9226856B2 (en)2013-03-142016-01-05Envision Diagnostics, Inc.Inflatable medical interfaces and other medical devices, systems, and methods
US11559198B2 (en)2013-03-142023-01-24Envision Diagnostics, Inc.Medical interfaces and other medical devices, systems, and methods for performing eye exams
US10631725B2 (en)2013-03-142020-04-28Envision Diagnostics, Inc.Inflatable medical interfaces and other medical devices, systems, and methods
US20140313477A1 (en)*2013-03-152014-10-23Amo Wavefront Sciences, Llc.Angular multiplexed optical coherence tomography systems and methods
US9486137B2 (en)2013-03-152016-11-08Amo Wavefront Sciences, LlcAngular multiplexed optical coherence tomography systems and methods
US10058244B2 (en)2013-03-152018-08-28Amo Wavefront Sciences, LlcAngular multiplexed optical coherence tomography systems and methods
US10702146B2 (en)2013-03-152020-07-07Amo Development, LlcAngular multiplexed optical coherence tomography systems and methods
US9198573B2 (en)*2013-03-152015-12-01Amo Wavefront Sciences, LlcAngular multiplexed optical coherence tomography systems and methods
US20150160726A1 (en)*2013-03-182015-06-11Mirametrix Inc.System and Method for On-Axis Eye Gaze Tracking
US9733703B2 (en)*2013-03-182017-08-15Mirametrix Inc.System and method for on-axis eye gaze tracking
US10149610B2 (en)2014-04-252018-12-11Carl Zeiss Meditec, Inc.Methods and systems for automatic detection and classification of ocular inflammation
US10772497B2 (en)2014-09-122020-09-15Envision Diagnostics, Inc.Medical interfaces and other medical devices, systems, and methods for performing eye exams
US20170245755A1 (en)*2014-09-192017-08-31Carl Zeiss Meditec AgSystem for optical coherence tomography, comprising a zoomable kepler system
US10092179B2 (en)*2014-09-192018-10-09Carl Zeiss Meditec AgSystem for optical coherence tomography, comprising a zoomable kepler system
US10459229B2 (en)2015-03-162019-10-29Magic Leap, Inc.Methods and systems for performing two-photon microscopy
US11256096B2 (en)2015-03-162022-02-22Magic Leap, Inc.Methods and systems for diagnosing and treating presbyopia
US10371947B2 (en)2015-03-162019-08-06Magic Leap, Inc.Methods and systems for modifying eye convergence for diagnosing and treating conditions including strabismus and/or amblyopia
US10379353B2 (en)2015-03-162019-08-13Magic Leap, Inc.Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields
US10379351B2 (en)2015-03-162019-08-13Magic Leap, Inc.Methods and systems for diagnosing and treating eyes using light therapy
US10379350B2 (en)2015-03-162019-08-13Magic Leap, Inc.Methods and systems for diagnosing eyes using ultrasound
US10379354B2 (en)2015-03-162019-08-13Magic Leap, Inc.Methods and systems for diagnosing contrast sensitivity
US10386639B2 (en)2015-03-162019-08-20Magic Leap, Inc.Methods and systems for diagnosing eye conditions such as red reflex using light reflected from the eyes
US11747627B2 (en)2015-03-162023-09-05Magic Leap, Inc.Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields
US10386640B2 (en)2015-03-162019-08-20Magic Leap, Inc.Methods and systems for determining intraocular pressure
US10386641B2 (en)2015-03-162019-08-20Magic Leap, Inc.Methods and systems for providing augmented reality content for treatment of macular degeneration
US10437062B2 (en)2015-03-162019-10-08Magic Leap, Inc.Augmented and virtual reality display platforms and methods for delivering health treatments to a user
US10444504B2 (en)2015-03-162019-10-15Magic Leap, Inc.Methods and systems for performing optical coherence tomography
US10451877B2 (en)2015-03-162019-10-22Magic Leap, Inc.Methods and systems for diagnosing and treating presbyopia
US20170000342A1 (en)2015-03-162017-01-05Magic Leap, Inc.Methods and systems for detecting health conditions by imaging portions of the eye, including the fundus
US10345591B2 (en)2015-03-162019-07-09Magic Leap, Inc.Methods and systems for performing retinoscopy
US10466477B2 (en)2015-03-162019-11-05Magic Leap, Inc.Methods and systems for providing wavefront corrections for treating conditions including myopia, hyperopia, and/or astigmatism
US10473934B2 (en)2015-03-162019-11-12Magic Leap, Inc.Methods and systems for performing slit lamp examination
US10527850B2 (en)2015-03-162020-01-07Magic Leap, Inc.Augmented and virtual reality display systems and methods for determining optical prescriptions by imaging retina
US10539795B2 (en)2015-03-162020-01-21Magic Leap, Inc.Methods and systems for diagnosing and treating eyes using laser therapy
US10539794B2 (en)2015-03-162020-01-21Magic Leap, Inc.Methods and systems for detecting health conditions by imaging portions of the eye, including the fundus
US10545341B2 (en)2015-03-162020-01-28Magic Leap, Inc.Methods and systems for diagnosing eye conditions, including macular degeneration
US10564423B2 (en)2015-03-162020-02-18Magic Leap, Inc.Augmented and virtual reality display systems and methods for delivery of medication to eyes
US10371948B2 (en)2015-03-162019-08-06Magic Leap, Inc.Methods and systems for diagnosing color blindness
US11474359B2 (en)2015-03-162022-10-18Magic Leap, Inc.Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields
US10371949B2 (en)2015-03-162019-08-06Magic Leap, Inc.Methods and systems for performing confocal microscopy
US10371946B2 (en)2015-03-162019-08-06Magic Leap, Inc.Methods and systems for diagnosing binocular vision conditions
US10345592B2 (en)2015-03-162019-07-09Magic Leap, Inc.Augmented and virtual reality display systems and methods for diagnosing a user using electrical potentials
US10365488B2 (en)2015-03-162019-07-30Magic Leap, Inc.Methods and systems for diagnosing eyes using aberrometer
US10775628B2 (en)2015-03-162020-09-15Magic Leap, Inc.Methods and systems for diagnosing and treating presbyopia
US10359631B2 (en)2015-03-162019-07-23Magic Leap, Inc.Augmented reality display systems and methods for re-rendering the world
US10788675B2 (en)2015-03-162020-09-29Magic Leap, Inc.Methods and systems for diagnosing and treating eyes using light therapy
US10345590B2 (en)2015-03-162019-07-09Magic Leap, Inc.Augmented and virtual reality display systems and methods for determining optical prescriptions
US10345593B2 (en)2015-03-162019-07-09Magic Leap, Inc.Methods and systems for providing augmented reality content for treating color blindness
US10371945B2 (en)2015-03-162019-08-06Magic Leap, Inc.Methods and systems for diagnosing and treating higher order refractive aberrations of an eye
US11156835B2 (en)2015-03-162021-10-26Magic Leap, Inc.Methods and systems for diagnosing and treating health ailments
US10969588B2 (en)2015-03-162021-04-06Magic Leap, Inc.Methods and systems for diagnosing contrast sensitivity
US10983351B2 (en)2015-03-162021-04-20Magic Leap, Inc.Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields
US12345892B2 (en)2015-03-162025-07-01Magic Leap, Inc.Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields
US11672421B2 (en)2015-08-122023-06-13Carl Zeiss Meditec, Inc.Alignment improvements for ophthalmic diagnostic systems
US10849498B2 (en)2015-08-122020-12-01Carl Zeiss Meditec, Inc.Alignment improvements for ophthalmic diagnostic systems
US12201365B2 (en)2015-08-122025-01-21Carl Zeiss Meditec, Inc.Alignment improvements for ophthalmic diagnostic systems
US11039741B2 (en)2015-09-172021-06-22Envision Diagnostics, Inc.Medical interfaces and other medical devices, systems, and methods for performing eye exams
US11106041B2 (en)2016-04-082021-08-31Magic Leap, Inc.Augmented reality systems and methods with variable focus lens elements
US11614626B2 (en)2016-04-082023-03-28Magic Leap, Inc.Augmented reality systems and methods with variable focus lens elements
US10459231B2 (en)2016-04-082019-10-29Magic Leap, Inc.Augmented reality systems and methods with variable focus lens elements
US11717153B2 (en)2016-04-302023-08-08Envision Diagnostics, Inc.Medical devices, systems, and methods for performing eye exams and eye tracking
US12279820B2 (en)2016-04-302025-04-22Envision Diagnostics, Inc.Medical devices, systems, and methods for performing eye exams and eye tracking
US11890053B2 (en)2016-12-212024-02-06Acucela Inc.Miniaturized mobile, low cost optical coherence tomography system for home based ophthalmic applications
US10610096B2 (en)2016-12-212020-04-07Acucela Inc.Miniaturized mobile, low cost optical coherence tomography system for home based ophthalmic applications
US12396639B2 (en)2016-12-212025-08-26Acucela Inc.Miniaturized mobile, low cost optical coherence tomography system for home based ophthalmic applications
US10952607B2 (en)2016-12-212021-03-23Acucela Inc.Miniaturized mobile, low cost optical coherence tomography system for home based ophthalmic applications
US11627874B2 (en)2016-12-212023-04-18Acucela Inc.Miniaturized mobile, low cost optical coherence tomography system for home based ophthalmic applications
US11300844B2 (en)2017-02-232022-04-12Magic Leap, Inc.Display system with variable power reflector
US11774823B2 (en)2017-02-232023-10-03Magic Leap, Inc.Display system with variable power reflector
US10962855B2 (en)2017-02-232021-03-30Magic Leap, Inc.Display system with variable power reflector
US11042622B2 (en)2017-05-082021-06-22International Business Machines CorporationAuthenticating users and improving virtual reality experiences via ocular scans and pupillometry
US10241576B2 (en)2017-05-082019-03-26International Business Machines CorporationAuthenticating users and improving virtual reality experiences via ocular scans and pupillometry
US10386923B2 (en)*2017-05-082019-08-20International Business Machines CorporationAuthenticating users and improving virtual reality experiences via ocular scans and pupillometry
US10653314B2 (en)2017-11-072020-05-19Notal Vision Ltd.Methods and systems for alignment of ophthalmic imaging devices
US11058299B2 (en)2017-11-072021-07-13Notal Vision Ltd.Retinal imaging device and related methods
US11389061B2 (en)2017-11-072022-07-19Notal Vision, Ltd.Methods and systems for alignment of ophthalmic imaging devices
US11723536B2 (en)2017-11-072023-08-15Notal Vision, Ltd.Methods and systems for alignment of ophthalmic imaging devices
US12137977B2 (en)2017-11-072024-11-12Notal Vision, Ltd.Retinal imaging device and related methods
US12290317B2 (en)2018-06-202025-05-06Acucela Inc.Miniaturized mobile, low cost optical coherence tomography system for home based ophthalmic applications
US11576572B2 (en)2018-06-202023-02-14Acucela Inc.Miniaturized mobile, low cost optical coherence tomography system for home based ophthalmic applications
US11896308B2 (en)2018-06-202024-02-13Acucela Inc.Miniaturized mobile, low cost optical coherence tomography system for home based ophthalmic applications
US11357401B2 (en)2018-06-202022-06-14Acucela Inc.Miniaturized mobile, low cost optical coherence tomography system for home based ophthalmic applications
US11464408B2 (en)2018-10-032022-10-11Notal Vision Ltd.Automatic optical path adjustment in home OCT
US11986241B2 (en)2018-10-032024-05-21Notal Vision, Ltd.Automatic optical path adjustment in home OCT
US10595722B1 (en)*2018-10-032020-03-24Notal Vision Ltd.Automatic optical path adjustment in home OCT
US11564564B2 (en)2019-06-122023-01-31Notal Vision, Ltd.Home OCT with automatic focus adjustment
US10653311B1 (en)2019-06-122020-05-19Notal Vision Ltd.Home OCT with automatic focus adjustment
US12201360B2 (en)2019-06-122025-01-21Notal Vision, Ltd.Home OCT with automatic focus adjustment
US11730363B2 (en)2019-12-262023-08-22Acucela Inc.Optical coherence tomography patient alignment system for home based ophthalmic applications
US11684254B2 (en)2020-08-042023-06-27Acucela Inc.Scan pattern and signal processing for optical coherence tomography
US12232810B2 (en)2020-08-042025-02-25Acucela Inc.Scan pattern and signal processing for optical coherence tomography
US11974807B2 (en)2020-08-142024-05-07Acucela Inc.System and method for optical coherence tomography a-scan decurving
US11393094B2 (en)2020-09-112022-07-19Acucela Inc.Artificial intelligence for evaluation of optical coherence tomography images
US11798164B2 (en)2020-09-112023-10-24Acucela Inc.Artificial intelligence for evaluation of optical coherence tomography images
US11620749B2 (en)2020-09-112023-04-04Acucela Inc.Artificial intelligence for evaluation of optical coherence tomography images
US11911105B2 (en)2020-09-302024-02-27Acucela Inc.Myopia prediction, diagnosis, planning, and monitoring device
US11497396B2 (en)2021-03-242022-11-15Acucela Inc.Axial length measurement monitor
US11779206B2 (en)2021-03-242023-10-10Acucela Inc.Axial length measurement monitor

Also Published As

Publication numberPublication date
EP2271249B1 (en)2016-03-16
EP3053513A1 (en)2016-08-10
US20130201449A1 (en)2013-08-08
CN102046067B (en)2017-10-31
CN107692960A (en)2018-02-16
EP4386775A2 (en)2024-06-19
JP2011515194A (en)2011-05-19
WO2009120544A1 (en)2009-10-01
CN102046067A (en)2011-05-04
CN120052807A (en)2025-05-30
CN107692960B (en)2021-01-29
US9149182B2 (en)2015-10-06
US20090244485A1 (en)2009-10-01
JP2014094313A (en)2014-05-22
EP4386775A3 (en)2024-09-25
EP2271249B2 (en)2019-05-22
EP3053513B1 (en)2024-03-27
JP5469658B2 (en)2014-04-16
CN112869696A (en)2021-06-01
CN112869696B (en)2025-03-18
EP2271249A1 (en)2011-01-12
US20170049318A1 (en)2017-02-23
JP6040177B2 (en)2016-12-07
US11291364B2 (en)2022-04-05
US20150138503A1 (en)2015-05-21

Similar Documents

PublicationPublication DateTitle
US11291364B2 (en)Optical coherence tomography device, method, and system
US11510567B2 (en)Optical coherence tomography-based ophthalmic testing methods, devices and systems
US10945597B2 (en)Optical coherence tomography-based ophthalmic testing methods, devices and systems
WO2009120543A1 (en)Optical coherence tomography device, method, and system
WO2010117386A1 (en)Ophthalmic testing methods, devices and systems

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:DOHENY EYE INSTITUTE, CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WALSH, ALEXANDER C.;UPDIKE, PAUL G.;SADDA, SRINIVAS R.;REEL/FRAME:021126/0911

Effective date:20080612

STCFInformation on status: patent grant

Free format text:PATENTED CASE

FPAYFee payment

Year of fee payment:4

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2552); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment:8

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 12TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2553); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment:12


[8]ページ先頭

©2009-2025 Movatter.jp