Movatterモバイル変換


[0]ホーム

URL:


US9867533B2 - Systems and methods for determining an angle of repose of an asymmetric lens - Google Patents

Systems and methods for determining an angle of repose of an asymmetric lens
Download PDF

Info

Publication number
US9867533B2
US9867533B2US15/070,254US201615070254AUS9867533B2US 9867533 B2US9867533 B2US 9867533B2US 201615070254 AUS201615070254 AUS 201615070254AUS 9867533 B2US9867533 B2US 9867533B2
Authority
US
United States
Prior art keywords
locations
image
captured image
lens
contact lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/070,254
Other versions
US20160287067A1 (en
Inventor
Chi Shing Fan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CooperVision International Ltd
Original Assignee
CooperVision International Holding Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CooperVision International Holding Co LPfiledCriticalCooperVision International Holding Co LP
Priority to US15/070,254priorityCriticalpatent/US9867533B2/en
Publication of US20160287067A1publicationCriticalpatent/US20160287067A1/en
Assigned to COOPERVISION HONG KONG LIMITEDreassignmentCOOPERVISION HONG KONG LIMITEDASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: FAN, Chi Shing
Assigned to COOPERVISION INTERNATIONAL HOLDING COMPANY, LPreassignmentCOOPERVISION INTERNATIONAL HOLDING COMPANY, LPASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: COOPERVISION HONG KONG LIMITED
Application grantedgrantedCritical
Publication of US9867533B2publicationCriticalpatent/US9867533B2/en
Assigned to COOPERVISION INTERNATIONAL LIMITEDreassignmentCOOPERVISION INTERNATIONAL LIMITEDASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: COOPERVISION INTERNATIONAL HOLDING COMPANY, LP
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

Technology to determine the angle of repose of a lens, such as the angle of repose of a contact lens on an eye, includes storing an image including the lens captured using a camera and executing graphical user interface logic. The graphical user interface logic includes a frame or frames including the captured image, and graphical constructs prompting input identifying a set of locations on the captured image that are usable to define a location of the lens in the image, and a location usable to define an angle of repose of the lens in the image. The input data accepted via the graphical user interface is processed to determine angle of repose of the lens. The angle of repose of the lens is used to produce a specification for a lens.

Description

RELATED APPLICATIONS
This application claims benefit of U.S. Provisional Patent Application No. 62/142,006 filed on 2 Apr. 2015.
BACKGROUND
The present technology relates to tools for measuring and analyzing the orientation of a toric lens, or other lens not radially symmetric, disposed on a patient's eye or other element on which the lens is disposed, and for modifying specifications for the lens based on the measurement.
A problem arises when fitting a contact lens for a patient when the lens is not radially symmetric. In particular, a lens that is not radially symmetric, such as a toric lens, is designed to be oriented at a specific angle relative to the eye. However, when placed on the eye, the lens may rotate to an angle of repose on the eye that varies from eye to eye. If the angle of repose is sufficiently different than the specified angle for the lens, then the lens will be ineffective or uncomfortable for the wearer.
One of the steps in fitting asymmetric lenses is to determine whether the lens rotates on the eye and the amount of rotation. Eye care professionals (ECP) have used an expensive special instrument known as a slit lamp to determine the angle of repose of the lens on a patient's eye. Toric lenses, for example, typically have an index mark on the lens outside the optic region that can be viewed using the slit lamp, as an indication of the angle of repose. The ECP can estimate the angle of repose based on the location of the index mark, and make adjustments to the specification for the lens. A slit lamp, however, is expensive and requires specialized training. It is desirable, therefore, to provide an easy to use technology to measure and analyze the orientation of lenses on eyes.
SUMMARY
Technology described herein includes systems and methods for computing the angle of repose for a lens, and for modifying a specification of a lens, such as an asymmetric lens, using the angle.
A method for operating a device or devices is described to determine the angle of repose of a lens, such as the angle of repose of a contact lens on an eye. The device or devices include a display, a data processor and a user input device such as a touchscreen overlying the display, such as commonly found on so-called “smart” phones. The method of operating includes storing an image including the lens captured using a camera and executing graphical user interface logic using the display. Graphical user interface logic includes a frame or frames including the captured image, and graphical constructs prompting input identifying a set of locations on the captured image that are usable to define a location of the lens in the image, and a location usable to define an angle of repose of the lens in the image. The graphical user interface logic accepts input data responsive to the graphical constructs that is interpreted as identifying locations on the captured image.
The device or devices can include a camera with an orientation sensor. The method can include displaying an orientation graphic on the display using the orientation sensor. The orientation graphic can include indicators of an orientation of a camera lens, or other indicators usable in composition of an image to be captured by the camera for use in the process. For example, the orientation graphic can include a graphical construct that indicates orientation of the camera lens around a line of site axis normal to the camera lens, around a horizontal axis and around a vertical axis.
The method can include processing the input data accepted via the graphical user interface to determine an angle of repose of the lens. Also, the method can include using the angle of repose of the lens to produce a specification for a lens, such as a contact lens prescription, based on the determined angle of repose.
Technology described herein also includes a device configured to execute processes as described above. Also, technology described herein includes a computer program product comprising a non-transitory computer readable medium storing instructions executable by a data processor that comprises logic to execute the processes described above.
Other aspects and advantages of the technology described herein can be seen on review of the drawings, the detailed description and the claims, which follow.
BRIEF DESCRIPTION OF THE DRAWINGS
The included drawings are for illustrative purposes and serve only to provide examples of possible structures and process operations for one or more implementations of this disclosure. These drawings in no way limit any changes in form and detail that may be made by one skilled in the art without departing from the spirit and scope of this disclosure. A more complete understanding of the subject matter may be derived by referring to the detailed description and claims when considered in conjunction with the following figures, wherein like reference numbers refer to similar elements throughout the figures.
FIG. 1 illustrates one implementation of a device configured of determining an angle of repose of an asymmetric lens (such as a toric contact lens), and for modifying a specification for the lens.
FIG. 2A is a block diagram for one implementation of a device like that ofFIG. 1.
FIG. 2B displays alignment calculation example concepts for a device like that ofFIG. 1.
FIG. 3A is an example frame of a graphical user interface for a toric lens prescription device.
FIG. 3B is an example frame of a of a graphical user interface for the top control level of a toric lens prescription device.
FIG. 3C is an example frame of a graphical user interface for the record information for a toric lens prescription device.
FIG. 3D is an example frame of a graphical user interface for the spectacle Rx input for a toric lens prescription device.
FIG. 3E shows an example frame showing an angle of repose display and GUI ‘save’ controls for a toric lens prescription device.
FIG. 4A shows an example frame of a graphical user interface including an orientation graphic.
FIG. 4B shows an example frame of a graphical user interface including an orientation graphic when the camera is positioned at an oblique angle relative to the patient's eye.
FIG. 4C shows an example portion of a frame of a graphical user interface including an orientation graphic as the camera is positioned at another oblique angle relative to the patient's eye.
FIG. 4D shows an example portion of a frame of a graphical user interface including an orientation graphic when the camera is positioned slightly misaligned relative to the patient's eye.
FIG. 5A shows an example portion of a frame of a graphical user interface including a view of a captured image.
FIG. 5B shows an example portion of a frame of a graphical user interface including a view of a captured image, including a zoom feature.
FIG. 6 shows an example frame of a graphical user interface including graphical constructs for prompting input of data and accepting input via a touch screen.
FIG. 7 shows an example frame of a graphical user interface including graphical constructs for prompting input of data and accepting input via a touch screen, with a zoom feature for the captured image.
FIG. 8 shows an example frame of a graphical user interface including graphical constructs for prompting input of data and accepting input via a touch screen, including a construct for prompting input of a location on the captured image to indicate a location of an index mark on the lens.
FIG. 9 shows an example frame of a graphical user interface including controls for left and right eyes, for axis compensation for a toric lens prescription device.
FIG. 10A shows an example frame of a graphical user interface including an example converted Rx for a toric lens prescription.
FIG. 10B shows an example frame of a graphical user interface including a graphical element for a completed ‘save’ function for a toric lens prescription.
FIG. 11 shows a lookup table example for converting components of a spectacle prescription to a toric lens prescription.
FIG. 12 is a simplified flow diagram illustrating graphical user interface logic for the top level functions for modifying a lens prescription for an eye measurement device.
FIG. 13A is a simplified flow diagram illustrating graphical user interface logic for processing a new record for a new patient for an eye measurement device.
FIG. 13B is a simplified flow diagram illustrating graphical user interface logic for processing a prescription for an eye measurement device.
FIG. 13C is a simplified flow diagram illustrating graphical user interface logic for marking locations for an eye measurement device.
FIG. 14 is a simplified flow diagram illustrating graphical user interface logic showing the basic steps for listing and searching records, and for editing records for an eye measurement device.
FIG. 15 is a simplified flow chart for editing a record, and for angle calculation and lens specification processes.
FIG. 16 is a simplified flowchart showing a method of operating a device or devices for measuring an angle of repose and updating a lens specification.
DETAILED DESCRIPTION
The following detailed description is made with reference to the figures. Sample implementations are described to illustrate the technology disclosed, not to limit its scope, which is defined by the claims. Those of ordinary skill in the art will recognize a variety of equivalent variations on the description that follows.
The generation of the angle of repose for a lens on an eye, and methods for modifying a specification of a contact lens prescription using the angle will be discussed primarily with reference toFIGS. 1-16.
FIG. 1 illustratesmobile phone100 having acamera142, aprocessor145, auser input device165 such as a touch screen, adisplay152, anorientation sensor144, and adata store125. Themobile phone100 is configured with instructions on a non-transistor computer readable medium executable by theprocessor145 to provide a graphical user interface, and that include logic supporting the determination of the angle of repose of a lens which in some embodiments is combined with logic for modification of a specification of a lens using the determined angle. Amobile phone100 is one example of a device that can be used with the present technology. Of course, one or more devices that include, or have access to, the basic components necessary can be utilized. For example, the camera, orientation sensor and the processor which executes portions of the graphical user interface can be implemented on more than one chassis, with data transfer capability by which for example the processor can execute logic to store an image captured by the camera.
FIG. 2A is a simplified block diagram of a component of amobile phone200, representative of a device which can be implemented on a single chassis including tools described herein for measuring the angle of repose for a lens on an eye which in some embodiments is combined with logic for modification of a specification of a lens using the determined angle.
Themobile phone200 in this example includes acamera214, adisplay206, with atouch screen211 overlying the display, and acodec204, including analog-to-digital and digital-to-analog converters coupled to theprocessor203 which can execute instructions stored in memory on the device or accessible using the device. A view finder image is displayed on the display with the touch screen. Read-onlymemory207 stores instructions, parameters and other data for execution by theprocessor203. In addition, a read/write memory208 in the mobile phone stores instructions, parameters, patient records, spectacle and contact lens prescriptions and other data for use by theprocessor203. Also, thememory208 in this example stores GUI logic, orientation processing logic, angle determination logic, specification update logic, and so on as described in more detail herein. There may be multiple types of read/write memory on thephone200, such as nonvolatile read/write memory208 (flash memory or EEPROM for example) and volatile read/write memory209 (DRAM or SRAM for example). Other embodiments include removable memory modules in which instructions, parameters and other data for use by theprocessor203 are stored. Themobile phone200 includes a microphone/speaker205, a radio including an RF receiver andtransmitter202 and anantenna201, which supports audio and data communications.
An input/output controller210 is coupled to user input devices including atouch screen211 overlying thedisplay206, and to otheruser input devices212, such as a numerical keypad, a volume control and a function keypad. Thecontroller210 in this example is also connected to and provides control functions for, an accessory port (or ports)213, and anorientation sensor215. In one embodiment, the orientation sensor is a three axis accelerometer such as commonly used on mobile phones.
The accessory port orports213 are used for other types of input/output devices, such as connections to processing devices such as PDAs, or personal computers, alternative communication channels such as an infrared port, a Bluetooth network interface, a Universal Serial Bus USB port, a portable storage device port, and other things. Thecontroller210 is coupled to theprocessor203.
User input concerning patient records, including spectacle Rx and lens model information, is received via theinput devices212, the radio and/or other accessories. User interaction is enhanced, and the user is prompted to interact, using thetouch screen211 anddisplay206 and optionally other accessories.
In other implementations,mobile phone200 may not have the same elements or components as those listed above and/or may have other/different elements or components instead of, or in addition to, those listed above, such as a web browser user interface, custom plugin engine, etc. The different elements or components can be combined into single software modules, and multiple software modules can run on the same hardware. Also, the present technology can be implemented on other types of devices.
The orientation sensor can include a three-axis accelerometer that can be used to indicate the orientation of the lens of the camera on the device.FIG. 2B illustrates one example of the location of acamera lens232 on a side of amobile phone100 opposite the display. This location is more typically on the upper left corner of the mobile phone chassis, but is placed in the center in this drawing. The lens has a line of sight axis generally along the nominal surface normal of the lens into the field of view of the camera, which is labeled the “z” axis in this diagram. Also, the orientation of the lens can be specified by a horizontal axis, labeled the “x” axis in this diagram, and a vertical axis, labeled the “y” axis in this diagram. The three axes intersect at right angles as indicated by thegraph233 at the center of the lens, for example. The rotation of the camera about the line of sight axis is commonly referred to as roll. The rotation of the camera about the horizontal axis is commonly referred to as pitch, and the rotation of the camera around the vertical axis is commonly referred as yaw. For aligning thecamera214 of the disclosed device, theprocessor203 utilizes inputs from theorientation sensor215, generating an alignment graphic that includes indicators of orientation of the camera around the line of sight axis (roll), around the horizontal axis (pitch), and around the vertical axis (yaw). In one implementation,processor203 implements a three-dimensional animation API, CATransformMake3DRotation, using function CATransform3DRotate to generate a graphical construct that reflects the orientation of the camera as it is positioned for taking an image of the lens to be analyzed. Alternative methods for orientation for a camera could be used.
FIG. 3A illustrates a frame produced by a graphical user interface (GUI) logic displaying on thedisplay206. The frame includes afield286 including a number of graphical constructs. Thefield286 in this example includes atask bar272 havingsystem icons282, used to select and invoke system operations. Also, the graphical constructs infield286 can include a plurality ofapplication icons262 that, when selected by user input, can invoke corresponding applications in the device. In this example, the graphical constructs infield286 include aneye icon252 which, when selected, invokes logic to execute a graphical user interface as described herein.
Touching eye icon252 can cause a program including graphical user interface logic stored inmobile phone250 to be accessed; and to execute the graphical user interface logic. The graphical user interface logic can include interactive logic that includes displaying frames on the display, accepting input prompted by the graphical constructs on the frames, processing the input and then proceeding to a next frame or a next process.
In one example, the graphical user interface logic includes a frame having thescreen image352 shown inFIG. 3B.Screen image352 includes application options such as record listing andsearch icon361,new case icon362, contact usicon372,382, terms ofservice371,383, and aconstruct381 that includes a graphical home icon. An input generated by a touch on the touch screen over thenew case icon362 causes the graphical user interface logic to display a frame including the record information frame shown inFIG. 3C. The record information frame in this example includes a task bar including aconstruct315 for “record information task,” aconstruct316 for a spectacle prescription task, aconstruct317 for a lens model task, aconstruct318 for an axis compensation task and aconstruct319 for a suggested prescription task. As illustrated inFIG. 3C, selection of the Record Information task cause display of constructs that include text entry fields with labels that prompt entry of a patient ID in the record/patient ID field326, and a date in thefield336. As shown inFIG. 3D, selection of the Spectacle RX task results in display of constructs that prompt user selection of various elements of a patient's spectacle prescription, including sphere andcylinder366 and axis, andvertex distance376.
Selection of the lens model construct317 activates a list of lens brands and models, with a selector for choosing a model. Selection of the axis compensation construct318 application icon activates the angle of repose calculation processes, and displays the angle rotation results. Selection of the suggested Rx construct319 activates the prescription conversion process for converting the spectacle prescription plus angle of repose into contact lens prescription, described below.
The angle of rotation calculation processes in the graphical user interface logic include one or more frames to assist with centering and aligning the camera to compose an image of a lens disposed on the eye of the patient and capturing an image of the lens.
To capture an image, the user can hold the camera upright at the level of the eye of the patient, and utilize the graphical user interface to assist capturing an image of the eye. The graphical user interface logic can include a frame as shown inFIG. 3E, including graphical constructs for choosingright eye645 orleft eye655. Other constructs on the frame shown inFIG. 3E can includebuttons640,660 to activate an image capture mode for the left and right eyes, buttons (e.g.635) used to activate an angle calculation mode (shown inFIG. 7), to add or subtract from a displayed value, and to cancel the procedure.Del icon925 can be used to delete the angle measured. Input generated by selecting one of the eyes in the frame shown inFIG. 3E can then cause the graphical user interface to enter an image capture mode, in which a frame or sequence of frames like that shown inFIG. 4A is displayed.
The frame inFIG. 4A includesview finder field459 in which an image from the camera view finder is displayed in real time on thetouch screen211. Also, it includes an image capture button (graphical construct480) which, when selected, captures an image and stores the captured image for processing. In the example shown, the frame includes an orientation graphic comprising in this example amovable ellipsoid feature460, which can be red for example, and aset465 including one or more fixed marks which indicate a preferred orientation for the camera. Also, or in the alternative, the orientation graphic includes a feature useable as reference for sizing the lens subject of the measurements within the area of the captured image. For example, the orientation graphic can include a perceptible ellipsoid within which the user can position the lens and which can be used as a reference for magnification to adjust the size of the lens in the image to be captured.
The graphical user interface logic can generate the orientation graphic using the orientation sensor to indicate orientation of the camera around one or more of the line of sight axis, the horizontal axis and the vertical axis. In this example, themovable feature460 has index marks located nominally at 12, 3, 6 and 9 o'clock. Also, in the example, theset465 of one or more fixed marks includes alignment indexes which comprise pairs of lines positioned at 12, 3, 6 and 9 o'clock. When the camera is oriented as desired, the movable feature is positioned on the display so that the index marks align with the alignment indexes on theset465 of one or more fixed marks. The frame illustrated inFIG. 4A can also include fields (e.g.475) for displaying current angles of rotation for one or more of the roll, pitch and yaw.
In an example embodiment, the orientation graphic is generated in real time as the camera is moved. The shape of themovable feature460 shifts as the position of the camera changes, as illustrated inFIGS. 4B, 4C and 4D. InFIG. 4B, themoveable feature485 is presented as an ellipsoid having a long axis that is close to horizontal and reaches across the fixed circle, and a short axis that is offset from the vertical by a slight negative angle and substantially shorter than the diameter of the fixed circle. The length of the short axis indicates that the camera is rotated about the horizontal axis. The rotation of the long and short axes indicates a small offset around the line of sight axis. The length of the long axis indicates that the camera is close to preferred alignment on the vertical axis. InFIG. 4C, themoveable feature485 is presented as an ellipsoid having a long axis that is slightly offset from the horizontal and reaches across the fixed circle, and a short axis that is offset from the vertical by an angle of close to 10 degrees and substantially shorter than the diameter of the fixed circle. The length of the short axis indicates that the camera is rotated about the horizontal axis. The rotation of the long and short axes indicates an offset around the line of sight axis. The length of the long axis indicates that the camera is close to preferred alignment on the vertical axis. InFIG. 4D, themoveable feature485 is presented as an ellipsoid (now close to circular) having a long axis that is slightly offset from the horizontal, and reaches across the fixed circle, and a short axis that is slightly offset from the vertical and reaches across the fixed circle. As the length of the short axis and the long axis of the ellipsoid reach the diameter of the fixed circle, the orientation graphic indicates that the camera is aligned about the horizontal axis and vertical axis as desired. As the position of the indexes on the movable feature align with the alignment index marks on the fixed circle, orientation graphic indicates that the camera is aligned around the line of sight axis. In this manner, the graphical user interface logic can present feedback about the alignment of the camera to the user during the image capture mode.
The graphical user interface logic can continuously update and display these visual alignment indicators on display in overlying theview finder field459 on thedisplay206. In some examples, the orientation graphics can be positioned differently on the frame or elsewhere viewable by the user, as suits a particular implementation. Theprocessor203 can calculate and display one or more of the pitch value (e.g. in field475), roll value and yaw value, while the user adjusts the orientation by moving the camera until the one or more of the pitch, roll and yaw values display appropriate values.
Input generated by selection of theconstruct480 on the frame shown inFIG. 4A causes capture and storing of the image. The captured image is stored in a manner that incorporates alignment information within the image, such as by having pixels with coordinates on the captured image. The alignment process supported by the graphical user interface logic can assist in producing an image that does not include keystone distortion that can interfere with the accuracy of the computations executed. Keystone distortion occurs for example when an image is projected on a surface that is not perfectly normal to the lens. This distortion can make accurate measurement of a rotation angle difficult or impossible. This keystone effect is seen in a projection of an image onto a surface at an angle, as with a projector whose line of projection of the image is not normal to the screen on which it projects. With the alignment tools presented herein, the processes can be based on an assumption that the captured image does not include significant keystone distortion, and that the coordinate axes used to identify locations in the image can be considered to correspond to a reference plane on the lens usable to determine its angle of repose.
In some implementations, the graphical user interface logic may only accept the graphical construct for capturing an image at times when the alignment meets some minimum criteria.
After storing a captured image, the graphical user interface logic can cause display of a frame like that shown inFIGS. 5A and 5B. The frame inFIG. 5A shows the captured image infield552 and button constructs550,555 which, when selected, cause the view finder image to be used for angle computations, or to be discarded. The frame includes a zoom feature, which can be activated by a gesture using the touch screen, for example, so that a zoomedimage558 can be produced in thefield552. This feature permits the user to check quality (focus) of the captured view finder image to determine whether the lens edges (547) and the alignment index mark (548) on the lens can be seen. This feature can be used for sizing the lens within the area of the captured image. Thealignment index mark548 on a contact lens is typically a groove on the lens surface which can be seen by an ECP. A zoom feature on the device enables more accurate viewing of the feature.
After the graphical user interface logic stores an accepted eye image, the process continues to axis compensation, in which the graphical user interface uses the display, to present a frame or frames including the captured image, and graphical constructs prompting input identifying a set of locations on the captured image usable to define a location of a lens in the image, and a location usable to define an angle of repose of the lens in the image, and to accept input data responsive to the graphical constructs that identifies locations on the captured image. Also, the graphical user interface logic can process the input data to determine the angle of repose of the lens in the image. Example frames usable for this purpose are shown inFIGS. 6-8.
The frame shown inFIG. 6 includes afield720 in which the captured image is displayed, and afield721 including a set of graphical constructs. The graphical constructs in thefield721 prompt input identifying a set of locations, and include a construct (e.g.732) which when selected identifies a particular member of the set of locations, and causes the graphical user interface to interpret coordinates of a selected point in the displayed image as the particular member of the set of locations.Other constructs751,753 and754 are included, and used to invoke actions by the graphical user interface logic.
In this example, the constructs infield721 include a set of constructs (e.g.732,742,752) corresponding to locations usable to define a perimeter of the contact lens in the captured image, where the constructs in the set when selected identify respective members of the set of locations, and cause the graphical user interface to interpret coordinates of a selected point (e.g. point715) selected by a user touch on the touch screen in the captured image as the respective member of the set of locations.
In this example, the constructs (e.g.732,742 and752) are used to identify data that is interpreted as the locations of three points on the edge of a toric contact lens in the image, as selected by the user. The three selected points are interpreted by the graphical user interface logic as identifying the location of the circumference of the contact lens, usable in a process to define an angle of repose of the contact lens in the image. The graphical user interface logic can display values representing the coordinates of the locations of the three dots in fields P1 (construct732), P2 (construct742) and P3 (construct742).
Also, the constructs in thefield721 include anotherconstruct755 which corresponds to an index point location in the set of locations usable to define the location of an index point on the lens in the capture image. An index point can be the mark that is typically used on toric contact lenses, or any other feature of the lens that can indicate an angular position of the lens relative to a reference line, such as a vertical line. When selected, theconstruct755 identifies the index point location, and causes the graphical user interface to interpret coordinates of a selected point in the captured image as the index point location in the set of locations.
The graphical user interface logic can produce agraphical element725, as seen onFIG. 7, that overlays the circumference of the contact lens image, based on the 3 data points entered by the user. This location is usable to define an angle of repose of the contact lens in the image. The coordinates of the center of the circle can be displayed, if desired, as indicated by the field labeled with the construct “C”760 inFIG. 7.
In one implementation, a widely understood method of defining a circle uses the perpendicular bisectors of two chords drawn between the 3 dots to calculate the center and radius of the circle. The circle represents the location of the lens in the image of the eye.
By processing the input interpreted as identifying the user-selected index mark, the graphical user interface logic calculates the angle from vertical axis (815) in the image from the center of the lens to the index mark (825), and interprets that angle as the angle of repose of the lens on the eye. The angle of repose of the lens on the eye can be displayed in thefield835 as illustrated inFIG. 8.
After computation of the angle of repose, the graphical user interface logic can display a frame like that ofFIG. 3E, updated as shown inFIG. 9 to show the angle of lens repose in the user interface.
From the screen shown inFIG. 9, the user can select a variety of processes as explained above. A suggested prescription process is invoked by selection of theconstruct319. Upon selection of theconstruct319, the graphical user interface logic can display a frame like that shown inFIG. 10A, which includesconstructs1025 for displaying updated specifications for a lens, including for example the sphere, cylinder and axis parameters of a toric contact lens prescription. Also, the frame shown inFIG. 10A includesconstruct1015 which, when selected, invokes a process to save the lens specifications in the patient record. As shown inFIG. 10B, after a successful save operation, the graphical user interface logic can display awindow1035 indicating that the process is done.
In some implementations, theprocessor203 can initiate transmission of the value of the angle of lens rotation for the patient to another processor, such as a smart phone, a tablet, a laptop computer or a desktop computer. Prescription entry and mapping can be completed by any of these devices.
In one example, the graphical user interface logic can include a lookup table and supporting logic to compute the updated specifications as follows. The logic receives user input concerning patient records, including spectacle Rx and lens model information, and an angle oflens rotation945 for the patient. The angle of repose can be deleted using theDel icon925, if the patient's measurements are to be updated. A contact lens prescription for a patient with astigmatism can include 3 components: degree of shortsighted or longsighted (i.e. −5.25), degree of astigmatism (i.e. −1.50) and location of the astigmatism (i.e. 10). Sphere measurements (y axis of table degree of shortsighted or longsightedness; i.e. −5.25) and degree of astigmatism (x axis of table; i.e. −1.50) can be used as indexes to entries in a lookup table1100 stored in memory accessible by the processor, to determine two components of the suggested contact lens Rx (−5.00 −1.25)1155. In an alternative example, the graphical user interface logic can include utilize formulae or combinations of formulae and tables to compute the updated specifications for the updated prescription.
Then the logic can calculate the axis of the suggested Rx using the convention that a positive value means the lens rotates to the left (rotates clockwise from user's viewpoint), and a negative value means that the lens rotates to the right in a counter-clockwise direction as specified by a rule called “Left Add, Right Subtract” (known as the LARS rule) or “Clockwise Add, Anticlockwise Subtract” (known as the CAAS rule). To calculate the axis of the suggested Rx:
Axis of suggested toric lens Rx=axis of the spectacle Rx+angle measured
For example, if the axis of the spectacle Rx=10°, and the angle of repose=−5°, then the axis of the suggested Rx=10°+(−5°)=5°. The range of axis measurements is from 1 to 180 degrees, like a protractor; that is, 0=180; only 180° are in use for prescriptions. Any axis value greater than 180° is represented by the value on the “opposite” side. For example, an axis measurement of 181=1, 270=90, 320=140, etc. Therefore, if the axis of a spectacle Rx is 180 degrees, angle rotation measured is 20, the result of the suggested Rx is 20 because {180=0 (spectacle Rx)}+{20 (angle measured)}=20.
FIG. 12 shows graphical user interface logic for a method responsive to user inputs via graphical constructs described above. Instep1210, the application starts.Step1220 covers login to the application. Upon successful login, instep1230 theprocessor203 delivers a home pagegraphical construct381; an example home page is shown inFIG. 3B.Step1240 is for the option for record listing and search, which is described in more detail inFIG. 14. Atstep1250, theprocessor203 provides a graphical element for entering a new record. The new record overview is shown inFIG. 13A. In one example, the element can be labeledNew Case362. Instep1260 the processor delivers icons for contact us372,382. Upon selection, content for contacting the business is listed. Instep1270, legal statements are displayed when the icon for Terms ofService371,383 is selected. Instep1280, the application returns to display the home pagegraphical construct381.
FIG. 13A shows details of the graphical user interface logic for processing a new record, referenced instep1250. Atstep1310, the processor displays a GUI frame for starting a new record, and atstep1324, prompts are provided for filling in record details for a new case as shown inFIG. 3C. If the fields are not already clear (step1322) then theprocessor203 clears any unwanted data instep1332. Atstep1326, asave icon915 graphical element, displayed in the GUI byprocessor203, can be selected atstep1328 to save all the data. Atstep1334, theprocessor203 generates a series of frames that guide the angle calculation process, as described above and shown inFIG. 4A-FIG. 4D. Atstep1344, photo shooting refers to selectinggraphical construct480 to capture the image of the eye. Instep1354 the processor displays a frame like that ofFIG. 5A, including button constructs, which, when selected, cause the image to be used (555) for angle computations or to be discarded (550). Instep1364, theprocessor203 delivers the frames for circle marking and angle marking. Atstep1362 the processor provides graphical prompts that allow a user to clear (button753 inFIG. 6) or use (button construct754 inFIG. 6) location data, or to go back (button751 inFIG. 6) to capture replacement location inputs, after selection of dots for identifying lens location.Processor203 also provides graphical prompts for clear766 or use768 the location data and resultant element. Atstep1366 the option is to go back764 to capture a replacement eye image atstep1344 after a resultant circle graphical element has been displayed. Step1374 continues the process when the location inputs have been accepted for use via button constructs754,768. Instep1376, points are accepted as set, and atstep1378 the points are updated and can be stored in the read/write memory208, and the prescription is updated. If the points are cleared, as instep1386, the process returns to step1324.
FIG.13B step1331 starts the prescription updating logic block. Atstep1341, theprocessor203 updates the prescription based on the determined angle of repose, and a table lookup to determine the nearest Rx value is listed instep1351.
FIG. 13C shows the control logic for circle marking and angle marking. Instep1337,processor203 executes a graphical user interface in which the GUI displays a frame or frames including the captured eye image, and graphical constructs prompt input identifying a set of locations on the captured image usable to define a location of a lens in the image, and a location usable to define an index point location on the lens, as shown inFIG. 6-FIG. 7. Atstep1347 for calculating the vertical radius to toric marking radius, theprocessor203 accepts input data responsive to the graphical constructs that identify locations on the captured image. Atstep1357 for calculating the angle of repose, theprocessor203 interprets the input data as locations on edges of a lens in the captured image, and a location of a position of an index point on the lens in the captured image, and processes the input identifying the set of locations to determine a center of the lens in the captured image, and to determine the angle of repose of the contact lens on the eye.
FIG. 14 shows control logic for processing a record listing and search, starting withstep1410. This option can be selected using the record listing andsearch icon361 shown inFIG. 3B. Atstep1420, theprocessor203 displays a listing of records from the patient records stored in the read/write memory208. At step1430 a search selection is accepted byprocessor203, and at step1440 a search result listing is displayed. Atstep1446, the record can be selected for editing instep1456. Alternatively, a search result record can be selected to delete instep1450, and after confirmation of the intention to delete instep1460, theprocessor203 will delete the record instep1470 from the patient records stored in the read/write memory208.
An existing patient record can be edited to add new location data, or to replace existing location data, for updating a prescription based on the determined angle of repose.FIG. 15 shows control logic for processing a record edit, starting withstep1411. Atstep1423, prompts are provided for filling in record details in the case being edited. If the fields are not clear (step1421) then theprocessor203 clears any unwanted data instep1431. Instep1425, a save prompt can be selected to selectstep1427 to save all the data. Atstep1433, theprocessor203 generates a series of frames that guide the angle calculation process, as described above and shown inFIG. 4A-FIG. 4D. Instep1443, described above and inFIG. 13C, theprocessor203 delivers the frames for circle marking and angle marking.Processor203 provides graphical prompts that allow a user to go back viastep1445, returning to step1423, described above.Processor203 also provides graphical prompts for clear766 or use768 of the location data and resultant element.Step1453 continues the process when the location inputs have been accepted foruse754768. Instep1455, points are accepted as set, and atstep1457 theprocessor203 updates the points and stores updated data for the record in the read/write memory208, and the prescription is updated. If the points are cleared, as instep1465, theprocessor203 reverts the data to the original points and the process returns to step1423.
FIG. 16 shows an overview of a method for computing the angle of repose for a lens on an eye, and for modifying a specification of a contact lens using the angle. Other implementations may perform the actions in different orders and/or with different, fewer or additional actions than those illustrated inFIG. 15. Multiple actions can be combined in some implementations. Instep1510, the method includes displaying an image of a camera view finder and an orientation graphic, including indicators of orientation of the camera using the orientation sensor. These indicators can indicate rotation of the camera lens relative to a line of sight axis normal to the lens, around a horizontal axis and around a vertical axis. Instep1520, the method includes enabling capture and storage of an image while displaying the graphic. Instep1530, the method includes executing GUI logic using the display, the GUI logic including a frame or frames including the captured image and graphical constructs prompting input identifying a set of locations on the captured image, the set including a number of locations usable to define a location of a lens in the image, and a location usable to define an index point location on the lens.Step1535 includes accepting input data, responsive to the graphical constructs, that identifies locations on the captured image.
Step1540 includes interpreting the input data as locations on edges of a lens in the captured image, and a location of a position of an index point on the lens in the captured image, andstep1550 includes processing the input identifying the set of locations to determine the angle of repose of the lens on the eye. Finally,step1560 of the method includes updating or producing a lens specification (e.g. a toric contact lens prescription) based on the determined angle of repose.
As with all flowcharts herein, it will be appreciated that many of the steps can be combined, performed in parallel or performed in a different sequence without affecting the functions achieved. In some cases, as the reader will appreciate, a re-arrangement of steps will achieve the same results only if certain other changes are made as well. In other cases, as the reader will appreciate, a re-arrangement of steps will achieve the same results only if certain conditions are satisfied. Furthermore, it will be appreciated that the flow charts herein show steps that are pertinent to an understanding of the invention, and it will be understood that numerous additional steps for accomplishing other functions can be performed before, after and between those shown.
While the present invention is disclosed by reference to the preferred embodiments and examples detailed above, it is to be understood that these examples are intended in an illustrative rather than in a limiting sense. It is contemplated that modifications and combinations will readily occur to those skilled in the art, which modifications and combinations will be within the spirit of the invention and the scope of the following claims.

Claims (24)

What is claimed is:
1. A method for operating a device or devices to determine an angle of repose of a contact lens on an eye, the device or devices including a display, a camera with a camera lens, an orientation sensor, a data processor, and a user input device, comprising:
displaying a view finder image from the camera on the display prior to capturing the image;
displaying an orientation graphic on the display using the orientation sensor while displaying the view finder image, the orientation graphic to indicate orientation of the camera lens around a line of sight axis normal to the camera lens around a horizontal axis and around a vertical axis;
capturing and storing an image using the camera;
executing a graphical user interface using the display, the graphical user interface including a frame or frames including the captured image, and graphical constructs prompting input identifying a set of locations on the captured image usable to define a location of the contact lens in the image, and a location usable to define an angle of repose of the contact lens in the image; and
accepting input data responsive to the graphical constructs that identifies the set of locations to define the location of the contact lens in the captured image and the location to define the angle of response of the contact lens in the captured image.
2. The method ofclaim 1, including processing the input data to determine the angle of repose of the contact lens in the captured image.
3. The method ofclaim 1, wherein the graphical constructs prompting input identifying a set of locations include a construct which when selected identifies a particular member of the set of locations, and causes the graphical user interface to interpret coordinates of a selected point in the displayed image as the particular member of the set of locations.
4. The method ofclaim 1, wherein the graphical constructs prompting input identifying a set of locations includes:
a set of constructs corresponding to locations usable to define a perimeter of the contact lens in the captured image, where the constructs in the set when selected identify respective members of the set of locations, and cause the graphical user interface to interpret coordinates of a selected point in the captured image as the respective member of the set of locations; and
another construct corresponding to an index point location in the set of locations usable to define the location of an index point on the contact lens in the captured image which when selected identifies the index point location, and causes the graphical user interface to interpret coordinates of a selected point in the captured image as the index point location in the set of locations.
5. The method ofclaim 1, including:
interpreting the input data as locations on edges of the contact lens in the captured image, and a location of a position of an index point on the contact lens in the captured image; and
processing the input identifying the set of locations to determine a center of the contact lens in the captured image, and to determine the angle of repose of the contact lens on an eye.
6. The method ofclaim 1, including:
processing the input data to determine the angle of repose of the contact lens on an eye; and
producing a lens specification based on the determined angle of repose.
7. The method ofclaim 1, wherein the user input device includes a touch screen overlying the display.
8. The method ofclaim 1, wherein the orientation graphic overlies the view finder image on the display, and includes a feature useable as reference for sizing the contact lens within the area of the captured image.
9. A device for determining an angle of repose of a contact lens on an eye, comprising:
a display; a camera with a camera lens, a data processor; an orientation sensor and a user input device, and including instructions executable by the data processor to:
display a view finder image from the camera on the display prior to capturing an image;
display an orientation graphic on the display using the orientation sensor while displaying the view finder image, the orientation graphic including graphical constructs that indicate orientation of the camera lens around a line of sight axis normal to the camera lens, around a horizontal axis and around a vertical axis;
store an image captured using the camera;
execute a graphical user interface using the display, the graphical user interface including a frame or frames including the captured image, and graphical constructs prompting input identifying a set of locations on the captured image usable to define a location of the contact lens in the image, and a location usable to define an angle of repose of the contact lens in the image; and
accept input data responsive to the graphical constructs that identifies the set of locations to define the location of the lens in the captured image and the location to define the angle of response of the contact lens in the captured image.
10. The device ofclaim 9, including instructions executable to process the input data to determine the angle of repose.
11. The device ofclaim 9, wherein the graphical constructs prompting input identifying a set of locations include a construct which when selected identifies a particular member of the set of locations, and causes the graphical user interface to interpret coordinates of a selected point in the displayed image as the particular member of the set of locations.
12. The device ofclaim 9, wherein the graphical constructs prompting input identifying a set of locations include:
a set of constructs corresponding to locations usable to define a perimeter of the contact lens in the captured image, where the constructs in the set when selected identify respective members of the set of locations, and cause the graphical user interface to interpret coordinates of a selected point in the captured image as the respective member of the set of locations; and
another construct corresponding to an index point location in the set of locations usable to define the location of an index point on the contact lens in the captured image which when selected identifies the index point location, and causes the graphical user interface to interpret coordinates of a selected point in the captured image as the index point location in the set of locations.
13. The device ofclaim 9, including instructions executable to:
interpret the input data as locations on edges of the contact lens in the captured image, and a location of a position of an index point on the lens in the captured image; and
process the input identifying the set of locations to determine a center of the contact lens in the captured image, and to determine the angle of repose of the lens.
14. The device ofclaim 9, including instructions executable to:
process the input data to determine the angle of repose of the contact lens on an eye; and
produce a lens specification based on the determined angle of repose.
15. The device ofclaim 9, wherein the user input device includes a touch screen overlying the display.
16. The device ofclaim 9, wherein the orientation graphic overlies the view finder image on the display, and includes a feature useable as reference for sizing the contact lens within the area of the captured image.
17. A computer program product for determining an angle of repose of a contact lens on an eye using a device having a display; a camera with a camera lens, a data processor, an orientation sensor and a user input device, the product comprising a non-transitory computer readable medium storing instructions executable by the data processor, the instructions comprising:
logic to display a view finder image from a camera on the display prior to capturing the image;
logic to display an orientation graphic on the display while displaying the view finder image, using the orientation sensor, the orientation graphic including graphical constructs that indicate orientation of the camera lens around a line of sight axis normal to the cameral lens, around a horizontal axis and around a vertical axis;
logic to store an image captured using the camera;
logic to execute a graphical user interface using the display, the graphical user interface including a frame or frames including the captured image, and graphical constructs prompting input identifying a set of locations on the captured image usable to define a location of the contact lens in the image, and a location usable to define an angle of repose of the contact lens in the image; and
logic to accept input data responsive to the graphical constructs that identifies the set of locations to define the location of the contact lens in the captured image and the location to define the angle of repose of the contact lens in the captured image.
18. The product ofclaim 17, wherein the instructions include logic to process the input data to determine the angle of repose of the contact lens on the capture image.
19. The product ofclaim 17, wherein the graphical constructs prompting input identifying a set of locations include a construct which when selected identifies a particular member of the set of locations, and causes the graphical user interface to interpret coordinates of a selected point in the displayed image as the particular member of the set of locations.
20. The product ofclaim 17, wherein the graphical constructs prompting input identifying a set of locations include:
a set of constructs corresponding to locations usable to define a perimeter of the contact lens in the captured image, where the constructs in the set when selected identify respective members of the set of locations, and cause the graphical user interface to interpret coordinates of a selected point in the captured image as the respective member of the set of locations; and
another construct corresponding to an index point location in the set of locations usable to define location of an index point on the contact lens in the captured image which when selected identifies the index point location, and causes the graphical user interface to interpret coordinates of a selected point in the captured image as the index point location in the set of locations.
21. The product ofclaim 17, wherein the instructions include logic to:
interpret the input data as locations on edges of the contact lens in the captured image, and a location of a position of an index point on the contact lens in the captured image; and
process the input identifying the set of locations to determine a center of the contact lens in the captured image, and to determine the angle of repose of the contact lens.
22. The product ofclaim 17, wherein the instructions include logic to:
process the input data to determine the angle of repose of the contact lens; and
produce a lens specification based on a determined angle of repose.
23. The product ofclaim 17, wherein the user input device includes a touch screen overlying the display.
24. The product ofclaim 17, wherein the orientation graphic overlies the view finder image on the display, and includes a feature useable as reference for sizing the contact lens within the area of the captured image.
US15/070,2542015-04-022016-03-15Systems and methods for determining an angle of repose of an asymmetric lensActive2036-05-24US9867533B2 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US15/070,254US9867533B2 (en)2015-04-022016-03-15Systems and methods for determining an angle of repose of an asymmetric lens

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US201562142006P2015-04-022015-04-02
US15/070,254US9867533B2 (en)2015-04-022016-03-15Systems and methods for determining an angle of repose of an asymmetric lens

Publications (2)

Publication NumberPublication Date
US20160287067A1 US20160287067A1 (en)2016-10-06
US9867533B2true US9867533B2 (en)2018-01-16

Family

ID=55699665

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US15/070,254Active2036-05-24US9867533B2 (en)2015-04-022016-03-15Systems and methods for determining an angle of repose of an asymmetric lens

Country Status (13)

CountryLink
US (1)US9867533B2 (en)
EP (1)EP3262459A1 (en)
JP (1)JP6564054B2 (en)
KR (1)KR101891457B1 (en)
CN (1)CN107430289B (en)
AU (1)AU2016240027B2 (en)
CA (1)CA2981412C (en)
GB (1)GB2553939B (en)
MX (1)MX370882B (en)
MY (1)MY180079A (en)
SG (1)SG11201707770VA (en)
TW (1)TWI632529B (en)
WO (1)WO2016156810A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
DK180859B1 (en)*2017-06-042022-05-23Apple Inc USER INTERFACE CAMERA EFFECTS
US11509755B2 (en)*2017-09-292022-11-22Johnson & Johnson Vision Care, Inc.Method and means for evaluating toric contact lens rotational stability
RU2695567C1 (en)*2018-08-302019-07-24федеральное государственное автономное учреждение "Национальный медицинский исследовательский центр "Межотраслевой научно-технический комплекс "Микрохирургия глаза" имени академика С.Н. Федорова" Министерства здравоохранения Российской ФедерацииMethod for determining angle of rotation of toric intraocular lens
EP4356818A4 (en)*2021-06-172025-06-11Kowa Company, Ltd. MEDICAL PROGRAM AND MEDICAL EXAMINATION SYSTEM
US20240062676A1 (en)*2022-08-192024-02-22Johnson & Johnson Vision Care, Inc.Digital contact lens insertion and removal aid
US20240373121A1 (en)2023-05-052024-11-07Apple Inc.User interfaces for controlling media capture settings
US20250106501A1 (en)*2023-09-222025-03-27Walmart Apollo, LlcSystem and method for user interface guidance system for electronic devices

Citations (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5686981A (en)1994-02-281997-11-11Menicon Co., LtdOphthalmologic device for accurately positioning a contact lens to an eye
US5926252A (en)*1998-08-251999-07-20Reyburn; Thomas P.Ophthalmic instrument that measures rotation of a toric contact lens
US5963299A (en)1997-07-221999-10-05Reyburn; Thomas P.Method and apparatus for measuring toric contact lens rotation
CA2717328A1 (en)2010-10-142012-04-14Anton SabetaMethod and system for determining the orientation of an ophthalmic lens
TW201241781A (en)2011-04-072012-10-16Claridy Solutions IncInteractive service methods and systems for virtual glasses wearing
WO2013104353A1 (en)2012-01-102013-07-18Hans-Joachim OllendorfMobile video centring system for determining centring data for spectacle lenses
TW201433998A (en)2013-02-232014-09-01Univ Southern Taiwan Sci & TecCloud body-sensory virtual-reality eyeglasses prescription system
WO2014193798A1 (en)2013-05-302014-12-04Johnson & Johnson Vision Care, Inc.Apparatus for programming an energizable ophthalmic lens with a programmable media insert
US8967488B2 (en)2013-05-172015-03-03Johnson & Johnson Vision Care, Inc.Ophthalmic lens with communication system
US20150061990A1 (en)2013-09-042015-03-05Johnson & Johnson Vision Care, Inc.Ophthalmic lens system capable of interfacing with an external device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH11295669A (en)*1998-04-081999-10-29Menicon Co LtdTrial lens
GB0713461D0 (en)*2007-07-112007-08-22Ct Meter LtdDevice and methods for obtaining measurements for spectacles fitting
JP3164152U (en)*2010-07-212010-11-18株式会社マグネテックジャパン Pair shave bar magnet
FR2980591B1 (en)*2011-09-282014-05-16Essilor Int METHOD FOR MEASURING MORPHO-GEOMETRIC PARAMETERS OF AN INDIVIDUAL CARRYING GLASSES
JP2015048378A (en)*2013-08-302015-03-16三菱レイヨン株式会社Acrylonitrile-based polymer particle and production method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5686981A (en)1994-02-281997-11-11Menicon Co., LtdOphthalmologic device for accurately positioning a contact lens to an eye
US5963299A (en)1997-07-221999-10-05Reyburn; Thomas P.Method and apparatus for measuring toric contact lens rotation
US5926252A (en)*1998-08-251999-07-20Reyburn; Thomas P.Ophthalmic instrument that measures rotation of a toric contact lens
CA2717328A1 (en)2010-10-142012-04-14Anton SabetaMethod and system for determining the orientation of an ophthalmic lens
TW201241781A (en)2011-04-072012-10-16Claridy Solutions IncInteractive service methods and systems for virtual glasses wearing
WO2013104353A1 (en)2012-01-102013-07-18Hans-Joachim OllendorfMobile video centring system for determining centring data for spectacle lenses
TW201433998A (en)2013-02-232014-09-01Univ Southern Taiwan Sci & TecCloud body-sensory virtual-reality eyeglasses prescription system
US8967488B2 (en)2013-05-172015-03-03Johnson & Johnson Vision Care, Inc.Ophthalmic lens with communication system
WO2014193798A1 (en)2013-05-302014-12-04Johnson & Johnson Vision Care, Inc.Apparatus for programming an energizable ophthalmic lens with a programmable media insert
US20150061990A1 (en)2013-09-042015-03-05Johnson & Johnson Vision Care, Inc.Ophthalmic lens system capable of interfacing with an external device

Non-Patent Citations (10)

* Cited by examiner, † Cited by third party
Title
Bethke (Ed), "Smartphones Take on Astigmatism", Review of Ophalmology, Jun. 5, 2014, 4 pages.
Office Action from family member application TW105110659, dated Jan. 10, 2017, with English translation, 26 pages.
PCT Int'l Preliminary Report on Patentability IPRP in related Application No. PCT/GB2016/050824 dated Jun. 26, 2017, 10 pages.
PCT Response to Written Opinon in related Application No. PCT/GB2016/050824 dated May 8, 2017, 13 pages.
PCT Search Report in related Application No. PCT/GB2016/050824 dated Jun. 24, 2016, 14 pages.
PCT Written Opinion Report in related Application No. PCT/GB2016/050824 dated Mar. 9, 2017, 11 pages.
TW 105110659-First Office Action dated Jan. 10, 2017, 26 pages.
TW 105110659—First Office Action dated Jan. 10, 2017, 26 pages.
TW 105110659-Office Action dated Aug. 9, 2017, 5 pages.
TW 105110659—Office Action dated Aug. 9, 2017, 5 pages.

Also Published As

Publication numberPublication date
WO2016156810A1 (en)2016-10-06
US20160287067A1 (en)2016-10-06
TW201640448A (en)2016-11-16
EP3262459A1 (en)2018-01-03
CA2981412C (en)2019-03-19
SG11201707770VA (en)2017-10-30
MX370882B (en)2020-01-08
KR101891457B1 (en)2018-09-28
MY180079A (en)2020-11-20
GB2553939A (en)2018-03-21
JP2018510384A (en)2018-04-12
AU2016240027B2 (en)2018-09-20
CA2981412A1 (en)2016-10-06
CN107430289A (en)2017-12-01
JP6564054B2 (en)2019-08-21
MX2017012035A (en)2018-07-06
AU2016240027A1 (en)2017-10-19
GB2553939B (en)2019-08-28
GB201715759D0 (en)2017-11-15
KR20170135850A (en)2017-12-08
CN107430289B (en)2020-09-11
TWI632529B (en)2018-08-11

Similar Documents

PublicationPublication DateTitle
US9867533B2 (en)Systems and methods for determining an angle of repose of an asymmetric lens
US10585288B2 (en)Computer display device mounted on eyeglasses
US10969949B2 (en)Information display device, information display method and information display program
US11789528B1 (en)On-the-fly calibration for improved on-device eye tracking
JP6514418B2 (en) Imaging system, imaging method, and program
US10033943B1 (en)Activity surface detection, display and enhancement
EP3270099B2 (en)Measurement device for eyeglasses-wearing parameter, measurement program for eyeglasses-wearing parameter, and position designation method
WO2018191784A1 (en)Eyeglasses ordering system and digital interface therefor
WO2020238249A1 (en)Display method, apparatus and device for interactive interface
CN103970499A (en)Method and device for displaying electronic content and terminal equipment
CN110455265A (en)RTK setting-out system, method and device
CN106097428A (en)The mask method of threedimensional model metrical information and device
KR20240032107A (en) Vision testing systems and methods and their uses
US20170083157A1 (en)Projection device
US20240159621A1 (en)Calibration method of a portable electronic device
HK1241986A1 (en)Systems and methods for determining an angle of repose of an asymmetric lens
HK1241986B (en)Systems and methods for determining an angle of repose of an asymmetric lens
CN114782962B (en) Calligraphy scoring method, device, storage medium and electronic device
KR102777427B1 (en)Shooting guidance method and program for creating 3d spin image
CN114971115A (en)Oil and gas technical element yield division rate determining method and device and storage medium
CN120144018A (en) Viewing angle control method, device, electronic device and storage medium
EP2677401A1 (en)Image data generation using a handheld electronic device
JP2016110597A (en)Information processing system, information processing device, coordinate conversion method and program

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:COOPERVISION HONG KONG LIMITED, HONG KONG

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FAN, CHI SHING;REEL/FRAME:042046/0704

Effective date:20170407

ASAssignment

Owner name:COOPERVISION INTERNATIONAL HOLDING COMPANY, LP, BA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:COOPERVISION HONG KONG LIMITED;REEL/FRAME:042787/0130

Effective date:20170615

STCFInformation on status: patent grant

Free format text:PATENTED CASE

CCCertificate of correction
ASAssignment

Owner name:COOPERVISION INTERNATIONAL LIMITED, UNITED KINGDOM

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:COOPERVISION INTERNATIONAL HOLDING COMPANY, LP;REEL/FRAME:054370/0631

Effective date:20201102

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:8


[8]ページ先頭

©2009-2025 Movatter.jp