Movatterモバイル変換


[0]ホーム

URL:


US5933132A - Method and apparatus for calibrating geometrically an optical computer input system - Google Patents

Method and apparatus for calibrating geometrically an optical computer input system
Download PDF

Info

Publication number
US5933132A
US5933132AUS08/648,659US64865996AUS5933132AUS 5933132 AUS5933132 AUS 5933132AUS 64865996 AUS64865996 AUS 64865996AUS 5933132 AUS5933132 AUS 5933132A
Authority
US
United States
Prior art keywords
distortion
coordinate values
image
coordinate
determined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US08/648,659
Inventor
Roger N. Marshall
Lane T. Hauck
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Infocus Corp
Straight Signals LLC
Original Assignee
Proxima Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US07/611,416external-prioritypatent/US5181015A/en
Application filed by Proxima CorpfiledCriticalProxima Corp
Priority to US08/648,659priorityCriticalpatent/US5933132A/en
Application grantedgrantedCritical
Publication of US5933132ApublicationCriticalpatent/US5933132A/en
Assigned to INFOCUS CORPORATIONreassignmentINFOCUS CORPORATIONMERGER (SEE DOCUMENT FOR DETAILS).Assignors: PROXIMA CORPORATION, A DELAWARE CORPORATION
Assigned to INFOCUS CORPORATIONreassignmentINFOCUS CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: HAUCK, LANE T.
Assigned to STRAIGHT SIGNALS LLCreassignmentSTRAIGHT SIGNALS LLCASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: INFOCUS CORPORATION
Assigned to PROXIMA CORPORATIONreassignmentPROXIMA CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: BUSCH, JEFFREY W., HAUCK, LANE T., MARSHALL, ROGER N., SHAPIRO, LEONID, STEVENS, ERIC S.
Anticipated expirationlegal-statusCritical
Expired - Fee Relatedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A method and apparatus geometrically makes correction in an optical computer input system.

Description

CROSS-REFERENCES TO RELATED APPLICATIONS
This application is a continuation patent application of U.S. patent application Ser. No. 08/342,905, filed Nov. 21, 1994, now abandoned, entitled "METHOD AND APPARATUS FOR CALIBRATING GEOMETRICALLY AN OPTICAL COMPUTER INPUT SYSTEM," which is a continuation patent application of U.S. patent application Ser. No. 08/115,522, filed Aug. 31, 1993, now abandoned, entitled "METHOD AND APPARATUS FOR CALIBRATING GEOMETRICALLY AN OPTICAL COMPUTER INPUT SYSTEM," which is a continuation patent application of U.S. patent application Ser. No. 07/656,803, filed Feb. 14, 1991, now abandoned, entitled "METHOD AND APPARATUS FOR CALIBRATING GEOMETRICALLY AN OPTICAL COMPUTER INPUT SYSTEM," which is a continuation-in-part patent application of U.S. patent application Ser. No. 07/433,029, filed Nov. 7, 1989, now abandoned, entitled "COMPUTER INPUT SYSTEMS AND METHOD OF USING SAME", and of U.S. patent application Ser. No. 07/611,416, filed Nov. 9, 1990, now U.S. Pat. No. 5,181,015, entitled "METHOD AND APPARATUS FOR CALIBRATING AN OPTICAL COMPUTER INPUT SYSTEM", each one of said applications being incorporated herein by reference as fully set forth herein.
TECHNICAL FIELD
The present invention relates in general to a method in apparatus for calculating geometrically an optical computer input system. It more particularly relates to a system for calibrating geometrically for distortion in connection with such an optical computer input system is shown and described in the said patent application.
BACKGROUND
A new optical computer input system is shown and described in said parent patent application. Such system enable the use to shine a high intensity light onto a screen bearing a computer generated image to provide auxiliary information for the computer. Such an input system includes an optical sensing device, such as a charged coupled device camera focused on to the screen. Thus the system can detect high intensity light images and discriminate them from the computer generated images, to input information interactively into the computer, in a convenient manner, even in very low ambient light conditions.
While such a computer input system and method of using it has proven to be highly satisfactory, it would be desirable to provide for a geometric compensation or correction. There are various reasons why such a geometric correction is required. First, the screen onto which is projected the computer generated image may not be a perfect rectangle as presented to the sensing device or camera. In this regard, the screen may be tilted either forward or backward, or from side to side, or any combination thereof. Thus, the sensing device or camera will not track properly relative to the image visualized from the screen.
An even more significant problem is the problem of "keystoning" which is caused by an overhead projector utilized in projecting the computer generated image onto the screen. In this regard, the commonly known keystoning problem produces an image which has a longer top edge as compared to its bottom edge. Such keystoning problem is well known with overhead projectors, and thus, the sensing device or camera will produce a distortion in sensing the image projected into the computer.
A third problem is caused by the improper alignment of a projection panel on the stage of the overhead projector. In this regard, if the panel is not accurately aligned in a parallel manner on all sides relative to the projector's stage, the resulting image projected onto the screen will also be askew.
A fourth problem relates to the project itself not being properly aligned relative to the screen. Such is commonly the case where the neck portion of the overhead projector may be bent slightly due to excessive use or wear. This causes a result similar to the improper registration of the panel on the stage of the projector.
A still further problem of geometric alignment is caused by the camera or sensing device being tilted at an angle being tilted upwardly, or downwardly, relative to the plane of the screen. The result is that a distortion may occur.
As a result of both described distortions due to the geometry of the screen, projector and panel, as well as the camera itself, the camera is unable to accurately plot the various coordinates visualized from the image projected on to the screen. As a result, tracking is not able to be perfectly accomplished. Thus, when a light is projected onto the screen, the camera may not accurately know the precise coordinates of the spot of light projected onto the screen. As a result, the computer may not accurately respond to the position of the light and incorrect data can be entered. Thus, erroneous results might occur.
DISCLOSURE OF INVENTION
Therefore, the principle object of the present invention is to provide a new and improved geometric correction arrangement for an optical imaging system.
Briefly, the above referenced object is realized by providing a new and improved geometric correction system.
There is provided in accordance with the present invention, a geometric system which includes an arrangement for generating geometrically compensated relative coordinates for a projected image and for storing such coordinates.
Therefore, the system of the present invention produces a normalization of the image of the screen to provide for the necessary correction. Thus, the resulting coordinates stored by the system are continuously used to adjust or convert the coordinates sensed by the camera. Thus, suitable corrections for the distorted are produced.
BRIEF DESCRIPTION OF DRAWINGS
The above mentioned and other objects and features of this invention, and the manner of attaining them will become apparent, and the invention itself will be best understood by reference to the following description of the embodiment of the invention in conjunction with the accompanying drawings, wherein:
FIG. 1 is a block diagram of the imaging system;
FIGS. 2 through 11 are diagrammatic views of various images or portions of images helpful in understanding the operation of the present invention; and
FIGS. 12 through 17 are flow charts of computer software for the system of FIG. 1 to illustrate the operation of the geometric correction arrangement.
BEST MODE FOR CARRYING OUT THE INVENTION
Referring now to the drawings and more particularly FIG. 1 thereof, there is illustrated a computer input system 10 which modifies computer generated images appearing on ascreen 21, and which is constructed in accordance with the present invention. The computer input system 10 generally includes an image projection/detection system or arrangement 11 whose input path (cable 17A) is coupled to the output of a video port 17 of acomputer 16. The arrangement 11 comprises aliquid crystal panel 13 and a charge coupleddevice image sensor 14. Thecomputer 16 is a conventional personal computer, such as a model PS/2 personal computer manufactured by International Business Machines. Thecomputer 16 includes a video monitor 19A andkeyboard 19B. Thepanel 13 is driven by thecomputer 16 for generating live images which are projected by anoverhead head projector 22 onto thescreen 21.
The computer input system 10 also includes asignal processing unit 25 coupled between the output path (cable 14A) of the image/detection arrangement 11 and theinput serial port 18 of thecomputer 16 viacable 25A. The computer input system 10 further includes a light wand or light generatingpointing device 24, or a laser light generating device.
The projection/detection arrangement 11 detects the presence of auxiliary light image or spot projected onto theviewing surface 21 by the handheld/battery-operatedlight generating device 24, and generates an analog electrical signal which is coupled to thesignal processing unit 25 viacable 14A. Thesignal processing unit 25 responds to the analog signal, and converts the signal into digital pixel coordinates reference signals which identify the relative position of the auxiliary light image onscreen 21, which are transferred into thecomputer 16 via theoutput cable 25A. Cable 25A is connected to theserial input port 18 of thecomputer 16.Computer 16 responds to the pixel coordinates signals and can alter its application program which causes the computer generated image being projected onto thescreen 21 to be modified. For example, the computer generated projected image on theviewing area 21 can be modified in accordance with the information contained in the coordinate references signals.
In the firmware stored in thesignal processor 25 provides the necessary geometric correction for the image in accordance with the present invention. As shown in FIG. 2, the correction commences by projecting a bright rectangular light onto the screen to determine what correction is necessary and then record the necessary information for converting the coordinates to an adjusted relative coordinate. As shown on FIG. 2, a true rectangular image is indicated at 80. It should be noted that each one of the four sides of the image can be distorted in a generally rectangular manner. On each side of therectangular image 80 there are two possible triangular area of distortion possible. The arrangement of the present invention determines which one of the two triangular areas of distortion are present for each side of the rectangular image. Once that determination is made, a formula for the relative correction is generated and stored in the signal processor. Thus, there are eight possible triangular areas of distortion indicated at 81 through 88.
In FIG. 3, there is shown an example of a grossly distorted rectangular image 90 as an example. The first portion of the process or technique for doing the correction, is the actual corners of the projected image are shown. The coordinates for the individual corners are shown in FIG. 3. The technique for determining the corners are similar as shown and described in the parent applications.
Once the corner coordinates are generated, a defined central coordinate of X , Y is located at the intersection of the diagonals of the corners as indicated in FIG. 3. Thus, it is important to know for the purposes of the invention four quadrants of the generally rectangular image.
In this regard, if the coordinate identified on the screen is, for example, in the upper left quadrant, then one of the two top triangles must be identified, together with one of the two left triangles. In so doing, the relative coordinates for the identified spot of light on the screen can be determined accordingly.
Referring now to FIGS. 12 through 17, the software for the geometrically correct arrangement will now be considered in greater detail Once a rectangular image is projected onto the screen, such as thescreen 21 of FIG. 1, the software in thesignal processor 25 determines the coordinates of the four corners of the image appearing on the screen, as indicated atbox 121 of FIG. 12. In this regard, the image sensor 14 (FIG. 1) determines the interface between the bright image and the remaining portion of the screen, in a similar manner as described in the parent patent applications.
As indicated inbox 122 in FIG. 12, and as shown in FIG. 3, the geometric center of the screen is determined and defined at the intersection of diagonal lines extending through the corners. And thereafter, as indicated inbox 123, particular coordinates of a spot of light projected onto the screen by the handheldlight device 24 is determined. In this regard, a position X of the spot of light is determined to be either left of center or not. If it is determined to be left of center, then the software will analyze whether the left edge is in perfect vertical alignment or whether it is distorted according to either one or the other triangular areas ofdistortion 81 or 82, as indicated more clearly in FIGS. 4 and 5. As indicated in box 124, a determination is made as to whether or not the left edge is a perfectly aligned vertical edge. In this regard, the X coordinate of the top, left corner is compared with the X coordinate bottom, left corner. If they are equal, then the edge is determined to be perfectly oriented in a vertical disposition, and thus correction is not required. If such occurs,box 131 of FIG. 13 is entered.
Assuming that a left margin of the image is required to be corrected, the decision box 125 is entered as shown in FIG. 12 to determine whether the X coordinate of the top left corner is greater than the bottom left corner. This determination is made to learn whether there is rectangular distortion area 81 (FIG. 4), or a triangular area of distortion shown at 82 of FIG. 5. Thetriangular area 81 is in the shape of a right triangle having its based aligned with the bottom edge of the rectangular image. Therectangular area 82 is inverted from thearea 81, and has its base coextensive with the top edge of the rectangular image area.
At the decision box 125, if the X coordinate at the top left corner is greater than the X coordinate at the bottom left corner then it is determined that atriangular image area 82 is present. In such a situation, thebox 126 is entered and the left edge X coordinate is calculated in thebox 126. Thereafter, at theboxes 132, 133 the absolute value of the X coordinate of the spot is calculated. This value of the X coordinate is then used for sending to thecomputer 16, instead of sending of the value of the X coordinate of the spot of light as detected by thesensor 14. In this regard, the formula shown in theboxes 132 and 133 are used to scale the X coordinate of the spot of light on thescreen 21 to correct geometrically for any distortions. It should be understood that this X coordinate correction is calculated on the slide for each spot of light detected by theimage sensor 14.
Thus, it is to be understood that the initial storing of the X and Y coordinates of the four corners and the defined center is the only information that need be saved during initialization process. Thereafter, the absolute values of the X and Y coordinates of the spot of light are calculated and supplied to thehost computer 16 on the fly.
In order to determine the other left edge triangular area of distortion indicated at 81 at FIGS. 2 and 4, the decision box 125 of FIG. 12 will determine that the top left X coordinate is not greater than the bottom left X coordinate, so that thebox 126 is entered to calculate the left edge X coordinate. After so calculating, theboxes 132 and 133 are entered to determine a different X ABSOLUTE value based upon thetriangular area 81 of distortion.
Referring again tobox 123 to where the X coordinate position of the light spot is determined to be left of center, if it is not left of center then thedecision box 141 of FIG. 14 is entered. A determination will then be made as to whether the right edge of the rectangular image area is perfectly vertical, or whether one of the twotriangular areas 83 and 84 at the right edge of the image area occurs. This operation is similar to the operation of distinguishing the toptriangular areas 81 and 82.
At thedecision box 141, there is determination made as to whether the top right X coordinate is equal to the bottom right X coordinate. If they are equal thenbox 145 is entered directly to determine if the right edge X coordinate is equal to the X coordinate of the top right edge.
However, if they are not equal, then thebox 142 is entered to determine whether or not the top right X coordinate is greater than the bottom right X coordinate. If it is, then the right edge X coordinate is calculated by the formula shown inbox 143. Thereafter, as indicated bybox 151 and 152, the absolute value of the X coordinate is then calculated knowing the right edge X coordinate and the X coordinate of the light spot and the X coordinate of the defined center. This absolute value of X is then used and supplied to thehost computer 16.
Referring again to thedecision box 142, if the top right X coordinate is not greater than the bottom right X coordinate, then the triangular area 83 (FIG. 2) is identified and thebox 144 is entered to calculate the X coordinate of the right edge, thereafter, calculations are made atboxes 151 and 152 calculating the absolute value of the X coordinate.
Once the absolute value of the X coordinate is determined from either one or the twotriangular areas 83 or 84, absolute value of the Y coordinate is then calculated.
In order to calculate the relative value of the Y coordinate of the light spot, the quadrant of the light spot is first determined atbox 153. In this regard, at box 153 a determination is made as to whether or not the Y position of the spot is above the defined center. If it is, then thedecision box 154 is entered to determine whether the top edge is either horizontal or whether there are either one or two triangular areas 85 or 86 as shown in FIG. 2.
If a determination is made atbox 154, if the top left Y coordinate is not equal to the right Y coordinate, then at thedecision box 155, a determination is made as to whether or not the top left Y coordinate is less than the top right Y coordinate. This decision will identify which one of the two top triangular areas 85 or 86 is present. If the decision is positive, then the triangular area 86 is present. If the decision is positive, then the triangular area 86 is identified and the calculation shown inbox 156 is made to determine the top edge Y coordinate. Thereafter, as shown in box 163 (FIG. 16A), andbox 164 of FIG. 16A, the absolute value of Y is then determined for supplying it to thehost computer 16. This value of the absolute value of Y is a scaled value of the Y coordinate of the spot.
It the top triangular area 85 is present, a similar calculation is made at box 161 for the top edge of the Y coordinate of the triangular area 85. Thereafter, the absolute value of Y is then calculatedboxes 163 and 164.
Once the absolute value of the Y coordinate is calculated, abox 176 in FIG. 17 is entered to prepare and transmit to the host computer once both absolute value of the X coordinate and the absolute value of the Y coordinates have been calculated. It should be noted that once these absolute values have been transmitted, the routine loops back to theinitial box 123 to repeat the cycle of operation for the next light spot detected.
Referring again tobox 153 of FIG. 15, if it is determined that the Y position of the spot is not above the center, then thedecision box 165 of FIG. 16 is entered. A determination is then as to whether or not the bottom left Y coordinate is equal to the bottom right Y coordinate. The purpose of this determination is to decide whether the bottom edge of the rectangular viewing area is either horizontal, or at a triangular position as indicated at either 87 or 88 in FIG. 2. If it is determined that the bottom edge is a true horizontal line, then theboxes 173 through 175 are entered to calculate the value of the absolute value of Y as previously explained. On the other hand, if they are not equal, then the decision made inbox 165 determines whether or not the bottom left Y coordinate is greater than the bottom right Y coordinate. If it is, then the box 171 is entered to perform a calculation to determine the bottom edge value of the Y coordinate. This calculation is based on thetriangular area 88, since the bottom left Y coordinate is greater than the bottom right Y coordinate. If the reverse is true, then the calculation is made atbox 172 based on thetriangular area 87 to determine the bottom edge Y coordinate. Thereafter, the absolute value of the Y coordinate is calculated as previously described.
The following is a series of the equations used to perform the various calculations as illustrated in the flow charts to scale the X and Y coordinates of the light spot.
1) FIRST DETERMINE X, Y POSITIONS OF EACH 4 CORNERS
2) NEXT TO FIND CENTER OF SCREEN, FIND INTERSECTION OF DIAGONAL LINES:
REFERRING TO FIG. 3, CALCULATE: ##EQU1## POINT SLOPE FORM OF EQUATION OF LINE
y-Y.sub.BR =m.sub.LR (x-X.sub.BR)                          (3)
y-Y.sub.BL =m.sub.RL (x-X.sub.BL)                          (4)
y-Y.sub.BR =m.sub.LR x-m.sub.LR x.sub.BL                   (5)
y-Y.sub.BL =m.sub.RL x-m.sub.RL X.sub.BL                   (6)
y-m.sub.LR x=Y.sub.BR -m.sub.LR x.sub.BR                   (7)
y-m.sub.RL x=Y.sub.BL -m.sub.RL X.sub.BL                   (8) ##EQU2## SUBSTITUTING INTO FIRST EQUATION
y.sub.c =m.sub.LR X.sub.C -m.sub.LR X.sub.BR +Y.sub.BR     (12)
Y.sub.C =m.sub.LR X.sub.C -m.sub.LR X.sub.BR +Y.sub.BR     (13)
3) DETERMINE IN WHICH QUADRANT, THE SPOT APPEARS BY APPLYING VERTICAL & HORIZONTAL CENTER LINES THROUGH THE CALCULATED CENTER.
4) COMPUTE THE ADJUSTED EDGE FOR BOTH X & Y IN THE FOUND QUANDRANT. ##EQU3##
SIDEOPP(ADJUSTMENT)=(DISTANCE)(TANα), WHERE DISTANCE EQUALS SIDE ADJ.(15)
REFERRING TO FIG. 4, FOR A LEFT EDGE OF ##EQU4##
X.sub.RELATIVE =X.sub.RAW -X.sub.HOME NOTE: IF X.sub.REL IS NEGATIVE THEN POSITION IS OUTSIDE OF SCREEN.                            (17) ##EQU5## REFERRING TO FIG. 5, FOR A LEFT EDGE OF ##EQU6##
X.sub.REL =X.sub.RAW -X.sub.HOME (NOTE: IF X.sub.REL IS NEGATIVE THEN POSITIVE IS OUTSIDE OF SCREEN                             (20) ##EQU7## REFERRING TO FIG. 6, FOR THE RIGHT EDGE OF ##EQU8##
X.sub.REL =X.sub.RAW -X.sub.o (NOTE: IF X.sub.RAN >X.sub.RIGHT THEN OUTSIDE OF SCREEN)                                                (23) ##EQU9## REFERRING TO FIG. 7, FOR A RIGHT EDGE OF ##EQU10##
X.sub.REL =X.sub.RAW -X.sub.C (NOTE: IF X.sub.RAW >X.sub.RIGHT THEN OUTSIDE OF SCREEN)                                                (26) ##EQU11## REFERRING TO FIG. 8, FOR A TOP EDGE OF ##EQU12##
Y.sub.REL =Y.sub.RAW -Y.sub.HOME NOTE: IF Y.sub.RAW <Y.sub.HOME THEN OUTSIDE OF SCREEN)                                        (29) ##EQU13## REFERRING TO FIG. 9, FOR A TOP EDGE OF ##EQU14##
Y.sub.REL =Y.sub.RAW -Y.sub.HOME ((NOTE: IF Y.sub.RAW <Y.sub.HOME THEN OUTSIDE OF SCREEN)                                        (32) ##EQU15## REFERRING TO FIG. 10, FOR A BOTTOM EDGE OF ##EQU16##
Y.sub.REL =Y.sub.RAW -Y.sub.C (NOTE: IF Y.sub.RAW >Y.sub.BOTTOM THEN OUTSIDE OF SCREEN)                                        (35) ##EQU17## REFERRING TO FIG. 11, FOR A BOTTOM EDGE OF ##EQU18##
Y.sub.REL =Y.sub.RAW -Y.sub.C (NOTE: IF Y.sub.RAW >Y.sub.BOTTOM THEN OUTSIDE OF SCREEN)                                        (38) ##EQU19##
While particular embodiments of the present invention have been disclosed, it is to be understood that various different modifications are possible and are contemplated within the true spirit and scope of the appended claims. There is no intention, therefore, of limitations to the exact abstract or disclosure herein presented.

Claims (18)

What is claimed is:
1. A coordinate correction apparatus comprising:
image detection means for perceiving visually a displayed generally rectangularly shaped projected image having keystone distortion;
hand held auxiliary light means for successively illuminating the corners of the distorted image with spots of high intensity auxiliary control light for facilitating defining the geometric form of the keystone distortion in the projected image;
said image detection means responsive to said spots of high intensity light only during a calibration mode for generating a signal indicative of the location of the high intensity spots;
signal processing means responsive to said signal during said calibration mode for determining the geometric center of the geometric form of the keystone distortion defined by the intersection coordinate values of a pair of imaginary lines interconnecting the detected spots of high intensity light;
distortion orientation calculating means responsive to the determined geometric center for determining quadrant scaling factors to facilitate determining corrected coordinate values for detected spots of high intensity auxiliary control light during a normal mode of operation;
said signal processing means storing the scaling factors for use during said normal mode of operation to enable said signal processing means to convert subsequently a determined auxiliary light sensing coordinate value relative to another computer generated projected image having substantially the same keystone distortion as said first mentioned projected image;
said signal processing means using said stored scaling factors for converting the determined auxiliary light sensory coordinate values to absolute coordinate values, said absolute coordinate values being indicative of corresponding coordinate value information defining a fixed location within said another computer generated image.
2. An apparatus according to claim 1, further comprising:
means for determining the corner x,y coordinate values for each respective corner of the displayed image; and
means for determining the x,y coordinate values for a defined center within the displayed image area, wherein said image area is bounded by the four respective edges of the displayed image.
3. An apparatus according to claim 2, wherein said means for determining the x,y coordinate value for said defined center includes point slope determining means.
4. An apparatus according to claim 3, wherein said point slope determining means includes algorithm means.
5. An apparatus according to claim 4, wherein said algorithm means includes means for solving an equation: ##EQU20## wherein yTL is indicative of a determined Y top left coordinate value:
wherein YBR is indicative of a determined y bottom right coordinate value;
wherein XTL is indicative of a determined X top left coordinate value; and
wherein XBR is indicative of a determined Y bottom right coordinate value.
6. An apparatus according to claim 5, wherein said algorithm means includes means for solving a pair of equations: ##EQU21##
7. An apparatus according to claim 1 further comprising: means for determining the x,y coordinate locations of each respective corner of the displayed image; and
means for determining a defined center coordinate location for the displayed image.
8. An apparatus according to claim 1, wherein said distortion orientation calculating means includes:
means for calculating an adjusted edge for both X and Y stored determined correction coordinate location values in the quadrant of the determined geometric center; and
means for calculating an absolute x value and an absolute y value.
9. An apparatus according to claim 8, wherein said absolute y value is: ##EQU22##10.
10. An apparatus according to claim 9, wherein said Yrelative is:
Y.sub.relative =Y.sub.SPOT -Y.sub.c.
11. An apparatus according to claim 10, wherein said yspot is indicative of the y coordinate value of the projected x,y coordinate image of light.
12. An apparatus according to claim 8, wherein said absolute x value is ##EQU23##
13. An apparatus according to claim 12, wherein said XRELATIVE is:
X.sub.relative =X.sub.SPOT -X.sub.left edge.
14. An apparatus according to claim 13, wherein said xSPOT is indicative of the x coordinate image of light.
15. An apparatus according to claim 14, wherein said X left edge is XTL.
16. An apparatus according to claim 13, wherein said X left edge is XTL.
17. An apparatus according to claim 1, further comprising:
said image detection means responsive to the projected image in another calibration mode for perceiving visually the distorted image and for generating a video signal indicative of the projected image;
said signal processing means further being responsive to said image detection means for determining image detection coordinate values for the distorted image, said distorted image having four determined corner coordinate values and associated right, left, top and bottom boundaries;
left boundary distortion determination means for calculating whether or not the left boundary is aligned vertically relative to its associated determined corner coordinate values;
right boundary distortion determination means for calculating whether or not the right boundary is aligned vertically relative to its associated determined coordinate values;
top boundary distortion determination means for calculating whether or not the top boundary is aligned horizontally relative to its associated determined corner coordinate values;
bottom boundary distortion determination means for calculating whether or not the bottom boundary is aligned horizontally relative to its associated determined corner coordinate values;
left distortion orientation means responsive to said left boundary distortion determination means for determining whether determined left boundary distortion is rectangular distortion or triangular distortion;
left triangular distortion correction means for calculating a set of corrected image detection coordinate values for the left edge of the distorted image;
left rectangular distortion correction means for calculating another set of corrected image detection coordinate values for the left edge of the distorted image;
absolute correction means responsive to said left triangular distortion correction means and said left rectangular distortion correction means for converting one set of the left edge corrected image detection coordinate values to absolute computer coordinate values;
right distortion orientation means responsive to said right boundary distortion determination means for determining whether determined right boundary distortion is rectangular distortion or triangular distortion;
right triangular distortion correction means for calculating a set of corrected image detection coordinate values for the right edge of the distorted image;
right rectangular distortion correction means for calculating another set of corrected image detection coordinate values for the right edge of the distorted image;
absolute correction means responsive to said right triangular distortion correction means and said right rectangular distortion correction means for converting one set of the right edge corrected image detection coordinate values to absolute computer coordinate values;
top distortion orientation means responsive to said top boundary distortion determination means for determining whether determined top boundary distortion is rectangular distortion or triangular distortion;
top triangular distortion correction means for calculating a set of corrected image detection coordinate values for the top edge of the distorted image;
top rectangular distortion correction means for calculating another set of corrected image detection coordinate values for the top edge of the distorted image;
absolute correction means responsive to said top triangular distortion correction means and said top rectangular distortion correction means for converting one set of the top edge corrected image detection coordinate values to absolute computer coordinate values;
bottom distortion orientation means responsive to said bottom boundary distortion determination means for determining whether determined bottom boundary distortion is rectangular distortion or triangular distortion;
bottom triangular distortion correction means for calculating a set of corrected image detection coordinate values for the bottom edge of the distorted image;
bottom rectangular distortion correction means for calculating another set of corrected image detection coordinate values for the bottom edge of the distorted image;
absolute correction means responsive to said bottom triangular distortion correction means and said bottom rectangular distortion correction means for converting one set of the bottom edge corrected image detection coordinate values to absolute computer coordinate values; and
whereby corrected coordinate values for the right, left, top and bottom boundaries are calculated to substantially eliminate errors in input coordinate information resulting from keystone image distortion in the projected image.
18. An apparatus for substantially eliminating errors in optical input information resulting from image distortion in a projected image, comprising
means for determining image detection coordinate values for any detected substantially flat generally rectangularly shaped projected image having possible edge distortion, said distortion being a distortion caused by a height differential between the position of the projected image on a viewing surface relative to the position of image projection means producing the projected image;
said determined image detection coordinate values including four determined corner coordinate values;
boundary distortion determination means for calculating whether or not any boundary between a pair of adjacent determined corner coordinate values is aligned rectilinearly relative to its associated determined corner coordinate values;
distortion orientation means responsive to said boundary distortion determination means for determining the orientation of the distortion relative to an imaginary rectilinear line forming part of an imaginary rectilinearly shaped image;
distortion correction means for calculating a set of corrected image detection coordinate values to eliminate errors in coordinate information resulting from edge distortion in the projected image; and
absolute correction means responsive to said distortion correction means for converting said set of corrected image detection coordinate values to absolute coordinate values, said absolute coordinate values being indicative of corresponding coordinate value information generated by said image projection means.
US08/648,6591989-11-071996-05-15Method and apparatus for calibrating geometrically an optical computer input systemExpired - Fee RelatedUS5933132A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US08/648,659US5933132A (en)1989-11-071996-05-15Method and apparatus for calibrating geometrically an optical computer input system

Applications Claiming Priority (6)

Application NumberPriority DateFiling DateTitle
US43302989A1989-11-071989-11-07
US07/611,416US5181015A (en)1989-11-071990-11-09Method and apparatus for calibrating an optical computer input system
US65680391A1991-02-141991-02-14
US11552293A1993-08-311993-08-31
US34290594A1994-11-211994-11-21
US08/648,659US5933132A (en)1989-11-071996-05-15Method and apparatus for calibrating geometrically an optical computer input system

Related Parent Applications (2)

Application NumberTitlePriority DateFiling Date
US07/611,416Continuation-In-PartUS5181015A (en)1989-11-071990-11-09Method and apparatus for calibrating an optical computer input system
US34290594AContinuation1989-11-071994-11-21

Publications (1)

Publication NumberPublication Date
US5933132Atrue US5933132A (en)1999-08-03

Family

ID=27537422

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US08/648,659Expired - Fee RelatedUS5933132A (en)1989-11-071996-05-15Method and apparatus for calibrating geometrically an optical computer input system

Country Status (1)

CountryLink
US (1)US5933132A (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20010010522A1 (en)*2000-01-272001-08-02Mitsubishi Denki Kabushiki KaishaThree-dimensional graphic processing device for drawing polygon having vertex data defined by relative value and method therefor
US20010024231A1 (en)*2000-03-212001-09-27Olympus Optical Co., Ltd.Stereoscopic image projection device, and correction amount computing device thereof
US20020197584A1 (en)*2001-06-082002-12-26Tansel KendirFirearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control
US20030002751A1 (en)*2001-03-022003-01-02Hung-Ming SunMethod of correcting an image with perspective distortion and producing an artificial image with perspective distortion
WO2001071665A3 (en)*2000-03-172003-01-16Sun Microsystems IncA graphics system having a super-sampled sample buffer with hot spot correction, edge blending, edge matching, distortion correction, and chromatic distortion compensation
US6600478B2 (en)*2001-01-042003-07-29International Business Machines CorporationHand held light actuated point and click device
US6616452B2 (en)2000-06-092003-09-09Beamhit, LlcFirearm laser training system and method facilitating firearm training with various targets and visual feedback of simulated projectile impact locations
US20030210381A1 (en)*2002-05-102003-11-13Nec Viewtechnology, Ltd.Method of correcting for distortion of projected image, distortion correcting program used in same method, and projection-type image display device
US6690354B2 (en)*2000-11-192004-02-10Canesta, Inc.Method for enhancing performance in a system utilizing an array of sensors that sense at least two-dimensions
US6704000B2 (en)*2000-11-152004-03-09Blue Iris TechnologiesMethod for remote computer operation via a wireless optical device
US6747636B2 (en)*1991-10-212004-06-08Smart Technologies, Inc.Projection display and system with pressure sensing at screen, and computer assisted alignment implemented by applying pressure at displayed calibration marks
US6753907B1 (en)*1999-12-232004-06-22Justsystem CorporationMethod and apparatus for automatic keystone correction
US20050041216A1 (en)*2003-07-022005-02-24Seiko Epson CorporationImage processing system, projector, program, information storage medium, and image processing method
US20060101349A1 (en)*2000-05-292006-05-11Klony LiebermanVirtual data entry device and method for input of alphanumeric and other data
US20060187199A1 (en)*2005-02-242006-08-24Vkb Inc.System and method for projection
US20070188447A1 (en)*2006-02-152007-08-16Pixart Imaging, Inc.Light-pointing device and light-tracking receiver having a function selection key and system using the same
US20070206159A1 (en)*2002-04-082007-09-06Nec Viewtechnology, Ltd.Method for correcting for distortion of projected image, program for correcting image distortion, and projection-type image display device
US20080018591A1 (en)*2006-07-202008-01-24Arkady PittelUser Interfacing
EP1363239A3 (en)*2000-03-172008-07-30Sun Microsystems, Inc.A graphics system having a super-sampled sample buffer with hot spot correction,edge blending, edge matching, distortion correction and chromatic distortion compensation
US7427983B1 (en)2002-06-022008-09-23Steelcase Development CorporationVisual communication system
CN100422913C (en)*2004-06-282008-10-01微光科技股份有限公司Array type photoreceptor index system and method thereof
US20110191690A1 (en)*2010-02-032011-08-04Microsoft CorporationCombined Surface User Interface
US20150062089A1 (en)*2013-05-092015-03-05Stephen HowardSystem and method for motion detection and interpretation
US10891003B2 (en)2013-05-092021-01-12Omni Consumer Products, LlcSystem, method, and apparatus for an interactive container
US12120471B2 (en)2014-12-302024-10-15Omni Consumer Products, LlcSystem and method for interactive projection

Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4857998A (en)*1987-02-261989-08-15Matsushita Electric Industrial Co., Ltd.Automatic primary color convergence alignment system for projection television
US5070465A (en)*1987-02-251991-12-03Sony CorporationVideo image transforming method and apparatus
US5091773A (en)*1989-10-031992-02-25Thomson-CsfProcess and device for image display with automatic defect correction by feedback

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5070465A (en)*1987-02-251991-12-03Sony CorporationVideo image transforming method and apparatus
US4857998A (en)*1987-02-261989-08-15Matsushita Electric Industrial Co., Ltd.Automatic primary color convergence alignment system for projection television
US5091773A (en)*1989-10-031992-02-25Thomson-CsfProcess and device for image display with automatic defect correction by feedback

Cited By (50)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6747636B2 (en)*1991-10-212004-06-08Smart Technologies, Inc.Projection display and system with pressure sensing at screen, and computer assisted alignment implemented by applying pressure at displayed calibration marks
US7289113B2 (en)1991-10-212007-10-30Smart Technologies Inc.Projection display system with pressure sensing at screen, and computer assisted alignment implemented by applying pressure at displayed calibration marks
US20080042999A1 (en)*1991-10-212008-02-21Martin David AProjection display system with pressure sensing at a screen, a calibration system corrects for non-orthogonal projection errors
US7626577B2 (en)1991-10-212009-12-01Smart Technologies UlcProjection display system with pressure sensing at a screen, a calibration system corrects for non-orthogonal projection errors
US6753907B1 (en)*1999-12-232004-06-22Justsystem CorporationMethod and apparatus for automatic keystone correction
US20010010522A1 (en)*2000-01-272001-08-02Mitsubishi Denki Kabushiki KaishaThree-dimensional graphic processing device for drawing polygon having vertex data defined by relative value and method therefor
US6788299B2 (en)*2000-01-272004-09-07Renesas Technology Corp.Three-dimensional graphic processing device for drawing polygon having vertex data defined by relative value and method therefor
EP1363238A3 (en)*2000-03-172008-07-30Sun Microsystems, Inc.A graphics system having a super-sampled sample buffer with hot spot correction,edge blending, edge matching, distortion correction and chromatic distortion compensation
EP1363239A3 (en)*2000-03-172008-07-30Sun Microsystems, Inc.A graphics system having a super-sampled sample buffer with hot spot correction,edge blending, edge matching, distortion correction and chromatic distortion compensation
US6771272B2 (en)2000-03-172004-08-03Sun Microsystems, Inc.Graphics system having a super-sampled sample buffer with hot spot correction
WO2001071665A3 (en)*2000-03-172003-01-16Sun Microsystems IncA graphics system having a super-sampled sample buffer with hot spot correction, edge blending, edge matching, distortion correction, and chromatic distortion compensation
US7695143B2 (en)2000-03-182010-04-13Seiko Epson CorporationImage processing system, projector, computer-readable medium, and image processing method
EP1137293A3 (en)*2000-03-212005-01-05Olympus CorporationStereoscopic image projection device
US20010024231A1 (en)*2000-03-212001-09-27Olympus Optical Co., Ltd.Stereoscopic image projection device, and correction amount computing device thereof
US7084857B2 (en)2000-05-292006-08-01Vkb Inc.Virtual data entry device and method for input of alphanumeric and other data
US7305368B2 (en)2000-05-292007-12-04Vkb Inc.Virtual data entry device and method for input of alphanumeric and other data
US20060101349A1 (en)*2000-05-292006-05-11Klony LiebermanVirtual data entry device and method for input of alphanumeric and other data
US6966775B1 (en)2000-06-092005-11-22Beamhit, LlcFirearm laser training system and method facilitating firearm training with various targets and visual feedback of simulated projectile impact locations
US6616452B2 (en)2000-06-092003-09-09Beamhit, LlcFirearm laser training system and method facilitating firearm training with various targets and visual feedback of simulated projectile impact locations
US6704000B2 (en)*2000-11-152004-03-09Blue Iris TechnologiesMethod for remote computer operation via a wireless optical device
US6690354B2 (en)*2000-11-192004-02-10Canesta, Inc.Method for enhancing performance in a system utilizing an array of sensors that sense at least two-dimensions
US6600478B2 (en)*2001-01-042003-07-29International Business Machines CorporationHand held light actuated point and click device
US6947610B2 (en)*2001-03-022005-09-20Ulead Systems Inc.Method of correcting an image with perspective distortion and producing an artificial image with perspective distortion
US20030002751A1 (en)*2001-03-022003-01-02Hung-Ming SunMethod of correcting an image with perspective distortion and producing an artificial image with perspective distortion
US7329127B2 (en)2001-06-082008-02-12L-3 Communications CorporationFirearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control
US20020197584A1 (en)*2001-06-082002-12-26Tansel KendirFirearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control
US20070206159A1 (en)*2002-04-082007-09-06Nec Viewtechnology, Ltd.Method for correcting for distortion of projected image, program for correcting image distortion, and projection-type image display device
US7755706B2 (en)*2002-04-082010-07-13Nec Display Solutions, Ltd.Method for correcting for distortion of projected image, program for correcting image distortion, and projection-type image display device
US20030210381A1 (en)*2002-05-102003-11-13Nec Viewtechnology, Ltd.Method of correcting for distortion of projected image, distortion correcting program used in same method, and projection-type image display device
US7427983B1 (en)2002-06-022008-09-23Steelcase Development CorporationVisual communication system
US8179382B2 (en)2003-05-302012-05-15Steelcase Development CorporationVisual communication system
US20080297595A1 (en)*2003-05-302008-12-04Hildebrandt Peter WVisual communication system
US20050041216A1 (en)*2003-07-022005-02-24Seiko Epson CorporationImage processing system, projector, program, information storage medium, and image processing method
US20080291402A1 (en)*2003-07-022008-11-27Seiko Epson CorporationImage processing system, projector, computer-readable medium, and image processing method
EP1494486A3 (en)*2003-07-022006-03-22Seiko Epson CorporationImage processing system, projector, information storage medium, and image processing method
US7419268B2 (en)2003-07-022008-09-02Seiko Epson CorporationImage processing system, projector, and image processing method
CN100422913C (en)*2004-06-282008-10-01微光科技股份有限公司Array type photoreceptor index system and method thereof
US8243015B2 (en)2005-02-242012-08-14Vkb Inc.Virtual data entry device
US20060187199A1 (en)*2005-02-242006-08-24Vkb Inc.System and method for projection
US20060187198A1 (en)*2005-02-242006-08-24Vkb Inc.Input device
US8933883B2 (en)*2006-02-152015-01-13Pixart Imaging, Inc.Light-pointing device and light-tracking receiver having a function selection key and system using the same
US20070188447A1 (en)*2006-02-152007-08-16Pixart Imaging, Inc.Light-pointing device and light-tracking receiver having a function selection key and system using the same
US20080018591A1 (en)*2006-07-202008-01-24Arkady PittelUser Interfacing
US20110191690A1 (en)*2010-02-032011-08-04Microsoft CorporationCombined Surface User Interface
US9110495B2 (en)*2010-02-032015-08-18Microsoft Technology Licensing, LlcCombined surface user interface
US10452203B2 (en)2010-02-032019-10-22Microsoft Technology Licensing, LlcCombined surface user interface
US20150062089A1 (en)*2013-05-092015-03-05Stephen HowardSystem and method for motion detection and interpretation
US9465488B2 (en)*2013-05-092016-10-11Stephen HowardSystem and method for motion detection and interpretation
US10891003B2 (en)2013-05-092021-01-12Omni Consumer Products, LlcSystem, method, and apparatus for an interactive container
US12120471B2 (en)2014-12-302024-10-15Omni Consumer Products, LlcSystem and method for interactive projection

Similar Documents

PublicationPublication DateTitle
US5933132A (en)Method and apparatus for calibrating geometrically an optical computer input system
US6520647B2 (en)Automatic keystone correction for projectors with arbitrary orientation
EP1519575B1 (en)Image processing system, projector, information storage medium, and image processing method
US5835241A (en)Method for determining the profile of a bound document with structured light
US7342572B2 (en)System and method for transforming an ordinary computer monitor into a touch screen
US7055958B2 (en)Image projection method and device
US7226173B2 (en)Projector with a plurality of cameras
US7014323B2 (en)Image processing system, projector, program, information storage medium and image processing method
US5581637A (en)System for registering component image tiles in a camera-based scanner device transcribing scene images
US7019713B2 (en)Methods and measurement engine for aligning multi-projector display systems
US6292171B1 (en)Method and apparatus for calibrating a computer-generated projected image
US8251524B2 (en)Projection display apparatus and display method
US7303285B2 (en)Projector and method of projecting projection image
US5528290A (en)Device for transcribing images on a board using a camera based board scanner
US4999703A (en)Automatic image correction method and apparatus for projectors utilizing cathode ray tubes
CN101656857B (en) Projection display device and display method
US20070091334A1 (en)Method of calculating correction data for correcting display characteristic, program for calculating correction data for correcting display characteristic and apparatus for calculating correction data for correcting display characteristic
WO1992015084A1 (en)Method and apparatus for calibrating geometrically an optical computer input system
US20170308242A1 (en)Projection alignment
WO2025145859A1 (en)Projection method and apparatus for automatically adjusting display scale, and device and storage medium
JP4363152B2 (en) Captured image projection device, image processing method and program for captured image projection device
JPH06281421A (en) Image processing method
CN115514944A (en)Intelligent household projection angle correction system
CN119273774A (en) Scanner calibration method, device, electronic device and computer readable storage medium
CN118196206A (en)Active binocular camera

Legal Events

DateCodeTitleDescription
REMIMaintenance fee reminder mailed
FPAYFee payment

Year of fee payment:4

SULPSurcharge for late payment
ASAssignment

Owner name:INFOCUS CORPORATION, OREGON

Free format text:MERGER;ASSIGNOR:PROXIMA CORPORATION, A DELAWARE CORPORATION;REEL/FRAME:014484/0549

Effective date:20030226

ASAssignment

Owner name:INFOCUS CORPORATION, OREGON

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HAUCK, LANE T.;REEL/FRAME:017097/0177

Effective date:20051003

ASAssignment

Owner name:STRAIGHT SIGNALS LLC, NEVADA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INFOCUS CORPORATION;REEL/FRAME:017759/0418

Effective date:20050629

ASAssignment

Owner name:PROXIMA CORPORATION, CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MARSHALL, ROGER N.;HAUCK, LANE T.;SHAPIRO, LEONID;AND OTHERS;REEL/FRAME:018576/0673

Effective date:19910410

FPAYFee payment

Year of fee payment:8

FEPPFee payment procedure

Free format text:PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

REMIMaintenance fee reminder mailed
LAPSLapse for failure to pay maintenance fees
STCHInformation on status: patent discontinuation

Free format text:PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FPLapsed due to failure to pay maintenance fee

Effective date:20110803


[8]ページ先頭

©2009-2025 Movatter.jp