TECHNICAL FIELDThis description relates to optical tracking techniques.
BACKGROUNDTracking and/or pointing applications allow users to interact with computers and other devices in a fast, easy, and intuitive manner. An example of a tracking application is the well-known computer mouse, which allows users, for example, to control movement of a cursor or other icon within the context of a monitor or other display. Other tracking applications include touchpads that track a movement of a finger or other pointing device across a pressure-sensitive surface.
Optical tracking systems generally rely on some type of emission, reflection, and/or detection of light, that is translated, for example, into movement of a cursor or other icon within the context of a monitor or other display.
SUMMARYExamples of optical tracking systems are described in which optical components (e.g., image sensors) detect light within a substantially planar region adjacent to a user device. Tracking logic may receive signals output by the optical components and determine coordinates associated with a surface-independent movement of a pointing object through the substantially planar region. For example, the pointing object may be moved through an open space adjacent to the device, without contact of the pointing object on a physical surface. The tracking logic may then provide for translation of the coordinates into an action on a display, such as, for example, a movement of a cursor or other icon on the display.
For example, a row of pixels of a 1-dimensional image sensor (or a designated row of pixels among a plurality of rows of pixels, e.g., in a 2-dimensional image sensor) may be used to detect the movement of the pointing object. Since 1-dimensional image sensors may have a limited field of view, corresponding, for example, to such a single row of pixels within the image sensor(s), pixels from such an image sensor may be effectively limited to detecting light within the substantially planar region and within a vicinity of the device. Then, the movement of the pointing object within the substantially planar region may be characterized using pixel values corresponding to light reflected from the pointing object within the substantially planar region, as the pointing object is moved through the substantially planar region.
In one example, two image sensors are used that are each disposed at least partially within the substantially planar region, so that the substantially planar region includes at least a part of each of the image sensors and at least a part of the pointing object. In this example, both image sensors detect the part of the pointing object within the substantially planar region, and triangulation calculations may be performed to determine x, y coordinates associated with the movement of the pointing object. In another example, only one image sensor is used, and x, y coordinates associated with the movement of the pointing object may be determined based on an apparent size of the part of the pointing object in the substantially planar region, relative to reference size information (e.g., a known diameter) of the part of the pointing object.
Further, additional optical sensing may be provided by virtue of a secondary substantially planar region in parallel with the substantially planar region (e.g., by using one or more additional image sensors to detect light from the secondary substantially planar region). Then, by tracking movement in the secondary substantially planar region (e.g., using the same techniques as just described), additional information may be obtained for controlling an action on a display. For example, a tilt of a finger that intersects both the substantially planar region and the secondary substantially planar region may be detected and translated into a desired action with respect to the display, such as, for example, an up-or-down scrolling through a text screen.
This Summary is provided to introduce selected concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1 is a block diagram of a system for performing optical tracking.
FIG. 2 is a diagram of an example implementation of the optical tracking system ofFIG. 1.
FIG. 3 is a flowchart illustrating a process of the system(s) ofFIGS. 1 and/or2.
FIG. 4A is a block diagram of an alternate implementation of the optical tracking system ofFIG. 1.
FIG. 4B is a sideview of the optical tracking system ofFIG. 4A.
FIG. 5 is block diagram of a partial example implementation of the optical tracking system ofFIGS. 4A and 4B.
FIG. 6 is a flowchart illustrating a process of the systems ofFIGS. 4A,4B, and5.
FIGS. 7A,7B,7C, and7D illustrate example implementations of the systems of one or more ofFIGS. 1-6.
DETAILED DESCRIPTIONFIG. 1 is a block diagram of asystem100 for performing optical tracking. In the example ofFIG. 1, auser device102 is illustrated that includes anoptical tracking system104. Theoptical tracking system104 is operable to detect light from a substantiallyplanar region106. For example, theoptical tracking system104 may detect light reflected from a pointing object108 (illustrated as a finger in the example ofFIG. 1), so as to detect movement of thepointing object108 through the substantiallyplanar region106. Then, theoptical tracking system104 may determine coordinates describing the movement of thepointing object108 within the two dimensions (i.e., in an x and/or y direction) of the substantiallyplanar region106, and provide for translation of the coordinates into movement of acursor110 or other icon on adisplay112.
In the example ofFIG. 1, theuser device102 may represent virtually any type of device that may be operated by a user (i.e., the user providing and moving the pointing object108). For example, theuser device102 may include one or more of a keyboard, a mouse, a wireless communications device, a personal digital assistant, a desktop computer, a tablet personal computer, a cell phone, a gaming device, and/or a laptop computer. Further, although thedisplay112 is illustrated separately in the example ofFIG. 1, it should be understood that theuser device102 also may include, or may be associated with, a monitor or other display.
Theoptical tracking system104 is operable to detect light from the substantiallyplanar region106 by, for example, effectively limiting a viewing field in which light is detected. For example, theoptical tracking system104 may provide only a limited number or distribution of light-sensitive pixels. As another example, theoptical tracking system104 may provide a larger number or distribution of light-sensitive pixels, and then discard information from all but specified ones of the pixels that correspond to the substantiallyplanar region106.
Accordingly, the substantiallyplanar region106 may be understood to be included in a defined viewing field of the optical tracking system104 (e.g., defined by appropriate provision, selection and/or activation of corresponding pixels). That is, as long as thepointing object108 is moved within the viewing field of theoptical tracking system104 and within a certain distance of theuser device102, then light reflected from thepointing object108 may be detected and analyzed with respect to the substantiallyplanar region106, for purposes of control of thecursor110. In this regard, the distance within which light reflected from thepointing object108 is detected for purposes of control of thecursor110 may be determined or designated by various techniques (as discussed below, for example, with respect toFIG. 2). Generally, however, it should be understood that the user may effectively determine this distance in practice, simply by noticing a distance at which an accuracy of control of thecursor110 begins to suffer, and then staying comfortably within this distance during operation of the optical tracking system.
Based on the above description, it should be understood that designation of the substantiallyplanar region106 as such is not intended to imply the mathematical definition of a plane as having infinite extent and no thickness. Rather, the substantiallyplanar region106 represents a generally flat or level shape or surface within a space adjacent to theuser device102, that, as just described, may be specified by appropriate provision, selection, and/or activation of pixels of theoptical tracking system104. Therefore, the substantiallyplanar region106 does not necessarily represent, and is not limited to, a literal two-dimensional surface or space, but, rather, provides an effective two-dimensional space for purposes of control of thecursor110.
The more the substantiallyplanar region106 is (or can be) limited in thickness (e.g., by appropriate sensor/pixel selection), the less opportunity may exist for errors or inaccuracies in determining the movement of thepointing object108. For example, when thepointing object108 includes a finger, as in the example ofFIG. 1, an increased thickness of the substantiallyplanar region106 may result in inaccuracies resulting from surface inconsistencies in the finger through the substantiallyplanar region106, as detected by theoptical tracking system104.
Although thepointing object108 is illustrated in the example ofFIG. 1 as a finger, it should be understood that virtually any type of pointing object may be used that is operable to provide a sufficient level of reflection of light for detection by theoptical tracking system104. For example, a stylus or pen may be used, where the stylus or pen may have a defined shape (e.g., round or square). In some implementations, reflecting material may be added to, or incorporated into, thepointing object108, to increase an ease of detection by theoptical tracking system104. In other implementations, a light source (e.g., an light-emitting diode (LED)) may be included on thepointing object108, in order to increase an amount of light detected by theoptical tracking system104.
Thecursor110 is used to represent an example of a traditional type of cursor or other icon that may be controlled on thedisplay112 to obtain a desired action and/or result. For example, virtually any cursor control action of thecursor110 that may be obtained by conventional mouse or touch-sensitive tracking surfaces may generally be provided on thedisplay112 by theoptical tracking system104, using one or more of the techniques described below with respect toFIGS. 2-6. For example, movement of thecursor110 to a desired portion of thedisplay112 may be performed, or selection of a particular file, document, or action that is designated on thedisplay112 may be performed. As a further example, a drawing function may be performed, in which movement of thecursor110 provides a line drawing or similar effect on thedisplay112. Also, specialized actions may be provided, including, for example, photo-editing functionality, web-browsing functionality, or gaming functionality.
Thedisplay112 may be virtually any display that may be used with theuser device102. For example, thedisplay112 may be integrated with the user device102 (such as with a laptop computer, personal digital assistant, or mobile telephone), or may be separate from theuser device102 and in (wired or wireless) communication therewith (such as a monitor associated with a desktop computer, or with a television).
Further inFIG. 1, anoptional surface114 is shown in order to illustrate a capability of theoptical tracking system104 to detect surface-independent movements of thepointing object108. For example, in a case where the user device includes a keyboard, thesurface114 may represent a desk on which the keyboard rests. A user may control thecursor110 simply by moving his or her finger (pointing object108) within the substantiallyplanar region106. If the substantiallyplanar region106 is over the surface114 (e.g., desk), then the user may trace his or her finger along the desk and within the substantiallyplanar region106; however, it should be understood that operation of theoptical tracking system104 is not dependent on such contact between the finger and the desk to perform accurate optical tracking.
For example, if the keyboard (user device102) rests at the edge of a desk or other surface, then there may be no surface under the substantiallyplanar region106, and thepointing object108 may be moved in free and open space. As long as at least a part of thepointing object108 moves within the substantiallyplanar region106, then the desired action on thedisplay112 may be obtained.
Continuing the example of a keyboard, it may be the case that theuser device102 is a keyboard intended for use with television and/or media center systems (e.g., media centers that allow users to access computer files by way of a television). Such a keyboard may thus be primarily intended for use in a living room or other non-traditional space for operating a keyboard and/or controlling a display, where a desktop may not be practical or available. In these cases, the substantiallyplanar region106 may be provided adjacent to the keyboard (e.g., vertically from a top surface of the keyboard), so that movements of thepointing object108 within a free space included in the substantiallyplanar region106 may be tracked without reference to, dependence on, or touching of, a physical surface such as thesurface114.
Similarly, in other examples, theuser device102 may include a wireless communications device and/or a gaming device. Such devices, and similar devices, may be frequently used while being held in a hand of a user. In these cases, movement of thepointing object108 may occur within the substantiallyplanar region106 in an open space adjacent to an edge surface of theuser device102, so that cursor control actions or other actions may be obtained on a display of theuser device102. Such implementations may allow, for example, a relatively larger display on the mobile device, since less space for user controls may be required.
In these and other implementations, theoptical tracking system104 may includeoptical components116 that are operable to sense movements, including such surface-independent movements, and output pixel values corresponding thereto. Then, trackinglogic118 may be operable to receive the pixel values, and determine coordinates of thepointing object108 within the substantiallyplanar region106 therefrom. Thus, the trackinglogic118 may provide for translation of the coordinates into an action on thedisplay112, such as, for example, cursor control actions for controlling thecursor110.
For example, theoptical components116 may include one or more sensors, such as thesensors120 and122. For example, thesensors120 and122 may operate by capturing light on grids of pixels on their respective surfaces, which may be formed by photosensitive diodes that also may be referred to as photosites, and that record an intensity or brightness of the detected light by accumulating a charge. Thesensors120 and122 may include, for example, complementary metal-oxide-semiconductor (CMOS) sensors, or may include any other image sensor this is operable to detect light from the substantiallyplanar region106 and output a signal corresponding to an intensity or other characteristic of the light, such as, for example, a charge-coupled device (CCD) sensor. In some implementations, thesensors120 and122 may include CMOS image sensors having a linear response characteristic(s), so that a response of thesensors120 and122 varies linearly with an intensity of the detected light.
In the example ofFIG. 1, thesensors120 and122 are each disposed at least partially within the substantiallyplanar region106, and, more specifically, are disposed substantially along anaxis124 that is included within the substantiallyplanar region106. For example, theaxis124 may be defined along a first row of pixels within thesensor120 and a second row of pixels within thesensor122, so that these rows of pixels are included within the substantiallyplanar region106. By using only these rows of pixels, light detected by thesensors120 and122 may substantially correspond only to light within the substantiallyplanar region106.
In so doing, several advantages may be obtained in the example implementation ofFIG. 1. For example, placement of thesensors120 and122 beside one another allows for a compact and discrete construction of theoptical tracking system104. Also, restricting the field of view of thesensors120 and122 reduces an area of thepointing object108 that is detected by thesensors120 and122, which implies less opportunities for errors resulting from, for example, any surface irregularities on thepointing object108. Further, since less information is collected by thesensors120 and122 than if a wider field of view were employed, calculations to be performed by the trackinglogic118 may be reduced and/or simplified, and a reliability of results may be increased. Additionally, such construction and use of thesensors120 and122 allows for the use of 1-dimensional (1-D) sensors, which may be inexpensive compared to larger pixel arrays.
InFIG. 1, although thesensors120 and122 are illustrated and described as being included in the substantiallyplanar region106, and although movement of thepointing object108 is illustrated and described as occurring within the substantiallyplanar region106, it should be understood that there is no requirement or limitation that movement of thepointing object108 should or must be able to occur (and be detected) within an entirety of the substantiallyplanar region106. For example, as illustrated and discussed below with respect toFIG. 2, various other optical components may be included inoptical components116, such as lenses, light sources, or filters, and such optical components may be placed in between thesensors120 and122 and thepointing object108. Additionally, as described below with respect toFIG. 2, a “dead zone” may exist immediately outside of theoptical components116, i.e., a limited region in which movement of thepointing object108 may not be (sufficiently) accurately tracked.
In an implementation of the example ofFIG. 1, a triangulation calculation is performed using thesensors120 and122 and thepointing object108. Specifically, for example, and as described in more detail with respect toFIG. 2, eachsensor120 and122 may output pixel values from a row of pixels along theaxis124 to thetracking logic118, the pixel values corresponding to light reflected from thepointing object108. Then, the trackinglogic118 may determine a centroid or center of thepointing object108 within the substantiallyplanar region106, simply by, for example, taking a center-most pixel(s) from each of the two rows of pixels that register reflected images of thepointing object108 along theaxis124. Accordingly, the trackinglogic118 may perform a triangulation calculation using the two centroids, together with other pre-determined information about the optical components116 (such as, for example, a known spacing between thesensors120 and122, and/or a known spacing between each of thesensors120 and122 and corresponding lenses used to focus the light reflected from thepointing object108 onto thesensors120 and122).
Thus, the trackinglogic118 may determine, from the triangulation calculation, coordinates of thepointing object108 within the substantiallyplanar region106. For example, the trackinglogic118 may determine either relative or absolute coordinates of the pointing object. For example, determining relative coordinates may refer to determining a current coordinate of thepointing object108 within the substantiallyplanar region106, relative to an immediately-past coordinate, and without reference to any other frame of reference in or around the substantiallyplanar region106. Such relative tracking is typically performed, for example, in many conventional mouse tracking devices, where movement of the mouse on a surface is not required to be within any particular defined field, but rather may occur on any suitable surface (with the user being responsible for orienting a corresponding cursor movement in a desired fashion relative to a display). Absolute coordinates, on the other hand, may refer to coordinates defined with respect to a fixed frame of reference. For example, if light from the substantiallyplanar region106 is detected immediately in front of thedisplay112, then the perimeter of thedisplay112 may be used to define coordinates determined by the trackinglogic118. As a result, in such examples, movement of thepointing object108 in a particular region of the substantiallyplanar region106 and over a region of thedisplay112 will result in corresponding movement of the cursor110 (or other action) within the corresponding display region.
Although thetracking logic118, and theoptical tracking system104 as a whole, is illustrated in the example ofFIG. 1 as being implemented as a single block or module within theuser device102, it should be understood that some or all of thetracking logic118 may be implemented outside of theuser device102, and may be implemented in/by multiple instances and types of devices, peripherals, hardware, software, and/or firmware.
For example, the trackinglogic118 may include a processor (e.g., a micro-programmed control unit (MCU)) that is operable to control thesensors120 and122, by, for example, providing power and timing information to thesensors120 and122. In other words, for example, such a processor may be used as part of the (synchronized) selection and activation of desired rows of pixels of thesensors120 and122 that results in effective tracking of thepointing object108 through the substantiallyplanar region106, by, for example, limiting obtained pixel values from thesensors120 and122 to pixel values from rows of pixels on each of thesensors120 and122 that lie substantially along theaxis124.
Additional computing resources (e.g., software or firmware) may be used to receive pixel values from, for example, the processor just mentioned, and perform calculations and other analysis thereof. For example, software may be used that has access to pre-defined information about the optical components116 (e.g., a spacing between thesensors120 and122), so that such software may use such information to perform the triangulation calculations referenced above and described in more detail below with respect to, for example,FIG. 2.
By way of example, then, elements of thetracking logic118 may be implemented in a single component (which may be internal or external to the user device102), or in multiple components in communication with one another (any one, or all, of which may be internal or external to the user device102). For example, a processor within the user device102 (e.g., a keyboard) may be in communication with a separate computing device (e.g., a desktop computer) by way of a serial port or other wired connection, or by way of a wireless connection, in order to transmit pixel values and/or full or partial results of calculations based on the pixel values.
Additionally, the trackinglogic118 may be directly or indirectly involved in providing results of the calculations (e.g., calculated coordinates of the pointing object108) for actual translation into an action on thedisplay112. For example, in one implementation, the trackinglogic118 may be wholly responsible for translating relative coordinates of thepointing object108 within the substantiallyplanar region106 into absolute coordinates associated with the frame of reference of thedisplay112. However, such translation of relative coordinates of a tracking system (e.g., a conventional mouse) into absolute coordinates of a display may already be performed by existing systems. Therefore, it may be advantageous or efficient for theoptical tracking system118 to take advantage of existing software or firmware associated with thedisplay112, theuser device102, and/or a separate computing device (such as a desktop computer, not shown inFIG. 1). For example, the trackinglogic118 may output coordinates according to a format that matches an output of a conventional mouse, so that software or firmware receiving the coordinates may not require modification to operate with theoptical tracking system104.
In addition to the various actions described above that may be provided with respect to thecursor110 on thedisplay112, it should be understood that other, secondary actions may be provided. For example, a movement of thepointing object108 in a direction perpendicular to the substantiallyplanar region106 may cause thepointing object108 either to begin intersecting the substantiallyplanar region106, or to cease intersecting the substantiallyplanar region106. Such movements may be detected by a corresponding presence or absence of reflected light detected by thesensors120 and122, (e.g., a new determination of coordinates of thepointing object108 within the substantially planar region106), and the secondary actions may be performed based thereon. For example, such movements may result in a secondary action such as a “clicking” or selection of a file, document, or hypertext link on thedisplay112 to which thecursor110 is pointing. As another example of secondary actions that may be provided, movements within the substantiallyplanar region106 may be interpreted as gestures associated with particular functionality of thedisplay112. For example, a rapid movement (or succession of movements) to the left within the substantiallyplanar region106 may be interpreted as a command to go “back” to a previous page within a browser, while a rapid movement to the right within the substantiallyplanar region106 may be interpreted as a command to go forward to a next page.
FIG. 2 is a diagram of an example implementation of theoptical tracking system116 ofFIG. 1.FIG. 2 provides a more detailed view of a particular example of thesensors120 and122, disposed along theaxis124 as described and illustrated above with reference toFIG. 1. Additionally,FIG. 2 illustrates the substantiallyplanar region106, as well as thepointing object108.FIG. 2 also illustrates that the substantiallyplanar region106 includes, in the example ofFIG. 2, a dead zone “L0” in which tracking of thepointing object108 is limited or non-existent (e.g., due to non-overlap of fields of view of thesensors120 and122 within the dead zone L0).
Also, as should be understood from the above discussion with respect toFIG. 1, the illustrated outline of the substantiallyplanar region106 inFIG. 2 is not intended to illustrate an absolute cut-off point or boundary, since, as explained, an effectiveness of theoptical components116 may diminish gradually over a distance therefrom. Thus, a design of theoptical components116 may be implemented with the intent that the substantiallyplanar region106 allows sufficient area for controlling thecursor110 on thedisplay112; however, it should be understood that if a user moves beyond this area, then control of thecursor110 may diminish or cease. Nonetheless, in some implementations, physical perimeter(s) may be separately associated with the substantiallyplanar region106 and provided for a user. For example, thesurface114 may include a drawing surface that is attached or attachable to theuser device102, on which a drawing perimeter is defined that is pre-calibrated to be safely within the substantiallyplanar region106. In this way, a user may be assured of remaining within the substantiallyplanar region106 by staying within the identified perimeter, and, moreover, theoptical tracking system104 may be calibrated to use the drawing perimeter as a frame of reference for absolute tracking of thepointing object108 with respect to thedisplay112.
FIG. 2 also illustrates examples of other components that may be included within theoptical components116. For example, light source(s)202 include, in the example ofFIG. 2, a plurality of light-emitting diodes (LEDs), which emit light into the substantiallyplanar region106. The light is reflected off of thepointing object108 and received at thesensor120 and thesensor122 through afirst lens204 and asecond lens206, respectively, as shown. Although three light-sources202 are illustrated, it should be understood that more or fewer may be used. For example, nolight sources202 may be used in a case where ambient light is used to detect thepointing object108, or when thepointing object108 itself includes a light emitting source.
As illustrated in the example ofFIG. 2, then, thelight sources202 project light from theoptical components116. This light is reflected from thepointing object108, and a portion of the reflected light that is within the substantiallyplanar region106 is detected by thesensors120 and122. This light may be detected by a row of pixels at each of thesensors120 and122. The two rows of pixels may each be analyzed by the trackinglogic118 to determine a centroid thereof, e.g., a centroid A′ is determined from a row of pixels from thesensor120, and a centroid A is determined from a row of pixels from thesensor122.
In the case where only a row of pixels is designated for use in eachsensor120 and122, calculation of the centroids A and A′ may simply involve determining a center-most pixel(s) in each designated row(s). Such a determination may be made quickly, easily, and reliably, even during rapid movements of thepointing object108. In other cases, it may be possible to use multiple rows of pixels of each of thesensors120 and122, and then discard all pixel values outside of designated row(s) of each of thesensors120 and122 on theaxis124. In still other cases, a plurality of rows of pixels may be read out of each of thesensors120 and122, and then the centroids A and A′ may be calculated from each plurality, using known techniques (e.g., dividing a total shape of each plurality into known shapes, and then calculating the centroids A and A′ from a summation of the areas of the known shapes).
In the example ofFIG. 2, thelenses120 and122 are illustrated as being placed along a “y” axis with a separation “a” between points “O” and “O′,” where the latter points are aligned with the centers of thelenses204 and206, respectively. Thesensor120 and thesensor122 are placed a distance “b” behind thelens204 and thelens206, respectively. A center of thesensor120 is placed a distance “d” above the point O′, while thesensor122 is placed a distance “d” below the point O.
Afilter208 is placed between thelens204 and thesensor120, and afilter210 is placed between thelens206 and thesensor122. Thefilters208 and210 may be used, for example, to filter out light that is not associated with theLEDs202, so that a sensitivity of thesensors120 and122 may effectively be increased. Additionally, or alternatively, light from theLEDs202 may be modulated or otherwise controlled, in conjunction with control of a timing of image-taking by thesensors120 and122, so as to synchronize projection of light and detection of reflected signal(s) from thepointing object108 in an efficient and effective way.
With the information related to the centroids A and A′, as well as the known quantities a, b, O, and O′, the trackinglogic118 may determine x, y coordinates for thepointing object108, using, for example, various triangulation techniques. For example, an equivalence of angles θ1and θ2may be used to define two equations in the two unknowns x, y, in terms of the known quantities “a,” “b,” and the detected pixel lengths “OA,” and “O′A′” (i.e., a quantity of pixels between start and end points O, O′, A, and A′). Then, these equations may be solved for x, y to obtain Eqs. (1)-(2):
In order to obtain a desired range of coverage for the substantiallyplanar region106, values of x, y may be inserted into Eqs. (1) and (2) to obtain required/workable ranges or values for a, b, OA, and/or O′A′. For example, the values of pixel lengths OA and O′A′ may be obtained for a desired x, y range and for known values of a and b, using Eqs. (3) and (4):
As thepointing object108 moves within the substantiallyplanar region106, the pixel length end points A and A′ will shift on thesensors122 and120, respectively. Theoptical components116 may therefore improve resolution and/or coverage area by arranging for the shifting range of A and A′ to equal a length of thesensors120 and122, thereby maximizing usage of an area(s) of thesensors120 and122). For example, as shown, thesensors120 and122 may be arranged off-center from thelenses204 and206, with the offset d. In this way, for example, full coverage of the substantiallyplanar region106 may be obtained, and most or all overlapping (and therefore wasting) of pixels of thesensors120 and122 may be eliminated. In other implementations, however, the points O and O′ may be defined at a center of thesensors122 and120, respectively, or at another desired location.
A resolution of theimage sensors120 and122 that may be used in theoptical components116 may be, for example 1024, 2048, or 4096 pixels. Of course, any appropriate resolution that is able to provide a needed or desired resolution for controlling thecursor110 on thedisplay112 may be used. Thelenses204 and208 may have, for example, focal lengths of 3.3 mm, and viewing angles of ninety-two degrees, or any other focal length or viewing angle that is operable to provide accurate tracking of thepointing object108.
In some implementations, thefilters208 and210 may be provided as a film on thesensors120 and122, respectively. In other implementations, thefilters208 and210 may be provided as discrete components that are separate from thesensors120 and122. In operation, thefilters208 and210 prevent light that is reflected from thepointing object108 but that does not match a wavelength of the source light(s)202 from reaching thesensors120 and122.
Further, as shown inFIG. 2, amaterial212 may be included between theLEDs202 and the substantiallyplanar region106. Thematerial212 may include, for example, ground glass, and may serve, for example, to smooth out any non-uniformities that may be present in light from thelight sources202. In this way, shadows, un-wanted reflections (e.g., from ancillary objects in the vicinity of the substantially planar region106), and other undesirable artifacts may be minimized, so that the desired reflections from thepointing object108 may be detected reliably.
Although components ofFIG. 2 are illustrated to provide a particular example of theoptical components116, it should be understood that many other implementations may be used. For example, as indicated byarrows214, thesensors120 and122 may be rotated along theaxis124 and in the plane of the substantiallyplanar region106. Such rotations may serve either to reduce the dead zone L0, or to increase a range at which reflected light from thepointing object108 in the substantiallyplanar region106 is detected.
For example, thesensors120 and122 may be angled inward toward one another along theaxis124, so as to cause viewing areas of thesensors120 and122 to overlap closer to the y axis ofFIG. 2, i.e., in an area within the example dead zone L0ofFIG. 2. In this way, movements of thepointing object108 through the substantiallyplanar region106 may be tracked more closely to theuser device102. Such implementations may be useful, for example, when the user device is compact in size, such as a mobile phone or personal digital assistant.
In other implementations, however, it may be desired to increase an area of the substantiallyplanar region106, so that movements of thepointing object108 may be tracked further from theuser device102 than in the illustrated example ofFIG. 2. In this case, thesensors120 and122 may be angled more outward and/or away from one another along theaxis124. It should be understood that such implementations may serve to increase an area of the substantiallyplanar region106, with an accompanying increase in the dead zone L0. Such implementations may be useful, for example, where a greater range of detection is desired. In these and other implementations, modifications to the triangulation techniques described above (and/or below, with respect toFIG. 5) may be implemented to reflect the change(s) in configuration of the optical components116 (e.g., the angling of thesensors120 and122 indicated by the arrows), as would be apparent.
FIG. 3 is aflowchart300 illustrating a process of the system(s) ofFIGS. 1 and/or2. In the example ofFIG. 3, a light source is projected from an optical tracking system into an adjacent area (302). For example, as described, light from theLEDs202 may be projected so as to illuminate at least the substantiallyplanar region106. Of course, other light sources may be used, including laser light sources. Also, as already mentioned with respect toFIG. 2, ambient light may be used, in which case no projected light may be required. Additionally, an amount or quantity of light may be selected for a given application; e.g., although threeLEDs202 are shown inFIG. 2, an appropriate number of one or more LEDs may be selected, as necessary or desired.
Further, in projecting the light, beam-forming components may be used within theoptical components116 that enhance an ability of thesensors120 and122 to detect light reflected from thepointing object108. For example, a light-forming technique may be used in which the source of light is located at a focal distance “f” of a cylindrical lens. In this example, the light source and the cylindrical lens produce light in a slice or fan region of produced light. Such a fan-shaped beam may be used to illuminate thepointing object108, and provide an effective way to minimize interference (e.g., scattering that may occur from an ancillary surface and/or from a tilting of the pointing object108). Such a fan beam also may provide an effective way to extend a detectable area in which thesensors120 and122 may accurately detect movement of thepointing object108, and may increase a sensitivity of theoptical tracking system104 to lateral movements of thepointing object108.
First pixel values are received from a first sensor, e.g., the sensor120 (304), and second pixel values are received from a second sensor, e.g., the sensor122 (306). For example, thesensor120 and thesensor122 may receive focused, filtered light reflected from thepointing object108, and may each output corresponding pixel values. As described above and illustrated inFIGS. 1 and 2, the sensors may be disposed at least partially in a common plane, and included in the substantiallyplanar region106. Accordingly, theoptical tracking system104 may be made in a compact and modular form.
In receiving the pixel values, an output mode of thesensors120 and122 may be selected by the trackinglogic118 that appropriately outputs the desired pixel information, e.g., as a comparison voltage that provides information as to where the image(s) is and how many pixels are contained therein. The pixels may be read out according to certain pre-defined standards, e.g., pixel values below a certain threshold amount may not be kept, and activated pixels having a length of less than some predetermined amount (e.g., less than ten pixels) may be disregarded as noise.
Pixels may be read out according to a start signal and timing signal produced by the trackinglogic118, within a defined exposure time (i.e., within a defined number of clock cycles). In some implementations, prior to the obtaining/reading of pixel values from thesensors120 and122, a baseline reading of pixel values may be determined by, for example, reading out a certain number of pixels during a time when no light source is not being projected.
Centroids are determined from the pixel values (308). For example, during and/or after the reading/receiving of the pixel values, all pixels in a row (e.g.,2048 pixels) may be read out, and their positions recorded by the trackinglogic118, so that start and end points of the pixel values corresponding to light reflected from thepointing object108 within the substantiallyplanar region106 may be determined.
Using these start and end points, the trackinglogic118 may determine centroids A and A′, e.g., center-most pixel(s) from each of the two rows of pixels that register reflected images of thepointing object108 along theaxis124. As described above with respect toFIGS. 1 and 2, determination of each centroid may include a single pixel at the centroids A and A′, and, in other implementations, sub-pixel resolution may be obtained in determining the centroids A and A′.
Triangulation may then be performed based on the determined centroids, in order to determine coordinates of a pointing object (e.g., the pointing object108) during movement thereof through the substantially planar region106 (310). For example, in the example ofFIG. 2, the trackinglogic118 may use the distance “a” between centers of thelenses204 and206 and the distance “b” between thesensors120/122 andlenses204/206 to calculate from Eqs. (1) and (2) the x, y coordinates of thepointing object108 during movement thereof through the substantiallyplanar region106. Thus, absolute and/or relative position/movement information of a pointing object (e.g., the pointing object108) may be determined. For example, an absolute position within the substantiallyplanar region106 may be determined (e.g., determined absolutely with reference to some pre-defined perimeter coordinates/frame of reference, such as a boundary of the display112), and/or a relative motion of thepointing object108 may be determined.
Finally, the determined coordinates may be provided for translation into a desired action(s) on a display (312). For example, as described above with respectFIG. 1, the trackinglogic118 may translate movement of thepointing object108 into movement of thecursor110 of thedisplay112. As another example, the trackinglogic118 may provide the coordinates to an external system or computing resource for translation of the coordinates into the action on the display.
FIG. 4A is a block diagram of an alternate implementation of the optical tracking system ofFIG. 1, andFIG. 4B is a sideview ofFIG. 4A taken along cut-away line “A.” In the example ofFIG. 4A, anoptical tracking system104ais illustrated that includesoptical components116aand trackinglogic118a. More specifically, theoptical components116aand thetracking logic118aare operable to detect light from two substantiallyplanar regions106aand106b. By determining x, y coordinate information of apointing object108a(illustrated as a stylus inFIGS. 4A and 4B) within each of the substantiallyplanar regions106aand106b, additional information about the movement of thepointing object108amay be determined beyond the two x, y coordinate determinations. For example, a relationship between x, y coordinates in the substantiallyplanar region106aand x, y coordinates in the substantiallyplanar region106bmay be determined, and an action on thedisplay112 may be provided by the trackinglogic118a, based on the relationship.
For example, as may be seen inFIG. 4B, thepointing object108amay be maintained by a user at a tilt with respect to the substantiallyplanar region106a, e.g., may form an angle with respect to both of the substantiallyplanar regions106aand106b. Then, an existence, degree, or direction of the tilt may be used to indicate a “scrolling-up” action through a document, while a tilt in a second direction may be used to indicate a “scrolling-down” action. Tilt information also may be used to achieve various other effects, such as, for example, a “back” or “forward” command within a web browser.
In the example ofFIGS. 4A and 4B, twosensors402 and404 are illustrated as being operable to detect light from the substantiallyplanar regions106aand106b, respectively. As described in more detail below with respect toFIGS. 5 and 6, the trackinglogic118amay determine the x, y coordinates of thepointing object108awithin the substantiallyplanar region106abased on apparent size information of thepointing object108adetected by the sensor402 (e.g., a number and/or distribution of pixels read from the sensor402), relative to reference size information (e.g., relative to a known diameter of thepointing object108a). Similarly, thesensor404 may be used to determine the x, y coordinates of thepointing object108awithin the substantiallyplanar region106bbased on apparent size information of thepointing object108adetected by thesensor404, relative to reference size information.
Once the two sets of x, y coordinates are known, a relationship between a first part of thepointing object108athat is within the substantiallyplanar region106aand a second part of thepointing object108athat is within the substantiallyplanar region106bmay be obtained. For example, where a distance D between the twosensors402 and404 is known, the two sets of x, y coordinates may be used to determine an angle θ3formed by thepointing object108awith the substantiallyplanar region106b. For example, the distance D may be considered to form a leg of a right triangle having thepointing object108aas its hypotenuse, and having a portion of the substantially planar region(s)106aand/or106bas the third leg. Then, other information about such a triangle, including the angle θ3, may be determined using well-known geometrical relationships.
FIG. 5 is block diagram of an example implementation of theoptical tracking system104aofFIGS. 4A and 4B, showing an example of theoptical components116a, and taken along a cut-away line B. Thus, in the example ofFIG. 5, only thesensor402 is illustrated, although it should be understood that thesensor404 may be implemented in a similar way.
InFIG. 5, thepointing object108ais illustrated as having adiameter502. For example, in the case ofFIGS. 4A and 4B, thepointing object108amay include a substantially cylindrical stylus having a known diameter “d”502. Thesensor402 may read out pixel values corresponding to light reflected from thepointing object108a, and thetracking logic118amay then determine apparent size information associated with thepointing object108afrom these pixel values.
For example, as illustrated inFIG. 5, thesensor402 may read out start and end points of the pixel values, A′ and B′, respectively, corresponding to points A and B at ends of thediameter502. In this regard, it should be understood from the description ofFIGS. 1 and 2 above that the pixels read from thesensor402 may be restricted to a designated and/or limited number of rows (e.g. a single row). In this way, light primarily from the substantiallyplanar region106amay be received at thesensor402, so that calculations may be simplified, and reliability may be increased, as described above with respect toFIGS. 1 and 2.
Then, the endpoints A′ and B′ may be considered to provide apparent size information associated with thepointing object108a, since, as should be understood fromFIG. 5, motion of thepointing object108awithin the substantiallyplanar region106awill correspond to changes in the start and end points A′ and B′. For example, as thepointing object108amoves closer to thesensor402 along an x axis, the distance A′B′ will increase, and, conversely, as thepointing object108amoves farther from thesensor402, the distance A′B′ will decrease.
This apparent size information may thus be compared with reference size information, such as the knowndiameter502, in order to determine a location of thepointing object108awithin the substantiallyplanar region106a. For example, and similarly to the discussion above related to the triangulation calculations associated withFIG. 2, equivalent angles θ4and θ5may be used to determine x, y coordinates, based on known information including the distance “b” between thesensor402 and alens504.
For example, such calculations may include use of Eqs. (5) and (6):
In other implementations, a size or diameter of thepointing object108amay not be known. In this case, however, absolute tracking may be performed by a calibration procedure for theoptical tracking system116a(e.g. allowing theoptical tracking system116ato determine pixel lengths corresponding to a given pointing object at a plurality of locations within the substantiallyplanar region106a, and then using the determined size information from the calibration procedure as the known size information). Also, relative tracking may be performed, by comparing the apparent size information to reference size information determined with respect to thepointing object108a. For example, by selecting a detected size of thepointing object108aat a given time “t,” thetracking logic118amay determine whether thepointing object108ais moving closer or farther away from thesensor402, by judging current, apparent size information against the determined reference size information.
Also, although thepointing object108ais illustrated inFIGS. 4A and 4B as a stylus, it should be understood that virtually any pointing object may be used. For example, thepointing object108amay have a square or other sharply-delineated outline, which may allow the sensor402 (and404) to easily detect the start and end points A′ and B′. In other implementations, as inFIG. 1, a finger, pen, or any other convenient pointing object may be used.
FIG. 6 is aflowchart600 illustrating a process of the systems ofFIGS. 4A,4B, and5. In the example ofFIG. 6, parallel processes are illustrated that correspond to operations of thesensors402 and404. For example, first pixel values may be received by the trackinglogic118afrom thesensor402, which may be disposed beneath the sensor404 (602a), as shown inFIG. 4A. Second pixel values also may be received from thesensor404, which may be disposed above the sensor402 (602b). As should be apparent fromFIG. 4A andFIG. 4B, the first and second sets of pixel values correspond to first and second parts, respectively, of thepointing object108athat intersect both of the substantiallyplanar regions106aand106b, also respectively.
Then, apparent size information may be determined for the first part of thepointing object108a(604a) and for the second part of thepointing object108a(604b), using the first and second pixel values, respectively. For example, as described above with respect toFIG. 5 for the example of thesingle sensor402, a number of activated pixels between start and end points B′ and A′ may correspond to apparent size information of a diameter of thepointing object108a(i.e., for first and second diameters corresponding to the first and second parts of thepointing object108a, respectively), since this number of pixels will change as thepointing object108amoves within the substantiallyplanar regions106aand106b.
Once the apparent size information is determined, then first x, y coordinates of the first part of thepointing object108ain the substantiallyplanar region106amay be obtained, e.g., using Eqs. (5) and (6), above (606a). Similarly, second x, y coordinates of the second part of thepointing object108ain the substantiallyplanar region106bmay be obtained, e.g., using Eqs. (5) and (6), above (606b).
Then, the first x, y coordinates of the first part of thepointing object108awithin the substantiallyplanar region106amay be provided by the trackinglogic118afor use in providing an action on a display (e.g., the display112) (608). In other words, once obtained, the first x, y coordinates detected with respect to the substantiallyplanar region106amay be used in much or exactly the same way as the x, y coordinates described above with respect toFIGS. 1-3 to obtain a desired action on thedisplay112. That is, the first x, y coordinates of the first part of thepointing object108amay be used to provide cursor control actions, or any of the other actions described above with respect toFIGS. 1-3. In this regard, it should be understood that thesensor402 and the substantiallyplanar region106amay provide such action(s) independently of thesensor404 and the substantiallyplanar region106b.
Additionally, a relationship may be determined between the first x, y coordinates and the second x, y coordinates (610). For example, as described above with respect toFIG. 4, an angle of tilt that may exist between the substantiallyplanar region106band thepointing object108amay be determined, and used to provide an action on a display (e.g., the display112) (612).
For example, in one implementation, theuser device102 may be a keyboard, and the substantiallyplanar regions106aand106bmay be provided to a side of the keyboard. Then, a user may move thepointing object108aoriented perpendicularly to the surface114 (e.g., a desk) on which the keyboard may rest, i.e. in a vertical direction, so as to move thecursor110 on thedisplay112 while, for example, browsing a web page. In this case, light detected by thesensor402 within the substantiallyplanar region106amay be used to control thecursor110 in moving around the display112 (e.g., within a web browser). Then, if the user tilts thepointing object108atoward him or herself, this may be detected by thesensor404, and interpreted by the trackinglogic118aas a command to scroll downward in the web page (or upward if thepointing object108ais tilted away from the user). As another example, a tilt of thepointing object108ato the left may be interpreted by the tracking logic as a command to go backward in the browser to a previous web page, while a tilt to the right may be interpreted as a command to go forward.
The trackinglogic118aalso may be operable to implement variations on such commands by calculating other information about the relationship between the first x, y coordinates of the first part of thepointing object108ain the substantiallyplanar region106a, and the second x, y coordinates of the second part of thepointing object108ain the substantiallyplanar region106b. For example, the trackinglogic118amay determine a degree or extent of tilting of thepointing object108ato supplement the actions described above. For example, in a case where a downward (i.e., toward the user) tilt causes a downward scrolling in a web page, a degree of the tilt (i.e., the angle θ3) may be measured, and a speed of the scrolling operation may be increased as thepointing object108ais tilted more (i.e., as θ3becomes more acute).
AlthoughFIGS. 4A,4B, and5 are illustrated as using thesensors402 and404, it should be understood that other configurations may be used. For example, in some implementations, theoptical components116amay detect light from the substantiallyplanar regions106aand106busing the techniques described above with respect toFIGS. 1-3. That is, the operations of thesensors120 and122 described above with respect toFIGS. 2 and 3 may be implemented to detect light from the substantiallyplanar region106a, and such operations may be duplicated by a second pair of sensors disposed above thesensors120 and122, so as to detect light from the substantiallyplanar region106babove, and substantially in parallel with, the substantiallyplanar region106a. Then, the techniques ofFIGS. 1-3 may be used to determine x, y coordinates of thepointing object108ain each of the substantiallyplanar regions106aand106b, so that a relationship therebetween may be determined by the trackinglogic118a. In still other implementations, thesensors120 and122 ofFIGS. 1-3 may be used to determine first x, y coordinates of the first part of thepointing object108ain the substantiallyplanar region106a, while thesensor404 is used to determine x, y coordinates of the second part of thepointing object108ain the substantiallyplanar region106b.
In yet another implementation, thesensors402 and404 may be considered to represent two pixel arrays (e.g., rows) of a single two-dimensional sensor. Then, the first pixel values and second pixel values may be read out (e.g.,602aand602b) from the first and second pixel arrays (e.g., rows).
FIGS. 7A,7B,7C, and7D illustrate example implementations of systems of one or more ofFIGS. 1-6. InFIG. 7A, akeyboard702 is illustrated as an example of theuser device102 ofFIG. 1. A substantially planar region704 may be associated with thekeyboard702, as illustrated inFIG. 7A, and as should be understood from the above descriptions ofFIGS. 1-6. Accordingly, control of thecursor110 on thedisplay112 may be provided, and, moreover, it should be understood that a user may easily access the substantially planar region704 during a typing operation or other use of thekeyboard702, with minimal hand movement being required.
Also, as should be understood from the discussion ofFIG. 1, the substantially planar region704 may be adjacent to other portions, and in other orientations, than that illustrated inFIG. 7A. For example, the substantially planar region704 may be adjacent to a top, front surface of thekeyboard702, in a vertical direction and above thekeyboard702. As also described with respect toFIG. 1, tracking of thepointing object108 within the substantially planar region704 may be performed without dependence on any physical surface on which thekeyboard702 may rest, so that surface-independent movement of thepointing object108 through a free or open space adjacent thekeyboard702 may be tracked for control of thecursor110.
Finally inFIG. 7A, light from a substantiallyplanar region706 may be detected by an optical tracking system integrated with thedisplay112 itself. For example, a module(s) including theoptical tracking system104 or104amay be disposed at a top, bottom, or side of thedisplay112, so as to project the substantiallyplanar region706 in front of a screen of thedisplay112. In this way, for example, thedisplay112 may effectively be turned into a touch-screen, so that a user may have the experience or feel of touching (or almost touching) a desired portion of thedisplay112, in order, for example, to direct thecursor110 or perform a drawing function across an area of thedisplay112.
In the example ofFIG. 7B, a personal digital assistant (PDA)708 is illustrated, and may be used to provide optical tracking, where, for example, a substantially planar region710 is detected at a bottom or side of thePDA708, and the resulting tracking may be performed with respect either to anintegrated display712 of the PDA, and/or an external display. In this way, a user may more easily work with the PDA708 (or any other wireless communications device), despite a relatively small size of the device.
In the example ofFIG. 7C, amouse714 is illustrated as detecting light from a substantiallyplanar region716. For example, themouse714 may be used to provide conventional cursor-tracking functionality, while light from the substantiallyplanar region716 is detected at a side of themouse714, in order to provide supplemental functionality, such as, for example, a drawing or scrolling function.
In the example ofFIG. 7D, akeyboard718 is illustrated as detecting light from a substantiallyplanar region720, and, in particular, detects light reflected at apoint722 corresponding to a pointing object (not shown inFIG. 7D; e.g., the pointing object108). As shown, light from the substantiallyplanar region720 is detected from pointing object movement above thekeyboard718 and within a vertically-defined region over thekeyboard718. In this way, for example, a user holding thekeyboard718 may control thecursor110 without reference to any physical surface on which thekeyboard718 may rest. Such an implementation may be used, for example, by a user operating thedisplay112 as a television display, e.g., in a non-traditional setting for thekeyboard718, such as a living room of the user.
AlthoughFIGS. 7A-7D illustrate specific examples of theuser device102, it should be understood that many other examples exist. For example, theuser device102 ofFIG. 1 also may generally represent other compact, portable computing devices, such as a cell phone, a tablet personal computer, and/or a portable gaming system. In the latter example, light from associated substantially planar region(s) may be used to allow various game functionalities to be implemented.
In still other example implementations,optical tracking system104 may be implemented as a discrete module that may easily be inserted into, or integrated with, another component or device. For example, the optical tracking system104 (or104a) may be implemented in the context of a Personal Computer Memory Card International Association (PCMCIA) card, that may be inserted into a corresponding, standard slot of, for example, a laptop computer. In another implementation, such a module may be plugged into thekeyboard702 or other device using a Universal Serial Bus (USB) port or other connection technology.
Of course, any of the example implementations and techniques described above with respect toFIGS. 1-6 may be used in the examples ofFIGS. 7A-7D, and in the other examples just mentioned. For example, in any one of the examples ofFIGS. 7A-7D, dual substantially planar regions may be used along the lines ofFIGS. 4A and 4B, in order to provide the tilt detection functions described with respect thereto. Also, other features described with respect toFIGS. 1-6 may be provided. For example, LEDs or other source lights may be included, as may be the various filters and/or beam-forming optics described above.
As described herein, optical tracking allows for various advantageous features, including, for example, direct finger cursor control, gesture detection capability, stylus inputs, a touch screen, and various other uses and applications. Described systems and methods provide good spatial resolution and accuracy, and responsive tracking speeds.
While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the embodiments of the invention.