CROSS REFERENCES TO RELATED APPLICATIONS This application is a Continuation-in-Part of the co-pending, commonly-owned U.S. Patent Application, Attorney Docket No. 020824-004610US, Ser. No. 10/803,806, filed Mar. 17, 2004, by James Marggraff et al., entitled “Scanning Apparatus,” and hereby incorporated by reference in its entirety.
This application is a Continuation-in-Part of the co-pending, commonly-owned U.S. Patent Application, Attorney Docket No. 020824-009500US, Ser. No. 10/861,243, filed Jun. 3, 2004, by James Marggraff et al., entitled “User Created Interactive Interface,” and hereby incorporated by reference in its entirety.
BACKGROUND OF THE INVENTION 1. Field of the Invention
Embodiments in accordance with the present invention generally pertain to information storage mediums and to the retrieval and use of stored information.
2. Related Art
Devices such as optical readers or optical pens conventionally emit light that reflects off a surface to a detector or imager. As the device is moved relative to the surface (or vice versa), successive images are rapidly captured. By analyzing the images, movement of the optical device relative to the surface can be tracked.
One type of optical pen is used with a sheet of paper on which very small dots are printed. The dots are printed on the page in a pattern with a nominal spacing of about 0.3 millimeters (0.01 inches). The pattern of dots within any region on the page is unique to that region. The optical pen essentially takes a snapshot of the surface, perhaps 100 times a second or more. By interpreting the dot positions captured in each snapshot, the optical pen can precisely determine its position relative to the page.
Applications that utilize information about the position of an optical pen relative to a surface have been or are being devised. An optical pen with Bluetooth or other wireless capability can be linked to other devices and used for sending electronic mail (e-mail) or faxes.
An optical pen may be shipped or sold with a set of pre-loaded software applications. Users will typically be motivated to update the software on their optical pens as new or improved applications become available. However, optical pens may not be equipped to conveniently download information, because of their relatively small size and relatively unique form factor. Thus, adding new software to an optical pen may be somewhat problematic.
SUMMARY OF THE INVENTION Accordingly, an optical pen that can be conveniently updated with new or improved software, and/or a method of conveniently updating the software on an optical pen, would be valuable. Embodiments in accordance with the present invention provide this and other advantages.
Embodiments of the present invention pertain to methods for storing, retrieving and using information, and devices thereof. In one embodiment, a pattern of markings on a surface is decoded to recover information encoded by the pattern. A software application associated with the information is identified. The information can then be used with the software application.
In one embodiment, a device such as handheld pen-shaped computer system (e.g., an optical pen) is used to scan the data from a surface (e.g., a piece of paper, etc.). The device contains memory, a processor, a writing instrument and an optical sensor that can read an image on the surface. Data scanned from the surface can be stored in memory and used by one or more applications resident on the device.
For example, parameterization data for an application, or even an application itself, can be encoded as a pattern of markings on a surface such as a piece of paper. The markings can be read (e.g., scanned) by the device (e.g., handheld pen-shaped computer system or optical pen). More precisely, an image of the pattern is captured by the device. The captured image of the markings can then be processed (decoded) to recover the encoded information, which can then be stored in memory on the device. The decoded information can be used, for example, to add an application to the device or to supplement an existing application.
In one embodiment, a surface (e.g., a piece of paper) can be supplied on which certain image themes are printed. Encoded information, as described above, can also be printed on the paper. Using the device (e.g., handheld pen-shaped computer system or optical pen) to scan, decode and store the encoded information, an application program resident on the device can become more customized to the theme of the paper. Alternatively, the user experience provided by interfacing with the application may become in some way relevant to the theme of the paper.
There are many other possible uses for information encoded and decoded in this manner. For example, a message encoded in a pattern of markings can be audibly rendered by scanning and decoding the markings. Alternatively, the information in a pattern of markings can index other, previously stored information (e.g., phonemes) that are used to synthesize an audible message. In the latter instance, new words can be added to the vocabulary of a device without having to download the words themselves, reducing the amount of information to be downloaded.
In general, embodiments in accordance with the present invention provide a convenient and user-friendly mechanism for adding information to devices such as handheld pen-shaped computer system or optical pens, thus expanding the functionalities of the devices beyond those provided when the devices were shipped or sold. These and other objects and advantages of the present invention will be recognized by one skilled in the art after having read the following detailed description, which are illustrated in the various drawing figures.
BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are incorporated in and form a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention:
FIG. 1 is a block diagram of a device upon which embodiments of the present invention can be implemented.
FIG. 2 is a block diagram of another device upon which embodiments of the present invention can be implemented.
FIG. 3 shows an exemplary sheet of paper provided with a pattern of marks according to one embodiment of the present invention.
FIG. 4 shows an enlargement of a pattern of marks on an exemplary sheet of paper according to one embodiment of the present invention.
FIG. 5 shows a region on a surface of, for example, a sheet of paper that can be used to store encoded information according to one embodiment of the present invention.
FIG. 6 shows a region on a surface of, for example, a sheet of paper that can be used to store encoded information according to another embodiment of the present invention.
FIG. 7 is a flowchart of a computer-implemented method for retrieving encoded information according to one embodiment of the present invention.
FIG. 8 shows an exemplary user interface for a software application in accordance with one embodiment of the present invention.
FIG. 9 shows an exemplary user interface for another software application in accordance with one embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION In the following detailed description of the present invention, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be recognized by one skilled in the art that the present invention may be practiced without these specific details or with equivalents thereof. In other instances, well-known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the present invention.
Some portions of the detailed descriptions, which follow, are presented in terms of procedures, steps, logic blocks, processing, and other symbolic representations of operations on data bits that can be performed on computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. A procedure, computer executed step, logic block, process, etc., is here, and generally, conceived to be a self-consistent sequence of steps or instructions leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present invention, discussions utilizing terms such as “encoding” or “using” or “identifying” or “accessing” or “rendering” or “reading” or “decoding” or “combining” or “sensing” or “executing” or “supplying” or the like, refer to the actions and processes of a computer system (e.g.,flowchart700 ofFIG. 7), or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
FIG. 1 is a block diagram of adevice100 upon which embodiments of the present invention can be implemented. In general,device100 may be referred to as a pen-shaped computer system or an optical device, or more specifically as an optical reader, optical pen or digital pen.
In the embodiment ofFIG. 1,device100 includes aprocessor32 inside ahousing62. In one embodiment,housing62 has the form of a pen or other writing utensil.Processor32 is operable for processing information and instructions used to implement the functions ofdevice100, which are described below.
In one embodiment, thedevice100 includes anaudio output device36, a display device40, or both an audio device and display device coupled to theprocessor32. In other embodiments, the audio output device and/or the display device are physically separated fromdevice100, but in communication withdevice100 through either a wired or wireless connection. For wireless communication,device100 can include a transceiver or transmitter (not shown inFIG. 1). Theaudio output device36 may include a speaker or an audio jack (e.g., for an earphone or headphone). The display device40 may be a liquid crystal display (LCD) or some other suitable type of display.
In the embodiment ofFIG. 1,device100 includesinput buttons38 coupled to theprocessor32 for activating and controlling thedevice100. For example, theinput buttons38 allow a user to input information and commands todevice100 or to turndevice100 on or off.Device100 also includes apower source34 such as a battery.
Device100 also includes a light source oroptical emitter44 and a light sensor oroptical detector42 coupled to theprocessor32. Theoptical emitter44 may be a light emitting diode (LED), for example, and theoptical detector42 may be a charge coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) imager array, for example. Theoptical emitter44 illuminates surface70 or a portion thereof. Light reflected from thesurface70 is received at and recorded byoptical detector42.
In one embodiment, a pattern of markings is printed onsurface70. Thesurface70 may be any suitable surface on which a pattern of markings can be printed, such as a sheet a paper or other types of surfaces. The end ofdevice100 that holdsoptical emitter44 andoptical detector42 is placed against or nearsurface70. Asdevice100 is moved relative to thesurface70, the pattern of markings is read and recorded byoptical emitter44 andoptical detector42. As discussed in more detail further below, in one embodiment, the markings onsurface70 are used to determine the position ofdevice100 relative to surface70 (seeFIGS. 3 and 4). In another embodiment, the markings onsurface70 are used to encode information (seeFIGS. 5 and 6). The captured images ofsurface70 can be analyzed (processed) bydevice100 to decode the markings and recover the encoded information.
Device100 ofFIG. 1 also includes amemory unit48 coupled to theprocessor32. In one embodiment,memory unit48 is a removable memory unit embodied as a memory cartridge or a memory card. In another embodiment,memory unit48 includes random access (volatile) memory (RAM) and read-only (non-volatile) memory (ROM) for storing information and instructions forprocessor32.
In the embodiment ofFIG. 1,device100 includes awriting element52 situated at the same end ofdevice100 as theoptical detector42 and theoptical emitter44. Writingelement52 can be, for example, a pen, pencil, marker or the like, and may or may not be retractable. In certain applications, writingelement52 is not needed. In other applications, a user can use writingelement52 to make marks onsurface70, including characters such as letters, numbers, symbols and the like. These user-produced marks can be scanned (imaged) and interpreted bydevice100 according to their position on thesurface70. The position of the user-produced marks can be determined using a pattern of marks that are printed onsurface70; refer to the discussion ofFIGS. 3 and 4, below. In one embodiment, the user-produced markings can be interpreted bydevice100 using optical character recognition (OCR) techniques that recognize handwritten characters.
Surface70 may be a sheet of paper, although surfaces consisting of materials other than paper may be used.Surface70 may be a flat panel display screen (e.g., an LCD) or electronic paper (e.g., reconfigurable paper that utilizes electronic ink). Also,surface70 may or may not be flat. For example,surface70 may be embodied as the surface of a globe. Furthermore,surface70 may be smaller or larger than a conventional (e.g., 8.5×11 inch) page of paper. In general,surface70 can be any type of surface upon which markings (e.g., letters, numbers, symbols, etc.) can be printed or otherwise deposited. Alternatively,surface70 can be a type of surface wherein a characteristic of the surface changes in response to action on the surface bydevice100.
FIG. 2 is a block diagram of anotherdevice200 upon which embodiments of the present invention can be implemented.Device200 includesprocessor32,power source34,audio output device36,input buttons38,memory unit48,optical detector42,optical emitter44 and writingelement52, previously described herein. However, in the embodiment ofFIG. 2,optical detector42,optical emitter44 and writingelement52 are embodied asoptical device201 inhousing62, andprocessor32,power source34,audio output device36,input buttons38 andmemory unit48 are embodied asplatform202 inhousing74. In the present embodiment,optical device201 is coupled toplatform202 by acable102; however, a wireless connection can be used instead. The elements illustrated byFIG. 2 can be distributed betweenoptical device201 andplatform200 in combinations other than those described above.
FIG. 3 shows a sheet ofpaper15 provided with a pattern of marks according to one embodiment of the present invention. In the embodiment ofFIG. 3, sheet ofpaper15 is provided with a coding pattern in the form of opticallyreadable position code17 that consists of a pattern ofmarks18. Themarks18 inFIG. 3 are greatly enlarged for the sake of clarity. In actuality, themarks18 may not be easily discernible by the human visual system, and may appear as grayscale on sheet ofpaper15. In one embodiment, themarks18 are embodied as dots; however, the present invention is not so limited.
FIG. 4 shows anenlarged portion19 of theposition code17 ofFIG. 3. An optical device such asdevice100 or200 (FIGS. 1 and 2) is positioned to record an image of a region of theposition code17. In one embodiment, the optical device fits themarks18 to a reference system in the form of a raster withraster lines21 that intersect at raster points22. Each of themarks18 is associated with araster point22. For example,mark23 is associated withraster point24. For the marks in an image/raster, the displacement of a mark from the raster point associated with the mark is determined. Using these displacements, the pattern in the image/raster is compared to patterns in the reference system. Each pattern in the reference system is associated with a particular location on thesurface70. Thus, by matching the pattern in the image/raster with a pattern in the reference system, the position of the pattern on thesurface70, and hence the position of the optical device relative to thesurface70, can be determined. Additional information is provided by the following patents and patent applications, herein incorporated by reference in their entirety for all purposes: U.S. Pat. No. 6,502,756; U.S. patent application Ser. No. 10/179,966 filed on Jun. 26, 2002; WO 01/95559; WO 01/71473; WO 01/75723; WO 01/26032; WO 01/75780; WO 01/01670; WO 01/75773; WO 01/71475; WO 01/73983; and WO 01/16691. See also Patent Application No. 60/456,053 filed on Mar. 18, 2003, and patent application Ser. No. 10/803,803 filed on Mar. 17, 2004, both of which are incorporated by reference in their entirety for all purposes.
With reference back toFIG. 1, four positions or regions onsurface70 are indicated by the letters A, B, C and D (these characters are not printed onsurface70, but are used herein to indicate positions on surface70). There may be many such regions on thesurface70. Associated with each region onsurface70 is a unique pattern of marks. The regions onsurface70 may overlap because even if some marks are shared between overlapping regions, the pattern of marks in a region is still unique to that region.
In the example ofFIG. 1, using device100 (specifically, using writing element52), a user may create a character consisting, for example, of a circled letter “M” at position A on surface70 (generally, the user may create the character at any position on surface70). The user may create such a character in response to a prompt (e.g., an audible prompt) fromdevice100. When the user creates the character,device100 records the pattern of markings that are uniquely present at the position where the character is created. Thedevice100 associates that pattern of markings with the character just created. Whendevice100 is subsequently positioned over the circled “M,”device100 recognizes the pattern of marks associated therewith and recognizes the position as being associated with a circled “M.” In effect,device100 recognizes the character using the pattern of markings at the position where the character is located, rather than by recognizing the character itself.
In one embodiment, the character is associated with a particular command. In the example just described, a user can create (write) a character that identifies a particular command, and can invoke that command repeatedly by simply positioningdevice100 over the written character. In other words, the user does not have to write the character for a command each time the command is to be invoked; instead, the user can write the character for a command one time and invoke the command repeatedly using the same written character.
FIG. 5 shows aregion50 on a surface70 (on a sheet ofpaper15, for example) that can be used to store encoded information according to one embodiment of the present invention. Although the example ofFIG. 5 shows a sheet of paper, embodiments in accordance with the present invention can be implemented on other types and shapes of surfaces made of various types of materials, as mentioned above.
Region50 includes a pattern of marks such as dots. In the embodiment ofFIG. 5, the pattern of marks is used to store encoded information. More specifically, information is binary encoded (e.g., as bit values of zero or one), and the binary values are translated into a particular pattern of marks that are printed onsurface70. In one embodiment, up to 50 bytes per inch can be stored inregion50. In one embodiment, the information encoded inregion50 is also encrypted.
Surface70 may contain other information in addition toregion50. For example,surface70 may contain a pattern of markings such as that described above in conjunction withFIGS. 3 and 4. That is, some portions ofsurface70 can be used to determine the position of an optical pen relative to surface70, while other portions ofsurface70 can be used to store encoded information.Surface70 can include yet other information. For example,surface70 may contain text-based or image-based information. As a specific example,surface70 may be a page in a magazine that contains articles and pictures as well as the patterns of markings referred to above. The theme of the information included onsurface70 may be related to the type of information encoded inregion50.
Region50 ofFIG. 5 is identified or labeled in some manner so that a user can conveniently locate it onsurface70. For example, a visible border or margin can be printed aroundregion50 to delineate the region on thesurface70. The pattern of marks inregion50 is read by passing theoptical emitter44 andoptical detector42 of an optical pen (e.g.,device100 or200 ofFIGS. 1 and 2) overregion50. The presence of a border helps a user keep the optical pen withinregion50 during scanning, so that the pattern of marks inregion50 can be accurately read without stray marks being inadvertently picked up from outsideregion50.
In the example ofFIG. 5, the orientation ofregion50 is horizontal; however, other orientations are permitted. Also, in the example ofFIG. 5,region50 is substantially rectangular in shape; however, other shapes are possible, including shapes that are curved or non-linear.
In one embodiment, the pattern of marks inregion50 is scanned by passing an optical pen (e.g.,device100 or200 ofFIGS. 1 and 2) in one direction overregion50. However, depending on how the information is encoded inregion50, the present invention is not so limited. For example, a user may move the optical pen in different directions withinregion50. Also, the optical pen may be moved at a constant speed or at varying speeds as the pen is passed overregion50.
In the example ofFIG. 5, a visual cue such asarrow56 is used to indicate to a user a direction in which the pattern of marks inregion50 are to be scanned. Visual cues indicating where to begin and where to end the scanning can also be used. Written instructions to assist the user can also be provided.
In one embodiment,region50 is demarcated by a first tag orregion53 and a second tag orregion54. In an example in whichregion50 is read from left to right,region53 indicates the start ofregion50 or the start of the encoded information, andregion54 indicates the end ofregion50 or the end of the encoded information. In such an example, an optical pen (e.g.,device100 or200 ofFIGS. 1 and 2) can be placed against or nearly againstregion53 to start the process of scanningregion50. Upon reachingregion54 after traversing the length ofregion50, the process of scanningregion50 is ended.
In one embodiment, a unique pattern of marks is associated with each of theregions53 and54 ofFIG. 5, as described previously in conjunction withFIGS. 3 and 4. In such an embodiment,device100 or200 (FIGS. 1 and 2) is programmed to recognize those unique patterns as being associated with a beginning of encoded information tag and an end of encoded information tag, allowing those tags to be used universally.
There are in effect two dimensions associated with the pattern of marks inregion50. In the example ofFIG. 5,region50 is scanned in the x-direction (e.g., left to right). In this example, in one embodiment, linear position withinregion50 is encoded in the x-dimension, while information is encoded in the y-dimension. That is, asregion50 is traversed in the x-direction with an optical pen (e.g.,device100 or200 ofFIGS. 1 and 2), data pairs (x, y) are read and stored by the optical pen. The x-values in each (x, y) pair provide a position along the x-axis withinregion50, and the y-values in each (x, y) pair provide a data value (e.g., a binary value) corresponding to the encoded information.
The x-values can be determined by capturing an image of the pattern of marks and interpreting the positions of the marks, as described above in conjunction withFIGS. 3 and 4. Information can be stored in the y-dimension using a variety of techniques. The y-values can be determined by capturing an image of the pattern of marks and interpreting the marks. For example, the size of the marks can be varied, with a mark of one size indicating a binary value of zero (0) and a mark of another size indicating a binary value of one (1). Alternatively, the distance between marks can be used to indicate binary values. For example, if marks are expected at to be spaced at uniform distances in the y-direction, the absence of a mark can indicate a binary value of 0 while the presence of a mark can indicate a binary value of 1. Alternatively, the displacement of a mark relative to a raster point (refer toFIG. 4 above) can be used to indicate one binary value versus another. The optical pen (e.g.,device100 or200 ofFIGS. 1 and 2) is programmed to interpret whatever encoding scheme is used. Also, the optical pen can check for errors that may occur in the scanning or interpretation of the markings in aregion50.
Depending on the encoding scheme used, knowledge of position within aregion50 may not be necessary.
In the example ofFIG. 5, asingle region50 is illustrated. However, there may be more than one of such regions on asurface70. If there are more than one such regions, they may be of different sizes and shapes and they may be oriented differently relative to one another. Also, the regions may be scanned in different directions relative to one another. For example, consider two such regions that are rectangular in shape, with their longer sides oriented horizontally onsurface70 as illustrated inFIG. 5, with one region just below the other onsurface70. The first of those regions can be read left to right while the second of the regions is read from right to left, facilitating the scanning of those regions by reducing the amount of movement of an optical pen (e.g.,device100 or200 ofFIGS. 1 and 2) between the regions.
The information encoded by multiples ofregion50 can be divided among those regions. For example, part of the encoded information may be included in afirst region50, while the remainder of the encoded information is included in asecond region50. The first and second regions may or may not be on the same surface70 (e.g., on the same piece of paper). During decoding and processing of the scanned and encoded information, an optical pen (e.g.,device100 or200 ofFIGS. 1 and 2) can then integrate the parts.
The information encoded by the pattern of marks inregion50 ofFIG. 5 can be used in a variety of different ways. For example, that information can be used to update an application resident on an optical pen (e.g.,device100 or200 ofFIGS. 1 and 2), or to add a new application to the optical pen. Also, a new application can be added to the optical pen by populating (parameterizing) an application template previously installed on the optical pen with the information encoded in aregion50. In these examples, the information encoded inregion50 includes information identifying an application with which it is associated. Information identifying an application can alternatively be included inregion53 or54. Additional information is provided in conjunction withFIGS. 8 and 9, below.
The information encoded withinregion50 ofFIG. 5 can also be used for an application referred to herein as “sound swipe.” For example, by passing the optical pen (e.g.,device100 or200 ofFIGS. 1 and 2) throughregion50, an audible message encoded by the pattern of marks inregion50 is rendered. An application such as sound swipe can be used in advertising or contest promotions, for example. Also, the audible message encoded by the pattern of marks inregion50 can be used to provide information or feedback to a user as the optical pen is being passed overregion50. For example, as the pattern of marks are being scanned, a beep or similar type of sound (in general, any type of sound) can be audibly rendered—a sound generally recognized as a pleasant sound can be used to indicate that the scanning is being performed correctly, while a sound generally recognized as an unpleasant sound can be used to indicate that the scanning is not proceeding correctly. The speed at which the sound is played can also be used to provide feedback to a user. In general, as aregion50 is being scanned, information relevant to the actions of a user can be encoded in theregion50 and audibly rendered to the user as the user is scanning the pattern of markings.
An audible message rendered from the information in aregion50 can also be used to provide direction or feedback to a user during the act of scanning as part of a software application that the user is executing. In other words, the information encoded in aregion50 can be used to provide audible feedback to a user to indicate how well the pattern of markings is being scanned, or to indicate how well the user is performing in, for example, a gaming application. For example, a user may be using a software application for a maze game, in which the optical pen (e.g.,device100 or200 ofFIGS. 1 and 2) is moved by the user through a maze that is illustrated on asurface70. The “walls” of the maze may be regions (such as a region50) in which the information that is encoded results in a sound being rendered when the optical pen makes contact with a wall. In general, the information encoded in aregion50 can be used to affect the actions of a user as the pattern of markings are being scanned (in the act of scanning).
An audible message can also be used to maintain the user's interest during scanning. As mentioned above, there may be multiples of theregions50. An audible message, the content of which may be unrelated to the type of information encoded in theregions50, may be rendered to encourage a user to scan all of the regions. For example, a riddle may be verbalized as the user scans theregions50, with the solution to the riddle not being provided until all of theregions50 are scanned.
In each of the examples above, the message that is audibly rendered may be based on information encoded in aregion50 or on information stored on the optical pen (e.g.,device100 or200 ofFIGS. 1 and 2) or on a combination of the two.
FIG. 6 shows aregion58 on a surface70 (on a sheet ofpaper15, for example) according to another embodiment of the present invention. In this embodiment, an application such as sound swipe is used to provide audible directions to a user, to assist the user inscanning region50. For example, the user can position an optical pen against or nearly againstregion58, which includes information encoded as a pattern of marks in a manner similar to that ofregion50. In response to the scanning and interpreting of the information encoded inregion58, an audible message is rendered, instructing the user on how to scanregion50. Alternatively,scanning region58 can invoke a command that causes the optical pen (e.g.,device100 or200 ofFIGS. 1 and 2) to play a recorded message that instructs the user on how to scanregion50.
FIG. 7 is aflowchart700 of a computer-implemented method for retrieving encoded information according to one embodiment of the present invention. Although specific steps are disclosed inflowchart700, such steps are exemplary. That is, embodiments of the present invention are well suited to performing various other steps or variations of the steps recited inflowchart700. It is appreciated that the steps inflowchart700 may be performed in an order different than presented, and that not all of the steps inflowchart700 may be performed. In one embodiment, with reference also toFIGS. 1 and 2,flowchart700 is implemented by a device such asdevice100 or200 as computer-readable program instructions stored in a memory unit (e.g., memory unit48) and executed by a processor (e.g., processor32).
In one embodiment, instep72, a pattern of markings on a surface is decoded to recover information encoded by the pattern. In one embodiment, the markings are sensed using an optical sensor (e.g.,optical detector42 ofFIGS. 1 and 2). Linear position within the pattern is encoded in a first dimension of the pattern, and the information is encoded in a second dimension of the pattern. The information may include data and/or instructions that may be executed by the processor of the optical pen.
Instep74, a software application associated with the information is identified. For example, the information can include the identity of the software application with which the information is associated.
Instep76, the information or some portion thereof can then be used with the software application. For example, the software application can be executed using the decoded information. This is described further in conjunction withFIGS. 8 and 9.
FIG. 8 shows anexemplary user interface800 used with a software application installed on an optical pen (e.g.,devices100 and200 ofFIGS. 1 and 2) in accordance with one embodiment of the present invention.FIG. 9 shows anexemplary user interface900 used with another software application installed on an optical pen (e.g.,devices100 and200 ofFIGS. 1 and 2) in accordance with one embodiment of the present invention. Theuser interfaces800 and900 may be printed on a sheet of paper, although the present invention is not so limited.
In actuality, theuser interfaces800 and900 may not include the text-based information (e.g., “oboe,” “oboe sound,” etc.) that are shown inFIGS. 8 and 9. To illustrate this by way of an example, an application that can be implemented using an optical pen such asdevice100 or200 ofFIGS. 1 and 2 is referred to herein as “memory match.” In a memory match application, a user is presented with an interface such as theinterfaces800 and900; however, theinterfaces800 and900 do not include the text-based information shown. The user attempts to match a spoken word (e.g., oboe) with a sound associated with that spoken word (e.g. oboe sound). The user touches the optical pen to one of the positions in theinterfaces800 and900, and an audible message is generated in response. For example, with reference toFIG. 8, if a user touches the optical pen to the position ininterface800 associated with the word “oboe,” the word “oboe” is audibly rendered in response, and if a user touches the optical pen to the position ininterface800 associated with the sound of an oboe (e.g., “oboe sound”), the sound of an oboe is audibly rendered in response. The user then touches another position within theinterface800 and another message is audibly rendered; either a word is pronounced, or a sound associated with a word is played. The user proceeds in this manner in an attempt to match the position associated with the word “oboe” with the position associated with the sound of an oboe. The memory match application works in a similar fashion with theinterface900, in which a user attempts to match state names and capitals.
In one embodiment, information encoded in a region50 (FIG. 5) can be used to update an application already installed ondevice100 or200 ofFIGS. 1 and 2. For example, a memory match application in which the names and sounds of musical instruments are to be located and matched (as inFIG. 8) may be installed ondevice100 or200; Information encoded in aregion50 can be used to add new instrument names and sounds to such an application.
Consider an example in which the word “oboe” (specifically, the spoken word “oboe”) is to be added to an application such as a memory match application. The spoken word “oboe” can be encoded in aregion50 and added todevice100 or200 (FIGS. 1 and 2) as previously described herein. Alternatively, the word “oboe” can be audibly rendered bydevice100 and200 using phonetics-to-speech (PTS) synthesis. In the latter instance, a library or database of phonemes is installed ondevices100 and200. Each of the phonemes can be uniquely identified in the database using a respective index (e.g., a unique binary value is associated with each phoneme). Thus, the information in aregion50 need only include the indexes of the phonemes that are to be used to synthesize a new word to be added to thedevice100 or200. That is, for example, the information encoded in aregion50 need only identify the indices for the phonemes associated with the word “oboe.” This can greatly reduce the amount of information that needs to be encoded in aregion50, relative to the amount of information that would need to be encoded if the spoken word was itself encoded.
The game “Hangman” provides another example of how information in a region50 (FIG. 5) can be used to supplement information already installed ondevice100 or200 ofFIGS. 1 and 2. Information in aregion50 can supplement an application by making the application customized to the surface with which the user is interfacing (e.g., the surface on which the user is writing). For example, a certain list of words that can be used in Hangman may be installed on thedevice100 or200 when the device is shipped or sold. New words to be added to the Hangman application can be encoded in aregion50, and that information can be used to add new words to the list of words used by Hangman by scanning theregion50 with thedevice100 or200. For instance, a preprinted theme can be illustrated on a surface70 (FIG. 5) that contains aregion50, with the words encoded in theregion50 associated with that theme.
As another example, a template for an application can be installed ondevice100 or200 ofFIGS. 1 and 2, and the information encoded in a region50 (FIG. 5) can be used to populate that template with information that produces a new application. For example, with reference again toFIGS. 8 and 9, a template for memory match applications can be installed ondevice100 or200 ofFIGS. 1 and 2. The template would define the type of user interface, the types of interactions that occur between the user and the user interface, and the like. In essence, the template would define the structure of a blank user interface which is to be populated with information for a memory match application. Information encoded in aregion50 can be used to populate the template with, for example, the names and sounds of musical instruments to produce a first software application. Information encoded in anotherregion50 can be used to populate the template with, for example, the names of states and their capitals to produce a second software application. Templates for other types of applications can be similarly defined and populated.
In another example, an application (or an application template) may define certain areas of a surface70 (FIGS. 1 and 2) as being regions in which handwritten user input is received. Information encoded in aregion50 can be used to install questions ontodevice100 or200 that can be audibly rendered to the user. The information encoded in aregion50 can also include the correct answers to those questions. In response to hearing a question, the user writes an answer into a designated region ofsurface70, using thewriting element52 ofdevice100 or200 to write the answer. Upon scanning the handwritten answer,device100 or200 can use character recognition techniques to interpret the handwritten answer and compare it to the correct answer.
As another example, a book can be printed on paper that has printed thereon the pattern of markings described above in conjunction withFIGS. 3 and 4. Each position in the book is associated with a unique pattern of markings (e.g., a unique pattern of dots). Thus, each word in the book is also associated with a unique pattern of markings. A mapping of the words to their respective patterns is stored in a database ondevice100 or200 (FIGS. 1 and 2). Thedevice100 or200 can be used to scan and read the pattern of markings at a particular location in the book, and the mapping can be used to identify the word at that location. Different activities can occur once the word at a location is so identified. For example, the word can be verbalized usingdevice100 or200 to identify how the word is pronounced, a definition of the word can be verbalized, or the word can be translated into a different language.
As an alternative to the above, a standardized pattern of markings is established for each word in a database (e.g., a lexicon). In other words, each word in the lexicon is associated with a unique pattern of markings such as those described in conjunction withFIG. 5. Thus, a particular pattern of markings uniquely identifies a particular word. The unique pattern of markings for a word may be an encoded version of the word (e.g., an ASCII version of the word) or it may be an index that points to the word inside the lexicon. When a book is printed, both the text of the book and the pattern associated with each word of text are printed; that is, a word and the unique pattern associated with that word are both printed on the page at the same location on the page, so that the word and its associated pattern of markings are physically linked. Thedevice100 or200 can be used to scan and read the pattern of markings at a particular location in the book, a word at that location can be identified from the pattern of markings, and applications such as those described above can then be utilized (e.g., the word can be defined, etc.). In general, a unique pattern of markings printed, for example, on asurface70 can be used to identify an item of content that is located at the same position onsurface70 as the pattern of markings.
The discussion above presents just a few examples of how information encoded in a region50 (FIG. 5) can be used with an optical pen such asdevices100 and200 ofFIGS. 1 and 2. In general, embodiments in accordance with the present invention provide a convenient and user-friendly mechanism for adding information to optical devices such as optical pens, in order to expand the functionalities of the devices beyond those provided when the devices were shipped or sold as well as enhance the experience of using the devices.
Embodiments of the present invention are thus described. While the present invention has been described in particular embodiments, it should be appreciated that the present invention should not be construed as limited by such embodiments, but rather construed according to the below claims.