CROSS-REFERENCE TO RELATED APPLICATIONSThis application claims benefit of U.S. Provisional Patent Application Ser. No. 61/755,881, filed Jan. 23, 2013, entitled “Method and System For Discriminating Pen and Touch Interactions”, U.S. Provisional Patent Application Ser. No. 61/791,577, filed Mar. 15, 2013, entitled “Method and System for Discriminating Stylus and Touch Interactions” (Atty Dkt No. LOGI/0005L), U.S. Provisional Patent Application Ser. No. 61/738,797, filed Dec. 18, 2012 entitled “Electronically Augmented Pen Tip For A Touch Pad Digitizer” (Atty Dkt No. LOGI/0003L), U.S. Provisional Patent Application Ser. No. 61/762,222, filed Feb. 7, 2013, entitled “Electronically Augmented Pen Tip For A Touch Pad Digitizer” (Atty Dkt No. LOGI/0003L02) and U.S. Provisional Patent Application Ser. No. 61/790,310, filed Mar. 15, 2013, entitled “Active Stylus For Touch Sensing Applications” (Atty Dkt No. LOGI/0003L03), which are all herein incorporated by reference.
BACKGROUND OF THE INVENTION1. Field of the Invention
The present invention relates to a method and a system that is able to discriminate between the interaction of an electronic stylus pen, finger(s) or user's appendage and a touch screen containing device.
2. Description of the Related Art
Touch-screen tablet computers allow a user the ability to interact directly with content displayed on the touch-screen of the tablet computer. These interactions can be conducted through various means, but typically is done through touch, by way of the user's fingers directly interacting with the screen, or through the use of a stylus pen or other type of input control device that contacts the screen based on movements made by the user. Typically, touch-screens distinguish touch inputs from stylus pen inputs by using various sensing technologies or input modes that the user has to select based on the operations the user wants to conduct on the touch-screen of the tablet computer. Other typical solutions require stylus pen inputs to originate from a stylus pen that is physically tethered to the tablet computer.
Collecting touch information from these types of interface mechanisms also introduces a number of challenges. Moreover, the process of reliably collecting touch information becomes increasingly more complicated where the computing device allows a user to input information using both a touch input mechanism and a stylus pen input mechanism. In the course of interfacing with the touch sensitive surface of the computing device with a stylus pen device, the user may inadvertently rest his or her palm on the touch sensitive surface. The computing device may then incorrectly interpret this inadvertent contact as a legitimate input activity. A similar challenge may confront a user who is intentionally using a touch input mechanism to control or input data to the computing device. In some cases, the user may attempt to apply a focused touch to the surface of the computing device, yet the user may accidentally brush or bump his or her hand against other parts of the display surface, causing accidental input events. These problems may understandably frustrate the user if they become a frequent occurrence, or even if uncommon, if they cause significant disruption in the task that the user is performing.
Moreover, due to limitation in the computing power of the computing device, a wish to increase the speed of the computing device by reducing the computational power required to collect and transfer the touch interaction data and/or the often limited nature of the data received from the touch sensing components of a third party's computing device on which a hardware and software application (e.g., “app”) maker's software is running, there is a need for a method that can distinguish between the different user inputs by use of a simplified data set that is created by the computing device from the interaction of the user's fingers, appendage and/or stylus pen. In some cases, the simplified data set includes the coordinates of a touch point and the time when the touch point was sensed by the touch sensing components. The simplified data set is generally a small fraction of the amount of the data that is commonly available from the touch sensitive hardware in a conventional touch sensitive display type computing devices today.
Despite the progress made with respect to operating touch screen tablet computers, there is a need in the art for improved methods and systems related to distinguishing different inputs provided to tablet computers in spite of the problems discussed above.
SUMMARY OF THE INVENTIONEmbodiments relate generally to control devices, such as human interface devices, configured for use with a touch screen tablet computer. More specifically, the present invention relates to methods and systems for discriminating between the interactions of a handheld device, touch of one or more of the user's finger(s) and interaction with appendages of the user on a touch-screen tablet computer. The methods described herein may include discriminating between the interaction of the handheld device, such as an electronic stylus pen, the user's finger(s) and a user's appendage so that the collected information can be used to control some aspect of the hardware or software running on the touch-screen tablet computer. The methods disclosed herein may also be used to separate the interaction of the user's appendage from the interactions of the handheld device and/or user's finger(s) with the touch-screen tablet computer. In one example, the information received from the appendage of the user is distinguished from the information received from the interaction of a stylus pen and the user's finger and the touch-screen tablet computer, and is purposely not used to control the hardware and/or software running on the touch-screen tablet computer.
Embodiments provide a method of operating a host device, comprising receiving, at the host device, information related to a touch-down event, receiving, at the host device, information related to a touch event from a controlling engine, correlating the information related to the touch-down event with the information related to the touch event, and determining that the touch-down event is associated with a handheld device.
Embodiments further provide a method of characterizing user input data received by a host device, comprising receiving, at the host device, information related to a first touch event from a touch sensing unit coupled to the host device, wherein the information from the first touch event comprises a first touch data point, comparing the first touch event information with a first rule and a second rule, wherein the first rule and the second rule each form a vote as to the type user input that created the first touch event, and attributing the first touch data point to a type of user input by analyzing the votes received from the first and second rule. However, in some embodiments, more than two rules may be used to determine the type of user input.
Embodiments may further provide a method of characterizing user input data received by a host device, comprising receiving, at the host device, information related to a touch-down event from a handheld device, wherein the information related to the touch-down event comprises information relating to a first time when the touch-down event occurred, receiving, at the host device, information related to a first touch event and a second touch event from a touch sensing unit coupled to the host device, wherein the information provided for the first touch event comprises a first touch data point and information relating to a second time, and the information provided for the second touch event comprises a second touch data point and information relating to a third time, analyzing the information received by the host device, comprising comparing a predetermined threshold time and the information relating to the first time and the second time, and then assigning a first user input type vote to the first touch data point based on the comparison, and comparing a first position of the first touch data point on a user interface of the host device and a second position of the second touch data point on the user interface of the host device, and then assigning a second user input type vote to the first touch data point based on the comparison of the first position relative to the second position, and attributing a type of user input to the first touch data point using the first user input type vote and second user input type vote.
Embodiments further provide a method of characterizing user input data received by a host device, comprising receiving, at the host device, information related to a touch-down event from a handheld device, wherein the information comprises information relating to a first time when the touch-down event occurred, receiving, at the host device, information related to a first touch event from a touch sensing unit coupled to the host device, wherein the information comprises information relating to a second time when the touch event occurred on a touch sensitive unit of the host device, correlating the information related to the touch-down event with the information related to the first touch event, wherein correlating the information comprises comparing the first time, the second time and a predetermined threshold, and determining that the touch-down event is associated with the handheld device when the difference in time between the first and second time is less than the predetermined threshold.
Embodiments further provide a method of characterizing user input data received by a host device, comprising receiving, at the host device, information related to a touch-down event from a handheld device, receiving, at the host device, information related to a plurality of touch events from a touch sensing unit coupled to the host device, defining a portion of the plurality of touch events as being part of a first cluster of touch events, correlating the information related to the touch-down event with the information related to the first cluster of touch events, determining that the first cluster of touch events is associated with a user's appendage, and determining that at least one touch event of the plurality of touch events is associated with a handheld device, wherein the at least one touch event is not within the first cluster.
Embodiments further provide a computer readable medium configured to store instructions executable by a processor of a host device to characterize user input data received by the host device, the instructions when executed by the processor causing the processor to receive information related to a first touch event from a touch sensing unit coupled to the host device, wherein the information from the first touch event comprises a first touch data point, compare the first touch event information with a first rule and a second rule, wherein the first rule and the second rule each form a vote as to the type user input that created the first touch event; and attribute the first touch data point to a type of user input by analyzing the votes received from the first and second rule.
Embodiments further provide a method of operating a host device, comprising receiving, at the host device, information related to a touch-down event, receiving, at the host device, information related to a plurality of touch events from a controller, determining one or more clusters of touch events from the plurality of touch events, correlating the information related to the touch-down event with the information related to the one or more cluster of touch events, determining that one of the one or more the cluster of touch events is associated with a palm, and determining that the touch-down event is associated with a handheld device.
In another embodiment, the handheld device includes at least one of an accelerometer, a magnetometer, a gyroscope, or the like for detecting the orientation of the handheld device and detecting a triggering event, which both can be used to help control some aspect of the hardware or software running on the touch-screen tablet computer.
BRIEF DESCRIPTION OF THE DRAWINGSSo that the manner in which the above recited features of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments.
FIG. 1 illustrates an exemplary touch-screen tablet computer and a capacitive stylus pen according to an embodiment of the invention.
FIG. 2 is a simplified block diagram of the components of a host device and stylus pen according to an embodiment of the invention.
FIG. 3A is a simplified block diagram of a user input discrimination processing architecture used to distinguishing between the different types of user inputs received by the touch-screen tablet computer according to an embodiment of the invention.
FIG. 3B is a flowchart illustrating a method of discriminating touch interactions from stylus pen interactions on a touch-screen according to an embodiment of the invention.
FIG. 3C is a simplified signal diagram illustrating aspects of the process of discriminating stylus pen interactions from touch interactions on a touch-screen, according to an embodiment of the invention.
FIG. 3D is a simplified signal diagram illustrating aspects of the process of discriminating stylus pen interactions from touch interactions on a touch-screen, according to an embodiment of the invention.
FIG. 4 is a simplified flowchart illustrating a method of discriminating touch interactions from stylus pen interactions on a touch-screen according to an embodiment of the invention.
FIG. 5A illustrates a plurality of related touch points on a touch-screen tablet computer that have been analyzed by a controlling engine according to an embodiment of the invention.
FIG. 5B illustrates a plurality of related touch points on a touch-screen tablet computer that have been analyzed by a controlling engine according to an embodiment of the invention.
FIG. 5C is a simplified flowchart illustrating a method of discriminating interactions caused by the palm of a user on a touch-screen according to an embodiment of the invention.
FIG. 6A is a simplified flowchart illustrating a method of discriminating touch interactions from stylus pen interactions on a touch-screen according to an embodiment of the invention.
FIG. 6B is a table listing some examples of some voting results contained in the generated decision matrix data generated during the method of discriminating between various touch interactions illustrated inFIG. 6A, according to one or more of the embodiments described herein.
FIG. 7 is a simplified signal diagram illustrating aspects of the process of discriminating stylus pen interactions from touch interactions on a touch-screen, according to an embodiment of the invention.
FIG. 8 is a simplified signal diagram illustrating aspects of the process of discriminating stylus pen interactions from touch interactions on a touch-screen, where stylus pen and touch interactions overlap, according to an embodiment of the invention.
FIG. 9A is an isometric cross-sectional view of a portion of a mutual capacitance sensing type host device that is interacting with an active stylus pen, according to an embodiment of the invention.
FIG. 9B is a schematic signal diagram illustrating aspects of the process of detecting a touch-sensing device output signal and synchronizing an active stylus pen thereto, according to an embodiment of the invention.
FIG. 9C illustrates the components of an active stylus pen206 capable of interacting with ahost device100 that is configured for mutual capacitance sensing, according to an embodiment of the invention.
FIG. 10 illustrates simplified signature pulse diagrams that may be generated by two pens, according to an embodiment of the invention.
To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures. It is contemplated that elements disclosed in one embodiment may be beneficially utilized on other embodiments without specific recitation. The drawings referred to here should not be understood as being drawn to scale unless specifically noted. Also, the drawings are often simplified and details or components omitted for clarity of presentation and explanation. The drawings and discussion serve to explain principles discussed below, where like designations denote like elements.
DETAILED DESCRIPTIONEmbodiments of the present invention generally provide a system and methods of distinguishing between the different types of user inputs provided from the interaction of a user's finger, a user's appendage and/or a handheld device with a touch sensitive device. In some configurations the handheld device is an electronic stylus pen, or also referred to herein as simply a “stylus pen,” that a user uses to provide input to control some aspect of the touch sensitive device. Computing devices that provide software applications that allow a user to input information via a touch input mechanism and a stylus pen input mechanism are often complex due to the need to distinguish between the interaction of a user's finger, user's appendage and stylus pen with the touch sensitive device to properly control some aspect of the hardware or software applications running on the computing device. It is common for the software applications running on the computing device to assign different tasks or cause different computing device controlling events to happen based on the input received from either a stylus pen, a finger or an appendage. It is often desirable to single out the unwanted interactions with the touch sensitive device, such as interactions created by an appendage of a user (e.g., palm, shirt cuff, or other similar element), so that they can be purposely excluded from the input provided to and/or analyzed by one or more software applications running on the computing device. Errors in the proper selection of an inputting element will create errors in the output generated by the software running on the host device, which will understandably frustrate the user even if they are an uncommon occurrence. Moreover, improper selection errors can also cause significant disruption to the task that the user is performing on the computing device.
Embodiments of the invention described herein may also include a system and methods that employ a controlling engine running on a touch sensitive computing device, generally referred to herein as a host device, to discern between the user input received from a stylus pen, fingers or user's appendage. The data generated from the controlling engine's analysis of the user input data received from the various components that are coupled to or in communication with the touch sensitive computing device can then be used to control some aspects of the hardware or software running on the touch sensitive computing device. The controlling engine generally includes software instructions that include one or more input discrimination techniques that are used to analyze the various types of user input data received from one or more components in the touch sensitive device to determine the likely source of the user input. The one or more input discrimination techniques may include time based synchronization techniques, geometric shape discrimination techniques and inference based discrimination techniques that can be used separately or in combination to discern between different types of inputs received by the touch sensitive computing device. Touch sensitive computing devices may include a touch-screen tablet computer, which may use a resistive, capacitive, acoustic or other similar sensing technique to sense the input received from a user.
In some embodiments, a system and method are used to distinguish between different types of user inputs using a simplified data set that is created by the touch sensitive computing device from the interaction of a user's finger, user's appendage and/or a handheld device. In some cases, the simplified data only includes the coordinates of the touch point and the time that the interaction occurred with the touch sensing components, which is generally a small fraction of the amount of the data that is typically collected by conventional handheld or touch sensitive computing devices.
InFIG. 1, a system is depicted that includes a touch sensitive computing device, orhost device102, that includes auser interface104.Host devices102 that include auser interface104 capable of user interaction through a touch-screen sensing component. Thehost device102 may be, for example, general computing devices, phones, media players, e-reader, kiosks, notebooks, netbooks, tablet types of computers, or any other device having one or more touch-sensitive inputs. In some devices, theuser interface104 can include components that are used to display applications being executed by thehost device102. In the example shown inFIG. 1, thehost device102 is an electronic device such as an iPad® device from Apple Inc. Exemplary embodiments of computing devices include, without limitation, the iPhone®, iPad® and iPod Touch® devices from Apple Inc., the Galaxy Note® 10.1 from Samsung, the Surface™ from Microsoft, other mobile devices, tablet computers, desktop computers, kiosks, and the like.
FIG. 1 also depicts a user input device, or a handheld device, in the form of astylus pen106 that is capable of touch interactions with theuser interface104 of thehost device102. Whilestylus pen106 is a typical embodiment of the control device described herein, embodiments of the control device are not limited to astylus pen106, and may include control devices in other forms including stamps, and other devices that can be used to conduct touch interactions with theuser interface104, such as other fixed or detachable devices. One skilled in the art will appreciate that the touch interactions between thestylus pen106 and theuser interface104 do not require the physical interaction of a portion of thestylus pen106 and the surface of theuser interface104, and may also include interactions where thestylus pen106 is moved over the surface of theuser interface104 without touching the surface (e.g., active stylus pen discussed below).
FIG. 2 schematically illustrates a system diagram showing a simplified view of the control elements of ahost device102, and a simplified system diagram of the control elements of astylus pen106. Thehost device102 typically has at least some minimum computational capability, touch sensing capability and/or visual display capability. Thehost device102 includes processingunits201 that may include, but is not limited to one ormore processing units210, amemory unit211, atouch sensing unit212, adisplay unit213 and acommunications unit214. Thetouch sensing unit212 may utilize resistive, capacitive (e.g., absolute sensing or mutual capacitance sensing), acoustic or other similar sensing and signal processing components, which are known in the art, to sense the input received from a user at theuser interface104. Thetouch sensing unit212 may be disposed within and/or coupled to theuser interface104 in thehost device102. Thedisplay unit213 may include various components that are able to display and/or visually render information provided to it by the one ormore processing units210 andmemory211. Thedisplay unit213 may include any type of visual interface that includes light emitting diode (LED), organic LED (OLED), liquid crystal display (LCD), plasma, electroluminescence (EL), or other similar conventional display technology. Thecommunications unit214 will generally include one or more components that are configured to transmit and receive information via acommunication link205 between thehost device102, thestylus pen106 and other possible peripheral devices via a desirable communication method. A desirable communication method may include a wired or wireless communication method, such as a Bluetooth low energy (BTLE) communication method, Bluetooth classic, WiFi, WiFi direct, near-field communication (NFC) or other similar communication method. Thememory unit211 generally contains computer readable media that can be accessed by thehost device102 and may include both volatile and nonvolatile media for storage of information, such as computer-readable or computer-executable instructions, data, programs and/or other data.Memory211 may include computer or machine readable media or storage devices such as DVD's, CD's, floppy disks, tape drives, hard drives, optical drives, solid state memory devices, RAM, ROM, flash memory or any other device which can be used to store the desired information.
To allow thehost device102 to discriminate between the various inputs received from the user, the device should have a sufficient computational capability and system memory to enable basic computational operations. As illustrated byFIG. 2, the computational capability can be completed by one or more processing unit(s)210 that are in communication withsystem memory211. The processing unit(s)210 may include conventional central processing units (CPUs), which include graphical processing units (GPU) and other useful elements to control the various display, touch, communication and other units in thehost device102. The processing unit(s)210 may also include or be in communication with ahost clock215, which may be a simple IC or similar component that aids in the analysis and synchronization of data transferred between components in the host device and/or data transferred between thehost device102 and other connected wired and wireless network components (e.g., stylus pen106).
In some embodiments, thestylus pen106 may have one or more active regions that are able to collect additional information about the user's interaction with thehost device102. In one example, the one or more active regions may include an active tip of thestylus pen106 that is positioned so that the user will cause this region of thestylus pen106 to interact with thehost device102. The active tip of thestylus pen106 may contain sensors that are able to measure some aspect of the interaction of the active tip and thehost device102. As schematically depicted inFIG. 2, thestylus pen106 may include apen tip106a, apressure sensing unit106b, aprocessor106c, acommunications unit106d, amemory unit106e, apower source106fand apen clock106g. In some embodiments, thestylus pen106 may further comprise one or more additional sensors (not shown inFIG. 2), such as one or both of a gyroscope and an accelerometer.
Referring back toFIG. 2, thepen tip106ais configured to make contact with theuser interface104 of thehost device102. The pressure exerted at thepen tip106ais dependent on the user's interaction with thestylus pen106.
Thepressure sensing unit106bis capable of detecting the amount of pressure applied to thepen tip106aof thestylus pen106 by the user. Pressure data corresponding to the amount of pressure exerted by the user with theuser interface104 of thehost device102 is measured by thepressure sensing unit106b. The pressure data can include data from a binary switch, or other device that is able to discern between 8, 16, 32, 64, or any other desirable number of pressure levels so that the generated pressure data is useful for the control of thehost device102. In embodiments of the invention, different pressure levels can be used fordifferent host devices102, such that a stylus pen interaction will only be registered by thehost device102 when a threshold pressure level is detected. In some embodiments, the pressure data sensed by thepressure sensing unit106bmay also include an analog measurement of the pressure applied, and thus the generated pressure data supplied to thehost device102 may vary continuously across a desired range.
Theprocessor106ccan be configured to control the operation of thestylus pen106. Thestylus pen106 may be comprised of one or more processors to control various aspects of the operation of thestylus pen106. Theprocessor106cmay also include or be in communication with astylus pen clock106g, which may be a simple IC or similar component that aids in the analysis and synchronization of data transferred between components in thestylus pen106 and/or data transferred between thestylus pen106 and other wired and wireless network components (e.g., host device102). In one embodiment, thestylus pen clock106gis set at a speed that is at least as fast as the speed that a clock (e.g., host clock215) in thehost device102 is running at to facilitate the timing of the delivery of communication signals from thecommunications unit214. In general, it is desirable for the accuracy of thestylus pen clock106gto be at least as accurate as thehost clock215 to assure that the time stamps applied to the touch data information generated by thestylus pen106 andhost device104 does not appreciably drift relative to one another. Clocks that have appreciably different accuracies (e.g., frequency error rates) from one another will affect the accuracy and usefulness of the time stamp information that is transferred between thestylus pen106 andhost device102. As discussed herein, the time stamp information provided by both thestylus pen106 and thehost device102 can be used together to help differentiate the type of user input based on its timing relative to other touch events. In one example, thestylus pen clock106ghas a frequency error of less than about 50 parts per million (ppm), such as an accuracy of at least 30 to 50 ppm.
Thecommunications unit106dis capable of transmitting the pressure data from thestylus pen106 to thecommunications unit214 of thehost device102 when stylus pen interactions are made against theuser interface104 of thehost device102. In some embodiments of the invention, thecommunications unit106dtransmits the interaction data via a desirable wireless communication method, such as a Bluetooth low energy (BTLE) communication method. Other embodiments include other appropriate communications device components for transmitting interaction data between thestylus pen106 and thehost device102. Interaction data supplied by thestylus pen106 can comprise the pressure data, timing data, and/or orientation data generated from gyroscopes and/or accelerometers or the like in thestylus pen106. In some embodiments, thecommunications unit106dmay only transmit the pressure data once a threshold pressure level has been detected by thepressure sensing unit106b. In other embodiments, thecommunications unit106dmay transmit the pressure data from thestylus pen106 once any pressure is detected, regardless of the pressure level detected by thepressure sensing unit106b.
Thememory unit106eis capable of storing data related to thestylus pen106 and data related to thehost device102, such as device settings andhost clock215 andstylus pen clock106ginformation. For example, thememory unit106emay store data related to the linking association between thestylus pen106 and thehost device102.
Thepower source106fis capable of providing power to thestylus pen106. Thepower source106fmay be a built-in battery inside thestylus pen106. Thepower source106fcan be electrically coupled to one or more of the components within thestylus pen106 in order to supply electrical power to thestylus pen106.
As noted above, some embodiments of thestylus pen106 may be comprised of one or both of a gyroscope, an accelerometer, or the like. A gyroscope is a device configured to measure the orientation of thestylus pen106 and operates based on the principles of the conservation of angular momentum. In certain embodiments, one or more gyroscopes are micro-electromechanical (MEMS) devices configured to detect a certain rotation of thestylus pen106. To illustrate, thestylus pen106 can be configured to send orientation data from a gyroscope contained within thestylus pen106. This orientation data can be used in conjunction with the timing and pressure data communicated from thestylus pen106 to thehost device102. In certain embodiments, the accelerometers are electromechanical devices (e.g., micro-electromechanical systems (MEMS) devices) configured to measure acceleration forces (e.g., static and dynamic forces). One or more accelerometers can be used to detect three-dimensional (3D) positioning. For example, 3D tracking can utilize a three-axis accelerometer or two two-axis accelerometers. According to some embodiments, thestylus pen106 may utilize a 3-axis accelerometer to detect the movement of thestylus pen106 in relation to theuser interface104 of thehost device102.
Input Discrimination Technique ExamplesFIG. 3A illustrates a simplified block diagram of a userinput discrimination architecture300 that comprise computer executable instructions and supporting hardware and software elements that are used to distinguish between the different types of user inputs received by thehost device102. In some embodiments, the system and methods described herein may provide a userinput discrimination architecture300 that includes a controllingengine340 that receives various user input information and uses the received user input information to distinguish between the different types of user touch inputs received by theuser interface104 of thehost device102. In some configurations, the controllingengine340 comprises computer executable instructions that are stored in thememory211 of thehost device102, and are run in the background of thehost device102 by use of theprocessing units201 of thehost device102.
The user inputs received by the controllingengine340 may include a user touchrelated input331, astylus pen input335 and/or ahost input333. While not intending to be limiting as to the scope of the invention described herein, thehost input333, which is delivered from the hostsignal processing unit332 of theprocessing unit210 to the controllingengine340, may include the user touchrelated input331 received from the user'sphysical touch input330 received by theuser interface104, thestylus pen input335 and other useful information relating to the control of thehost device102 collected by theprocessing unit210. However, in some configurations of thehost device102, the hostsignal processing unit332 is not a separate component within thehost device102, and may be formed within, and controlled by, the components used to provide the user'sphysical touch input330 or even the controllingengine340.
After receiving and analyzing the received user information, the controllingengine340 may then deliveroutput data350 that is used in the control of various software and hardware running on thehost device102. In one example, theoutput data350 may be used by one or more third party applications and/or components in thehost device102 to perform some useful display or data output functions. In another example, theoutput data350 may be used by thehost device102 to control some aspect of a software program running on thehost device102, to generate an image on a display in thehost device102 and/or process some data that is stored in thehost device102.
In general, the user touchrelated input331 includes the user'sphysical touch input330, which may include the interaction of a finger, an appendage and the physical interaction of thestylus pen106 with the touch sensitive portion of theuser interface104. Typically, the touchrelated input331 is processed by the hostsignal processing unit332, such as capacitive sensing signal processing, before it is delivered to and then used by the controllingengine340.
The user's input delivered to thehost device102, as illustrated inFIG. 2, may also includeconfigurational inputs339 that are delivered from the user and/orstylus pen106 to the controllingengine340. Theconfigurational inputs339 may include information about the user orstylus pen106 that will help the userinput discrimination architecture300 distinguish between the different types ofuser touch input331 information (e.g., information relating to touch input from a stylus pen, finger or appendage) received by thehost device102. Theconfigurational inputs339 may include whether the user is right-handed, left-handed, information about thehost device102, Bluetooth pairing information or other useful information about the user, stylus pen or controlling engine configuration.
Thestylus pen input335 generally includes user input information received by components in thestylus pen106 that can be transferred via wired or wireless communication methods to thehost device102. Thestylus pen input335 may comprise the pressure data, timing data, and/or orientation data generated by thepressure sensing unit106bor other sensors found in the stylus pen106 (e.g., gyroscopes, accelerometers, etc.), such as touchsignal generating device106hwhich is discussed further below. In one embodiment, thestylus pen input335 may be transmitted via a wireless communication link to thecommunications unit214 of thehost device102 using a desirable wired or wireless communication technique, such as a Bluetooth low energy (BTLE) communication protocol, and then is delivered to the controllingengine340. Typically, thestylus pen input335 is processed by the hostsignal processing unit332 in thehost device102 using wired or wireless communication protocols (e.g., BTLE protocols) before it is delivered to the controllingengine340 via the hostsignal processing unit332.
Thehost input333 generally includes various sets of synchronous and/or asynchronous data that are received by thehost device102 from thestylus pen106 and/or created by the user'sphysical touch input330 received from the user. Thehost input333, which is provided to the controllingengine340, may includeuser touch input331 generated by thetouch sensing unit212 and thestylus pen input335 data provided by thestylus pen106 to thecommunications unit214 and hostsignal processing unit332. In one example, the touchrelated input331 data is delivered to the controllingengine340 separately (i.e., input333A) from thestylus pen input335 data (e.g., input333B). The separate host inputs333A and333B may not be transferred on separate physical elements to the controllingengine340, but are shown herein separately to schematically illustrate the different types of data being delivered between thehost device102 and the controllingengine340. In some embodiments, thecommunications unit214 processes the transmittedstylus pen input335 received from thestylus pen106 via thecommunication link205 before it is delivered to the controllingengine340.
As briefly discussed above, the controllingengine340 generally includes one or more executable programs or program related tasks that are used to create theoutput data350 which is used by the controllingengine340, software running on thehost device102 and/or one or more hardware components of thehost device102 to perform some useful function. The controllingengine340 may comprise one or moreinput discrimination techniques345 that are used separately or in combination generate useful andreliable output data350. The one or moreinput discrimination techniques345 take in the various different types of inputs (e.g.,inputs331,333A,333B,335,339) received by thehost device102 and try to determine the different types of user inputs from one another, so that the number of errors in the proper selection of an inputting element, such as a finger, stylus pen and/or appendage will be eliminated or less likely to occur. The one or moreinput discrimination techniques345 are thus used to determine the different types of user inputs from one another and provide a desired “input label” or “element label” for each type of user input so that they can be correctly used by the one or more third party applications and/or components used inhost device102. In one embodiment, the one or moreinput discrimination techniques345 include a time baseddiscrimination technique341, a geometricshape discrimination technique342 and/or an inference baseddiscrimination technique343 that are used separately or in combination to generate useful andreliable output data350 that can be used by the software and/or hardware running on thehost device102. In some configurations, the one or moreinput discrimination techniques345 include a plurality of time based discrimination techniques, geometric shape discrimination techniques and/or inference based discrimination techniques.
FIG. 3B is a flowchart illustrating amethod390 of discriminating between finger and appendage touch interactions and the physical stylus pen interactions with thehost device102 using one or moreinput discrimination techniques345. Themethod390 optionally starts with the delivery, storage in memory and/or recall ofconfigurational inputs339 by the controllingengine340, as shown astep391. As noted above, theconfigurational inputs339 may include information about the user and/orstylus pen106 that is useful for the discrimination of a finger or appendage touch interaction from the physical stylus pen interaction.
Next, atstep392, astylus pen input335, which is created when thestylus pen106 is brought into contact with theuser interface104, is transferred via a wired or wireless communication technique to thehost device102 and controllingengine340. The receipt of thestylus pen input335 is also referred to herein as a “touch-down event.” A “touch-down event” may be created from a single interaction or each time a user reengages thestylus pen106 with theuser interface104 during a writing, drawing or othersimilar stylus pen106 user input interaction with theuser interface104. In some embodiments, the controllingengine340 will ignore the received user touchrelated input331 data until it has received touch-down event information. In some embodiments, touch-down events do not require the physical contact of a portion of the handheld device and the surface of theuser interface104, but may also include sensed interactions where the stylus pen is moved over the surface of theuser interface104 without touching the surface, for example, by use of an active pen tip, which is discussed below.
Next, atstep393, once thestylus pen input335 is received, a timing window of a desired length is created around the receipt of a stylus pen input335 (e.g., touch-down event) in time, so that all of the user touch relatedinputs331 can be collected for analysis by the controllingengine340 to determine which of the touch inputs were received from the stylus pen, finger(s) or user's appendage. In one example, the timing window includes a time period of about 30 ms on either side of a received touch-down event. The timing window will include all user data received by and stored inmemory211 of thehost device102 in a first time period prior to the receipt of astylus pen input335 and second time period after the receipt of astylus pen input335. The length of the timing window (e.g., first time period plus the second time period) may be adjusted so that any external noise received by thehost device102 does not adversely affect the discrimination process performed by the controllingengine340, while also assuring that all of the user touchrelated input331 data that is associated with thestylus pen input335 are captured. The length of the timing window will depend on the sampling frequency of the touch sensitive portion of thehost device102, the communication speed between thestylus pen106 andhost device102 and the processing speed of the controllingengine340. In one example, the sampling frequency of the stylus pen's generated data (e.g., pressure data generated by thepressure sensing unit106b) is sampled at about a 1 millisecond (ms) rate and the communication speed between thestylus pen106 andhost device102 is sampled at about a 30 ms rate. Once a touch-down event has occurred and at least one of the touch data points received by theuser interface104 has been associated by the one or more input discrimination techniques as being a physical stylus pen touch point, the controllingengine340 may continue to track and provide user input discrimination results via the generation and delivery of theoutput data350.
In general, it is desirable for the accuracy of thestylus pen clock106gto be at least as accurate as thehost clock215 to assure that the time stamps applied to the touch data information generated by thestylus pen106 andhost device104 does not appreciably drift relative to one another over time. Clock speeds in thestylus pen106 andhost device104 that appreciably vary from one another will affect the relative accuracy of the time stamp information that is compared by the controlling engine to determine whether a user input can be attributed to a stylus pen, finger or user appendage. As discussed herein, the time stamp information may be used in some embodiments described herein to help differentiate the type of user input based its timing relative to other touch events. In one example, thestylus pen clock106ghas a frequency error of less than about 50 parts per million (ppm), such as an accuracy of at least 30 to 50 ppm. Therefore, the use of astylus pen clock106gthat has an accuracy that is at least as good as thehost clock215 can help reduce the error in the detection and analysis of the user input. While the data transfer rate between thestylus pen106 and thehost device102 is much greater than the touch data collection rate used by the components in thestylus pen106 andhost device102, this will not affect the ability of the controllingengine340 to determine the type of user input, since the use of accurate time stamp information in the data transferred between devices will prevent the slow data transfer rate from affecting the usefulness of the created touch data analyzed by the controlling engine.
In one embodiment ofstep393, the controllingengine340 creates a timing window of a desired length around the receipt of a first stylus pen input335 (e.g., touch-down event) based on a first report received at a first time via thecommunication link205 created between thestylus pen106 and thehost device102. The controllingengine340 then determines which touch data events fall within the first timing window and then notes that these touch data events are likely to be from astylus pen106. However, the number of touch data events that fall within a timing window can be larger than the number of actual touch data event(s) that are associated with thestylus pen106. Therefore, to confirm or refute that touch data events that are likely not associated with thestylus pen106, when the last report of this sequence sent by the pen is received by the controllingengine340, the controlling engine will compare the touch data events found in this timing window with the touch data events found in the first timing window to determine which touch data events also stopped (touch take off (e.g., pen removed from interface)) in this window. Thus, touch data events that do not fit within these requirements are likely not related to the stylus pen and touch data event(s) that are in both windows are more likely to have originated from thestylus pen106. In one example, the first report is generated when the stylus pen lands on theuser interface104, few reports are then generated as long as pen is pressed on the host device, and the last report is generated when thestylus pen106 is removed from theuser interface104, and thus the controllingengine340 is used to determine which of the touch events was associated with the stylus pen.
Atstep394, the controllingengine340 utilizes one or more of theinput discrimination techniques345 to discriminate between the touch interactions supplied by the stylus pen, a finger or user's appendage. One or more of the input discrimination techniques, such as time based discrimination techniques, geometric shape discrimination techniques or inference based discrimination techniques, which are discussed further below, perform an analysis of the touch-down event information and touch event information received in steps392-393 to help distinguish between the source of the different touch event interactions received by theuser interface104. The controllingengine340 may also utilize theconfigurational input339 data received atstep391 during this step to help classify and further analyze the other received data. The analyses performed by the differentinput discrimination techniques345, such as the analysis steps394A-394C, utilize various different rules that are found within the software instructions that form at least part of the controllingengine340. A discussion of some of the types of rules for each of the different types ofinput discrimination techniques345 can be found below.
Next, at step395, after performing the various analyses of the received and collected data, each of the one or moreinput discrimination techniques345 are used to create and apply a “user input type” label, or also referred to herein as an “element label,” to each of the touch data points for each touch event. The process of providing a “user input type” label generally includes the process of attributing each of the touch data points to a particular user's touch input, such as the physical input from the stylus pen, finger or appendage to theuser interface104. To reconcile any differences in the element labels given to each of the touch data points by the different input discrimination techniques, the element labels may be further analyzed by the controllingengine340. In one embodiment ofstep394 or395, an inference based discrimination technique343 (FIG. 3A) may be used to reconcile the differences between the element labels created by each of the input discrimination techniques used instep394, which is further described below.
Atstep396, each of the element labels for each of the touch data points are either further analyzed by the controllingengine340 to reconcile differences between the element labels created by each of the input discrimination techniques or each of the different element labels are transferred within theoutput data350, so that they can be used by the software and/or hardware running on thehost device102. Theoutput data350 may include the positional information (e.g., touch points) and timing information for only the relevant interacting components, such as astylus pen106 and a finger, and not the interaction of a user's appendage, by use of one or moreinput discrimination techniques345.
Afterstep396 has been performed, steps392-396 can then be repeated continually, while thestylus pen106 is interacting with theuser interface104 or each time a touch-down event occurs to provide user input discrimination results via the generation and delivery of theoutput data350.
FIGS. 3C-3D illustrate an example of the various user input information that may be received by the controllingengine340 and theoutput data350 results that may be generated by the controllingengine340 using the steps provided inmethod390, according to an embodiment of the invention described herein.FIG. 3C illustrates an example ofdata370 that is received by the controllingengine340, due to the interaction of astylus pen106, finger or user's appendage (e.g., palm) with thehost device102 as a function of time.FIG. 3D graphically illustrates at least a portion of theoutput data350 generated by the controlling engine340 (e.g., data380), due to the interaction of astylus pen106, finger or user's appendage (e.g., palm) with thehost device102 as a function of time.
Referring to the example ofFIG. 3C, at time T0the touch sensing component of the host device102 (FIG. 2) receivesinteraction data371 created by the interaction of an appendage (e.g., palm) with thehost device102. Next, at time T1the touch sensing component of thehost device102 also receivesinteraction data372 created by the interaction of a stylus pen with the host device102 (e.g., touch event). Next, at time T2thehost device102 also receivesstylus pen input335 data, or interaction data373 (e.g., touch-down event). At time T3the touch sensing component of thehost device102 also receivesinteraction data374 created by the interaction of a finger with thehost device102, and then at time T4the interaction of a finger with thehost device102 ends, thus causing theinteraction data374 to end. In this example, once thestylus pen input335 is received by the controllingengine340 at time T2, a timing window having a desired length is created so that the stored user input received between a time before T0and time T2and the user input received between times T2and a time after T4can be characterized anduseful output data350 can be created. In some embodiments, the interaction data371-374 received by the controllingengine340 at any instant in time includes the coordinates of a touch data point and its timing information. One will note that the interaction data371-374 includes the input data received over a period of time for each specific interacting element, and thus may contain many different touch data points that are in different coordinate positions on the user interface at different times. WhileFIG. 3C illustrates an example of various different types of interacting elements (e.g., stylus pen, finger, appendage) and a specific example of the timing of the interaction of these interacting elements with thehost device102, this example is not intended to be limiting, and is only added herein as a way to describe one or more aspects of the invention described herein.
FIG. 3D illustrates at least a portion of theoutput data350 created by the controllingengine340 using the one or moreinput discrimination techniques345, based on the receivedinteraction data371,372,373, and374 illustrated inFIG. 3C. At time TA, which is at a time between time T0and time T1, the controllingengine340 has received a small amount of the receivedinteraction data371 created by the user. At time TA, at least one of the one or moreinput discrimination techniques345 used instep394 by the controllingengine340 are used to create and apply a user input type label to theinteraction data371, based on the input data received by the controllingengine340 by time TA. In general, as noted above, the input data may include the user touchrelated input331,stylus pen input335,host input333 andconfigurational inputs339. In some configurations, where the controllingengine340 does not have enough data to decide what type of user input is being applied to thehost device102, it may be desirable to make an initial guess (e.g., finger, stylus pen and/or appendage) and then later correct the user input label as more data is acquired about the received user input. Therefore, in one example, the user input type label for theinteraction data371 is defined to be an “appendage” versus a “finger” or “stylus pen.” Therefore, theoutput data350 created at time TAincludes the current positional information, current timing information and “appendage” element label for theinteraction data371. In some embodiments, any interaction data that is not given a stylus pen or finger type of element label is excluded from theoutput data350 provided from the controllingengine340 and thus nooutput data350 is transferred for theinteraction data371 at time TA, as illustrated inFIG. 3D as a dashed line.
Next, at time TB, which is at a time between time T1and time T2, the controllingengine340 has received data regarding a user input that is creating theinteraction data371 and a new user input that is creating theinteraction data372. At time TB, the one or moreinput discrimination techniques345 of the controllingengine340 are used to create and apply a user input type label to the interaction data372 (e.g., step394), based on the input data received by the controllingengine340 by time TB. In this example, the user input type label for theinteraction data372 is initially defined as a “finger” based on the one or moreinput discrimination techniques345. Typically, the controllingengine340 is continually collecting theinteraction data371 and372 information and thus can continually reevaluate the user input type label for the receivedinteraction data371 and372.
Next, at time TC, which is at a time between time T2and time T3, the controllingengine340 has received data regarding the user inputs that are creating theinteraction data371 and372, and a new user input that is creating theinteraction data373. As noted above, theinteraction data373 comprisesstylus pen input335 data created by one or more sensors found in thestylus pen106. In this example, theinteraction data373 is generated due to a user initiatedpen tip106atouch event that actually occurred at time T1. However, the delivery of theinteraction data373 to the controllingengine340 has been delayed from theinteraction data372 received by the stylus pen's interaction with the touch sensing unit of thehost device102 by a signal delay time375 (FIG. 3C). The signal delay time may be created by communication processing timing delays, differences in the clocks of thestylus pen106 andhost device102 and/or communication/timing errors created within thestylus pen106 or thehost device102. The data delivered in the transferredinteraction data373 may be generated by thepressure sensing unit106band then transferred to thecommunications unit214 of thehost device102 through thecommunications unit106dof thestylus pen106. At time TC, the one or moreinput discrimination techniques345 of the controllingengine340 are used to create and apply a user input type label to theinteraction data373, based on the input data received by the controllingengine340 by time TC. In this example, theinteraction data372 and373 are associated with each other and are given a “stylus pen” user input type label due to the information received and processed by the one or moreinput discrimination techniques345, which is an adjustment from the initial element label given to theinteraction data372. Theoutput data350 provided to the hardware or other software running on thehost device102 at time TCwill thus contain thestylus pen106's positional and timing data associated with theinteraction data372 and the stylus pen's pressure data, stylus pen related timing data, and/or stylus pen orientation data associated with theinteraction data373, while the “appendage” related data found in theinteraction data371 is still being excluded. It should be noted that the controllingengine340 will still collect theinteraction data371,372 and373 information, and thus can continually reevaluate the user input type labels as needed.
In general,signal delay time375 can be created by mechanical and electrical delays that are created during the collection and transmission of the information between thestylus pen106 and the controllingengine340 running in thehost device102, and also created by the controlling engine, which may not be synchronized with the wired or wireless communication arrival (e.g., BTLE information). Delays may also be generated due to higher priority tasks being completed by theprocessing unit210 and/or controllingengine340, which may cause a delay in the analysis of the received touch data. In some examples, the mechanical delays may include delays created by inertia and/or friction in thepen tip106aand/or pressure sensing components in apressure sensing unit106bof thestylus pen106. In some examples, the electrical delays may results from the propagation delays created by one or more electrical components in the host device or stylus pen (e.g., low-pass filters (LPFs) and ADCs) and processing delays created due to the need to transmit and/or convert the data for transmission via a wireless transfer technique (e.g., BTLE) or use by one or more processing components in thestylus pen106 orhost device102. In some embodiments, to prevent the signal delay time375 (FIG. 3C) from causing mischaracterization of the user input data, it is desirable to encode any transferred data with a timestamp that is at least based on the clock of the device transmitting the desired information. In one example, the sampling rate of the sensingcomponents user interface104 may be running at a speed of about 16 milliseconds (ms) and the sampling rate of the components in the stylus pen is less than about 16 ms. In one example, the sampling rate of the data sampling components in the stylus pen is less than about 10 ms, such as between about 1 ms and about 10 ms. The provided timestamp information is thus used to help better correlate the multiple sets of data that are received by the controllingengine340, so that a reliable characterization of the user inputs can be made.
However, since thehost clock215 and thestylus pen clock106g, which are used to generate and/or facilitate the transfer of data between thestylus pen106 andhost device102, are generally not synchronized, and in some cases may be running at different speeds, errors in the characterization and processing of the received user input data are not reliably eliminated by use of a single timestamp. In one example, these errors may include errors in the proper selection of a user's input and can cause jitter in the display which will ultimately annoy the user or cause significant disruption in the tasks that the user is performing on the computing device. It has been found that providing the timing data from both thestylus pen clock106gand thehost clock215 in the data transferred between the devices in either direction helps significantly reduce any error in the mischaracterization of the user input. In general, the controllingengine340 and/or user input sensing program(s) being executed in thestylus pen106 use both sets of timestamp information received in the transferred data to continually update the processes running in each device to account for any drift or difference in the timing found between thestylus pen clock106gandhost clock215. As noted above, the difference in the timing found between thestylus pen clock106gandhost clock215 will generally affect the analysis of the user input received by the controllingengine340. Therefore, in one embodiment, all communications provided between thehost device102 and thestylus pen106 will include the latest time information received from thestylus pen clock106gand the time information received from thehost clock215, so that the controllingengine340 can receive and can continually correct for errors found between thestylus pen clock106gand thehost clock215.
Next, at time TD, which is at a time between time T3and time T4, the controllingengine340 has received data regarding a user input that is creating theinteraction data371,372 and373, and the new user input that is creating theinteraction data374. At time TD, the one or moreinput discrimination techniques345 of the controllingengine340 are used to create and apply a user input type label to theinteraction data374, based on the input data received by the controllingengine340 by time TD. In this example, the user input type label for theinteraction data374 is initially defined as a “finger” based on the one or moreinput discrimination techniques345. It should be noted that the controllingengine340 will still collect theinteraction data371,372,373 and374 information, and thus can continually reevaluate the user input type labels as needed.
Next, at time TE, which is at a time between time T4and time T5, the controllingengine340 has receivedinteraction data371,372 and373, while theinteraction data374 has not been received after the time T4was reached. At time TE, the one or moreinput discrimination techniques345 continually reevaluate the user input type labels for the interaction data371-373, based on the input data received by the controllingengine340, however, the tracking and characterization of the user input relating to theinteraction data374 will generally be halted due to the removal of this user input. In some embodiments, all of the user's inputs will be continually tracked and characterized while they are interacting with thehost device102, and be dropped fromoutput data350 when their interaction with thehost device102 ends. In some cases, the controllingengine340 may use the one or moreinput discrimination techniques345 to track and provide user labels for user interactions that are suspended for times shorter than a specified period of time, such as when astylus pen106 is lifted from the touch sensitive surface of thehost device102 for only a short time to write, draw or input some different pieces of information on thehost device102.
Time Based Input Discrimination Technique ExamplesEmbodiments of the invention described herein may provide a system and method that analyzes the timing of the received user's input data to determine the source of the user input delivered in the user's physical touch input330 (e.g., physical stylus pen, finger(s) and user appendage touch input) to the controllingengine340. During operation the controllingengine340 analyzes the timing of the user input data to determine the different types of received user'sphysical touch input330, which is often referred to herein as the time based user input discrimination technique. In one example, the time based user input discrimination techniques can be used to determine if the received user input was created by a stylus pen, finger(s) or user's appendage by comparing the relative timing of the different user'sphysical touch input330 events andstylus pen input335. The time based discrimination techniques used by the controllingengine340 will generally compare the various received user input data as a function of time to help the controllingengine340 discriminate between the interaction of a stylus pen, fingers or an appendage. The time based user input discrimination techniques discussed herein may be used alone or in combination with one or more of the other user types of input discrimination techniques discussed herein.
FIG. 4 is a simplified flowchart illustrating a time based user input discrimination technique for discriminating touch interactions from the physical stylus pen interactions on a touch-screen according to an embodiment of the invention. Themethod400 can be performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computing system or a dedicated machine), firmware (embedded software), or any combination thereof that are contained in thehost device102 and/or thestylus pen106. In one embodiment, themethod400 is performed by the controllingengine340 that is running in the background of thehost device102 and/or the data collection and transmission processes running on components in thestylus pen106.
The method may includestep402, in which the controllingengine340 receives user input (e.g.,user touch input331 or stylus pen input335) information related to a touch-down event on thehost device102. According to embodiments of the present invention, information related to a touch-down event is received from a handheld device. The handheld device may be an electronic stylus pen, such as astylus pen106, comprising a pressure sensor (e.g.,pressure sensing unit106b) touchsignal generating device106hthat is configured to deliverstylus pen input335 information to thehost device102. In some embodiments of the present invention, the electronic pen may also be comprised of at least one of an accelerometer or a gyroscope.
The information related to the touch-down event that is transferred to the controllingengine340 may comprise timing information, pressure data, and other data (e.g., accelerometer and/or gyroscope data) that is sent from thestylus pen106 via thecommunication link335 to thehost device102. For example, the information related to the touch-down event may include a first timestamp for the touch-down event, and may include information related to when the touch-down event occurred, how much pressure was applied in the touch-down event, and the length of time of the touch-down event. In another example, the information related to the touch-down event may include a pen clock timestamp derived from a pen clock signal received from thepen clock106gand a host clock timestamp derived from a host clock signal received from thehost clock215 for the touch-down event, and may include information related to when the touch-down event occurred, how much pressure was applied in the touch-down event, and the length of time of the touch-down event.
Next, at box404, the method includes the controllingengine340 receiving information related to a touch event sensed by thehost device102. The touch event may be from thestylus pen106 physically interacting with theuser interface104 of thehost device102, or by a touch interaction from the direct contact with theuser interface104 by a finger and/or appendage of the user. According to embodiments of the present invention, the information related to the touch event may comprise a second timestamp for the touch event, which may include timing information related to when the touch event occurred. In one embodiment, the second timestamp comprises a pen clock timestamp derived from a pen clock signal received from thepen clock106gand a host clock timestamp derived from a host clock signal received from thehost clock215 for the touch event.
According to embodiments of the present invention, the information related to the touch event and the information related to the touch-down event may be received by the device simultaneously or at different times.
Next, atbox406, the method also includes correlating the information related to the touch-down event with the information related to the touch event. In one example, the controllingengine340 correlates the information related to the touch-down event with the information related to the touch event. According to embodiments of the present invention, first time information for the touch-down event is correlated with second time information for the touch event.
Next, atbox408, the method also includes determining whether the time delay between the first timestamp for the touch-down event and the second timestamp for the touch event is less than an empirically predetermined threshold. According to embodiments of the present invention, the controllingengine340 determines whether the time delay is within a predetermined threshold. For example, if the time delay between the touch-down event and the touch event are greater than the predetermined threshold, which may indicate that the touch event is separate from the touch-down event, as illustrated in step410. In that event, the controllingengine340 would distinguish that touch event as not being associated with thestylus pen106. The controllingengine340 may distinguish the touch event as being associated with the user's finger.
Alternatively, if the time delay between the touch-down event and the touch event are equal to or less than the predetermined threshold, that may indicate that the touch event is associated with the touch-down event, as illustrated inbox412. In that event, thehost device102 would register that touch event as being associated with thestylus pen106 and not the user's finger or user's appendage.
Once the determination has been made, the controllingengine340 continues to monitor incoming touch-down event(s) and touch events, correlates the received data, and makes a determination as to whether the touch event is associated with thestylus pen106, the touch interactions by the user's fingers and/or appendage of the user.
It should be appreciated that the specific steps illustrated inFIG. 4 provide a particular method of400 according to an embodiment of the present invention. Other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Moreover, the individual steps illustrated inFIG. 4 may include multiple sub-steps that may be performed in various sequences as appropriate to the individual step. Furthermore, additional steps may be added or removed depending on the particular applications. One of ordinary skill in the art would recognize many variations, modifications, and alternatives.
Geometric Based Input Discrimination Technique ExamplesEmbodiments of the invention described herein may also provide a system and method that uses one or more geometric based user input discrimination techniques to distinguish between the different types of user'sphysical touch input330 information received from a stylus pen, finger or user's appendage by a touch sensitive device. The one or more input discrimination techniques may include a geometric shape discrimination technique that uses information relating to the relative position of multiple touch points supplied by the user to help discriminate between the interaction of a stylus pen, fingers or an appendage. The geometric based user input discrimination techniques discussed herein may be used separately or in combination with one or more of the other user input discrimination techniques to distinguish between the various different types of user inputs received by the computing device. In some embodiments, the use of one or more of the geometric based input discrimination techniques with one or more of the time based input discrimination techniques will help improve the accuracy of the user input discrimination created by either technique on their own.
In some embodiments, as noted above, it is desirable for the host input333 (FIG. 3A), which is provided to the controllingengine340 via portions of thehost device102, to only include a simplified data set that just includes the coordinates (e.g., X and Y-direction coordinates) of each of thetouch data points556 and the time that the interaction occurred with theuser interface104. Typically, this simplified data set is a small fraction of the amount of the data that is commonly collected by conventional touch sensitive handheld devices or touch sensitive display type computing devices. The creation and use of the simplified data to discriminate between the interaction of a stylus pen, fingers or an appendage can reduce the required computing power of the host device and/or increase the speed of the computing device by reducing the computational power required to collect and transfer the touch interaction data. Alternately, in some configurations, the controllingengine340 does not have access to the actual user interaction data collected from theuser interface104 and is only fed a simplified data set from thehost device102. In this case, the controllingengine340 must discriminate between the interaction of a stylus pen, fingers or an appendage based on the limited nature of the data supplied to it by thehost device102.
FIG. 5A schematically illustrates a plurality oftouch data points556 that have been detected by theuser interface104 of thehost device102 and delivered to the controllingengine340 for the discrimination of the various different types of user inputs. In general, the geometricshape discrimination technique342 used by the controllingengine340 includes a method of sorting and grouping of the receivedtouch data points556 by their geometric location based on geometric rules coded into the controllingengine340. Each of thetouch data points556 will be separately analyzed by the controllingengine340 to determine if they can be associated with the stylus pen, finger or part of the user's appendage. The geometricshape discrimination technique342 uses geometric rules to provide element labels to one or more clusters of touch points, since it is often likely that these clusters of touch points are related to a specific type of user input, such as a user's palm or finger. For example, if the controllingengine340 knows that the user is right handed, based on information received from aconfigurational input339 or by prior analysis of receivedtouch data points556, the controllingengine340 can apply a rule that specifies that a stylus pen related touch point will be above and to the left of a group of touch points that are associated with a palm of the user.
To determine the likely type of user input atouch data point556 may be associated with, the controlling engine will generally use the current touch point data received by theuser interface104 and stored old touch data points that had been previously received by the controlling engine340 (e.g., also referred to herein as “aging” touch data points). In one example, older touch data points, which had each been previously analyzed and characterized by the controllingengine340, are used to help determine the type of user input that is associated with the current receivedtouch data point556. Use of the older touch data points and its relationship to the new touch data points can improve the speed and accuracy by which the current touch data points can be associated with a type of user input. In some configurations, the older touch data points are retained, analyzed and/or used by the controllingengine340 for only a short period of time before they are deemed not useful and are excluded from use.
In one embodiment, the controllingengine340 is configured to group thetouch data points556 into at least one of apen region571, anappendage region561 or afinger region581. Therefore, based on the position of eachtouch data point556 in relation to othertouch data points556, the geometricshape discrimination technique342 can determine that one or more touch points, at any instant in time, is likely to be associated with stylus pen, finger or part of the user's appendage. The various regions defined by the controllingengine340, such aspen region571, anappendage region561 or afinger region581, can be formed around clusters of touch data points that have been associated with the same type of user input. For example, theappendage region561 illustrated inFIG. 5A contains a cluster of thetouch data points556 that have been associated with the user's appendage.
The controllingengine340 may also create and use ageometric boundary region551 to help prioritize the analysis of the received touch data contained therein as being likely to contain useful user input data. In one example, thegeometric boundary region551 may include a region that includes a touch point that is associated with a stylus pen and one or more touch points that are associated with an appendage, since it is likely that a pen touch point will be near touch points that are associated with a palm of the user. In one embodiment, the geometric boundary includes all of the touch points supplied to theuser interface104 that have been received at an instant in time. The controllingengine340 may use the position of the touch points within the geometric boundary to help decide what type of user input has been received. In one example, touch data points that are near the boundaries of the geometric boundary may be more likely to be from a stylus pen.
Referring toFIG. 5A, in one example, the controllingengine340 has applied the various geometric based rules and determined that a group of touch points are associated with anappendage region561, a touch point is associated with a stylus pen (e.g., within the defined pen region571) and a touch point is associated with a finger (e.g., within the defined finger region581). In this example, the controllingengine340 compares each of the currently receivedtouch data points556 with oldertouch data points557 to determine the likely type of user input that has been received by the controllingengine340. In this example, the controllingengine340 thus may determine that the user's appendage has shifted down and to the left based on the comparison of thetouch data points556 that are found in theappendage region561 with the older touch data points557. This created geometric analysis data can then be used by other components in thehost device102, or likely in this case be excluded from theoutput data350 set.
In an effort to provide more accurate determination of the type of user input it is often desirable to use the change in position of two or more oldertouch data points557 to predict the likely position of the next touch point (e.g., touch data point556). Use of this predictive technique to help characterize the type of user input can reduce the errors created by incorrectly assigning an element label to a touch data point. In one embodiment, as illustrated inFIG. 5B, the geometricshape discrimination technique342 creates and uses a predicteddata point region559 that it uses to help determine the element label for a newly receivedtouch data point556. In this example, a firsttouch data point5571and a secondtouch data point5572are used to form a predicteddirection558 for the next touch data point and the predicteddata point region559. Thetouch data point556 in this example happens to fall within the predicteddata point region559, and thus would have higher likelihood of being a continuation of this specific input received from the user, such as an touch data point input received by astylus pen106. The controllingengine340 therefore takes into account the higher likelihood that a touch data point is of a certain type when it is assigning an element label to that touch data point. The controlling engine may adjust the size and shape of the predicteddata point region559 due to the speed of the user input, which is determined based on the movement of the older touch data points557.
In some configurations, the one or more geometricshape discrimination techniques342 compare the movement of a touch data point, or cluster of touch data points, with the movement of a touch data point that is associated with a stylus pen, to determine if this cluster of points may be associated with an appendage (e.g., palm) following the stylus pen. In one example, a touch point or cluster of touch data points that move parallel to the direction of the movement of a touch data point that is associated with a stylus pen is labeled as being a palm.
FIG. 5C is a simplified flowchart illustrating a method of discriminating interactions caused by an appendage, such as a palm of a user on a touch-screen according to an embodiment of the invention. Themethod520 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computing system or a dedicated machine), firmware (embedded software), or any combination thereof that are contained in thehost device102 and/or thestylus pen106. In one embodiment, themethod520 is performed by the controllingengine340 that is running in the background of thehost device102 and/or the data collection and transmission processes running on components in thestylus pen106.
Atstep522, the method includes receiving information related to a touch-down event on thehost device102. The information related to the touch-down event that is transferred to the controllingengine340 may comprise timing information, pressure data, and other data that is sent from thestylus pen106 via thecommunication link335 to thehost device102. For example, the information related to the touch-down event may comprise a first timestamp for the touch-down event, and may include information related to when the touch-down event occurred, how much pressure was applied in the touch-down event, and the length of time of the touch-down event. In another example, the information related to the touch-down event may comprise a pen clock timestamp derived from a pen clock signal received from thepen clock106gand a host clock timestamp derived from a host clock signal received from thehost clock215 for the touch-down event, and may include information related to when the touch-down event occurred, how much pressure was applied in the touch-down event, and the length of time of the touch-down event. According to embodiments of the present invention, the information related to the touch-down event is received from a handheld device, such as astylus pen106.
Next, atbox524, the method also includes receiving information related to a plurality of touch events on thehost device102. The plurality of touch event may be from an appendage of the user, such as a palm of the user resting on the surface of theuser interface104 of thehost device102. The plurality of touch event may also be from thestylus pen106 interacting with theuser interface104 of thehost device102, or by a touch interaction from the direct contact with theuser interface104 by a finger or appendage of the user. According to embodiments of the present invention, the information related to the touch event may comprise a second timestamp for the touch event, which may include timing information related to when the touch event occurred. In one embodiment, the second timestamp comprises a pen clock timestamp derived from a pen clock signal received from thepen clock106gand a host clock timestamp derived from a host clock signal received from thehost clock215 for the touch event.
Next, atbox526, the method also includes determining or defining one or more clusters of touch events from the plurality of touch events provided to the controllingengine340 from thestylus pen106 andhost device102. According to embodiments of the present invention, there may be touch events occurring from the finger of the user, from thestylus pen106 interacting with theuser interface104 of thehost device102, or from an appendage of the user, such as a palm resting on theuser interface104 of thehost device102.
Next, atbox528, the method also includes determining the movements of each one of the clusters of touch events. According to embodiments of the present invention, the controllingengine340 in thehost device102 can determine the movements of each one of the clusters of touch events based on the interactions between each one of the clusters of touch events and theuser interface104 of thehost device102. For example, the controller in thehost device102 can determine the movement of each one of the clusters of touch events across theuser interface104 of thehost device102, or can determine that one or more of the clusters of touch events are stationary. According to embodiments of the present invention, determining one or more clusters of touch events from the plurality of touch events may comprise detecting the location of each touch event in the plurality of touch events and associating each touch event into one or more clusters of touch events. In some embodiments, associating each touch event into one or more clusters of touch events is based on relative distances between each touch event in the plurality of touch events. For example, touch events may be considered associated in the same cluster of touch events when they are within a predetermined distance from each other.
Next, atbox530, the method may also include correlating the information related to the touch-down event with the information related to the plurality of touch events. According to embodiments of the present invention, the touch-down event and plurality of touch events may be correlated based on information received by the controllingengine340 relating to the touch-down event and the plurality of touch events. The information includes, but is not limited to, timing information received from thestylus pen106 or movement of the plurality of touch events detected by the controllingengine340 of thehost device102.
Next, atbox532, the method may also include determining whether the movement is below a threshold distance. In this step, the controllingengine340 may determine whether the movement of each cluster of touch events is less than a predetermined threshold distance. For example, the predetermined threshold may be set to a numerical value specifying a particular distance of movement. If the controllingengine340 determines that the movement of one of the clusters of touch events moves less than the predetermined threshold, the cluster of touch events may be determined to be associated with an appendage of a user, as illustrated inbox534, such as a palm as it may be more likely to have a smaller movement or lack of movement, since the palm of the user is resting on theuser interface104 of thehost device102. On the other hand, if the controllingengine340 determines that the movement of one of the cluster of touch events moves greater than the predetermined threshold, the cluster of touch events may be determined as being from a stylus pen or a finger, and thus are not associated with the appendage of a user, as illustrated inbox536. In such cases, timing information may be used to determine whether the cluster of touch events is from astylus pen106 or from a finger of the user interacting with theuser interface104 of thehost device102. In one example, the one or more of the input discrimination techniques try to determine whether an interaction is from a stylus pen and then tries to decide whether the interaction is from a non-stylus pen, or vice versa
Once the determination has been made, the controllingengine340 continues to monitor incoming touch-down event and touch events, correlates the received data, and makes a determination as to whether the touch event is associated with thestylus pen106, touch interactions by the user's fingers, or a touch interaction caused by the user's appendage resting on theuser interface104 of thehost device102, and then generates theoutput data350 that is provided to software and hardware components running on thehost device102.
It should be appreciated that the specific steps illustrated inFIG. 5C provide a particular method of520 according to an embodiment of the present invention. Other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Moreover, the individual steps illustrated inFIG. 5C may include multiple sub-steps that may be performed in various sequences as appropriate to the individual step. Furthermore, additional steps may be added or removed depending on the particular applications. One of ordinary skill in the art would recognize many variations, modifications, and alternatives.
Therefore, in some embodiments, the various geometric based rules applied by the controllingengine340 may include increasing the likelihood that atouch data point556 that is positioned a threshold distance from a previous touch data point that was characterized as a stylus pen is also a pen touch data point, atouch data point556 that is positioned outside a definedappendage region561 it is likely a stylus pen or a finger input, atouch data point556 that is positioned in the same general direction that a “stylus pen” associated touch data point had moved is also likely a pen touch data point, and that a touch data point that has not moved over one or more touch sampling intervals is also likely an appendage touch point. These geometric rule examples, and types of user input used with these rule examples, are not intended to be limiting as to scope of the invention described herein.
Inference Based Input Discrimination Technique ExamplesEmbodiments of the invention described herein may also include a system and method that utilizes one or more inference based user input discrimination techniques to determine whether it is likely that a user input was received from a stylus pen, finger or appendage by a touch sensitive device. The inference based discrimination techniques used by the controlling engine will generally compare the received user input data with predefined and/or relevance-weighted rules to help discriminate between the interaction of a stylus pen, fingers or an appendage. The inference based user input discrimination techniques discussed herein may be used by themselves or in combination with one or more of the other user types of input discrimination techniques discussed herein to discern between the various different types of inputs received by the computing device.
FIG. 6A illustrates a simplified flowchart of amethod600 of discriminating between different types of user inputs using an inference based discrimination technique. The inference based discrimination technique343 (FIG. 3A) of the controllingengine340 takes in the various different types ofinputs601 received by thehost device102 at an instant in time, such asinputs331,333,335,339 discussed above, and tries to determine what type of user input has been received by comparing the outcome of various analyses performed on the received user input data. The various analyses may be performed by use of two or more decision modules, such as decision modules6021-602N, where N is whole number greater than or equal to two, that each provide an input6041-604Nto adecision matrix605. Thedecision matrix605 then analyzes and compares the received inputs6041-604Nfor each touch data point andother event data603 to determine the most likely element label for each touch data point. The inputs6041-604Nmay comprise a “vote” that includes an alphanumeric, numerical or other distinct label that signifies one type of user input from another. In one embodiment, thedecision matrix605 compares the received inputs6041-604Nfor each touch data point by tabulating or summing the different votes for the user input type created by each of the decision modules6021-602N. Thedecision matrix605 of the controllingengine340 then generates theoutput data350 that is then used by the components and software found in thehost device102. In some embodiments, thedecision matrix605 uses threshold values, which are stored in memory, to assign the user input a desired element label. In one example, a touch data point is not assigned a stylus pen input element label unless it receives a certain minimum number of votes.
In some embodiments, thedecision matrix605 determines an element label for a touch data point by first applying a weighting factor to each of the received inputs6041-604N, and then compares the adjusted, or weighted, inputs to determine the element label for a given touch data point. The weighting factor may mean that a vote provided by a given decision module (e.g., decision module6021) may carry more weight than another vote provided by another given decision module (e.g., decision module6022) based on its ability to correctly characterize the type of user input. In one embodiment, theevent type603 includes other known information relating to the received touch point data, such as whether the data is being delivered from theuser interface104 orstylus pen106.
In one embodiment, eachdecision module6021,6022, . . .602Nincludes a time based user input discrimination technique, a geometric user input discrimination technique or other applicable user input discriminating rules that is available to the controllingengine340. In one embodiment, each of the decision modules6021-602Ncomprise one or more coded software instructions that apply one or more defined rules that are used to characterize the type user input that a touch data point was derived from. In one example, adecision module6021characterizes a touch data point based on its relationship to a cluster of other touch data points as discussed in relation toFIG. 5A, adecision module6022characterizes the touch data point based on its predicted position as discussed in relation toFIG. 5B, adecision module6023characterizes the touch data point based on its movement being within a threshold value as discussed in relation toFIG. 5C, and adecision module6024characterizes a touch data point based on the knowledge of an attribute of the user (e.g., right-handed). Then, each decision module6021-6024delivers its input6041-6044to thedecision matrix605, or also referred to herein as its “vote” as to what type of user input the touch data point was created from. In one example, the inputs6041-6044each include whether the decision module believes that it the touch data point is associated with a stylus pen, finger or user's appendage. Thedecision matrix605 of the controllingengine340 then compares the inputs6041-6044and generates theoutput data350 for that touch data point, which may include its position, timestamp information and whether the touch data point is associated with a stylus pen, finger or user's appendage.
FIG. 6B illustrates a table that contains some examples of some voting results contained in the generated decision matrix data based on the received inputs6041-604N, where in this example N is equal to 10. Therefore, the inputs6041-60410(not shown) have been created by use of inputs received from the decision modules6021-60210(not shown), and have been tabulated by thedecision matrix605 to form the illustrated results. In this example, a first touch data point has received eight votes that it is related to a stylus pen, one vote that it is related to a finger and one vote that it is related to a user's appendage, while a second touch data point has received two votes that it is related to a stylus pen, seven votes that it is related to a finger and one vote that it is related to a user's appendage. Therefore, based on the tabulated data the controllingengine340 would attribute the first touch data point to a stylus pen and the second touch data point to a finger, which would then be delivered in theoutput data350.
As noted above, after receiving and analyzing the received user information, the controllingengine340 may then deliveroutput data350 to one or more software and hardware components running on thehost device102. In one example, theoutput data350 may be used by one or more third party applications and/or components in thehost device102 to perform some useful display or data output function. In another example, theoutput data350 may be used by thehost device102 to generate an image or line on a display in thehost device102, due to the determination that the received touch data is related to astylus pen106.
In some cases, after receiving all of the inputs, thedecision matrix605 is unable to determine what type of input a certain touch data point is, which are referred to herein as an “unknown” touch data point. Therefore, to resolve this issue, the software running on the host device may take a few different paths to decide what to do with these unknown touch data points. First, the software may decide not to use the “unknown” touch data points in any of the tasks that is currently performing. For example, in the case of a drawing program, the controlling software may decide not to render the touch point on the screen of theuser interface104. In this case, the controlling software has decided that each data point must have element label to be used. In a second approach, the software running on the host device may decide to use the “unknown” touch data points in the tasks that it is currently performing. For example, in the case of a drawing program, the controlling software may decide to render the touch point on the screen of theuser interface104, and at some later time undo and/or remove the rendered data when it is clear that the input data was not received from a desired component, such as stylus pen or finger. In this case, the controlling software may decide to give each input an initial element label and then correct the label when it has more data.
FIG. 7 is a simplified signal diagram700 illustrating aspects of a method for discriminating stylus pen interactions from touch interactions on a touch-screen by an inference technique, according to an embodiment of the invention. The diagram includes anhost device signal710, a signal representing touch events detected by a controllingengine720, a signal representing a touch-down event by astylus pen106 on the user interface of thehost device730, and a signal representing a touch interaction (e.g. from a finger of a user) on the user interface of thehost device740.
Thehost device signal710 includes anactive period711 that indicates a period where the host device is active and able to receive inputs. The signal representing touch events detected by a controllingengine720 includes a first period indicating atouch event721, a first period indicating notouch event722, a second period indicating atouch event723, a second period indicating notouch event724, a third period indicating atouch event725, a third period indicating notouch event726, a fourth period indicating atouch event727, and a fourth period indicating notouch event728.
The signal representing touch-down event by astylus pen106 on the user interface of thehost device730 includes a first period indicating a touch-down event731, a first period indicating no touch-down interaction732, a second period indicating a touch-down event733, and a second period indicating no touch-down interaction734.
The signal representing a touch interaction on the user interface of thehost device740 includes a first period indicating atouch interaction741, a first period indicating notouch interaction742, a second period indicating atouch interaction743, and a second period indicating notouch interaction744.
In embodiments of the invention, by correlating the times of the signals from thestylus pen106 with the times that touch events were detected (e.g. comparing signals720 and730 inFIG. 7), the touch-down events attributable to thestylus pen106 can be parsed out from amongst all the signal periods where touch events were detected. For example,touch events723 and727 can be distinguished fromtouch events721 and725 as being touch events conducted using thestylus pen106 against theuser interface104 of thehost device102, rather being touch events conducted by the user making contact with their finger with theuser interface104 of thehost device102.
FIG. 8 is a simplified signal diagram800 illustrating aspects of a method for discriminating stylus pen interactions from touch interactions on a touch-screen, where stylus pen and touch interactions overlap, according to an embodiment of the invention. For example, a user may conduct stylus pen and touch interactions simultaneously with theuser interface104 of thehost device102. As there may be periods where the signals overlap, the system can parse out those signals from thestylus pen106 from the touch interactions conducting using the user's fingers. The diagram includes anhost device signal810, a signal representing touch events detected by a controllingengine820, a signal representing a touch-down event by astylus pen106 on the user interface of thehost device830, and a signal representing a touch interaction (e.g. from a finger of a user) on the user interface of thehost device840.
Thehost device signal810 includes anactive period811 that indicates a period where thehost device102 is active and able to receive inputs. The signal representing touch events detected by a controllingengine820 includes a first period indicating atouch event821, a first period indicating notouch event822, a second period indicating atouch event823, a second period indicating notouch event824, a third period indicating atouch event825, and a third period indicating notouch event826.
The signal representing touch-down events by astylus pen106 on the user interface of thehost device830 includes a first period indicating a touch-down event831, a first period indicating no touch-down interaction832, a second period indicating a touch-down event833, and a second period indicating no touch-down interaction834.
The signal representing a touch interactions on the user interface of thehost device840 includes a first period indicating atouch interaction841, a first period indicating notouch interaction842, a second period indicating atouch interaction843, and a second period indicating notouch interaction844.
As shown inFIG. 8, the signal representing a touch interaction on the user interface of thehost device840 and the signal representing touch-down events by a capacitive stylus pen on the user interface of thehost device830 partially overlap during anoverlap period850. Theoverlap period850 coincides with first period indicating a touch-down event831 and the second period indicating atouch interaction843.
As depicted inFIGS. 7 and 8, the signal strength from the touch-down events by thestylus pen106 and the signal strength from the touch interactions on the user interface of thehost device840 are roughly equal. However, in other embodiments, the relative signal strengths of the touch-down events by thestylus pen106 and the touch interactions on the user interface of thehost device840 may vary in strength. In these other embodiments, the relative change in signal strengths may be an additional factor in further discriminating between stylus pen and touch interactions.
Active Stylus Pen and Active Pen Control TechniquesReferring back toFIGS. 2 and 3, in one embodiment, thestylus pen106 may include a touch signal generating device (TSGD)106hthat is used to cause thestylus pen106 to be selectively sensed by the capacitive sensing elements found within thetouch sensing unit212 of theuser interface104 of thehost device102. In this configuration, touchsignal generating device106hincludes one or more components that are able to selectively form a virtual capacitance between a portion of thepen tip106aand the capacitive sensing elements found in theuser interface104 when a TSGD switch, such as a mechanical sensor/switch221 is activated by the user. In one example, the TSGD switch is part of thepen tip106aorpressure sensing unit106b. The formed virtual capacitance between thepen tip106aand thehost device102 creates a touch event that is sensed by theuser interface104 with or without the physical act of touching thepen tip106ato the user interface.
FIG. 9A is an electrical schematic that illustrates the operation of anactive stylus pen106 with thehost device102 that is configured for mutual capacitance sensing, according to an embodiment of the invention. Theactive stylus pen106 is configured with an activestylus control element907, which may receive signals fromhost device102 as well as generate signals to be transmitted to thehost device102. As shown, theactive stylus pen106 may be held by a user'sfingers915 and is coupled to theuser interface104 throughpen tip106a. Theactive stylus pen106 may be physically coupled to theuser interface104, or theactive stylus pen106 may be located in proximity to theuser interface104 such that signals generated within the activestylus control element907 and transmitted to thepen tip106aare able to change the sensed capacitance atsensing assembly117 within thehost device102 to a desired level at a desired time.
Thehost device102, of which a portion is depicted inFIG. 9, generally includes auser interface104, adriver assembly113 and asensing assembly117. Thehost device102 may include, for example, drive regions and sense regions, such asdrive electrodes114 andsense electrodes116. Further, thedrive electrodes114a-114c(x-direction) may be formed in columns whilesense electrodes116a-116b(y-direction) may be formed in rows. Touch sensing areas, or touch pixels, may be formed at the overlapping regions of the drive electrodes and sense electrodes.
During operation,column driver113 may transmit a capacitive sensing waveform on one ormore drive electrodes114 at a time, thereby creating a mutual capacitance CMbetween the row ofsense electrodes116 and the driven drive electrode(s)114 (i.e., column(s)) at each touch pixel.Active stylus pen106, when coupled to theuser interface104, may be configured to detect the transmitted capacitive sensing waveform. Whenactive stylus pen106 is coupled to theuser interface104, some of the charge coupled between thedrive electrodes114 andsense electrodes116 corresponding to one or more touch pixels may instead be coupled onto theactive stylus pen106, thus forming a pen capacitance CPcorresponding to each of the coupled touch pixels. More charge may generally be coupled from a particular touch pixel to theactive stylus pen106 where theactive stylus pen106 is a shorter distance from that touch pixel; therefore, detecting that more charge has been coupled away from a particular touch pixel may indicate a shorter distance toactive stylus pen106. This reduction in charge coupling across the touch pixels can result in a net decrease in the measured mutual capacitance CM between thedrive electrode114 and thesense electrode116, and a reduction in the capacitive sensing waveform being coupled across the touch pixel. This reduction in the charge-coupled sensing waveform can be detected and measured by analyzing the change in the sensed capacitance Csin thesensing assembly117 to determine the positions of multiple objects when they touch theuser interface104.
In some embodiments, theactive stylus pen106 may send a controlling signal to theuser interface104 by injecting a charge at the appropriate time to thepen tip106a, which alters the mutual capacitance CMand thus the value of sensed capacitance Csdetected by thesensing assembly117. Therefore, by controlling the amount of charge to a desired level, or voltage formed between thepen tip106aand asensing electrode116 to a desired level, thepen tip106aof theactive stylus pen106 can be detected by the capacitive sensing element in the touch-screen containing device as being a touch event.
Further, in some embodiments theactive stylus pen106 may detect a signal produced at one ormore drive electrodes114 of the touch-screen containing device by thecolumn driver113. Based on the detected signal, theactive stylus pen106 may alter the sensed capacitance Csto a level at a desired time, so as to cause the touch-screen containing device to correctly determine the location of input provided by theactive stylus pen106. Advantageously, since the size of thepen tip106ais generally too small to be sensed by theuser interface104, theactive stylus pen106 may therefore be used to selectively provide a touch sensing input to theuser interface104. Therefore, by timing when a user input is provided by theactive stylus pen106 to theuser interface104, the software running on the touch-screen containing device can analyze and use the provided input to control some aspect of a software program running on the touch-screen containing device and/or display some aspect of the input received on the display portion of the touch-screen device. In some embodiments, theactive stylus pen106 is adapted to deliver input from theactive stylus pen106 to any type of touch-screen containing device, despite differences in the particular configurations and sensing methods preformed by the touch-screen containing devices.
FIG. 9B generally illustrates a driven touch-sensing detected signal951 provided by the touch sensing components in thehost device102 and acontrolling signal915 that is generated and provided to thepen tip106aby the active styluscontrolling element907, according to an embodiment described herein. To provide desirable user input the activestylus control element907 may generally operate in asynchronization mode913 or in a transmitmode914.
For example, assume thatactive stylus pen106 is coupled to a particular touch-screen containinghost device102. The location of thepen tip106aon the touch screen may be directly at a drive pixel that contains a portion of the drive electrode114 (i.e., a column) and the sense electrode116 (i.e., a row), but may also be located on the touch screen between drive pixels. Detected signal951 represents the voltage measured by thepen tip106aover time. Detected signal951 reflects a signal that is generated by thecolumn driver113 and then sequentially applied to each column as theuser interface104 is sequentially scanned. The active styluscontrolling element907 may operate by default insynchronization mode913, essentially listening for signal activity in this mode, then may transition to transmitmode914 based on signal activity received and processed by theprocessor106c.
Duringtime period902, detectedsignal901 has asignal magnitude901a, which indicates that thecolumn driver113 signal is being applied to a column that is a distance away from thepen tip106a, such as a neighboring column, and thus has not yet reached the column nearest to thepen tip106a. The activestylus control element907 may remain in asynchronization mode913 for a period of time or until the signal magnitude changes. During thenext time periods903 and904, detectedsignal901 has an amplitude of901b, indicating that thecolumn driver113 is currently applying a portion of the detectedsignal901 to a column (e.g., drive electrode114) that is closer to thepen tip106athan the column that delivered the signal during thetime period902.
Generally, synchronization of the activestylus control element907 with the touch-screen containinghost device102 is important to ensuring accurate input is detected by thehost device102. For example, suppose the activestylus control element907 transmits a signal to pentip106awhencolumn driver113 is driving a column at which thepen tip106ais not located. The signal transmitted topen tip106awill change the sensed capacitance most strongly at asensing assembly117 closest to the location ofpen tip106a, but may also affectnearby sensing assemblies117 to a lesser degree. Because thehost device102 may measure the values of sensed capacitance across all rows simultaneously, but the columns are driven in particular sequence, thehost device102 will detect the changes in sensed capacitance but may misinterpret the location of the input. The effect of the misinterpretation may be erratic or erroneous input intohost device102, which may cause the input position on the screen to jump around and/or lead to other undesirable effects in programs being executed onhost device102, and may further significantly degrade the user's experience.
At thenext time period904, the frequency of detectedsignal901 received from thehost device102 may be changed, and may be a higher or lower frequency than the portion of the detectedsignal901 intime period903. The change in frequency may be caused by the particular scanning process ofhost device102. In this example, the frequency of detectedsignal901 increases at time period304 while the amplitude of detectedsignal901 remains at asignal magnitude901b, indicating that the detectedsignal901 is still being applied to the same or similarly positioned column to pentip106a. In one or more configurations, the activestylus control element907 may adapt to such a change in frequency and adjust the output signal delivered from thepen tip106a. To accomplish this, the activestylus control element907 may stop transmitting and transition from transmitmode914 tosynchronization mode913. When the activestylus control element907 regains synchronization with detectedsignal901, the activestylus control element907 may then return to transmitmode914 and resume transmitting anoutput signal912 to thepen tip106a.
Atsubsequent time period905, the magnitude of detectedsignal901 decreases from901bto901c, indicating that thecolumn driver113 is applying the detectedsignal901 to a column (i.e., thecolumn driver113 is transmitting on the next column) that is a further distance away from the column(s) that delivered the signal during thetime periods903 and904. The indication that the nearest column is no longer delivering the detectedsignal901 fromcolumn driver113, which then causes the activestylus control element907 to transition intosynchronization mode913, irrespective of the frequency or phase of detectedsignal901 that is detected by theactive stylus pen106. Althoughsignal901 is depicted as having the same frequency and phase during time period305 as duringtime period904, the example is meant to demonstrate that the signal magnitude falling below a particular threshold may trigger a transition intosynchronization mode913, regardless of signal frequency or phase. Further, the examples disclosed herein are not meant to be limiting the claimed subject matter to only those embodiments interacting withhost devices102 that generate such signal patterns, frequencies, phases, or changes in frequencies and/or phases.
In one or more embodiments, the maximum signal magnitude value that corresponds tocolumn driver113 driving the nearest column (i.e.,magnitude901b) may be learned during one scan cycle. The maximum signal magnitude value may then be used to determine a threshold value that can effectively distinguish the maximum magnitude value from the remainder of detected signal magnitude values (i.e., distinguishmagnitude901bfrommagnitudes901aand901c). In subsequent scan cycles, the threshold value may be compared with the detected signal magnitude to indicate whethercolumn driver113 is currently driving the nearest column to pentip106a.
In one embodiment, when the sensing component (e.g., communications unit906d, the processor906cand the memory906e) of theactive stylus pen106 determines that the nearest column(s) are delivering thecolumn driver113 signal, the sensing component may analyze the detectedsignal901 and generate an output signal based on the detectedsignal901. The activestylus control element907 may remain insynchronization mode913 for atime period918, when analysis of the detectedsignal901 is complete and the activestylus control element907 has synchronized to the detectedsignal901. The activestylus control element907 may then transition into transmitmode914 and begin transmitting an output signal, such as the output signal found in transmitmodes914 of thecontrolling signal915 to thepen tip106a. Transmission may continue until synchronization with the detectedsignal901 is lost (e.g., if the frequency or phase of detectedsignal901 changes).
Though activestylus control element907 may be capable of on-the-fly adaption to a frequency change in a detectedsignal901, this adaptive capability may have a significant computational expense. This expense may have secondary effects of increasing the power consumption ofactive stylus pen106 as the activestylus control element907 more frequently processes the detectedsignal901 and attempts to synchronize, as well as decreasing the percentage of time during scan cycles that the activestylus control element907 is able to transmit tohost device102. For example, activestylus control element907 is depicted as being insynchronization mode913 for alonger period918 than theperiod919, during which activestylus control element907 is in transmitmode914. Such a decreased percentage may result in a less responsive input to thehost device102, which may ultimately cause computing errors inhost device102.
In another embodiment, however, theactive stylus pen106 may accommodate longer transmitmode periods919 by storing host device identification information that relates to one ormore host devices102. The information may include data relating to physical characteristics or capacitive sensing techniques of each of the different types of host devices, and the information may be stored inmemory106e. The host device identification information may further include frequency, timing and phase information of detectedsignal901, number of rows and/or columns in theuser interface104 and other useful information. The host device identification information may be pre-programmed and/or stored in memory based on vendor specifications or may be learned (through use of theactive stylus pen106 with particular host devices102) and then stored in memory by the sensing component ofactive stylus pen106. If theactive stylus pen106 already contains host device identification information corresponding to theparticular host device102, theactive stylus pen106 may advantageously bypasssynchronization mode913 whencolumn driver113 is driving detectedsignal901 on the nearest column. In other words,active stylus pen106 may transmit an output signal to thepen tip106aduring the entirety of the time period. Further, frequency and phase changes to detectedsignal901 may not disrupt the transmission by theactive stylus pen106 if the target frequency and phase values are also included in the host device identification information.
In one embodiment, thestylus pen106 is able to use the knowledge of the physical characteristics of thehost device102 to determine one of the coordinates of a touch event created by astylus pen106's interaction with theuser interface104. In one example, since the stylus pen is able to sense the transmitted signals provided by the driven columns in thehost device102, and is able to determine that it is nearer to one column versus another, using of the knowledge of the physical layout of the columns (i.e., driven electrodes) in thehost device102, thestylus pen106 can ascertain its x-direction coordinates. By monitoring the full touch sensing scan cycle, or monitoring characteristics of the touch sensing scanning process performed by thehost device102, thestylus pen106 can determine which column number is being driven at a certain time, either by knowledge of the scanning technique used by the host device and/or by analysis of the touch sensing scanning process. For example, it is common for touch sensing devices to drive all of the columns at the end of a touch sensing cycle to reduce any charge built up in different areas of the user interface. Thestylus pen106 is then able to detect and use this information to know when the first column in a new touch sensing scan is about to start. The stylus pen can then analyze the number of sensing signals of different amplitude created by thecolumn driver113 that are sent before the column nearest thepen tip106ais reached. Thestylus pen106 can then determine which column number that it is nearest to in the user interface, and thus its relative x-coordinate position. The x-coordinate position can then be transmitted to the host device via thecommunication link205, so that this information can be used by and/or compared with the touch sensing coordinate information received from the host device to help more easily determine which touch data points are related to thestylus pen106. Knowledge of at least one of the coordinates of astylus pen106 interaction with theuser interface104 can help reduce misidentification error rate and help with palm and finger detection using the techniques described above.
FIG. 9C illustrates the components of anactive stylus pen106 capable of interacting with ahost device102 that is configured for mutual capacitance sensing, according to an embodiment of the invention. Theactive stylus pen106 may couple to thehost device102 throughpen tip106a, as discussed above. Theactive stylus pen106 is further configured with an activestylus control element910, which comprises a low-noise amplifier (LNA)931, aphase discriminator932, apeak detector933, a timing state machine (TSM)934, a waveform generator (WG)935, a power amplifier (PA)936, and aclock source937. TheLNA931 generally provides linear signal amplification, and in one or more configurations,LNA931 may operate across the 10 kilohertz (kHz) to 1 megahertz (MHz) frequency range and may have an input impedance is greater than 1 megaohm (MΩ). Thephase discriminator932 is generally a zero-crossing detector, which generates a pulse having a width of one cycle ofclock source937 upon detecting a transition of potential atpen tip106a. Thepeak detector933 is generally comprised of rectifier, integrator, and high pass filter components. TheTSM934 is comprised of a state machine that controls mode selection, a phase and frequency estimator, a calibration state machine, and a timing sequencer through use of theprocessor106c,clock106gandmemory unit106e. Output generated byTSM934 provides control to theWG935, which may generate an appropriate sequence of square pulses having a particular frequency, amplitude and duty cycle that are specified byTSM934. ThePA936 drives thepen tip106aso that a desired signal can be detected by thehost device102, and is capable of tri-state operation based on control signals received fromTSM934 andWG935. In one example, the tri-state operation, which may be controlled by theTSM934, may include the delivery of a high voltage signal (VH) (e.g., positive voltage signal) and a low voltage signal (VL) (e.g., negative voltage signal) to provide a desired signal from thepen tip106athat can be sensed (e.g., VHor VL) at desired times by any type ofhost device102 using any type sensing technique. ThePA936 may also deliver no signal at all to pentip106asuch as during idle periods or while thePA936 is in a high-impedance mode (e.g., when activestylus control element910 is synchronizing to a detected signal901). Theclock source937 may be a crystal oscillator or a comparably precise source, and is typically the same clock as clock206gdiscussed above. Theclock source937 is generally required to be as precise as the clock source that drives theuser interface104. Thehost device102 generally includes auser interface104, adriver assembly113 and asensing assembly117. Touch sensing areas, or touch pixels, may be formed at the overlapping regions of the one ormore drive electrodes114 and one ormore sense electrodes116. As shown,pen tip106ais located within an electric field E of the mutual capacitance created by thedrive electrode114 andsense electrode116. In this configuration, thepen tip106ais coupled to theuser interface104, and the signals generated within the activestylus control element910 and transmitted to thepen tip106amay alter the electric field E, which in turn may change the sensed capacitance atsensing assembly117 to a desired level at a desired time.
According to an embodiment of the invention, activestylus control element910 may generally operate in a synchronization mode and/or in a transmit mode. The activestylus control element910 may operate by default in synchronization mode, essentially listening for signal activity of the touch sensing component in thehost device102 in this mode, then may transition to transmit mode based on received signal activity. To operate in synchronization mode, theTSM934 may transmit an output to the enable (ENB) input ofPA936, which causes thePA936 to operate in a high impedance mode and deliver the signal to thepen tip106aat a desired time to coincide with the capacitive sensing signal delivered by thehost device102. The high impedance atPA936 relative toLNA931 causes most of the detected signal atpen tip106ato be transmitted to theLNA931. TheTSM934 also may transmit an output to theWG935 to disable theWG935, which may be advantageously used to conserve power in theactive stylus pen106. In some configurations, thepen tip106awhen coupled to ahost device102 may detect a signal from thehost device102, by monitoring the signal received by theLNA931 asPA936 is operating in high impedance mode. After being amplified atLNA931, the detected signal is provided to both thephase discriminator932 and thepeak detector933. The respective outputs from thephase discriminator932 andpeak detector933 are then transmitted toTSM934, which uses the estimated phase and frequency to control the output of theWG935.
Upon determining the estimated phase and frequency of the signal received from thehost device102, theTSM934 may cause the activestylus control element910 to operate in transmit mode by enabling thePA936 and causing theWG935 to begin generating an output signal according to the phase, amplitude and frequency information provided by theTSM934. The output signal generated by theWG935 may next be amplified by thePA936. In one or more embodiments,LNA931 may have a relatively large input impedance compared to thepen tip106a, so that the amplified signal will be transmitted to thepen tip106a, in order to affect the sensed capacitance due to the capacitive coupling of thepen tip106ato the touch sensing components in theuser interface104.
In one embodiment, the touchsignal generating device106hincludessignal control electronics106i, aconductive coating222 formed on a surface of thestylus pen106, which the user is in contact with when they are holding thestylus pen106, and the mechanical sensor/switch221 (e.g., simple mechanical switch). In one embodiment, thesignal control electronics106igenerally includes a signal generating device and other supporting components that are able to inject a current through thepen tip106ato the capacitive sensing elements in theuser interface104 at an interval that is synchronized with the capacitive sensing signals delivered between the capacitive sensing elements in theuser interface104. Thesignal control electronics106iis also adapted to detect the capacitive sensing signal(s) delivered between the transmitter and receiver electrodes in thetouch sensing unit212 at any instant in time, and a phase shifting device (not shown) that is able to synchronize the timing of the injection of current through thepen tip106awith the delivery of the capacitive sensing signal(s) delivered between the transmitter and receiver electrodes. The mechanical sensor/switch221 when activated electrically couples theconductive coating222,signal control electronics106iand other useful electrical components in thestylus pen106 to thepen tip106ato create a virtual capacitance signal that is delivered between thenpen tip106aand the capacitive sensing elements in theuser interface104. The virtual capacitance created by the activation of the mechanical sensor/switch221 can at least be intermittently formed between thepen tip106aand a portion of theuser interface104, so that a desirable touch signal is received by theuser interface104 with or without the physical act of touching thepen tip106ato the user interface.
In one embodiment, the initial activation of the mechanical sensor/switch221 causes a specific set of sensing signal pulses, or signature pulses, that will allow the one or moreinput discrimination techniques345 used by the controllingengine340 to more easily determine that the created touch data input created by the activation of the touchsignal generating device106hwill be more easily characterized as an input from thestylus pen106. One will note that the capacitive sensing elements in theuser interface104 of thehost device102 are sampled a set frequency (e.g., sampled every 16 ms). Therefore, the set of sensing signal pulses created by portions of the stylus pen106 (e.g., touchsignal generating device106h,processor106candmemory106e) when the touchsignal generating device106his activated may require two or more sensing signal pulses that each have a distinguishing preset length and/or fixed time between them that is equal to greater than the sampling rate of the device, so that the signature of the activation of the touchsignal generating device106hcan be more easily determined by the user input discriminating techniques performed by the controllingengine340 that is running on thehost device102. Thus, the touchsignal generating device106his useful, since it allows the user to initiate the interaction of thestylus pen106 with theuser interface104, rather than wait for the sensed contact of thepen tip106aand theuser interface104 to be characterized by the controllingengine340 as an input is received from a stylus pen.
Use of the touchsignal generating device106hcan also allow two ormore pens106 to be used with ahost device102, since each stylus pen can provide a different initial signature pulse configurations that allow the controllingengine340 to determine which of thepens106 is being used at any instant in time.FIG. 10 illustrates two sets ofsignature pulses1001 and1002 that each may be delivered from twodifferent pens106, so that the controllingengine340 can more easily determine that the user input created by eachstylus pen106 will be more easily associated to that particular stylus pen. Thesignature pulses1001 and1002 may be generated at the start of the interaction of the stylus pen with theuser interface104 to let the controlling engine know that the subsequent touch interactions that are associated with that initiating touch event will be made by a particular stylus pen. As illustrated inFIG. 10, thesignature pulse1001 may comprise two pulses1005 and1006 that each have a desiredduration1021 and1023, respectively, and an off-period1007 that has aduration1022. Also, thesignature pulse1002 may comprises two pulses1010 and1011 that each have a desiredduration1041 and1043, respectively, and an off-period1012 that has aduration1042. Therefore, due to at least one difference betweensignature pulses1001 and1002, such as the number of pulses (e.g., 2, 4 or 8 pulses), pulse shape (e.g., square-wave shape, sinusoidal wave shape), pulse duration or off-period between pulses the controllingengine340 will be able to more easily determine that a particular input is received by one stylus pen versus another. Moreover, asignature pulse1001 or1002 can also be used to determine that an interaction sensed by theuser interface104 is related to a stylus pen and not a finger or user's appendage.
While the techniques disclosed herein primarily discuss a process of determining the type of user input to create output data that is used within a host device on which the controlling engine is running, this configuration is not intended to limiting as to the scope of the invention described herein, since the output data can also be delivered to or shared with other peripheral devices without deviating from the basic scope of the invention described herein.
The present invention can be implemented in the form of control logic in software or hardware or a combination of both. The control logic may be stored in an information storage medium as a plurality of instructions adapted to direct an information-processing device to perform a set of steps disclosed in embodiments of the present invention. Based on the disclosure and teaching provided herein, a person of ordinary skill in the art will appreciate other ways and/or methods to implement the present invention.
In embodiments, any of the entities described herein may be embodied by a computer that performs any or all of the functions and steps disclosed.
It should be noted that any recitation of “a”, “an” or “the” is intended to mean “one or more” unless specifically indicated to the contrary.
It is also understood that the examples and embodiments described herein are for illustrative purposes only and that various modifications or changes in light thereof will be suggested to persons skilled in the art and are to be included within the spirit and purview of this application and scope of the appended claims. Therefore, the above description should not be understood as limiting the scope of the invention as defined by the claims.