This application is a continuation of U.S. application Ser. No. 13/441,489, filed Apr. 6, 2012, the entire contents of which are hereby incorporated herein by reference.
TECHNICAL FIELDThe disclosure relates to graphical keyboards provided by computing devices.
BACKGROUNDA user may interact with applications that are executing on a computing device (e.g., a mobile phone, tablet computer, smart phone, desktop computer, or the like). In some examples, a computing device may include a touch-sensitive display that may enable a user to interact with the computing device. For instance, an application executing on a computing device may cause a touch-sensitive display to display a graphical keyboard that may enable a user to register key presses by touching certain areas of the graphical keyboard.
Individual users of graphical keyboards may have varying typing styles. The graphical keyboard may be smaller in size than a physical keyboard to which a user may be accustomed. In some cases, an individual user may make generally consistent errors when typing on a graphical keyboard. For example, the actual key on the graphical keyboard that is touched by the user may be different from an intended target key.
SUMMARYIn one aspect, a method includes outputting, at an input-sensitive display of a computing device, a first graphical keyboard arrangement including a first representation of a key that is associated with a target region of the input-sensitive display, and receiving a plurality of user inputs at the input-sensitive display, each user input from the plurality of user inputs being associated with a respective touch region of the input-sensitive display. The method also includes, responsive to determining that each input from the plurality of user inputs is associated with the first representation of the key, determining whether one or more of the associated touch regions is not substantially aligned with the target region associated with the first representation of the key. The method also includes identifying a quantity of the touch regions that are not substantially aligned with the target region, and, subsequent to determining that the quantity exceeds a threshold quantity of touch regions that are not substantially aligned with the target region, outputting, at the input-sensitive display, a second graphical keyboard arrangement that includes a second representation of the key, wherein at least one attribute of the second representation of the key is graphically modified relative to the first representation of the key so as to substantially align one or more of the touch regions with a target region associated with the second representation of the key.
In another aspect, a system includes at least one processor, a keyboard application operable by the at least one processor to generate graphical keyboard arrangements, and an input-sensitive display that outputs a first graphical keyboard arrangement including a first representation of a key that is associated with a target region of the input-sensitive display. The input-sensitive display is configured to receive a plurality of user inputs each associated with a respective touch region of the input-sensitive display, wherein the plurality of user inputs are received during use by a user of an application executing on the system other than the keyboard application, and the keyboard application, responsive to determining that each input from the plurality of user inputs is associated with the first representation of the key, determines whether one or more of the associated touch regions is not substantially aligned with the target region associated with the first representation of the key. The input-sensitive display outputs a second graphical keyboard arrangement that includes a second representation of the key, wherein at least one attribute of the second representation of the key is graphically modified relative to the first representation of the key so as to substantially align one or more of the touch regions with a target region associated with the second representation of the key.
In another aspect, a computer-readable storage medium comprising instructions that, if executed by one or more processors of a computing system, cause the computing system to perform operations comprising outputting a first graphical keyboard arrangement for display, the first graphical keyboard arrangement including a first representation of a key that is associated with a target region of an input-sensitive display, receiving data indicative of a plurality of user inputs, each user input from the plurality of user inputs being associated with a respective touch region of the input-sensitive display, and responsive to determining that each input from the plurality of user inputs is associated with the first representation of the key, determining whether one or more of the associated touch regions is not substantially aligned with the target region associated with the first representation of the key. The operations also include, subsequent to determining that one or more of the touch regions is not substantially aligned with the target region associated with the first representation of the key, outputting for display a second graphical keyboard arrangement that includes a second representation of the key, wherein at least one attribute of the second representation of the key is graphically modified relative to the first representation of the key so as to substantially align one or more of the touch regions with a target region associated with the second representation of the key.
The details of one or more examples are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF DRAWINGSFIG. 1 is a conceptual diagram illustrating an example of a computing device that is configured to execute a keyboard application.
FIG. 2 is a block diagram illustrating further details of one example of the computing device shown inFIG. 1.
FIG. 3 is a conceptual diagram illustrating an example of a keyboard application.
FIGS. 4A-4D are block diagrams illustrating example portions of a graphical keyboard.
FIG. 5 is a conceptual diagram illustrating an example distribution of user inputs associated with a representation of a key on a graphical keyboard.
FIG. 6 is a flow diagram illustrating an example operation of a computing device that is configured to execute a keyboard application.
DETAILED DESCRIPTIONIn general, the disclosure is directed to customizing attributes of a graphical keyboard on a computing device. For example, a computing device can execute or otherwise implement a keyboard application that automatically customizes an arrangement of a graphical keyboard based on data collected from a user's use of the graphical keyboard, and presents the customized graphical keyboard to the user. The graphical keyboard may, for example, be presented on an input-sensitive display of the computing device.
In an example aspect, the keyboard application may present a training program to the user for customizing the graphical keyboard. For example, the training program may be presented to the user upon initial use of a default graphical keyboard of the computing device. According to one example aspect, the training program may present a sample training text to the user by a display, and the user may be instructed to type the training text using the default graphical keyboard. The keyboard application may record instances of mistyping by the user that occur when the user types the training text. The keyboard application may be configured to disregard other types of mistakes, such as misspelled words or displacement of fingers on the keys, which do not result from misdirected key presses.
In another example aspect, in addition to or instead of the training program mode, the keyboard application may operate in a continuous keyboard learning mode. In the continuous keyboard learning mode, the keyboard application executes in the background while the user is using the computing device, and gathers data based on the user's typing associated with other applications of the computing device.
In some aspects, for each key represented on the graphical keyboard, the keyboard application may log the exact locations that the user inputs to the input-sensitive surface, and over time develop a distribution of the locations of touches within an area associated with each key. In some examples, the keyboard application automatically customizes the graphical keyboard based on the data collected from a user's use of the graphical keyboard, and presents the customized the graphical keyboard to the user. The keyboard application may compare the touch regions to a target region, and modify the graphical keyboard (or propose modifications to the user) when, for example, a position associated with a maximum quantity of touches is located outside of an inner touch region for the key, and the quantity of touches exceeds a threshold value. The keyboard application may, for example, modify the shape, size, and relative position of keys on the graphical keyboard.
Customizing the graphical keyboard to the typing habits of an individual user in this manner may help to improve the quality of the user's experience when using the graphical keyboard, such as by reducing an amount of errors and corrections made by the user. The keyboard application may associate the customized graphical keyboard layout with a user login, allowing multiple users of the computing device to each have a different customized graphical keyboard.
FIG. 1 is a conceptual diagram illustrating an example of a computing device that is configured to execute a keyboard application. As illustrated inFIG. 1,computing device2 can includeinput device4 andoutput device6.Computing device2 may be configured to executekeyboard application8, which may causeoutput device6 to displaygraphical keyboard10.Keyboard application8 may be operable by at least one processor of a computing system includingcomputing device2 to generate graphical keyboard arrangements. Examples ofcomputing device2 can include, but are not limited to, portable or mobile devices such as cellular phones, tablet computers, personal digital assistance (PDAs), portable gaming devices, portable media players, and e-book readers, as well as non-portable devices such as desktop computers.
Input device4, in some examples, is configured to receive input from a user through tactile, audio, or video feedback. Examples ofinput device4 can include an input-sensitive display, such as a touch-sensitive and/or a presence-sensitive screen, mouse, keyboard, voice responsive system, or any other type of device for detecting a command from a user. In some examples,input device4 can include a touch-sensitive display, mouse, keyboard, microphone, or video camera.
Output device6, in certain examples, may be configured to provide output to a user using tactile, audio, or video stimuli.Output device6, in one example, includes an input-sensitive display (e.g., touch-sensitive display or presence-sensitive display), a sound card, a video graphics adapter card, or any other type of device for converting a signal into an appropriate form understandable to humans or machines. Additional examples ofoutput device6 can include a speaker, a cathode ray tube (CRT) monitor, a liquid crystal display (LCD), an organic light emitting diode (OLED), or any other type of device that can generate intelligible output to a user.Output device6 may present the content ofcomputing device2 to a user. For example,output device6 may present a web browser, or other output that may need to be presented to a user. In some examples,output device6 may be a touch screen that can allow a user to provide one or more user inputs to interact withcomputing device2.
Keyboard application8, executing oncomputing device2, may provide one or more signals to cause output device6 (e.g., a touch-sensitive display) to displaygraphical keyboard10. In some examples, a user may provide a user input to causecomputing device2 to select one or more character keys ofgraphical keyboard10 by touching the area ofoutput device6 that displays the character key ofgraphical keyboard10. For instance, a user may perform a tap gesture at a displayed character key ofgraphical keyboard10, such ascharacter key9. The tap gesture can include touching a displayed character key and releasing the character key.
In some examples, as whenoutput device6 includes a presence-sensitive display, touchingoutput device6 may be accomplished by bringing an input device such as a finger, a stylus, a pen, and the like, within proximity of output device that is sufficiently close to enableoutput device6 to detect the presence of the input device. As such, touching a displayed character key ofgraphical keyboard10 may, in some examples, not include actual physical contact between an input device andgraphical keyboard10. Similarly, in certain examples, as whenoutput device6 includes a presence-sensitive display, releasing a displayed character key ofgraphical keyboard10 may be accomplished by removing the input device from the detectable range ofoutput device6.
In an example aspect,keyboard application8 may present a training program to the user for customizinggraphical keyboard10. For example, the training program may be presented to the user upon initial use of a default graphical keyboard of thecomputing device2. According to one example aspect, the training program may present a sample training text to the user by a display (e.g., presented by output device6), and the user may be prompted to type a predefined series of characters using the default graphical keyboard arrangement. The sample training text presented by the training program ofkeyboard application8 can include a variety of characters, and may be selected such that each character appears multiple times and such that the order of the characters varies.
Keyboard application8 may record instances of mistyping by the user that occur when the user types the training text. For example,keyboard application8 may register key presses on the touch-sensitive display, and determine when a touch region associated with the key presses is not substantially aligned with a target region of the touch-sensitive display associated with a target key.Keyboard application8 may determine what target region for a key the user input is associated with based on comparison of an order of the user inputs with an order of the predefined series of characters presented by the training program.Keyboard application8 may determine whether certain mistakes are due to the user missing an intended target key and pressing a nearby area ofgraphical keyboard10 instead.Keyboard application8 may be configured to disregard other types of mistakes, such as misspelled words, that do not result from misdirected key presses.
In some aspects, for each key represented ongraphical keyboard10,keyboard application8 may record the exact locations that the user touches on the touch-sensitive surface, and over time develop a distribution of the locations of touches within an area associated with each key. In one example aspect, when a maximum point of the location distribution is located outside of a particular inner region associated with a key, and the value of the maximum point is greater than a threshold,keyboard application8 may propose to move a boundary of the key to better align with a location of the maximum point, i.e., where the user actually presses the key. For example,keyboard application8 may move the key boundary for a given key when the maximum point of the location distribution exceeds the threshold and is located at least a certain distance away from a current center area associated with the key.Keyboard application8 may, for example, modify attributes ofgraphical keyboard10, such as the shape, size, and relative position of keys ongraphical keyboard10.
In another example aspect, in addition to or instead of the training program mode,keyboard application8 may operate in a continuous keyboard learning mode. In the continuous keyboard learning mode,keyboard application8 executes in the background while the user is usingcomputing device2, and gathers data based on the user's typing associated with other applications (not shown) ofcomputing device2.Keyboard application8 may gather key touch location distribution data, as described above. As another example,keyboard application8 may learn where the user may makes typing mistakes, such as based on instances in which the user goes back and corrects their typing.
Keyboard application8 may, for example, occasionally present a display to the user that shows a new proposed graphical keyboard layout, allowing the user to elect to use the new graphical keyboard layout or stay with the current layout.Keyboard application8 can output a previous graphical keyboard arrangement and the new graphical keyboard at the same time, so that the user can see the proposed changes and select the desired arrangement.Keyboard application8 can solicit a user selection of one of the first graphical keyboard arrangement and the second graphical keyboard for future use. The new graphical keyboard arrangement may have a different overall size and shape than the first graphical keyboard arrangement, where the overall size and shape of the second graphical keyboard arrangement is selected bykeyboard application8 so as to substantially align one or more of the touch regions with a target region associated with a second representation of a key.Keyboard application8 may also provide the user the option to accept, reject, or modify the key layout changes on a key-by-key basis, such as by soliciting a user selection of one or more modifications of attributes of one or more representations of keys of the first graphical keyboard arrangement for future use. For example, the user can be given options to modify the proposed layout by moving a key or by resizing the keyboard. In some examples,keyboard application8 may automatically change the layout ofgraphical keyboard10 without requesting user approval. In this manner, techniques of this disclosure may enable the computing device to provide the user with a customized graphical keyboard that is tailored to the particular typing style of the user.
Keyboard application8 may associate the customized graphical keyboard layout with a user login, allowing multiple users of the computing device to each use a different customized graphical keyboard. The user may be able to turn off the continuous keyboard learning mode, such as by selecting an option indicated on the display to cease execution of the keyboard customization application as a background task.Keyboard application8 may be, for example, a downloadable or pre-installed application executing oncomputing device2. In another example,keyboard application8 may be part of a hardware unit of computing device.
FIG. 2 is a block diagram illustrating further details of one example of the computing device shown inFIG. 1.FIG. 2 illustrates only one particular example ofcomputing device2, and many other example embodiments ofcomputing device2 may be used in other instances. As shown in the specific example ofFIG. 2,computing device2 includesinput device4,output device6, one ormore applications19, one ormore processors20, one ormore storage devices26, andnetwork interface24.Computing device2 also includesoperating system16, which may include modules that are executable by computingdevice2.Computing device2, in one example, further includeskeyboard application8 that is also executable by computingdevice2.Keyboard application8 includesgesture determination module12,training module14, andkeyboard customization module18. Each ofcomponents4,6,8,12,14,18,20,24, and26 may be interconnected (physically, communicatively, and/or operatively) for inter-component communications. In some examples,communication channels22 may include a system bus, network connection, interprocess communication data structure, or any other channel for communicating data. As one example inFIG. 2,components4,6,20,24 and26 may be coupled by one ormore communication channels22.
Computing device2 can include additional components that, for clarity, are not shown inFIG. 2. For example,computing device2 can include a battery to provide power to the components ofcomputing device2. Similarly, the components ofcomputing device2 shown inFIG. 2 may not be necessary in every example ofcomputing device2. Forinstance computing device2 may not, in all examples, includenetwork interface24.
Although shown as separate components inFIG. 2, in some examples, one or more ofkeyboard application8,gesture determination module12,training module14, andkeyboard customization module18 may be part of the same module. In some examples, one or more ofkeyboard application8,gesture determination module12,training module14, andkeyboard customization module18, and one ormore processors20 may be formed in a common hardware unit. In certain examples, one or more ofkeyboard application8,gesture determination module12,training module14, andkeyboard customization module18 may be software and/or firmware units that are executed on or operable by one ormore processors20.
One ormore processors20 may include, in certain examples, any one or more of a microprocessor, a controller, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or equivalent discrete or integrated logic circuitry. One ormore processors20 may be configured to implement functionality and/or process instructions for execution withincomputing device2. For example, one ormore processors20 may be capable of processing instructions stored in one ormore storage devices26.
One ormore storage devices26, in one example, are configured to store information withincomputing device2 during operation.Storage device26, in some examples, is described as a computer-readable storage medium. In some examples,storage device46 is a temporary memory, meaning that a primary purpose ofstorage device46 is not long-term storage.Storage device46, in some examples, is described as a volatile memory, meaning thatstorage device46 does not maintain stored contents when the computer is turned off. Examples of volatile memories include random access memories (RAM), dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories known in the art. In some examples,storage device46 is used to store program instructions for execution by one ormore processors20.Storage device26, in one example, is used by software or applications running on computing device2 (e.g., keyboard application8) to temporarily store information during program execution.
One ormore storage devices26, in some examples, also include one or more computer-readable storage media. One ormore storage devices26 may be configured to store larger amounts of information than volatile memory. One ormore storage devices26 may further be configured for long-term storage of information. In some examples, one ormore storage devices26 include non-volatile storage elements. Examples of such non-volatile storage elements include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.
As shown inFIG. 2,storage devices26 include user settings28,training programs30,key regions32,thresholds34, andtouch data36. User settings28,training programs30,key regions32,thresholds34, andtouch data36 may each be configured as a database, flat file, table, tree, or other data structure stored withinstorage devices26 ofcomputing device2. In some examples, user settings28,training programs30,key regions32,thresholds34, andtouch data36 may be configured as separate data repositories while, in other examples, they may be a part of a single data repository.
In the example ofFIG. 2,computing device2 includesnetwork interface24.Computing device2, in one example, usesnetwork interface24 to communicate with external devices via one or more networks, such as one or more wireless networks.Network interface24 may be a network interface card, such as an Ethernet card, an optical transceiver, a radio frequency transceiver, or any other type of device that can send and receive information. Other examples of such network interfaces may include Bluetooth, 3G and WiFi radios in mobile computing devices as well as USB. In some examples,computing device2 usesnetwork interface24 to wirelessly communicate with an external device (not shown) such as a server, mobile phone, or other networked computing device.
Computing device2 may includeoperating system16.Operating system16, in some examples, controls the operation of components ofcomputing device2. For example,operating system16, in one example, facilitates the interaction ofkeyboard application8 withprocessors20,network interface24,storage device26,input device4, andoutput6.
Computing device2 includeskeyboard application8, executable by computingdevice2, such as by one ormore processors20. As shown inFIG. 2,keyboard application8 may includegesture determination module12,training module14, andkeyboard customization module18.Applications19, which includekeyboard application8, may each include program instructions and/or data that are executable by computingdevice2. For example,gesture determination module12,training module14, andkeyboard customization module18 may include instructions that causekeyboard application8 executing oncomputing device2 to perform one or more of the operations and actions described in the present disclosure.Gesture determination module12 may receive one or more inputs, such as frominput device4 or output device6 (e.g., a touch sensitive display), and may determine that the one or more inputs comprise a gesture. Examples of gestures can include, but are not limited to, tap gestures, sliding gestures, circular gestures, and the like.
As one example,keyboard application8, executing on one ormore processors20, may cause a touch-sensitive display ofcomputing device2, such asoutput device6, to display a graphical keyboard.Gesture determination module12 may receive an input fromoutput device6 indicating that a displayed character of the graphical keyboard has been touched, such as by a finger, stylus, pen, or the like.Gesture determination module12 may determine that a tap gesture has been performed when the selected character has been released. As another example, after receiving an input fromoutput device6 indicating that a displayed character has been touched,gesture determination module12 may determine that a sliding gesture has been performed when the selected character is released by sliding the input device off the selected character while maintaining contact withoutput device6.
In some example aspects,computing device2 may operate according to a training program mode. In the training program mode,training module14 may execute a training program oftraining programs30 to provide a block of training text tooutput device6 for display to the user.Training module14 may, for example, run atraining program30 upon initial use of the graphical keyboard by a user ofcomputing device2.Training programs30 can include a variety of sample training texts. The training texts may provide a series of characters, including letters, numbers, and other symbols that correspond to characters ongraphical keyboard10. The training texts may be selected to ensure that each character occurs at least once, or multiple times. The training texts may be selected to include common words, common character combinations, and/or a variety of different character combinations, for example.
Gesture determination module12 registers key presses on the touch-sensitive display by the user, and may determine a touch region associated with each key press.Gesture determination module12 may determine a touch region associated with a portion of a touch-sensitive display (e.g., output device6) that is in contact with an input unit, such as a finger, stylus, or other input unit. In some examples,output device6 may indicate a radius of a contact area between the input unit andoutput device6. For instance, the contact area may be an area of the touch-sensitive display where a detected capacitance of the touch-sensitive display changes responsive to a surface area of the input unit (e.g., a finger). In such examples,gesture determination module12 may determine the touch region of the portion ofoutput device6 that is in contact with the input unit using the radius indicated byoutput device6. In certain examples,output device6 may indicate a number of pixels or other units of known area ofoutput device6 that are in contact with the input unit.Gesture determination module12 may determine a center of the portion ofoutput device6 that is in contact with the input unit, such as by extrapolating based on the number of units of known area.
In some examples,gesture determination module12 may indicate that multiple gestures are being performed at once. For instance, a user may provide user inputs that include touching and releasing multiple displayed characters at the same time. In such examples,gesture determination module12 may track the multiple gestures individually, andkeyboard customization module18 may make a determination for each individual gesture.
Keyboard customization module18 determines that each user input is associated with a key, i.e., that the user input is intended for contact with a representation of the key on the graphical keyboard. As described in further detail below,keyboard customization module18 may compare the touch regions with expected target regions associated with the representation of the key on the display, to determine whether a given touch region and the associated target region are substantially aligned.
In the example of a training program mode,keyboard customization module18 obtains the expected target regions fromtraining module14 andkey regions32.Key regions32 may store data specifying boundaries of the target regions on the graphical keyboard.Keyboard customization module18 can determine which key the user was supposed to press, and obtains a key boundary of this key fromkey regions32. For example, an expected order of key characters of the training text may be known bykeyboard customization module18 fromtraining module14, andkeyboard customization module18 can compare this with an order and identity of user inputs registered bygesture determination module12.Keyboard customization module18 is configured to identify extraneous typing mistakes that are unrelated to misdirected key presses, such as spelling errors or displaced fingers. For example,keyboard customization module18 may note that a misdirected key press may be located very close to an intended key, while a key press associated with a spelling error may be located farther from an intended key. As another example,keyboard customization module18 may recognize and correct for errors due to displacement of the user's fingers on the graphical keyboard, such as may occur when the user's fingers have drifted slightly from the original positioning without the user's knowledge.Keyboard customization module18 may ignore the extraneous typing mistakes or finger displacement when comparing touch regions with expected target regions of the graphical keyboard.
In some example aspects,keyboard customization module18 ofcomputing device2 may operate in a continuous learning mode for determining whether modifications to a graphical keyboard might be proposed for a user.Computing device2 may operate according to the continuous learning mode alternatively or additionally to a training program mode such as that described above. In the continuous learning mode,keyboard customization module18 may execute as a background task while the user makes use ofcomputing device2. For example, a user may make use of one or more other application(s)19 executing onoperating system16 ofcomputing device2, where theother applications19 are applications other thankeyboard application8.Keyboard customization module18 may gather data based on the user's inputs tocomputing device2 usinggraphical keyboard10, and may, for example, store the gathered data attouch data36 of storage device(s)26. As one example, while a user types usinggraphical keyboard10 using an email application ofapplications19,keyboard customization module18 may operate in the continuous learning mode to record data associated with user inputs during use of the email application and calculate touch regions associated with the user inputs.
Keyboard customization module18 may identify a target region associated with each touch region. In other words, for each user input,keyboard customization module18 may identify which key was intended to be pressed by the user input, and then identifies a corresponding target region for the key intended to be pressed. For example, a target region may coincide with a boundary of the representation of the intended key.Keyboard customization module18 may identify which key was intended to be pressed by the user input, and thus the corresponding target regions, based on user typing self-corrections, such as when a user deletes and retypes some text, selection of an auto-correct suggestion, and/or other user typing corrections, for example. That is,keyboard customization module18 may identify what key character(s) the user has selected to replace what was originally typed by the user.
Keyboard customization module18 may determine whether a touch region corresponding to a given user input is substantially aligned with an associated target region of the graphical keyboard, andkeyboard customization module18 identifies instances in which the touch regions do not substantially align with the corresponding expectedkey region32. For example,keyboard customization module18 may compare a center of a touch region to a center of a corresponding target region.Keyboard customization module18 may determine that the touch region substantially aligns with the corresponding target region when, for example, the centers of the touch region and the target region are within a certain configured distance from one another. In some examples, even though some or all of a touch region lies within a boundary of the representation of the key, if a center of the touch region is too close to an edge of the key boundary, it may be determined not to be substantially aligned. In some example aspects,keyboard customization module18 may build a cumulative record of user touches intouch data36 for each user, and may determine whether a distribution of the cumulative user touches is substantially aligned with the target region. In other examples,keyboard customization module18 may determine that the touch region does not substantially align with the corresponding target region when greater than a threshold quantity of surface area of the touch region is positioned outside of the target region.
Based onuser touch data36 and/orthresholds34,keyboard customization module18 may determine that one or more parameters of graphical keyboard could be modified to improve the user experience.Keyboard customization module18 may, for example, propose to modify one or more attributes such as a shape, size, and relative position of keys on the graphical keyboard. As another example,keyboard customization module18 may propose to modify an attribute such as an overall layout of the graphical keyboard, including modifying the size of the entire graphical keyboard. For example,keyboard customization module18 may increase an overall size of the graphical keyboard, such as by increasing a size of one or more keys of the graphical keyboard.
Whenkeyboard customization module18 determines that one or more parameters of the graphical keyboard should be modified,keyboard customization module18 may suggest proposed modifications to the graphical keyboard to the user, such as by showing proposed modifications on a display ofoutput device6.Keyboard customization module18 may, for example, occasionally present a display to the user that shows a new proposed graphical keyboard layout.Keyboard customization module18 may simultaneously display both a current layout of the graphical keyboard and a proposed modified layout of the graphical keyboard on a single display.
In some examples,keyboard customization module18 may give the user an option to elect to use the new graphical keyboard layout or stay with the current layout.Keyboard customization module18 may also provide the user an option to accept the key layout changes on a key-by-key basis. In some examples,keyboard customization module18 may automatically change the layout of the graphical keyboard without requesting user approval. The user may be able to turn the continuous keyboard learning mode on or off, e.g., via a user menu presented by computingdevice2. The user may also be able to defer participation in a keyboard customization training program.
When multiple different users make use ofcomputing device2 and have, for example, different user names and associated profiles oncomputing device2,training module14 may run one ormore training programs30 for each user.Keyboard customization module18 may create different customized graphical keyboards for each user.Keyboard customization module18 may store data associated with the different customized graphical keyboards to user settings28. For example,keyboard customization module18 may store data to user settings28 indicating the mapping between users and respective customized graphical keyboard layouts. User settings28 can include a variety of user settings for each user, in addition to settings related to the customized graphical keyboards.
In one example,keyboard customization module18 may suggest enlarging an overall size of thegraphical keyboard10 based on comparisons of touch regions and target regions. For example, iftouch data36 collected based on the user's typing indicates that the user often touches locations beyond a boundary of the current graphical keyboard, thenkeyboard customization module18 may propose to enlarge the overall size of thegraphical keyboard10.
FIG. 3 is a conceptual diagram illustrating an example of a keyboard application. For purposes of illustration, the example keyboard application is described below in the context ofcomputing device2 ofFIG. 1 andFIG. 2.Keyboard application8, executing on one ormore processors20, may provide one or more signals to cause a touch-sensitive display, such asoutput device6, to displaygraphical keyboard40. As illustrated inFIG. 3, a user may perform a gesture, such as a tap gesture, at a location of the touch-sensitive display (e.g., output device6) that displays one or more of the characters of the graphical keyboard. A tap gesture may be defined as touching the touch-sensitive display at one or more of the displayed characters with an input unit (a finger, in the illustrated example) and releasing the character by removing the input unit from the touch-sensitive display. In certain examples, a user may perform a sliding gesture (not illustrated), such as by releasing the character by removing the input unit from the selected character while maintaining contact between the input unit and the touch-sensitive display.
In the illustrated example ofFIG. 3, a tap gesture begins withgesture42, where a user begins to initiate touchinggraphical keyboard40 at the displayed character “k”. Atgesture44, the user has made contact with the displayed letter “k” ofgraphical keyboard40.Gesture determination module12 may determine that a gesture, such as the illustrated tap gesture, has begun whenoutput device6 provides one or more signals indicating that an input device has made contact with the touch-sensitive display. In certain examples, as whenoutput device6 includes a presence-sensitive display,gesture determination module12 may determine that a gesture has begun whenoutput device6 provides one or more signals indicating that an input device has come into a detectable range of the presence-sensitive device.
Atgesture46, a user has released the displayed character “k” by removing his or her finger from the touch-sensitive display.Gesture determination module12 may determine that a tap gesture has been performed because the input unit (a finger in the illustrated example) was removed from the selected character by removing the input unit from the touch-sensitive display.
Upon determining that a tap gesture has been performed,gesture determination module12 may determine the touch region of the portion ofoutput device6 that is in contact with the input unit, such as by using position data and a radius of a touch region indicated byoutput device6.Gesture determination module12 may store user touch data, such as the position of a tap gesture on the graphical keyboard, a key character associated with the position, and a radius of the touch region associated with the tap gesture, to touchdata36.
FIGS. 4A-4D are block diagrams illustrating example portions of a graphical keyboard.FIG. 4A includes threetarget regions50A-50C (“target regions50”). In the example ofFIG. 4A,target region50A is associated with a representation of the “H” key,target region50B is associated with a representation of the “J” key, and targetregion50C is associated with a representation of the “N” key.FIG. 4A also includes atouch region52 associated with the representation of the key “H.” In this example, target regions50 are co-extensive with outer boundaries of a representation of the respective keys on the graphical keyboard. In other examples, target regions50 may not be co-extensive with the outer boundaries of the representations of the keys. For example, a target region50 may consist of a different region associated with the respective key, such as by delineating an inner region of a representation of a key.
In some aspects,touch region52 may be determined bygesture determination module12 based on a single user input, or may be determined bykeyboard customization module18 based on multiple user inputs, such as based on a set of user inputs like those represented by the distribution ofgraph60 ofFIG. 5. In an example in which a target region50 corresponds to a single user input,keyboard customization module18 may determine that thetouch region52 is associated withtarget region50A, in the sense thatkeyboard customization module18 determines that a user is expected to have typed “H” when the user input associated withtouch region52 occurred.Keyboard customization module18 may determine the association based on data from atraining program30, or other method.
In the example ofFIG. 4A,keyboard customization module18 may determine thattouch region52 is substantially aligned withtarget region50A. For example,keyboard customization module18 may determine that a center oftarget region52 is within a threshold distance of a center oftarget region50A. In the example ofFIG. 4B,keyboard customization module18 may likewise determine thattouch region54 is associated withtarget region50A.Gesture determination module12 may compute aradius55 oftouch region54, and may use theradius55 to determine thetouch region54.
Keyboard customization module18 may determine that the touch region substantially aligns with the corresponding target region when, for example, the centers of the touch region and the target region are within a certain configured distance from one another. In some examples, even though some or all of a touch region lies within a boundary of the representation of the key, if a center of the touch region is too close to an edge of the key boundary,keyboard customization module18 may determine that the touch region is not substantially aligned with the target region.FIG. 4C illustrates one such example. In the example ofFIG. 4C,keyboard customization module18 may determine thattouch region56 is not substantially aligned withtarget region50A because acenter57 oftouch region56 is greater than a threshold distance away from acenter59 of thetarget region50A.FIG. 4D may be another example in which atouch region58 is found not to be substantially aligned with atarget region50A associated with the representation of the key corresponding to the character of “H.”
FIG. 5 is a conceptual diagram illustrating an example distribution of user inputs associated with a representation of a key on a graphical keyboard.FIG. 5 illustrates aportion61 of a graphical keyboard that includesrepresentations keys63A-63C. Key63A is a representation of the “H” key, key63B is associated with a representation of the “J” key, and key63C is associated with a representation of the “N” key. In the example ofFIG. 5, three-dimensional graph60 includes anx-axis62 that represents a position along a first dimension in the plane of thegraphical keyboard10, and a y-axis64 that represents a position along a second dimension in the plane of thegraphical keyboard10. Three-dimensional graph60 further includes a z-axis66 that represents a cumulative quantity N of touches that have occurred at a given position of a key on the graphical keyboard.Keyboard customization module18 may maintaindistribution data36, which may include data such as that represented by three-dimensional graph60 for each key represented on the graphical keyboard for each user (e.g., the key associated with the letter “H”).
The example ofFIG. 5 includes akey boundary68 that indicates a position on the graphical keyboard associated with a particular key, for example, the key associated with the letter “H.”FIG. 5 also includes atarget region72 associated with the representation of the key “H.” In the example ofFIG. 5,target region72 is not coextensive with thekey boundary68, but represents an inner region having less surface area than a region encompassed bykey boundary68.
When a user touch is registered bygesture determination module12 as being associated with a particular (x, y) position in the plane of thegraphical keyboard10,keyboard customization module18 may log record the instance of the touch, such as by incrementing a counter associated with that (x, y) position. A cumulative quantity of touches up to a given point of time at each (x, y) position within the key boundary for the key “H” is illustrated by three-dimensional graph60.Keyboard customization module18 may not record those touches that are determined to be actual typing errors (e.g., spelling mistakes), but instead may only record those touches that are determined to be attempts to type at the representation of the key for “H.”
Computing device2 may be configured with athreshold70 that specifies a threshold quantity of touches. If the (x, y) position withinkey boundary68 that is most touched by the user when the user is attempting to touch the key for “H” is too close to an edge ofkey boundary68, thenkeyboard customization module18 may determine to modify the location, shape, or other parameter or characteristic of the key to better suit the typing habits of the user.
Keyboard customization module18 may initiate modification of one or more parameters of the graphical keyboard upon determining that a touch region associated with the key “H” is not substantially aligned with thetarget region72 associated with the key “H.” For example,keyboard customization module18 may initiate modification of the graphical keyboard when a most touched position associated with the key “H” is located outside of a configuredinner region72 of the key. The most touched position is represented by a maximum 74 of the distribution represented in three-dimensional graph60. In some aspects,keyboard customization module18 may be configured not to modify the graphical keyboard when only a few touches have occurred outside theinner region72, but may instead modify the graphical keyboard only when the quantity of touches at a position outside thetarget region72 exceeds a threshold quantity oftouches70. Sensitivity ofkeyboard customization module18 may be configured by adjusting settings forthreshold70 and/ortarget region72.
Alternatively or additionally, as described above,keyboard customization module18 may initiate modification of the graphical keyboard when a center of a touch region is located greater than a configured distance from a center of thetarget region72, where the touch region may be determined based on cumulative user inputs represented bygraph60.Keyboard customization module18 may user other techniques for determining whether a touch region is substantially aligned with the corresponding target region. In response to determining that a touch region is not substantially aligned with the corresponding target region,keyboard customization module18 may propose to modify the location, shape, or other parameter or characteristic of the key to better suit the typing habits of the user. In this manner,keyboard customization module18 may help to improve the quality of the user's experience when using the graphical keyboard, such as by reducing an amount of errors and corrections made by the user.
FIG. 6 is a flow diagram illustrating an example process of a computing device or computing system (e.g.,computing device2 ofFIGS. 1-2) that is configured to execute a keyboard application. In the example ofFIG. 6, a computing device having an input-sensitive display operates by outputting a first graphical keyboard arrangement including a first representation of a key that is associated with a target region of the input-sensitive display (80). The operation includes receiving a plurality of user inputs at the input-sensitive display, each user input from the plurality of user inputs being associated with a respective touch region of the input-sensitive display (82). Responsive to determining that each input from the plurality of user inputs is associated with the first representation of the key, the operation includes determining whether one or more of the associated touch regions is not substantially aligned with the target region associated with the first representation of the key (84). The operation includes identifying a quantity of the touch regions that are not substantially aligned with the target region (86), and, subsequent to determining that the quantity exceeds a threshold quantity of touch regions that are not substantially aligned with the target region, outputting, at the input-sensitive display, a second graphical keyboard arrangement that includes a second representation of the key, wherein at least one attribute of the second representation of the key is graphically modified relative to the first representation of the key so as to substantially align one or more of the touch regions with a target region associated with the second representation of the key (88).
In one example, the process can further include receiving the plurality of user inputs in response to a training program presented by the computing device that prompts a user to type a predefined series of characters using the first graphical keyboard arrangement, and determining that each user input from the plurality of user inputs is associated with the first representation of the key based on a comparison of an order of the user inputs with an order of the predefined series of characters presented by the training program. Alternatively or additionally, the process can include executing a keyboard customization application as a background task of the computing device, and receiving the plurality of user inputs during use by a user of an application executing on the computing device other than the keyboard customization application. In some examples, the process can include determining, for example, that each user input from the plurality of user inputs is associated with the first representation of the key at least in part based on typing corrections received from the user. The process can include outputting, at the input-sensitive display of the computing device, an indication of an option to cease execution of the keyboard customization application as a background task.
Alternatively or additionally, in some examples the process can include outputting, at the input-sensitive display of the computing device, the first graphical keyboard arrangement and the second graphical keyboard at the same time, and soliciting a user selection of one of the first graphical keyboard arrangement and the second graphical keyboard for future use, and/or soliciting a user selection of one or more modifications of attributes of one or more representations of keys of the first graphical keyboard arrangement for future use. The process can include associating the second graphical keyboard arrangement with a profile of a user of the computing device. The process can also include modifying one or more attributes of the second representation of the key to graphically modify relative to the first representation of the key, such as a shape, a size, a position, or other attribute.
In some examples, the process can include determining that one or more of the associated touch regions is not substantially aligned with the target region associated with the first representation of the key based at least on determining that a center of the touch region is positioned greater than a threshold distance away from a center of the target region. In some examples, the process can include determining that one or more of the associated touch regions is not substantially aligned with the target region associated with the first representation of the key based at least on determining that greater than a threshold quantity of surface area of the touch region is positioned outside of the target region.
Techniques described herein may be implemented, at least in part, in hardware, software, firmware, or any combination thereof. For example, various aspects of the described embodiments may be implemented within one or more processors, including one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components. The term “processor” or “processing circuitry” may generally refer to any of the foregoing logic circuitry, alone or in combination with other logic circuitry, or any other equivalent circuitry. A control unit including hardware may also perform one or more of the techniques of this disclosure.
Such hardware, software, and firmware may be implemented within the same device or within separate devices to support the various techniques described herein. In addition, any of the described units, modules or components may be implemented together or separately as discrete but interoperable logic devices. Depiction of different features as modules or units is intended to highlight different functional aspects and does not necessarily imply that such modules or units are realized by separate hardware, firmware, or software components. Rather, functionality associated with one or more modules or units may be performed by separate hardware, firmware, or software components, or integrated within common or separate hardware, firmware, or software components.
Techniques described herein may also be embodied or encoded in an article of manufacture including a computer-readable storage medium encoded with instructions. Instructions embedded or encoded in an article of manufacture including an encoded computer-readable storage medium, may cause one or more programmable processors, or other processors, of a computing system to implement one or more of the techniques described herein, such as when instructions included or encoded in the computer-readable storage medium are executed by the one or more processors. Computer readable storage media may include random access memory (RAM), read only memory (ROM), programmable read only memory (PROM), erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), flash memory, a hard disk, a compact disc ROM (CD-ROM), a floppy disk, a cassette, magnetic media, optical media, or other computer readable media. In some examples, an article of manufacture may comprise one or more computer-readable storage media.
In some examples, computer-readable storage media may comprise non-transitory media. The term “non-transitory” may indicate that the storage medium is tangible and is not embodied in a carrier wave or a propagated signal. In certain examples, a non-transitory storage medium may store data that can, over time, change (e.g., in RAM or cache).
Various examples have been described. These and other examples are within the scope of the following claims.