RELATED APPLICATIONSThis application is a divisional of U.S. patent application Ser. No. 11/459,606, filed Jul. 24, 2006, entitled “Keyboards for Portable Electronic Devices,” which claims priority to U.S. Provisional Patent Application No. 60/756,890, filed Jan. 5, 2006, entitled “Keyboards for Portable Electronic Devices,” which applications are incorporated by reference herein in their entirety.
This application is related to U.S. patent application Ser. No. 11/459,615, filed Jul. 24, 2006, entitled “Touch Screen Keyboards for Portable Electronic Devices,” which application is incorporated by reference herein in its entirety.
This application is related to U.S. patent application Ser. No. 11/228,700, filed Sep. 16, 2005, entitled “Operation of a Computer with Touch Screen Interface,” which application is incorporated by reference herein in its entirety.
This application is related to U.S. patent application Ser. No. ______, filed ______, entitled “User Interface Including Word Recommendations,” Attorney Docket No. P4150USD2/63266-5271US), which application is incorporated by reference herein in its entirety.
This application is related to U.S. patent application Ser. No. ______, filed ______, entitled “Keyboard with Multi-Symbol Icons,” Attorney Docket No. P4150USD3/63266-5272US), which application is incorporated by reference herein in its entirety.
This application is related to U.S. patent application Ser. No. ______, filed ______, entitled “Keyboard with Multi-Symbol Icons,” Attorney Docket No. P4150USD4/63266-5273US), which application is incorporated by reference herein in its entirety.
TECHNICAL FIELDThe disclosed embodiments relate to user interfaces, and in particular, to user interfaces that include a touch screen keyboard.
BACKGROUNDAs portable devices become more compact, and the amount of information to be processed and stored increases, it has become a significant challenge to design a user interface that allows users to easily interact with the device. This is unfortunate since the user interface is the gateway through which users receive not only content but also responses to user actions or behaviors, including user attempts to access a device's features or tools. Some portable electronic devices (e.g., mobile phones) have resorted to adding more pushbuttons, increasing a density of push buttons, overloading the functions of pushbuttons, or using complex menu systems to allow a user to access, store and manipulate data. These conventional user interfaces often result in complicated key sequences and menu hierarchies that must be memorized by the user. In addition, as the number of pushbuttons has increased the proximity of neighboring buttons often makes it difficult for users to activate a desired pushbutton.
Many conventional user interfaces, such as those that include physical pushbuttons, are also inflexible. This is unfortunate since it may prevent a user interface from being configured and/or adapted by either an application running on the portable device or by users. When coupled with the time consuming requirement to memorize multiple key sequences and menu hierarchies, and the difficulty in activating a desired pushbutton, such inflexibility is frustrating to most users.
Accordingly, there is a need for more transparent and intuitive user interfaces for portable electronic devices that are easy to use, configure, and/or adapt.
SUMMARY OF EMBODIMENTSThe above deficiencies and other problems associated with user interfaces for portable devices are reduced or eliminated by the disclosed touch screen keyboards and their methods of use.
In some embodiments, a method includes displaying a plurality of icons on a touch-sensitive display. A respective icon in at least a subset of the plurality of icons corresponds to two or more symbols. A contact by a user with the touch-sensitive display that corresponds to the respective icon is detected. A respective symbol in the two or more symbols to which the contact further corresponds is determined. The displayed respective icon is modified to indicate that the contact corresponds to the respective symbol.
The respective symbol may be selected when the user breaks contact with the respective icon. The respective symbol may be capitalized when contact is maintained for a time interval exceeding a pre-determined value.
Modifying may include changing a shape of the respective icon. Changing the shape may include an asymmetric distortion of the shape. An initial shape of the respective icon may include an arc.
Detecting may include detecting rolling of a finger over a region that corresponds to the respective symbol. The contact may include a gesture that is selected from the group consisting of one or more taps, a swipe and a rolling of a finger.
The two or more symbols for the respective icon may be determined in accordance with a lexicography model. The lexicography model may correspond to a user usage history. The user usage history may occur prior to the establishment of the contact. The lexicography model may correspond to a frequency of usage of symbols in a language.
In some embodiments, the respective symbol is displayed in a region within the shape of the respective icon and outside of a region corresponding to the contact.
In some embodiments, a visual indicator corresponding to the respective symbol is provided. The visual indicator may include visual illumination proximate to the respective icon. The visual illumination may include a band around at least a portion of the respective icon. The visual indicator may be in accordance with a user usage history that occurs prior to the detecting of the contact.
In some embodiments, a method includes displaying a plurality of icons on a touch-sensitive display. Two or more subsets of the plurality of icons are arranged in corresponding rows on the touch-sensitive display. A space greater than a pre-determined value is included between adjacent rows. A contact by a user with the touch-sensitive display that corresponds to a respective icon is detected. A symbol corresponding to the respective icon is displayed in the space between a respective row corresponding to the respective icon and a neighboring row while the contact is maintained.
The symbol may be the respective icon. The symbol may be magnified relative to the respective icon. The neighboring row may be above the respective row.
In another embodiment, a plurality of icons are displayed on a touch-sensitive display. A contact by a user with the touch-sensitive display that corresponds to the respective icon is determined. A symbol corresponding to the respective icon is displayed superimposed over one or more additional icons in the plurality of icons while the contact is maintained.
In another embodiment, a plurality of icons are displayed on a touch-sensitive display. Two or more subsets of the plurality of icons are arranged in corresponding rows. A contact by a user with the touch-sensitive display that corresponds to the respective icon is determined. The displayed plurality of icons are modified to include a space greater than a pre-determined value between a row corresponding to the respective icon and an adjacent row on the touch-sensitive display while the contact is maintained. A symbol corresponding to the respective icon is displayed in the space while the contact is maintained.
In some embodiments, a method includes displaying a plurality of icons on a touch-sensitive display. A respective icon in the plurality of icons corresponds to at least one symbol. One or more recommended words are displayed. The one or more recommended words are in accordance with a user history. The one or more recommended words are displayed prior to detecting any contacts by a user corresponding to symbol selection in a current application session. A contact by the user with the touch-sensitive display is detected. The contact includes a gesture. A respective recommended word corresponding to the gesture is selected.
The gesture may include a swipe motion. The swipe motion may include a horizontal component with displacement from left to right or from right to left along the touch-sensitive display. The swipe motion may include a vertical component with displacement downward or upward along the touch-sensitive display.
The gesture may include one or more taps. A respective tap may include making contact with the touch-sensitive display for a time interval less than a pre-determined value.
The gesture may include a rolling motion of the contact. The rolling motion may be from left to right or from right to left along the touch-sensitive display.
In some embodiments, a method includes displaying a plurality of icons on a touch-sensitive display. A respective icon in at least a subset of the plurality of icons corresponds to two or more symbols. A contact by a user with the touch-sensitive display that corresponds to a selection of the respective icon, wherein the contact includes a respective gesture, is detected. A respective symbol in the two or more symbols for the respective icon to which the contact further corresponds is determined. The respective symbol is a first symbol in the two or more symbols if the respective gesture includes a continuous contact and the respective symbol is a second symbol in the two or more symbols if the respective gesture includes a discontinuous contact.
The continuous contact may include a swipe motion. The swipe motion may include a horizontal component with displacement from left to right or from right to left along the touch-sensitive display. The swipe motion may include a vertical component with displacement downward or with displacement upward along the touch-sensitive display. The continuous contact may include a rolling motion of the contact. The rolling motion may be from left to right or from right to left along the touch-sensitive display.
The discontinuous contact may include one or more taps. A respective tap may include contact with the touch-sensitive display for a time interval less than a first pre-determined value. Two or more consecutive taps may correspond to the second symbol if a time interval between two or more corresponding contacts is less than a second pre-determined value.
The first symbol may be included in a first subset of symbols and the second symbol may be included in a second subset of symbols. The first subset of symbols may have a probability of occurrence that is greater than a first pre-determined value and the second subset of symbols may have a probability of occurrence that is less than the first pre-determined value.
The probability of occurrence may be in accordance with a user history. The probability of occurrence may be in accordance with a lexicography model. The lexicography model may include a frequency of usage of symbols in a language.
In some embodiments, the second symbol for the respective icon has a probability of occurrence immediately following the first symbol for the respective icon that is less than a second pre-determined value. In some embodiments, the first symbol for the respective icon has a probability of occurrence immediately following the second symbol for the respective icon that is less than a second pre-determined value.
In some embodiments, the displayed respective icon is modified to indicate that the contact corresponds to a respective symbol. In some embodiments, a visual indicator corresponding to a respective symbol is provided. The visual indicator may include visual illumination proximate to the respective icon. The visual illumination may include a band around at least a portion of the respective icon.
In some embodiments, a method includes displaying a plurality of icons on a touch-sensitive display. A respective icon in at least a subset of the plurality of icons corresponds to two or more symbols. A first symbol in the two or more symbols belongs to a first subset of symbols and a second symbol in the two or more symbols belongs to a second subset of symbols. The first symbol has a probability of occurrence that is greater than a first pre-determined value and the second symbol has a probability of occurrence that is less than the first pre-determined value. A contact by a user with the touch-sensitive display that corresponds to a selection of the respective icon is detected. The contact includes a respective gesture. A respective symbol in the two or more symbols for the respective icon to which the contact further corresponds is determined.
The probability of occurrence may be in accordance with a user history. The probability of occurrence may be in accordance with lexicography model. The lexicography model may include a frequency of usage of symbols in a language. The second symbol may have a probability of occurrence immediately following the first symbol that is less than a second pre-determined value.
The first symbol may be selected using one or more tap gestures and the second symbol may be selected using a swipe gesture. A respective tap may include making contact with the touch-sensitive display for a time interval less than a second pre-determined value. Two or more consecutive taps may correspond to the second symbol if a time interval between two or more corresponding contacts is less than a third pre-determined value.
In some embodiments, the displayed respective icon is modified to indicate that the contact corresponds to the respective symbol. In some embodiments, a visual indicator corresponding to the respective symbol is provided. The visual indicator may include visual illumination proximate to the respective icon. The visual illumination may include a band around at least a portion of the respective icon.
In some embodiments, the first subset of symbols includes e, t, a, o, i, n, s, r and h. In some embodiments, the first subset of symbols includes q, e, u, I, o, a, d, g, j, l, z, c, b, n and m. In some embodiments, the first subset of symbols includes q, c, e, h, I, l, n, o, r, t, u, w and y.
In some embodiments, the second subset of symbols includes w, y and j. In some embodiments, the second subset of symbols includes w, y, p, g and j. In some embodiments, the second subset of symbols includes w, r, t, y, p, s, f, h, k, x and v. In some embodiments, the second subset of symbols includes j, v, x and z. In some embodiments, the second subset of symbols includes b, d, f, g, j, k, m, p, q, s, v, x and z.
The aforementioned methods may be performed by a portable electronic device having a touch-sensitive display with a graphical user interface (GUI), one or more processors, memory and one or more modules, programs or sets of instructions stored in the memory for performing these methods. In some embodiments, the portable electronic device provides a plurality of functions, including wireless communication.
Instructions for performing the aforementioned methods may be included in a computer program product configured for execution by one or more processors.
BRIEF DESCRIPTION OF THE DRAWINGSFor a better understanding of the aforementioned embodiments of the invention as well as additional embodiments thereof, reference should be made to the Description of Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
FIG. 1 is a block diagram illustrating an embodiment of an architecture for a portable electronic device.
FIG. 2 is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 3A is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 3B is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 3C is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 4 is a flow diagram of an embodiment of a symbol entry process.
FIG. 5 is a block diagram illustrating an embodiment of a character set data structure.
FIG. 6A is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 6B is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 6C is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 6D is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 7 is a flow diagram of an embodiment of a symbol entry process.
FIG. 8 is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 9 is a flow diagram of an embodiment of a symbol entry process.
FIG. 10A is a block diagram illustrating an embodiment of a user word history data structure.
FIG. 10B is a block diagram illustrating an embodiment of a language data structure system.
FIG. 11A is a flow diagram of an embodiment of a symbol entry process.
FIG. 11B is a flow diagram of an embodiment of a symbol entry process.
FIG. 11C is a flow diagram of an embodiment of a symbol entry process.
FIG. 12A is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 12B is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 12C is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 12D is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 12E is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 12F is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 12G is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 13 is a flow diagram of an embodiment of a symbol entry process.
FIG. 14 is a flow diagram of an embodiment of a symbol entry process.
FIG. 15 is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 16 is a flow diagram of an embodiment of a symbol entry process.
FIG. 17 is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
FIG. 18 is a flow diagram of an embodiment of a symbol entry process.
FIG. 19 is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device.
DESCRIPTION OF EMBODIMENTSReference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
Embodiments of user interfaces and associated processes for using a device are described. In some embodiments, the device may be a portable communications device. The user interface may include a click wheel and/or touch screen. A click wheel is a physical user-interface device that may provide navigation commands based on an angular displacement of the wheel or a point of contact with the wheel by a user of the device. A click wheel may also be used to provide a user command corresponding to selection of one or more items, for example, when the user of the device presses down on at least a portion of the wheel. For simplicity, in the discussion that follows, a portable communications device (e.g., a cellular telephone that may also contain other functions, such as SMS, PDA and/or music player functions) that includes a touch screen is used as an exemplary embodiment. It should be understood, however, that the user interfaces and associated processes may be applied to other devices, such as personal computers and laptops, that may include one or more other physical user-interface devices, such as a click wheel, a keyboard, a mouse and/or a joystick.
The device may support a variety of applications, such as a telephone, text messaging, word processing, email and a music player. The music player may be compatible with one or more file formats, such as MP3 and/or AAC. In an exemplary embodiment, the device includes an iPod music player (trademark of Apple Computer, Inc.).
The various applications that may be executed on the device may use at least one common physical user-interface device, such as the touch screen. In embodiments that include a click wheel, one or more functions of the click wheel as well as corresponding information displayed on the device may be adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the click wheel) of the device may support the variety of applications with user interfaces that are intuitive and transparent.
The user interfaces may include one or more keyboard embodiments. The keyboard embodiments may include standard (qwerty) and/or non-standard configurations of symbols on the displayed icons of the keyboard. The keyboard embodiments may include a reduced number of icons (or soft keys) relative to the number of keys in existing physical keyboards, such as that for a typewriter. This may make it easier for users to select one or more icons in the keyboard, and thus, one or more corresponding symbols. The keyboard embodiments may be adaptive. For example, displayed icons may be modified in accordance with user actions, such as selecting one or more icons and/or one or more corresponding symbols. One or more applications on the portable device may utilize common and/or different keyboard embodiments. Thus, the keyboard embodiment used may be tailored to at least some of the applications. In some embodiments, one or more keyboard embodiments may be tailored to a respective user. For example, based on a word usage history (lexicography, slang, individual usage) of the respective user. Some of the keyboard embodiments may be adjusted to reduce a probability of a user error when selecting one or more icons, and thus one or more symbols, when using the keyboard embodiments.
Attention is now directed towards embodiments of the device.FIG. 1 is a block diagram illustrating an architecture for a portableelectronic device100, according to some embodiments of the invention. Thedevice100 may include a memory102 (which may include one or more computer readable storage mediums), amemory controller122, one or more processing units (CPU's)120, aperipherals interface118,RF circuitry108,audio circuitry110, aspeaker111, amicrophone113, an input/output (I/O)subsystem106, a display system112 (which may include a touch screen), aclick wheel114, other input orcontrol devices116, and anexternal port124. These components may communicate over the one or more communication buses orsignal lines103. Thedevice100 may be any portable electronic device, including but not limited to a handheld computer, a tablet computer, a mobile phone, a media player, a personal digital assistant (PDA), or the like, including a combination of two or more of these items. In other embodiments, thedevice100 may not be portable, such as a personal computer.
It should be appreciated that thedevice100 is only one example of a portableelectronic device100, and that thedevice100 may have more or fewer components than shown, may combine two or more components, or a may have a different configuration or arrangement of the components. The various components shown inFIG. 1 may be implemented in hardware, software or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
Thememory102 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state memory devices. In some embodiments, thememory102 may further include storage remotely located from the one ormore processors120, for instance network attached storage accessed via theRF circuitry108 or theexternal port124 and a communications network (not shown) such as the Internet, intranet(s), Local Area Networks (LANs), Wide Local Area Networks (WLANs), Storage Area Networks (SANs) and the like, or any suitable combination thereof. Access to thememory102 by other components of thedevice100, such as theCPU120 and theperipherals interface118, may be controlled by thememory controller122.
The peripherals interface118 couples the input and output peripherals of the device to theCPU120 and thememory102. The one ormore processors120 run or execute various software programs and/or sets of instructions stored in thememory102 to perform various functions for thedevice100 and to process data.
In some embodiments, theperipherals interface118, theCPU120, and thememory controller122 may be implemented on a single chip, such as achip104. In some other embodiments, they may be implemented on separate chips.
The RF (radio frequency)circuitry108 receives and sends electromagnetic waves. TheRF circuitry108 converts electrical signals to/from electromagnetic waves and communicates with communications networks and other communications devices via the electromagnetic waves. TheRF circuitry108 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. TheRF circuitry108 may communicate with the networks, such as the Internet, also referred to as the World Wide Web (WWW), an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The wireless communication may use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for email, instant messaging, and/or Short Message Service (SMS)), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
Theaudio circuitry110, thespeaker111, and themicrophone113 provide an audio interface between a user and thedevice100. Theaudio circuitry110 receives audio data from theperipherals interface118, converts the audio data to an electrical signal, and transmits the electrical signal to thespeaker111. Thespeaker111 converts the electrical signal to human-audible sound waves. Theaudio circuitry110 also receives electrical signals converted by themicrophone113 from sound waves. Theaudio circuitry110 converts the electrical signal to audio data and transmits the audio data to the peripherals interface118 for processing. Audio data may be may be retrieved from and/or transmitted to thememory102 and/or theRF circuitry108 by theperipherals interface118. In some embodiments, theaudio circuitry110 also includes a headset jack (not shown). The headset jack provides an interface between theaudio circuitry110 and removable audio input/output peripherals, such as output-only headphones or a headset with both output (headphone for one or both ears) and input (microphone).
The I/O subsystem106 provides the interface between input/output peripherals on thedevice100, such as thedisplay system112, theclick wheel114 and other input/control devices116, and theperipherals interface118. The I/O subsystem106 may include adisplay controller156, aclick wheel controller158 and one ormore input controllers160 for other input or control devices. The one ormore input controllers160 receive/send electrical signals from/to other input orcontrol devices160. The other input/control devices160 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, sticks, and so forth.
Thedisplay system112 provides an output interface and/or an input interface between the device and a user. Thedisplay controller156 receives and/or sends electrical signals from/to thedisplay system112. Thedisplay system112 displays visual output to the user. The visual output may include text, icons, graphics, video, and any combination thereof. In some embodiments, some or all of the visual output may correspond to user-interface objects, further details of which are described below.
In some embodiments, such as those that include a touch screen, thedisplay system112 also accepts input from the user based on haptic and/or tactile contact. In embodiments with a touch screen, thedisplay system112 forms a touch-sensitive surface that accepts user input. In these embodiments, thedisplay system112 and the display controller156 (along with any associated modules and/or sets of instructions in the memory102) detect contact (and any movement or breaking of the contact) on thedisplay system112 and converts the detected contact into interaction with user-interface objects, such as one or more soft keys, that are displayed on a touch screen. In an exemplary embodiment, a point of contact between a touch screen in thedisplay system112 and the user corresponds to one or more digits of the user.
In embodiments with a touch screen, the touch screen in thedisplay system112 may use LCD (liquid crystal display) technology, or LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments. A touch screen in thedisplay system112 and thedisplay controller156 may detect contact and any movement or breaking thereof using any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with a touch screen in thedisplay system112. A touch-sensitive display in some embodiments of thedisplay system112 may be analogous to the multi-touch sensitive tablets described in the following U.S. Pat. No. 6,323,846 (Westerman et al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat. No. 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference. However, a touch screen in thedisplay system112 displays visual output from theportable device100, whereas touch sensitive tablets do not provide visual output. The touch screen in thedisplay system112 may have a resolution in excess of 100 dpi. In an exemplary embodiment, the touch screen in the display system may have a resolution of approximately 168 dpi. The user may make contact with the touch screen in thedisplay system112 using any suitable object or appendage, such as a stylus, finger, and so forth.
In some embodiments, in addition to touch screen, thedevice100 may include a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad may be a touch-sensitive surface that is separate from the touch screen in thedisplay system112 or an extension of the touch-sensitive surface formed by the touch screen.
Thedevice100 may include aclick wheel114. A user may navigate among one or more graphical objects (henceforth referred to as icons) displayed in thedisplay system112 by rotating theclick wheel114 or by moving (e.g., angular displacement) of a point of contact with theclick wheel114. Theclick wheel114 may also be used to select one or more of the displayed icons. For example, the user may press down on at least a portion of theclick wheel114 or an associated physical button. User commands and navigation commands provided by the user via theclick wheel114 may be processed by theclick wheel controller158 as well as one or more of the modules and/or sets of instructions in thememory102.
Thedevice100 also includes apower system162 for powering the various components. Thepower system162 may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
In some embodiments, the software components stored in thememory102 may include anoperating system126, a communication module (or set of instructions)128, a contact/motion module (or set of instructions)130, a graphics module (or set of instructions)132, one or more applications (or set of instructions)136, a timer module (or set of instructions)144, a word prediction module (or set of instructions)146, anaddress book148, auser word history150, one or more character sets152, and one or morelexicography models154. Thegraphics module132 may include an icon effects module (or set of instructions)134. Theapplications module136 may include a telephone module (or set of instructions)138, a text messaging module (or set of instructions)140 and/or a music player module (or set of instructions)142.
The operating system126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
Thecommunication module128 facilitates communication with other devices over one or moreexternal ports124 and also includes various software components for handling data received by theRF circuitry108 and/or theexternal port124. The external port124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.).
The contact/motion module130 may detect contact with theclick wheel114 and/or a touch screen in the display system112 (in conjunction with the display controller156). The contact/motion module130 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred, determining if there is movement of the contact and tracking the movement across theclick wheel114 and/or a touch screen in thedisplay system112, and determining if the contact has been broken (i.e., if the contact has ceased). Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (including magnitude and/or direction) of the point of contact. In some embodiments, the contact/motion module130 and thedisplay controller156 also detect contact on a touchpad.
Thegraphics module132 includes various known software components for rendering and displaying graphics on thedisplay system112. Note that the term “graphics” includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like.
In some embodiments, thegraphics module132 includes theicon effects module134. Theicon effects module134 may modify a displayed position of one or more icons on the display system112 (in conjunction with the display controller156) based on user actions (such as detecting a contact corresponding to at least one icon). In some embodiments, the modification of the displayed icon(s) may be based on an animation sequence.
In addition to thetelephone module138, thetext messaging module140 and/or themusic player module142, the one ormore applications136 may include any applications installed on thedevice100, including without limitation, a browser, theaddress book148, contact list, email, instant messaging, word processing, keyboard emulation, widgets, JAVA-enabled applications, encryption, digital rights management, voice recognition, voice replication, location determination capability (such as that provided by the global positioning system (GPS)), etc.
In conjunction with theRF circuitry108, theaudio circuitry110, thespeaker111, themicrophone113, thedisplay system112, thedisplay controller156, theclick wheel114 and/or theclick wheel controller158, thetelephone module138 may be used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in theaddress book148, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation and disconnect or hang up when the conversation is completed.
In conjunction with thedisplay system112, thedisplay controller156, theclick wheel114 and/or theclick wheel controller158, thetext messaging module140 may be used to enter a sequence of characters corresponding to a text message, to modify previously entered characters, to transmit a respective text message (for example, using a Short Message Service or SMS protocol), to receive text messages and to view received text messages. In some embodiments, transmitted and/or received text messages may include graphics, photos, audio files, video files and/or other attachments as are supported in a Multimedia Message Service (MMS) and/or an Enhanced Messaging Service (EMS). Embodiments of user interfaces and associated processes corresponding to the symbol entry, such as with thetext messaging module140, and more generally, to text entry and communication are described further below with reference toFIGS. 2-4,6-9 and11-20.
In conjunction with thedisplay system112, thedisplay system controller156, theclick wheel114, theclick wheel controller158, theaudio circuitry110, thespeaker111 and/or themicrophone113, themusic player module142 allows the user to play back recorded music stored in one or more files, such as MP3 or AAC files. In some embodiments, thedevice100 may include the functionality of an MP3 player, such as an iPod (trademark of Apple Computer, Inc.). Thedevice100 may, therefore, include a 36-pin connector that is compatible with the iPod.
Thetimer module144 may provide a time reference and/or time stamps for user commands received by thedevice100, for example, using theclick wheel114 and theclick wheel controller158.
Theword prediction module146 may be used in conjunction with one or more of theapplications136, such as thetext messaging module140. Theword prediction module146 may suggest one or more words or symbols (such as punctuation marks, pronunciation marks or spaces) in accordance with a context. The context may be based on one or more of the lexicography models154 (for example, grammatical and/or syntax rules associated with one or more languages) and/or auser word history150. The context may include one or more previously entered words, characters, and/or symbols. The context may depend on which of theapplications136 is being used. For example, there may be different contexts for an email application as opposed to a word processing application. A user interface and associated process that include recommended words from theword prediction module146 are discussed further below with reference toFIGS. 8 and 9.
Theuser word history150 may include static content (such as that associated with a dictionary) and/or dynamic content (such as that associated with characters, symbols and/or words that are routinely and/or recently used by the user). Theuser word history150 may include a static dictionary built up by scanning a user's address book, emails, and other documents. Theuser word history150 may include weighted scores or probabilities for predicted words based on a set of characters, symbols and/or words that are provided by the user to thedevice100, for example, using thedisplay system112, theclick wheel114 and theclick wheel controller158. Theuser word history150 may also include use statistics (e.g., time of use and/or frequency of use) of one or more characters, symbols and/or words that are provided by the user. Theuser word history150 is discussed further below with reference toFIGS. 10A and 10B.
The character sets152 may include one or more sets of characters corresponding to numbers, letters and/or symbols. The letters and/or symbols may correspond to one or more languages. The character sets152 may be used by one or more of theapplications136, such as thetext messaging module140. A data structure associated with the one or more character sets (which may be used in one or more of the keyboard embodiments) is discussed further below with reference toFIG. 5.
In some embodiments, thedevice100 may include one or more optional optical sensors (not shown), such as CMOS or CCD image sensors, for use in imaging applications.
In some embodiments, thedevice100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen in thedisplay system112 and/or a touchpad. By using a touch screen and/or a touchpad as the primary input/control device for operation of thedevice100, the number of physical input/control devices (such as push buttons, dials, and the like) on thedevice100 may be reduced. In one embodiment, thedevice100 includes a touch screen, a touchpad, a push button for powering the device on/off and locking the device, a volume adjustment rocker button and a slider switch for toggling ringer profiles. The push button may be used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval, or may be used to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed. In an alternative embodiment, thedevice100 also may accept verbal input for activation or deactivation of some functions through themicrophone113.
The predefined set of functions that are performed exclusively through a touch screen and/or a touchpad include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates thedevice100 to a main, home, or root menu from any user interface that may be displayed on thedevice100. In such embodiments, the touchpad may be referred to as a “menu button.” In some other embodiments, the menu button may be a physical push button or other physical input/control device instead of a touchpad.
In some embodiments, thedevice100 is a device where operation of a predefined set of functions on the device is performed exclusively or primarily through theclick wheel114. By using theclick wheel114 as the primary input/control device for operation of thedevice100, the number of other physical input/control devices (such as push buttons, dials, and the like) on thedevice100 may be reduced.
Attention is now directed towards embodiments of user interfaces and associated processes that may be implemented on thedevice100.FIG. 2 is a schematic diagram illustrating an embodiment of a user interface for a portableelectronic device200. Thedevice200 includes atouch screen208. The touch screen may display one or more trays. A tray is a region within a graphical user interface. One tray may include a user entry interface, such as akeyboard210 that includes a plurality of icons. The icons may include one or more symbols. In this embodiment, as well as others described below, a user may select one or more of the icons, and thus, one or more of the corresponding symbols, by making contact or touching thekeyboard210, for example, with one or more fingers212 (not drawn to scale in the figure). The contact may correspond to the one or more icons. In some embodiments, selection of one or more icons occurs when the user breaks contact with the one or more icons. In some embodiments, the contact may include a gesture, such as one or more taps, one or more swipes (from left to right, right to left, upward and/or downward) and/or a rolling of a finger (from right to left, left to right, upward and/or downward) that has made contact with thedevice200. In some embodiments, in advertent contact with an icon may not select a corresponding symbol. For example, a swipe gesture with an icon may not select a corresponding symbol when the gesture corresponding to selection is a tap.
Thedevice200 may include adisplay tray214. Thedisplay tray214 may display one or more of the characters and/or symbols that are selected by the user. Thedevice200 may also include one or more physical buttons, such as the clear, hold and menu buttons shown inFIG. 2. As described previously, the menu button may be used to navigate within a hierarchy of applications that may be executed on thedevice200. Alternatively, in some embodiments, the clear, hold, and/or menu buttons are implemented as soft keys in a GUI intouch screen208.
FIGS. 3A-3C are schematic diagrams illustrating an embodiment of a user interface for a portableelectronic device300. The user interface includes akeyboard310 that includes a plurality of icons. The icons include three symbols each. In other embodiments, the icons include two symbols each. In other embodiments, different icons on the same keyboard may include one, two, or three symbols each (e.g., some icons may contain one symbol while other icons contain two or three symbols). The symbols on the icons are in a non-standard configuration, i.e., non-qwerty. In addition, the total number of icons in thekeyboard310 is less than the number of physical keys in a standard keyboard.
The symbols in the icons in thekeyboard310 may be determined using a lexicography model, such as a language. The lexicography model may include a frequency of use of symbols in a language. For example, characters or symbols that are unlikely to occur immediately proximate to one another or immediately after one another in a set of symbols that the user may enter may be grouped on a respective icon312 (FIG. 3B). A language may include slang as well as individual usage (for example, words that are commonly used by the user). The lexicography model may correspond to a user usage or word history that occurs prior to the user making contact with thedevice300, i.e., a past usage.
As shown inFIG. 3B, when a user makescontact314 with thetouch screen208 in thedevice300 corresponding to therespective icon312 and a respective symbol (in this case a letter ‘a’), the shape of therespective icon312 is modified. This provides information to the user as to which icon and which symbol thecontact314 currently corresponds. This may be useful since thecontact314 may obscure at least a portion of therespective icon312 making it difficult for the user to see the respective symbol he or she is currently positioned on.
In an exemplary embodiment, the icons in thekeyboard310 may at least in part include an arc. In response to thecontact314, the shape of therespective icon312 may be asymmetrically distorted and the respective symbol that thecontact314 currently corresponds to may be displayed within the shape of therespective icon312 and outside of thecontact314.
In some embodiments, the user may select the respective symbol by making thecontact314 with therespective icon312 and rolling a finger over a region within therespective icon312 that corresponds to the respective symbol. If the user determines, based on the modified shape of therespective icon312 and/or the displayed symbol within the modified shape that the wrong symbol is currently contacted, the user may roll their finger to a different position within therespective icon312 that corresponds to the correct symbol. Once thecontact314 has been positioned over or proximate to the correct symbol, the user may select this symbol by breaking thecontact314 with therespective icon312. The selected symbol (such as the letter ‘a’) may then be displayed in thedisplay tray214. In some embodiments, if thecontact314 is maintained by the user for a time interval that is more than a first pre-determined value, such as 0.5, 1 or 2 s, before thecontact314 is broken, the respective symbol may be capitalized.
If an error has been made, the user may clear theentire display tray214 using a clear icon or may delete a most recently selected symbol using a delete icon. Once a set of symbols (such as a message) has been entered, the user may accept the set of symbols (which may store and/or send the set of symbols depending on the application executing on the device300) using an accept icon.
As shown inFIG. 3C, in some embodiments an additional visual indicator corresponding to therespective icon312 may be provided on thedisplay208. The visual indicator may be proximate to therespective icon312. The visual indicator may include aband318 around at least a portion of therespective icon312.
As is also shown inFIG. 3C, in some embodiments a shape of therespective icon312 may not be modified in response to thecontact314. Instead, anicon316 corresponding to therespective symbol316 may be displayed proximate to therespective icon312.
The modifying of the shape of therespective icon312 and/or the displaying of the visual indicator, such as theband318 and/or theicon316, may be included in at least some of the embodiments discussed further below.
While thedevice300 has been illustrated with certain components and a particular arrangement of these components, it should be understood that there may be fewer or more components, two or more components may be combined, and positions of one or more components may be changed. For example, thekeyboard310 may include fewer or additional icons. In some embodiments, a different character set and/or different groups of symbols may be used on the icons in thekeyboard310.
FIG. 4 is a flow diagram of an embodiment of asymbol entry process400. While thesymbol entry process400 described below includes a number of operations that appear to occur in a specific order, it should be apparent that theprocess400 can include more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed and/or two or more operations may be combined into a single operation.
A plurality of icons may be displayed on a touch-sensitive display (410). A respective icon may correspond to two or more symbols. Contact by a user with the display that corresponds to the respective icon may be detected (412). The displayed respective icon may be modified to indicate that the contact corresponds to a respective symbol in the two or more symbols (414). The respective symbol may be optionally displayed in a region within the shape of the respective icon and outside of a region corresponding to the contact (416). A visual indicator corresponding to the respective symbol may be optionally provided (418). The respective symbol may be optionally capitalized when contact is maintained for a time interval exceeding a pre-determined value (420). The respective symbol may be selected when the user breaks contact with the respective icon (422).
Attention is now directed towards embodiments of a character set data structure that may be used in implementing the user interface in the device300 (FIG. 3) and/or user interfaces described further below.FIG. 5 is a block diagram illustrating an embodiment of a characterset data structure500. The character sets152 may include multiple sets512 of characters and/or symbols. A respective set, such as the set512-1, may include one ormore symbols514 and one ormore probabilities516. The probabilities may include frequencies of occurrence of use, as well as conditional probabilities (such as the probability of a given symbol occurring given one or more symbols that have already occurred). In some embodiments the character setdata structure500 may include fewer or more components. Two or more components may be combined and an order of two or more components may be changed.
Attention is now directed towards additional embodiments of user interfaces and associated processes that may be implemented on the device100 (FIG. 1).FIGS. 6A-6D are schematic diagrams illustrating an embodiment of a user interface for a portableelectronic device600. Thedevice600 includes akeyboard610 that has a plurality of icons arranged in rows. A given row includes a subset of the plurality of icons. Adjacent rows are separated by a space greater than a second pre-determined value, such as a height of one of the icons.
As shown inFIG. 6B, when the user makes acontact612 with thedisplay208 corresponding to a respective icon in thekeyboard610, anicon614 may be displayed in the space between two adjacent rows. The icon may correspond to a respective symbol that corresponds to the respective icon that the user has contacted612. For example, if the user contacts or is proximate to an icon for the character ‘u’ in thekeyboard610, theicon614 may correspond to the character ‘u’. In this way, the user may receive feedback that the respective icon (and thus, the respective symbol) is currently contacted. This may be useful because thecontact612 may obscure the respective icon, and thus, the respective symbol, that has been selected in the rows of icons.
In some embodiments, theicon614 may be displayed above a respective row in which thecontact612 has occurred. In some embodiments, theicon614 may be magnified, i.e., larger, than the respective icon.
Theicon614 may be displayed while thecontact612 is maintained. When the user breaks thecontact612 with the respective icon, the respective symbol may be selected. In some embodiments, the respective symbol may be displayed in thedisplay tray214.
As shown inFIG. 6C, in some embodiments akeyboard616 may be displayed with rows of icons. Initially, the rows of icons may not include a significant space between adjacent rows, e.g., the space may be less than the second pre-determined value. When the user makes thecontact612 with thedisplay208, however, the displayedkeyboard616 may be modified to include a space greater the second pre-determined value and theicon614 may be displayed. This modified configuration or layout of thekeyboard616 may be maintained while thecontact612 is maintained by the user.
As shown inFIG. 6D, in some embodiments akeyboard618 may include rows of icons. When thecontact612 is made, anicon620 may be displayed superimposed over at least one or more additional icons in thekeyboard618.
While thedevice600 has been illustrated with certain components and a particular arrangement of these components, it should be understood that there may be fewer or more components, two or more components may be combined, and positions of one or more components may be changed. For example, thekeyboards610,616 and/or618 may include fewer or additional icons. In some embodiments, a different character set and/or different groups of symbols may be used on the icons in thekeyboards610,616 and/or618.
FIG. 7 is a flow diagram of an embodiment of asymbol entry process700. While thesymbol entry process700 described below includes a number of operations that appear to occur in a specific order, it should be apparent that theprocess700 can include more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed and/or two or more operations may be combined into a single operation.
A plurality of icons may be displayed on a touch-sensitive display (710). Two or more subsets of the plurality of icons may be arranged in rows. A contact by a user with the display that corresponds to a respective icon may be detected (712). A symbol corresponding to the respective icon may be optionally displayed between a row corresponding to the respective icon and a neighboring row (714). A symbol corresponding to the respective icon may be optionally displayed superimposed over one or more additional icons in the plurality of icons (716).
FIG. 8 is a schematic diagram illustrating an embodiment of a user interface for a portableelectronic device800. Thedevice800 may include atray812 that includes one or more recommendedwords810. The one or more recommendedwords810 may be determined using a user word history. This is discussed further below with reference toFIGS. 10A and 10B.
In some embodiments, the one or more recommendedwords810 are displayed prior to detecting any contacts corresponding to text input (symbol selection) by the user in a current application session. For example, the one or more recommendedwords810 may be displayed when the user initially opens an application, such as email, on thedevice800. The one or more recommendedwords810, therefore, may be determined based on a user word or usage history that may be application specific. After thedevice800 receives contacts corresponding to text input, the one or more recommendedwords810 may change dynamically in response to contacts corresponding to text input by the user during the application session.
The user may select one or more of the recommendedwords810 by making contact with thedisplay208. In some embodiments, one or more of the recommendedwords810, such as a phrase (“How are you?”), may be selected with a single contact. The contact may include a gesture, such as one or more taps, one or more swipes, and/or a rolling motion of a finger that makes the contact. The one or more taps may have a duration that is less than a third pre-determined value, such as 0.1, 0.5 or 1 s.
While thedevice800 has been illustrated with certain components and a particular arrangement of these components, it should be understood that there may be fewer or more components, two or more components may be combined, and positions of one or more components may be changed. For example, thekeyboard210 may include fewer or additional icons. In some embodiments, a different character set and/or different groups of symbols may be used on the icons in thekeyboard210.
FIG. 9 is a flow diagram of an embodiment of asymbol entry process900. While thesymbol entry process900 described below includes a number of operations that appear to occur in a specific order, it should be apparent that theprocess900 can include more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed and/or two or more operations may be combined into a single operation.
A plurality of icons may be displayed on a touch-sensitive display (910). A respective icon may correspond to at least one symbol. One or more recommended words may be displayed (912). The one or more recommended words may be in accordance with a user history prior to detecting any contacts corresponding to text input (symbol selection) by the user in a current application session. A contact by the user with the display may be detected (914). The contact may include a gesture. A respective recommended word that corresponds to the gesture may be selected (916).
Attention is now directed towards embodiments of data structure systems that may be implementing in the device100 (FIG. 1).FIG. 10A is a block diagram illustrating an embodiment of a user wordhistory data structure1000. Theuser word history150 may include a deletedword stack1010 andmultiple words1016. Thewords1016 may include one or more characters and/or one or more symbols. The deletedword stack1010 includes one ormore words1014 in a sequential order in which the one ormore words1014 were deleted by the user in an application, such as the text messaging module140 (FIG. 1).
A respective word in thewords1016, such as word1016-M, may include multiple records. A respective record may include a time-weighted score1018, use statistics1020 (such as a time of use and/or a frequency of use), acontext1022 and one ormore applications1024. The time-weighted score1018 may indicate a probability that the word1016-M is a next predicted word based on the context1022 (one or more characters, symbols and/or words that have previously been provided by the user) and/or theapplication1024. For example, the time-weighted score1018 may therefore be different for email than for the text messaging module140 (FIG. 1). The time-weighted score1018 may be computed to favorably weight (e.g., give a higher probability) to words that are used recently. For example, the time-weighted score1018 may give favorable weighting towords1016 that are used within the last 24 hours or week.Words1016 used on longer time scales (e.g., more than a day or a week ago) may have their corresponding time-weighted scores1018 reduced by a pre-determined ratio (such as 0.9) for each additional time interval (e.g., each day or week) since thewords1016 were last used.
The userhistory data structure1000 may include static information (for example, corresponding to a dictionary and/or grammatical and syntax rules for one or more languages) as well as dynamic information (based on recent usage statistics and/or patterns). Thus, the userhistory data structure1000 may be dynamically updated continuously, after pre-determined time intervals, or when a new word or syntax is employed by the user. The userhistory data structure1000 may include a static dictionary built up by scanning a user's address book, emails, and other documents. In some embodiments the userhistory data structure1000 may include fewer or more components. Two or more components may be combined and an order of two or more components may be changed.
FIG. 10B is a block diagram illustrating an embodiment of a languagedata structure system1050. The languagedata structure system1050 may be used to provide recommended words in the device800 (FIG. 8). A sequence of symbols1062 (including one or more characters, symbols and/or words) may be provided by the user. A set of symbols1062 corresponding to a context1022-1 may be processed by acontext map1060. In some embodiments, the context1022-1 may be a null set, i.e., one or more recommended words are provided before the user provides any symbols1062 (e.g., when an application is first opened). In other embodiments, the context1022-1 may include one or more previously entered or provided words as well as one or more symbols, such as the first one, two or three letters in a current word that the user is providing. Thecontext map1060 may include a select andhashing module1064 and ahash map1066. Thehash map1066 may select one or more appropriate entries in an application-specific dictionary1068. The entries in the application-specific dictionary1068 may includecontexts1070, predictedwords1072, and time-weighted scores1074. The application-specific dictionary1068 may utilize the records in the userhistory data structure1000. As a consequence, the application-specific dictionary1068 may be dynamically updated continuously, after pre-determined time intervals, or when a new word or syntax is employed by the user.
The languagedata structure system1050 may be used to provide one or more recommended words based on the context1022-1. The context map may find a top-5 or top-10best context1070 matches. The corresponding predictedwords1072 may be recommended to the user in accordance with the time-weighted scores1074. In some embodiments, only a subset of the predictedwords1072 corresponding to thebest context1070 matches may be presented to the user (e.g., just the top-1, top-2, or top-3 predicted words).
In some embodiments, the languagedata structure system1050 may provide one or more recommended words in accordance with a state machine (corresponding to a Markov sequence or process) that corresponds to a language. For example, the application-specific dictionary1068 may be based on a stochastic model of the relationships among letters, characters, symbols and/or words in a language.
A path memory (such as up to three characters in a word that is currently being entered and/or two or three previously entered words) of the probabilistic model represents a tradeoff between accuracy and the processing and power capabilities (for example, battery life) of the portable electronic device100 (FIG. 1). In some embodiments, such a probabilistic model may be based on a lexicography and usage that is user-specific and/or, as discussed previously, even application specific. For example, user emails, address book and/or other documents may be analyzed to determine an appropriate probabilistic model for that user based on the syntax and/or lexicography (including names and slang) that are employed by the user. The probabilistic model may be updated continuously, after pre-determined time intervals, or when a new word or syntax is employed by the user.
In some embodiments, the probabilistic model may be based on one or more mistakes made by the user when using the click wheel114 (FIG. 1) and/or a touch-sensitive display in the display system112 (FIG. 1). For example, if the user accidentally selects the wrong icon when typing a respective word, the probabilistic model may be updated to account for such errors in the future. In an exemplary embodiment, a mistake may be determined based on a user activation of an icon corresponding to the delete function. This adaptability of the portable electronic device100 (FIG. 1) may allow correction of user interface errors (such as parallax and/or left-right symmetry) associated with which finger(s) the user is using and how the user is holding the portable electronic device100 (FIG. 1) while using it. This functionality is discussed further below with reference toFIG. 14.
In some embodiments the languagedata structure system1050 may include fewer or more components. Two or more components may be combined and an order of two or more components may be changed.
Attention is now directed towards additional embodiments of user interfaces and associated processes that may be implemented on the device100 (FIG. 1).FIG. 11A is a flow diagram of an embodiment of asymbol entry process1100. While thesymbol entry process1100 described below includes a number of operations that appear to occur in a specific order, it should be apparent that theprocess1100 can include more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed and/or two or more operations may be combined into a single operation.
A plurality of icons may be displayed on a touch-sensitive display (1110). A respective icon may correspond to two or more symbols. A contact by a user with the display that corresponds to selection of the respective icon may be detected (1112). A symbol in the two or more symbols for which the contact further corresponds may be determined (1114).
FIG. 11B is a flow diagram of an embodiment of asymbol entry process1130. While thesymbol entry process1130 described below includes a number of operations that appear to occur in a specific order, it should be apparent that theprocess1130 can include more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed and/or two or more operations may be combined into a single operation.
A plurality of icons may be displayed on a touch-sensitive display (1132). A respective icon may correspond to two or more symbols. A first symbol may belong to a first subset of symbols and a second symbol may belong to a second subset of symbols. The first symbol may have a probability of occurrence greater than the second symbol. A contact by a user with the display that corresponds to selection of the respective icon may be detected (1134). A symbol in the two or more symbols for which the contact further corresponds may be determined (1136).
FIG. 11C is a flow diagram of an embodiment of asymbol entry process1150. While thesymbol entry process1150 described below includes a number of operations that appear to occur in a specific order, it should be apparent that theprocess1150 can include more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed and/or two or more operations may be combined into a single operation.
A plurality of icons may be displayed on a touch-sensitive display (1152). A respective icon may correspond to two or more symbols. A first symbol may belong to a first subset of symbols and a second symbol may belong to a second subset of symbols. The second symbol may have a probability of occurrence immediately following the first symbol that is less than a pre-determined value. A contact by a user with the display that corresponds to selection of the respective icon may be detected (1154). A symbol in the two or more symbols for which the contact further corresponds may be determined (1156).
FIGS. 12A-12G are schematic diagrams illustrating embodiments of a user interface for a portableelectronic device1200. These embodiments may utilize the symbol entry processes1100 (FIG. 11A),1130 (FIG. 11B) and/or1150 (FIG. 11C) described previously. As shown inFIG. 12A, thedevice1200 may include akeyboard1210 with a plurality of icons. A respective icon may include two or more symbols. A first symbol for a respective icon may be selected by the user using a first gesture. A second symbol for a respective icon may be selected by the user using a second gesture. The first gesture may include a continuous contact with thedisplay208 and the second gesture may include a discontinuous contact with thedisplay208.
The continuous contact may include a swipe and/or a rolling motion of the contact. The discontinuous contact may include one or more consecutive taps. A respective tap may include contact with thedisplay208 for a time interval that is less than a fourth pre-determined value, such as 0.1, 0.5 or 1 s. In some embodiments, two or more consecutive taps may correspond to a second symbol if a time interval between the two or more consecutive taps is less than a fifth pre-determined value, such as 0.1, 0.5 or 1 s.
In some embodiments, the first symbol is in a first subset of the symbols in the character set displayed in thekeyboard1210 and the second symbol is in a second subset of the symbols in the character set displayed in thekeyboard1210. The first subset may have a probability of occurrence that is greater than a sixth pre-determined value and the second subset may have a probability of occurrence that is less than the sixth pre-determined value. Thus, the first subset may include symbols that are more likely to occur, for example, in a language (using a lexicography model) and/or based on a user history. The gesture used to select the first symbol may, therefore, be easier or quicker for the user to make. For example, the first gesture may be a tap gesture and the second gesture may be a swipe gesture. This is illustrated inFIG. 12A. The gestures needed to select corresponding symbols for a respective icon may be indicated on the icon. For example, a dot on the icon may correspond to a tap and a horizontal line on the icon may correspond to a dash. This ‘tap-dash’ embodiment is an example of a two-gesture keyboard. Additional examples are discussed below.
In some embodiments, the first symbol may have a probability of occurrence immediately after the second symbol that is less than a seventh pre-determined value. In some embodiments, the second symbol may have a probability of occurrence immediately after the first symbol that is less than a seventh pre-determined value. This arrangement or grouping of the symbols displayed on the icons may reduce errors when using thekeyboard1210 because the user will be less likely to make a first gesture for the first symbol corresponding to a respective icon and then make the second gesture for the second symbol corresponding to the respective icon (or vice versa). Gestures for different symbols on the respective icon may, therefore, be separated by a time interval that is large enough to reduce a likelihood of inadvertently selecting a respective symbol using consecutive gestures for symbols corresponding to the respective icon.
FIGS. 12B-12G illustrate additional multi-gesture keyboards. For the icons inkeyboards1212,1214,1216,1218,1220 and1222, a first symbol for a respective icon in these keyboards may be selected with a first gesture (for example, a single tap) and a second symbol for the respective icon may be selected using a second gesture (for example, two consecutive taps). Thekeyboard1222 inFIG. 12G includes some icons that correspond to more than two symbols. These symbols may be selected by making additional gestures, such as three consecutive taps. In some embodiments, a second or third symbol for the respective icon may be selected by the user by first contacting a meta key, such as a shift key, and then contacting and/or breaking contact with the respective icon.
While thedevice1200 has been illustrated with certain components and a particular arrangement of these components, it should be understood that there may be fewer or more components, two or more components may be combined, and positions of one or more components may be changed. For example, thekeyboards1210,1212,1214,1216,1218,1220 and/or1222 may include fewer or additional icons. In some embodiments, a different character set and/or different groups of symbols may be used on the icons in thekeyboard1210,1212,1214,1216,1218,1220 and/or1222.
In some embodiments, the user selects symbols by breaking a contact with one or more icons on thedisplay208. In other embodiments, however, the user may select one or more symbols without breaking contact with thedisplay208. For example, the user may pause or maintain contact over the respective icon for a time interval longer than an eighth pre-determined value (such as 0.1, 0.5 or 1 s) before moving on to the next icon and corresponding symbol. In the process, the user may maintain contact with the display. In other embodiments, selection of the respective icon and corresponding symbol may occur by increasing a contact pressure with thedisplay208 while maintaining the contact with the display.
A flow chart for asymbol entry process1300 corresponding to embodiments where contact is not broken is shown inFIG. 13. While thesymbol entry process1300 described below includes a number of operations that appear to occur in a specific order, it should be apparent that theprocess1300 can include more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed and/or two or more operations may be combined into a single operation.
A plurality of icons may be displayed on a touch-sensitive display (1310). A respective icon may correspond to at least one symbol. A contact by a user with the display may be detected (1312). Positions of the contact corresponding to a sequence of icons may be determined (1314). The at least one symbol may be selected when a respective position of the contact corresponds to the respective icon for a time interval exceeding a pre-determined value (1316).
As discussed previously, the user may make errors when using a touch screen in the display system112 (FIG. 1). The device100 (FIG. 1) may, therefore, adapt an offset between an estimated contact and an actual contact in accordance with such errors. Feedback may be provided by the user activating an icon corresponding to a delete key. The offset may be applied to one or more icons. In some embodiments, there may be more than one offset and a respective offset may be applied to a respective subset that includes one or more icons in a plurality of the icons in a keyboard or other user interface. The adaptation may occur continuously, after a pre-determined time interval and/or if an excessive number of user errors occur (e.g., as evidenced by a frequency of use of the delete icon). The adaptation may occur during a normal mode of operation of the device100 (FIG. 1), rather than requiring the user to implement a separate keyboard training/adaptation mode.
A flow chart for asymbol entry process1400 corresponding to such embodiments is shown inFIG. 14. While thesymbol entry process1400 described below includes a number of operations that appear to occur in a specific order, it should be apparent that theprocess1400 can include more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed and/or two or more operations may be combined into a single operation.
A plurality of icons may be displayed on a touch-sensitive display (1410). A respective icon may correspond to at least one symbol. A contact by a user with the display may be detected (1412). An estimated contact that corresponds to the respective icon and the at least one symbol may be determined in accordance with the actual contact and pre-determined offset (1414). One or more corrections for one or more errors in one or more selected symbols may be received (1416). The offset for at least the respective icon may be modified in accordance with the one or more received corrections (1418).
FIG. 15 is a schematic diagram illustrating an embodiment of a user interface for a portableelectronic device1500. Thedevice1500 includes akeyboard1510 with a plurality of icons. Different spacings (“guard bands”) are used between the icons. The guard bands between icons visually encourage a user to touch the center of an adjacent icon, although user contact in the guard band region may also activate the nearest icon to the contact. In some embodiments, icons near the center of thedisplay208 may have a smaller guard band between adjacent icons than icons near an edge of the display. This may reduce errors when using thedisplay208 if it is easier for a user to select or contact a respective icon near the center of thedisplay208. In some embodiments, the guard band near the edge of thedisplay208 may be larger than that near the center of thedisplay208. Conversely, in some embodiments (opposite to what is shown inFIG. 15), icons near the center of thedisplay208 may have a larger guard band between adjacent icons than icons near an edge of the display. This may reduce errors when using thedisplay208 if it is easier for a user to select or contact a respective icon near the edge of thedisplay208. In some embodiments, the guard band near the edge of thedisplay208 may be smaller than that near the center of thedisplay208. In some embodiments, icons near the center of thedisplay208 may be larger than icons near the edge of thedisplay208. In some embodiments, icons at the edge of the display are about half the size of the other icons because it is easier to identify contacts corresponding to edge icons.
In some embodiments, either the size of the icons or the size of the guard bands between icons could incrementally vary between the edge of the display and the center of the display (e.g., from small icons at the edge to large icons in the center or from small guard bands at the edge to large guard bands in the center).
A flow chart for asymbol entry process1600 corresponding to such embodiments is shown inFIG. 16. While thesymbol entry process1600 described below includes a number of operations that appear to occur in a specific order, it should be apparent that theprocess1600 can include more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed and/or two or more operations may be combined into a single operation.
A plurality of icons may be displayed on a touch-sensitive display (1610). The plurality of icons may be arranged in rows in a first dimension of the display. A first guard band in the first dimension between adjacent icons in a first subset of the icons may be greater than a pre-determined value and a second guard band in the first dimension between adjacent icons in a second subset of the icons may be less than a pre-determined value. A contact by the user with the display that corresponds to selection of the respective icon may be detected (1612). A symbol corresponding to the respective icon may be displayed (1614).
FIG. 17 is a schematic diagram illustrating an embodiment of a user interface for a portable electronic device1700. The device1700 includes a keyboard1710 that has a plurality of icons. A respective icon corresponds to two or more symbols. Some symbols may be selected by contacting two or more icons simultaneously. A respective symbol that is selected may be displayed in thedisplay tray214. For example, a letter ‘e’ may be selected by contacting and breaking contact with the first icon in the first row. A letter ‘l’ may be selected by contacting and breaking contact with the first and the second icons in the first row. The icons include visual information indicating the combinations of contacts with icons (also referred to as chords) that correspond to given symbols. Keyboard1710 is sometimes referred to as a hop-scotch keyboard.
A flow chart for asymbol entry process1800 corresponding to such embodiments is shown inFIG. 18. While thesymbol entry process1800 described below includes a number of operations that appear to occur in a specific order, it should be apparent that theprocess1800 can include more or fewer operations, which can be executed serially or in parallel (e.g., using parallel processors or a multi-threading environment), an order of two or more operations may be changed and/or two or more operations may be combined into a single operation.
A plurality of icons may be displayed on a touch-sensitive display (1810). A first icon and a second icon each correspond to two or more symbols. A contact by a user with the display that corresponds to the first icon and the second icon is detected (1812). A respective symbol in the two or more symbols to which the contact corresponds may be determined (1814). A visual indicator corresponding to the respective symbol is displayed (1816).
FIG. 19 is a schematic diagram illustrating an embodiment of a user interface for a portableelectronic device1900. Akeyboard1910 does not include fixed icons. Instead symbols are displayed. A nearest group of symbols, such as three letters in aregion1912, are selected in accordance with a user contact with thedisplay208. In other embodiments, theregion1912 may include two or more symbols or characters. A correct set of symbols may be determined using a lexicography model or system, such as that shown inFIG. 10A, in accordance with a sequence of groups of symbols that correspond to a sequence of contacts by the user. As more contacts occur, a tree of possible words or sets of symbols corresponding to the groups of symbols that have been selected may be pruned until a correct or highest likelihood word or set of symbols is determined.
In other embodiments not shown, a respective user may play a game that is used to determine a smallest acceptable key size for a user interface, such as a keyboard. The smallest key size may be in accordance with a user's manual dexterity, age, health, finger size and vision. Errors made in using the icons in a keyboard during the game may help determine a minimum icon size for the respective user.
In some embodiments, icons in the embodiments of the user interfaces, such as the keyboards described above, may have an effective contact area or a strike area that is larger than the displayed icon size. In other embodiments, the effective contact area or strike area may be larger than the displayed icon size in at least one dimension of thedisplay208 surface.
The foregoing descriptions of specific embodiments of the present invention are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Rather, it should be appreciated that many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.