The present application is a divisional application of the invention patent application having an application date of 11/8/2006 and an application number of 200680033990.2 entitled "virtual input device arrangement on a touch screen user interface".
This application claims priority from U.S. patent application 11/228758 filed on 16/9/2005, 11/228758 being a continuation-in-part application of prior application No.10/903964 and claiming priority to the latter, the disclosure of which is incorporated herein by reference in its entirety. The present application also relates to the following co-pending applications: U.S. patent application No.10/840862 filed on 6/5/2004; U.S. patent application No.11/048264 filed on 30/7/2004; U.S. patent application No.11/038590 filed on 30/7/2004; U.S. patent application No.11/228737 filed on 16/9/2005; and U.S. patent application No.11/228700 filed on 16.9.2005, all of which are hereby incorporated by reference in their entirety for all purposes.
Detailed Description
Various examples and aspects are discussed below with reference to the figures. It is to be understood that the detailed description given herein with respect to these drawings is for explanatory purposes only and is not limiting.
Fig. 1-1 is a block diagram of an exemplary computer system 50 according to one embodiment of the invention. The computer system 50 may correspond to a personal computer system, such as a desktop computer, a laptop computer, a tablet computer, or a handheld computer. The computer system may also correspond to computing devices such as cellular telephones, PDAs, dedicated media players, and consumer electronic devices.
The exemplary computer system 50 shown in fig. 1-1 includes a processor 56 configured to execute instructions and to perform operations associated with the computer system 50. For example, using instructions retrieved, for example, from memory, the processor 56 may control the receipt and manipulation of input and output data between components of the computing system 50. The processor 56 may be implemented on a single chip, multiple chips, or multiple electrical components. For example, various architectures can be used for the processor 56, including dedicated or embedded processors, single purpose processors, controllers, ASICs, and the like.
In most cases, the processor 56 operates with an operating system to execute computer code and produce and use data. Operating systems are generally well known and will not be described in further detail. As an example, the operating system may correspond to Mac OS X, OS/2, DOS, Unix, Linux, Palm OS, and the like. The operating system may also be a dedicated operating system, such as may be used for limited-use appliance-type computing devices. The operating system, other computer code, and data may reside within a memory block 58 that is operatively coupled to the processor 56. The memory block 58 generally provides a place to store computer code and data used by the computer system 50. By way of example, the memory block 58 may include Read Only Memory (ROM), Random Access Memory (RAM), and/or a hard drive or the like. Information may also reside on a removable storage medium and be loaded and installed onto computer system 50 as needed. Removable storage media include, for example, CD-ROM, PC-CARD, memory CARDs, floppy disks, magnetic tape, and network components.
The computer system 50 also includes a display device 68 operatively coupled with the processor 56. The display device 68 may be a Liquid Crystal Display (LCD) (e.g., active matrix, passive matrix, etc.). Alternatively, the display device 68 may be a monitor such as a monochrome display, a Color Graphics Adapter (CGA) display, an Enhanced Graphics Adapter (EGA) display, a Variable Graphics Array (VGA) display, a super VGA display, and a Cathode Ray Tube (CRT). The display device may also correspond to a plasma display or a display implemented with electronic ink.
The display device 68 is generally configured to display a Graphical User Interface (GUI)69 that 69 provides an easy-to-use interface between a user of the computer system and an operating system or application running thereon. Generally speaking, the GUI69 represents programs, files, and operational options with graphical images. The graphical images may include windows, columns, dialog boxes, menus, icons, buttons, cursors, scroll bars, and the like. The images may be arranged in a predetermined layout or may be dynamically generated to suit the particular action taken by the user. In operation, to initiate functions and tasks associated therewith, a user may select and activate various graphical images. As examples, the user may select a button to open, close, minimize, or maximize a window, or select an icon to launch a particular program. The GUI69 may additionally or alternatively display information such as non-interactive text and graphics for the user on the display device 68.
The computer system 50 also includes an input device 70 operatively coupled to the processor 56. The input device 70 is configured to transfer data from the outside world into the computer system 50. The input device 70 may be used, for example, to perform tracking and to make selections with respect to the GUI69 on the display 68. The input device 70 may also be used to issue commands in the computer system 50. The input device 70 may include a touch sensing arrangement configured to receive input from a user's touch and to send this information to the processor 56.
As an example, the touch sensing device may correspond to a touch pad or a touch screen. In many cases, touch sensing devices recognize touches as well as the position and magnitude of touches on a touch sensitive surface. The touch sensing device reports these touches to the processor 56, and the processor 56 interprets these touches in accordance with its programming. For example, the processor 56 may initiate a task based on a particular touch. A dedicated processor may be used to process touches locally and reduce the need for the main processor of the computer system. The touch sensing device may be based on the following sensing technologies, including but not limited to: capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, and/or optical sensing, among others. Further, the touch sensing device may be based on single point sensing or multipoint sensing. Single point sensing can only distinguish a single touch, while multi-point sensing can distinguish multiple touches that occur simultaneously.
The input device 70 may be a touch screen located above or in front of the display 68. The touch screen 70 may be integral with the display device 68 or it may be a separate component. The touch screen 70 has several advantages over other input technologies such as a touch pad, mouse, and the like. For example, the touch screen 70 is located in front of the display 68 so the user can directly manipulate the GUI 69. For example, a user may simply place their finger on the object to be controlled. In a touch pad, there is no one-to-one relationship such as this. With a touch pad, the touch pad is typically placed in a different plane and away from the display. For example, the display is generally located in a vertical plane, while the touch pad is generally located in a horizontal plane. In addition to being a touch screen, input device 70 may also be a multi-point input device. A multi-point input device has the advantage over a conventional single-point device that it can distinguish more than one object (finger). A single point device simply cannot distinguish between multiple objects. By way of example, a multi-touch screen that may be used herein is shown and described in more detail in co-pending and commonly assigned U.S. patent application No.10/840862, which is incorporated herein by reference.
The computer system 50 also includes the capability to couple with one or more I/O devices 80. By way of example, the I/O devices 80 may correspond to a keyboard, a printer, a scanner, a camera, and/or speakers, among others. The I/O devices 80 may be integrated with the computer system 50 or they may be separate components (e.g., peripheral devices). In some cases, the I/O device 80 may be connected with the computer system 50 through a wired connection (e.g., cable/port). In other cases, the I/O device 80 may be connected to the computer system through a wireless connection. As examples, the data link may correspond to PS/2, USB, IR, RF, or Bluetooth, among others.
Specific processes within a touch screen based computer are now described, where the processes enable execution of applications and provide displays on the touch screen of the computer. The display process includes providing a composite display having application-based display characteristics and characteristics related to the virtual input device. The virtual input device display includes at least an input portion to receive appropriate touch input to the touch screen relative to the displayed input device for user interaction with the virtual input device. User interaction with the virtual input device involves activating portions of the virtual input device to provide user input that affects application processing. The virtual input device (i.e., the process on the computer used to implement the virtual input device) handles user interactions and provides corresponding user input to the application based on that process.
Virtual input device display is generally highly relevant to virtual input device handling of user interaction with the virtual input device. For example, if the virtual input device is a virtual keyboard, the virtual input device displays a graphical representation of the keys that may comprise a typical QWERTY keyboard, while the virtual input device processing for user interaction with the virtual keyboard comprises: determines which virtual keys were activated by the user and provides corresponding input (e.g., letters and/or numbers) to the application.
Reference is now made to fig. 1, 2, 3 and 3-1. FIG. 1 broadly illustrates a process for implementing a composite display (i.e., a composite of an application display and a virtual input device display) on a touch screen. FIG. 2 illustrates an example of an application display on a touch screen without displaying a virtual input device on the touch screen. FIG. 3 schematically illustrates an exemplary composite display, components of which include an application display and a virtual input device display.
Referring initially to FIG. 1, a flowchart illustrates process steps performed on a computer, such as the touch screen based computer shown in FIG. 1-1. First, processing steps of the application 102 executed on the computer are schematically shown. The application may be, for example, an email client program, a word processing program, or other application program. Application 102 executes in cooperation with operating system program 104 executing on the computer. In particular, operating system 104 provides executing application 102 with access to computer resources. One resource that the operating system 104 provides access to is a touch screen.
Application 102 provides operating system 104 with an indication of the nature of the application display. Broadly speaking, the indication of the characteristic of the application display includes data at least partially usable by the operating system to cause the application display to be generated on the touchscreen.
The application display characteristics provided from the application 102 are generally related to the processing results of the application. At least some of the application display characteristics may be known and/or controlled by the operating system without an indication provided by the application. These types of characteristics are typically more general display-related characteristics, such as the "window size" of the window of the application display and the background color of the window of the application display.
Given the characteristics of the application display, the display processing 106 of the operating system program 104 determines the characteristics of the resulting display image to be displayed on the touch screen based at least in part on the indication of the application display characteristics.
In addition, operating system program 104 includes virtual keyboard processing 108. More generally, process 108 may be a process for any virtual input device that is displayed on a touch screen and receives user input from the touch screen. Initial characteristics processing 110 of virtual keyboard processing 108 responds to a keyboard initiation event and determines initial display characteristics of the virtual keyboard. The in-progress characteristics processing 112 of the virtual keyboard processing 108 generally determines the in-progress display characteristics of the virtual keyboard based on activation of virtual keys of the virtual keyboard but may also be based on other conditions. While the discussion herein is related to the display characteristics of a virtual keyboard, it should be understood that virtual keyboard operational characteristics, such as the mapping of keys to application inputs, are often interleaved with the display characteristics. The determined display characteristics of the virtual keyboard are provided to the display process 106.
The display process 106 determines characteristics of the composite display including displaying the virtual input device based on the indicated characteristics of the virtual input device in view of the indication of characteristics of the application display. More particularly, the virtual input device portion of the composite display is intelligent with respect to the nature of the application display. This is particularly useful because the same touch screen is used for both the virtual input device display output and the application display output. Displaying the virtual input device in a particular manner for a particular application (i.e., for a particular application display characteristic) may improve the usability of the touch screen to interact with the application using the virtual input device.
As described above, fig. 2 illustrates an application display without displaying a virtual input device.
According to the example shown in FIG. 3, the resulting composite display is such that the application display (e.g., the application display shown in FIG. 2) remains substantially unchanged except that the virtual input display is overlaid over a portion, but not all, of the application display. According to another example shown in fig. 3-1, the resulting composite display is such that the application display (e.g., the application display shown in fig. 2) remains substantially unchanged except that the application display "slides up" and the virtual input device is displayed in a portion of the touch screen vacated by "sliding up" the application display.
The display process 106 accounts for the nature of the indications of the application display to determine the location of the virtual input device display in the composite display on the touch screen. For example, the display processing 106 may determine characteristics of the composite display such that important portions of the application display, such as input fields associated with the application display (and virtual input device), are not covered by the virtual keyboard display.
That is, the input field of the application display is generally determined to be important because it may represent a portion of the application with which the user interacts through the virtual input device. However, other portions of the application display may also be determined to be important. For example, a portion of an application display directly affected by input via a virtual input device may be determined to be important. In some instances, there may not even be an input field for the application display.
What is determined to be important may depend on the particular application and/or application display, or may generally depend on the characteristics of the application. In some cases, the portion of the application display other than the input field may be relatively important, so as to ensure that it is not covered by the virtual input device display in the composite display. The relative importance may be context dependent. For example, the relative importance may depend on the particular mode in which the application operates.
According to some examples of application displays that do not remain substantially unchanged (such as the application displays shown in fig. 3 and 3-1), the display process 106 determines characteristics of the composite display such that the application display is modified in the composite display to accommodate the virtual input device display with substantially all information on the application display remaining visible within the composite display. In some examples, the display processing 106 determines characteristics of the composite display such that spatial features of the application display are adjusted to provide space for the virtual input device on the composite display while minimizing or eliminating the amount of information on the application display that would otherwise be hidden by the virtual input device display on the composite display.
In some examples, at least a portion of the application display is compressed on the composite display to accommodate the virtual input device display. FIG. 4 illustrates one example of substantially equally compressing all portions of an application display in one direction on a composite display. FIG. 5 illustrates another example in which less than all portions of an application display are compressed on a composite display. In other examples, portions of the application display are enlarged on the composite display, where, for example, the portions of the application display are important for the virtual input device.
In some examples, which portion or portions of the application display are compressed on the composite display is based on characteristics of the application display. For example, some portions of the application display determined to be more important may not be compressed, while other portions of the application display determined to be less important may be compressed. In some examples, the amount of compression of a particular portion of the application display is based on the relative importance of that portion of the application display. In a composite display, different portions of the display may be compressed (or expanded) by different amounts (including spatial feature invariance).
In other examples, the characteristics of the virtual input device on the composite display may be configurable by the user, as preset conditions and/or characteristics of the virtual input device display may be dynamically configurable. As an example of dynamic configuration, a user may change the position of the virtual input device display in the composite display by touching a portion of the virtual keyboard display and "dragging" the virtual input device display to a desired portion of the composite display.
In some examples, the application display portion itself does not change in the composite display as the user changes the characteristics of the virtual input device display in the composite display. Thus, for example, if the user causes the position of the virtual input device display to change in the composite display, different portions of the application display are overlaid due to the virtual input device display movement. In other examples, when the user causes the characteristics of the virtual input device display to change, the display process 106 makes a new determination of the characteristics of the application display in the composite display. For example, the display process 106 may make a new determination of which portions of the application display to compress in the composite display based at least in part on the new location of the virtual input device display in the composite display.
The virtual input device initiation event (FIG. 1) will now be discussed in more detail. In particular, there are various examples of events that may include a virtual input device initiation event that causes the virtual input device to be initially displayed as part of the composite display. For example, the virtual input device may be displayed as part of a composite display in response to a particular user action that directly corresponds to a virtual input device initiation event. According to one example, the application has an input field as part of the application display, and a user gesture to the input field may result in triggering a virtual input device launch event. The user gesture may be, for example, a tap or double tap of a portion of the touch screen corresponding to the display of the input field. Generally, the operating system processes 104 include processes to recognize such user gestures for the input field and cause triggering of a virtual input device initiation event.
As another example of an event that may result in triggering a virtual input device start event, there may be an "on screen" button displayed as part of the application display, the user's activation of which is interpreted by the operating system process 104 and results in triggering a virtual input device start event. As another example, the on-screen button may be more generally associated with the operating system and, for example, displayed on a "desktop" portion of the touch screen associated with the operating system as opposed to a particular portion being displayed as an application. In either case, activation of the on-screen button results in the triggering of a virtual input device initiation event and, as a result, the initial input device processing 110 is performed.
As another example, a keyboard initiation event may be triggered by a user placing their finger in a "typing" position on a touch screen (e.g., a multi-touch screen). Detecting the user action may trigger a virtual keyboard initiation event based on which the initial keyboard processing 110 is performed and the virtual input device is displayed as part of the composite display. In this case, for example, the operating system process 104 interacting with the touch screen hardware and/or low-level processes is made aware of user input to the touch screen. This knowledge may be in the form of, for example, the coordinates of the point touched on the touch screen. When the combination of points touched on the touch screen is determined to correspond to a "typing" position where the user places their finger on the touch screen, a virtual keyboard initiation event is triggered. The process for determining that the combination of points corresponds to the user placing their finger in the "typing" position on the touch screen such that the virtual input device initiation event is triggered may be assigned to the operating system process 104 or may be, for example, a process that occurs in conjunction or cooperation with the operating system process 104.
The virtual input device deactivation event will now be discussed in more detail. As shown in FIG. 1, the triggering of the virtual input device deactivation event causes the virtual input to cease to be displayed as part of the composite display on the touch screen. The virtual input device deactivation event may be triggered, for example, by an action specifically taken by a user directly with respect to the virtual input device. This may include, for example, activating a particular "deactivate" key on the virtual input device display to cause the virtual input device to cease being displayed as part of the composite display. Interaction with an application, more generally but not necessarily specifically by activating a key on a virtual input device, may result in triggering a deactivation event.
One example of such interaction includes interaction with a display of an executing application in a manner that makes providing input through a virtual input device inappropriate. Another example includes interaction with the application to close the application (displayed by the application or by the virtual keyboard, as appropriate). Another example includes a gesture (such as "rubbing" a hand across a keyboard) or activating a virtual return key in conjunction with a finger "sliding" away from the virtual return (return) key, which results in activating a "return" and then causing the virtual keyboard to disappear.
As another example, triggering a deactivation event may be less relevant to a particular interaction with the virtual input device in particular or the touch screen in general, but may be caused, for example, by the passage of a particular amount of time since a key on the virtual input device was activated. That is, deactivating the virtual input device for a particular amount of time would mean that the virtual keyboard is no longer being used. In another example, the application itself may trigger a deactivation event, such as an application that triggers a deactivation event when the state of the application is such that the display of the virtual input device is deemed to be unnecessary and/or inappropriate.
Various modes of operation of the virtual input device are now discussed. In one example, input (typically, but not limited to, text) associated with the activated key may be provided directly to and operated on by the application to which the application display corresponds. The indication of the input may even be displayed directly in an input field associated with the application.
In other examples, one of which is shown in FIG. 6, the indication of the input may appear in a portion 604 of the display that is associated with the virtual input device 602, but not directly associated with the application display. The input may then be communicated to the application (either directly acted upon by the application or communicated to an input field 608 associated with the application display) automatically or upon command by the user. According to one example, automatic transfer occurs when "n" characters are entered via virtual input device 602, where "n" may be a user-configurable setting. According to another example, the automatic transfer occurs every "m" seconds or other unit of time, where "m" may be a user-configurable setting.
In some examples, the virtual input device display 602 includes a visual indicator 606 associated with the virtual input device 602 and an input field 608 of the application display. Referring to the exemplary display 600 in FIG. 6, the virtual input device display 602 contains a visual indicator arrow 606, the visual indicator arrow 606 pointing from the virtual input device display 602 to a corresponding input field 606 of the application display. The visual indicator 606 is not limited to a pointer. As another example, the visual indicator 606 may be an input field 608 of a highlighted application field.
In some examples, the display associated with the virtual input device is displayed in a smaller window than the virtual input device itself (and the size of the window may be user-configurable). In this case, the user may activate portions of the virtual input device display to scroll to (and thereby access) different portions of the virtual input device display. Fig. 7A, 7B, and 7C illustrate virtual input device displays in various states that have been scrolled. Scrolling may even be more than two-dimensional (e.g., a virtual cube or more than three-dimensional virtual shape) to access an undisplayed portion of the virtual input device.
The many features and advantages of the invention are apparent from the written description and, thus, it is intended by the appended claims to cover all such features and advantages of the invention. Further, since numerous modifications and changes will readily occur to those skilled in the art, the invention should not be limited to the exact construction and operation as illustrated and described. Accordingly, all suitable modifications and equivalents may be resorted to, falling within the scope of the invention.