Movatterモバイル変換


[0]ホーム

URL:


HK1177269A - Virtual input device placement on a touch screen user interface - Google Patents

Virtual input device placement on a touch screen user interface
Download PDF

Info

Publication number
HK1177269A
HK1177269AHK13104054.0AHK13104054AHK1177269AHK 1177269 AHK1177269 AHK 1177269AHK 13104054 AHK13104054 AHK 13104054AHK 1177269 AHK1177269 AHK 1177269A
Authority
HK
Hong Kong
Prior art keywords
display
input device
virtual input
application
virtual
Prior art date
Application number
HK13104054.0A
Other languages
Chinese (zh)
Inventor
I.乔德里
G.克里斯蒂
B.奥丁
Original Assignee
苹果公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 苹果公司filedCritical苹果公司
Publication of HK1177269ApublicationCriticalpatent/HK1177269A/en

Links

Description

Virtual input device arrangement on a touch screen user interface
The present application is a divisional application of the invention patent application having an application date of 11/8/2006 and an application number of 200680033990.2 entitled "virtual input device arrangement on a touch screen user interface".
(Cross-reference to related applications)
This application claims priority from U.S. patent application 11/228758 filed on 16/9/2005, 11/228758 being a continuation-in-part application of prior application No.10/903964 and claiming priority to the latter, the disclosure of which is incorporated herein by reference in its entirety. The present application also relates to the following co-pending applications: U.S. patent application No.10/840862 filed on 6/5/2004; U.S. patent application No.11/048264 filed on 30/7/2004; U.S. patent application No.11/038590 filed on 30/7/2004; U.S. patent application No.11/228737 filed on 16/9/2005; and U.S. patent application No.11/228700 filed on 16.9.2005, all of which are hereby incorporated by reference in their entirety for all purposes.
Technical Field
The present application relates to touch screen user interfaces, and more particularly to the placement of a virtual input device, such as a virtual keyboard or other virtual input device, on a touch screen user interface.
Background
A touch screen is a type of display screen that has a touch sensitive transparent panel that overlays the screen or otherwise recognizes touch input on the screen. Typically, a touch screen display is housed in the same housing as computer circuitry that contains processing circuitry that operates under program control. When using a touch screen to provide input to an application executing on a computer, a user makes a selection on the display screen by pointing directly (typically with a stylus or finger) at a Graphical User Interface (GUI) object displayed on the screen.
The set of GUI objects displayed on the touch screen may be considered a virtual keyboard. In some examples, the virtual input device is a virtual keyboard. Similar to conventional external keyboards that are not so closely associated with a display screen, virtual keyboards contain a plurality of keys ("virtual keys"). Activation of a particular virtual key (or combination of virtual keys) generates a signal (or signals) that is provided as input to an application executing on the computer.
External keyboards and other external input devices do not cover the display output of an application due to their nature (i.e., externally). On the other hand, a virtual input device may overlay the display output of executing applications as it is displayed on the same display screen that displays the output of those applications.
What is desired is a method of intelligently displaying a virtual input device on a touch screen to enhance the usability of the virtual input device and a touch screen based computer.
Disclosure of Invention
A display is generated on a touch screen of a computer. The display includes an application display associated with an application executing on the computer and a virtual input device display for a user to provide input to the application executing on the computer through the touch screen. An initial characteristic of the virtual input device display is determined in response to a virtual input device initiation event. Based on the characteristics of the application display and the characteristics of the virtual input device display, initial characteristics of a composite display image comprising the application display and the virtual input device display are determined. The composite image is displayed on the touch screen.
This summary is not intended to be all-inclusive. Other aspects will become apparent from the following detailed description when taken in conjunction with the drawings and from the appended claims.
Drawings
FIG. 1-1 is a block diagram of a touch screen based computer system.
FIG. 1 illustrates processing within a computer that results in a display on a touch screen, according to one aspect.
FIG. 2 illustrates an exemplary touch screen display output that does not include a virtual input device display.
Fig. 3 and 3-1 illustrate exemplary touch screen display outputs including both an application display and a virtual input device display, each of which substantially maintains the application output unchanged from the display of fig. 2.
Fig. 4 and 5 illustrate an exemplary touch screen display in which the spatial characteristics of the application display are modified to accommodate the virtual input device display.
FIG. 6 illustrates an exemplary touch screen display in which an indication of an input appears at a portion of the display associated with a virtual input device.
Fig. 7A, 7B, and 7C illustrate the virtual input device display in different states that have been scrolled.
Detailed Description
Various examples and aspects are discussed below with reference to the figures. It is to be understood that the detailed description given herein with respect to these drawings is for explanatory purposes only and is not limiting.
Fig. 1-1 is a block diagram of an exemplary computer system 50 according to one embodiment of the invention. The computer system 50 may correspond to a personal computer system, such as a desktop computer, a laptop computer, a tablet computer, or a handheld computer. The computer system may also correspond to computing devices such as cellular telephones, PDAs, dedicated media players, and consumer electronic devices.
The exemplary computer system 50 shown in fig. 1-1 includes a processor 56 configured to execute instructions and to perform operations associated with the computer system 50. For example, using instructions retrieved, for example, from memory, the processor 56 may control the receipt and manipulation of input and output data between components of the computing system 50. The processor 56 may be implemented on a single chip, multiple chips, or multiple electrical components. For example, various architectures can be used for the processor 56, including dedicated or embedded processors, single purpose processors, controllers, ASICs, and the like.
In most cases, the processor 56 operates with an operating system to execute computer code and produce and use data. Operating systems are generally well known and will not be described in further detail. As an example, the operating system may correspond to Mac OS X, OS/2, DOS, Unix, Linux, Palm OS, and the like. The operating system may also be a dedicated operating system, such as may be used for limited-use appliance-type computing devices. The operating system, other computer code, and data may reside within a memory block 58 that is operatively coupled to the processor 56. The memory block 58 generally provides a place to store computer code and data used by the computer system 50. By way of example, the memory block 58 may include Read Only Memory (ROM), Random Access Memory (RAM), and/or a hard drive or the like. Information may also reside on a removable storage medium and be loaded and installed onto computer system 50 as needed. Removable storage media include, for example, CD-ROM, PC-CARD, memory CARDs, floppy disks, magnetic tape, and network components.
The computer system 50 also includes a display device 68 operatively coupled with the processor 56. The display device 68 may be a Liquid Crystal Display (LCD) (e.g., active matrix, passive matrix, etc.). Alternatively, the display device 68 may be a monitor such as a monochrome display, a Color Graphics Adapter (CGA) display, an Enhanced Graphics Adapter (EGA) display, a Variable Graphics Array (VGA) display, a super VGA display, and a Cathode Ray Tube (CRT). The display device may also correspond to a plasma display or a display implemented with electronic ink.
The display device 68 is generally configured to display a Graphical User Interface (GUI)69 that 69 provides an easy-to-use interface between a user of the computer system and an operating system or application running thereon. Generally speaking, the GUI69 represents programs, files, and operational options with graphical images. The graphical images may include windows, columns, dialog boxes, menus, icons, buttons, cursors, scroll bars, and the like. The images may be arranged in a predetermined layout or may be dynamically generated to suit the particular action taken by the user. In operation, to initiate functions and tasks associated therewith, a user may select and activate various graphical images. As examples, the user may select a button to open, close, minimize, or maximize a window, or select an icon to launch a particular program. The GUI69 may additionally or alternatively display information such as non-interactive text and graphics for the user on the display device 68.
The computer system 50 also includes an input device 70 operatively coupled to the processor 56. The input device 70 is configured to transfer data from the outside world into the computer system 50. The input device 70 may be used, for example, to perform tracking and to make selections with respect to the GUI69 on the display 68. The input device 70 may also be used to issue commands in the computer system 50. The input device 70 may include a touch sensing arrangement configured to receive input from a user's touch and to send this information to the processor 56.
As an example, the touch sensing device may correspond to a touch pad or a touch screen. In many cases, touch sensing devices recognize touches as well as the position and magnitude of touches on a touch sensitive surface. The touch sensing device reports these touches to the processor 56, and the processor 56 interprets these touches in accordance with its programming. For example, the processor 56 may initiate a task based on a particular touch. A dedicated processor may be used to process touches locally and reduce the need for the main processor of the computer system. The touch sensing device may be based on the following sensing technologies, including but not limited to: capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, and/or optical sensing, among others. Further, the touch sensing device may be based on single point sensing or multipoint sensing. Single point sensing can only distinguish a single touch, while multi-point sensing can distinguish multiple touches that occur simultaneously.
The input device 70 may be a touch screen located above or in front of the display 68. The touch screen 70 may be integral with the display device 68 or it may be a separate component. The touch screen 70 has several advantages over other input technologies such as a touch pad, mouse, and the like. For example, the touch screen 70 is located in front of the display 68 so the user can directly manipulate the GUI 69. For example, a user may simply place their finger on the object to be controlled. In a touch pad, there is no one-to-one relationship such as this. With a touch pad, the touch pad is typically placed in a different plane and away from the display. For example, the display is generally located in a vertical plane, while the touch pad is generally located in a horizontal plane. In addition to being a touch screen, input device 70 may also be a multi-point input device. A multi-point input device has the advantage over a conventional single-point device that it can distinguish more than one object (finger). A single point device simply cannot distinguish between multiple objects. By way of example, a multi-touch screen that may be used herein is shown and described in more detail in co-pending and commonly assigned U.S. patent application No.10/840862, which is incorporated herein by reference.
The computer system 50 also includes the capability to couple with one or more I/O devices 80. By way of example, the I/O devices 80 may correspond to a keyboard, a printer, a scanner, a camera, and/or speakers, among others. The I/O devices 80 may be integrated with the computer system 50 or they may be separate components (e.g., peripheral devices). In some cases, the I/O device 80 may be connected with the computer system 50 through a wired connection (e.g., cable/port). In other cases, the I/O device 80 may be connected to the computer system through a wireless connection. As examples, the data link may correspond to PS/2, USB, IR, RF, or Bluetooth, among others.
Specific processes within a touch screen based computer are now described, where the processes enable execution of applications and provide displays on the touch screen of the computer. The display process includes providing a composite display having application-based display characteristics and characteristics related to the virtual input device. The virtual input device display includes at least an input portion to receive appropriate touch input to the touch screen relative to the displayed input device for user interaction with the virtual input device. User interaction with the virtual input device involves activating portions of the virtual input device to provide user input that affects application processing. The virtual input device (i.e., the process on the computer used to implement the virtual input device) handles user interactions and provides corresponding user input to the application based on that process.
Virtual input device display is generally highly relevant to virtual input device handling of user interaction with the virtual input device. For example, if the virtual input device is a virtual keyboard, the virtual input device displays a graphical representation of the keys that may comprise a typical QWERTY keyboard, while the virtual input device processing for user interaction with the virtual keyboard comprises: determines which virtual keys were activated by the user and provides corresponding input (e.g., letters and/or numbers) to the application.
Reference is now made to fig. 1, 2, 3 and 3-1. FIG. 1 broadly illustrates a process for implementing a composite display (i.e., a composite of an application display and a virtual input device display) on a touch screen. FIG. 2 illustrates an example of an application display on a touch screen without displaying a virtual input device on the touch screen. FIG. 3 schematically illustrates an exemplary composite display, components of which include an application display and a virtual input device display.
Referring initially to FIG. 1, a flowchart illustrates process steps performed on a computer, such as the touch screen based computer shown in FIG. 1-1. First, processing steps of the application 102 executed on the computer are schematically shown. The application may be, for example, an email client program, a word processing program, or other application program. Application 102 executes in cooperation with operating system program 104 executing on the computer. In particular, operating system 104 provides executing application 102 with access to computer resources. One resource that the operating system 104 provides access to is a touch screen.
Application 102 provides operating system 104 with an indication of the nature of the application display. Broadly speaking, the indication of the characteristic of the application display includes data at least partially usable by the operating system to cause the application display to be generated on the touchscreen.
The application display characteristics provided from the application 102 are generally related to the processing results of the application. At least some of the application display characteristics may be known and/or controlled by the operating system without an indication provided by the application. These types of characteristics are typically more general display-related characteristics, such as the "window size" of the window of the application display and the background color of the window of the application display.
Given the characteristics of the application display, the display processing 106 of the operating system program 104 determines the characteristics of the resulting display image to be displayed on the touch screen based at least in part on the indication of the application display characteristics.
In addition, operating system program 104 includes virtual keyboard processing 108. More generally, process 108 may be a process for any virtual input device that is displayed on a touch screen and receives user input from the touch screen. Initial characteristics processing 110 of virtual keyboard processing 108 responds to a keyboard initiation event and determines initial display characteristics of the virtual keyboard. The in-progress characteristics processing 112 of the virtual keyboard processing 108 generally determines the in-progress display characteristics of the virtual keyboard based on activation of virtual keys of the virtual keyboard but may also be based on other conditions. While the discussion herein is related to the display characteristics of a virtual keyboard, it should be understood that virtual keyboard operational characteristics, such as the mapping of keys to application inputs, are often interleaved with the display characteristics. The determined display characteristics of the virtual keyboard are provided to the display process 106.
The display process 106 determines characteristics of the composite display including displaying the virtual input device based on the indicated characteristics of the virtual input device in view of the indication of characteristics of the application display. More particularly, the virtual input device portion of the composite display is intelligent with respect to the nature of the application display. This is particularly useful because the same touch screen is used for both the virtual input device display output and the application display output. Displaying the virtual input device in a particular manner for a particular application (i.e., for a particular application display characteristic) may improve the usability of the touch screen to interact with the application using the virtual input device.
As described above, fig. 2 illustrates an application display without displaying a virtual input device.
According to the example shown in FIG. 3, the resulting composite display is such that the application display (e.g., the application display shown in FIG. 2) remains substantially unchanged except that the virtual input display is overlaid over a portion, but not all, of the application display. According to another example shown in fig. 3-1, the resulting composite display is such that the application display (e.g., the application display shown in fig. 2) remains substantially unchanged except that the application display "slides up" and the virtual input device is displayed in a portion of the touch screen vacated by "sliding up" the application display.
The display process 106 accounts for the nature of the indications of the application display to determine the location of the virtual input device display in the composite display on the touch screen. For example, the display processing 106 may determine characteristics of the composite display such that important portions of the application display, such as input fields associated with the application display (and virtual input device), are not covered by the virtual keyboard display.
That is, the input field of the application display is generally determined to be important because it may represent a portion of the application with which the user interacts through the virtual input device. However, other portions of the application display may also be determined to be important. For example, a portion of an application display directly affected by input via a virtual input device may be determined to be important. In some instances, there may not even be an input field for the application display.
What is determined to be important may depend on the particular application and/or application display, or may generally depend on the characteristics of the application. In some cases, the portion of the application display other than the input field may be relatively important, so as to ensure that it is not covered by the virtual input device display in the composite display. The relative importance may be context dependent. For example, the relative importance may depend on the particular mode in which the application operates.
According to some examples of application displays that do not remain substantially unchanged (such as the application displays shown in fig. 3 and 3-1), the display process 106 determines characteristics of the composite display such that the application display is modified in the composite display to accommodate the virtual input device display with substantially all information on the application display remaining visible within the composite display. In some examples, the display processing 106 determines characteristics of the composite display such that spatial features of the application display are adjusted to provide space for the virtual input device on the composite display while minimizing or eliminating the amount of information on the application display that would otherwise be hidden by the virtual input device display on the composite display.
In some examples, at least a portion of the application display is compressed on the composite display to accommodate the virtual input device display. FIG. 4 illustrates one example of substantially equally compressing all portions of an application display in one direction on a composite display. FIG. 5 illustrates another example in which less than all portions of an application display are compressed on a composite display. In other examples, portions of the application display are enlarged on the composite display, where, for example, the portions of the application display are important for the virtual input device.
In some examples, which portion or portions of the application display are compressed on the composite display is based on characteristics of the application display. For example, some portions of the application display determined to be more important may not be compressed, while other portions of the application display determined to be less important may be compressed. In some examples, the amount of compression of a particular portion of the application display is based on the relative importance of that portion of the application display. In a composite display, different portions of the display may be compressed (or expanded) by different amounts (including spatial feature invariance).
In other examples, the characteristics of the virtual input device on the composite display may be configurable by the user, as preset conditions and/or characteristics of the virtual input device display may be dynamically configurable. As an example of dynamic configuration, a user may change the position of the virtual input device display in the composite display by touching a portion of the virtual keyboard display and "dragging" the virtual input device display to a desired portion of the composite display.
In some examples, the application display portion itself does not change in the composite display as the user changes the characteristics of the virtual input device display in the composite display. Thus, for example, if the user causes the position of the virtual input device display to change in the composite display, different portions of the application display are overlaid due to the virtual input device display movement. In other examples, when the user causes the characteristics of the virtual input device display to change, the display process 106 makes a new determination of the characteristics of the application display in the composite display. For example, the display process 106 may make a new determination of which portions of the application display to compress in the composite display based at least in part on the new location of the virtual input device display in the composite display.
The virtual input device initiation event (FIG. 1) will now be discussed in more detail. In particular, there are various examples of events that may include a virtual input device initiation event that causes the virtual input device to be initially displayed as part of the composite display. For example, the virtual input device may be displayed as part of a composite display in response to a particular user action that directly corresponds to a virtual input device initiation event. According to one example, the application has an input field as part of the application display, and a user gesture to the input field may result in triggering a virtual input device launch event. The user gesture may be, for example, a tap or double tap of a portion of the touch screen corresponding to the display of the input field. Generally, the operating system processes 104 include processes to recognize such user gestures for the input field and cause triggering of a virtual input device initiation event.
As another example of an event that may result in triggering a virtual input device start event, there may be an "on screen" button displayed as part of the application display, the user's activation of which is interpreted by the operating system process 104 and results in triggering a virtual input device start event. As another example, the on-screen button may be more generally associated with the operating system and, for example, displayed on a "desktop" portion of the touch screen associated with the operating system as opposed to a particular portion being displayed as an application. In either case, activation of the on-screen button results in the triggering of a virtual input device initiation event and, as a result, the initial input device processing 110 is performed.
As another example, a keyboard initiation event may be triggered by a user placing their finger in a "typing" position on a touch screen (e.g., a multi-touch screen). Detecting the user action may trigger a virtual keyboard initiation event based on which the initial keyboard processing 110 is performed and the virtual input device is displayed as part of the composite display. In this case, for example, the operating system process 104 interacting with the touch screen hardware and/or low-level processes is made aware of user input to the touch screen. This knowledge may be in the form of, for example, the coordinates of the point touched on the touch screen. When the combination of points touched on the touch screen is determined to correspond to a "typing" position where the user places their finger on the touch screen, a virtual keyboard initiation event is triggered. The process for determining that the combination of points corresponds to the user placing their finger in the "typing" position on the touch screen such that the virtual input device initiation event is triggered may be assigned to the operating system process 104 or may be, for example, a process that occurs in conjunction or cooperation with the operating system process 104.
The virtual input device deactivation event will now be discussed in more detail. As shown in FIG. 1, the triggering of the virtual input device deactivation event causes the virtual input to cease to be displayed as part of the composite display on the touch screen. The virtual input device deactivation event may be triggered, for example, by an action specifically taken by a user directly with respect to the virtual input device. This may include, for example, activating a particular "deactivate" key on the virtual input device display to cause the virtual input device to cease being displayed as part of the composite display. Interaction with an application, more generally but not necessarily specifically by activating a key on a virtual input device, may result in triggering a deactivation event.
One example of such interaction includes interaction with a display of an executing application in a manner that makes providing input through a virtual input device inappropriate. Another example includes interaction with the application to close the application (displayed by the application or by the virtual keyboard, as appropriate). Another example includes a gesture (such as "rubbing" a hand across a keyboard) or activating a virtual return key in conjunction with a finger "sliding" away from the virtual return (return) key, which results in activating a "return" and then causing the virtual keyboard to disappear.
As another example, triggering a deactivation event may be less relevant to a particular interaction with the virtual input device in particular or the touch screen in general, but may be caused, for example, by the passage of a particular amount of time since a key on the virtual input device was activated. That is, deactivating the virtual input device for a particular amount of time would mean that the virtual keyboard is no longer being used. In another example, the application itself may trigger a deactivation event, such as an application that triggers a deactivation event when the state of the application is such that the display of the virtual input device is deemed to be unnecessary and/or inappropriate.
Various modes of operation of the virtual input device are now discussed. In one example, input (typically, but not limited to, text) associated with the activated key may be provided directly to and operated on by the application to which the application display corresponds. The indication of the input may even be displayed directly in an input field associated with the application.
In other examples, one of which is shown in FIG. 6, the indication of the input may appear in a portion 604 of the display that is associated with the virtual input device 602, but not directly associated with the application display. The input may then be communicated to the application (either directly acted upon by the application or communicated to an input field 608 associated with the application display) automatically or upon command by the user. According to one example, automatic transfer occurs when "n" characters are entered via virtual input device 602, where "n" may be a user-configurable setting. According to another example, the automatic transfer occurs every "m" seconds or other unit of time, where "m" may be a user-configurable setting.
In some examples, the virtual input device display 602 includes a visual indicator 606 associated with the virtual input device 602 and an input field 608 of the application display. Referring to the exemplary display 600 in FIG. 6, the virtual input device display 602 contains a visual indicator arrow 606, the visual indicator arrow 606 pointing from the virtual input device display 602 to a corresponding input field 606 of the application display. The visual indicator 606 is not limited to a pointer. As another example, the visual indicator 606 may be an input field 608 of a highlighted application field.
In some examples, the display associated with the virtual input device is displayed in a smaller window than the virtual input device itself (and the size of the window may be user-configurable). In this case, the user may activate portions of the virtual input device display to scroll to (and thereby access) different portions of the virtual input device display. Fig. 7A, 7B, and 7C illustrate virtual input device displays in various states that have been scrolled. Scrolling may even be more than two-dimensional (e.g., a virtual cube or more than three-dimensional virtual shape) to access an undisplayed portion of the virtual input device.
The many features and advantages of the invention are apparent from the written description and, thus, it is intended by the appended claims to cover all such features and advantages of the invention. Further, since numerous modifications and changes will readily occur to those skilled in the art, the invention should not be limited to the exact construction and operation as illustrated and described. Accordingly, all suitable modifications and equivalents may be resorted to, falling within the scope of the invention.

Claims (36)

1. A computer-implemented method of generating a display on a touchscreen of a computer, the display including an application display associated with an application executing on the computer and a virtual input device display for a user to provide input to the application executing on the computer through the touchscreen, the method comprising:
determining initial characteristics of the virtual input device display in response to the virtual input device initiation event;
determining an initial characteristic of a composite display image comprising a portion of the application display and the virtual input device display based on the characteristic of the application display and the characteristic of the virtual input device display, wherein the initial characteristic of the composite display image comprises a modification to the application display such that at least a portion of the application display for the virtual input device is enlarged on the composite display; and
causing the composite display to be displayed on the touch screen.
2. The method of claim 1, further comprising:
displaying an application display on the touch screen without the virtual input device display prior to the virtual input device initiation event.
3. The method of claim 1, wherein the virtual input device initiation event is caused by a gesture detected at the touch screen.
4. The method of claim 3, wherein the detected gesture includes a plurality of touch points detected at locations of the touch screen having a predetermined characteristic.
5. The method of claim 1, wherein the predetermined characteristic represents a typing position where a finger remains on the input device.
6. The method of claim 3, wherein the detected gesture includes a gesture for an input field of an application display on the touch screen.
7. The method of claim 3, wherein the detected gesture comprises a gesture for a particular user interface item displayed on the touchscreen.
8. The method of claim 7, wherein a particular user interface item is associated with an application display.
9. The method of claim 8, wherein the user interface item associated with the application display is an input field associated with the application display.
10. The method of claim 9, wherein the gesture includes at least one tap detected on a portion of the touch screen associated with the input field.
11. The method of claim 7, wherein the particular user interface item is associated with a desktop portion of a touch screen associated with an operating system of the computer.
12. The method of claim 1, further comprising:
upon detecting a virtual input device deactivation gesture, causing display of a composite image including the virtual input device display to cease, wherein the virtual input device deactivation gesture is triggered by a particular gesture for the composite display that is inconsistent with input via the virtual input device.
13. The method of claim 1, wherein the composite display includes visually associating the virtual input device display with an input field of the application display.
14. The method of claim 13, wherein the visual indicator is an arrow from a portion of the virtual input device display to an input field of the application display.
15. The method of claim 14, wherein the portion of the virtual input device display is an input display of the virtual input device.
16. The method of claim 13, wherein the visual indicator is a differentiated display of an input field of the application display.
17. The method of claim 1, wherein the virtual input device display comprises a portion of a display configured to receive input.
18. The method of claim 17, further comprising:
input from a portion of the display configured to receive input is communicated to an input field of the application display.
19. An apparatus for generating a display on a touchscreen of a computer, the display containing an application display associated with an application executing on the computer and a virtual input device display for a user to provide input to the application executing on the computer through the touchscreen, the apparatus comprising:
means for determining an initial characteristic of the virtual input device display in response to a virtual input device initiation event;
means for determining an initial characteristic of a composite display image comprising a portion of the application display and the virtual input device display based on the characteristic of the application display and the characteristic of the virtual input device display, wherein the initial characteristic of the composite display image comprises a modification to the application display such that at least a portion of the application display for the virtual input device is enlarged on the composite display; and
means for causing a composite display to be displayed on the touch screen.
20. The apparatus of claim 19, further comprising:
means for displaying an application display on the touch screen without the virtual input device display prior to the virtual input device initiation event.
21. The device of claim 19, wherein the virtual input device initiation event is caused by a gesture detected at the touch screen.
22. The device of claim 21, wherein the detected gesture includes a plurality of touch points detected at locations of the touch screen having a predetermined characteristic.
23. The device of claim 21, wherein the location having the predetermined characteristic comprises a location having a characteristic predetermined as a characteristic of a finger on the input device.
24. The device of claim 21, wherein the detected gesture includes a gesture for an input field of an application display on a touch screen.
25. The device of claim 21, wherein the detected gesture comprises a gesture for a particular user interface item displayed on the touchscreen.
26. The device of claim 25, wherein a particular user interface item is associated with an application display.
27. The device of claim 26, wherein the user interface item associated with the application display is an input field associated with the application display.
28. The device of claim 27, wherein the user gesture includes at least one tap detected on a portion of the touch screen associated with the input field.
29. The device of claim 28, wherein the particular user interface item is associated with a desktop portion of a touch screen associated with an operating system of the computer.
30. The apparatus of claim 19, the method further comprising:
means for stopping display of a composite image including a virtual input device display upon detection of a virtual input device deactivation gesture, wherein the virtual input device deactivation gesture is triggered by a particular gesture for the composite display that is inconsistent with input via the virtual input device.
31. The device of claim 19, wherein the composite display includes a visual indicator that visually associates the virtual input device display with an input field of the application display.
32. The device of claim 31, wherein the visual indicator is an arrow from a portion of the virtual input device display to an input field of the application display.
33. The device of claim 32, wherein the portion of the virtual input device display is an input display of the virtual input device.
34. The device of claim 31, wherein the visual indicator is a differentiated display of an input field of the application display.
35. The device of claim 19, wherein the virtual input device display comprises a portion of a display configured to receive input.
36. The apparatus of claim 35, the method further comprising:
means for communicating input from a portion of the display configured to receive input to an input field of the application display.
HK13104054.0A2005-09-162013-04-02Virtual input device placement on a touch screen user interfaceHK1177269A (en)

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US11/228,7582005-09-16

Publications (1)

Publication NumberPublication Date
HK1177269Atrue HK1177269A (en)2013-08-16

Family

ID=

Similar Documents

PublicationPublication DateTitle
CN101263443B (en) Computer-implemented method and apparatus for generating a display on a touch screen
JP5129140B2 (en) Computer operation using a touch screen interface
US7760187B2 (en)Visual expander
EP3025218B1 (en)Multi-region touchpad
EP2507698B1 (en)Three-state touch input system
KR20130052749A (en)Touch based user interface device and methdo
HK1177269A (en)Virtual input device placement on a touch screen user interface
HK1177270A (en)Virtual input device placement on a touch screen user interface

[8]ページ先頭

©2009-2025 Movatter.jp