Movatterモバイル変換


[0]ホーム

URL:


HK1193665A - Multi-application environment - Google Patents

Multi-application environment
Download PDF

Info

Publication number
HK1193665A
HK1193665AHK14107055.1AHK14107055AHK1193665AHK 1193665 AHK1193665 AHK 1193665AHK 14107055 AHK14107055 AHK 14107055AHK 1193665 AHK1193665 AHK 1193665A
Authority
HK
Hong Kong
Prior art keywords
application
interface
gesture
immersive
interfaces
Prior art date
Application number
HK14107055.1A
Other languages
Chinese (zh)
Other versions
HK1193665B (en
Inventor
Robert J. Jarrett
Jesse Clay Satterfield
Nils A. Sundelin
Bret P. Anderson
Tsz Yan Wong
Chaitanya Dev Sareen
Patrice L. Miner
Jensen Harris
David A. Matthews
Jennifer Nan
Matthew I. Worley
Original Assignee
Microsoft Technology Licensing, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing, LlcfiledCriticalMicrosoft Technology Licensing, Llc
Publication of HK1193665ApublicationCriticalpatent/HK1193665A/en
Publication of HK1193665BpublicationCriticalpatent/HK1193665B/en

Links

Description

Multi-application environment
Background
Conventional operating systems allow a user to view multiple computing applications through a window. Each of these windows typically includes a frame with controls for interacting with the computing application, as well as controls for selecting which window frame is primary, or moving, sizing, or otherwise managing the layout of the windows. However, these window frames occupy portions of the display that might otherwise be contributed for the content of the application. Moreover, managing the primality or layout of these windows through these controls can be time consuming and cumbersome for the user.
Disclosure of Invention
This document describes techniques and apparatuses that enable a multi-application environment. The multi-application environment described herein can present multiple applications without having a significant portion of the display contribute to the window frame for the application and/or without having to manage the window frames, such as their size, location, or primacy on the display. In some embodiments, the techniques and apparatus enable a multi-application environment with a combination of an immersive interface (immersive interface), a window-based interface, and a desktop treated as an immersive interface. Additionally, in some embodiments, the techniques and apparatus enable management of applications in a multi-application environment, such as sizing and moving interfaces within the environment. Further, some embodiments enable management of applications that have not been currently rendered that have been previously interacted with. Moreover, some embodiments of the described technology enable management of applications and their interfaces, whether currently displayed or not, through edge (edge) gestures or user interface management menus.
This summary is provided to introduce simplified concepts that are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter, nor is it intended for use in determining the scope of the claimed subject matter. Technologies and/or apparatus that enable a multi-application environment are also referred to herein, separately or collectively, as "technologies" when the context permits, although the technologies may include or alternatively represent other aspects described herein.
Drawings
Embodiments of enabling a multi-application environment are described with reference to the following figures. The same numbers are used throughout the drawings to reference like features and components:
FIG. 1 illustrates an exemplary system in which techniques enabling a multi-application environment may be implemented.
FIG. 2 illustrates an exemplary method for enabling an edge gesture that is approximately perpendicular to an edge where the gesture begins.
FIG. 3 illustrates an exemplary tablet computing device with a touch sensitive display that presents an immersive interface.
Fig. 4 illustrates the exemplary immersive interface of fig. 3 along with an exemplary edge.
Fig. 5 illustrates the exemplary immersive interface of fig. 3 and 4, along with the lines of angular variation from the vertical, and the lines from the starting point of the gesture to the later point of the gesture (later point).
Fig. 6 illustrates the edges of the immersive interface shown in fig. 4, along with two regions in the right edge.
FIG. 7 illustrates an application selection interface presented by the system interface module in response to an edge gesture made on the immersive interface and web page of FIG. 3.
FIG. 8 illustrates an exemplary method for enabling edge gestures, including determining an interface to present based on some factor of the gesture.
FIG. 9 illustrates an exemplary method of enabling expansion or cessation of presentation of a user interface presented in response to an edge gesture or presentation of another user interface.
Fig. 10 illustrates a laptop computer with a touch sensitive display having a window-based email interface and two immersive interfaces.
FIG. 11 illustrates the interface of FIG. 10 along with two gestures with a start point, a later point, and one or more successive points (useful points).
Fig. 12 illustrates the window-based email interface of fig. 10 and 11, along with an email processing interface presented in response to an edge gesture.
FIG. 13 illustrates the interface of FIG. 12 along with additional email options interfaces presented in response to gestures determined to have successive points at a preset distance from the edge.
Fig. 14 illustrates a method of switching back to an application with which it has previously interacted by using a queue.
FIG. 15 illustrates an exemplary interaction sequence for a user interacting with various applications.
Fig. 16 illustrates the immersive interface of fig. 3 along with thumbnail images (thumbnail images) of user interfaces of previous applications.
FIG. 17 illustrates a method for switching back to an application with which it has previously interacted, which may or may not use a queue.
Fig. 18 illustrates the immersive interface, two progressive (progressive) presentations, and two gesture portions of fig. 3 and 16.
Fig. 19 illustrates a method of enabling a multi-application environment including changing a size of a plurality of immersive interfaces in response to a single selection.
Fig. 20 illustrates the desktop computing device of fig. 1 with a touch-sensitive display shown as displaying a multi-application environment with two immersive interfaces divided by an interface divider (divider) region.
Fig. 21 illustrates the multi-application environment of fig. 20 with the size of the two immersive interfaces changed and the interface spacer region moved.
FIG. 22 illustrates a method for displaying an immersive interface of an application in a fully occupied region in response to being as small as one selection and at the size of the region.
Fig. 23 illustrates a current immersive interface that fully occupies a multi-application environment having three regions.
Fig. 24 illustrates the multi-application environment of fig. 23 with a reduced size immersive interface and a second immersive interface in place of the current immersive interface of fig. 23.
FIG. 25 illustrates a method for managing a multi-application environment through a user interface.
FIG. 26 illustrates an exemplary multi-application environment having primary and non-primary regions.
FIG. 27 illustrates the multi-application environment of FIG. 26 including a user interface management menu.
FIG. 28 illustrates a method of enabling a desktop to be displayed as an immersive interface within a multi-application environment.
FIG. 29 illustrates an exemplary multi-application environment with a desktop immersive interface displaying a window-based interface and a task bar and an immersive interface displaying content.
FIG. 30 illustrates a method of enabling content presentation and/or managing a multi-application environment.
FIG. 31 illustrates an exemplary device in which techniques enabling a multi-application environment may be implemented.
Detailed Description
Overview
This document describes techniques and apparatuses that enable a multi-application environment. The multi-application environment described herein can present multiple applications without having a significant portion of the display contribute to the window frame for the application and/or without having to manage the window frames, such as their size, location, or primacy on the display. In some embodiments, these techniques and apparatuses enable a multi-application environment with a combination of immersive interfaces, window-based interfaces, and desktops treated as immersive interfaces. Additionally, in some embodiments, the techniques and apparatus enable management of applications that are currently present or not currently present in a multi-application environment, such as sizing and moving interfaces within the environment, and selection of applications that are not currently present that have been previously interacted with. In some embodiments, this and other forms of management are enabled through edge gestures or user interface management menus made on the multi-application environment.
These are merely a few examples of the many ways in which the techniques may enable a multi-application environment, and others are described below.
Exemplary System
FIG. 1 illustrates an exemplary system 100 in which techniques enabling a multi-application environment may be embodied. The system 100 includes a computing device 102, six examples of which are shown: laptop 104, tablet computing device 106, smart phone 108, set-top box 110, desktop computer 112, and gaming device 114, however, other computing devices and systems, such as servers and netbooks, may also be used.
The computing device 102 includes a computer processor 116 and a computer-readable storage medium 118 (medium 118). The media 118 includes an operating system 120, a window-based mode module 122, a multi-application environment module 124, a system interface module 126, a gesture handler 128, an application manager 130 that includes or has access to an application queue 132, a manager 134, and one or more applications 136, each having one or more application user interfaces 138.
Computing device 102 also includes or has access to one or more displays 140 and input mechanisms 142. Four example displays are illustrated in fig. 1. Input mechanism 142 may include gesture sensitive sensors and devices, such as touch-based sensors and motion tracking sensors (e.g., camera-based), as well as a mouse (either standalone or integrated with a keyboard), track pad, and microphone with accompanying voice recognition software, to name a few. The input mechanism 142 may be separate or integrated with the display 140; examples of integration include gesture-sensitive displays with integrated touch-sensitive or motion-sensitive sensors.
The window-based mode module 122 presents the application user interface 138 through a window having a frame. These frames may provide controls through which to interact with the application, and/or controls that enable a user to move and resize the window.
The multi-application environment module 124 provides an environment through which a user may view and interact with one or more of the applications 136 through the application user interface 138. The multi-application environment 124 can present one or more application user interfaces 138 in conjunction with the window-based mode module 122. The multi-application environment 124 may also or alternatively present one or more application user interfaces 138 as immersive interfaces.
In some embodiments, the immersive interface presents the content of the application and enables interaction with the application with little or no window frame, without requiring the user to manage the layout or primacy of the window frame relative to other windows (e.g., which window is active or in front), without requiring the user to manually size or position the application user interface 138, and/or without presenting visual controls (e.g., permanent controls on the window frame or in a window that obscures the content).
The multi-application environment enabled by the multi-application environment module 124 may (but need not) be hosted and/or surfaced (surface) without using a window-based desktop environment. Thus, in some cases, multi-application environment module 124 presents the multi-application environment as an immersive environment, and thus lacks windows (even windows with no large frames) and/or precludes the use of a display like a desktop (e.g., a task bar). Also, in some embodiments, this multi-application environment is similar to an operating system in that it cannot be shut down or offloaded. In some cases, although not required, such a multi-application environment enables all or nearly all of the pixels of the display to be used by applications within the multi-application environment.
Examples of multi-application environments are provided below, some of which include only immersive interfaces, and some of which include window-based interfaces or desktops that are treated as immersive interfaces, although they are not exhaustive or intended to limit the techniques described herein.
The system interface module 126 provides one or more interfaces through which interaction with the operating system 120 is enabled, such as an application launch interface, a start menu, or a system tools or options menu, to name a few.
The operating system 120, modules 122, 124, and 126, and gesture handler 128, application manager 130, and manager 134 may be separate from each other, or may be combined or integrated in any suitable form.
Exemplary method
Exemplary methods 200, 800, and 900 handle edge gestures, exemplary methods 1400 and 1700 handle switching back to an application with which it had previously interacted, exemplary methods 1900 and 2200 handle managing an immersive interface in a multi-application environment, exemplary method 2500 handles managing a multi-application environment through a user interface, method 2800 handles a desktop treated as an immersive interface within a multi-application environment, and method 3000 handles content presentation and/or management of a multi-application environment.
The methods may be used separately or in combination with each other, in whole or in part. For example, the techniques may use edge gestures to enable selection, sizing, and switching of interfaces that are currently in a multi-application environment. The techniques may also use the application queue to select an application with which it has previously interacted with, with or without edge gestures. Moreover, the techniques may automatically size multiple interfaces upon selection of an application with which it has previously interacted through an application queue or in response to a selection to move or change a currently displayed interface.
Edge gestures
FIG. 2 depicts a method 200 for enabling an edge gesture in accordance with the edge gesture being approximately perpendicular to the edge at which the gesture begins. In some portions of the following discussion, reference may be made to system 100 of FIG. 1, and other methods and example embodiments described elsewhere herein, the reference to which is made for example only.
Block 202 receives a gesture. This gesture may be received at various portions of the display, such as on a window-based interface, an immersive interface, or no interface. Moreover, this gesture may be made and received in various ways, such as a pointer tracking motion received through a touchpad, mouse, or roller ball, or physical motion made with an arm, finger, or stylus received through a motion-sensitive or touch-sensitive mechanism, and so forth.
As an example, consider FIG. 3, which illustrates tablet computing device 106. Input board 106 includes a touch sensitive display 302, which is shown displaying an immersive interface 304 that includes a web page 306. As part of the ongoing example, at block 202, the gesture handler 128 receives a gesture 308, as shown in FIG. 3.
Block 204 determines whether the starting point of the gesture is at an edge. As noted above, the edge in question may be an edge of a user interface (whether immersive or window-based), and/or an edge of a display. In some cases, of course, the edge of the user interface is also the edge of the display. The size of the edge may vary depending on various factors related to the display or interface. A small display or interface may have a smaller size from an absolute or pixel perspective than a large display or interface. The high sensitivity input mechanism also allows for a smaller margin. The exemplary edge is rectangular and it varies between 1 and 20 pixels in one dimension and the interface limit of the interface or display in another dimension, although other sizes and shapes, including convex and concave edges, may alternatively be used.
Continuing the ongoing example, consider fig. 4, which illustrates immersive interface 304 and gesture 308 of fig. 3, as well as left edge 402, top edge 404, right edge 406, and bottom edge 408. The web page 306 is not displayed for visual clarity. In this example, the dimensions of the interface and display are of a medium size, between those of a smartphone and many laptop and desktop computer displays. Edges 402, 404, 406, and 408 have a small scale of 20 pixels, and the area of each edge is shown as bounded by a dashed line at 20 pixels from the display or interface boundary 410, 412, 414, and 416, respectively. Although shown as overlapping at corners, the edges may alternatively be mitred (mitter) at the corners, or one edge may be more windward (vor) than the other (e.g., if the starting point is received at an overlapping corner, edge 404 is more windward than edge 406).
The gesture handler 128 determines: the gesture 308 has a start point 418, and this start point 418 is within the left edge 402. In this case, the gesture handler 128 determines the starting point by: data indicative of the [ X, Y ] coordinates in pixels at the beginning of the gesture 308 is received and the first of these coordinates is compared to those pixels contained within each edge 402, 404, 406, and 408. The gesture handler 128 can often determine the starting point and whether it is at an edge more quickly than the sampling rate, thereby causing little performance degradation from the technique of merely passing the gesture directly to the revealed interface on which the gesture was made.
Returning to method 200, in general, if block 204 determines that: if the starting point of the gesture is not at an edge, the method 200 proceeds along the NO path to block 206. Block 206 communicates the gesture to a surfaced user interface, such as an underlying interface on which the gesture is received. Changing the ongoing example, assume that gesture 308 is determined to have no starting point within the edge. In this case, gesture handler 128 passes the buffered data for gesture 308 to immersive user interface 304. After passing the gesture, the method 200 ends.
If block 204 determines that: the starting point of the gesture is in the edge, the method 200 proceeds along the yes path to block 208. Block 208 responds to an affirmative determination of block 204 by determining whether a line from the starting point to a later point of the gesture is approximately perpendicular to the edge.
In some embodiments, block 208 determines a later point to use. The gesture handler 128 may determine the later point of the gesture, for example, based on the later point being received at a preset distance from the edge or starting point, such as beyond the edge boundary 410 of the edge 402, or twenty pixels away from the starting point 418 (all in FIG. 4). In some other embodiments, the gesture handler 128 determines the later point in accordance with whether the later point is received at a preset time after the starting point is received, such as an amount of time slightly greater than an amount of time typically used by the computing device 102 to determine whether the gesture is a tap and hold gesture or a hover gesture.
For the ongoing embodiment, the gesture handler 128 uses the later received point of the gesture 308 received outside the edge 402, as long as the later received point is received within a preset time. If no point is received outside the edge within this preset time, the gesture handler 128 proceeds to block 206 and communicates the gesture 308 to the immersive interface 304.
Using the starting point, block 208 determines whether a line from the starting point to a later point of the gesture is approximately perpendicular to the edge. Various varying angles, such as five, ten, twenty, or thirty degrees, may be used by block 208 in making this determination.
As an example, consider a variation angle of thirty degrees from vertical. Fig. 5 illustrates a variation of this example, showing the immersive interface 304, the gesture 308, the left edge 402, the left edge boundary 410, and the start point 418 of fig. 3 and 4, along with a line of variation 502 thirty degrees from vertical line 504. Thus, from the line 506 from the starting point 418 to the later point 508 (which is about twenty degrees from vertical) being within the example thirty degree line of change 502, the gesture handler 128 determines: the line 506 is approximately vertical.
In general, if block 208 determines that the line is not approximately perpendicular to the edge, method 200 proceeds along the NO path to block 206. As partially noted above, block 208 may also determine that: a later point or other aspect of the gesture disqualifies the gesture (disqualify). Examples include when a later point is within an edge, such as due to a hover, tap, press and hold, or add up and down gesture (e.g., scrolling content in a user interface), or when the gesture is set to a single input gesture and a second input is received (e.g., the first finger starts at the edge but the second finger then falls anywhere).
If block 208 determines from a later point outside the edge that the line is approximately vertical, method 200 proceeds along the YES path to block 210.
Block 210 responds to an affirmative determination at block 208 by passing the gesture to another entity other than the exposed user interface. This entity is not the user interface on which the gesture was received, assuming that the gesture was received entirely on the user interface. Block 210 may also determine to which entity to pass the gesture, such as based on the edge or region of the edge in which the gesture start point was received. For example, consider fig. 6, which illustrates immersive interface 304 and edges 402, 404, 406, and 408 of fig. 4, but adds top region 602 and bottom region 604 to right edge 406. A starting point in the top region 602 may result in a different entity (or even the same entity, but provide a different user interface in response) than the starting point received in the bottom region 604. Likewise, a starting point in the top edge 404 may result in a different entity or interface than the left edge 402 or the bottom edge 408.
In some cases, this entity is an application associated with the user interface, and not an application not associated with the user interface, such as a system entity or a different application. In such a case, communicating the gesture to the entity may effectively cause the application to present a second user interface that enables interaction with the application. In the movie example above, the entity may be a media player playing the movie, rather than an immersive interface displaying the movie. The media player may then present a second user interface that enables selection of subtitles or director's commentary, rather than selections enabled by the interface displaying the movie, such as "pause", "play", and "stop". This capability is permitted in fig. 1, where one of the applications 136 may include or be capable of presenting more than one application user interface 138. Thus, block 210 may pass the gesture to system interface module 126, the one of applications 136 that is currently presenting the user interface, or another one of applications 136, to name just three possibilities.
To summarize the ongoing embodiment, at block 210, the gesture handler 128 passes the gesture 308 to the system interface module 126. The system interface module 126 receives the buffered portion of the gesture 308 and continues to receive the remainder of the gesture 308 as the gesture is made by the user. Fig. 7 illustrates a possible response after receiving gesture 308, displaying an application selection interface 702 presented by system interface module 126 and on immersive interface 304 and web page 306 from fig. 3. The application selection interface 702 enables selection of various other applications and their respective interfaces at selectable application tiles (tiles) 704, 706, 708, and 710.
The example application selection interface 702 is an immersive user interface presented through use of the multi-application environment module 124, although this is not required. The presented interface may alternatively be window-based and may be presented using a window-based mode module 122. Both modules are illustrated in fig. 1.
Block 210 may also or alternatively determine to pass the gesture to a different entity and/or interface based on other factors related to the received gesture. Exemplary factors are described in more detail in method 800 below.
It should be noted that method 200 and other methods described below can be performed in real time, such as while gestures are being made and received. This permits, among other things, the user interface to be presented in response to a gesture to be presented before the gesture ends. Also, the user interface may be progressively presented as gestures are received. This allows for a user experience of dragging the user interface from the edge as the gesture is performed, as if the user interface were "sticky" to the gesture (e.g., to a mouse pointer or finger of a gesturing person).
FIG. 8 depicts a method 800 for enabling edge gestures, including determining an interface to present based on certain factors of the gesture. In some portions of the following discussion, reference may be made to the system 100 of FIG. 1, the reference to which is merely for example. The method 800 may act in whole or in part, separately from or in conjunction with other methods described herein.
Block 802 determines that a gesture made on the user interface has a starting point at an edge of the user interface and a later point that is not within the edge. Block 802 may operate similar to some aspects of method 200 or use some aspects of method 200, such as determining a later point on which to base the determination of block 802. Block 802 may also act differently.
In one case, for example, block 802 determines that the gesture is a single-finger swipe gesture, starting at an edge of the exposed immersive user interface and having a later point that is not at the edge, but is not based on an angle of the gesture. Based on this determination, block 802 proceeds to block 804 instead of passing the gesture to the exposed immersive user interface.
Block 804 determines which interface to present based on one or more factors of the gesture. Block 804 may do so depending on the last or middle length of the gesture, whether the gesture is a single point or multiple points (e.g., a single finger or multi-finger gesture), or the speed of the gesture. Thus, block 804 may, for example, present a start menu in response to a multi-finger gesture, an application selection interface in response to a relatively short single-finger gesture, or a system control interface that permits selection of turn-off of computing device 102 in response to a relatively long single-finger gesture. To do so, the gesture handler 128 may determine the length of the gesture or the number of inputs (e.g., fingers). In response, block 806 presents the determined user interface.
By way of example, assume that gesture handler 128 determines to present a user interface that enables interaction with operating system 120 based on a factor of the gesture. In response, the system interface module 126 presents this user interface. Presentation of the user interface may occur in a manner similar to that described in other methods, such as through progressive display of the application selection user interface 702 of FIG. 7.
Following, in whole or in part, method 200 and/or method 800, the techniques may proceed to perform method 900 of fig. 9. The method 900 enables expanding the user interface, presenting another interface, or ceasing to present a user interface presented in response to the edge gesture.
Block 902 receives a successive point of the gesture after rendering at least some portion of the second user interface. As partially noted above, the methods 200 and/or 800 can present or cause presentation of a second user interface, such as a second user interface for the same application associated with the current user interface, for a different application, or for a system user interface.
By way of example, consider fig. 10, which illustrates a laptop 104 with a touch-sensitive display 1002, the touch-sensitive display 1002 displaying a window-based email interface 1004 and two immersive interfaces 1006 and 1008. The window-based email interface 1004 is associated with an application that manages email, which may be remote from the laptop 104 or local to the laptop 104. FIG. 10 also illustrates two gestures 1010 and 1012. Gesture 1010 proceeds in a straight line while gesture 1012 reverses back (shown with two arrows to show two directions).
FIG. 11 illustrates a gesture 1010 and a gesture 1012, the gesture 1010 having a start point 1102, a later point 1104, and a successive point 1106; the gesture 1012 has the same starting point 1102, later point 1108, and first and second successive points 1110, 1112. Fig. 11 also shows a bottom edge 1114, a dot later area 1116, and an interface attachment area 1118.
Block 904 determines from the successive points whether the gesture includes a reversal, an extension, or neither. Block 904 may determine the reversal by determining that the successive point is at an edge or closer to an edge than a previous point of the gesture. Block 904 may determine that the gesture extends based on a preset distance of a successive point from the edge or a later point. If neither of these are determined to be true, the method 900 may repeat blocks 902 and 904 to receive and analyze additional successive points until the gesture ends. If block 904 determines that there is a reverse direction, the method 900 proceeds along the "reverse" path to block 906. If block 904 determines that the gesture is extended, the method 900 proceeds along the "extend" path to block 908.
In the context of this example, assume that the gesture handler 128 receives a first successive point 1110 of the gesture 1012. The gesture handler 128 then determines that the first successive point 1110 is not at the edge 1114, is also not closer to the edge 1114 than the previous point of the gesture (e.g., is not closer than the later point 1108), and is also not a preset distance from the edge or the later point because it is not within the interface append area 1118. In such a case, the method 900 returns to block 902.
At the second iteration of block 902, assume that the gesture handler 128 receives a second successive point 1112. In such a case, the gesture handler 128 determines that the second successive point 1112 is closer to the edge 1114 than the first successive point 1110, and thus, the gesture 1012 includes a reverse direction. The gesture handler 128 then proceeds to block 906 to cease presentation of the second user interface previously presented in response to the gesture. By way of example, consider FIG. 12, which illustrates an email processing interface 1202. In this exemplary case of block 906, gesture handler 128 causes the email application to stop presenting interface 1202 in response to the reversal (not shown, removed) of gesture 1012.
However, block 908 renders or causes a third user interface to be rendered or a second user interface to be expanded. Continuing the ongoing example, consider FIG. 13, which illustrates an additional email options interface 1302 responsive to the gesture 1010, the gesture 1010 being determined to have a successive point 1106 a preset distance from the edge 1104, in this case the successive point being within the interface attachment area 1118 of FIG. 11. This area and preset distance may be set according to the size of the user interface previously presented in response to the gesture. Thus, a user wishing to add additional controls may simply extend the gesture across the user interface presented in response to the early portion of the gesture.
Method 900 may repeat to add additional user interfaces or to expand a presented user interface. Returning to the example interface 702 of FIG. 7, for example, the gesture handler 128 may continue to add interfaces or add controls to the interface 702 as the gesture 308 extends across the interface 702, such as by presenting an additional set of selectable application tiles. If the gesture 308 extends beyond the additional tile, the gesture handler 128 may cause the system interface module 126 to present another interface proximate to the tile to enable the user to select a control, such as to pause, hibernate, switch modes (immersive to window-based, and vice versa), or turn off the computing device 102.
While the user interfaces presented in response to edge gestures, as exemplified above, are opaque, they may also be partially transparent. This may be useful as the content is not occluded. In the movie example described above, the presented user interface may be partially transparent, thereby permitting the movie to be only partially occluded during use of the user interface. Likewise, in the example of fig. 12 and 13, the interfaces 1202 and 1302 may be partially transparent, thereby enabling the user to see the text of the email while also selecting a control on one of the interfaces.
As noted above, the exemplary methods 200, 800, and 900 handle edge gestures and they were described before the methods 1400 and 1700, the methods 1400 and 1700 handle switching back to an application with which it had previously interacted. Any one or more of the methods may be used separately or in combination, in whole or in part, with other ones of the methods.
Switch back to the application with which it had previously interacted
Fig. 14 depicts a method 1400 for switching back to an application with which it had previously interacted through the use of a queue. In some portions of the following discussion, reference may be made to system 100 of FIG. 1, and other methods and example embodiments described elsewhere herein, the reference to which is made for example only.
Block 1402 maintains a queue of a plurality of applications interacted with, the queue arranged according to applications other than a current application that have interacted most recently (most-recent) to least recently (least-recent). For example, consider FIG. 15, which illustrates an interaction order 1502 for a user to interact with various applications. First, a user interacts with the web search application 1504 through its interface. Second, the user interacts with the web-enabled media application 1506 through a web browser. Third, the user interacts with the local (non-web) photo application 1508 through its interface. Fourth, the user interacts with the social networking application 1510 through a web browser. Fifth, the user returns to interacting with the web-enabled media application 1506. Sixth, the user interacts with the web-enabled news application 1512 again through a web browser.
For the first interaction, no queue is maintained because no other applications have interacted with before this first interaction. For the second through sixth interactions in interaction order 1502, consider queues 1514, 1516, 1518, 1520, and 1522, which correspond to each interaction in interaction order 1502 after the first interaction, respectively. Queues 1514 through 1522 are exemplary iterations of the application queue 132 maintained by the application manager 130, both the application manager 130 and the application queue 132 coming out of FIG. 1.
As shown in fig. 15, the application manager 130 keeps the application queue 132 up to date according to the user's interaction. Queue 1522 includes, for example, media application 1506 as the most recently interacted-with application, followed by social networking application 1510, photo application 1508, and ending with web search application 1504. Because the user interacts twice with the media application 1506 (in the second and fifth interactions), the application manager 130 removes it from the application queue 132 at the fifth interaction and reorders the other applications to reflect the latest order of interaction, but excludes the application with which it is currently interacting.
Block 1404 receives a certain gesture or gesture portion. This gesture or gesture portion may include one or more of the various gestures or portions described elsewhere herein, such as a pointer tracking motion received through a touchpad, mouse, or roller ball, or physical motion made with an arm, finger, or stylus received through a motion-sensitive or touch-sensitive mechanism. In some embodiments, gesture portions are received, each portion being a component of a gesture, and each causing presentation of an application in a queue. Each of these portions may have, but are not required to have, a starting point at the edge of the display, a later point not at the edge of the display, and a successive point at the edge of the display. In this case, a gesture with multiple portions will appear somewhat like a multi-loop spiral, multiple circles, or a back and forth (e.g., zigzag), where each loop, each circle, or each back and forth starts, leaves, and returns to the edge of the user interface or display. Optionally, block 1404 may receive a number of gestures or gesture portions. These gestures or gesture portions may include one or more of the various gestures or gesture portions described elsewhere herein.
Continuing with the ongoing embodiment, consider again FIG. 3, which illustrates tablet computing device 106 with touch sensitive display 302, touch sensitive display 302 being shown displaying immersive interface 304 including web page 306. Assume for this example that immersive interface 304 is associated with news application 1512 and that web page 306 is content from news application 1512.
As part of this example, at block 1404, the gesture handler 128 receives the gesture 308 (as shown in FIG. 3), and the gesture handler 128 passes the gesture 308 to the application manager 130. Assume for the ongoing example that gesture 308 is determined to be associated with switching back to an application with which it was previously interacted, rather than being associated with some other function or application.
In response to receiving the gesture or gesture portion, block 1406 queues up to another application of the plurality of applications that has interacted with. Thus, upon receiving the gesture or gesture portion, the application manager 130 may proceed to the first application of the application queue 132, and thus to the application with which it has most recently interacted. In some embodiments, upon receiving two gestures or portions, the application manager 130 may proceed to the application queue 132 that was most recently interacted with second, however, the method 1400 may do so by repeating blocks 1404, 1406, and/or 1408, and so on, as described below.
Continuing the ongoing example, assume: gesture 308 is received after the sixth interaction, the application currently interacted with at that time is news application 1512, and application queue 132 is up-to-date and represented by queue 1522 of FIG. 15. In such a case, upon receiving the gesture or gesture portion, the application manager 130 proceeds to the media application 1506.
Block 1408 presents a user interface associated with the other application. In some embodiments, this user interface is the same user interface through which the application was previously interacted with. In some embodiments, the user interface is presented as a thumbnail or transparently overlaid on the currently presented user interface. The application manager 130 presents this user interface, either alone or in combination with the associated application, such as by causing the associated application to present the user interface with which the user last interacted.
For this example, the application manager 130 progressively presents thumbnail images of the user interface for the application as the gesture 308 is received, and then expands the thumbnail to contain the real estate (real estate) available for the display when the gesture ends. Application manager 130 thus replaces web page 306 in immersive interface 304, or replaces immersive interface 304, with another interface, which may be immersive or window-based.
This is illustrated in fig. 16 with thumbnail images 1602 of the user interface of the media application 1506 and web pages 306 of the news application 1512 presented on the immersive interface 304. Upon completion of the gesture 308, the thumbnail image 1602 expands into a media player 1604, replacing the web page 306 in the immersive interface 304. This is merely one example way to present a user interface for a selected application, and other ways to respond progressively or otherwise are described elsewhere herein.
In some embodiments, block 1408 shrinks the current user interface to a second thumbnail image and passes the second thumbnail image to an area of the display from which the first-mentioned thumbnail image is progressively presented. Thus, the block 1408 expands the thumbnail image 1602 into the media player 1604 while simultaneously shrinking the web page 306 into a thumbnail image and passing the thumbnail to the edge from which the thumbnail image 1602 was selected.
During the presentation of the user interface at block 1408, another gesture or gesture portion may be received, returning to block 1404. In some cases, another gesture or gesture portion is received within a certain amount of time while the user interface is presented at block 1408. Following a return to block 1404, block 1406 may then proceed to yet another or subsequent application of the plurality of applications with which it interacted. Continuing with this progression, block 1408 then presents a user interface associated with a subsequent application of the plurality of applications with which it interacted.
Thus, by repeating blocks 1404, 1406, and 1408, the user interfaces associated with the applications that were previously interacted with may be presented in succession. In some cases, a user interface associated with an application with which it has previously interacted may be presented in response to each gesture received. In the context of this example, when another gesture is received while presenting the user interface of the media application 1506, the user interface associated with the social networking application 1510 (the application of the queue 1522 with which the second most recent interaction was made) is presented. Yet another gesture or gesture portion is received during presentation of the user interface associated with the social networking application 1510, resulting in presentation of the user interface associated with the photo application 1508 (the application of the queue 1522 with which the third most recently interacted), and so on.
Following such a switch from presenting the current application to presenting another selected previous application, block 1410 updates the queue in response to an interaction with a user interface associated with the other application, or in response to a period of time having elapsed during the presentation of this user interface. In some cases, it may be possible to select a previous application and then quickly select another application after it, in effect scanning through the applications in the queue. In such a case, block 1410 may abandon the update queue because fast viewing cannot be seen as an interaction.
Example interactions by which the application manager 130 updates the application queue 132 include explicitly selecting to interact with a newly presented interface, such as by using a control displayed in the user interface of the media player 1604 of fig. 16, to control playback or edit information about currently playing media. In other cases, the interaction is determined based on a period of time elapsed. For example, assume that a web page for a news application is presented upon selection, rather than as the current application. After a certain period of time, such as one, two, or three seconds, for example, the application manager 130 determines that the delay is actually an interaction based on the likelihood that the user is reading a news article on a web page. Similarly, in block 1408, presentation of a user interface for such a media application may also be considered an interaction, namely: the media application is playing media and remains on the display without otherwise selecting an application in the application queue 132.
As partially noted above, the application queue 132 may be circular. In doing so, if the user reaches the application of the application queue 132 that has been least recently interacted with, the selection of the application is not stopped, but is scrolled. For example, upon selecting to switch back from the social networking application 1510 to a previous application and thus using the queue 1518, switching back once results in selecting the photo application 1508, switching back twice results in selecting the media application 1506, and switching back three times results in selecting the web search application 1504. The fourth selection switches back, returning in a looping manner again resulting in the presentation of the photo application 1508.
Method 1400 describes various ways in which the techniques may enable selection of applications that have previously interacted with and determination of which application to present from a queue. Method 1700 may operate in conjunction with method 1400 and other methods described herein, however, the use of queues is not required. Therefore, method 1400 is not intended to limit the techniques as described in exemplary method 1700.
Fig. 17 depicts a method 1700 for switching back to an application with which it has previously interacted, possibly with or without queues. In some portions of the following discussion, reference may be made to the system 100, methods 200, 800, 900, 1400 of FIG. 1 and the exemplary embodiments described above, the references to which are made for example only.
Block 1702 enables selection of an application with which it has previously interacted through gestures made on a current user interface associated with a current application. Block 1702 may do so in various ways described above, such as with an edge gesture or a portion of an edge gesture, to name one example only.
Block 1704, in response to receiving the gesture and no additional selections, presents a previous user interface associated with the application with which it was previously interacted.
For example, assume that a portion of the gesture is received in association with a selection of a previous application, such as an edge gesture that starts at an edge of the current user interface and progresses approximately perpendicularly away from the edge. In response, block 1704 presents a user interface for the application with which it previously interacted, or a thumbnail image of the interface, or selects some indicator that has been successfully made along with an indicator of the selected application or interface.
Example thumbnail images or indicators include any of the selectable application tiles 704, 706, 708, and 710 of FIG. 7, some of which include thumbnail images of the interface, while others indicate the selected application. Another example is the thumbnail image 1602 of fig. 16.
Block 1704 presents a user interface of the selected previously interacted with application as shown in FIG. 16 at media player 1604. In doing so, block 1704 may enable interaction with the photo application 1508 through the immersive interface 304 without further selection. Thus, the user can interact without making additional selections after selecting a previous application with as little as one gesture. The user need not select, for example, to exit the application selection mode, or have the presented interface be "live" or primary, or at the top of the stack. Briefly, the techniques enable a previous application to be selected and further interacted with a single input.
In this example of fig. 16, immediately after the media player 1604 is presented and replaces the web page 306, the next input to the immersive interface 304 is passed immediately to the photo application 1508. Accordingly, taps, hot keys, or other inputs are passed directly to the photo application 1508, thereby enabling immediate response to the inputs by the photo application 1508.
In some embodiments, the gesture made on the current user interface includes multiple portions, each portion indicating a selection of a previous application. In such a case, block 1704 presents the previous user interface in response to the first portion, and then, in response to block 1702 receiving the second portion of the gesture, presents a more previous user interface associated with the application with which it was more previously interacted, and so on.
This is illustrated in fig. 18, which presents the immersive interface 304 of fig. 16 (shown twice for visual clarity), and block 1704 may be in a manner to respond to multiple gestures or portions of a single gesture. FIG. 18 illustrates two progressive presentations 1802 and 1804, respectively, and a gesture 1806 having two gesture portions 1806-1 and 1806-2. The first progressive presentation 1802 illustrates a drag of the thumbnail image 1602 from the left edge of the immersive interface 304 and thus illustrates a selection of the photo application 1508 with which it has previously interacted. It should be noted that the thumbnail image 1602 is "sticky" to the gesture portion 1806-1. It should also be noted that unlike gesture 308 of fig. 3 and 16, gesture 1806 returns to the left edge. In response, rather than the gesture 308 ending and the media player 1604 replacing the web page 306, the gesture portion 1806-1 of the gesture 1806 returns to the edge at which it started. In this case, the thumbnail image 1602 is progressively displayed with the gesture portion 1806-1, but then disappears when the gesture portion 1806-1 returns to an edge.
The gesture 1806 continues with the second portion 1806-2. In response, block 1704 presents a second progressive presentation 1804 illustrating a second drag from the left edge of the immersive interface 304. Here, social network thumbnail images 1808 of the more previous application, social networking application 1510, are progressively presented. As part of the second portion 1806-2, the gesture 1806 returns to the left edge. In response, block 1704 decrements the thumbnail image 1808 as the gesture portion 1806-2 returns to the edge. This is merely one example of the manner in which the techniques may be used to enable a user to select and view a previous application, or even all previously interacted with, with only a single gesture. At any point in this example, gesture 1806 may end or indicate a selection to present a full user interface for the selected application, at which point block 1704 presents the user interface (e.g., media player 1604 of FIG. 16 or a full user interface for a social networking application).
As noted above, the exemplary methods 200, 800, and 900 process edge gestures and they are described before the methods 1400 and 1700, and the methods 1400 and 1700 process switches back to an application with which it previously interacted, which in turn are described before the methods 1900 and 2200. Any one or more of the methods may be used, in whole or in part, separately or in conjunction with other ones of the methods.
Managing immersive interfaces
Fig. 19 depicts a method 1900 of enabling a multi-application environment that includes changing a size of a plurality of immersive interfaces in response to a single selection. In some portions of the following discussion, reference may be made to system 100 of FIG. 1, and other methods and example embodiments described elsewhere herein, the reference to which is made for example only.
Block 1902 enables selecting to change a first size of a first immersive interface of a first application displayed in a multi-application environment in which a second immersive interface of a second application is displayed in a second size.
Block 1902 may enable such selection in various manners as described above, such as with gestures, whether made through a gesture-sensitive display or track pad or mouse, or with hardware buttons or hot keys, to name a few.
As an example, consider the case where block 1902 enables selection-and-movement gesture selection through a gesture-sensitive display, selection-and-movement gesture of an interface spacer region between immersive interfaces of a multi-application environment. This example is illustrated in fig. 20, which illustrates a desktop computing device 112 having a touch-sensitive display 2002, the touch-sensitive display 2002 being shown displaying a multi-application environment 2004. Multi-application environment 2004 includes a larger immersive interface 2006 and a smaller immersive interface 2008 separated by immersive interface spacer 2010. The larger immersive interface 2006 is associated with a word processing application and presents document content 2012. The smaller immersive interface 2008 is associated with a software drawing application and presents drawing content 2014. As part of the ongoing example, at block 1902, the manager 134 receives a gesture 2016, as shown in FIG. 20, which is displayed with an arrow, but omits an input actor (e.g., a finger or stylus).
Block 1904 changes the first size of the first immersive interface and the second size of the second immersive interface in response to the selection to change the first size of the first immersive interface. Thus, block 1904 can change the size of the plurality of immersive interfaces in response to as few as one selection. Also, block 1904 may do so simultaneously and without obscuring either interface.
Consider, as an example, the ongoing example of fig. 20. In response to the select-and-move gesture 2016 of the interface spacer region 2010, the organizer 134 decreases one interface while increasing the other interface, here increasing the smaller immersive interface 2008, and while decreasing the larger immersive interface 2006. The result of this change is illustrated in fig. 21 as changed smaller immersive interface 2102 and changed larger immersive interface 2104. The previous position of interface spacer region 2010 is shown at previous position 2106. It should also be noted that select-and-move gesture 2016 starts at a previous location 2106 of interface divider region 2010 and ends at a last location 2108 of interface divider region 2010.
It should be noted that in this example, before and after changing the size of the immersive interface, multi-application environment 2004 is entirely occupied by the immersive interface, has no unused real estate, or is obscured by visible controls for managing the immersive interface.
This particular example illustrates one way in which the techniques may permit a user to select the size of an immersive interface, here increasing the map presented by a drawing application.
The techniques also permit a user to "grab" (snap) the immersive interface to automatically fill in predetermined areas of multi-application environment 2004. By doing so, gestures and other selections that are quick and easy for the user may be used. Moreover, the regions may have a predetermined size across multiple devices, thereby permitting application developers to prepare for the region size. This is particularly useful for smaller area sizes, as smaller sizes are often more challenging to present in a user-friendly manner. Considering again fig. 20, for example, a predetermined small region width 2018 is illustrated, here having a width of 320 pixels. In this example, three zones are shown, two of which are dependent in that they are included in one full zone. These regions have the following widths: width 2018 and remainder width 2020 are for two dependent regions, and full width 2022 of a full region, all coming out of multi-application environment 2004. It should be noted that remainder width 2020 may vary across the display, as may full width 2022.
Block 1902 can also enable selection by a drag-and-drop gesture of one of the immersive interfaces from one region to another. In such a case, block 1904 may switch the interface between regions or automatically move spacers (e.g., immersive interface spacer 2010 of fig. 20) such that the resulting dimensions are switched. By doing so, the manager 134 automatically reduces the larger immersive interface 2006 to fully occupy the area previously occupied by the smaller immersive interface 2008, and vice versa.
In some cases, the selection to change the interface size is enabled by an edge gesture. For example, consider an edge gesture that starts at an edge of the larger immersive interface 2006 and has a later point that is not at the edge of the larger immersive interface 2006. The manager 134, alone or in conjunction with the gesture handler 128 and/or the application manager 130, collapses the larger immersive interface 2006 into a reduced size image. Selection of the size of adjustment interface 2006 may then be performed by dropping the reduced size image onto smaller immersive interface 2008. In response, the manager 134 resizes both interfaces.
Method 1900 describes various ways of enabling a multi-application environment, including changing the size of multiple immersive interfaces in response to a single selection. Method 2200 may operate in conjunction with method 1900 and other methods described herein, however, does not require the use of a queue. Thus, method 1900 is not intended to limit the techniques as described in exemplary method 2200.
Fig. 22 depicts a method 2200 for displaying an immersive interface for an application in a region, including in response to as few as one selection and at a size that fully occupies the region. In some portions of the following discussion, reference may be made to the system 100, methods 200, 800, 900, 1400, 1700, and 1900 of FIG. 1, and the above-described exemplary embodiments, the references to which are made for example only.
Block 2202 enables selection of an immersive interface that displays applications in one of a plurality of regions of a multi-application environment that is displaying one or more current immersive interfaces of one or more current applications. Block 2202 can do so in various ways described above, such as with an edge gesture or a portion of an edge gesture, to name one example only. Further, the selected application may be an application with which it has previously interacted determined in various ways, such as determined by application manager 130 using application queue 132, both application manager 130 and application queue 132 coming out of FIG. 1.
At block 2202, the multi-application immersive interface may present one, two, or even three current immersive interfaces. Thus, block 2202 permits selection of applications to be placed in a region that is currently occupied or that, while present, is occupied by a larger immersive interface, such as in a situation where one immersive interface fully occupies a multi-application environment.
As an example, consider fig. 23, which illustrates a current immersive interface 2302 occupying a multi-application environment 2304. It is noted here that there are three slave regions 2306, 2308 and 2310. These areas may or may not be indicated. In a situation where an application has been selected and hovers or moves over one of the regions, that region may be designated with partially transparent immersive interface spacers 2312 and 2314. The three slave regions 2306, 2308 and 2310 are included within a full size region 2316 that occupies substantially all of the multi-application environment 2304.
As an example, assume that manager 134 receives an application selected in accordance with method 1700 and previously interacted with following the example illustrated in FIG. 18. In such a scenario, assume that a thumbnail image 1808 for the social networking application 1510 is selected and hovers over the area 2306 (not shown but similar to FIG. 18). In response, the manager 134 indicates that the region 2306 is or will be selected, and indicates the size of the region 2306, by displaying a partially transparent immersive interface spacer 2312.
Returning to method 2200, block 2204 displays the immersive interface in a size that fully occupies the region in response to selecting to display the immersive interface in the region. It should be noted that the user, by as little as one selection of an application, can select and present an immersive interface in a size that fully occupies the selected area.
Continuing this example, consider fig. 24, which illustrates multi-application environment 2304, but now with reduced-size immersive interface 2402 instead of current immersive interface 2302 of fig. 23, and with second immersive interface 2404 that displays social networking web page 2406 for social networking application 1510 of fig. 15. Second immersive interface 2404 fully occupies region 2306 and has no other user selections than a selection of that region.
It should be noted that both the content in reduced-size immersive interface 2402 and the arrangement of social networking web page 2406 are changed. Size changes can be made more quickly or to allow better content scheduling applications and/or developers of those applications to have these region sizes in advance, which are provided by the techniques as predetermined region widths. Here, the predetermined region width provided is region 2306, however, a full width region 2408 may also be provided.
Following block 2204, method 2200 may repeat blocks 2202 and 2204, thereby enabling selection of additional immersive interfaces. For example, the manager 134 can enable selection of a third immersive interface for presentation in regions 2310 or 2308 of fig. 23. In response to such a selection, manager 134 reduces the size of reduced-size immersive interface 2402, or replaces reduced-size immersive interface 2402.
It should be noted that any of these methods may be combined in whole or in part. Thus, for example, a portion of a gesture can be used to select an immersive interface, while another portion of the same gesture selects to position and/or size the immersive interface. In response to this single gesture, the techniques may resize multiple interfaces currently presented in the multi-application environment.
User interface for managing a multi-application environment
FIG. 25 depicts a method 2500 for managing a multi-application environment through a user interface. In some portions of the following discussion, reference may be made to system 100 of FIG. 1, and other methods and example embodiments described elsewhere herein, the reference to which is made for example only.
Block 2502 enables selection of a user interface for managing a multi-application environment. The selection of the user interface may be made in a variety of ways, including those described above, such as with a gesture or a portion of a gesture, a hardware button or hot key, or a voice command, to name a few. The user interface may be fully displayed, partially displayed, or not displayed at all prior to selection. For example, consider a scenario in which a multi-application environment is displayed while a user interface is not displayed. An example of such a scenario is illustrated in fig. 26, which shows display 2600 populated with multi-application environment 2602. Multi-application environment 2602 includes a primary region 2604 and a non-primary region 2606, both of which present various content from applications 136. It should be noted that non-primary region 2606 includes two non-primary segments 2608 and 2610, each of which may be used to present content in parallel with each other as well as with the content of primary region 2604. In this example, content from three applications is presented in parallel: content 2612 from a social networking website, content 2614 from a news website, and content 2616 from a local document viewing application.
Here, manager 134 of FIG. 1 enables selection of a user interface with a non-visual selector, such as a hotkey or a gesture (e.g., an edge gesture made to the right edge of multi-application environment 302). However, in certain other cases, the manager 134 enables selection through the displayed selectable controls.
Block 2504 displays a user interface in response to this selection. The user interface may be an immersive user interface or a semi-transparent overlay to enable further selection. Through this user interface, block 2506 enables selection of applications for concurrent presentation in a multi-application environment.
This user interface enables a user to manage the multi-application environment, however it may enable this in various ways. The user interface may, for example, enable a user to present the application to the user interface and/or environment, remove the application, or set the application as a default, such as by selecting a tab associated with or representative of the application. The user interface may enable a user to select preferences for applications to be presented in the user interface, generally change environments, or switch to a non-multi-application environment. Moreover, the user interface may present applications for selection according to various criteria, such as those most recently used or most frequently used by the user of the computing device 102, and whether the applications are currently executing. Still further, the user interface may present a common set of system commands for the application, such as a user interface that enables search commands, shares content, or changes settings.
Continuing the ongoing example, assume: the manager 134 receives a selection to present a user interface through a gesture made on the touch screen of the display 2600. In response, the manager 134 presents a user interface management menu 2700, illustrated in fig. 27, through which selection of an application is enabled. The user interface management menu 2700 of this example presents icons and/or names for nine applications. These nine applications include various websites, services and local computing applications, which are named "Social Net", "news.com", "PDFs", "Telecon", "music.com", "movies.com", "poker.com", "Art Space" and "Maps by GPS" at 2702, 2704, 2706, 2708, 2710, 2712, 2714, 2716 and 2718, respectively. As noted, other applications (such as those related to system commands) may also be presented by the manager 134. As an example, consider system command 2719, which is displayed as the tenth application of user interface management menu 2700, entitled "Search Share Settings".
In this particular example, two applications 2702 and 2704 are "pinned". The pin icons shown at 2720 and 2722 indicate: applications 2702 and 2704 will be maintained in one or both of multi-application environment 2602 and user interface management menu 2700 (where it indicates both). Com applications will thus execute and present content within a certain portion of environment 2602 without further selection by the user (shown in segments 2608 and 2610, respectively). It should be noted that selection of the retained ("pinned") application may be enabled by the manager 134 in various ways, such as presenting a pinned selection icon 2724 for selecting any of the applications 2702-2718 through the user interface management menu 2700. As noted, any of the selectable applications may or may not be currently executing, in which case applications 2702, 2704, 2706, and 2708 are executing, and application 2710 and 2718 are not currently executing.
The user interface management menu 2700 may also present applications according to other criteria, such as the user's history. For example, applications 2706, 2708, 2710, and 2712 are presented according to which they are the four most recently used applications (in addition to 2702 and 2704) by the user of computing device 102. The applications 2714, 2716, and 2718 are presented according to the fact that they are the most frequently used applications other than application 2702-2712. In other cases, the application may be presented according to new content associated with the available application (e.g., a new email, message, or RSS feed), or upon receiving other alerts for the application.
Also, the user interface management menu 2700 enables a user to manage the multi-application environment 2602 in addition to managing applications, such as through a window selection icon 2726. With this selection, the manager 134 allows the user to opt out of the multi-application environment 2602 and proceed with the currently presented content by using the window-based environment.
Returning to method 2500, block 2508 is responsive to selecting one or more applications such that content of the selected applications is presented in parallel with presentation of content of different applications in the multi-application environment.
The method 2500 may receive a plurality of selections. In response to the multiple selections, block 2508 causes the content of each selected application to be presented in the multi-application environment. In such a case, the manager 134 may present the content from each selected application sequentially or simultaneously. In one sequential case, consider the above example, but assume that the application 2706 was selected in a previous iteration of block 2506, and in response to this selection, the manager 134 presents the content 2616 of the application 2706 (as shown in fig. 26 and 27). Subsequent iterations of blocks 2506 and 2508 are performed after this selection of application 2706. The sequential presentation of the application is enabled by the manager 134 through at least additional iterations of blocks 2506 and 2508, shown as potentially repeated with a dashed line from block 2508 to block 2506.
Desktop as an immersive interface in a multi-application environment
FIG. 28 depicts a method 2800 that enables display of a window-based desktop as an immersive interface within a multi-application environment. In some portions of the following discussion, reference may be made to system 100 of FIG. 1, and other methods and example embodiments described elsewhere herein, the reference to which is made for example only.
Block 2802 displays the desktop as an immersive interface within the multi-application environment. The multi-application environment is configured to support access to multiple applications, like those described elsewhere herein. Thus, a user can see content through an interface associated with an application (or applications if the applications include an interface on the desktop) and interact with the application through the interface, all through the multi-application environment.
Block 2804 enables interaction with the desktop immersive interface. Such interaction may be concurrent with interaction enabled for other interfaces. Moreover, such interaction may be through a multi-application environment, and also include window-based and desktop-based controls, such as window frame controls and task bars, respectively.
The windows-based mode module 122 and multi-application environment module 124, acting separately or in conjunction with the operating system 120 of FIG. 1, may treat, for example, a desktop immersive interface as one of the applications 136, and windows and task bars, etc., as various examples of the application user interface 138.
By way of example, consider fig. 29, which illustrates a tablet computing device 106 having a desktop immersive interface 2902 that displays a window-based interface 2904 and a taskbar 2906, and an immersive interface 2908 that displays content 2910, all within a multi-application environment 2912. Desktop immersive interface 2902 may also include representations (e.g., icons) of applications executable within desktop immersive interface 2902, as well as representations of folders used to support the hierarchical file structure of computing device 102, to name a few.
Moreover, the performance of the functionality provided within desktop immersive interface 2902 may be different from other interfaces within multi-application environment 2912, such as those common to window-based interfaces.
Block 2806 provides the interaction to an application within the desktop immersive interface in response to the interaction. In some cases, this includes passing the gesture or gesture portion to an application having an interface within the desktop immersive interface, such as to operating system 120 for interacting with taskbar 2906 or to an application associated with one of window-based interfaces 2904. In other cases, this may include passing keystrokes to the primary (e.g., preceding) window of window-based interface 2904. In doing so, the techniques permit users to engage with applications and interfaces that are common to desktop, window-based environments, while also permitting interaction with immersive interfaces and other operations of multi-application environments.
The desktop immersive interface may be managed in a manner similar or identical to that described elsewhere herein. For example, edge gestures may be used to select, move, or size a desktop immersive interface, such as desktop immersive interface 2902. Moreover, the desktop immersive interface may appear as a single application that is part of an application queue and thus may be selected or removed from the multi-application environment, as noted above for the other interfaces.
Multi-application environment
The techniques and apparatus described above enable many different embodiments of multi-application environments, including environments with one interface but permitting additional interfaces, multiple interfaces that are all immersive, multiple interfaces that are a mix of immersive and window-based interfaces, and desktops that are treated as immersive interfaces, among others. In some embodiments, these multi-application environments enable selection of various menus or additional interfaces for the system and applications for providing additional controls. In still other embodiments, these multi-application environments enable gestures through which applications and interfaces are managed.
FIG. 30 depicts a method 3000 of enabling content presentation and/or management of a multi-application environment. In some portions of the following discussion, reference may be made to system 100 of FIG. 1, and other methods and example embodiments described elsewhere herein, the reference to which is made for example only.
Block 3002 presents, within a multi-application environment, a plurality of interfaces associated with a plurality of applications, at least one interface of the plurality of interfaces being an immersive interface.
As noted elsewhere herein, the multi-application environment may present various combinations of different interfaces. For example, consider a multi-application environment having at least one immersive interface, as shown on each of fig. 3, 7, 10, 12, 16, 18, 20, 23, 24, 26, 27, and 29. The example multi-application environments are described in order.
Fig. 3 illustrates a multi-application environment with a single immersive interface 304. Fig. 7 illustrates a multi-application environment having the single immersive interface of fig. 3 along with an application selection interface 702. Fig. 10 illustrates a multi-application environment with a window-based email interface 1004 and two immersive interfaces 1006 and 1008. FIG. 12 illustrates the multi-application environment of FIG. 10 along with an interface enabling additional controls-an email processing interface 1202. FIG. 16 illustrates a multi-application environment that switches from the web page 306 to the media player 1604 in response to a gesture selection. Fig. 18 illustrates a multi-application environment with thumbnail images 1602, 1808 of the immersive interface 304 and two other interfaces (one immersive and the other window-based). Fig. 20 illustrates a multi-application environment with two immersive interfaces, a larger immersive interface 2006 and a smaller immersive interface 2008 separated by an immersive interface spacer 2010. Fig. 23 illustrates a multi-application environment with a current immersive interface 2302 and regions 2306 and 2310, where additional interfaces may be grasped for full occupation in regions 2306 and 2310. Fig. 24 illustrates the multi-application environment of fig. 23 with a second immersive interface 2404, wherein interface 2404 displays a social networking web page 2406 for social networking application 1510 of fig. 15. Fig. 26 illustrates a multi-application environment in which content is presented through three immersive interfaces, one in primary region 2604 and two in non-primary region 2606. Fig. 27 illustrates the multi-application environment of fig. 26 along with a user interface management menu 2700. Fig. 29 illustrates a multi-application environment 2912 with a desktop immersive interface 2902 displaying a window-based interface 2904 and a taskbar 2906, and an immersive interface 2908 displaying content 2910.
Block 3004 presents content through at least one of a plurality of interfaces. Content is shown as displayed in many of the figures described above, such as media being played, social networking web pages, news website articles, and word processing documents. The displayed content is received from various sources, such as application 136, which may have generated the content or received the content from a remote source (e.g., in the case of a web browser application, from a remote provider).
It should be noted that many of the mentioned graphical displays are presented simultaneously. The multi-application environment may present moving visual media, such as a movie, on one interface while simultaneously presenting a web page with a media slideshow on another interface, both in real-time. Moreover, the multi-application environment enables interaction with multiple interfaces without necessarily requiring an initial selection. Thus, the user can select one interface or the other without first selecting an interface that is not primary or at the top of the stack, as is possible in a window-based environment.
Block 3006 enables selection of at least one interface to change size or position in the multi-application environment or to remove multiple interfaces from the multi-application environment. Block 3006 can act in various ways described elsewhere herein, such as enabling selection to move an interface from one region of a multi-application environment to another region through a drag-and-drop gesture.
Block 3008, in response to receiving the selection, changes a size of the selected interface, changes a position of the selected interface, or removes the selected interface from the multi-application environment. Exemplary changes to the size and location of interfaces in a multi-application environment are shown in fig. 23 and 24. Fig. 23 illustrates multi-application environment 2304 having first current immersive interface 2302, then immersive interface 2402 of reduced size, and with second immersive interface 2404 displaying social networking web page 2406.
Block 3010 changes a size or a position of another interface in the multi-application environment in response to the same selection. This is also shown in fig. 23 and 24. Although not shown, the multi-application environment module 124 responds to the removal of an interface by resizing and/or repositioning other interface(s) in the multi-application environment. Assume a situation in which the multi-application environment 2304 includes the two interfaces 2402 and 2404 of FIG. 24. In response to selecting to remove second immersive interface 2402, multi-application environment module 124 returns to multi-application environment 2304 as shown in fig. 23, which includes only current immersive interface 2302. It should be noted that current immersive interface 2302 is larger than second immersive interface 2404 and occupies a region of second immersive interface 2404.
The foregoing discussion describes some methods in which the techniques manage immersive interfaces in a multi-application environment, certain other methods that enable switching back to applications that have previously interacted with, still other methods that describe the manner in which the techniques enable and/or use edge gestures, further methods that describe the manner in which the techniques enable and/or use a desktop as an immersive interface, and methods that enable content presentation and/or management of a multi-application environment. The methodologies are shown as a set of blocks that specify operations performed but are not necessarily limited to the orders shown for performing the operations by the respective blocks. Also, these methods may be used in whole or in part in combination.
Some aspects of these methods may be implemented in hardware (e.g., fixed logic circuitry), firmware, a system on a chip (SoC), software, manual processing, or any combination thereof. A software implementation represents program code, such as software, applications, routines, programs, objects, components, data structures, procedures, modules, functions, and so forth, that performs specified tasks when executed by a computer processor. The program code can be stored in one or more computer-readable memory devices, both local and/or remote to a computer processor. The method may also be practiced in a distributed computing environment by multiple computing devices.
Exemplary device
Fig. 31 illustrates various components of an exemplary device 3100, which exemplary device 3100 can be implemented as any type of client, server, and/or computing device described with reference to fig. 1-31 above to implement techniques to enable and use edge gestures, switch back to applications with which it has previously interacted, and/or manage immersive interfaces in a multi-application environment. In an embodiment, device 3100 may be implemented as one or a combination of wired and/or wireless devices, in the form of: television client devices (e.g., television set-top boxes, Digital Video Recorders (DVRs), etc.), consumer devices, computer devices, server devices, portable computer devices, user devices, communication devices, video processing and/or rendering devices, appliance devices, gaming devices, electronic devices, and/or as another type of device. Device 3100 can also be associated with a user (e.g., a person) and/or an entity that operates the device such that the device describes logical devices that include users, software, firmware, and/or a combination of devices.
Device 3100 includes a communication device 3102 that enables wired and/or wireless communication of device data 3104 (e.g., received data, data that is being received, data scheduled for broadcast, data packets of the data, etc.). The device data 3104 or other device content can include configuration settings of the device, media content stored on the device, and/or information associated with a user of the device. Media content stored on device 3100 can include any type of audio, video, and/or image data. Device 3100 includes one or more data inputs 3106 via which any type of data, media content, and/or inputs can be received, such as user-selectable inputs, messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.
Device 3100 also includes communication interfaces 3108, which can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface. Communication interfaces 3108 provide a connection and/or communication links between device 3100 and a communication network by which other electronic, computing, and communication devices can communicate data with device 3100.
The device 3100 includes one or more processors 3110 (e.g., any of microprocessors, controllers, and the like) which process various computer-executable instructions to control the operation of the device 3100 and to enable those techniques to enable a multi-application environment. Alternatively or in addition, device 3100 can be implemented with any one or combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 3112. Although not shown, device 3100 can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
Device 3100 also includes computer-readable storage media 3114 such as one or more memory devices that enable permanent and/or non-transitory data storage (i.e., as opposed to mere signal transmission), examples of which include Random Access Memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and disk storage. The disk storage device may be embodied as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable Compact Disc (CD), any type of a Digital Versatile Disc (DVD), and so forth. Device 3100 can also include mass storage media devices 3116.
Computer-readable storage media 3114 provides data storage mechanisms to store device data 3104, as well as various device applications 3118 and any other types of information and/or data related to operational aspects of device 3100. For example, an operating system 3120 can be maintained as a computer application with the computer-readable storage media 3114 and executed on processors 3110. The device applications 3118 may include a device manager, such as any form of a control application, software application, signal processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, and so forth.
The device applications 3118 also include any system components or modules to implement the techniques, such as the device applications 3118 including a multi-application environment module 124, a system interface module 126, a gesture handler 128, an application manager 130, a manager 134, and an application(s) 136.
Conclusion
Although embodiments of techniques and apparatus enabling multi-application environments have been described in language specific to features and/or methods, it is to be understood that the subject of the appended claims is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as exemplary implementations that enable a multi-application environment.

Claims (10)

1. A computer-implemented method, comprising:
presenting, within a multi-application environment, a plurality of interfaces associated with a plurality of applications, at least one interface of the plurality of interfaces being an immersive interface; and
content is presented through at least one of the plurality of interfaces.
2. A computer-implemented method as described in claim 1, further comprising:
presenting second content concurrently with presenting the first-mentioned content, the second content being presented through another interface of the plurality of interfaces.
3. A computer-implemented method as described in claim 2, wherein the rendering of the first-mentioned content is rendering a first moving visual media in real-time, and the rendering of the second content is rendering a second moving visual media in real-time.
4. A computer-implemented method as described in claim 1, wherein the immersive interface represents a desktop having at least one window-based interface or task bar.
5. A computer-implemented method as described in claim 4, further comprising enabling interaction with an application associated with the window-based interface through the multi-application environment.
6. A computer-implemented method as described in claim 4, wherein the desktop further includes a taskbar, and further comprising enabling interaction with the taskbar through the multi-application environment.
7. A computer-implemented method as described in claim 1, wherein the content is received by the multi-application environment from one of the plurality of applications.
8. A computer-implemented method as described in claim 1, wherein another interface of the plurality of interfaces is a window-based interface.
9. A computer-implemented method as described in claim 1, wherein the multi-application environment does not include a visual control.
10. A computer-implemented method as described in claim 1, further comprising:
enabling selection to change a size or a position in the multi-application environment or to remove at least one of the plurality of interfaces from the multi-application environment; and
in response to receiving the selection, changing a size of the selected interface, changing a position of the selected interface, or removing the selected interface from the multi-application environment;
presenting content through two or more of the plurality of interfaces and simultaneously;
enabling selection to change a size or position of one of the plurality of interfaces; and
concurrently changing a size or a position of the one of the plurality of interfaces and another one of the plurality of interfaces in response to receiving the selection.
HK14107055.1A2011-05-272011-10-09Multi-application environmentHK1193665B (en)

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US13/118,3392011-05-27

Publications (2)

Publication NumberPublication Date
HK1193665Atrue HK1193665A (en)2014-09-26
HK1193665B HK1193665B (en)2018-03-23

Family

ID=

Similar Documents

PublicationPublication DateTitle
AU2017200737B2 (en)Multi-application environment
US11698721B2 (en)Managing an immersive interface in a multi-application immersive environment
US9104440B2 (en)Multi-application environment
KR102027612B1 (en)Thumbnail-image selection of applications
US9329774B2 (en)Switching back to a previously-interacted-with application
US20120299968A1 (en)Managing an immersive interface in a multi-application immersive environment
US12175062B2 (en)Managing an immersive interface in a multi-application immersive environment
HK1193665A (en)Multi-application environment
HK1193665B (en)Multi-application environment
HK1193661B (en)Multi-application environment
HK1193660B (en)Edge gesture

[8]ページ先頭

©2009-2025 Movatter.jp