CROSS-REFERENCE TO RELATED APPLICATIONSThis application is a continuation of U.S. patent application Ser. No. 12/840,709, filed on Jul. 21, 2010, the disclosure of which is expressly incorporated herein by reference in its entirety.
BACKGROUNDThis specification generally relates to systems and techniques for a user interface tab bar control affordance for a mobile computing device.
In some implementations, it is advantageous to protect an underlying application running on a mobile computing device from receiving accidental user input. For example, user input can be received from a keyboard, pointing device, or on-screen contact with a touchscreen included with the mobile computing device. The user may place the mobile computing device in a location that may inadvertently provide erroneous input to the device (e.g., the user places the device in a pocket, backpack, or handbag). Locking the mobile computing device can prevent the occurrence of the accidental input as the computing device can ignore any user input it receives while in a locked state.
SUMMARYAccording to one innovative aspect of the subject matter described in this specification, a mobile computing device with a touchscreen can lock the touchscreen to prevent accidental input to underlying applications running on the mobile computing device. A user interface (UI) touchscreen can include multiple tab bar controls that allow the user to interact with the mobile computing device in the locked state. The user can activate each of the tab bar controls on the touchscreen using a pulling gesture to slide the tab bar control across the touchscreen. The pulling gesture moves a graphic on one side of the tab bar control towards an the opposite side of the touchscreen. When the graphic moves a sufficient degree, the mobile computing device performs an operation designated by the tab bar control dependent upon the state of the mobile computing device.
For example, the mobile device can be a mobile phone. A user can use a tab bar control to unlock their mobile phone from a locked state. The user can use a tab bar control to control a ringer of the mobile phone (e.g., turn off (silence) the ringer or turn on (un-silence) the ringer). The user can also use a tab bar control to answer an incoming call on the mobile device when it is in a locked state. The user can also use a tab bar control to decline an incoming call on the mobile device when it is in a locked state.
The activation of a tab bar control can occur using a simple gesture performed by the user (e.g., a one finger pulling gesture of the tab bar across the touchscreen). However, the activation of the tab bar control to unlock the mobile computing device or answer or decline an incoming call can be a difficult, if not impossible, action to perform accidentally by inadvertent contact with the touchscreen (e.g., while the device is in the user's pocket).
In general, another innovative aspect of the subject matter described in this specification may be embodied in methods that include the actions of displaying a first tab graphic in a first tab bar control, the first tab bar control being displayed at a first default position on a first edge of a touchscreen display of a mobile device, detecting a user selection in a first region of the touchscreen display, the first region being associated with the first tab graphic, detecting user motion corresponding to the user selection, animating the first tab bar control to extend from the first edge of the touch-screen display in response to detecting user motion, determining a measure of user motion, comparing the degree of user motion to a threshold measure, and performing one or more functions on the mobile device in response to the measure of user motion exceeding the threshold measure. Other implementations of this aspect include corresponding systems, apparatus, and computer programs, configured to perform the actions of the methods, encoded on computer storage devices.
These and other implementations may each optionally include one or more of the following features. For instance, the one or more functions may include entering or exiting an unlocked mode, answering or declining a received call, and changing a mode of a ringer; the actions may further include continuing animation of the first tab bar control to extend from the first edge to a second edge of the touchscreen display in response to the measure of user motion exceeding the threshold measure; the actions may further include detecting cessation of the user selection, determining, upon detecting cessation of the user selection, that the measure of user motion is less than the threshold measure, and animating the first tab bar control to retract back toward the first edge of the touchscreen display to the first default position; the one or more functions may further include removing the first tab bar control from the touchscreen display, and displaying a plurality of icons on the touchscreen display, each icon corresponding to a respective application executable by the mobile device; the actions may further include concurrent with displaying the first tab graphic, displaying a second tab graphic in a second tab control bar, the second tab bar control being displayed at a second default position on a second edge of the touchscreen display, the second edge being opposite from the first edge, and removing the second tab graphic from the touchscreen display upon detecting the user selection in the first region; the actions may further include displaying a first target in response to detecting the user selection in the first region, wherein the first tab control bar is animated to extend towards the first target without displaying a defined path between the first tab bar control and the first target; the operations may further include displaying a first target in response to detecting the user selection in the first region, and displaying a defined path between the first tab bar control and the first target, wherein the first tab control bar is animated to extend towards the first target along the defined path; the operations may further include displaying a first target on the touchscreen display in response to detecting the user selection in the first region, the first target corresponding to the threshold measure, and determining that the measure of user motion is equal to or greater than the threshold measure when the first tab control bar contacts the first target on the touchscreen display; the actions may further include displaying scrolling text within the first tab bar control as the first tab control bar extends from the first edge of the touchscreen display, the scrolling text indicating the at least one or more functions; the actions may further include highlighting the first tab graphic upon detecting the user selection, and may include the highlighting including changing a color of the first tab graphic from a first color to a second color, the second color being brighter than the first color; the actions may further include detecting an orientation change of the mobile device, and rotating the first tab graphic in the first tab bar control in response to detecting the orientation change; and the measure includes a distance and the threshold measure includes a threshold distance.
The details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other potential features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.
BRIEF DESCRIPTION OF THE DRAWINGSFIGS. 1A-1D demonstrate the unlocking of a mobile computing device using an unlock tab bar control displayed on a touchscreen display device.
FIGS. 2A-2F are illustrations demonstrating controlling a ringer of a mobile computing device using tab bar controls displayed on a touchscreen display device.
FIGS. 3A-3C are illustrations demonstrating accepting and declining an incoming phone call on a mobile computing device.
FIG. 4 is an illustration showing graphical user interface elements on a mobile computing device when oriented in a landscape mode.
FIG. 5 is a flowchart of an exemplary process for using a tab bar control.
Like reference numbers represent corresponding parts throughout.
DETAILED DESCRIPTIONFIGS. 1A-1D are illustrations demonstrating the unlocking of amobile computing device100 using an unlocktab bar control102 displayed on atouchscreen display device104.
In these illustrations, themobile computing device100 is depicted as a handheld mobile telephone (e.g., a smartphone or an application telephone) that includes thetouchscreen display device104 for presenting content to a user of themobile computing device100. Themobile computing device100 includes various input devices (e.g., thetouchscreen display device104, a keyboard (not shown)) for receiving user input that influences the operation of themobile computing device100. In further implementations, themobile computing device100 may be a laptop computer, a tablet computer, a personal digital assistant, an embedded system (e.g., a car navigation system), a desktop computer, or a computerized workstation.
Themobile computing device100 may include various visual, auditory, and tactile user-output mechanisms. An example visual output mechanism is thetouchscreen display device104, which can visually display video, graphics, images, and text that combine to provide a visible user interface. An example tactile user-output mechanism is a small electric motor that is connected to an unbalanced weight to provide a vibrating alert (e.g., to silently vibrate themobile computing device100 to alert a user of an incoming telephone call or confirm user contact with the touchscreen display device104). Themobile computing device100 may include one ormore speakers106 that convert an electrical signal into sound, for example, music, an audible alert, or voice of an individual in a telephone call. Themobile computing device100 may include mechanical or touch sensitive buttons116a-d.
Themobile computing device100 can determine a position of physical contact with the touchscreen display device104 (e.g., a position of contact by afinger101 or a stylus). Using thetouchscreen display device104, various “virtual” input mechanisms may be produced, where a user interacts with a graphical user interface element depicted on thetouchscreen display device104 by contacting the graphical user interface element. As shown inFIG. 1A, the graphical user interface elements can be anunlock tab graphic108 included in afirst region118 of the unlocktab bar control102 and a sound offtab graphic130 included in asecond region132 of a sound offtab bar control126. The unlocktab bar control102 is located along afirst edge110 of thetouchscreen display device104. The sound offtab bar control126 is located along asecond edge128 of thetouchscreen display device104.
In some implementations, theunlock tab graphic108 may be a pictorial representation of the function performed by interacting with the graphical user interface element, in this example, the unlocktab bar control102. As shown inFIG. 1A, the pictorial representation for theunlock tab graphic108 is an unlocked lock signifying the unlocking of themobile computing device100 when interacting with the unlocktab bar control102.
Referring to the example shown inFIGS. 1A-1C, a user wants to unlock themobile computing device100 from a locked state. The user selects the unlocktab bar control102 by placing afinger101 in contact with theunlock tab graphic108. Upon contact with theunlock tab graphic108, the sound offtab bar control126 disappears or retracts towards thesecond edge128 and is no longer displayed on thetouchscreen display device104. Thetouchscreen display device104 then displays afirst contact point120, as shown inFIG. 1B. Thefirst contact point120 generally indicates a threshold measure of movement to be achieved to induce unlocking of themobile device100. That is, unlocking of the mobile device can occur when a measure of motion of the user's finger across thetouchscreen display device104 is equal to or greater than the threshold measure. The measure can include a distance and the threshold measure can include a threshold distance
In some implementations, themobile computing device100 provides tactile feedback when the user's finger initially makes contact with the unlock tab graphic108. In some implementations, while the user's finger remains in contact with the unlock tab graphic108, the unlock tab graphic108 and the unlocktab bar control102 are highlighted (e.g., displayed brighter than before contact). While the user's finger maintains contact with the unlock tab graphic108, the user, using a pushing or pulling motion, can move their finger across thetouchscreen display device104. This movement further animates the unlocktab bar control102 to appear as though it is being dragged across thetouchscreen display device104 from afirst default position112 towards thefirst contact point120, as shown inFIGS. 1B and 1C. As the user moves their finger across thetouchscreen display device104, thetouchscreen display device104 animates the display of the unlocktab bar control102 to extend from thefirst edge110. As thetouchscreen display device104 animates the display of the unlocktab bar control102, the term “Unlock” can be displayed and/or scrolled within the unlocktab bar control102 as the unlocktab bar control102 is extended, as shown inFIGS. 1B-1C.
The animation of the unlocktab bar control102 continues across thetouchscreen display device104 as long as the user maintains contact with the unlock tab graphic108, while dragging the unlocktab bar control102 across thetouchscreen display device104. When afirst arrow122 included on the unlocktab bar control102 makes contact with thefirst contact point120, the unlocktab bar control102 changes color. The animation of the unlocktab bar control102 continues across thetouchscreen display device104 towards thesecond edge128 and is then no longer displayed on thetouchscreen display device104. Themobile computing device100 unlocks and, in its unlocked state, themobile computing device100 displays agraphical user interface124 on thetouchscreen display device104 as shown inFIG. 1D. For example, thegraphical user interface124 is a collection of one or more graphical interface elements. Thegraphical user interface124 may be static (e.g., the display appears to remain the same over a period of time), or may be dynamic (e.g., the graphical user interface includes graphical interface elements that animate without user input).
In some implementations, a background wallpaper of thetouchscreen display device104 can be visible behind the unlocktab bar control102 even as the user drags the unlocktab bar control102 across the touchscreen display device104 (e.g., the unlocktab bar control102 is semitransparent). When thefirst arrow122 makes contact with thefirst contact point120, the unlocktab bar control102 can change color and become opaque (e.g., the background wallpaper of thetouchscreen display device104 is no longer visible behind the unlock tab bar control102).
In some situations, a user may decide while dragging the unlocktab bar control102 towards thefirst contact point120 that they no longer want to unlock themobile computing device100. The user then removes their finger from the unlock tab graphic108. This disconnects or breaks the physical contact between the user and thetouchscreen display device104. The unlocktab bar control102 is no longer dragged across thetouchscreen display device104 and retracts back to thefirst default position112. The user may also move their finger backwards towards thefirst edge110 of thetouchscreen display device104, retracting the unlocktab bar control102 back to thefirst default position112. Thetouchscreen display device104 displays the sound offtab bar control126 along with the unlocktab bar control102 as shown inFIG. 1A. Themobile computing device100 remains in a locked state.
FIGS. 2A-2F are illustrations demonstrating controlling a ringer of amobile computing device100 using tab bar controls displayed on atouchscreen display device104. Specifically,FIGS. 2A-2D are illustrations demonstrating the silencing of the ringer of themobile computing device100 using the sound offtab bar control126 displayed on thetouchscreen display device104.
In some implementations, the sound off tab graphic130 may be a pictorial representation of the current state of the speaker or ringer of themobile computing device100 and the function performed by interacting with the graphical user interface element, in this example, the sound offtab bar control126. As shown inFIG. 2A, the pictorial representation of a speaker emitting sound for the sound off tab graphic130 signifies the ringer of themobile computing device100 is currently enabled. The user interacting with the sound offtab bar control126 can control the ringer (e.g., disable the ringer and silence the mobile computing device100).
Referring toFIGS. 2A-2D, a user wants to silence the ringer of themobile computing device100 while themobile computing device100 is in a locked state. For example, it is a Sunday morning and the user is at a church service where any audible tone from themobile computing device100 would disturb the other parishioners and disrupt the church service. The user selects the sound offtab bar control126 by placing a finger in contact with the sound off tab graphic130 included in thesecond region132 of the sound offtab bar control126. Upon contact with the sound off tab graphic130, the unlocktab bar control102 disappears or retracts towards thefirst edge110 and is no longer displayed on thetouchscreen display device104. Thetouchscreen display device104 then displays a second contact point202, as shown inFIG. 2B. The second contact point202 generally indicates a threshold measure of movement to be achieved to induce silencing of the ringer of themobile device100. That is, silencing of the ringer can occur when a measure of motion of the user's finger across thetouchscreen display device104 is equal to or greater than the threshold measure. In some implementations, themobile computing device100 provides tactile feedback when the user's finger initially makes contact with the sound off tab graphic130. In some implementations, while the user's finger remains in contact with the sound off tab graphic130, the sound off tab graphic130 and the sound offtab bar control126 are highlighted (e.g., displayed brighter than before contact).
While the user's finger maintains contact with the sound off tab graphic130, the user, using a pulling motion, can further drag the sound offtab bar control126 across thetouchscreen display device104. The user can drag the sound offtab bar control126 from asecond default position134 towards the second contact point202, as shown inFIGS. 2B and 2C. As the user drags the sound offtab bar control126 across thetouchscreen display device104, thetouchscreen display device104 animates the display of the sound offtab bar control126 to extend from thesecond edge128. As thetouchscreen display device104 animates the display of the sound offtab bar control126, the words “Sound Off” can be displayed within the sound offtab bar control126 as the sound offtab bar control126 is extended, as shown inFIG. 2C.
The animation of the sound offtab bar control126 continues across thetouchscreen display device104 as long as the user maintains contact with the sound off tab graphic130 while dragging the sound offtab bar control126 across thetouchscreen display device104. Referring toFIG. 2C, when asecond arrow204 included on the sound offtab bar control126 makes contact with the second contact point202, thetouchscreen display device104 displays a sound offindicator212 showing that the sound for themobile computing device100 is turned off. The sound offindicator212 can include a sound off graphic214 providing a graphic indicator representing the disabling (turning off) of the ringer and the silencing of themobile computing device100.
Additionally, when thesecond arrow204 makes contact with the second contact point202, the animation of the sound offtab bar control126 continues across thetouchscreen display device104 towards thefirst edge110 of thetouchscreen display device104. The ringer of themobile computing device100 is disabled (e.g., themobile computing device100 is silenced) and thetouchscreen display device104 displays the graphical user interface elements shown inFIG. 2D.
Referring toFIG. 2D, thetouchscreen display device104 displays the unlocktab bar control102 and a sound ontab bar control208 where the sound ontab bar control208 includes a sound on tab graphic206 in athird region210 of the sound ontab bar control208. In some implementations, the sound on tab graphic206 may be a pictorial representation of the current state of themobile computing device100 and the function performed by interacting with the sound ontab bar control208. For example, the pictorial representation for the sound on tab graphic206 is a speaker emitting sound with a line drawn diagonally across it signifying the ringer of themobile computing device100 is currently disabled and themobile computing device100 is silent. Interacting with the sound ontab bar control208 can enable the ringer of the mobile computing device100 (e.g., themobile computing device100 is no longer silent but can emit audible sounds).
In the implementation shown inFIGS. 2A-2F, the sound ontab bar control208 and the sound offtab bar control126 are located on thetouchscreen display device104 at thesecond default position134. Thetouchscreen display device104 can display the sound on tab graphic206 included in the sound ontab bar control208 in a different color than the sound off tab graphic130 in the sound offtab bar control126. This can help to additionally distinguish the different states of themobile computing device100 represented by the sound off tab graphic130 and the sound on tab graphic206.
Referring toFIGS. 2A-2C, in some situations, a user may decide while dragging the sound offtab bar control126 towards the second contact point202 that they no longer want to silence the ringer of themobile computing device100. The user then removes their finger from the sound off tab graphic130. This disconnects or breaks the physical contact between the user and thetouchscreen display device104. The sound offtab bar control126 is no longer dragged across thetouchscreen display device104 and retracts back to thesecond default position134. Thetouchscreen display device104 displays the sound offtab bar control126 and the unlocktab bar control102 as shown inFIG. 2A. The ringer of themobile computing device100 remains enabled.
In some implementations, a background wallpaper of thetouchscreen display device104 can be visible behind the sound offtab bar control126 even as the user drags the sound offtab bar control126 across the touchscreen display device104 (e.g., the sound offtab bar control126 is semitransparent). When thesecond arrow204 makes contact with the second contact point202, the sound offtab bar control126 can change color and become opaque (e.g., the background wallpaper of thetouchscreen display device104 is no longer visible behind the sound off tab bar control126).
Referring toFIGS. 2D-2F, a user wants to enable the ringer of themobile computing device100 while themobile computing device100 is in a locked state. For example, the church service is over and the user left the church and is in their car traveling to meet friends for breakfast. The user selects the sound ontab bar control208 by placing a finger in contact with the sound on tab graphic206 included in thethird region210 of the sound ontab bar control208. Upon contact with the sound on tab graphic206, the unlocktab bar control102 retracts towards thefirst edge110 and is no longer displayed on thetouchscreen display device104. Thetouchscreen display device104 then displays a third contact point216, as shown inFIG. 2E. In some implementations, themobile computing device100 provides tactile feedback when the user's finger initially makes contact with the sound on tab graphic206. In some implementations, while the user's finger remains in contact with the sound on tab graphic206, the sound on tab graphic206 and the sound ontab bar control208 are highlighted (e.g., displayed brighter than before contact).
In a similar manner as the sound offtab bar control126, while the user's finger maintains contact with the sound on tab graphic206 in the sound ontab bar control208, the user, using a pulling motion, can further drag the sound ontab bar control208 across thetouchscreen display device104. Thetouchscreen display device104 animates the display of the sound ontab bar control208 to extend from thesecond edge128 as the user maintains contact with the sound on tab graphic206. As thetouchscreen display device104 animates the display of the sound ontab bar control208, the words “Sound On” can be displayed within the sound ontab bar control208 as the sound ontab bar control208 is extended, as shown inFIG. 2E. As shown inFIG. 2F, when a third arrow218 included on the sound ontab bar control208 makes contact with the third contact point216, thetouchscreen display device104 displays a sound on indicator220 showing that the sound for themobile computing device100 is turned on. Additionally, the sound on indicator220 can include a sound on graphic222 providing a graphic indicator representing the enabling (turning on) of the ringer.
Additionally, when the third arrow218 makes contact with the third contact point216, the animation of the sound ontab bar control208 continues across thetouchscreen display device104 towards thefirst edge110 of thetouchscreen display device104. The ringer of themobile computing device100 is enabled (e.g., the sound is turned ON for the mobile computing device100) and thetouchscreen display device104 displays the graphical user interface elements as shown inFIG. 2A.
Referring toFIGS. 2D-2F, in some situations, a user may decide while dragging the sound ontab bar control208 towards the third contact point216 that they no longer want to enable the ringer of themobile computing device100, turning the sound ON for themobile computing device100. The user then removes their finger from the sound on tab graphic206 disconnecting or breaking the physical contact between the user and thetouchscreen display device104. The sound ontab bar control208 retracts back to thesecond default position134. Thetouchscreen display device104 displays the sound ontab bar control208 and the unlocktab bar control102 as shown inFIG. 2D. The ringer of themobile computing device100 remains disabled, silencing themobile computing device100.
In some implementations, similar to the sound offtab bar control126, the sound ontab bar control208 is semitransparent allowing a background wallpaper of thetouchscreen display device104 to be visible behind the sound ontab bar control208 even as the user drags the sound ontab bar control126 across thetouchscreen display device104. When the third arrow218 makes contact with the third contact point216, the sound ontab bar control208 can change color and become opaque (e.g., the background wallpaper of thetouchscreen display device104 is no longer visible behind the sound on tab bar control208).
FIGS. 3A-3C are illustrations demonstrating accepting and declining an incoming phone call on themobile computing device100. Referring toFIG. 3A, thetouchscreen display device104 displays graphical user interface elements such as an answer tab graphic302 included in afirst phone region304 of an answertab bar control306 and a decline tab graphic308 included in asecond phone region310 of a declinetab bar control312. The answertab bar control306 is located along thefirst edge110 of thetouchscreen display device104. The declinetab bar control312 is located along thesecond edge128 of thetouchscreen display device104.
In some implementations, the answer tab graphic306 and the decline tab graphic308 may be pictorial representations of the functions performed by interacting with the answertab bar control306 and the declinetab bar control312 graphical user interface elements, respectively. As shown inFIG. 3A, the pictorial representation for the answer tab graphic302 is a telephone in a “picked up” or answered state signifying the acceptance or answering of the incoming phone call on themobile computing device100. The pictorial representation for the decline tab graphic308 is a telephone in a “hung up” or unanswered state signifying declining or not answering the incoming phone call on themobile computing device100.
Referring toFIG. 3A, when themobile computing device100 receives an incoming phone call, the touchscreen display device can display a name for the incoming caller (e.g., caller name314), a picture of the incoming caller (e.g., caller picture316) and a phone number for the incoming call (e.g., caller phone number318).
Referring to the example shown inFIGS. 3A and 3B, a user receives an incoming phone call on themobile computing device100 while themobile computing device100 is in a locked state. For example, the user is in a meeting with their boss and prefers not to be interrupted by a coworker. On receiving the incoming call and seeing that it is from a coworker (recognizing the name, picture and phone number associated with the incoming phone call), the user decides to decline the incoming call (not answer the call on the mobile computing device100). In order to decline the incoming call, the user selects the declinetab bar control312 by placing a finger in contact with the decline tab graphic308. Upon contact with the decline tab graphic308, the answertab bar control306 disappears or retracts towards thefirst edge110 and is no longer displayed on thetouchscreen display device104. Thetouchscreen display device104 then displays secondphone contact point320, as shown inFIG. 3B. The secondphone contact point320 generally indicates a threshold measure of movement to be achieved to induce declining of the call. That is, declining the call can occur when a measure of motion of the user's finger across thetouchscreen display device104 is equal to or greater than the threshold measure. Additionally, thetouchscreen display device104 can display declineinstructions322. Thedecline instructions322 can indicate to the user the action the user needs to take to decline the incoming phone call.
In some implementations, themobile computing device100 provides tactile feedback when the user's finger initially makes contact with the decline tab graphic308. In some implementations, while the user's finger remains in contact with the decline tab graphic308, the decline tab graphic308 and the declinetab bar control312 are highlighted (e.g., displayed brighter than before contact). While the user's finger maintains contact with the decline tab graphic308, the user, using a pulling motion, can further drag the declinetab bar control312 across thetouchscreen display device104 towards the secondphone contact point320, as shown inFIG. 3B. As the user drags the declinetab bar control312 across thetouchscreen display device104, thetouchscreen display device104 animates the display of the declinetab bar control312 to extend from thesecond edge128. As thetouchscreen display device104 animates the display of the declinetab bar control312, the term “Decline” can be displayed within the declinetab bar control312 as the declinetab bar control312 is extended. When asecond phone arrow325 included on the declinetab bar control312 makes contact with the secondphone contact point320, the declinetab bar control312 changes color. The animation of the declinetab bar control312 continues across thetouchscreen display device104 towards thefirst edge110 and is then no longer displayed on thetouchscreen display device104. The incoming phone call is then declined or not answered by the user (e.g., themobile computing device100 no longer indicates there is an incoming call and the caller is connected to the user's voicemail). Thetouchscreen display device104 then displays the graphical user interface elements shown inFIG. 2A.
Referring toFIGS. 3A-3B, in some situations, a user may decide, while dragging the declinetab bar control312 towards the secondphone contact point320, that they no longer want to decline or not answer the incoming call. The user then removes their finger from the decline tab graphic308. This disconnects or breaks the physical contact between the user and thetouchscreen display device104. The declinetab bar control312 is no longer dragged across thetouchscreen display device104 and retracts back to a secondphone default position324. Thetouchscreen display device104 displays the declinetab bar control312 including the decline tab graphic308 and the answertab bar control306 including the answer tab graphic302 as shown inFIG. 3A. Themobile computing device100 continues to receive the incoming call.
Referring toFIGS. 3A and 3C, the user, in order to answer the incoming call, selects the answertab bar control306 by placing a finger in contact with the answer tab graphic302. Upon contact with the answer tab graphic302, the declinetab bar control312 disappears or retracts towards thesecond edge128 and is no longer displayed on thetouchscreen display device104. Thetouchscreen display device104 then displays firstphone contact point326, as shown inFIG. 3C. The firstphone contact point326 generally indicates a threshold measure of movement to be achieved to induce answering of the call. That is, answering the call can occur when a measure of motion of the user's finger across thetouchscreen display device104 is equal to or greater than the threshold measure. Additionally, thetouchscreen display device104 can display answer instructions328. The answer instructions328 can indicate to the user the action the user needs to take in order to answer the incoming phone call.
In some implementations, themobile computing device100 provides tactile feedback when the user's finger initially makes contact with the answer tab graphic302. In some implementations, while the user's finger remains in contact with the answer tab graphic302, the answer tab graphic302 and the answertab bar control306 are highlighted (e.g., displayed brighter than before contact). While the user's finger maintains contact with the answer tab graphic302, the user, using a pulling motion, can further drag the answertab bar control306 across thetouchscreen display device104 towards the firstphone contact point326, as shown inFIG. 3C. As the user drags the answertab bar control306 across thetouchscreen display device104, thetouchscreen display device104 animates the display of the answertab bar control306 to extend from thefirst edge110. As thetouchscreen display device104 animates the display of the answertab bar control306, the term “Answer” can be displayed within the answertab bar control306 as the answertab bar control306 is extended. When afirst phone arrow330 included on the answertab bar control306 makes contact with the firstphone contact point326, the answertab bar control306 changes color. The animation of the answertab bar control306 continues across thetouchscreen display device104 towards thesecond edge128 and is then no longer displayed on thetouchscreen display device104. The user answers the incoming phone call. Thetouchscreen display device104 then displays graphical user interface elements (not shown) related to the handling of the receipt of the incoming call (e.g., a graphical user interface element to end the call).
In some situations, a user may decide while dragging the answertab bar control306 towards the firstphone contact point326 that they no longer want to answer the incoming call. The user then removes their finger from the answer tab graphic302. This disconnects or breaks the physical contact between the user and thetouchscreen display device104. The answertab bar control306 is no longer dragged across thetouchscreen display device104 and retracts back to a first phone default position332 as shown inFIG. 3A. Thetouchscreen display device104 displays the answertab bar control306 and the declinetab bar control312 as shown inFIG. 3A. Themobile computing device100 continues to receive the incoming call.
FIG. 4 is an illustration showing graphical user interface elements on themobile computing device100 when oriented in a landscape mode. The examples shown inFIGS. 1A-1D, 2A-2F and 3A-3C are illustrated on themobile computing device100 when oriented in a portrait mode. The example tab bar controls and their functionality described inFIGS. 1A-1D, 2A-2F and 3A-3C can be implemented on themobile computing device100 when oriented in a landscape mode.FIG. 4 shows an example of graphical user interface elements that can include an alternative unlock tab graphic408 included in thefirst region118 of the unlocktab bar control102 and an alternative sound off tab graphic430 included in thesecond region132 of the sound offtab bar control126. The alternative unlock tab graphic408 and the alternative sound off tab graphic430 are oriented to appear correctly to the user when themobile computing device100 is held in the landscape mode.
The unlocktab bar control102 is located along thefirst edge110 of thetouchscreen display device104. The sound offtab bar control126 is located along thesecond edge128 of thetouchscreen display device104. As shown inFIG. 1A andFIG. 4, the location of the unlocktab bar control102 and the sound offtab bar control126 remain the same on thetouchscreen display device104. The tab graphics are adjusted accordingly for landscape and portrait mode. The user interaction with the tab bar controls is the same for both the landscape mode and portrait mode. However, the user motion when dragging the tab bar control in the landscape mode is up and down as opposed to the side-to-side motion used when themobile computing device100 is in the portrait mode. The functionality of the graphical user interface elements on themobile computing device100 remains the same in both the landscape and portrait modes.
FIG. 5 is a flowchart of anexemplary process500 for using a tab bar control. Theprocess500 can be used with the tab bar controls described inFIGS. 1A-1D,FIGS. 2A-2F,FIGS. 3A-3C andFIG. 4.
Theprocess500 begins by displaying first and second tab graphics (502). For example, referring toFIG. 1A, themobile computing device100 displays the unlock tab graphic108 and the sound off tab graphic130 on thetouchscreen display device104. Theprocess500 continues by detecting user selection of the first tab graphic (504). For example, the user places their finger on the unlock tab graphic108 resulting in contact being made between the user and thetouchscreen display device104 and the selection of the unlock tab graphic108 by the user. The second tab graphic is removed (506). For example, thetouchscreen display device104 no longer displays the sound off tab graphic130 and the sound offtab bar control126. A first target is displayed (508). For example, thetouchscreen display device104 displays thefirst contact point120. User motion corresponding to the user selection is detected (510). The tab bar control is animated (512). For example, referring toFIG. 1C, themobile computing device100 detects the pulling motion by the user of the unlock tab graphic108 resulting in the animation of the unlocktab bar control102 across thetouchscreen display device104 towards thefirst contact point120. The measure of user motion is determined (514). Instep518, if the measure of user motion is less than the threshold measure and, instep516, it is determined that user contact with the tab graphic has not ceased, theprocess500 continues to step514 and the measure of user motion is again determined. The measure can include a distance and the threshold measure can include a threshold distance. Instep518, if the measure of user motion is less than the threshold measure and, instep516, it is determined that user contact with the tab graphic has ceased, the tab bar control is retracted (522). Instep518, if the measure of user motion is equal to or greater than the threshold measure, the mobile device function is performed (520). For example, referring toFIGS. 1C and 1D, themobile computing device100 determines contact has been made betweenfirst arrow122 and thefirst contact point120. Themobile computing device100 performs the function associated with the first tab graphic. For example, referring toFIG. 1D, themobile computing device100 unlocks and displays thegraphical user interface124.
The use of the tab bar controls described with reference toFIGS. 1A-1D,FIGS. 2A-2F,FIGS. 3A-3C andFIG. 4 requires continual physical contact between the user and thetouchscreen display device104 in order to perform the task represented by the tab bar control (e.g., unlocking themobile computing device100, answering an incoming call, etc.). In addition, themobile computing device100 may not perform the task until the user drags the tab bar control across thetouchscreen display device104 and contacts the contact point on thetouchscreen display device104 related to the tab bar control. The need for continual physical contact between the user and thetouchscreen display device104 and, in addition, the requirement to have the tab bar control contact a contact point prior to performing a task makes the likelihood of an inadvertent trigger of the task on themobile computing device100 small. For example, it would be difficult if not impossible for themobile computing device100 to be unlocked from a locked state while in the user's pocket.
In some implementations, when the user is holding themobile computing device100 in a portrait mode, the user maintaining their finger on a tab graphic accomplishes the continual physical contact needed between the user and thetouchscreen display device104. The user, while maintaining this contact, uses a pulling, side-to-side motion to drag the tab bar control across thetouchscreen display device104. Thetouchscreen display device104 can have a certain tolerance for up and down motion by the user while maintaining contact on the tab graphic. However, if the user exceeds the tolerance, thetouchscreen display device104 will interpret the movement as disconnecting or breaking the contact between the user and thetouchscreen display device104 and the tab bar control will retract towards its associated edge. The task associated with the tab bar control will not be performed (e.g., themobile computing device100 remains locked).
In some implementations, when the user is holding themobile computing device100 in a landscape mode, the user keeps their finger on a tab graphic accomplishing the continual physical contact needed between the user and thetouchscreen display device104. The user, while maintaining this contact, uses a pulling motion to drag the tab bar control up and down thetouchscreen display device104. Thetouchscreen display device104 can have a certain tolerance for side-to-side motion by the user while maintaining contact on the tab graphic. However, if the user exceeds the tolerance, thetouchscreen display device104 will interpret the movement as disconnecting or breaking the contact between the user and thetouchscreen display device104 and the tab bar control will retract towards its associated edge. The task associated with the tab bar control will not be performed (e.g., themobile computing device100 remains locked).
In some implementations, the user maintains their finger on a tab graphic accomplishing the continual physical contact needed between the user and thetouchscreen display device104. The user, while maintaining this contact, may use a pulling motion to drag the tab bar control forward across thetouchscreen display device104 to contact a contact point. The task associated with the tab bar control can be performed once the tab bar control contacts the contact point. The user, while maintaining contact with the tab graphic, may also drag the tab bar control backwards towards the originating edge of thetouchscreen display device104 for the tab bar control. This backwards movement can result in the tab bar control returning to its default position. The tab bar control can return to its default position because of the backwards movement of the tab bar control by the user and because of some retracting of the tab bar control once it reaches a certain point on thetouchscreen display device104. The task associated with the tab bar control will not be performed. For example, referring toFIGS. 2A and 2B, the user may drag the sound offtab bar control126 backwards towardssecond edge128 resulting in the sound offtab bar control126 retracting back to itssecond default position134. The sound will then remain on for themobile computing device100. In some implementations, any backwards movement of the tab bar control will result in the tab bar control retracting to its default position. The task associated with the tab bar control will not be performed.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. For example, various forms of the flows shown above may be used, with steps re-ordered, added, or removed. Accordingly, other implementations are within the scope of the following claims.
Implementations and all of the functional operations described in this specification may be provided in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Implementations may be provided as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium may be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus may include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus.
A computer program (also known as a program, software, software application, script, or code) may be written in any form of programming language, including compiled or interpreted languages, and it may be deployed in any form, including as a stand alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program may be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program may be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
The processes and logic flows described in this specification may be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows may also be performed by, and apparatus may also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both.
The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer may be embedded in another device, e.g., a tablet computer, a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few. Computer readable media suitable for storing computer program instructions and data include all forms of non volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory may be supplemented by, or incorporated in, special purpose logic circuitry.
To provide for interaction with a user, implementations may be provided on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user may provide input to the computer. Other kinds of devices may be used to provide for interaction with a user as well; for example, feedback provided to the user may be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user may be received in any form, including acoustic, speech, or tactile input.
Implementations may be provided in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user may interact with an implementation, or any combination of one or more such back end, middleware, or front end components. The components of the system may be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
While this specification contains many specifics, these should not be construed as limitations on the scope of the disclosure or of what may be claimed, but rather as descriptions of features specific to particular implementations. Certain features that are described in this specification in the context of separate implementations may also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation may also be implemented in multiple implementations separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination may in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems may generally be integrated together in a single software product or packaged into multiple software products.
In each instance where an HTML file is mentioned, other file types or formats may be substituted. For instance, an HTML file may be replaced by an XML, JSON, plain text, or other types of files. Moreover, where a table or hash table is mentioned, other data structures (such as spreadsheets, relational databases, or structured files) may be used.
Thus, particular implementations have been described. Other implementations are within the scope of the following claims. For example, the actions recited in the claims may be performed in a different order and still achieve desirable results.