CROSS-REFERENCE TO RELATED APPLICATIONSThis application claims the priority of U.S. Provisional Application Ser. No. 61/255,847, filed on Oct. 28, 2009, the contents of which are incorporated herein by reference.
BACKGROUNDThis specification relates to the display of collections of interactive elements that trigger actions directed to a particular contact, message, media file (e.g., image, music or video file), or other item.
Touchscreens are graphical displays that can act as both an input and an output. For example, visual elements displayed in a touchscreen can serve a double-duty, acting both as interactive elements that receive user input and as visual outputs that convey information to a user. As a result, data processing devices that use touchscreens can be made relatively small. Indeed, touchscreens are so effective that many modern data processing devices supplement touchscreens with only a small number of other—generally mechanical—input mechanisms. Touchscreens are thus favored in data processing devices where size and portability are important design considerations, such as smartphones and personal digital assistants (PDA's).
SUMMARYThis specification describes technologies relating to the display—on touchscreen displays—of collections of interactive elements that trigger the performance of data processing and other actions. The interactive elements in such collections are directed to the performance of data-processing or other actions that are directed to a particular contact, message, media file (e.g., image, music or video file), or other item. As a result of the grouping and display of such a collection, a user can conveniently and intuitively navigate through a wide range of actions directed to a particular item, even when the touchscreen display on which the collection of interactive elements is displayed is relatively small sized.
A first aspect of these technologies is a method performed by a system comprising one or more data processing devices and a touchscreen display. The method includes displaying several identifiers, each identifier comprising one or more graphical or textual elements that identify an item, each identifier associated with a respective interactive element, receiving user interaction with a first of the interactive elements that is associated with a first of the identifiers, in response to the user interaction, displaying a collection of action widgets on the touchscreen display, the action widgets comprising iconic graphical indicia that each represent an action triggered by user interaction therewith, the iconic graphical indicia displayed adjacent one another in a strip-shaped area that is wider than it is high, the strip-shaped area being displaced vertically on the touchscreen display from the first identifier so that the first identifier is visible on the touchscreen notwithstanding the display of the collection of action widgets, receiving user interaction with a first of the action widgets that is in the collection displayed on the touchscreen display, and performing the action represented by the first of the action widgets on the item identified by the first identifier.
This first aspect and the second and third aspects can include one or more of the following features. Displaying the collection of action widgets on the touchscreen display can include apparently displacing one or more of identifiers away from the first identifier to accommodate the strip-shaped area between the displaced one or more of identifiers and the first identifier. The method can include displaying a disambiguation interactive element on the touchscreen display on a side of the strip-shaped area opposite the first identifier and receiving user interaction with the disambiguation interactive element, the user interaction disambiguating the action represented by the first of the action widgets. Performing the action represented by the first of the action widgets can include performing the action in accordance with the disambiguation provided by the user interaction with the disambiguation interactive element. Displaying the collection of action widgets can include displaying a pointed indicium that is directed toward an area in which the first identifier is found. A border can surround the collection of action widgets. The border can demarcate the collection of action widgets from other portions of the touchscreen display. The pointed indicium can extend outwardly from a relatively straighter portion of the border toward the area in which the first identifier is found. Each collection of information can be displayed in a strip-shaped area that is wider than it is high. Each strip-shaped area can occupy a majority of the width of the touchscreen display. The identifiers can be aligned horizontally in the strip-shaped areas. The method can also include receiving user interaction dragging across the strip-shaped area and in response to the user interaction, displaying a second collection of action widgets on the touchscreen display. The second collection of action widgets can include at least one action widget that is not found in the action widget collection and exclude at least one action widget that is found in the action widget collection. The first identifier can identify a first message. The action widgets in the collection can include a reply widget that, in response to user interaction, triggers a display of a presentation for authoring a reply to the first message and a repost widget that, in response to user interaction, triggers reposting of the first message to a social network. The first identifier can identify a first contact. The action widgets in the collection can include a telephone contact widget that, in response to user interaction, triggers a telephone call to the first contact and a message widget that, in response to user interaction, triggers a display of a presentation for authoring a message addressed to the first contact. The first identifier can identify a first media file. The action widgets in the collection can include a telephone contact widget that, in response to user interaction, triggers a telephone call to the first contact and a message widget that, in response to user interaction, triggers a display of a presentation for authoring a message addressed to the first contact.
A second aspect of these technologies is a device that includes a computer storage medium encoded with a computer program. The program includes instructions that when executed by a system comprising one or more data processing devices and a touchscreen display, cause the one or more data processing devices to perform operations. The operations include displaying an interactive element in a presentation on the touchscreen display, receiving user interaction with the interactive element, and displaying, in response to the user interaction, a collection of action widgets apparently overlaid on the presentation. The action widgets comprising iconic graphical indicia that each represent an action triggered by user interaction therewith. The iconic graphical indicia are displayed adjacent one another in an area that is wider than it is high and that is associated with a visible indicium that indicates to what the action triggered by user interaction with the widgets in the collection are directed. The area is displaced on the touchscreen display from the interactive element so that the interactive element is visible in the presentation notwithstanding the display of the collection of widgets.
This second aspect and the first and third aspects can include one or more of the following features. The operations can also include receiving user interaction with a first of the action widgets that is in the collection displayed on the touchscreen display and performing the action represented by the first of the action widgets in accordance with the visible indicium. The method can include displaying a disambiguation interactive element on the touchscreen display and receiving user interaction with the disambiguation interactive element, the user interaction disambiguating the action represented by the first of the action widgets. Performing the action represented by the first of the action widgets can include performing the action in accordance with the disambiguation provided by the user interaction with the disambiguation interactive element. The visible indicium can indicate that the action triggered by user interaction with the action widgets in the collection is directed to a message. The action widgets in the collection can include a reply widget that, in response to user interaction, triggers a display of a presentation for authoring a reply to the first message and a repost widget that, in response to user interaction, triggers reposting of the first message to a social network. The visible indicium can indicate that the action triggered by user interaction with the action widgets in the collection is directed to a hyperlink that refers, in a reference, to an electronic document or to a portion of an electronic document. The action widgets in the collection can include an open widget that, in response to user interaction, triggers opening of the referenced electronic document or the referenced portion of the electronic document and a share widget that, in response to user interaction, triggers transmission of a message or display of a presentation for authoring a message that includes the reference. The area in which the iconic graphical indicia are displayed can be demarcated from other portions of the presentation by a border that surrounds the collection of widgets. The visible indicium can include a pointed indicium that extends outwardly from a relatively straighter portion of the border. The interactive element can be encompassed by the border.
A third aspect of these technologies is a handheld data processing system that includes a touchscreen display and a collection of one or more data processing devices that perform operations in accordance with one or more collections of machine-readable instructions. The operations include instructing the touchscreen display to display, in response to user interaction with a first interactive element displayed on the touchscreen display in association with an identifier of a contact, a first collection of action widgets comprising iconic graphical indicia that each represent an action directed to the identified contact and display, in response to user interaction with a second interactive element displayed on the touchscreen display in association with an identifier of a message, a second collection of action widgets comprising iconic graphical indicia that each represent an action directed to the identified message. The respective of the first and the second interactive elements are visible on the touchscreen display notwithstanding the display of the respective of the first or the second collection of action widgets.
This third aspect and the first and second aspects can include one or more of the following features. The operations can include instructing the touchscreen display to display, in response to user interaction with a third interactive element displayed on the touchscreen display in association with an identifier of a media file, a third collection of action widgets comprising iconic graphical indicia that each represent an action directed to the identified media file. Each of the first interactive element and the second interactive element can be displayed on the touchscreen display in conjunction with a collection of other interactive elements. Each of the other interactive elements can be associated with an identifier of another contact or another message. The identifiers in a presentation can be displayed in respective strip-shaped areas that include information characterizing contacts, media files, or messages. The identifiers can be aligned horizontally in the strip-shaped areas. Each of the collections of action widgets can be associated with a pointed indicium that is directed to indicate the respective contact or message to which the actions are directed. The operations can include instructing the touchscreen display to display a border surrounding the first and the second action widget collections, the border demarcating the first and the second action widget collections from other portions of the touchscreen display and the pointed indicium extending outwardly from a relatively straighter portion of the borders toward the area in which the identifier of the respective contact or message is found. The operations can include instructing the touchscreen display to display the iconic graphical indicia of the first and the second action widget collections adjacent one another in a strip-shaped area that is wider than it is high. The strip-shaped area can be displaced vertically on the touchscreen display from the respective of the first and the second interactive elements. The operations can also include receiving user interaction dragging across the strip-shaped area that includes the iconic graphical indicia and in response to the dragging user interaction, instructing the touchscreen display to display a second collection of action widgets in the strip-shaped area, the second collection of action widgets including at least one action widget that is not found in the first or the second action widget collection and excluding at least one action widget that is found in the first or the second action widget collection.
The details of one or more implementations described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1 is a schematic representation of a system of electronic devices that can exchange information for the performance of data processing and other activities.
FIGS. 2-14,19, and20 are schematic representations of the display of presentations on a portion of a touchscreen of an electronic device.
FIG. 15 is a schematic representation of a collection of electronic components that can be housed in the electronic device that displays the presentations ofFIGS. 2-14.
FIG. 16 is a schematic representation of a collection of information identifying interactive elements that are to be displayed in response to user interaction with different categories of interactive elements.
FIGS. 17 and 18 are schematic representations of implementations of collections of activities in an asymmetric social network.
Like reference numbers and designations in the various drawings indicate like elements.
DETAILED DESCRIPTIONFIG. 1 is a schematic representation of asystem100 of electronic devices that can exchange information for the performance of data processing and other activities.System100 includes adevice105 that includes atouchscreen115 with which a user can interact.Device105 can be, e.g., a computer, a tablet computer, a telephone, a music player, a PDA, a gaming device, or the like. In some implementations,device105 can be mobile, portable device, as shown.
In addition totouchscreen115,device100 include ahousing110 and a collection of off-screen input elements120.Housing110 supportstouchscreen115 and off-screen input elements120.Housing110 also houses a collection of electronic components, as described further below.
Touchscreen115 is a graphical display that can act as both an input and an output. For example,touchscreen115 can sense the position and movement of a user's finger or other elements. The sensed information can be translated into commands that trigger the performance of data processing and other activities by the electronic components housed inhousing110, by other electronic devices insystem100, or by both.Touchscreen115 can be, e.g., a liquid crystal display (LCD) device, a light emitting diode (LED) device, an organic LED (OLED) device, an E-INK device, or a flexible touch screen device.Input elements120 are input devices that are “off”touchscreen115.Input elements120 are not part oftouchscreen115 and can receive input from a user that is distinct from the input received bytouchscreen115.Input elements120 can include one or more key, pad, trackball, or other component that receives mechanical, audio, or other input from a user.
Among the electronic components housed inhousing110 are one or more wireless or wired data communication components such as transmitters, receivers, and controllers of those components.Device105 can thus exchange information with other electronic devices insystem100, e.g., in response to user interaction withtouchscreen115.
In the illustrated implementation ofsystem100,device105 includes two wireless data communication components, namely, a cellular phone transceiver and a WiFi transceiver. The WiFi transceiver is able to exchangemessages125 with aWiFi access point125 andmessages135 with a peerelectronic device140 that also includes a WiFi transceiver. Peerelectronic device140 is associated with another individual user. The cellular phone transceiver is able to exchangemessages145 with aphone base station155.
Phone base station155 andWiFi access point130 are connected for data communication with one or moredata communication networks160 viadata links162,164 and can exchange information with one ormore servers165,170,175,180.
In some implementations, peerelectronic device140 may also be able to exchange messages with WiFi access point130 (or another WiFi access point) for data communication withdata communication networks140,device105, and one or more ofservers165,170,175,180. One or more additional devices183, which are associated with one or more other individual users, may also be able to exchangemessages185 with phone base station155 (or another base station) for data communication withdata communication networks140,device105, and access to one or more ofservers165,170,175,180. One or morepersonal computing devices190, which are associated with one or more other individual users, may also be connected for data communication with one or moredata communication networks140,device105, and access to one or more ofservers165,170,175,180 via adata link195
System100 supports both direct and server-mediated interaction by the users with whomdevices105,140,182,190 are associated. Such interaction includes the exchange of messages, photos, or other media directly to one another or indirectly, i.e., mediated by one or more ofservers165,170,175,180.
The illustrated implementation ofsystem100 includes four different examples of servers that can mediate such interaction, namely, anelectronic mail server165, asocial network server170, atext message server175, and aphoto server180. Each ofservers165,170,175,180 includes one or more data processing devices that are programmed to perform data processing activities in accordance with one or more sets of machine—readable instructions. For example,electronic mail server165 is programmed to allow a user to access electronic mail from an electronic mail client.Social network server170 is programmed to allow users to access a social network where messages, photos, and/or other media are exchanged.
The social network provided bysocial network server170 can be a symmetric social network or an asymmetric social network. In a symmetric social network, related members necessarily share the same relationship with one another. Examples of such symmetric social networks include FACEBOOK, LINKEDIN, and MYSPACE, where two or more members establish bi-directionally equivalent “friend” or other relationships generally using an invitation/response protocol that effectively requires the consent of both members to the relationship. Such bi-directionally equivalent relationships provide the same social interaction possibilities to the related members.
In an asymmetric social network, a first member's relationship to a second member is not necessarily the same as the second member's relationship to the first member. Since the character of the social interaction between members in a member network can be defined in accordance with the nature of the relationship between those members, a first member in an asymmetric social network may interact with a second member in ways that differ from the social interaction provided for the second member to interact with the first member. An example of such an asymmetric social network is TWITTER, where a first member may be a follower of a second member without the second member necessarily being a follower of the first. Indeed, in many asymmetric social networks, a second member need not even know a first member's identity even though the first member has a relationship to the second member.
Text message server175 is programmed to allow a user to exchange chat or other text messages with other users.Media server180 is programmed to allow a user to access a collection of one or more media files (e.g., image, music or video files) posted tophoto server180 by other individuals. In some implementations,media server180 may restrict a user to accessing media files posted by other individuals who have somehow approved the user's access.
FIG. 2 is a schematic representation of the display of apresentation200 on a portion oftouchscreen115 ofdevice105.Presentation200 includes a collection ofidentifiers205,210,215,220,225 of a contact. A contact is one or more individuals or other entity. A contact can be associated with an electronic device that can exchange information withdevice105, such as one or more ofdevices140,182,190 in system100 (FIG. 1). In the illustrated implementation, eachidentifier205,210,215,220,225 is the name of a respective contact and hence textual. However, other identifiers such as graphical, iconic, or numeric identifier can also be used.
In some implementations,presentation200 can be part of a display of a collection of other information ontouchscreen115 ofdevice105. For example,touchscreen115 can displaypresentation200 along with interactive icons that trigger the performance of data processing applications bydevice105. In some implementations, the contacts identified by such apresentation200 can be limited to “favorite” contacts, as discussed further below.
Identifiers205,210,215,220,225 are each associated with a respectiveinteractive widget230,235,240,245,250 by positioning or arrangement onpresentation200. Eachinteractive widget230,235,240,245,250 is an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements. The additional interactive elements trigger the performance of additional data processing or other actions that are directed to the contact identified by the associatedidentifier205,210,215,220,225, as described further below.
In the illustrated implementation, eachidentifier205,210,215,220,225 is associated with a respectiveinteractive widget230,235,240,245,250 by virtue of common positioning within anarea255 that is dedicated to the display of information characterizing a single contact.Interactive widgets230,235,240,245,250 are positioned laterally adjacent to respective ofidentifiers205,210,215,220,225 (i.e., to the right in the illustrated implementation). In the illustrated implementation,areas255 are demarcated from one another by borders260. In other implementations,areas255 can be demarcated using color, empty expanses, or other visual features. In other implementations,interactive widgets230,235,240,245,250 can be positionedadjacent areas255.
In the illustrated implementation, eacharea255 also includes agraphical indicium265 that characterizes the contact. Eachgraphical indicium265 is an photograph, an icon, or other graphical representation of the contact identified by an associatedidentifier205,210,215,220,225.Graphical indicia265 can be stored in one or more memory devices ofdevice105, e.g., in conjunction with other contact information.
In some implementations, eacharea255 can include additional information characterizing a contact, such as some or all of the contact's “contact information.” Such contact information can include, e.g., the contact's title, image, phone number, electronic mail or other address, employer, moniker in a social network, or the like. Such additional information can also be stored in one or more memory devices ofdevice105.
In the illustrated implementation of device105 (i.e., a portable, handheld device), eacharea255 occupies a majority of the width W oftouchscreen115. Further,areas255 are aligned with one another and arranged one above the other to span a majority of the height H oftouchscreen115.Identifiers205,210,215,220,225,graphical indicia265, andwidgets230,235,240,245,250 indifferent areas255 are aligned with one another. Such an arrangement lists information characterizing the contacts identified by205,210,215,220,225 in a convenient format that is familiar to many individuals. Other layouts are possible, e.g., in other contexts. By way of example, ifdevice105 includes a relativelylarger touchscreen115 than in the illustrated implementation, thenareas255 can be arranged differently and/or span relatively smaller portions oftouchscreen115.
In some implementations, the display of additional identifiers and associated interactive widgets and concomitant removal one or more ofidentifiers205,210,215,220,225 andwidgets230,235,240,245,250 can be triggered by user interaction with one or more ofinput elements120 and/orpresentation200. For example, in some implementations,presentation200 can trigger scrolling navigation through a collection of contacts and contact information in response totouchscreen115 identifying upward or downward movement of a finger or other element acrosspresentation200. As another example, in some implementations,presentation200 can include additional interactive widgets that trigger, in response to user interaction, scrolling navigation through a collection of contacts and contact information.
Presentation200 can be displayed in accordance with one or more sets of machine-readable instructions that are performed by one or more data processing devices housed inhousing110 ofdevice105, as described further below. The instructions can causedevice105 to displaypresentation200 at various points in a set of data processing activities. For example, the instructions can causedevice105 to displaypresentation200 in response to user interaction with a widget that indicates that a user wishes to make a selection from a collection of contacts.
FIG. 3 is a schematic representation of the display of apresentation300 on a portion oftouchscreen115 ofdevice105.Presentation300 is displayed ontouchscreen115 in response to user interaction withinteractive widget235 that is associated withcontact identifier210. The user interaction withinteractive widget235 that triggers the display ofpresentation300 can be, e.g., a single or a double click or tap.
In addition to the displayed features shared withpresentation200,presentation300 also includes anaction widget collection305.Action widget collection305 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the contact identified by the identifier which is associated with the interactive widget that triggers the display of the action widget collection. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget incollection305 or the “dragging and dropping” of the contact identified by the identifier which is associated with the interactive widget that triggers the display of theaction widget collection305 onto a particular action widget incollection305.
In the illustrated implementation,action widget collection305 includes acontact display widget310, acontact edit widget315, atelephone contact widget320, ane-mail contact widget325, and a contact socialnetwork interaction widget330. In the illustrated implementation,widgets310,315,320,325,330 are iconic graphical indicia that represent the actions triggered by user interaction therewith, as described further below.
Contact display widget310 is an interactive element that, in response to user interaction, triggers the display of additional information characterizing the contact identified by the identifier which is associated with the interactive widget that triggers the display of theaction widget collection305. The additional information can include one or more of, e.g., the contact's title, image, phone number, electronic mail or other address, employer, moniker in a social network, or the like. In the illustrated implementation,contact display widget310 is an iconic graphical indicium that resembles a portion of the person of an individual and represents that the display of additional information related to the contact's person is triggered by user interaction.
Contactedit widget315 is an interactive element that, in response to user interaction, triggers the display of interactive elements for editing information characterizing the contact identified by the identifier which is associated with the interactive widget that triggers the display ofaction widget collection305. Such editing can including changing existing contact information stored indevice105 and adding new contact information to the contact information stored in a data storage device ofdevice105. In some implementations, the interactive elements can respond to user interaction to add or change an identifier of the contact (including the respective ofidentifiers205,210,215,220,225), the contact's title, the contact's phone number, the contact's electronic mail or other address, the contact's employer, the contact's moniker in a social network, or the like. In some implementations, the interactive elements can respond to user interaction to add or change an image, an icon, or other graphical representation of the contact. In the illustrated implementation,contact edit widget315 is an iconic graphical indicium that resembles a writing instrument and represents that editing of information characterizing the contact is triggered by user interaction.
Telephone contact widget320 is an interactive element that, in response to user interaction, triggers a telephone call to the contact identified by the identifier which is associated with the interactive widget that triggers the display ofaction widget collection305. The telephone call can be, e.g., a plain old telephone service (POTS) call, a cellular phone call, a voice over Internet protocol (VoIP) call, or other call. The telephone call can be placed to a telephone number that is stored in association with the respective ofidentifiers205,210,215,220,225 in a data storage device ofdevice105. In the illustrated implementation,telephone contact widget320 is an iconic graphical indicium that resembles a telephone handset and represents that the placing of a telephone call is triggered by user interaction.
E-mail contact widget325 is an interactive element that, in response to user interaction, triggers the transmission of an electronic mail message or the display of a presentation for authoring an electronic mail message to the contact identified by the identifier which is associated with the interactive widget that triggers the display ofaction widget collection305. The electronic mail message can be transmitted to an electronic mail address that is stored in association with the respective ofidentifiers205,210,215,220,225 in a data storage device ofdevice105. In the illustrated implementation,e-mail contact widget325 is an iconic graphical indicium that resembles a letter envelope and represents that the transmission of an electronic mail message or the display of a presentation for authoring an electronic mail message is triggered by user interaction.
Contact socialnetwork interaction widget330 is an interactive element that, in response to user interaction, triggers interaction that is mediated by a social network with the contact identified by the identifier which is associated with the interactive widget that triggers the display ofaction widget collection305. The social network can be a symmetric or an asymmetric social network. The interaction can include, e.g., opening the profile page of the contact in the social network or transmitting a message to the contact using the capabilities proved by the social network. The social network—mediated interaction can rely upon information characterizing the contact within the social network that is stored in association with the respective ofidentifiers205,210,215,220,225 in a data storage device ofdevice105. In the illustrated implementation, contact socialnetwork interaction widget330 is an iconic graphical indicium that resembles a net and represents that interaction that is mediated by a social network is triggered by user interaction.
In the illustrated implementation, the action widgets incollection305 are grouped together in anarea335 that appears to be overlaid upon other portions ofpresentation200 that are not visible inpresentation300. In particular,area335 appears to obscure at least a portion ofarea255 that includes information characterizing a contact that differs from the contact that is associated with theinteractive widget235 that triggers the display ofaction widget collection305. As a result, at least a portion ofidentifier215 of this different contact, and the associatedinteractive widget240 andgraphical indicia265 are not visible inpresentation300 and appear to be obscured by the overlaidarea335.
Thecontact identifier210 that is associated with theinteractive widget235 that triggers the display ofaction widget collection305 is not obscured byaction widget collection305. In other words,contact identifier210 andaction widget collection305 are both visible inpresentation300. In the illustrated implementation, all of the information characterizing the contact identified bycontact identifier210 remains visible notwithstanding the presentation ofaction widget collection305 inpresentation300. Indeed, in the illustrated implementation,area255 that includes information characterizing the contact identified bycontact identifier210 remains visible in its entirety except for a relatively small incursion by a pointed indicium, as described further below.
In the illustrated implementation,area335 is demarcated from other portions ofpresentation300 by aborder340. In other implementations,area335 can be demarcated from other portions ofpresentation300 by color or shade, by empty expanses, or by other visual features that convey thatwidgets310,315,320,325,330 commonly belong tocollection305.
In the illustrated implementation,border340 ofarea335 includes apointed indicium345 that is directed towardarea255 that is associated with theinteractive widget235 that triggers the display ofaction widget collection305. The directionality ofpointed indicium345 thus indicates that the actions triggered by user interaction withwidgets310,315,320,325,330 are directed to the contact that is associated with that same interactive widget. In the illustrated implementation, the upward-pointing directionality ofindicium345 towardarea255 that includesidentifier210 allows a user to recognize that interaction withwidgets310,315,320,325,330 trigger actions directed to the respective of viewing or editing the contact information of the contact identified byidentifier210, placing a telephone call to or e-mailing the contact identified byidentifier210, or interacting with the contact identified byidentifier210 via a social network. In the illustrated implementation, pointedindicium345 extends outwardly from a relatively straighter portion ofborder340 and extends acrossborder260 that demarcatesarea255.
In the illustrated implementation,widgets310,315,320,325,330 incollection305 are arranged adjacent one another to span anarea335 that is wider than it is tall. In the illustrated implementation,area335 spans a majority of the width W oftouchscreen115. In this, the relative sizes of the height and width dimensions ofarea335 follow the relative sizes of the height and width dimensions ofareas255. In particular,areas255 are generally strip-shaped elements that span a majority of the width W oftouchscreen115.Area335 is also a generally strip-shaped element that spans a majority of the width W oftouchscreen115. In the illustrated implementation, the height of the strip of area335 (i.e., in the direction of height H of touchscreen115) is smaller than the height of the strips ofareas255, although this is not necessarily the case. Indeed, in some implementations, the height of the strip ofarea335 can be the same as or larger than the height of the strips ofareas255. Other layouts ofarea335 are possible, e.g., in other contexts. By way of example, ifdevice105 includes a relativelylarger touchscreen115 than in the illustrated implementation, thenarea335 can be arranged differently and/or span a relatively smaller portion oftouchscreen115.
In the illustrated implementation,widgets310,315,320,325,330 incollection305 are demarcated from one another by empty expanses. In other implementations,widgets310,315,320,325,330 can be demarcated from one another by color or shade, by borders, or by other visual features that convey thatwidgets310,315,320,325,330 differ from one another.
FIG. 4 is a schematic representation of the display of apresentation400 on a portion oftouchscreen115 ofdevice105.Presentation400 is displayed ontouchscreen115 in response to user interaction with one or more interactive elements. For example, in some implementations,presentation400 can be displayed ontouchscreen115 in response to a user dragging a finger or other element acrossarea335 in presentation300 (FIG. 3).
In addition to the displayed features shared withpresentations200,300,presentation400 also includes anaction widget collection405. The interactive elements inaction widget collection405 differ from the interactive elements inaction widget collection305. In particular,action widget collection405 includes at least one interactive element that is not found inaction widget collection305 and excludes at least one interactive element that is found inaction widget collection305. For example, in the illustrated implementation,action widget collection405 includes a trio ofwidgets410,415,420 that are not found inaction widget collection305 and excludescontact display widget310,contact edit widget315, andtelephone contact widget320.
In transitioning betweenaction widget collection305 andaction widget collection405, widgets can appear to scroll into and out ofareas305,405 in the direction that a finger or other element is dragged. For example, in the illustrated implementation,widgets410,415,420 may have shifted to the left and been deleted fromarea305 aswidgets410,415,420 shifted intoarea305 from the right in response to a user dragging a finger or other element to the left acrossarea335 inpresentation300.
Widgets410,415,420 are interactive elements that, in response to user interaction, trigger data processing or other actions directed to the contact identified by the identifier which is associated with the interactive widget that triggers the display ofaction widget collection305. For example, in the illustrated implementation,widget410 is an interactive element that, in response to user interaction, triggers the display of a presentation for authoring a chat or other text message to the contact identified by the identifier which is associated with the interactive widget that triggers the display ofaction widget collection305. The text message can be transmitted to an address that is stored in association with the respective ofidentifiers205,210,215,220,225 in a data storage device ofdevice105. In the illustrated implementation,widget410 is an iconic graphical indicium that resembles a bubble callout and represents that the display of a presentation for authoring a chat or other text message is triggered by user interaction.
As another example, in the illustrated implementation,widget415 is an interactive element that, in response to user interaction, changes the contact identified by the identifier which is associated with the interactive widget that triggers the display ofaction widget collection305 into a “favorite” contact. Favorite contacts are contacts who have been identified by a user ofdevice105 as contacts that will be treated differently from the other contacts stored in a data storage device ofdevice105. Favorite contacts are thus generally a proper subset of the stored contacts. Favorite contacts can be treated differently from other contacts in a variety of different ways. For example, in some implementations, incoming messages from favorite contacts are given priority over incoming messages from other, non-favorite contacts. For example, all postings to a social network by favorite contacts may be displayed by default, whereas postings by non-favorite contacts may be displayed only occasionally or only in response to an explicit request by the individual that they be displayed. As another example, in some implementations, favorite contacts are eligible to become selected followers of an individual in an asymmetric social network, whereas non-favorite contacts may not. As yet another example, in some implementations, favorite contacts may have unrestricted access to media files or other content posted to a media file sharing network or a member network by the individual who has designated the contact as a favorite. As yet another example, in some implementations, favorite contacts may have unrestricted access to information identifying an individual's current location. Information identifying a contact as a favorite contact can be stored in association with the contact information ondevice105. In the illustrated implementation,widget415 is an iconic graphical indicium that resembles a star with a plus sign and represents that the addition of the contact identified by the identifier to a collection of favorite contacts is triggered by user interaction.
As yet another example, in the illustrated implementation,widget420 is an interactive element that, in response to user interaction, triggers the deletion of the contact identified by the identifier which is associated with the interactive widget that triggers the display ofaction widget collection305. The deletion of a contact can include deleting the information characterizing the contact from a data storage device indevice105. In the illustrated implementation,widget420 is an iconic graphical indicium that resembles the letter “X” and represents that the deletion of a contact is triggered by user interaction.
In the illustrated implementation, the action widgets incollection405 are grouped together in thesame area335 that includedcollection305 in presentation300 (FIG. 3).Area335 remains demarcated from other portions ofpresentation300 byborder340, which includes pointedindicium345 directed towardarea255 that is associated with the interactive widget that triggered the display ofaction widget collection305.Contact identifier210 is not obscured byaction widget collection405 but rather bothcontact identifier210 andaction widget collection405 are both visible inpresentation400.
FIG. 5 is a schematic representation of the display of apresentation500 on a portion oftouchscreen115 ofdevice105.Presentation500 includes a collection ofmessage records505,510,515,520 that each include information characterizing a message that has been received bydevice105. The messages can be, e.g., electronic mail messages, chat or other text messages, messages posted over a member network, or the like. In the illustrated implementation, message records505,510,515,520 include information characterizing received messages. In other implementations, message records505,510,515,520 include information characterizing sent messages or a combination of sent and received messages.
Eachmessage record505,510,515,520 is associated with a respectiveinteractive widget530,535,540,545 by positioning or arrangement onpresentation500. Eachinteractive widget530,535,540,545 is an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements. The additional interactive elements trigger the performance of additional data processing or other actions that are directed to the message characterized in the associated record, as described further below.
In the illustrated implementation, eachmessage record505,510,515,520 is associated with a respectiveinteractive widget530,535,540,545 by virtue of positioning adjacent anarea555 that is dedicated to the display of information characterizing a single message. In particular,interactive widgets530,535,540,545 are positioned laterally adjacent to a counterparty identifier in respective ofmessage records505,510,515,520 (i.e., to the right in the illustrated implementation). In the illustrated implementation,areas555 are demarcated from one another, and from the remainder ofpresentation500, byborders560. In other implementations,areas555 can be demarcated using color, empty expanses, or other visual features. In other implementations,interactive widgets530,535,540,545 can be positioned withinareas555.
In the illustrated implementation, eachmessage record505,510,515,520 includes acounterparty identifier565,message text570, andmessage transaction information575.Counterparty identifiers565 are the names or other information that identifies a counterparty to the message characterized by the respective ofmessage records505,510,515,520. In the illustrated implementation,counterparty identifiers565 are textual but other identifiers such as graphical, iconic, or numeric identifiers can also be used.
Message text570 is at least a portion of the textual content of the messages characterized by the respective ofmessage records505,510,515,520. The textual content can include the body of the message or the subject line of the message. In the illustrated implementation, message records505,510,515,520 include information characterizing messages received over an asymmetric social network that limits the size of postings. As a result,message text570 often includes the complete textual content of such postings.
Message transaction information575 is textual or other indicia that characterize one or more transactional properties of the messages characterized by the respective ofmessage records505,510,515,520. For example,message transaction information575 can characterize the time when the message was sent, the location from where the message was sent, and the transaction history of the message. The transactional history can include, e.g., whether the message has been forwarded or is a reply to a previous message.
In the illustrated implementation, eachmessage record505,510,515,520 also includes agraphical indicium580 that characterizes the counterparty on the message characterized by the respective ofmessage records505,510,515,520. Eachgraphical indicium580 is an photograph, an icon, or other graphical representation of the counterparty on the characterized message. In some implementations,graphical indicia580 are likenesses of or identical to thegraphical indicia265 that characterize contacts and that are displayed inpresentations200,300,400 (FIGS. 2,3,4), as shown.Graphical indicia580 can be stored in one or more memory devices ofdevice105 in conjunction with contact information.
In other implementations, eachmessage record505,510,515,520 can include additional information characterizing a message, such indicia indicating whether a message has been read, indicia indicating whether the message has been labeled with a priority, an urgent, or other designator, and the like.
Whendevice105 is a portable, handheld device, eacharea555 can occupy a majority of the width oftouchscreen115. Further,areas555 are aligned with one another and arranged one above the other to span a majority of the height oftouchscreen115. In particular,counterparty identifiers565,message text570,message transaction information575,graphical indicia580, andwidgets530,535,540,545 indifferent areas555 are aligned with one another. Such an arrangement lists information characterizing the messages in a convenient format that is familiar to many individuals. Other layouts are possible, e.g., in other contexts. By way of example, ifdevice105 includes a relativelylarger touchscreen115, thenareas555 can be arranged differently and/or span relatively smaller portions oftouchscreen115.
In some implementations, the display of additional message records and concomitant removal one or more ofmessage records505,510,515,520 can be triggered by user interaction with one or more ofinput elements120 and/orpresentation500. For example, in some implementations,presentation500 can trigger scrolling navigation through a collection of message information in response totouchscreen115 identifying upward or downward movement of a finger or other element acrosspresentation500. As another example, in some implementations,presentation500 can include additional interactive widgets that trigger, in response to user interaction, scrolling navigation through a collection of message information.
Presentation500 can be displayed in accordance with one or more sets of machine-readable instructions that are performed by one or more data processing devices housed inhousing110 ofdevice105, as described further below. The instructions can causedevice105 to displaypresentation500 at various points in a set of data processing activities. For example, the instructions can causedevice105 to displaypresentation500 in response to user interaction with a widget that indicates that a user wishes to make a selection from a collection of electronic mail, chat or other text, or social network messages.
FIG. 6 is a schematic representation of the display of apresentation600 on a portion oftouchscreen115 ofdevice105.Presentation600 is displayed ontouchscreen115 in response to user interaction withinteractive widget530 that is associated withmessage record505. The user interaction withinteractive widget530 that triggers the display ofpresentation300 can be, e.g., a single or a double click or tap.
In addition to the displayed features shared withpresentation500,presentation600 also includes anaction widget collection605.Action widget collection605 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the message which is characterized in a message record associated with the interactive widget that triggers the display of the action widget collection. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget incollection605 or the “dragging and dropping” of the message record that is associated with the interactive widget that triggers the display ofaction widget collection605 onto a particular action widget incollection605.
In the illustrated implementation,action widget collection605 includes a mark-as-favorite widget610, areply widget615, arepost widget620, adelete widget625, and a locate-on-map widget630. In the illustrated implementation,widgets610,615,620,625,630 are iconic graphical indicia that represent the actions triggered by user interaction therewith, as described further below.
Mark-as-favorite widget610 is an interactive element that, in response to user interaction, changes the message that is characterized in the message record associated with the interactive widget that triggers the display ofaction widget collection605 into a “favorite” message. Favorite messages are messages that have been identified by a user ofdevice105 as messages that will be treated differently from the other messages stored in a data storage device ofdevice105. Favorite messages are thus generally a proper subset of the stored messages. Favorite messages can be treated differently from other messages in a variety of different ways. For example, in some implementations, favorite messages can added to a user's profile page or other collection in a social network. For example, favorite messages can be posted or reposted to an asymmetric social network, as inactivities1710,1725 (FIG. 17). As another example, in some implementations, favorite messages may be exempted from certain automated processes, such as automatic deletion of messages from a data storage device indevice105 or automatic removal of a message record from a presentation ontouchscreen115 as new, unread messages are received bydevice105. Information identifying a message as a favorite message can be stored in association with the message information ondevice105. In the illustrated implementation, mark-as-favorite widget610 is an iconic graphical indicium that resembles a star and represents that the addition of the message that is characterized in a message record associated with the interactive widget that triggers the display ofaction widget collection605 is to be marked as a favorite message in response to user interaction.
Reply widget615 is an interactive element that, in response to user interaction, triggers the display of a presentation for authoring a reply message to the counterparty identified by thecounterparty identifier565 in the message record associated with the interactive widget that triggers the display ofaction widget collection605. The reply message can be directed to the electronic address from which the message characterized in the message record originated. In the illustrated implementation,reply widget615 is an iconic graphical indicium that resembles an arrow changing direction and represents that the display of a presentation for authoring an reply message is triggered by user interaction.
Repost widget620 is an interactive element that, in response to user interaction, triggers the “reposting”—to the social network from which it originated or to another social network—of the message that is characterized in the message record associated with the interactive widget that triggers the display ofaction widget collection605. In the context of an asymmetric social network, reposting a message can include transmitting the message to followers of the user who interacts withdevice105, as described further below. In the illustrated implementation,repost widget620 is an iconic graphical indicium that resembles a pair of arrows, each changing direction to arrive at the others tail, and represents that the reposting of the message is triggered by user interaction.
Delete widget625 is an interactive element that, in response to user interaction, triggers the deletion of the message that is characterized in the message record associated with the interactive widget that triggers the display ofaction widget collection605. The deletion of a message can include deleting the information characterizing the message from a data storage device indevice105. In the illustrated implementation, deletewidget625 is an iconic graphical indicium that resembles a trash can and represents that the deletion of a contact is triggered by user interaction.
Locate-on-map widget630 is an interactive element that, in response to user interaction, triggers the display of a map that includes an indium identifying the location from where the message that is characterized in the message record associated with the interactive widget that triggers the display ofaction widget collection605 was sent. In some implementations,presentation600 can be removed fromtouchscreen115 and replaced with such a map in response to user interaction with locate-on-map widget630. In the illustrated implementation, locate-on-map widget630 is a tear-drop-shaped iconic graphical indicium and represents that the display of such a map is triggered by user interaction.
In the illustrated implementation, the action widgets incollection305 are grouped together in anarea635 that appears to be overlaid upon other portions ofpresentation500 that are not visible inpresentation600. In particular,area635 appears to obscure at least a portion of thearea555 that includes information characterizing a different message. As a result, at least a portion ofcounterparty identifier565,message text570,message transaction information575, andgraphical indicia580 are not visible inpresentation600 and appear to be obscured by the overlaidarea635.
Thecounterparty identifier565 that is inrecord505, which itself is associated with theinteractive widget530 that triggers the display ofaction widget collection605, is not obscured byaction widget collection605. In other words, thiscounterparty identifier565 andaction widget collection605 are both visible inpresentation600. In the illustrated implementation, all of the message-characterizing information inrecord505 remains visible notwithstanding the presentation ofaction widget collection605 inpresentation600. Indeed, in the illustrated implementation,area555 ofmessage record505 remains visible in its entirety except for a relatively small incursion by a pointed indicium, as described further below.
In the illustrated implementation,area635 is demarcated from other portions ofpresentation600 by anouter border640. In other implementations,area635 can be demarcated from other portions ofpresentation600 by color or shade, by empty expanses, or by other visual features that convey thatwidgets610,615,620,625,630 commonly belong tocollection605.
In the illustrated implementation,outer border640 ofarea635 includes apointed indicium645 that is directed towardarea555 that is associated with theinteractive widget530 that triggers the display ofaction widget collection605. The directionality ofpointed indicium645 thus indicates that the actions triggered by user interaction withwidgets610,615,620,625,630 are directed to the contact that is associated with that same interactive widget. In the illustrated implementation, pointedindicium645 extends outwardly from a relatively straighter portion ofborder640 and extends acrossborder560 that demarcatesarea555.
In the illustrated implementation,widgets610,615,620,625,630 incollection605 are arranged adjacent one another to span anarea635 that is wider than it is tall. In the illustrated implementation,area635 spans a majority of the width oftouchscreen115. In this, the relative sizes of the height and width dimensions ofarea635 follow the relative sizes of the height and width dimensions ofareas555. In particular,areas555 are generally strip-shaped elements that span a majority of the width W oftouchscreen115.Area635 is also a generally strip-shaped element that spans a majority of the width W oftouchscreen115. In the illustrated implementation, the height of the strip ofarea635 is smaller than the height of the strips ofareas555, although this is not necessarily the case. Indeed, in some implementations, the height of the strip ofarea635 can be the same as or larger than the height of the strips ofareas555. Other layouts ofarea635 are possible, e.g., in other contexts. By way of example, ifdevice105 includes a relativelylarger touchscreen115 than in the illustrated implementation, thenarea635 can be arranged differently and/or span a relatively smaller portion oftouchscreen115.
In the illustrated implementation,widgets610,615,620,625,630 incollection605 are demarcated from one another by borders650. In other implementations,widgets610,615,620,625,630 can be demarcated from one another by color or shade, by empty expanses, or by other visual features that convey thatwidgets610,615,620,625,630 differ from one another.
In some implementations, a different action widget collection that includes least one interactive element that is not found inaction widget collection605 and excludes at least one interactive element that is found inaction widget collection605 can be presented ontouchscreen115 in response to user interaction with one or more interactive elements. For example, in some implementations, a different action widget collection can be displayed ontouchscreen115 in response to a user dragging a finger or other element acrossarea635 inpresentation600. In transitioning betweenaction widget collection605 and such a different action widget collection, widgets can appear to scroll into and out ofarea635 in the direction that a finger or other element is dragged.
FIG. 7 is a schematic representation of the display of apresentation700 on a portion oftouchscreen115 ofdevice105.Presentation700 includes a collection ofmedia records705,710,715,720 that each include information characterizing a media file, such as an image, music or video file. In the illustrated implementation,media records705,710,715,720 each include information characterizing an image. The characterized images can be, e.g., photographs, drawings, icons, or other graphical elements. The characterized media files can be stored ondevice105 or available for download from a server that is accessible over the Internet. For example, the characterized media files can be available fromsocial network server170 ormedia server180.
Eachmedia record705,710,715,720 is associated with a respectiveinteractive widget725,730,735,740 by positioning or arrangement onpresentation700. Eachinteractive widget725,730,735,740 is an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements. The additional interactive elements trigger the performance of additional data processing or other actions that are directed to the media files characterized in the associated media record, as described further below.
In the illustrated implementation, eachmedia record705,710,715,720 is associated with a respectiveinteractive widget725,730,735,740 by virtue of common positioning within anarea755 that is dedicated to the display of information characterizing a single media file.Interactive widgets725,730,735,740 are positioned laterally adjacent to respective of media file identifiers inmedia records705,710,715,720 (i.e., to the right in the illustrated implementation). In the illustrated implementation,areas755 are demarcated from one another, and from the remainder ofpresentation700, byborders760. In other implementations,areas755 can be demarcated using color, empty expanses, or other visual features. In other implementations,interactive widgets725,730,735,740 can be positionedadjacent areas755.
Eachmedia record705,710,715,720 includes amedia file identifier770 that each identify the media file characterized in therespective media record705,710,715,720. In the illustrated implementation,media file identifiers770 are likenesses of the characterized images. The likenesses can be thumbnail-sized reproductions of the characterized images or other graphical elements that resemble the characterized images. In other implementations,media file identifiers770 can be a name of the media file or other textual or numeric identifier. In some implementations, eachmedia record705,710,715,720 can include multiple media file identifiers such as, e.g., both a likeness and a textual or numeric identifier. In some implementations, eachmedia record705,710,715,720 can also include additional information characterizing media files, such as the names of individuals or other tags or captions associated with the media files. In some implementations, eachmedia record705,710,715,720 can also include additional information characterizing transactional properties of the media file, such as when the media file was created or saved or from whence the media file originated.
Whendevice105 is a portable, handheld device, eacharea755 can occupy a majority of the width oftouchscreen115. Further,areas755 are aligned with one another and arranged one above the other to span a majority of the height oftouchscreen115. In particular,media file identifiers770 indifferent areas755 are aligned with one another. Such an arrangement lists information characterizing the media files in a convenient format that is familiar to many individuals. Other layouts are possible, e.g., in other contexts. By way of example, ifdevice105 includes a relativelylarger touchscreen115, thenareas755 can be arranged differently and/or span relatively smaller portions oftouchscreen115.
In some implementations, the display of additional media records and concomitant removal one or more ofmedia records705,710,715,720 can be triggered by user interaction with one or more ofinput elements120 and/orpresentation700. For example, in some implementations,presentation700 can trigger scrolling navigation through a collection of media files in response totouchscreen115 identifying upward or downward movement of a finger or other element acrosspresentation700. As another example, in some implementations,presentation700 can include additional interactive widgets that trigger, in response to user interaction, scrolling navigation through a collection of media files.
Presentation700 can be displayed in accordance with one or more sets of machine-readable instructions that are performed by one or more data processing devices housed inhousing110 ofdevice105, as described further below. The instructions can causedevice105 to displaypresentation700 at various points in a set of data processing activities. For example, the instructions can causedevice105 to displaypresentation700 in response to user interaction with a widget that indicates that a user wishes to make a selection from a collection of media files.
FIG. 8 is a schematic representation of the display of apresentation800 on a portion oftouchscreen115 ofdevice105.Presentation800 is displayed ontouchscreen115 in response to user interaction withinteractive widget740 that is associated withmessage record720. The user interaction withinteractive widget740 that triggers the display ofpresentation800 can be, e.g., a single or a double click or tap.
In addition to the displayed features shared withpresentation700,presentation800 also includes anaction widget collection805.Action widget collection805 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the media file which is characterized in an media record associated with the interactive widget that triggers the display of the action widget collection. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget incollection805 or the “dragging and dropping” of the media record that is associated with the interactive widget that triggers the display ofaction widget collection805 onto a particular action widget incollection805.
In the illustrated implementation,action widget collection805 includes aview widget810, an edit caption widget815m adelete widget820, and aninformation widget825.
View widget810 is an interactive element that, in response to user interaction, triggers the display of the media file that is characterized in the media record associated with the interactive widget that triggers the display ofaction widget collection805. In some implementations,presentation800 can be removed fromtouchscreen115 and replaced with the media file in response to user interaction withview widget810. In the illustrated implementation,view widget810 is a graphical indicium that resembles a pair of binoculars, and represents that the display of such a media file is triggered by user interaction.
Caption edit widget815 is an interactive element that, in response to user interaction, triggers the display of interactive elements for editing the caption of media file that is characterized in the media record associated with the interactive widget that triggers the display ofaction widget collection805. Such editing can change a caption that is stored indevice105 or a caption stored at a server that is accessible over the Internet, such associal network server170 orphoto server180. In the illustrated implementation,caption edit widget315 is an iconic graphical indicium that resembles a writing instrument and represents that editing of a media file caption is triggered by user interaction.
Delete widget820 is an interactive element that, in response to user interaction, triggers the deletion of the media file that is characterized in the media record associated with the interactive widget that triggers the display ofaction widget collection805. The deletion of a media file can include deleting the media file and information characterizing the media file from a data storage device indevice105 or from a server that is accessible over the Internet, such associal network server170 orphoto server180. In the illustrated implementation, deletewidget820 is an iconic graphical indicium that resembles the letter “X” and represents that the deletion of a media file is triggered by user interaction.
Information widget825 is an interactive element that, in response to user interaction, triggers the display of additional information characterizing the media file that is characterized in the media record associated with the interactive widget that triggers the display ofaction widget collection805. The additional information can include, e.g., a name of the media file or other textual or numeric identifier of the media file, the names of individuals or other tags or captions associated with the media file, information characterizing transactional properties of the media file (such as when the media file was created or saved or from whence the media file originated), or the like. The additional information can be drawn from a data storage device indevice105 or from a server that is accessible over the Internet, such associal network server170 orphoto server180. In the illustrated implementation,information widget825 is an iconic graphical indicium that resembles the letter “i” and represents that information characterizing a media file is triggered by user interaction.
In the illustrated implementation, the action widgets incollection805 are grouped together in anarea835 that appears to be overlaid upon other portions ofpresentation700 that are not visible inpresentation800. In particular,area835 appears to obscure at least a portion of thearea755 that includes information characterizing a different media file. As a result, at least a portion ofmedia file identifier770 inrecord715 is not visible inpresentation800 and appears to be obscured by the overlaidarea835.
Themedia file identifier770 that is inrecord720, which itself is associated with theinteractive widget730 that triggers the display ofaction widget collection805, is not obscured byaction widget collection805. In other words, thismedia file identifier770 andaction widget collection805 are both visible inpresentation800. In the illustrated implementation, all of the message-characterizing information inrecord720 remains visible notwithstanding the presentation ofaction widget collection805 inpresentation800. Indeed, in the illustrated implementation,area755 ofmessage record720 remains visible in its entirety except for a relatively small incursion by a pointed indicium, as described further below.
In the illustrated implementation,area835 is demarcated from other portions ofpresentation800 by anouter border840. In other implementations,area835 can be demarcated from other portions ofpresentation800 by color or shade, by empty expanses, or by other visual features that convey thatwidgets810,815,820,825 commonly belong tocollection805.
In the illustrated implementation,outer border840 ofarea835 includes apointed indicium845 that is directed towardarea755 that is associated with theinteractive widget730 that triggers the display ofaction widget collection805. The directionality ofpointed indicium845 thus indicates that the actions triggered by user interaction withwidgets810,815,820,825 are directed to the media file that is characterized inmedia record720. In the illustrated implementation, pointedindicium845 extends outwardly from a relatively straighter portion ofborder840 and extends acrossborder760 that demarcatesarea755.
In the illustrated implementation,widgets810,815,820,825 incollection805 are arranged adjacent one another to span anarea835 that is wider than it is tall. In the illustrated implementation,area835 spans a majority of the width oftouchscreen115. In this, the relative sizes of the height and width dimensions ofarea835 follow the relative sizes of the height and width dimensions ofareas755. In particular,areas755 are generally strip-shaped elements that span a majority of the width W oftouchscreen115.Area835 is also a generally strip-shaped element that spans a majority of the width W oftouchscreen115. In the illustrated implementation, the height of the strip ofarea835 is smaller than the height of the strips ofareas755, although this is not necessarily the case. Indeed, in some implementations, the height of the strip ofarea835 can be the same as or larger than the height of the strips ofareas755. Other layouts ofarea835 are possible, e.g., in other contexts. By way of example, ifdevice105 includes a relativelylarger touchscreen115 than in the illustrated implementation, thenarea835 can be arranged differently and/or span a relatively smaller portion oftouchscreen115.
In the illustrated implementation,widgets810,815,820,825 incollection805 are demarcated from one another by borders850. In other implementations,widgets810,815,820,825 can be demarcated from one another by color or shade, by empty expanses, or by other visual features that convey thatwidgets810,815,820,825 differ from one another.
In some implementations, a different action widget collection that includes least one interactive element that is not found inaction widget collection805 and excludes at least one interactive element that is found inaction widget collection805 can be presented ontouchscreen115 in response to user interaction with one or more interactive elements. For example, in some implementations, a different action widget collection can be displayed ontouchscreen115 in response to a user dragging a finger or other element acrossarea835 inpresentation800. In transitioning betweenaction widget collection805 and such a different action widget collection, widgets can appear to scroll into and out ofarea835 in the direction that a finger or other element is dragged.
FIG. 9 is a schematic representation of the display of apresentation900 on a portion oftouchscreen115 ofdevice105.Presentation900 includes a collection ofmedia records902,904,906,908,912,914,916,918 that each include information characterizing a media file. The characterized media files can be, e.g., photographs, drawings, icons, or other graphical elements. The characterized media files can be stored ondevice105 or available for download from a server that is accessible over the Internet. For example, the characterized media files can be available fromsocial network server170 ormedia file server180.
Eachmedia record902,904,906,908,912,914,916,918 is associated with a respectiveinteractive widget922,924,926,928,932,934,936,938 by positioning or arrangement onpresentation900. Eachinteractive widget902,904,906,908,912,914,916,918 is an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements. The additional interactive elements trigger the performance of additional data processing or other actions that are directed to the media files characterized in the associated media record, as described further below.
In the illustrated implementation, eachmedia record902,904,906,908,912,914,916,918 is associated with a respectiveinteractive widget902,904,906,908,912,914,916,918 by virtue of common positioning within anarea955 that is dedicated to the display of information characterizing a single media file.Interactive widgets902,904,906,908,912,914,916,918 are positioned laterally adjacent to respective of media file identifiers inmedia records902,904,906,908,912,914,916,918 (i.e., to the right in the illustrated implementation). In the illustrated implementation,areas955 are demarcated from one another, and from the remainder ofpresentation900, byborders960. In other implementations,areas955 can be demarcated using color, empty expanses, or other visual features. In other implementations,interactive widgets922,924,926,928,932,934,936,938 can be positionedadjacent areas955.
Eachmedia record902,904,906,908,912,914,916,918 includes amedia file identifier970 that each identify the media file characterized in therespective media record902,904,906,908,912,914,916,918. In the illustrated implementation,media file identifiers970 are likenesses of the characterized images. The likenesses can be thumbnail-sized reproductions of the characterized images or other graphical elements that resemble the characterized images. In other implementations,media file identifiers970 can be a name of the media file or other textual or numeric identifier. In some implementations, eachmedia record902,904,906,908,912,914,916,918 can include multiple media file identifiers such as, e.g., both a likeness and a textual or numeric identifier. In some implementations, eachmedia record902,904,906,908,912,914,916,918 can also include additional information characterizing media files, such as the names of individuals or other tags or captions associated with the media files. In some implementations, eachmedia record902,904,906,908,912,914,916,918 can also include additional information characterizing transactional properties of the media file, such as when the media file was created or saved or from whence the media file originated.
Whendevice105 is a portable, handheld device, eacharea955 can occupy a approximately one half of the width oftouchscreen115. Such dimensioning is particular convenient for images, which—absent editing—are generally dimensioned to have size ratios that facilitate such a presentation.Areas955 are aligned with one another and arranged one above the other to span a majority of the height oftouchscreen115. In particular,media file identifiers970 indifferent areas955 are aligned with one another. Such an arrangement lists information characterizing the media files in a convenient format. Other layouts are possible, e.g., in other contexts. By way of example, ifdevice105 includes a relativelylarger touchscreen115, thenareas955 can be arranged differently and/or span relatively smaller portions oftouchscreen115.
In some implementations, the display of additional media records and concomitant removal one or more ofmedia records902,904,906,908,912,914,916,918 can be triggered by user interaction with one or more ofinput elements120 and/orpresentation900. For example, in some implementations,presentation900 can trigger scrolling navigation through a collection of media files in response totouchscreen115 identifying upward or downward movement of a finger or other element acrosspresentation900. As another example, in some implementations,presentation900 can include additional interactive widgets that trigger, in response to user interaction, scrolling navigation through a collection of media files.
Presentation900 can be displayed in accordance with one or more sets of machine-readable instructions that are performed by one or more data processing devices housed inhousing110 ofdevice105, as described further below. The instructions can causedevice105 to displaypresentation900 at various points in a set of data processing activities. For example, the instructions can causedevice105 to displaypresentation900 in response to user interaction with a widget that indicates that a user wishes to make a selection from a collection of media files.
FIG. 10 is a schematic representation of the display of apresentation1000 on a portion oftouchscreen115 ofdevice105.Presentation1000 is displayed ontouchscreen115 in response to user interaction withinteractive widget934 that is associated withmessage record914. The user interaction withinteractive widget934 that triggers the display ofpresentation1000 can be, e.g., a single or a double click or tap.
In addition to the displayed features shared withpresentation900,presentation1000 also includes anaction widget collection1005.Action widget collection1005 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the media file which is characterized in an media record associated with the interactive widget that triggers the display of the action widget collection. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget incollection1005 or the “dragging and dropping” of the message record that is associated with the interactive widget that triggers the display ofaction widget collection1005 onto a particular action widget incollection1005.
In the illustrated implementation,action widget collection1005 includesview widget810,edit widget815, deletewidget820, andinformation widget825, as described above (FIG. 8).
In the illustrated implementation, the action widgets incollection1005 are grouped together in anarea1035 that appears to be overlaid upon other portions ofpresentation900 that are not visible inpresentation1000. In particular,area1035 appears to obscure at least a portion of twodifferent areas955 that each includes information characterizing a different media file. As a result, at least a portion ofmedia file identifiers970 inrecords906,916 are not visible inpresentation1000 and appear to be obscured by the overlaidarea1035.
Themedia file identifier970 that is inrecord914, which itself is associated with theinteractive widget934 that triggers the display ofaction widget collection805, is not obscured byaction widget collection1005. In other words, thismedia file identifier970 andaction widget collection1005 are both visible inpresentation1000. In the illustrated implementation, all of the message-characterizing information in record920 remains visible notwithstanding the presentation ofaction widget collection1005 inpresentation1000. Indeed, in the illustrated implementation,area955 ofmessage record914 remains visible in its entirety except for a relatively small incursion by a pointed indicium, as described further below.
In the illustrated implementation,area1035 is demarcated from other portions ofpresentation1000 by anouter border1040. In other implementations,area1035 can be demarcated from other portions ofpresentation1000 by color or shade, by empty expanses, or by other visual features that convey that widgets1010,1015,1020,1025 commonly belong tocollection1005.
In the illustrated implementation,outer border1040 ofarea1035 includes apointed indicium1045 that is directed towardarea914 that is associated with theinteractive widget934 that triggers the display ofaction widget collection1005. The directionality ofpointed indicium1045 thus indicates that the actions triggered by user interaction with widgets1010,1015,1020,1025 are directed to the media file that is characterized inmedia record914. In the illustrated implementation, pointedindicium1045 extends outwardly from a relatively straighter portion ofborder1040 and extends acrossborder960 that demarcatesarea955 that is associated withinteractive widget934.
In the illustrated implementation,widgets810,815,820,825 incollection1005 are arranged adjacent one another to span anarea1035 that is wider than it is tall. In the illustrated implementation,area1035 is a generally strip-shaped element that spans a majority of the width W oftouchscreen115. In the illustrated implementation, the height of the strip ofarea1035 is smaller than the height of the strips ofareas955, although this is not necessarily the case. Indeed, in some implementations, the height of the strip ofarea1035 can be the same as or larger than the height of the strips ofareas955. Other layouts ofarea1035 are possible, e.g., in other contexts. By way of example, ifdevice105 includes a relativelylarger touchscreen115 than in the illustrated implementation, thenarea1035 can be arranged differently and/or span a relatively smaller portion oftouchscreen115.
In the illustrated implementation,widgets810,815,820,825 incollection1005 are demarcated from one another by borders1050. In other implementations,widgets810,815,820,825 can be demarcated from one another by color or shade, by empty expanses, or by other visual features that convey thatwidgets810,815,820,825 differ from one another.
In some implementations, a different action widget collection that includes least one interactive element that is not found inaction widget collection1005 and excludes at least one interactive element that is found inaction widget collection1005 can be presented ontouchscreen115 in response to user interaction with one or more interactive elements. For example, in some implementations, a different action widget collection can be displayed ontouchscreen115 in response to a user dragging a finger or other element acrossarea1035 inpresentation1000. In transitioning betweenaction widget collection1005 and such a different action widget collection, widgets can appear to scroll into and out ofarea1035 in the direction that a finger or other element is dragged.
FIG. 11 is a schematic representation of the display of apresentation1100 of anelectronic document1102 on a portion oftouchscreen115 ofdevice105. An electronic document is a collection of machine-readable data. Electronic documents are generally individual files that are formatted in accordance with a defined format (e.g., HTML, MS Word, or the like). Electronic documents can be electronically stored and disseminated. In some cases, electronic documents include media content such as images, audio content, and video content, as well as text and links to other electronic documents. Electronic documents need not be individual files. Instead, an electronic document can be stored in a portion of a file that holds other documents or in multiple coordinated files.
Electronic document1102 can be stored ondevice105 or accessible over the Internet. For example,presentation1100 can be formed by a web-browser that has downloadedelectronic document1102 from a server that is accessible over the Internet.
Electronic document1102 includes adocument title1105, a body oftext1110, andimages1115,1120,1125.Document title1105 is a textual or other heading that identifieselectronic document1102. In some implementations,document title1105 is a hyperlink that self-referentially refers toelectronic document1102 and acts as an interactive element that, in response to user interaction, triggers the display of a collection of additional interactive elements. The additional interactive elements trigger the performance of additional data processing or other actions that are directed toelectronic document1102, as described further below.
Body oftext1110 includesinteractive elements1130,1135.Interactive elements1130,1135 are hyperlinks that refer to other electronic documents or to portions of other electronic documents.Interactive elements1130,1135 are generally formed from text that is integrated intotext body1110. In some implementations,interactive elements1130,1135 trigger the display of a collection of additional interactive elements in response to user interaction. The additional interactive elements trigger the performance of additional data processing or other actions that are directed to the electronic document (or portion thereof) that is referred to byinteractive elements1130,1135, as described further below.
In some implementations, one or more ofimages1115,1120,1125 are also interactive elements that, in response to user interaction, trigger the display of a collection of additional interactive elements. The additional interactive elements trigger the performance of additional data processing or other actions that are directed to therespective image1115,1120,1125, as described further below.
FIG. 12 is a schematic representation of the display of apresentation1200 on a portion oftouchscreen115 ofdevice105.Presentation1200 is displayed ontouchscreen115 in response to user interaction withinteractive element1130 that is formed from text that is integrated intotext body1110 ofelectronic document1102. The user interaction withinteractive element1130 that triggers the display ofpresentation1200 can be, e.g., a single or a double click or tap.
In addition to the displayed features shared withpresentation1100,presentation1200 also includes anaction widget collection1205.Action widget collection1205 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed to the reference to the electronic document (or to portion thereof) in the interactive element that triggers the display of the action widget collection. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget incollection1205 or the “dragging and dropping” of the reference to the electronic document that triggers the display ofaction widget collection1205 onto a particular action widget incollection1205.
In the illustrated implementation,action widget collection1205 includes anopen widget1210, asave widget1215, and ashare widget1220.Open widget1210 is an interactive element that, in response to user interaction, triggers the opening of the electronic document (or portion thereof) that is referenced in the interactive element that triggers the display ofaction widget collection1205. In the illustrated implementation,open widget1210 is an iconic graphical indicium that resembles an opened can and represents that opening of an electronic document is triggered by user interaction.
Savewidget1215 is an interactive element that, in response to user interaction, triggers saving of the reference to the electronic document (or portion thereof) in the interactive element that triggers the display ofaction widget collection1205. The reference can be saved, e.g., in a data storage device indevice105. In the illustrated implementation, savewidget1215 is an iconic graphical indicium that resembles a data storage disk and represents that storing of a reference to the electronic document is triggered by user interaction.
Share widget1220 is an interactive element that, in response to user interaction, triggers the transmission of a message or the display of a presentation for authoring an message that includes the reference to the electronic document (or portion thereof) in the interactive element that triggers the display ofaction widget collection1205. The message can be an electronic mail message, a chat or other text message, a post to a member network, or the like. The message can be transmitted to an address that is stored in a data storage device ofdevice105. In the illustrated implementation,share widget1220 is an iconic graphical indicium that resembles a letter envelope and represents that the transmission of a message or the display of a presentation for authoring a message is triggered by user interaction.
In the illustrated implementation, the action widgets incollection1205 are grouped together in anarea1235 that appears to be overlaid upon other portions ofpresentation1100 that are not visible inpresentation1200. In particular,area1235 appears to obscure at least a portion of body oftext1110 andimage1125. However,interactive element1130 is not obscured byaction widget collection1205. Instead,interactive element1130 is visible inpresentation1200.
In the illustrated implementation,area1235 is demarcated from other portions ofpresentation1200 by anouter border1240. In other implementations,area1235 can be demarcated from other portions ofpresentation1200 by color or shade, by empty expanses, or by other visual features that convey thatwidgets1210,1215,1220 commonly belong tocollection1205.
In the illustrated implementation,outer border1240 ofarea1235 includes apointed indicium1245 that is directed toward theinteractive element1130 that triggers the display ofaction widget collection1205. The directionality ofpointed indicium1245 thus indicates that the actions triggered by user interaction withwidgets1210,1215,1220 are directed to the electronic document (or portion thereof) that is referenced byinteractive element1130. In the illustrated implementation, pointedindicium1245 extends outwardly from a relatively straighter portion ofborder1240.
In the illustrated implementation,widgets1210,1215,1220 incollection1205 are arranged adjacent one another to span anarea1235 that is wider than it is tall. In the illustrated implementation,area1235 is a generally strip-shaped element that spans a majority of the width W oftouchscreen115. Other layouts ofarea1235 are possible, e.g., in other contexts. By way of example, ifdevice105 includes a relativelylarger touchscreen115 than in the illustrated implementation, thenarea1235 can be arranged differently and/or span a relatively smaller portion oftouchscreen115.
In the illustrated implementation,widgets1210,1215,1220 incollection1205 are demarcated from one another by borders1250. In other implementations,widgets1210,1215,1220 can be demarcated from one another by color or shade, by empty expanses, or by other visual features that convey thatwidgets1210,1215,1220 differ from one another.
In some implementations, a different action widget collection that includes least one interactive element that is not found inaction widget collection1205 and excludes at least one interactive element that is found inaction widget collection1205 can be presented ontouchscreen115 in response to user interaction with one or more interactive elements. For example, in some implementations, a different action widget collection can be displayed ontouchscreen115 in response to a user dragging a finger or other element acrossarea1235 inpresentation1200. In transitioning betweenaction widget collection1205 and such a different action widget collection, widgets can appear to scroll into and out ofarea1235 in the direction that a finger or other element is dragged.
FIG. 13 is a schematic representation of the display of a presentation1300 on a portion oftouchscreen115 ofdevice105. Presentation1300 is displayed ontouchscreen115 in response to user interaction with adocument title1105 that is an interactive element. The user interaction withdocument title1105 that triggers the display of presentation1300 can be, e.g., a single or a double click or tap.
In addition to the displayed features shared withpresentation1100, presentation1300 also includes an action widget collection1305. Action widget collection1305 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed toelectronic document1102 referred to bydocument title1105. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection1305 or the “dragging and dropping” ofdocument title1105 onto a particular action widget in collection1305.
In the illustrated implementation, action widget collection1305 includesopen widget1210, savewidget1215, andshare widget1220.Widgets1210,1215,1220 trigger the reopening ofelectronic document1102, the saving of a reference toelectronic document1102, or the transmission of a message or the display of a presentation for authoring an message that includes a reference toelectronic document1102.
In the illustrated implementation, the action widgets in collection1305 are grouped together in anarea1235 that appears to obscure at least a portion of body oftext1110,image1115, andinteractive element1130. However,document title1105 is not obscured by action widget collection1305 but is instead visible in presentation1300.
In the illustrated implementation,area1235 is demarcated from other portions ofpresentation1200 by anouter border1240 that conveys thatwidgets1210,1215,1220 commonly belong to collection1305.Outer border1240 ofarea1235 includes apointed indicium1245 that is directed towarddocument title1105. The directionality ofpointed indicium1245 thus indicates that the actions triggered by user interaction withwidgets1210,1215,1220 are directed to the electronic document (or portion thereof) that is referenced bydocument title1105. Other features of action widget collection1305 share the characteristics of correspondingly numbered features in action widget collection1205 (FIG. 12).
FIG. 14 is a schematic representation of the display of a presentation1400 on a portion oftouchscreen115 ofdevice105. Presentation1400 is displayed ontouchscreen115 in response to user interaction withimage1120 ofelectronic document1102. The user interaction withimage1120 that triggers the display of presentation1400 can be, e.g., a single or a double click or tap.
In addition to the displayed features shared withpresentation1100,presentation1200 also includes an action widget collection1405. Action widget collection1405 is a collection of interactive elements that, in response to user interaction, trigger data processing or other actions directed toimage1120. The user interaction that triggers the actions can be, e.g., a single or a double click or tap on a particular action widget in collection1405 or the “dragging and dropping” ofimage1120 onto a particular action widget in collection1405.
In the illustrated implementation,action widget collection1205 includes a view widget1410, a save widget1415, and a share widget1420. View widget1410 is an interactive element that, in response to user interaction, triggers the display ofimage1120. In some implementations, presentation1400 can be removed fromtouchscreen115 and replaced withimage1120 in response to user interaction with view widget1410. In the illustrated implementation, view widget1410 is a graphical indicium that resembles a pair of binoculars, and represents that the display of an image is triggered by user interaction.
Save widget1415 is an interactive element that, in response to user interaction, triggers saving ofimage1120. The image can be saved, e.g., in a data storage device indevice105. In the illustrated implementation, save widget1415 is an iconic graphical indicium that resembles a\ data storage disk and represents that storage of an image is triggered by user interaction.
Share widget1420 is an interactive element that, in response to user interaction, triggers the transmission of a message or the display of a presentation for authoring an message that includes the image or a reference to the image that triggers the display of action widget collection1405. The message can be an electronic mail message, a chat or other text message, a post to a member network, or the like. The message can be transmitted to an address that is stored in a data storage device ofdevice105. In the illustrated implementation, share widget1420 is an iconic graphical indicium that resembles a letter envelope and represents that the transmission of a message or the display of a presentation for authoring a message is triggered by user interaction.
In the illustrated implementation, the action widgets in collection1405 are grouped together in an area1435 that appears to be overlaid upon other portions ofpresentation1100 that are not visible in presentation1400. In particular, area1435 appears to obscure at least a portion of body oftext1110 andinteractive element1135. However,image1120 is not obscured byaction widget collection1205. Instead,image1120 is visible in presentation1400.
In the illustrated implementation, area1435 is demarcated from other portions of presentation1400 by an outer border1440. In other implementations, area1435 can be demarcated from other portions of presentation1400 by color or shade, by empty expanses, or by other visual features that convey that widgets1410,1415,1420 commonly belong to collection1405.
In the illustrated implementation, outer border1440 of area1435 includes a pointed indicium1445 that is directed toward theimage1120 that triggers the display of action widget collection1405. The directionality of pointed indicium1445 thus indicates that the actions triggered by user interaction with widgets1410,1415,1420 are directed toimage1120. In the illustrated implementation, pointed indicium1445 extends outwardly from a relatively straighter portion of border1440.
In the illustrated implementation, widgets1410,1415,1420 in collection1405 are arranged adjacent one another to span an area1435 that is wider than it is tall. In the illustrated implementation, area1435 is a generally strip-shaped element that spans a majority of the width W oftouchscreen115. Other layouts of area1435 are possible, e.g., in other contexts. By way of example, ifdevice105 includes a relativelylarger touchscreen115 than in the illustrated implementation, then area1435 can be arranged differently and/or span a relatively smaller portion oftouchscreen115.
In the illustrated implementation, widgets1410,1415,1420 in collection1405 are demarcated from one another by empty expanses. In other implementations, widgets1410,1415,1420 can be demarcated from one another by color or shade, by borders, or by other visual features that convey that widgets1410,1415,1420 differ from one another.
In some implementations, a different action widget collection that includes least one interactive element that is not found in action widget collection1405 and excludes at least one interactive element that is found in action widget collection1405 can be presented ontouchscreen115 in response to user interaction with one or more interactive elements. For example, in some implementations, a different action widget collection can be displayed ontouchscreen115 in response to a user dragging a finger or other element across area1435 in presentation1400. In transitioning between action widget collection1405 and such a different action widget collection, widgets can appear to scroll into and out of area1435 in the direction that a finger or other element is dragged.
FIG. 15 is a schematic representation of acollection1500 of electronic components.Collection1500 can be housed inhousing110 ofdevice105 and includes both hardware and software components, as well as one or more data storage devices and one or more data processing devices that perform operations for displaying presentations ontouchscreen115 ofdevice105. For example,collection1500 can display one or more ofpresentations200,300,400,500,600,700,800,900,1000,1100,1200,1300,1400,1900,2000 (FIGS. 2-14,19,20) ontouchscreen115 ofdevice105.
Collection1500 includes adisplay interface1505, aphone interface1510, aninterface1515 with a wireless transceiver, a collection ofdata stores1525,1530, and adata processing system1535.Display interface1505 is a component that interfaces between adata processing system1535 andtouchscreen115.Display interface1505 can include hardware and/or software that provide a data communication path and defines a data communication protocol for the transfer of display and user interaction information betweendata processing system1535 andtouchscreen115.Display interface1505 can include one or more of a graphic processing unit, a video display controller, a video display processor, or other display interface.
Phone interface1510 is a component that interfaces betweendata processing system1535 and a cellular or other phone.Phone interface1510 can include hardware and/or software that provide a data communication path and define a data communication protocol for the transfer of information between data processing unit1520 and the phone.
Wireless interface1510 is a component that interfaces betweendata processing system1535 and a wireless transceiver.Wireless interface1510 can include hardware and/or software that provide a data communication path and define a data communication protocol for the transfer of information betweendata processing system1535 and the wireless transceiver.
Data stores1525,1530 are collections of machine—readable information stored at one or more data storage devices.Data store1525 stores a collection of contact information, a message log, media files, or combinations thereof. The information stored indata store1525 can be used to generate one or more ofpresentations200,300,400,500,600,700,800,900,1000,1100,1200,1300,1400,1900,2000 (FIGS. 2-14,19,20). In this regard, among the contact information that can be stored atdata store1525 is information characterizing contact's home and work telephone numbers, information characterizing a contact's home page or other contributions to a social networking site or a photosharing site, information characterizing one or more electronic mail, instant message, chat, or other messaging addresses of a contact, as well as other information such as postal address information, a photograph, and the like. In some implementations,data store1525 can also include grouping information characterizing groups of contacts. Such a group of individuals can be specified by grouping information indata store1525.
Among the message information that can be stored in a message log is information characterizing past electronic mail messages, chat or other text messages, social network postings, telephone calls, and/or other messages.Data store1525 can include, e.g., information characterizing the counterparty in such messages, information characterizing the timing of the messages, information characterizing the content of the messages, information characterizing other transactional characteristics of the messages, and the like. In some implementations,data store1525 only stores information describing a proper subset of all messages received by or sent fromdevice105. For example, in some implementations,data store1525 only stores a group of the most recent messages except messages that have been marked as favorites, e.g., as described above.
Among the media file information that can be stored is the media files themselves, likenesses of the media files, references (such as a URI) to media files that are stored outside ofdevice105, transactional characteristics of the media files, and the like. In some implementations,data store1525 can also include user preference information that specifies user preferences for the display of presentations such aspresentations200,300,400,500,600,700,800,900,1000,1100,1200,1300,1400,1900,2000 (FIGS. 2-14,19,20). For example,data store1525 can include information identifying the media files, contacts, or messages that have been marked as favorites.
Data store1530 stores one or more sets of machine-readable instructions for displaying and interpreting user interaction with presentations such aspresentations200,300,400,500,600,700,800,900,1000,1100,1200,1300,1400,1900,2000 (FIGS. 2-14,19,20).Data store1530 can include information identifying the interactive elements that are to be displayed in response to user interaction with different categories of interactive elements. For example,data store1530 can include information identifying the widgets that are to be displayed in response to user interaction with interactive elements that are associated with contact identifiers (e.g.,widgets230,235,240,245,250), user interaction with interactive elements that are associated with message records (e.g.,widgets530,535,540,545), user interaction with interactive elements that are associated with media records (e.g.,widgets725,730,735,740,922,924,926,928,932,934,936,938), interactive elements that self-referentially refer to an electronic document in which the interactive elements are found (e.g., document title1105), interactive elements in one electronic document that refer to another electronic document or to another portion of an electronic document (e.g.,interactive elements1130,1135), and interactive media files (e.g.,images1115,1120,1125). In some implementations, such information can be organized as shown inFIG. 16 below.
In some implementations,data store1530 can also include, e.g., iconic graphical indicia used form forming the interactive elements that are to be displayed in response to user interaction with different categories of interactive elements, instructions for forming contact, message, media file, or other records using information drawn fromdata store1525, instructions for interpreting user interaction withpresentations200,300,400,500,600,700,800,900,1000,1100,1200,1300,1400,1900,2000 (FIGS. 2-14,19,20) and implementing actions responsive to such user interaction, as described above.
Data processing system1535 is a system of one or more digital data processing devices that perform operations in accordance with the logic of one or more sets of machine-readable instructions.Data processing system1535 can implement one or more modules for performing operations described herein Among the modules that can be implemented bydata processing system1535 are auser interface module1540, a variety of differentserver interface modules1545, and adata aggregation module1550.
User interface module1540 is a set of data processing activities that displays presentations such aspresentations200,300,400,500,600,700,800,900,1000,1100,1200,1300,1400,1900,2000 (FIGS. 2-14,19,20) ontouch screen115, interprets user interaction with such presentations, and performs data processing and other actions triggered by such user interaction. The operations performed byuser interface module1540 can be performed in accordance with instructions indata store1530.
Server interface modules1545 are sets of data processing activities that interface with servers that are external todevice105, such asservers165,170,175,180 (FIG. 1). In general, eachserver interface module1545 is dedicated to obtaining information suitable for display in a presentation from a different server.Server interface modules1545 can be, e.g., an electronic mail or message clients, as well as dedicated clients tailored to the characteristics of a specific social or photosharing network.
Theserver interface modules1545 can obtain information for display by issuing service requests to a server and extracting the formation from the responses to those requests. The requests and responses are communicated fromdevice105 to the relevant server over one or both ofinterfaces1510,1515. The information extracted from the responses to the service requests can include, e.g., incoming electronic mail and text messages, a name or other identifier of a counterparty, an excerpt or other content from a posting on a photosharing or social network site, a likeness of an image, a counterparty's location, transactional information regarding a message or a media file, and the like.
Data aggregation module1550 is a set of data processing activities that aggregates information drawn fromdata store1525 andserver interfaces1545 for display of that information in presentations such aspresentations200,300,400,500,600,700,800,900,1000,1100,1200,1300,1400,1900,2000 (FIGS. 2-14,19,20). In some implementations, data aggregation module1450 compares the names or other identifiers of counterparties on a message with names or other identifiers information in contact information indata store1525 to, e.g., locate a graphical indicium such asgraphical indicia580 that characterizes the counterparty on the message for use in forming message records.
In general,data aggregation module1550 includes rules for filtering messages or other items that are characterized in a presentation such aspresentations200,300,400,500,600,700,800,900,1000,1100,1200,1300,1400,1900,2000 (FIGS. 2-14,19,20).
The items that are characterized in a presentation can be limited in several different ways, including whether the items have been marked as favorites, whether the items involved a particular counterparty, and/or whether the items are found in a particular memory location, such as a particular file, directory, or location on a network.Data aggregation module1550 can thus filter items to implement these and other limitations.
In some implementations,data aggregation module1550 can also include extraction rules for extracting appropriate information for presentation from, e.g., electronic mail and other messages stored indata store1525 and the responses to service requests received byserver interfaces1545. For example,data aggregation module1550 can extract the subject line of electronic mail messages or a title of a posting on a photosharing or social network for display in a presentation such aspresentations200,300,400,500,600,700,800,900,1000,1100,1200,1300,1400,1900,2000 (FIGS. 2-14,19,20).
FIG. 16 is a schematic representation of a collection1600 of information identifying the interactive elements that are to be displayed in response to user interaction with different categories of interactive elements. Collection1600 can be stored indata store1530 of device105 (FIG. 15). In the illustrated implementation, collection1600 is implemented in a data table1605. Data table1605 organizes the interactive elements that are to be displayed in response to user interaction with different categories of interactive elements intorows1610,1612,1614,1615,1620,1625,1630,1635,1640,1645,1650,1655 andcolumns1660,1662,1664,1666,1668,1670,1672,1674,1676,1678,1680,1682. Eachrow1610,1612,1614,1615,1620,1625,1630,1635,1640,1645,1650,1655 is associated with a different category of interactive element that are to trigger the display of additional interactive elements. Eachcolumn1660,1662,1664,1666,1668,1670,1672,1674,1676,1678,1680,1682 includes data specifying whether a particular additional interactive element is to be displayed in response to user interaction with the category of interactive element associated with respective ofrows1610,1612,1614,1615,1620,1625,1630,1635,1640,1645,1650,1655.
For example, in the illustrated implementation, the data incolumns1660,1662,1664,1666,1668,1670,1672,1674,1676,1678,1680,1682 specify that user interaction with an interactive element that is associated with a contact identifier (e.g., any ofwidgets230,235,240,245,250) are to trigger the display of a view interactive element, a delete interactive element, an edit interactive element, a text interactive element, a phone interactive element, and an email interactive element.
As another example, in the illustrated implementation, the data incolumns1660,1662,1664,1666,1668,1670,1672,1674,1676,1678,1680,1682 specify that user interaction with an interactive element that is associated with an media record (e.g., any ofwidgets725,730,735,740,922,924,926,928,932,934,936,938) are to trigger the display of a save interactive element, a favorite interactive element, a view interactive element, a delete interactive element, an edit interactive element, a post-to-social-network interactive element, and an information interactive element.
The interactive elements specified incolumns1660,1662,1664,1666,1668,1670,1672,1674,1676,1678,1680,1682 need not be displayed in a single action widget collection but rather can be displayed in multiple action widget collections that are accessible, e.g., in response to a user dragging a finger or other element acrossareas335,635,835,1035,1235,1335 inpresentations300,400,600,800,1000,1200,1300,1400.
FIG. 17 is a schematic representation of an implementation of a collection ofactivities1700 in an asymmetric social network.Activities1700 occur in the context of a single level asymmetric social network in which a first member can become a follower of a second member without the second member necessarily becoming a follower of the first member. In the illustrated implementation, a first user “Apple” authors apost1705 using a data processing device (e.g., any ofdevices105,140,182,190 (FIG. 1)). The data processing device can also receive input from the first user that triggers “posting” ofpost1705.Post1705 is accordingly transmitted at1710 to social network server1755 (e.g., server170 (FIG.1)), which receives the transmission, identifies the transmission as a posting by the first user, and identifies members who are related to the first member as followers in the network.Social network server1755 then relays content frompost1705 to those followers at1715. These followers can receive and review the transmitted content at one or more data processing devices (e.g.,devices105,140,182,190 (FIG. 1)).
One of the followers, namely, second user “Orange,” may chose to reply to the content frompost1705 and author a reply post1720 using a data processing device (e.g.,devices105,140,182,190 (FIG. 1)). The data processing device can also receive input from the second user that triggers posting of reply post1720. Reply post1720 thus reposts at least some of the content frompost1705 to the asymmetric social network. Reply post1720 is accordingly transmitted at1725 to asymmetricsocial network server1755, which receives the transmission, identifies the transmission as a reply posting by the second user, and identifies members who are related to the second member as followers in the network.Social network server1755 also identifies the author of the post that is being replied to, namely, first user “Apple.”Social network server1755 then relays content from reply post1720 to both the followers of second user “Orange” at1730 and to the author ofpost1705 at1735. The followers of second user “Orange” can receive and review the transmitted content from reply post1720 at one or more data processing devices (e.g.,devices105,140,182,190 (FIG. 1)). The author of post1705 (i.e., first user “Apple”) can receive and review the transmitted content from reply post1720 at one or more data processing devices (e.g.,devices105,140,182,190 (FIG. 1)).
As a consequence of the asymmetry in the relationships between members, there is a directionality to the flow of posts in the illustrated asymmetric social network. In particular, posts tends to preferentially flow in the direction indicated byarrow1740, i.e., from an author to that author's followers. In the illustrated example, there is an exception to this directionality, namely, the transmission of content from reply post1720 to the author ofpost1705 at1735. Nevertheless, the preferred directionality is in the direction indicated byarrow1740.
FIG. 18 is a schematic representation of an implementation of a collection ofactivities1800 in an asymmetric social network.Activities1800 occur in the context of a multiple level asymmetric social network in which a first member can become either a “public follower” or a “selected follower” of a second member without the second member necessarily becoming a follower of the first member. A public follower is a member of the asymmetric social network who receives a proper subset of the posts (i.e., the public posts) authored by the followed member. A selected follower is a member of the asymmetric social network who generally receives all of the posts (i.e., both public and private posts) authored by the followed member. In some implementations, a selected follower relationship between two members is established by an invitation/response protocol that effectively requires the consent of both members to the selected follower relationship.
In the illustrated implementation, first user “Apple” authors apost1805 using a data processing device (e.g.,devices105,140,182,190 (FIG. 1)). In the course ofauthoring post1805, first user “Apple” indicates whetherpost1805 is a public or a private post, e.g., by interacting with an interactive element such as a widget that designates the post as a public or private post.Post1805 includes information characterizing the indication.
In response to input from the first user that triggers the posting ofpost1805,post1805 is accordingly transmitted at1815 tosocial network server1755, which receives the transmission, identifies the transmission as a posting by the first user, and determines whetherpost1805 is to be posted publicly or privately. In response to determining thatpost1805 is to be posted publicly,server1755 identifies both public and selected followers of first user “Apple” and relays content frompost1805 to those followers at1820 and at1825.Server1755 also relays content from apost1805 that is to be posted publicly to the public profile of first user “Apple” at1830. A profile is a representation of an individual or a group of individuals on a member network. A profile generally includes details such as a name, a hometown, interests, pictures, and other information characterizing an individual or a group of individuals. A profile is public if other network members (or even the general public) do not require the consent of the represented individual or group in order to access the profile.
In response to determining thatpost1805 is to be posted privately,server1755 identifies selected followers of first user “Apple” and relays content frompost1805 to those followers at1820.Private posts1805 are not relayed to public followers of first user “Apple” or to the public profile of first user “Apple.” In either case, the followers to whompost1805 is relayed can receive and review the transmitted content at one or more data processing devices (e.g.,devices105,140,182,190 (FIG. 1)).
Activities1800 can also be used in posting a reply post (not shown). In particular, the author of a reply post can indicate whether a reply post is to be publicly or privately posted. In some implementations, a reply to a private post may be forbidden or delete information identifying the author of the replied-to post.
FIG. 19 is a schematic representation of the display of apresentation1900 on a portion oftouchscreen115 ofdevice105.Presentation1900 is displayed ontouchscreen115 in response to user interaction withinteractive widget235 that is associated withcontact identifier210. The user interaction withinteractive widget235 that triggers the display ofpresentation1900 can be, e.g., a single or a double click or tap.
As shown,presentation1900 shares features withpresentation300, includingaction widget collection305. However, the action widgets incollection305 are grouped together in anarea1905 that appears to have displacedareas255 which are below the contact that is associated with theinteractive widget235 that triggers the display ofaction widget collection305. In particular,areas255 that includeidentifiers215,220,225 appear to have been shifted downward to accommodatearea1905. As a result,area1905 does not appear overlaid upon and does not appear to obscure at least a portion ofarea255 that includes information characterizing a contact that differs from the contact that is associated with theinteractive widget235 that triggers the display ofaction widget collection305.
As a result of this apparent displacement of some of theareas255, at least a portion of one ormore areas255 may no longer be visible ontouchscreen115 ofdevice105. In particular, in some implementations,touchscreen115 may not be large enough continue to display allareas255 without resizing after shifting to accommodatearea1905. Such implementations are schematically illustrated inFIG. 19 by thearea255 which includesidentifier225 andinteractive widget250. In particular, this area is shown cut off, with a portion of this area outside the area oftouchscreen115 that displayspresentation1900.
In other implementations, one ormore areas255 can be shifted upward to accommodate area so that the contact identifier that is associated with the interactive widget that triggers the display ofaction widget collection305 is not obscured byaction widget collection305.
In the illustrated implementation,area1905 is demarcated from other portions ofpresentation1900 by aborder1910. In other implementations,area1905 can be demarcated from other portions ofpresentation1900 by color or shade, by empty expanses, or by other visual features that convey thatwidgets310,315,320,325,330 commonly belong tocollection305. In the illustrated implementation,border1910 ofarea1905 includes apointed indicium345 that is extends outwardly from a relatively straighter portion ofborder1910 and extends acrossborder260 that demarcatesarea255.
In the illustrated implementation,area1905 that is wider than it is tall. In the illustrated implementation,area1905 spans a majority of the width oftouchscreen115. In this, the relative sizes of the height and width dimensions ofarea1905 follow the relative sizes of the height and width dimensions ofareas255. In particular,areas255 are generally strip-shaped elements that span a majority of the width W oftouchscreen115. Area19055 is also a generally strip-shaped element that spans a majority of the width W oftouchscreen115. In the illustrated implementation, the height of the strip of area1905 (i.e., in the direction of height H of touchscreen115) is smaller than the height of the strips ofareas255, although this is not necessarily the case. Indeed, in some implementations, the height of the strip ofarea1905 can be the same as or larger than the height of the strips ofareas255. Other layouts ofarea1905 are possible, e.g., in other contexts. By way of example, ifdevice105 includes a relativelylarger touchscreen115 than in the illustrated implementation, thenarea1905 can be arranged differently and/or span a relatively smaller portion oftouchscreen115.
Such an apparent displacement of identifiers and associated interactive elements can be used in other contexts. For example, rather than apparently overlayingarea635 onarea555 that includes information characterizing a different message as shown in presentation600 (FIG. 6), one ormore areas555 can appear to have been shifted upward or downward to accommodate an area that includesaction widget collection605. As another example, rather than apparently overlayingarea835 onarea755 that includes information characterizing a different media file as shown in presentation800 (FIG. 8), one ormore areas755 can appear to have been shifted upward or downward to accommodate an area that includesaction widget collection805. As another example, rather than apparently overlayingarea1035 onareas955 that includes information characterizing different media files as shown in presentation1000 (FIG. 10), two ormore areas955 can appear to have been shifted upward or downward to accommodate an area that includesaction widget collection1005.
FIG. 20 is a schematic representation of the display of apresentation2000 on a portion oftouchscreen115 ofdevice105.Presentation2000 is displayed ontouchscreen115 in response to user interaction withe-mail contact widget325 inaction widget collection305 that is itself displayed in response to user interaction withinteractive widget235. The user interaction withe-mail contact widget325 that triggers the display ofpresentation2000 can be, e.g., a single or a double click or tap.
In addition to the displayed features shared withpresentation300,presentation2000 also includes anaction disambiguation section2005.Disambiguation section2005 is a display area inpresentation2000 that includes interactive elements for resolving ambiguity as to the particular action that is to be triggered by user interaction with an interactive widget inaction widget collection305.
In the illustrated implementation,disambiguation section2005 includes a pair ofdisambiguation widgets2010,2015 and a disambiguation savewidget2020.Disambiguation widgets2010,2015 are interactive elements that, in response to user interaction, resolve ambiguity as to the action that is to be performed on the identified contact. In the illustrated instance,disambiguation widgets2010,2015 disambiguate the action triggered bye-mail contact widget325, namely, the electronic mail address of the contact which is addressed by user interaction withe-mail contact widget325. In other instances,disambiguation widgets2010,2015 can disambiguate other actions. For example, in some instances, the action triggered by telephone contact widget320 (e.g., which telephone number of the contact is called), the action triggered by contact social network interaction widget330 (e.g., which social network of the contact mediates the interaction), the action triggered by widget410 (e.g., which chat or text message functionality or address is used), the action triggered by asave widget1215,1415 (e.g., where the image or document is to be saved), the action triggered by ashare widget1220,1420 (e.g., how the image, a reference to the image, or a reference to the electronic document is to be shared), or other action can be disambiguated bydisambiguation widgets2010,2015.Disambiguation widgets2010,2015 can thus be presented in one or more ofareas335,635,835,1035,1235,1435,1905.
In some implementations, the action which is disambiguated bydisambiguation widgets2010,2015 is indicated by anindicium2022 associated with a particular action widget incollection305. In the illustrated implementation,indicium2022 is aborder2022 that surroundsmail contact widget325. In other implementations,indicium2022 can be shading, coloring, or another visual features that distinguishesmail contact widget325 from the other widgets inaction widget collection305.
In the illustrated implementation,disambiguation widgets2010,2015 are each a textual presentation of a different electronic mail address of the contact. User interaction with one ofdisambiguation widgets2010,2015 triggers the transmission of an electronic mail message to that respective address or the display of a presentation for authoring an electronic mail message addressed to that respective address. The user interaction that triggers the such a transmission or presentation can be, e.g., a single or a double click or tap on a respective one ofdisambiguation widgets2010,2015.
Disambiguation savewidget2020 is an interactive element that, in response to user interaction, saves the disambiguation provided bydisambiguation widgets2010,2015. The saved disambiguation can be stored with other user preferences (e.g., in data store1525) and used to disambiguate subsequent actions without additional user disambiguation. For example, the resolution of electronic mail address ambiguity by user interaction withdisambiguation widgets2010,2015 can be saved and subsequent electronic mail communications to the contact identified byidentifier210 can be addressed to the selected electronic mail address by default. In the illustrated implementation, disambiguation savewidget2020 resembles a check box that is associated withtext2025 that sets forth the consequences of user interaction with disambiguation savewidget2020.
In the illustrated implementation,disambiguation section2005 displayed withinarea335 that includesaction widget collection305 and that appears to be overlaid upon other portions ofpresentations200,300 that are not visible inpresentation2000. In particular,area335 appears to obscure at least a portion of a pair ofareas255 that include information characterizing contacts that differs from the contact that is associated with theinteractive widget235 that triggers the display ofaction widget collection305. In other implementations, identifiers and their associated interactive elements can be apparently displaced by area335 (FIG. 19).
In the illustrated implementation,disambiguation section2005 displayed withinborder340 that demarcatesarea335 from the remainder ofpresentation2000. In other implementations,area335 can be demarcated from other portions ofpresentation2000 by color or shade, by empty expanses, or by other visual features that convey thataction widget collection305 is associated withdisambiguation section2005. In the illustrated implementation,disambiguation section2005 is positioned on the opposite side ofaction widget collection305 fromcontact identifier210 that is associated with theinteractive widget235 that triggers the display ofaction widget collection305.
Embodiments of the subject matter and the operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on computer storage medium for execution by, or to control the operation of, data processing apparatus.
Alternatively or in addition, the program instructions can be encoded on an artificially-generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus. A computer storage medium can be, or be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them. Moreover, while a computer storage medium is not a propagated signal, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially-generated propagated signal. The computer storage medium can also be, or be included in, one or more separate physical components or media (e.g., multiple CDs, disks, or other storage devices).
The operations described in this specification can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources.
The term “data processing apparatus” encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing The apparatus can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit). The apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them. The apparatus and execution environment can realize various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.
A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform actions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a tablet computer, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), to name just a few. Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
To provide for interaction with a user, embodiments of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.
While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any inventions or of what may be claimed, but rather as descriptions of features specific to particular embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Thus, particular implementations have been described. Other implementations are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In certain implementations, multitasking and parallel processing may be advantageous.