FIELD OF THE INVENTIONThe present technology relates to the field of multimedia content. More particularly, the present technology provides techniques for selecting content items and generating multimedia content.
BACKGROUNDComputing devices are popular and are often used to browse web sites, access online content, interact with social networks and/or social media, and perform a wide variety of tasks. Computing devices may allow users to create and upload content items to a social networking or social media services, where other users can comment, like, and/or further share the content items.
When selecting content items for uploading, computing devices typically link users to a camera coupled to/integrated into the computing device, a local file system, or a remote file system connected to the computing device with a network connection. Facilitating links to cameras and local files systems allow a user the flexibility of uploading content items that were recently captured or are otherwise located on the user's computing device. Facilitating links to a remote file system allows the user to upload content items stored on networked devices, including cloud storage accounts and other servers accessible via networks. Systems that make it easier to upload and organize content items would be helpful.
SUMMARYVarious embodiments of the present disclosure can include systems, methods, and non-transitory computer readable media configured to highlight, by a computing system, a reference content item of a plurality of content items associated with a story in response to a selection of the reference content item, the plurality of content items having a first order. The reference content item may be reranked relative to the plurality of content items in response to user input to create a second order of the plurality of content items. The story may be published using the second order of the plurality of content items.
In some embodiments, the first order corresponds to an order the content items were uploaded for the story. Moreover, a content reordering screen may be generated in response to the selection of the reference content item in a story creation screen. The content reordering screen may display at least a portion of the plurality of content items in a vertical format optimized for viewing in a viewport of a mobile device.
In an embodiment, each of the plurality of content items comprises one or more of: digital images, digital audio, digital video, map data, hashtags, and social tags. The plurality of content items may be scrolled in a content reordering screen. The scrolling may occur at scrolling speeds based at least in part on a distance between a position of a cursor or touchpoint and an initial position of the reference content item.
In an embodiment, reranking the reference content item comprises: identifying an initial rank of the reference content item; identifying an insertion location between a first content item and a second content item into which the user input instructs insertion of the reference content item; and updating the rank of the reference content item based at least in part on the identified insertion location. A rank of at least a portion of the plurality of content items other than the reference content item may be updated.
In an embodiment, highlighting the reference content item comprises shading the reference content item in a content reordering screen.
In an embodiment, the selection comprises a long press. The selection may comprise a double-click or a right-click. Further, publishing the story may comprise at least one of sharing the story, publishing the story in a feed, storing the story locally, or storing the story on a server.
In an embodiment, selection of the reference content item is associated with a first screen and reranking of the reference content item is associated with a second screen. The user input may be applied to a touchscreen display. The computer-implemented method may be implemented on a mobile device. Further, the computer-implemented method is implemented on an application associated with a social networking service or a social media service.
Many other features and embodiments of the invention will be apparent from the accompanying drawings and from the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1 illustrates an example environment including a story publication system, according to an embodiment of the present disclosure.
FIG. 2 illustrates an example environment including a story publication user interface module, according to an embodiment of the present disclosure.
FIG. 3 illustrates an example environment including a user input processing module, according to an embodiment of the present disclosure.
FIG. 4 illustrates an example environment including a display view rendering module, according to an embodiment of the present disclosure.
FIG. 5 illustrates an example environment including a story publication management module, according to an embodiment of the present disclosure.
FIG. 6 illustrates an example environment including a story content order modification module, according to an embodiment of the present disclosure.
FIG. 7 illustrates an example method for reordering content items in a story with a user interface, according to an embodiment of the present disclosure.
FIG. 8 illustrates an example method for reordering content items in a story with a user interface, according to an embodiment of the present disclosure.
FIG. 9 illustrates an example method for reordering content items in a story, according to an embodiment of the present disclosure.
FIG. 10 illustrates an example method for reordering content items in a story, according to an embodiment of the present disclosure.
FIG. 11 illustrates an example screen of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure.
FIG. 12 illustrates an example screen of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure.
FIG. 13 illustrates an example screen of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure.
FIG. 14 illustrates an example screen of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure.
FIG. 15 illustrates an example screen of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure.
FIG. 16 illustrates an example screen of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure.
FIG. 17 illustrates an example screen of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure.
FIG. 18 illustrates an example screen of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure.
FIG. 19 illustrates a network diagram of an example system that can be utilized in various scenarios and/or environments, according to an embodiment of the present disclosure.
FIG. 20 illustrates an example of a computer system that can be utilized in various scenarios and/or environments, according to an embodiment of the present disclosure.
The figures depict various embodiments of the disclosed technology for purposes of illustration only, wherein the figures use like reference numerals to identify like elements. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated in the figures can be employed without departing from the principles of the disclosed technology described herein.
DETAILED DESCRIPTIONSystems and Methods for Processing Orders of Content ItemsConventional techniques can store and display large numbers of photos. According to these techniques, photos may be uploaded and stored locally or remotely. These techniques are helpful to present photos to the persons who uploaded them and even to others who have been provided with shared access to the photos.
A large number of photos invites organization and reordering of the photos. Such reorganization and reordering can be performed according to user selected categories and topics for more user friendly and efficient access to the photos. Unfortunately, conventional techniques can be labor intensive (e.g., due to small screen sizes of mobile phones) and lack a streamlined capability to manage photos in this manner. As a result, related photos are often not optimally organized and ordered. When shared with others, their impact can be muted. Moreover, conventional techniques typically allow a user to select, upload, and publish photos to an online account, such as an account associated with a social networking service or a social media service. Unfortunately, once photos are chosen for uploading, the user is unable to reorder those photos until after the photos are actually published. Even systems that allow reordering of photos before publication may nonetheless limit the user's ability to effectively tell a story with the photos, particularly if the user is using a mobile phone or a tablet computing device.
FIG. 1 illustrates anexample environment100 including astory publication system102, according to an embodiment of the present disclosure. Thestory publication system102 includes a contentdatastore interface module104, a story publicationuser interface module106, a storypublication management module108, and anetwork interface module110. One or more of the contentdatastore interface module104, the story publicationuser interface module106, the storypublication management module108, and thenetwork interface module110 may be coupled to one another or to components external to thestory publication system102 and not explicitly shown therein. It is noted the components shown in this figure and all figures herein are exemplary only, and other implementations may include additional, fewer, or different components. Some components may not be shown so as not to obscure relevant details.
Thestory publication system102 may create stories using content items captured on or otherwise accessible to a computing device. The stories may be shared with others, published in a feed associated with the user, stored locally on the computing device, or stored remotely on a server coupled to the computing device through a network connection. In some embodiments, the stories are published to the user or the user's account associated with a social networking service or a social media service. The content items in the stories may include any type of digital content, including but not limited to: digital images, digital audio, digital video, map data, hashtags, social tags (e.g., user tags, facial recognition tags, location tags, activity tags, etc.). The content items in the stories may also include metadata associated with digital content, including metadata added by the user as well as metadata added by other users (e.g., social networking friends, etc.). In an embodiment, the computing device comprises a device having a touchscreen display; the touchscreen display may include a touchscreen user interface. The computing device may also comprise any computing device having an input device, such as a mouse or a trackpad.
The contentdatastore interface module104 may be coupled to a content item datastore. The contentdatastore interface module104 may also be coupled to a camera or a scanner of the computing device. The contentdatastore interface module104 may instruct the camera to initiate content item capture. The contentdatastore interface module104 may also be coupled to a file system of the computing device. For example, the contentdatastore interface module104 may interface with memory and/or storage (e.g., Flash memory, internal memory cards, external memory cards, etc.) of the computing device to access files stored thereon. Moreover, the contentdatastore interface module104 may be coupled to a server that stores content items. For example, the contentdatastore interface module104 may interface with cloud-based storage or other networked storage associated with the user. As another example, the contentdatastore interface module104 may interface with any networked server that the user may access through the Internet.
The story publicationuser interface module106 may provide a user interface that allows a user to create a story and to associate content items with the story. The user interface may also allow the user to reorder the content items before the story is published.
In some embodiments, the story publicationuser interface module106 configures an application to display a story creation screen and a content reordering screen. The story creation screen may allow the user to identify a story and create annotations (e.g., title, captions, tags, etc.) for the story. The story creation screen may further allow the user to select content items for the story. The story creation screen may display the selected content items in a manner similar to how the file system of the computing device displays the selected content items. In the story creation screen content items may have a first order, which in some embodiments, corresponds to the order the content items were chosen for the story. When in the story creation screen, the story publicationuser interface module106 may receive a modified selection gesture (such as a long press gesture, a right-click, or a double-click) that selects a reference content item for reordering.
In response to the modified selection gesture, the story publicationuser interface module106 may display the content reordering screen. The content reordering screen may show the reference content item visually emphasized (e.g., silhouetted, shaded, darkened, etc.) and other content items listed or otherwise arranged in a manner optimized for the viewport of the computing device. For example, in the content reordering screen, the reference content item may be expanded and may have an outline around it, while other content items in the content reordering screen may be shrunk and listed in a vertical arrangement. Further, in the content reordering screen, the content items may be adapted in size to cover most of the viewport of the computing device. In an embodiment, the content items in the content reordering screen may be of a different size than the content items in the story creation screen. More specifically, the content items in the content reordering screen may be rendered smaller than the content items in the story creation screen. The content items in the content reordering screen may or may not keep their aspect ratios and other properties. For example, in an embodiment, content items depicting panoramas may keep their aspect ratios and may be sized with a width similar to other non-panorama content items.
The content reordering screen may receive reordering gestures (e.g., horizontal slide gestures and/or vertical slide gestures) that allow the reference content item to be moved relative to the other content items. The other content items may move relative to the reference content item in order to accommodate a new position of the reference content item. The reordering gestures may cause scrolling of the content reordering screen. The story publicationuser interface module106 may support multiple scroll speeds, as discussed further herein. The story publicationuser interface module106 may return to a modified story creation screen that shows the content items reordered. The story publicationuser interface module106 may also provide user interface elements that allow the story to be published.FIG. 2 shows the story publicationuser interface module106 in greater detail.
The storypublication management module108 may manage backend processes associated with story creation and selection of content items for stories. More specifically, the storypublication management module108 may also support the story publicationuser interface module106 and may manage reordering processes used by the story publicationuser interface module106. In some embodiments, the storypublication management module108 receives instructions to create a story and receives annotations for the story from the story publicationuser interface module106. The storypublication management module108 may further receive instructions to select content items for the story. The storypublication management module108 may also receive instructions to reorder the content items based on instructions from the story publicationuser interface module106. The storypublication management module108 may publish a story based on an instruction from the story publicationuser interface module106. In an embodiment, the storypublication management module108 implements Application Programming Interfaces (APIs) or functional calls that publish a story to a social networking system or a social media system.FIG. 5 shows the storypublication management module108 in greater detail.
Thenetwork interface module110 may couple thestory publication system102 to a computer network. In some embodiments, thenetwork interface module110 allows the contentdatastore interface module104 to access remote content datastores. Further, thenetwork interface module110 may allow the storypublication management module108 to transfer content items and/or content items to other devices. Thenetwork interface module110 may allow the storypublication management module108 to publish a story.
FIG. 2 illustrates anexample environment200 including the story publicationuser interface module106, according to an embodiment of the present disclosure. The story publicationuser interface module106 includes a userinput processing module202, a displaymode rendering module204, and a display configuration module206. The userinput processing module202, the displaymode rendering module204, and the display configuration module206 may be coupled to one another and/or components external to the story publicationuser interface module106 and not explicitly shown therein. The userinput processing module202, the displaymode rendering module204, and the display configuration module206 may be associated with a computing device having a display and a user input device, as discussed herein. In various embodiments, the display is a touchscreen display, such as touchscreen display incorporated into a mobile phone or a tablet computing device. Such a display may support the user input device by receiving gestures. In some embodiments, a user input device comprises a mouse or trackpad.
The userinput processing module202 may receive user input relating to stories and/or content items. The userinput processing module202 may be coupled to the display of the computing device. In embodiments in which the display comprises a touchscreen display, the userinput processing module202 may receive one or more gestures, including: gestures related to a location of a user's finger on the touchscreen display, gestures related to whether the user has tapped a specific area of the touchscreen display, gestures related to whether the user has held down (e.g., long pressed) a specific area of the touchscreen display, and gestures related to whether the user has provided horizontal or vertical movements (or movements having horizontal components or vertical components) on the touchscreen display. In embodiments in which the user input device comprises a mouse or a trackpad, the userinput processing module202 may receive information from the input device, including: the location of a cursor associated with the input device, whether the input device has selected an area of a display, whether the input device has right-clicked and/or double-clicked an area of the display, and whether the user has attempted to move items horizontally or vertically across the display.
As used herein, “vertical” and “horizontal” may refer to absolute, relative, or approximate directions. In some embodiments, “vertical” movements and gestures by a user may include substantially vertical or non-horizontal action by the user. In some embodiments, “vertical” may refer to a direction along a longitudinal or latitudinal axis of a display of a computing device receiving user input. Likewise, as used herein, in some embodiments, “horizontal” movements and gestures by the user may include substantially horizontal or non-vertical action by the user. In some embodiments, “horizontal” may refer to a direction along a longitudinal or latitudinal axis of a display of a computing device receiving user input. In some embodiments, a movement or gesture that is not precisely vertical or horizontal may be deconstructed to determine its vertical component and horizontal component.
In some embodiments, the userinput processing module202 may process user input related to the creation of stories, annotations related to new stories, content items selected for stories, and publication of stories. The userinput processing module202 may also process user input related to reordering content items within a story, such as specific gestures and specific actions taken by a mouse or trackpad, relating to those content items. The userinput processing module202 may provide user input to the storypublication management module108.FIG. 3 shows the userinput processing module202 in greater detail.
The displaymode rendering module204 may instruct the display configuration module206 to show one of several display views. In some embodiments, the displaymode rendering module204 instructs the display configuration module206 to show a story creation screen that allows a user to create a story, select content for the story, and ultimately publish the story. The displaymode rendering module204 may also instruct the display configuration module206 to show a content reordering screen in which a user can reorder content items based on user input received by the userinput processing module202. The displaymode rendering module204 may base a determination to activate a particular display mode on information form the storypublication management module108.FIG. 4 shows the displaymode rendering module204 in further detail.
The display configuration module206 may configure the display of the computing device to show information relevant to creating stories, selecting content items for stories, and publishing stories. The display configuration module206 may be coupled to the displaymode rendering module204. The display configuration module206 may configure the display to allow a user to: create a story; enter annotations for the story; select content items for the story; and reorder content items selected for the story. The display configuration module206 may also display a story creation screen and/or a content reordering screen to facilitate story creation based on instructions from the displaymode rendering module204. The views and/or the orders of content items may be provided by the storypublication management module108.
FIG. 3 illustrates anexample environment300 including the userinput processing module202, according to an embodiment of the present disclosure. The userinput processing module202 includes aposition recognition module302, aselection recognition module304, a modifiedselection recognition module306, a horizontalmotion recognition module308, and a verticalmotion recognition module310. One or more of theposition recognition module302, theselection recognition module304, the modifiedselection recognition module306, the horizontalmotion recognition module308, and the verticalmotion recognition module310 may be coupled to one another and/or to components external to the userinput processing module202 and not explicitly shown therein.
Theposition recognition module302 may recognize the position of a relevant area on a display. In a touchscreen embodiment, theposition recognition module302 recognizes an area of a touchscreen display the user is touching. In various embodiments, theposition recognition module302 recognizes the position of the cursor on the display associated with an input device.
Theselection recognition module304 may be configured to recognize a selection of a content item or of an area of the display. In a touchscreen embodiment, theselection recognition module304 recognizes a tap gesture corresponding to an area on the display. For example, theselection recognition module304 may recognize a user's tapping of a content item. In various embodiments, theselection recognition module304 recognizes a mouse or trackpad click of a content item or other area of the display. In some embodiments, theselection recognition module304 provides other modules, such as the storypublication management module108, with an identifier (e.g., a Universally Unique Identifier (UUID), a name, etc.) of a content item that has been selected.
The modifiedselection recognition module306 may be configured to recognize a modified selection of a content item or other area of the display. In a touchscreen embodiment, the modifiedselection recognition module306 recognizes a long press (e.g., a tap on the screen that exceeds a predetermined length of time) gesture corresponding to the user's tapping of an area on the display. In various embodiments, the modifiedselection recognition module306 recognizes a double-click or a right-click from a mouse or trackpad related to an area of the display. In various embodiments, the modifiedselection recognition module306 provides other modules, such as the storypublication management module108, with an identifier (e.g., a Universally Unique Identifier (UUID), a name, etc.) of a content item that is the subject of a modified selection.
The horizontalmotion recognition module308 may be configured to recognize horizontal motion taken with respect to a content item or other area of the display. In a touchscreen embodiment, the horizontalmotion recognition module308 recognizes when a content item or other area of the display is being moved left or right. The horizontalmotion recognition module308 may also recognize horizontal swipes. In various embodiments, the horizontalmotion recognition module308 may recognize when a content item or other area of the display is being dragged right or left pursuant to instructions from a mouse or a trackpad.
The verticalmotion recognition module310 may be configured to recognize vertical motion taken with respect to a content item or other area of the display. In a touchscreen embodiment, the verticalmotion recognition module310 recognizes when a content item or other area of the display is being moved up or down. The verticalmotion recognition module310 may also recognize vertical swipes.
In various embodiments, the verticalmotion recognition module310 may recognize when a content item or other area of the display is being dragged up or down pursuant to instructions from a mouse or a trackpad.
FIG. 4 illustrates anexample environment400 including the displaymode rendering module204, according to an embodiment of the present disclosure. The displaymode rendering module204 may include a story creationscreen rendering module402, a content reorderingscreen rendering module404, a reference contentitem rendering module406, a content reorderingscreen scrolling module408, and a content iteminsertion rendering module410. One or more of the story creationscreen rendering module402, the content reorderingscreen rendering module404, the reference contentitem rendering module406, the content reorderingscreen scrolling module408, and the content iteminsertion rendering module410 may be coupled to one another and/or to components external to the displaymode rendering module204 and not explicitly shown therein.
The story creationscreen rendering module402 may instruct the display to render a story creation screen. The story creation screen may allow a user to create a story, add annotations to the story, add content items to the story, and publish the story. In some embodiments, the story creation screen rendered by the story creationscreen rendering module402 may receive user input. For example, the story creation screen may include portions that can recognize positions of relevant areas, selections of content items or relevant areas, and modified selections of content items or relevant areas. In an embodiment, the story creation screen may correspond to a screen of a social networking application that asks users to create a story with content items.
The content reorderingscreen rendering module404 may instruct the display to render a content reordering screen. The content reordering screen may display content items that were selected for a story in a format that is optimized for viewing on the display. For instance, in embodiments where the display comprises a touchscreen display of a mobile phone or a tablet computing device, the content reordering screen displays content items in a vertical format that allows a user to preview a plurality of content items. The content items may cover a substantial area of the viewport of the display (e.g., they may cover ninety percent of the display), and may each be separated by a fixed distance or a fixed number of pixels. The content reorderingscreen rendering module404 may receive user input (e.g., touch positions, selections, modified selections, horizontal motions, vertical motions, etc.) with respect to the content items.
The reference contentitem rendering module406 may instruct the display to render a reference content item in the content reordering screen. In various embodiments, the reference contentitem rendering module406 instructs the display to visually emphasize the reference content item by providing a line around the reference content item and highlighting the interior portions of the reference content item. In some embodiments, the reference contentitem rendering module406 increases the size of the reference content item relative to the other content items in the content reordering screen. In some embodiments, the reference contentitem rendering module406 instructs the display to render the reference content item to the side of other content items so that it appears the order of the reference content item is being changed relative to the other content items in the content reordering screen. In some embodiments, the reference contentitem rendering module406 may receive user input with respect to the reference content item. For example, the reference contentitem rendering module406 may receive horizontal and/or vertical motions with respect to the reference content item.
The content reorderingscreen scrolling module408 may render scrolling of the content reordering screen. More specifically, the content reorderingscreen scrolling module408 may make the content items in the content reordering screen appear as if they are scrolling at one or more speeds. In some embodiments, the scrolling may be vertical scrolling. For example, the content items in the content reordering screen may appear to be moving up or down in the opposite direction to the direction a reference content item is being moved. In various embodiments, the content reorderingscreen scrolling module408 supports a plurality of scrolling speeds.
In some embodiments, the scroll speed may be dynamic and determined based on the distance between the initial position of a reference content item and the location of the cursor or touchpoint at a particular instant in time during a gesture. For example, the content reorderingscreen scrolling module408 may support a scroll speed of zero at which the content items do not scroll when, upon selection of the reference content item, the cursor or touchpoint is within a threshold distance of the initial position of the reference content item. The content reorderingscreen scrolling module408 may also cause content items to be scrolled at other speeds based on (e.g., proportional to) the distance between the position of a user's finger/cursor and the initial position of the reference content item in the content reordering screen as the finger/cursor moves. The content reorderingscreen scrolling module408 may further support a maximum scroll speed at which content items are scrolled when the distance between the position of a user's finger/cursor and the position of the reference content item satisfies a threshold distance between the initial position of the reference content item and the location of the cursor or touchpoint at a particular instant in time during the gesture. The content reorderingscreen scrolling module408 may receive user input, such as vertical motions, in various embodiments.
The content iteminsertion rendering module410 may render insertion of content items into the content reordering screen. In various embodiments, the content iteminsertion rendering module410 inserts the reference content item into a specified location in the list of content items in the content reordering screen. The content iteminsertion rendering module410 may further part content items at an insertion location, and may render the reference content item at or into the insertion location. The content iteminsertion rendering module410 may receive user input, such as horizontal motions, in various embodiments.
FIG. 5 illustrates anexample environment500 including the storypublication management module108, according to an embodiment of the present disclosure. The storypublication management module108 includes astory creation module502, astory annotation module504, a storycontent selection module506, a storycontent order module508, a story contentorder modification module510, astory publication module512, and a storycontent order datastore514. Thestory creation module502, thestory annotation module504, the storycontent selection module506, the storycontent order module508, the story contentorder modification module510, thestory publication module512, and the story content order datastore514 may be coupled to one another and/or components external to the storypublication management module108 and not explicitly shown therein.
Thestory creation module502 may facilitate the creation of stories. More specifically, thestory creation module502 may configure a story creation screen to request a user to create a new story and may create backend processes related to necessary pages, scripts etc. that would help publish the new story. Thestory creation module502 may receive information about the new story from the story publicationuser interface module106. In some embodiments, thestory creation module502 may notify a social networking service or a social media service that a new story is being created. Thestory creation module502 may request the social networking service or the social media service to update permissions and/or other information related to the user creating the story accordingly. Thestory creation module502 may provide information about a new story being created to the story publicationuser interface module106 so that the story publicationuser interface module106 can request other information from a user, such as annotations for the story, as discussed further herein.
Thestory annotation module504 may facilitate the annotation of stories. Thestory annotation module504 may configure a story creation screen to accept a title, captions, tags, and other annotations for a story being created. Thestory annotation module504 may also update backend processes related to the story to reflect the annotations. Thestory annotation module504 may receive the annotations from the story publicationuser interface module106. In some embodiments, thestory annotation module504 notifies the social networking service and/or social media service publishing the story of the annotations being added to the story. Thestory annotation module504 may provide the annotations to the story publicationuser interface module106, so that the story publicationuser interface module106 can request other information from the user, such as content items for the story, as discussed further herein.
The storycontent selection module506 may facilitate selection of content items for a story. The storycontent selection module506 may configure a story creation screen to associate content items with a story being created. In some embodiments, the storycontent selection module506 may interface with one or more of a camera, a file system, memory and/or storage, and cloud-based storage of a computing system to facilitate identification of content items relevant to a story. In some embodiments, the storycontent selection module506 configures the story publicationuser interface module106 to display content items that can be selected for the story. In some embodiments, the storycontent selection module506 instructs the story publicationuser interface module106 to accept selection of individual content items. In various embodiments, the storycontent selection module506 instructs the story publicationuser interface module106 to provide a batch uploader that allows selection of a plurality of content items for the story. The storycontent selection module506 may provide the identities of selected content items to the storycontent order module508.
The storycontent order module508 may order the content items selected for a story. More specifically, the storycontent order module508 may assign a rank for each content item selected for a story. The rank may comprise a number or other value that facilitates ordering of the content items for the story. In an embodiment, the storycontent order module508 may store and/or manage the ranks of the content items in the storycontent order datastore514. For example, the storycontent order module508 may store and/or manage a database in the story content order datastore514 that has, as its first column, the names of content items, and as its second column, the ranks of content items. In various embodiments, the storycontent order module508 may implement multiple orders of content items. For instance, the storycontent order module508 may implement a first order and a second order of content items. In the first order of content items, the rank of each content item may correspond to the order specific content items were selected for a story. In the second order of content items, content may be ordered according to modifications by a user. The modifications may be based on instructions from the story contentorder modification module510.
The story contentorder modification module510 may facilitate modifying the order of content items for a story. The story contentorder modification module510 may receive instructions to reorder a content item from the story publicationuser interface module106. More specifically, the story contentorder modification module510 may receive a selection from the story publicationuser interface module106 of a reference content item. The story contentorder modification module510 may also identify whether the story publicationuser interface module106 has instructed the rank of the reference content item to change. The story contentorder modification module510 may correspondingly change the rank of the reference content item and other content items selected for the story in the storycontent order datastore514. The story contentorder modification module510 may support a content reordering screen to reorder the reference content item in relation to the other selected content item, as discussed herein. In some embodiments, the story contentorder modification module510 implements an iterative process that reorders content items more than once before a story is published.FIG. 6 shows the story contentorder modification module510 in greater detail.
Thestory publication module512 may facilitate publication of a story. More specifically, thestory publication module512 may provide instructions to a social networking service or a social media service to publish a story. In an embodiment, thestory publication module512 interfaces with APIs and/or functions of the social networking service or social media service that facilitate publication of the content items. Further, thestory publication module512 may publish the story to a feed associated with the user who created the story. In an embodiment, the story to be published has content items that have been reordered by the story contentorder modification module510. Thestory publication module512 may receive instructions to publish the story from the story publicationuser interface module106.
The story content order datastore514 may store content items and their specific ranks with respect to the order of content items in a story. The story content order datastore514 may receive content items and their ranks from the storycontent order module508 and the story contentorder modification module510.
FIG. 6 illustrates anexample environment600 including the story contentorder modification module510, according to an embodiment of the present disclosure. The story contentorder modification module510 may include a story contentorder identification module602, a reference contentitem selection module604, a reference content item rank modification module606, and a story contentorder update module608. One or more of the story contentorder identification module602, the reference contentitem selection module604, the reference content item rank modification module606, and the story contentorder update module608 may be coupled to one another and/or components external to the story contentorder modification module510 and not shown explicitly therein.
The story contentorder identification module602 may identify the order of content items in a story. More specifically, the story contentorder identification module602 may identify the ranks and/or orders of each content item chosen for a story. The story contentorder identification module602 may obtain the orders of content items based on the ranks assigned to content items by the storycontent order module508. In some embodiments, the story order identification module obtains a first order of content items corresponding to the orders the content items were selected for a story. The story contentorder identification module602 may also obtain updated orders of content items from the story contentorder update module608. The story contentorder identification module602 may provide the order of content items when so requested.
The reference contentitem selection module604 may receive a selection of a reference content item that is to be re-ranked. In an embodiment, the reference contentitem selection module604 may receive from the story publication user interface module106 a selection of a reference content item. The selection may identify the reference content item by Universally Unique Identifier (UUID), by name, or by the original ranking of the reference content item. The reference contentitem selection module604 may provide the identifier of the reference content item to the reference content item rank modification module606.
The reference content item rank modification module606 may modify the rank of the reference content item. More specifically, the reference content item rank modification module606 may facilitate changing the rank of the reference content item and the ranks of other content items in the order. In an embodiment, the reference content item rank modification module606 receives the instructions to modify the rank from the story publicationuser interface module106. More specifically, as discussed herein, the reference content item rank modification module606 may receive instructions from the story publicationuser interface module106 that a user of the computing device moved the reference content item horizontally out of order from the other content items in the story. The reference content item rank modification module606 may also receive instructions from the story publicationuser interface module106 that a user of the computing device moved the reference content item vertically to a new rank in the order of content items in the story. The reference content item rank modification module606 may further receive instructions from the story publicationuser interface module106 that a user of the computing device reinserted the reference content item into the order of content items in the story, thereby reordering the reference content item with respect to the order of the content items selected for the story. The reference content item rank modification module606 may provide the new rank of the reference content item as well as the new rank of the content items impacted by the reordering of the reference content item to the story contentorder update module608.
The story contentorder update module608 may update the order of content items in the story based on the modified rank of the reference content item. In some embodiments, the story contentorder update module608 assigns new ranks to all content items in the story having a higher rank than the modified rank of the reference content item. The story contentorder update module608 may provide the new ranks of the content items in the story to other modules, such as other modules of the storypublication management module108. In some embodiments, a story may be published with the content items reordered according to the updated orders.
FIG. 7 illustrates anexample method700 for reordering content items in a story with a user interface, according to an embodiment of the present disclosure. It should be appreciated that there can be additional, fewer, or alternative steps performed in similar or alternative orders, or in parallel, within the scope of the various embodiments unless otherwise stated.
Theexample method700 can receive a selection of a plurality of content items for a story in a story creation screen atblock702. The content items may include, in various items, any items of digital content, such as digital images, digital audio, digital video, map data, hashtags, and social tags (e.g., user tags, facial recognition tags, location tags, and activity tags).
Theexample method700 can display a first order of the plurality of content items in a story creation screen atblock704. The first order of the plurality of content items can correspond to the order in which content items were chosen for upload. The first order of the plurality of content items can also correspond to any other known or convenient order, such as an order of the content items by size, or by alphabetical or reverse alphabetical order based on annotations associated with the content items.
Theexample method700 can receive a selection of a reference content item in the story creation screen atblock706. In some embodiments, a modified selection gesture selecting a reference content item is received. The modified selection gesture may include a long-press gesture, or a right-click or double-click, of a reference content item. The reference content item may be a content item the user is attempting to rerank.
Theexample method700 can display at least a portion of the plurality of content items in a content reordering screen in a format optimized for viewing in a viewport of the computing device atblock708. In an embodiment, at least some of the plurality of content items may be shown in a content reordering screen. The shown content items may be displayed vertically so that viewing of multiple content items is optimized in the display of a mobile phone or a tablet computing device.
Theexample method700 can visually emphasize the reference content item in the content reordering screen atblock710. For example, the reference content item may be expanded and may have an outline around it, while other content items in the content reordering screen may be shrunk and listed in a vertical arrangement. The reordering screen may further be configured to receive gestures relating to the reference content item.
Theexample method700 can receive a movement gesture to rerank the reference content item in the content reordering screen atblock712. In some embodiments, a horizontal motion (e.g., horizontal swipe) of the reference content item is received. Further, a vertical motion (e.g., a vertical swipe) of the reference content item may be received. The reference content item may be dragged and inserted at a selected position into the list of content items. In various embodiments, the reference content item is dragged in an arbitrary direction in the content reordering screen.
Theexample method700 can rerank the reference content item in the content reordering screen in response to the movement gesture atblock714. More specifically, the reference content item may be removed and reinserted into the list of content items at a location the user desires. The content items may be reranked according to where the reference content item was inserted into the list of content items.
Theexample method700 can display a second order of the plurality of content items in the content reordering screen atblock716. More specifically, the plurality of content items may be displayed according to the second order after the reference content item was reinserted into the plurality of content items.
Theexample method700 can display the second order of the plurality of content items in the story creation screen atblock718. For example, the story creation screen may display the content items based on the second order.
FIG. 8 illustrates anexample method800 for reordering content items in a story with a user interface, according to an embodiment of the present disclosure. Again, it should be appreciated that there can be additional, fewer, or alternative steps performed in similar or alternative orders, or in parallel, within the scope of the various embodiments unless otherwise stated.
Theexample method800 can display at least a portion of a plurality of content items in a content reordering screen in a format optimized for viewing on a viewport of a computing device atblock802. In an embodiment, at least some of the plurality of content items may be shown in the content reordering screen. The shown content items may be displayed vertically so that viewing of multiple content items is optimized in the display of a mobile phone or a tablet computing device.
Theexample method800 can receive a first gesture related to a first instruction to move a reference content item atblock804. In some embodiments, a horizontal motion (or substantially horizontal motion or non-vertical motion) relating to the reference content item may be received in the content reordering screen. More specifically, a horizontal swipe gesture or a horizontal drag instruction may be received. In a specific embodiment, the first gesture may require a minimum amount of horizontal motion before reordering is facilitated. More specifically, the first gesture need not be based purely on where a cursor or touchpoint is relative to a vertical axis of the content reordering screen. As a result, reordering need not be triggered simply by a modified selection of a content item at the extremes of the content reordering screen. The resulting embodiments may ensure content reordering processes that are less sensitive and less prone to user error.
Theexample method800 can render and display the reference content item being slid out of order atblock806. More specifically, the content reordering screen may show the reference content item being slid horizontally away from the rest of the plurality of content items. A gap may be created in the space the reference content item previously resided. The gap may further be closed by relative movement of the content item immediately preceding the reference content item (and adjacent content items) and relative movement of the content item immediately following the reference content item (and adjacent content items) toward one another.
Theexample method800 can receive a second gesture related to a second instruction to move the reference content item atblock808. In an embodiment, a vertical motion (or substantially vertical motion or non-horizontal motion) relating to the reference content may be received in the content reordering screen. More specifically, a vertical swipe gesture or a vertical drag may be received as the second gesture.
Theexample method800 can determine a scroll speed based on the difference of an initial location of the reference content item and a location associated with the second gesture atblock810. The scroll speed may be dynamic and determined based on the distance between the initial location of the reference content item and the location of the cursor or touchpoint at a particular instant in time during the second gesture. For example, if the location of the cursor or touchpoint at a particular time during the second gesture is near or within a first threshold distance of the reference content item, the scroll speed may be zero (i.e., the content reordering screen may not scroll at all). As the location of the cursor or touchpoint at a later second time during the second gesture is moved away from the initial location of the reference content item beyond the first threshold distance, the scroll speed may be increased from an earlier scroll speed. If the location of the cursor or touchpoint at a later third time during the second gesture is moved toward the initial location of the reference content item, the scroll speed may be decreased from an earlier value. When the location of the cursor or touchpoint at any time during the second gesture satisfies a second threshold distance from the initial location of the reference, the scroll speed may reach a maximum value.
The determination of scroll speeds can be based on various techniques. In some embodiments, the scroll speed may be proportional or otherwise correlate with the distance between the initial location of the reference content item and the location of the cursor or touchpoint at a particular instant in time during the second gesture. In some embodiments, the scroll speed may be calculated depending on the distance of the touchpoint from the edges of the screen and/or the number of content items. In some embodiments, the scroll speed may change continuously. In other embodiments, the scroll speed may change non-continuously in discrete steps. For example, if the distance between the initial location of the reference content item and the location of the cursor or touchpoint at a particular instant in time is within a first selected range of distances, then the scroll speed may be set to a first constant scroll speed. Further to this example, if the distance between the initial location of the reference content item and the location of the cursor or touchpoint at a particular instant in time is within a second range of distances, then the scroll speed may be set to a second constant scroll speed, and so on.
Theexample method800 can render the plurality of content items being scrolled at the determined scroll speed atblock812. More specifically, the content reordering screen can be scrolled at the scroll speed determined.
Theexample method800 can receive a third gesture to insert the reference content item into the list of the plurality of content item atblock814. The third gesture may comprise a horizontal motion, such as a motion to insert the reference content item into the list of the plurality of content items. Examples of such motion include a horizontal swipe gesture and a horizontal drag of the reference content item.
Theexample method800 can render the reference content item being inserted into a location associated with the third gesture, thereby creating a second order of the plurality of content items atblock816. The content reordering screen may allow the reference content item to be inserted at a location corresponding to the third gesture. This may have the effect of reordering the list of the plurality of content items.
FIG. 9 illustrates anexample method900 for reordering content items in a story, according to an embodiment of the present disclosure. Again, it should be appreciated that there can be additional, fewer, or alternative steps performed in similar or alternative orders, or in parallel, within the scope of the various embodiments unless otherwise stated.
Theexample method900 can receive an instruction to create a new story atblock902. More specifically, an instruction to create a new story may be received from a story creation screen.
Theexample method900 can identify a plurality of content items to associate with the story atblock904. A plurality of content items may be selected from content items from the camera, the file system, or servers coupled to the computing device. The selection of the plurality of content items may be received from the story creation screen.
Theexample method900 can receive annotations for the story atblock906. Annotations, such as a title, captions, tags, and other information may be received for a story from the story creation screen.
Theexample method900 can modify the order of content items by modifying the rank of one of the content items atblock908. More specifically, a content reordering screen may be provided. In the content reordering screen, instructions to modify the order of the content items may be received. The instructions may be based, at least in part, on changing the rank of a reference content item. The order of the content items may therefore be modified to have a second order of content items.
Theexample method900 can publish the story using the modified order of content items atblock910. More specifically, the story may be published with the modified order of content items to a variety of locations. In various embodiments, the story may be shared with others, published in a feed associated with the user, stored locally on the computing device, or stored remotely on a server coupled to the computing device through a network connection. The story may be published to a social networking service or a social media service.
FIG. 10 illustrates anexample method1000 for reordering content items in a story, according to an embodiment of the present disclosure. Again, it should be appreciated that there can be additional, fewer, or alternative steps performed in similar or alternative orders, or in parallel, within the scope of the various embodiments unless otherwise stated.
Theexample method1000 can receive annotations for a story created on a computing device atblock1002. In an embodiment, annotations such as a title, captions, tags, and other information may be received for a story. These annotations may be received from a story creation screen.
Theexample method1000 can receive a selection of a plurality of content items for the story atblock1004. A plurality of content items may be selected from content items from the camera, the file system, or servers coupled to the computing device. The selection of the plurality of content items may be received from the story creation screen.
Theexample method1000 can identify a first order of the plurality of content items for the story atblock1006. More specifically, a rank may be assigned to each content item to produce the first order. The first order of the plurality of content items can correspond to the order in which content items were chosen for upload. The first order of the plurality of content items can also correspond to any other known or convenient order, such as an order of the content items by size, or by alphabetical or reverse alphabetical order of annotations associated with the content items.
Theexample method1000 can receive a selection of a reference content item of the plurality of content items for moving atblock1008. More specifically, a user interface module may provide a notification that a reference content item has been selected.
Theexample method1000 can receive a notification that the reference content item was moved atblock1010. More specifically, a notification that the reference content item was reordered in the content reordering screen may be received.
Theexample method1000 can rerank the reference content item to create a second order of the plurality of content items atblock1012. In an embodiment, the rank of the reference content item may be adjusted. The ranks of other content items may also be adjusted to create a second order of the plurality of content items.
Theexample method1000 can provide the second order of the plurality of content items for use in the story atblock1014. For example, the second order of content items may be provided to the story creation screen so that the story can be modified and/or published.
FIG. 11 illustrates anexample screen1100 of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure. Thescreen1110 may form a part of a story creation screen, as discussed herein. Thescreen1100 may include afirst content item1102, asecond content item1104, athird content item1106, afourth content item1108, afifth content item1110, and asixth content item1112. Thescreen1100 may further include astory publication button1114. In this example, the content items1102-1112 have already been chosen for publication as a story. Though thescreen1100 contains two columns, it is noted that the screen may include a single column, or more than two columns in various embodiments. In an embodiment, thescreen1100 contains a single column with content items appearing larger than they would in a content reordering screen.
FIG. 12 illustrates anexample screen1200 of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure. In thescreen1200, thethird content item1106 has received a modified selection. For example, a user may have long-pressed thethird content item1106 or double-clicked/right-clicked the third-content item1106. Thethird content item1106 may have been highlighted in thescreen1200 in response to the modified selection.
FIG. 13 illustrates anexample screen1300 of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure. Thescreen1300 may form part of a content reordering screen, as discussed herein. Thescreen1300 may include thefirst content item1102, thesecond content item1104, thethird content item1106, and thefourth content item1108. In some embodiments, the content reordering screen may selectively display a portion of the content items of the story creation screen. The selective display of a portion of the content items may be based on threshold proximity of the contents items to the selected (reference) content item based on distance or ranking in view of the available display space of the content reordering screen. As shown, thefifth content item1110 and thesixth content item1112 are not shown in thescreen1300. Thefirst content item1102second content item1104,third content item1106, andfourth content item1108, are shown in a vertical arrangement that is optimized for display in the viewport of the computing device. In this example, thefirst content item1102,second content item1104,third content item1106, andfourth content item1108 are sized so that their widths take up approximately90 percent of the width of the viewport of thescreen1300. Other techniques to resize the content items to other dimensions are possible. Thethird content item1106 may be highlighted due to the modified selection of thethird content item1106.
FIG. 14 illustrates anexample screen1400 of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure. Thescreen1400 reflects provision of a user gesture to move thethird content item1106 out of order. More specifically, a user has moved thethird content item1106 right (e.g., horizontally) to pull it away from thesecond content item1104 and thefourth content item1108. This horizontal movement may have been a horizontal swipe gesture or a horizontal drag. The space between thesecond content item1104 and thefourth content item1108 previously occupied by thethird content item1106 has closed after thethird content item1106 is pulled away. The user has also moved thethird content item1106 vertically and upwardly using a vertical motion gesture (e.g., a vertical swipe or vertical drag). As thethird content item1106 is moved in this manner, thescreen1400 is scrolled down to make thethird content item1106 appear as if it is moving up in the order of the content items.
FIG. 15 illustrates anexample screen1500 of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure. The user is moving thethird content item1106 left (e.g., horizontally) to insert thethird content item1106 between thefirst content item1102 and thesecond content item1104. Thescreen1500 reflects a decision of the user to place thethird content item1106 between thefirst content item1102 and thesecond content item1104. In some embodiments, ashadow1502 may automatically appear between thefirst content item1102 and thesecond content item1104 as the user moves thethird content item1106 to a selected position relative to thefirst content item1102 and thesecond content item1104. Theshadow1502 may indicate to the user that thethird content item1104 has been moved sufficiently to allow insertion between thefirst content item1102 and thesecond content item1104. In some embodiments, a shadow may appear whenever movement of a reference content item results in allowable insertion of the reference content item between two other content items. In an embodiment, an indicator of where the reference content item will land is shown using an opaque preview of the reference content item in the new position by moving adjacent content items apart.
FIG. 16 illustrates anexample screen1600 of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure. Thescreen1600 shows the content reordering screen after thethird content item1106 has been inserted between thefirst content item1102 and thesecond content item1104.
FIG. 17 illustrates anexample screen1700 of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure. Thescreen1700 illustrates the story creation screen after thethird content item1106 has been reranked and the order of the content items has been modified. In thescreen1700, thestory publication button1114 can be depressed by the user to publish the story to the user's social networking account. For instance, the story may be shared with friends or published to the user's feed.
FIG. 18 illustrates anexample screen1800 of a user interface of a system that facilitates reordering of content items, according to an embodiment of the present disclosure. Thescreen1800 may show a feed for another user associated with the user who created the story with the reranking of thethird content item1106. In thescreen1800, thefirst content item1102, thethird content item1106, and thesecond content item1104 appear in the feed of the other user according to the reranked ordering.
Social Networking System—Example ImplementationFIG. 19 illustrates a network diagram of anexample system1900 that can be utilized in various embodiments for enhanced video encoding, in accordance with an embodiment of the present disclosure. Thesystem1900 includes one or more user devices1910, one or moreexternal systems1920, asocial networking system1930, and anetwork1950. In an embodiment, the social networking service, provider, and/or system discussed in connection with the embodiments described above may be implemented as thesocial networking system1930. For purposes of illustration, the embodiment of thesystem1900, shown byFIG. 19, includes a singleexternal system1920 and a single user device1910. However, in other embodiments, thesystem1900 may include more user devices1910 and/or moreexternal systems1920. In certain embodiments, thesocial networking system1930 is operated by a social network provider, whereas theexternal systems1920 are separate from thesocial networking system1930 in that they may be operated by different entities. In various embodiments, however, thesocial networking system1930 and theexternal systems1920 operate in conjunction to provide social networking services to users (or members) of thesocial networking system1930. In this sense, thesocial networking system1930 provides a platform or backbone, which other systems, such asexternal systems1920, may use to provide social networking services and functionalities to users across the Internet.
The user device1910 comprises one or more computing devices that can receive input from a user and transmit and receive data via thenetwork1950. In one embodiment, the user device1910 is a conventional computer system executing, for example, a Microsoft Windows compatible operating system (OS), Apple OS X, and/or a Linux distribution. In another embodiment, the user device1910 can be a device having computer functionality, such as a smart-phone, a tablet, a personal digital assistant (PDA), a mobile telephone, etc. The user device1910 is configured to communicate via thenetwork1950. The user device1910 can execute an application, for example, a browser application that allows a user of the user device1910 to interact with thesocial networking system1930. In another embodiment, the user device1910 interacts with thesocial networking system1930 through an application programming interface (API) provided by the native operating system of the user device1910, such as iOS and ANDROID. The user device1910 is configured to communicate with theexternal system1920 and thesocial networking system1930 via thenetwork1950, which may comprise any combination of local area and/or wide area networks, using wired and/or wireless communication systems.
In one embodiment, thenetwork1950 uses standard communications technologies and protocols. Thus, thenetwork1950 can include links using technologies such as Ethernet, 702.11, worldwide interoperability for microwave access (WiMAX), 3G, 4G, CDMA, GSM, LTE, digital subscriber line (DSL), etc. Similarly, the networking protocols used on thenetwork1950 can include multiprotocol label switching (MPLS), transmission control protocol/Internet protocol (TCP/IP), User Datagram Protocol (UDP), hypertext transport protocol (HTTP), simple mail transfer protocol (SMTP), file transfer protocol (FTP), and the like. The data exchanged over thenetwork1950 can be represented using technologies and/or formats including hypertext markup language (HTML) and extensible markup language (XML). In addition, all or some links can be encrypted using conventional encryption technologies such as secure sockets layer (SSL), transport layer security (TLS), and Internet Protocol security (IPsec).
In one embodiment, the user device1910 may display content from theexternal system1920 and/or from thesocial networking system1930 by processing amarkup language document1914 received from theexternal system1920 and from thesocial networking system1930 using abrowser application1912. Themarkup language document1914 identifies content and one or more instructions describing formatting or presentation of the content. By executing the instructions included in themarkup language document1914, thebrowser application1912 displays the identified content using the format or presentation described by themarkup language document1914. For example, themarkup language document1914 includes instructions for generating and displaying a web page having multiple frames that include text and/or image data retrieved from theexternal system1920 and thesocial networking system1930. In various embodiments, themarkup language document1914 comprises a data file including extensible markup language (XML) data, extensible hypertext markup language (XHTML) data, or other markup language data. Additionally, themarkup language document1914 may include JavaScript Object Notation (JSON) data, JSON with padding (JSONP), and JavaScript data to facilitate data-interchange between theexternal system1920 and the user device1910. Thebrowser application1912 on the user device1910 may use a JavaScript compiler to decode themarkup language document1914.
Themarkup language document1914 may also include, or link to, applications or application frameworks such as FLASH™ or Unity™ applications, the SilverLight™ application framework, etc.
In one embodiment, the user device1910 also includes one ormore cookies1916 including data indicating whether a user of the user device1910 is logged into thesocial networking system1930, which may enable modification of the data communicated from thesocial networking system1930 to the user device1910.
Theexternal system1920 includes one or more web servers that include one ormore web pages1922a,1922b, which are communicated to the user device1910 using thenetwork1950. Theexternal system1920 is separate from thesocial networking system1930. For example, theexternal system1920 is associated with a first domain, while thesocial networking system1930 is associated with a separate social networking domain.Web pages1922a,1922b, included in theexternal system1920, comprise amarkup language document1914 identifying content and including instructions specifying formatting or presentation of the identified content.
Thesocial networking system1930 includes one or more computing devices for a social network, including a plurality of users, and providing users of the social network with the ability to communicate and interact with other users of the social network. In some instances, the social network can be represented by a graph, i.e., a data structure including edges and nodes. Other data structures can also be used to represent the social network, including but not limited to databases, objects, classes, meta elements, files, or any other data structure. Thesocial networking system1930 may be administered, managed, or controlled by an operator. The operator of thesocial networking system1930 may be a human being, an automated application, or a series of applications for managing content, regulating policies, and collecting usage metrics within thesocial networking system1930. Any type of operator may be used.
Users may join thesocial networking system1930 and then add connections to any number of other users of thesocial networking system1930 to whom they desire to be connected. As used herein, the term “friend” refers to any other user of thesocial networking system1930 to whom a user has formed a connection, association, or relationship via thesocial networking system1930. For example, in an embodiment, if users in thesocial networking system1930 are represented as nodes in the social graph, the term “friend” can refer to an edge formed between and directly connecting two user nodes.
Connections may be added explicitly by a user or may be automatically created by thesocial networking system1930 based on common characteristics of the users (e.g., users who are alumni of the same educational institution). For example, a first user specifically selects a particular other user to be a friend. Connections in thesocial networking system1930 are usually in both directions, but need not be, so the terms “user” and “friend” depend on the frame of reference. Connections between users of thesocial networking system1930 are usually bilateral (“two-way”), or “mutual,” but connections may also be unilateral, or “one-way.” For example, if Bob and Joe are both users of thesocial networking system1930 and connected to each other, Bob and Joe are each other's connections. If, on the other hand, Bob wishes to connect to Joe to view data communicated to thesocial networking system1930 by Joe, but Joe does not wish to form a mutual connection, a unilateral connection may be established. The connection between users may be a direct connection; however, some embodiments of thesocial networking system1930 allow the connection to be indirect via one or more levels of connections or degrees of separation.
In addition to establishing and maintaining connections between users and allowing interactions between users, thesocial networking system1930 provides users with the ability to take actions on various types of items supported by thesocial networking system1930. These items may include groups or networks (i.e., social networks of people, entities, and concepts) to which users of thesocial networking system1930 may belong, events or calendar entries in which a user might be interested, computer-based applications that a user may use via thesocial networking system1930, transactions that allow users to buy or sell items via services provided by or through thesocial networking system1930, and interactions with advertisements that a user may perform on or off thesocial networking system1930. These are just a few examples of the items upon which a user may act on thesocial networking system1930, and many others are possible. A user may interact with anything that is capable of being represented in thesocial networking system1930 or in theexternal system1920, separate from thesocial networking system1930, or coupled to thesocial networking system1930 via thenetwork1950.
Thesocial networking system1930 is also capable of linking a variety of entities. For example, thesocial networking system1930 enables users to interact with each other as well asexternal systems1920 or other entities through an API, a web service, or other communication channels. Thesocial networking system1930 generates and maintains the “social graph” comprising a plurality of nodes interconnected by a plurality of edges. Each node in the social graph may represent an entity that can act on another node and/or that can be acted on by another node. The social graph may include various types of nodes. Examples of types of nodes include users, non-person entities, content items, web pages, groups, activities, messages, concepts, and any other things that can be represented by an object in thesocial networking system1930. An edge between two nodes in the social graph may represent a particular kind of connection, or association, between the two nodes, which may result from node relationships or from an action that was performed by one of the nodes on the other node. In some cases, the edges between nodes can be weighted. The weight of an edge can represent an attribute associated with the edge, such as a strength of the connection or association between nodes. Different types of edges can be provided with different weights. For example, an edge created when one user “likes” another user may be given one weight, while an edge created when a user befriends another user may be given a different weight.
As an example, when a first user identifies a second user as a friend, an edge in the social graph is generated connecting a node representing the first user and a second node representing the second user. As various nodes relate or interact with each other, thesocial networking system1930 modifies edges connecting the various nodes to reflect the relationships and interactions.
Thesocial networking system1930 also includes user-generated content, which enhances a user's interactions with thesocial networking system1930. User-generated content may include anything a user can add, upload, send, or “post” to thesocial networking system1930. For example, a user communicates posts to thesocial networking system1930 from a user device1910. Posts may include data such as status updates or other textual data, location information, images such as photos, videos, links, music or other similar data and/or media. Content may also be added to thesocial networking system1930 by a third party. Content “items” are represented as objects in thesocial networking system1930. In this way, users of thesocial networking system1930 are encouraged to communicate with each other by posting text and content items of various types of media through various communication channels. Such communication increases the interaction of users with each other and increases the frequency with which users interact with thesocial networking system1930.
Thesocial networking system1930 includes aweb server1932, anAPI request server1934, auser profile store1936, aconnection store1938, anaction logger1940, anactivity log1942, and anauthorization server1944. In an embodiment of the invention, thesocial networking system1930 may include additional, fewer, or different components for various applications. Other components, such as network interfaces, security mechanisms, load balancers, failover servers, management and network operations consoles, and the like are not shown so as to not obscure the details of the system.
Theuser profile store1936 maintains information about user accounts, including biographic, demographic, and other types of descriptive information, such as work experience, educational history, hobbies or preferences, location, and the like that has been declared by users or inferred by thesocial networking system1930. This information is stored in theuser profile store1936 such that each user is uniquely identified. Thesocial networking system1930 also stores data describing one or more connections between different users in theconnection store1938. The connection information may indicate users who have similar or common work experience, group memberships, hobbies, or educational history. Additionally, thesocial networking system1930 includes user-defined connections between different users, allowing users to specify their relationships with other users. For example, user-defined connections allow users to generate relationships with other users that parallel the users' real-life relationships, such as friends, co-workers, partners, and so forth. Users may select from predefined types of connections, or define their own connection types as needed. Connections with other nodes in thesocial networking system1930, such as non-person entities, buckets, cluster centers, images, interests, pages, external systems, concepts, and the like are also stored in theconnection store1938.
Thesocial networking system1930 maintains data about objects with which a user may interact. To maintain this data, theuser profile store1936 and theconnection store1938 store instances of the corresponding type of objects maintained by thesocial networking system1930. Each object type has information fields that are suitable for storing information appropriate to the type of object. For example, theuser profile store1936 contains data structures with fields suitable for describing a user's account and information related to a user's account. When a new object of a particular type is created, thesocial networking system1930 initializes a new data structure of the corresponding type, assigns a unique object identifier to it, and begins to add data to the object as needed. This might occur, for example, when a user becomes a user of thesocial networking system1930, thesocial networking system1930 generates a new instance of a user profile in theuser profile store1936, assigns a unique identifier to the user account, and begins to populate the fields of the user account with information provided by the user.
Theconnection store1938 includes data structures suitable for describing a user's connections to other users, connections toexternal systems1920 or connections to other entities. Theconnection store1938 may also associate a connection type with a user's connections, which may be used in conjunction with the user's privacy setting to regulate access to information about the user. In an embodiment of the invention, theuser profile store1936 and theconnection store1938 may be implemented as a federated database.
Data stored in theconnection store1938, theuser profile store1936, and theactivity log1942 enables thesocial networking system1930 to generate the social graph that uses nodes to identify various objects and edges connecting nodes to identify relationships between different objects. For example, if a first user establishes a connection with a second user in thesocial networking system1930, user accounts of the first user and the second user from theuser profile store1936 may act as nodes in the social graph. The connection between the first user and the second user stored by theconnection store1938 is an edge between the nodes associated with the first user and the second user. Continuing this example, the second user may then send the first user a message within thesocial networking system1930. The action of sending the message, which may be stored, is another edge between the two nodes in the social graph representing the first user and the second user. Additionally, the message itself may be identified and included in the social graph as another node connected to the nodes representing the first user and the second user.
In another example, a first user may tag a second user in an image that is maintained by the social networking system1930 (or, alternatively, in an image maintained by another system outside of the social networking system1930). The image may itself be represented as a node in thesocial networking system1930. This tagging action may create edges between the first user and the second user as well as create an edge between each of the users and the image, which is also a node in the social graph. In yet another example, if a user confirms attending an event, the user and the event are nodes obtained from theuser profile store1936, where the attendance of the event is an edge between the nodes that may be retrieved from theactivity log1942. By generating and maintaining the social graph, thesocial networking system1930 includes data describing many different types of objects and the interactions and connections among those objects, providing a rich source of socially relevant information.
Theweb server1932 links thesocial networking system1930 to one or more user devices1910 and/or one or moreexternal systems1920 via thenetwork1950. Theweb server1932 serves web pages, as well as other web-related content, such as Java, JavaScript, Flash, XML, and so forth. Theweb server1932 may include a mail server or other messaging functionality for receiving and routing messages between thesocial networking system1930 and one or more user devices1910. The messages can be instant messages, queued messages (e.g., email), text and SMS messages, or any other suitable messaging format.
TheAPI request server1934 allows one or moreexternal systems1920 and user devices1910 to call access information from thesocial networking system1930 by calling one or more API functions. TheAPI request server1934 may also allowexternal systems1920 to send information to thesocial networking system1930 by calling APIs. Theexternal system1920, in one embodiment, sends an API request to thesocial networking system1930 via thenetwork1950, and theAPI request server1934 receives the API request. TheAPI request server1934 processes the request by calling an API associated with the API request to generate an appropriate response, which theAPI request server1934 communicates to theexternal system1920 via thenetwork1950. For example, responsive to an API request, theAPI request server1934 collects data associated with a user, such as the user's connections that have logged into theexternal system1920, and communicates the collected data to theexternal system1920. In another embodiment, the user device1910 communicates with thesocial networking system1930 via APIs in the same manner asexternal systems1920.
Theaction logger1940 is capable of receiving communications from theweb server1932 about user actions on and/or off thesocial networking system1930. Theaction logger1940 populates theactivity log1942 with information about user actions, enabling thesocial networking system1930 to discover various actions taken by its users within thesocial networking system1930 and outside of thesocial networking system1930. Any action that a particular user takes with respect to another node on thesocial networking system1930 may be associated with each user's account, through information maintained in theactivity log1942 or in a similar database or other data repository. Examples of actions taken by a user within thesocial networking system1930 that are identified and stored may include, for example, adding a connection to another user, sending a message to another user, reading a message from another user, viewing content associated with another user, attending an event posted by another user, posting an image, attempting to post an image, or other actions interacting with another user or another object. When a user takes an action within thesocial networking system1930, the action is recorded in theactivity log1942. In one embodiment, thesocial networking system1930 maintains theactivity log1942 as a database of entries. When an action is taken within thesocial networking system1930, an entry for the action is added to theactivity log1942. Theactivity log1942 may be referred to as an action log.
Additionally, user actions may be associated with concepts and actions that occur within an entity outside of thesocial networking system1930, such as anexternal system1920 that is separate from thesocial networking system1930. For example, theaction logger1940 may receive data describing a user's interaction with anexternal system1920 from theweb server1932. In this example, theexternal system1920 reports a user's interaction according to structured actions and objects in the social graph.
Other examples of actions where a user interacts with anexternal system1920 include a user expressing an interest in anexternal system1920 or another entity, a user posting a comment to thesocial networking system1930 that discusses anexternal system1920 or aweb page1922awithin theexternal system1920, a user posting to the social networking system1930 a Uniform Resource Locator (URL) or other identifier associated with anexternal system1920, a user attending an event associated with anexternal system1920, or any other action by a user that is related to anexternal system1920. Thus, theactivity log1942 may include actions describing interactions between a user of thesocial networking system1930 and anexternal system1920 that is separate from thesocial networking system1930.
Theauthorization server1944 enforces one or more privacy settings of the users of thesocial networking system1930. A privacy setting of a user determines how particular information associated with a user can be shared. The privacy setting comprises the specification of particular information associated with a user and the specification of the entity or entities with whom the information can be shared. Examples of entities with which information can be shared may include other users, applications,external systems1920, or any entity that can potentially access the information. The information that can be shared by a user comprises user account information, such as profile photos, phone numbers associated with the user, user's connections, actions taken by the user such as adding a connection, changing user profile information, and the like.
The privacy setting specification may be provided at different levels of granularity. For example, the privacy setting may identify specific information to be shared with other users; the privacy setting identifies a work phone number or a specific set of related information, such as, personal information including profile photo, home phone number, and status. Alternatively, the privacy setting may apply to all the information associated with the user. The specification of the set of entities that can access particular information can also be specified at various levels of granularity. Various sets of entities with which information can be shared may include, for example, all friends of the user, all friends of friends, all applications, or allexternal systems1920. One embodiment allows the specification of the set of entities to comprise an enumeration of entities. For example, the user may provide a list ofexternal systems1920 that are allowed to access certain information. Another embodiment allows the specification to comprise a set of entities along with exceptions that are not allowed to access the information. For example, a user may allow allexternal systems1920 to access the user's work information, but specify a list ofexternal systems1920 that are not allowed to access the work information. Certain embodiments call the list of exceptions that are not allowed to access certain information a “block list”.External systems1920 belonging to a block list specified by a user are blocked from accessing the information specified in the privacy setting. Various combinations of granularity of specification of information, and granularity of specification of entities, with which information is shared are possible. For example, all personal information may be shared with friends whereas all work information may be shared with friends of friends.
Theauthorization server1944 contains logic to determine if certain information associated with a user can be accessed by a user's friends,external systems1920, and/or other applications and entities. Theexternal system1920 may need authorization from theauthorization server1944 to access the user's more private and sensitive information, such as the user's work phone number. Based on the user's privacy settings, theauthorization server1944 determines if another user, theexternal system1920, an application, or another entity is allowed to access information associated with the user, including information about actions taken by the user.
The user device1910 can include astory publication system1946. Thestory publication system1946 can facilitate effective publication of content items by allowing a user to reorder content items according to a specific narrative of a story the user is trying to tell with the content items. Thestory publication system1946 can further allow a user to enter captions, titles, tags, maps, and other metadata associated with a story. Thestory publication system1946 can include a story publication user interface module, having the story publication user interface features described herein, and a story publication management module, having the story publication management features descried herein. In some embodiments, thestory publication system1946 can be implemented as thestory publication system102 ofFIG. 1.
Hardware ImplementationThe foregoing processes and features can be implemented by a wide variety of machine and computer system architectures and in a wide variety of network and computing environments.FIG. 20 illustrates an example of acomputer system2000 that may be used to implement one or more of the embodiments described herein in accordance with an embodiment of the invention. Thecomputer system2000 includes sets of instructions for causing thecomputer system2000 to perform the processes and features discussed herein. Thecomputer system2000 may be connected (e.g., networked) to other machines. In a networked deployment, thecomputer system2000 may operate in the capacity of a server machine or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. In an embodiment of the invention, thecomputer system2000 may be thesocial networking system1930, the user device1910, and theexternal system1920, or a component thereof. In an embodiment of the invention, thecomputer system2000 may be one server among many that constitutes all or part of thesocial networking system1930.
Thecomputer system2000 includes aprocessor2002, acache2004, and one or more executable modules and drivers, stored on a computer-readable medium, directed to the processes and features described herein. Additionally, thecomputer system2000 includes a high performance input/output (I/O)bus2006 and a standard I/O bus2008. Ahost bridge2010couples processor2002 to high performance I/O bus2006, whereas I/O bus bridge2012 couples the twobuses2006 and2008 to each other. Asystem memory2014 and one ormore network interfaces2016 couple to high performance I/O bus2006. Thecomputer system2000 may further include video memory and a display device coupled to the video memory (not shown). Mass storage2018 and I/O ports2020 couple to the standard I/O bus2008. Thecomputer system2000 may optionally include a keyboard and pointing device, a display device, or other input/output devices (not shown) coupled to the standard I/O bus2008. Collectively, these elements are intended to represent a broad category of computer hardware systems, including but not limited to computer systems based on the x86-compatible processors manufactured by Intel Corporation of Santa Clara, Calif., and the x86-compatible processors manufactured by Advanced Micro Devices (AMD), Inc., of Sunnyvale, Calif., as well as any other suitable processor.
An operating system manages and controls the operation of thecomputer system2000, including the input and output of data to and from software applications (not shown). The operating system provides an interface between the software applications being executed on the system and the hardware components of the system. Any suitable operating system may be used, such as the LINUX Operating System, the Apple Macintosh Operating System, available from Apple Computer Inc. of Cupertino, Calif., UNIX operating systems, Microsoft® Windows® operating systems, BSD operating systems, and the like. Other implementations are possible.
The elements of thecomputer system2000 are described in greater detail below. In particular, thenetwork interface2016 provides communication between thecomputer system2000 and any of a wide range of networks, such as an Ethernet (e.g., IEEE 802.3) network, a backplane, etc. The mass storage2018 provides permanent storage for the data and programming instructions to perform the above-described processes and features implemented by the respective computing systems identified above, whereas the system memory2014 (e.g., DRAM) provides temporary storage for the data and programming instructions when executed by theprocessor2002. The I/O ports2020 may be one or more serial and/or parallel communication ports that provide communication between additional peripheral devices, which may be coupled to thecomputer system2000.
Thecomputer system2000 may include a variety of system architectures, and various components of thecomputer system2000 may be rearranged. For example, thecache2004 may be on-chip withprocessor2002. Alternatively, thecache2004 and theprocessor2002 may be packed together as a “processor module”, withprocessor2002 being referred to as the “processor core”. Furthermore, certain embodiments of the invention may neither require nor include all of the above components. For example, peripheral devices coupled to the standard I/O bus2008 may couple to the high performance I/O bus2006. In addition, in some embodiments, only a single bus may exist, with the components of thecomputer system2000 being coupled to the single bus. Furthermore, thecomputer system2000 may include additional components, such as additional processors, storage devices, or memories.
In general, the processes and features described herein may be implemented as part of an operating system or a specific application, component, program, object, module, or series of instructions referred to as “programs”. For example, one or more programs may be used to execute specific processes described herein. The programs typically comprise one or more instructions in various memory and storage devices in thecomputer system2000 that, when read and executed by one or more processors, cause thecomputer system2000 to perform operations to execute the processes and features described herein. The processes and features described herein may be implemented in software, firmware, hardware (e.g., an application specific integrated circuit), or any combination thereof.
In one implementation, the processes and features described herein are implemented as a series of executable modules run by thecomputer system2000, individually or collectively in a distributed computing environment. The foregoing modules may be realized by hardware, executable modules stored on a computer-readable medium (or machine-readable medium), or a combination of both. For example, the modules may comprise a plurality or series of instructions to be executed by a processor in a hardware system, such as theprocessor2002. Initially, the series of instructions may be stored on a storage device, such as the mass storage2018. However, the series of instructions can be stored on any suitable computer readable storage medium. Furthermore, the series of instructions need not be stored locally, and could be received from a remote storage device, such as a server on a network, via thenetwork interface2016. The instructions are copied from the storage device, such as the mass storage2018, into thesystem memory2014 and then accessed and executed by theprocessor2002. In various implementations, a module or modules can be executed by a processor or multiple processors in one or multiple locations, such as multiple servers in a parallel processing environment.
Examples of computer-readable media include, but are not limited to, recordable type media such as volatile and non-volatile memory devices; solid state memories; floppy and other removable disks; hard disk drives; magnetic media; optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks (DVDs)); other similar non-transitory (or transitory), tangible (or non-tangible) storage medium; or any type of medium suitable for storing, encoding, or carrying a series of instructions for execution by thecomputer system2000 to perform any one or more of the processes and features described herein.
For purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the description. It will be apparent, however, to one skilled in the art that embodiments of the disclosure can be practiced without these specific details. In some instances, modules, structures, processes, features, and devices are shown in block diagram form in order to avoid obscuring the description. In other instances, functional block diagrams and flow diagrams are shown to represent data and logic flows. The components of block diagrams and flow diagrams (e.g., modules, blocks, structures, devices, features, etc.) may be variously combined, separated, removed, reordered, and replaced in a manner other than as expressly described and depicted herein.
Reference in this specification to “one embodiment”, “an embodiment”, “other embodiments”, “one series of embodiments”, “some embodiments”, “various embodiments”, or the like means that a particular feature, design, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. The appearances of, for example, the phrase “in one embodiment” or “in an embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, whether or not there is express reference to an “embodiment” or the like, various features are described, which may be variously combined and included in some embodiments, but also variously omitted in other embodiments. Similarly, various features are described that may be preferences or requirements for some embodiments, but not other embodiments.
The language used herein has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.