BACKGROUNDSoftware packages traditionally are sold as bundles of functions and/or types of functions that software or application developers believe will be useful and/or popular with users. These software packages typically include a hard-coded user interface (“UI”) that provides a carefully tailored user experience (“UX”) to users of the software packages. The UI, and consequently the UX, of a software package often distinguishes one software package from another, as the underlying functionality of two or more software packages may be similar.
More particularly, software packages often are directed to providing a particular type of functionality. At any particular time, there may exist a number of software packages directed to a particular type of task or functionality, where the software packages may be provided by any number of developers. As such, the UIs and UXs associated with the software packages may vary widely, but the underlying functionality of the two software packages may have many similarities. For example, a particular function may be available in two or more software packages. UI controls associated with the particular function, however, may be located in different places in respective UIs, may have varied appearances, and/or otherwise may be varied among the various software packages.
In a task-based computing environment, tasks associated with one or more functions of a software package are provided to users via an appropriate interface. Software developers do not control the UI providing the tasks to the users. As such, the UX can vary widely, and it may be difficult for users to access certain tasks through a UI presented to the users. Because there may be no underlying data stored at a device until task execution is complete, interrupted tasks must be renewed and completed in their entireties.
It is with respect to these and other considerations that the disclosure made herein is presented.
SUMMARYConcepts and technologies are described herein for interacting with contextual and task-focused computing environments. According to some embodiments of the concepts and technologies disclosed herein, a discovery engine collects application data that indicates functionality provided by applications. The discovery engine is configured to identify tasks, corresponding to particular functionality of the applications, that can be provided individually to users on-demand and/or in batches of tasks. In some embodiments, the applications are configured to declare tasks provided by the applications, which can allow the tasks to be exposed to users in a more streamlined manner,
UIs can be customized based upon the tasks identified as being relevant to activities occurring at a client device. The UIs can include one or more tasks and workflows corresponding to batches of tasks. The workflows can be executed via the client devices and can be interrupted during execution. When interrupted, the workflows are stored with data indicating progress in the workflow execution, contextual information associated with the device that initiated the workflow execution, a UI fused to access the work low, and other information. This information, referred to herein as the “workflow” can be stored and/or shared with other users. When execution of the workflow is resumed, the same UI can be provided, or a different UI can be generated and provided if the device used to resume execution of the workflow, and/or other contextual data associated with the device, differs from the stored contextual data. Thus, multiple devices and users can access workflows in parallel to provide collaborative task execution. Also, users can begin, interrupt, and resume execution of one or more workflows, if desired.
According to one aspect, application data corresponding to applications and/or software is generated. The application data is provided to or retrieved by the discovery engine. The discovery engine analyzes the application data to identify functionality provided by the applications. The discovery engine also generates, organizes, categorizes, and stores task data that describes and identifies tasks associated with the applications, the tasks corresponding to the identified functionality of the applications. The task data is stored in data store such as a database or server that is accessible to a task engine.
According to another aspect, the task engine obtains contextual data indicating activities at one or more client devices. Based upon the contextual data, the task engine searches or queries the task data to identify tasks that are expected to be relevant to the one or more client devices. The relevancy of the tasks cart be determined based upon activities occurring at the client devices, files accessed at the client devices, activity history associated with the client devices, interactions between the client devices, and/or the like. The task engine also can obtain or access social networking data associated with a user of the client device. The social networking data can be used in addition to, or instead of, the contextual data to identify tasks that are believed to be relevant to the user of the client device based upon usage, comment, review, or rating by members of the user's social networks.
According to another aspect, the relevant tasks are identified by the task engine, and packaged for presentation to or use by the client device. The task engine is configured to generate a UI for interacting with the tasks and/or workflows corresponding to batches of the tasks, and to provide the UIs for consumption at a device. The task engine also is configured to determine a ranking and/or advertising scheme for the tasks and/or workflows based upon usage history associated with a client device, popularity of the tasks and/or workflows, advertising fees paid by vendors associated with the tasks, usage of the tasks by social network members, numbers of explicit searches for the tasks, other search or usage history of entities that have accessed the tasks, and the like. UI controls for accessing the determined tasks and/or workflows can be provided to the client device in a determined format. Metrics associated with the tasks, workflows, and UIs can be tracked and provided to one or more vendors associated with the tasks, if desired, and/or used for other purposes.
According to various embodiments, the client device is configured to execute a web-based operating system (OS). Thus, the client device may execute an operating system or other base program that is configured to access web-based or other remotely-executed applications and services to provide specific functionality at the client device. The client device therefore may provide various applications and services via a simple operating system or an application comparable to a standard web browser. It should be understood that the client device can execute other web-based and non-web-based operating systems, as is explained in more detail below.
It should be appreciated that the above-described subject matter may be implemented as a computer-controlled apparatus, a computer process, a computing system, or as an article of manufacture such as a computer-readable storage medium. These and various other features will be apparent from a reading of the following Detailed Description and a review of the associated drawings.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended that this Summary be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1 is a system diagram illustrating an exemplary operating environment for the various embodiments disclosed herein.
FIGS. 2A-2C are user interface diagrams showing aspects of exemplary user interfaces for interacting with contextual and task-focused computing environments, according to various embodiments.
FIG. 3 is a flow diagram showing aspects of a method for providing tasks to a client, according to an exemplary embodiment.
FIG. 4 is a flow diagram showing aspects of another method for providing tasks to a client, according to an exemplary embodiment.
FIG. 5 is a flow diagram showing aspects of a method for continuing execution of a workflow, according to an exemplary embodiment,
FIG. 6 is a computer architecture diagram illustrating an exemplary computer hardware and software architecture for a computing system capable of implementing aspects of the embodiments presented herein.
DETAILED DESCRIPTIONThe following detailed description is directed to concepts and technologies for interacting with contextual and task-focused computing environments. One or more tasks associated with applications are identified, and task data describing the tasks is stored in a searchable format and location. A task engine can search the task data to identify tasks and/or batches of tasks relevant to activities occurring at a client. The task engine also can be configured to customize UIs based upon the tasks identified as being relevant to activities occurring at a client device. The UIs can include one or more tasks and workflows corresponding to batches of tasks.
The workflows can be executed via the client devices and can be interrupted during execution. When interrupted, the workflows are stored with data indicating progress in the workflow execution, contextual information associated with the device that initiated the workflow execution, a UI used to access the workflow, and other information. The workflow can be stored and/or shared with other users. When execution of the workflow is resumed, the same UI can be provided, or a different UI can be generated and provided if the device used to resume execution of the workflow, and/or other contextual data associated with the device, differs from the stored contextual data. Thus, multiple devices and users can access workflows in parallel to provide collaborative task execution. Also, users can begin, interrupt, and resume execution of one or more workflows, if desired.
The word “application,” and variants thereof, is used herein to refer to computer-executable files for providing functionality to a user. According to various embodiments, the applications can be executed by a device, for example a computer, smartphone, or the like. Additionally, the computer, smartphone, or other device cart execute a web browser or operating system that is configured to access remotely-executed applications and/or services such as web-based and/or other remotely-executed applications, web pages, social networking services, and the like. In some embodiments, the applications, web pages, and/or social networking services are provided by a combination of remote and local execution, for example, by execution of JavaScript, DHTML, AJAX, .ASP, and the like. According to other embodiments, the applications include runtime applications built to access remote or local data. These runtime applications can be built using the SILVERLIGHT family of products from Microsoft Corporation in Redmond, Wash., the AIR and FLASH families of products from Adobe Systems Incorporated of San Jose, Calif., and/or other products and technologies.
The word “tasks.” and variants thereof, is used herein to refer to a function and/or a set, subset, or category of functionality associated with an application, routine, or software package. Thus, an application can include any number of tasks, wherein the tasks define individual functions of the applications and/or types, sets, or subsets of the functions associated with the applications. For example, the tasks can include particular features of applications such as a task for playback of art audio file in the case of a media playback application. Similarly, the tasks can include multiple features associated with the applications such as macros and/or other automated tasks associated with an application. These examples are illustrative, and should not be construed as being limiting in any way.
While the subject matter described herein is presented in the general context of program modules that execute in conjunction with the execution of an operating system and application programs on a computer system, those skilled in the art will recognize that other implementations may be performed in combination with other types of program modules. Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the subject matter described herein may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like.
In the following detailed description, references are made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments or examples. Referring now to the drawings, in which like numerals represent like elements throughout the several figures, aspects of a computing system, computer-readable storage medium, and computer-implemented methodology for interacting with contextual and task-focused computing environments will be presented.
Referring now toFIG. 1, aspects of oneoperating environment100 for the various embodiments presented herein will be described. The operatingenvironment100 shown inFIG. 1 includes aserver computer102 operating on or in communication with anetwork104. According to various embodiments, the functionality of theserver computer102 is provided by a web server operating on or in communication with the Internet, though this is not necessarily the case.
Theserver computer102 is configured to execute anapplication106 for providing functionality associated with theserver computer102. According to various embodiments, theapplication106 provides a mapping application for providing maps, navigation instructions, location based services, and the like. Theapplication106 also can provide multimedia functionality such as, for example, video and audio streaming, video and audio playback functionality, and the like. Theapplication106 also can provide tools such as photo, video, arid audio editing and creation applications, word processing functionality, data backup and storage functionality, calendaring applications, messaging applications such as email, text messaging, instant messaging, and realtime messaging applications, shopping applications, search applications, and the like. Theapplication106 also can provide rating and/or review applications, games, and the like. The above lists are not exhaustive, as theapplication106 can provide any functionality associated with theserver computer102. While the embodiments described herein includeapplications106 executing onserver computers102, it should be understood that client-centric approaches are also possible, wherein client devices execute applications that access data and/or applications hosted by theserver computers102, as described in more detail below. Furthermore, it should be understood that theapplications106 can be executed on theserver computers102 in part and on client devices in part. Thus, the above examples are exemplary and should not be construed as being limiting in any way.
The operatingenvironment100 further includes adiscovery engine108 operating on or in communication with thenetwork104. Thediscovery engine108 can include a combination of hardware and software for discovering applications such as theapplication106, and identifying one or more tasks provided by the applications. In some embodiments, thediscovery engine108 identifies or receivesapplication data110 corresponding to theapplication106.
Theapplication data110 describes theapplication106 and/or functionality associated therewith. Theapplication data110 can be generated by theapplication106, for example via computer executable instructions that, when executed by theserver computer102 cause theserver computer102 to self-describe theapplication106 and provide or make available theapplication data110. In other embodiments, thediscovery engine108 or other devices or software such as search engines not illustrated) identify and describe functionality associated with theserver computer102 and/or theapplication106. Theapplication data110 corresponds, in some embodiments, to metadata describing theapplication106 and/or functionality associated therewith.
In some embodiments, thediscovery engine108 analyzes theapplication data110 and identifies one or more tasks provided by theapplication106, as defined or described by theapplication data110. The tasks describe particular functionality of theapplication106. For example, if theapplication106 provides photo editing functionality, the tasks provided by theapplication106 can include, but are not limited to, red-eye removal tasks, color balancing tasks, special effects tasks, sharpness adjustment tasks, blemish removal tasks, image sizing and cropping tasks, blurring tasks, text editing tasks, contrast, hue, and brightness adjustment tasks, other tasks, combinations thereof, and the like. It should be understood that this embodiment is exemplary, and should not be construed as being limiting in any way.
Thediscovery engine108 can generate data identifying the tasks associated with anapplication106 and store the data astask data112. In some embodiments, thediscovery engine108 also is configured to organize and categorize thetask data112 according to the tasks described by thetask data112. In the above example of anapplication106 for photo editing, thediscovery engine108 can create a category of image editing tasks, wherein the image editing tasks correspond not only to theapplication106, but also toother applications106 provided by any number ofserver computers102 or other devices. Thediscovery engine108 can categorize and/or organize photo editing tasks for theapplications106 into an image editing category, for example. Thediscovery engine108 also can be configured to store thetask data112 corresponding to the catalogued, categorized, and organized tasks forapplications106 at a data storage location such as thedata store114.
In some embodiments, application or task developers publish thetask data112 with theapplications106. For example, the developers can generate text descriptions and/or metadata describing the tasks, input or types of input recognized by the tasks, output or types of output generated by the tasks, keywords, limitations and/or capabilities of the tasks, and the like. Additionally, theapplications106 can be configured by developers to self-declare tasks. Thus, theapplication data110 can be generated by theapplications106 without analysis or data farming by thediscovery engine108 or other devices or software. Thetask data112 can be stored in a searchable format, if desired, such as extensible markup language (“XML”), text, and other formats. Thetask data112 can be queried by devices to identify tasks based upon search query terms.
The functionality of thedata store114 can be provided by one or more databases, memory devices, server computers, desktop computers, mobile telephones, laptop computers, other computing systems, and the like. In the illustrated embodiments, the functionality of thedata store114 is provided by a database operating in communication with thenetwork104. In these embodiments, thedata store114 is configured to receive and respond to queries of thetask data112 by devices configured to communicate with thenetwork104. It should be understood that these embodiments are exemplary.
The operatingenvironment100 includes a social networking server116 (“SN server”) operating on or in communication with thenetwork104. TheSN server116 executes a social networking application118 (“SN application”) to provide social networking services. Exemplary social networking services include, but are not limited to, the MYSPACE social networking service, the FOURSQUARE geographic networking service, the TWITTER realtime messaging service, the FACEBOOK social networking service, the LINKEDIN professional networking service, the YAMMER office colleague networking service, and the like. In other embodiments, social networking functionality is provided by other services, sites, and/or providers that are not explicitly known as social networking providers. For example, some web sites allow users to interact with one another via email, commenting, ratings and reviews, realtime messages, chat services, gameplay, and/or other means, without explicitly supporting “social networking services.” Examples of such services include, but are not limited to, the WINDOWS LIVE service from Microsoft Corporation in Redmond, Wash., among others. Therefore, it should be appreciated that the above list of social networking services is not exhaustive, as numerous social networking services are not mentioned herein.
The SN application118 generates social networking data120 (“SN data”) associated with one or more users. TheSN data120 describes, for example, social networking graphs associated with users, user content such as status updates, photographs, reviews, links, and the like, contact and biographical information associated with users, and the like. TheSN data120 can include, for example, information describing applications or tasks accessed by users of the social networking service, links and status updates relating to applications and tasks, combinations thereof, and the like. TheSN data120 also can include other information such as likes and dislikes, user comments, user connection requests, and the like.
The operatingenvironment100 also includes atask engine122 operating on or in communication with thenetwork104. Thetask engine122 is configured to search for, identify, and provide tasks based upon one or more inputs. In some embodiments, thetask engine122 executes asearch application124 for searching thetask data112 for tasks relevant to aclient128 operating in communication with thetask engine122. According to various embodiments, thesearch application124 bases searches of thetask data112, at least in part, uponcontextual data126 associated with theclient128.
According to various embodiments, theclient128 is a personal computer (“PC”) such as a desktop, tablet, or laptop computer system. Theclient128 may include other types of computing systems including, but not limited to, server computers, handheld computers, netbook computers, embedded computer systems, personal digital assistants, mobile telephones, smart phones, or other computing devices. Although connections between theclient128 and thenetwork104 are not illustrated inFIG. 1, it should be understood that theclient128 can communicate with thetask engine122 via thenetwork104. Furthermore, while only oneclient128 is illustrated inFIG. 1, it should be understood that a user or combination of users may communicate with thetask engine122 using two ormore clients128.
Theclient128 is configured to execute anoperating system130. According to various embodiments theoperating system130 executed by theclient128 is a web-based operating system. In some embodiments, theclient128 is not configured or equipped to execute traditional native applications and/or programs at the client-side, and instead accesses remotely-executed applications such as web applications and/or other remote applications, and renders the application data for presentation at theclient128. In still other embodiments, theclient128 is configured to access remotely-executed applications and to execute some local code such as scripts, local searches, and the like. As such, theclient128 can be configured to access or utilize cloud-based, web-based, and/or other remotely executed applications, and to render data associated with the applications at theclient128.
In some embodiments, theclient128 is further configured to executeapplications programs132. Theapplication programs132 can include a web browser or web-based operating system that is configured to access web-based or runtime applications, and to render the data generated by the web-based or runtime applications for use at theclient128. Thus, theapplication programs132 can include one or more programs for accessing and rendering web pages, accessing applications, rendering data associated with the applications, accessing services, rendering data associated with the services, combinations thereof, and the like. In some embodiments, theclient128 also is configured to execute stand-alone or runtime applications that are configured to access web-based or remote applications via public or private application programming interfaces (“APIs”). Therefore, while the word “application” and variants thereof is used extensively herein, it should be understood that the applications can include locally-executed and/or remotely-executed applications.
Thecontextual data126 describes contextual information associated with theclient128. Thecontextual data126 identifies one ormore applications106 being accessed by theclient128, one ormore application programs132 being accessed or executed by theclient128, and/orcontent data134 describing data being accessed, edited, created, saved, or otherwise worked with or processed by theOS130, theapplications106, and/or theapplication programs132. Thecontent data134 can describe documents, audio files, video files, web pages, programs, scripts, images, social networking content, spreadsheets,applications106, other files and software, combinations thereof, and the like. Thus, thecontent data134 can indicate usage or access of one or more web-based or other remotely-executed applications by theclient128, and what type of data is being processed by the one or more web-based or other remotely-executed applications. Thecontent data134 can he surfaced or provided to thetask engine122 as part of thecontextual data126.
Thecontextual data126 also can describe one or more actions taken entirely at theclient128. For example, thecontextual data126 may indicate movement of a cursor or pointer at theclient128, alphanumeric text input at theclient128, clicking at a particular location or region of a display associated with theclient128, and/or other movements or inputs associated with theclient128. These and other inputs can prompt, for example, local execution of scripts and/or code at theclient128. These actions and other actions can be captured by thecontextual data126 and passed to thetask engine122. In some embodiments, these and other actions are mediated by a remote or local application, relative to theclient128, and therefore may be captured by thecontextual data126 not only as particular actions, but additionally, or alternatively, as specific invocation of particular functionality associated with the remote or local application, script, or code execution.
Thesearch application124 is configured to use thecontextual data126 to search thetask data112 for tasks that are expected to be relevant to theclient128 based upon thecontextual data126. As mentioned above, thesearch application124 can be configured to query thetask data112, though other methods of searching content including thetask data112 can he used. In an exemplary embodiment, if thecontextual data126 indicates that theclient128 is accessing an image, thesearch application124 queries thetask data112 to identify tasks related to images and/or tasks users often use when accessing images such as viewing tasks, editing tasks, printing tasks, and the like. Similarly, if thecontextual data126 indicates that theclient128 is accessing an audio file, thesearch application124 can query thetask data112 to identify tasks related to audio tiles such as, for example, recording tasks, editing tasks, conversion tasks, audio processing tasks, and the like. These examples are illustrative, and should not be construed as being limiting in any way.
Although search engines are not illustrated inFIG. 1, search engines also can he used to generate or supplement thecontextual data126 with search histories, preferences associated with users, and the like. Thus, in addition to revealing activities associated with theclient128, thecontextual data126 can indicate activity associated with theclient128 over some time period, for example, during a session, day, week, month, year, and the like. Thus, thecontextual data126 can relate to some, none, or all interactions at theclient128 including web searches, application or task usage, email messaging usage, map usage, and the like.
In some embodiments, thesearch application124 receives or retrievers theSN data120 in addition to, or instead of, thecontextual data126. Additionally, or alternatively, search engines can retrieve theSN data120 and supplement thecontextual data126 with theSN data120. Thesearch application124 can use theSN data120 to identify tasks orapplications106 used, consumed, reviewed, posted, commented on, or otherwise referenced by one or more members of a social network associated with a particular user, for example, a user associated with theclient128. Thus, thesearch application124 can query thetask data112 to identify tasks based not only upon thecontextual data126 associated with theclient128, but also based upon one or more social networks corresponding to a user of theclient128.
In response to searches or queries of thetask data112, thetask engine122 can receiverelevant task data136 identifying tasks that are expected to be relevant to theclient128 based upon explicit search terms, thecontextual data126, and/or theSN data120. Therelevant task data136 can identify the tasks or applications by one or more addresses, names, applications, categories, functionality descriptions, and the like. In some embodiments, application tasks are identified by one or more Internet protocol (“IP”) addresses associated withserver computers102 hosting theapplications106 with which the tasks are associated, one or more uniform resource locator (“URL”) addresses associated with theapplications106 associated with the application tasks, and/or other information for identifying the identity and/or location of the tasks.
In some embodiments, thetask engine122 includes aninterface module138. Theinterface module138 is configured to determine how to present the tasks identified by therelevant task data136 to theclient128. For example, theinterface module138 is configured to determine a layout for one or more user interfaces (“UIs”) for presenting tasks at theclient128 or other devices. Theinterface module138 determines, for example, how to arrange the tasks with respect to the one or more UIs, how to group tasks, what type of UI should be presented, and the like. The type of UI determined by theinterface module138 can include, but is not limited to, one or more of a mobile UI such as a smart phone web browser UI or smart phone web-based OS UI, a desktop PC web browser UI or a desktop PC web-based OS UI, a UI associated with theapplication106, a tablet UI, other UIs, combinations thereof and the like.
Theinterface module138 also can determine how to rank tasks and/or arrangements or orders in which to present the ranked tasks. Theinterface module138 also can identify advertising for presentation with the UIs, as well as arrangement, ranking, layout, and/or other options for the advertising and/or other aspects of the UIs. In some embodiments, theinterface module138, thesearch application124, and/or a combination thereof, identify, configure, provide data for presenting, and manage one ormore workflows140.
Theworkflows140 correspond to batches of tasks and/or data associated with batches of tasks and/or related tasks. Theworkflows140 can include data identifying a batch of tasks,contextual data126 associated with theworkflow140, a user orclient128 associated with theworkflow140, and/or other information associated with theworkflow140. Oneexemplary workflow140 includes atrip planner workflow140. Atrip planner workflow140 can include one or more tasks for searching and booking flights or other transportation, tasks for search and reserving hotel or other accommodations, tasks for searching for restaurants and reviews associated with the restaurants, tasks for making reservations at restaurants, tasks for searching for a booking rental cars, and/or other travel-related tasks. According to embodiments, theworkflows140 are tailored by users to include desired tasks, tailored by thesearch application124 and/or theinterface module138, saved by users, and/or otherwise configured. Theworkflows140 can be interrupted or shared by users. Users can thus shareworkflows140 to enable simultaneous interaction with theworkflows140 by one or more devices or users. Theworkflows140 also can be interrupted and saved to allow users to interrupt execution of theworkflows140 and later resume execution of theworkflows140.
Grouping tasks inworkflows140 also can provide the ability to associated data used, consumed, submitted, or otherwise generated with respect to one task to be propagated to other tasks that are part of the same or arelated workflow140. Thus, for example, a user may access aworkflow140 to plan a trip or vacation. In the midst of executing theworkflow140, the user may change the travel dates. Instead of having to search flights, hotels, car rentals, restaurants, events, and the like, with the new dates, theworkflow140 can be used to propagate the changes through the other tasks, thereby allowing the user to avoid duplicative work. If theworkflows140 are interrupted, thetask engine122 can storecontextual data126 associated with a particular instance of theworkflow140 and/or other information such as data identifying an execution point in theworkflow140, a device being used to execute theworkflow140, and the like. Thus, when execution of theworkflow140 resumes, thetask engine122 can determine if the UI determined by thetask engine122 when execution of theworkflow140 should be used for the remainder of theworkflow140 execution, or if a new UI should be presented to a device resuming execution of theworkflow140.
In another embodiment, users can shareworkflows140 with other users or devices. Thetask engine122 can be configured to enableworkflow140 sharing. For example, thetask engine122 can be used to generate access codes or links that can be shared with other users or devices to access theworkflow140. Thus, one or more users and/or devices can access and execute aworkflow140 in parallel and/or in succession.
Although thesearch application124 and theinterface module138 are illustrated as components of thetask engine122, it should be understood that each of these components, or combinations thereof, may be embodied as or in stand-alone devices or components thereof operating on or in communication with thenetwork104 and/or theclient128. Thus, the illustrated embodiment is exemplary, and should not be construed as being limiting in any way. Additionally, whileFIG. 1 illustrates oneserver computer102, onenetwork104, onediscovery engine108, onedata store114, oneSN server116, onetask engine122, and oneclient128, it should be understood that some implementations of the operatingenvironment100 includemultiple server computers102,multiple networks104,multiple discovery engines108,multiple data stores114,multiple SN servers116,multiple task engines122, and/ormultiple clients128. Thus, the illustrated embodiments should be understood as being exemplary, and should not he construed as being limiting in any way.
Turning now toFIG. 2A, a user interface diagram showing aspects of a user interface (UI) for presenting tasks at theclient128 in one embodiment will be described. In particular,FIG. 2A shows ascreen display200A generated by one or more of theoperating system130 and/or theapplication programs132 executed by theclient128 according to one particular implementation presented herein. It should be appreciated that the UI diagram illustrated inFIG. 2A is exemplary. Furthermore, it should he understood that data corresponding to the UI diagram illustrated inFIG. 2A can be generated by theinterface module138, made available to or transmitted to theclient128, and rendered by theclient128, though this is not necessarily the case.
In the illustrated embodiment, thescreen display200A includes anapplication window202A. In some implementations, theapplication window202A is displayed on top of or behind other information (not illustrated) displayed on thescreen display200A. Additionally, or alternatively, theapplication window202A can fill thescreen display200A and/or can be sized to fit a desired portion or percentage of thescreen display200A. It should be understood that the illustrated layout, proportions, and contents of the illustratedapplication window202A are exemplary, and should not be construed as being limiting in any way.
Theexemplary application window202A corresponds to an application window for a web browser, though this example is illustrative. It should be understood that theapplication window202A can correspond to an application window for other applications, including native applications such as theapplication programs132, web applications, theapplications106, and/or interfaces displayed or rendered by theoperating system130. In the illustrated embodiment, theapplication window202A is displayingcontent204A. It should he understood that this embodiment is exemplary, and should not be construed as being limiting in any way.
Theapplication window202A also is displaying two menu areas206 (“menus”). While themenus206 are illustrated as areas on the right and left sides of theapplication window202A, it should be understood that one or more menu areas can be displayed as floating windows in theapplication window202A and/or as areas docked to one or more of a side, top, and/or bottom of theapplication window202A, placed into a tool bar or status bar of theapplication window202A, and the like. Furthermore, the illustrated size, shape, and configuration of themenus206 is exemplary and should not be construed as being limiting in any way. In some embodiments, theapplication window202A is superimposed in “front” of thecontent204A. Themenus206 can be rendered when theapplication window202A is rendered and/or can be rendered in response to a keystroke, gesture, voice command, touch command, and/or manipulation of another input devices such as a mouse. Themenus206 can be configured to be only partially opaque, such that thecontent204A and themenus206 are simultaneously visible.
Themenus206 are configured to display UI controls for accessing tasks,workflows140, and/or resources or links relating to thecontent204A. In the illustrated embodiment, thecontent204A corresponds to a news article. Thus, themenus206 can include UI controls for various tasks and/orworkflows140 associated with the news article such as publishing tasks, printing tasks, research tasks, text extraction tasks, review tasks, and the like. As will be explained in more detail herein, the UI controls displayed in themenus206 can be ordered based upon ranking and/or advertising schemes, wherein UI controls associated with tasks orworkflows140 are ordered based, at least in part, upon anticipated relevance, ranking or advertising programs, and the like. Thescreen display200A also includes advertising areas208 (“advertising”) for displaying advertising content. The advertising content can correspond to one or more tasks orworkflows140, if desired, and/or can be targeted advertising based upon thecontent204A, one or more users associated with theclient128, and/or other considerations.
According to embodiments, the content, size, configuration, layout, and ordering of the UI controls and advertising in themenus206 andadvertising208 are determined by thetask engine122 based upon thecontextual data126, theSN data120, capabilities associated with theclient128, and/or other information. In some embodiments, thetask engine122 is configured to present certain tasks,workflows140, and/or types of tasks andworkflows140 in designated locations on thescreen display200A. Thus, theinterface module138 can provide a consistent UX to users with regard to particular types of content such as thecontent204A, regardless of one or more sources associated with the tasks and/orworkflows140.
Turning now toFIG. 213, a user interface diagram showing aspects of a user interface (UI) for presenting tasks at theclient128 in another embodiment will be described. In particular,FIG. 2B shows ascreen display200B generated by one or more of theoperating system130 and/or theapplication programs132 executed by theclient128 according to one particular implementation presented herein. It should be appreciated that the UI diagram illustrated inFIG. 2B is exemplary. As explained above with regard toFIG. 2A, it should be understood that data corresponding to the UI diagram illustrated inFIG. 2B can be generated by theinterface module138, made available to or transmitted to theclient128, and rendered by theclient128, though this is not necessarily the case.
Thescreen display200B includes anapplication window202B that can be sized according to various sizes, shapes, and configurations and is not limited to the illustrated content, size, or configuration. In the illustrated exemplary embodiment, theapplication window202B includes thecontent204B displayed in theapplication window202A. Thecontent204B corresponds to an image. It should be understood that this embodiment is exemplary, and should not be construed as being limiting in any way.
As illustrated inFIG. 2B, themenu206 is displayed in front of thecontent204B. Display of themenu206 can be triggered, for example, by keystrokes, mouse movements, mouse clicks, gestures, touch commands, voice commands, and/or other actions. Themenu206 includes UI controls210A-210F herein collectively referred to as the UI controls210. Selection of the UI controls210 can trigger the associated tasks orworkflows140. As illustrated inFIG. 2B, the Ul controls210 can correspond to various tasks andworkflows140 for editing images, though other tasks such as sharing tasks, printing tasks, commenting and reviewing tasks, and the like, are contemplated.
In the illustrated embodiment, theUI control210C corresponds to a “fix redeye” task for removing redeye from photographs. In the illustrated embodiment, theUI control210C is displayed in a format indicating that the task is not available or not relevant to thecontent204B. It should be understood that in some embodiments theUI control210C is replaced with a UI control for triggering an available task and that other methods of indicating that a task is unavailable or unnecessary are contemplated. It therefore can be appreciated fromFIG. 2B that thetask engine122 can recognize from thecontextual data126 that some tasks and/orworkflows140 are not relevant to activity at theclient128 and/or other devices.
Thescreen display200B also displays asecondary menu212 corresponding to themenu206. In the illustrated embodiment, thesecondary menu212 displays UI controls214A-214C for triggering tasks orworkflows140. The UI controls214A-214C correspond to three tasks or workflows for cropping images, and the display of thesecondary menu212 can be triggered by selection of theUI control210A. The three tasks orworkflows140 corresponding to the UI controls214A-214C can be substantially similar in terms of underlying functionality, but may be associated with three developers, suppliers, sites, interfaces, and the like. As shown, users may be charged for access to a task orworkflow140 corresponding to theUI control214C. Thus, theUI control214C can be displayed in a manner that conveys to the user that a fee must be paid to access the task orworkflow140. It should be understood that the illustrated scheme for identifying restricted tasks orworkflows140 can be varied depending upon any desired criteria. Thus, it should be understood that the illustrated embodiment is exemplary, and should not be construed as being limiting in any way.
As shown inFIG. 2B, thescreen display200B also can displayadvertising208. In the illustrated embodiment, thesecondary menu212 includes theadvertising208. It should be understood that this embodiment is exemplary, and should not be construed as being limiting in any way. Theadvertising208 can be displayed in any desired location on thescreen display200B. Additionally, the UI controls214A-214C can include advertising if desired. Although not shown inFIG. 2B, an interface for paying for access to tasks orworkflows140 can be presented in response to receiving selection of theUI control214C, if desired. It also should be understood that the UI controls210A-F and the UI controls214A-C can be selected, ordered, and arranged in accordance with various advertising and ranking schemes, as well as other considerations such as user search and usage history,SN data120, combinations and the like.
Referring now toFIG. 2C, a user interface diagram showing aspects of a user interface (UI) for presenting tasks at theclient128 in another embodiment will be described. In particular,FIG. 2C shows ascreen display200C generated by one or more of theoperating system130 and/or theapplication programs132 executed by theclient128 according to one embodiment. It should be appreciated that the UI diagram illustrated inFIG. 2C is exemplary. As explained above, the UI diagram illustrated inFIG. 2C can be generated by theinterface module138, made available to or transmitted to theclient128, and rendered by theclient128, though this is not necessarily the case.
In the embodiment illustrated inFIG. 2C, thescreen display200C includes an application window202C that can be sized and configured to various sizes and layouts, and is not limited to the illustrated content, size, or configuration. The application window202C includescontent204C. In the illustrated embodiment, thecontent204C corresponds to output generated via execution of theapplication106, wherein theapplication106 provides a mapping application. In the illustrated embodiment, thecontent204C illustrates a route between two points. It should be understood that this embodiment is exemplary, and should not be construed as being limiting in any way.
Thescreen display200C cart include any number of UI controls for accessing tasks and/orworkflows140 associated with thecontent204C. In the illustrated exemplary embodiment, thescreen display200C includes a location tasks area220 (“location tasks”), a trip planning tasks area222 (“trip planning tasks”), and a social networking tasks area224 (“SN tasks”). It should be understood that this embodiment is exemplary, and should not be construed as being limiting in any way.
Thelocation tasks220 can be configured by theinterface module138 and can include UI controls for accessing one or more tasks orworkflows140 relating to the location(s) corresponding to the map displayed in thecontent204C. Exemplary tasks and workflows include, but are not limited to, tasks for searching the displayed area for businesses, tasks for generating product price reports for stores in the displayed area, tasks for identifying social networking connections within the displayed area, tasks for generating navigation directions, other tasks, and the like. Thelocation tasks220 also can include one ormore workflows140 such as arestaurant reservation workflow140, which can include a batch of tasks such as a task for polling one or more connections for a type of cuisine and/or a dining time, tasks for reviewing ratings of restaurants matching the type of cuisine and/or dining time identified by the polling, tasks for making reservations, tasks for generating calendar reminders relating to the tasks, other tasks, and the like. It should be understood that this embodiment is exemplary, and should not be construed as being limiting in any way.
Thetrip planning tasks222 can include tasks and/orworkflows140 relating to trip planning. Exemplary trip planning tasks include, but are not limited to, tasks for booking flights or other travel arrangements, tasks for reserving hotel rooms or other accommodations, tasks for identifying attractions, other tasks, and the like. Additionally, thetrip planning tasks222 can include one ormore workflows140, as explained above. It should be understood that these tasks are exemplary, and should not he construed as being limiting in any way.
TheSN tasks224 can include various tasks and/orworkflows140 relating to social networking services. TheSN tasks224 can include, for example, tasks for searching the area corresponding to the displayed area for social networking connections, tasks for publishing location information to a social networking service, tasks for searching for social networking or realtime messaging updates emanating from the displayed area, and the like. TheSN tasks224 also can includeworkflows140 relating to social networking services. It should he understood that these tasks are exemplary, and should not be construed as being limiting in any way.
The above examples are merely illustrative of how UI controls corresponding to tasks and/orworkflows140 can be displayed for users, and should not be construed as being limiting in any way. Additional and/or alternative categories of tasks and/orworkflows140 can be displayed with respect to thecontent204C and/or other content, if desired.
Turning now toFIG. 3, aspects of amethod300 for providing tasks will be described in detail. It should be understood that the operations of the methods disclosed herein are not necessarily presented in any particular order and that performance of some or all of the operations in an alternative order(s) is possible and is contemplated. The operations have been presented in the demonstrated order for ease of description and illustration. Operations may be added, omitted, and/or performed simultaneously, without departing from the scope of the appended claims,
It also should be understood that the illustrated methods can be ended at any time and need not be performed in their respective entireties. Some or all operations of the methods disclosed herein, and/or substantially equivalent operations, can be performed by execution of computer-readable instructions included on a computer-storage media, as defined above. The term “computer-readable instructions,” and variants thereof, as used in the description and claims, is used expansively herein to include routines, applications, application modules, program modules, programs, components, data structures, algorithms, and the like. Computer-readable instructions can be implemented on various system configurations, including single-processor or multiprocessor systems, minicomputers, mainframe computers, personal computers, hand-held computing devices, microprocessor-based, programmable consumer electronics, combinations thereof, and the like.
Thus, it should be appreciated that the logical operations described herein are implemented (1) as a sequence of computer implemented acts or program modules running on a computing system and/or (2) as interconnected machine logic circuits or circuit modules within the computing system. The implementation is a matter of choice dependent on the performance and other requirements of the computing system. Accordingly, the logical operations described herein are referred to variously as states, operations, structural devices, acts, or modules. These operations, structural devices, acts, and modules may be implemented in software, in firmware, in special purpose digital logic, and any combination thereof
For purposes of illustrating and describing the concepts of the present disclosure, the methods disclosed herein are described as being performed by thetask engine122. It should be understood that these embodiments are exemplary, and should not be viewed as being limiting in any way. Additional and/or alternative devices can provide the functionality described herein. Themethod300 begins atoperation302, wherein thetask engine122 detects an interaction associated with theclient128. The interaction detected by thetask engine122 can include an interaction at theclient128, an interaction with one ormore application programs132 executing at theclient128, an interaction between theclient128 and one or more remotely executed or web-based applications such as theapplications106, and/or access or utilization of a web-based or other remotely executed application by theclient128.
It should be understood that the functionality of thetask engine122 can be provided by one or more of theapplication programs132 executed by theclient128. Additionally, the functionality of thetask engine122 can be provided by theoperating system130 of theclient128 and/or by execution of one or more of theapplication programs132 executing at theclient128. In other embodiments, thetask engine122 is in communication with theclient128 and detects interactions at theclient128. In any event, thetask engine122 can be configured to detect the interaction associated with theclient128.
Fromoperation302, themethod300 proceeds tooperation304, wherein thetask engine122 obtains thecontextual data126. As explained above with reference toFIG. 1, thecontextual data126 describes various aspects of one or more interaction(s) occurring at theclient128 such as one ormore applications106 or resources being accessed or utilized by theclient128, operations occurring at theclient128, the particular or types ofapplication programs128 executing at or being accessed by theclient128, content being used, consumed, or operated on by theclient128, combinations thereof, and the like. Thus, thecontextual data126 describes the types of interactions occurring at theclient128 and types of content being interacted with by theclient128.
Fromoperation304, themethod300 proceeds tooperation306, wherein thetask engine122 identifies one or more tasks that are relevant to thecontextual data126 associated with theclient128. As explained above, thetask engine122 searches or queries thetask data112 based upon thecontextual data126 to identify tasks that are relevant to activity associated with theclient128. For example, if theclient128 is interacting with a video file, thecontextual data126 may indicate this interaction, as well as file types associated with the video file and/or other information such as, for example, the size, resolution, length, frame rate, and the like, of the video file. Based, at least partially, upon thiscontextual data126, thetask engine122 can identify tasks relevant to theclient128.
Fromoperation306, themethod300 proceeds tooperation308, wherein thetask engine122 configures a UI for interacting with the relevant tasks. As explained above with reference toFIGS. 1-2C, thetask engine122 populates menus and/or UI controls displayed on the menus, determines what tasks are or are not available, determines one or more ranking and/or advertising schemes for the UI controls, and configures the UI based upon these and/or other determinations. Also, thetask engine122 can configure and generate data for presenting one or more UI controls for accessingworkflows140. According to various embodiments, thetask engine122 configures the UIs based upon these and other determinations and makes data describing the UIs available to theclient128. Fromoperation308, themethod300 proceeds tooperation310. Themethod300 ends atoperation310.
Turning now toFIG. 4, aspects of anothermethod400 for providing tasks will he described in detail. Themethod400 begins atoperation402, wherein thetask engine122 detects interactions with aworkflow140. As explained above, theworkflows140 can include data identifying bundles of tasks, as well as data indicating a execution point associated with theworkflow140,contextual data126 associated with a device executing theworkflow140, other information, and the like. Although not illustrated inFIG. 4, it should be understood that theworkflows140 can be generated by users, application or task developers, thediscovery engine108, thesearch application124, thetask engine122, and/or theinterface module138. Thus, it should be understood that the contents of theworkflows140 can be based upon one or more of user preferences,contextual data126, histories associated with devices and/or users,SN data120, developer information, other information, combinations thereof, and the like.
Fromoperation402, themethod400 proceeds tooperation404, wherein thetask engine122 determines if interactions with aworkflow140 have been interrupted. Execution of theworkflows140 can he interrupted for various reasons. For example, a device being used to execute aworkflow140 may lose network connectivity or power, a user may logout of a session via which interactions with theworkflow140 are controlled, a user may explicitly stop or pause execution of theworkflow140 to change devices or to share the execution of the workflow with other users or devices, and the like. In some embodiments, some tasks of aworkflow140 may not be executable on a smart phone being used to execute theworkflow140. Thus, when these tasks are encountered during execution of theworkflow140, thetask engine122 can inform the user and interrupt execution of theworkflow140. Theworkflows140 can be interrupted for other reasons.
If thetask engine122 determines inoperation404 that execution of theworkflow140 has been interrupted, themethod400 proceeds tooperation406, wherein thetask engine122 stores workflow progress information and other information associated with theworkflow140 execution. The other information can include, but is not limited to,contextual data126 associated with a device that began execution of theworkflow140, the next task in theworkflow140, a UI configured and presented to the user, other information, and the like. Thus, thetask engine122 can store various information allowing theworkflow140 to be resumed at another time, by another user, by another device, at another location, and the like. An exemplary method for resuming execution of theworkflow140 is described below with reference toFIG. 5.
If thetask engine122 determines inoperation404 that execution of theworkflow140 has not been interrupted, themethod400 proceeds tooperation408, wherein thetask engine122 determines if execution of theworkflow140 has completed. According to implementations, thetask engine122 can determine if all tasks associated with theworkflow140 have been executed, whether execution of theworkflow140 has been explicitly ended by a user, and the like. If thetask engine122 determines inoperation408 that execution of theworkflow140 has not been completed, themethod400 returns tooperation404. In some embodiments, the operations404-408 may be iterated until theworkflow140 is interrupted or completed. If thetask engine122 determines inoperation408 that execution of theworkflow140 has been completed, or fromoperation406, themethod400 proceeds tooperation410. Themethod400 ends atoperation410.
Turning now toFIG. 5, aspects of amethod500 for resuming execution of theworkflow140 will be described in detail. Themethod500 begins atoperation502, wherein execution of theworkflow140 is resumed. As will be appreciated from the description herein ofFIGS. 1-4, aworkflow140 may be resumed after being interrupted or shared. Thetask engine122 can detect resumption of aworkflow140 by detecting a login associated with theworkflow140, by detecting submission of an access code or access of a link associated with theworkflow140, or by other means. In some embodiments, theworkflow140 is executed in parallel by two or more users. As such, theoperation502 includes not only resuming execution of interruptedworkflows140, but also access of aworkflow140 via a shared access code or other method of accessing a sharedworkflow140.
Fromoperation502, themethod500 proceeds tooperation504, wherein thetask engine122 determines if thecontextual data126 associated with theworkflow140 is the same ascontextual data126 associated with the device resuming execution of theworkflow140. As mentioned above, a user or device may interrupt execution of aworkflow140 being accessed with a first device, and resume execution of theworkflow140 with a second device. Thetask engine122 can be configured to accesscontextual data126 associated with a device resuming execution of theworkflow140, and to compare that data withcontextual data126 corresponding to a device that initiated execution of theworkflow140.
If thetask engine122 determines inoperation504 that thecontextual data126 associated with the device resuming execution of theworkflow140 is not the same as thecontextual data126 associated with theworkflow140, themethod500 proceeds tooperation506, wherein thetask engine122 generates a new UI for presentation to a user associated with the device resuming execution of theworkflow140. It should be understood that generating the inoperation506 can be, but is not necessarily, substantially similar to generating the UI inoperation308 described above with reference toFIG. 3.
If thetask engine122 determines inoperation504 that thecontextual data126 associated with the device resuming execution of theworkflow140 is the same as thecontextual data126 associated with theworkflow140, themethod500 proceeds tooperation508, wherein thetask engine122 recalls the UI associated with theworkflow140. Fromoperation506 andoperation508, themethod500 proceeds tooperation510, wherein execution of theworkflow140 is resumed. Fromoperation510, themethod500 proceeds tooperation512. Themethod500 ends atoperation512.
It should be understood that in accordance with the concepts and technologies disclosed herein, the source or brand of particular applications and/or tasks can be deemphasized, and the functionality associated with a particular task and/or application can be emphasized. In other words, UIs can be tailored based upon functionality associated with tasks and/orworkflows140, and not hard coded by application and/or task developers.
UIs can be configured based upon functionality and not necessarily the source of applications, tasks, and/orworkflows140 for providing the functionality. Thus, location and/or configuration of UI controls in UIs can be based upon underlying functionality such as associated tasks andworkflows140, and therefore can he consistent even if a task associated with a particular source is unavailable. Thus, for example, a UI for viewing streaming video content may include a UI control for streaming a movie. Depending upon the movie searched for and/or selected for streaming, one or more sources of the movie may not be available. Thus, a UI control for streaming the video may be located in a consistent location on a but may be associated with one or more sources of the task for streaming the content. This example is illustrative, and should not be construed as being limiting in any way.
In some embodiments, application or task developers can specialize in tasks instead of, and/or in addition to, specializing in a particular application. For example, multiple applications exist for providing in editing. Some image editing applications include similar features, though some features may vary between the applications. Users may purchase one application over another based upon a subset of functionality. In accordance with the concepts and technologies disclosed herein, users can access tasks associated with multiple application developers. Thus, application developers can focus, if desired, upon tasks that set their services and/or products apart from other application developers.
In some embodiments, some tasks ofworkflows140 can he automated and/or data associated with those tasks can be populated automatically. For example, a user may specify preferences, or atask engine122 can determine those preferences over time, and those preferences can he used to drive data generation associated with and/or execution of the tasks. As such, users may specify preferences for almost any tasks, where those preferences are used to drive execution of tasks and/or data used to execute those tasks.
Someworkflows140 and/or tasks ofworkflows140 can he configured to execute for long periods of time and/or until interruption. For example, a user may specify aworkflow140 for a product search, historical data generation and/or tracking, and the like, and theworkflow140 can execute until it expires and/or until interrupted by the user. In some embodiments, data associated with theworkflows140 can be made available to any other tasks and/orworkflows140 associated with the user, such that these data can be used to drive execution of other tasks orworkflows140 associated with the user.
In some embodiments, data corresponding toworkflows140 and/or task execution associated with a user's trusted social networking connections or other entities can be used for task or workflow execution associated with a user. Thus, for example, a user may associate aworkflow140 or task with a trusted entity, and data associated with the trusted entity can be used with respect to execution of the task orworkflow140.
FIG. 6 illustrates anexemplary computer architecture600 for a device capable of executing the software components described herein for interacting with contextual and task-based computing environments. Thus, thecomputer architecture600 illustrated inFIG. 6 illustrates an architecture for a server computer, mobile phone, a PDA, a smart phone, a desktop computer, a netbook computer, a tablet computer, and/or a laptop computer. Thecomputer architecture600 may be utilized to execute any aspects of the software components presented herein,
Thecomputer architecture600 illustrated inFIG. 6 includes a central processing unit602 (“CPU”), asystem memory604, including a random access memory606 (“RAM”) and a read-only memory (“ROM”)608, and asystem bus610 that couples thememory604 to theCPU602. A basic input/output system containing the basic routines that help to transfer information between elements within thecomputer architecture600, such as during startup, is stored in theROM608. Thecomputer architecture600 further includes amass storage device612 for storing anoperating system614, thesearch application124, theinterface module138, and theworkflows140. Although not shown inFIG. 6, themass storage device612 also can be configured to store thetask data112, if desired.
Themass storage device612 is connected to theCPU602 through a mass storage controller (not shown) connected to thebus610. Themass storage device612 and its associated computer-readable media provide non-volatile storage for thecomputer architecture600. Although the description of computer-readable media contained herein refers to a mass storage device, such as a hard disk or CD-ROM drive, it should be appreciated by those skilled in the art that computer-readable media can be any available computer storage media or communication media that can be accessed by thecomputer architecture600.
Communication media includes computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics changed or set in a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer-readable media.
By way of example, and not limitation, computer storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, digital versatile disks (“DVD”), HD-DVD, BLU-RAY, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by thecomputer architecture600. For purposes the claims, the phrase “computer storage medium” and variations thereof, does not include waves, signals, and/or other transitory and/or intangible communication media, per se.
According to various embodiments, thecomputer architecture600 may operate in a networked environment using logical connections to remote computers through a network such as thenetwork104. Thecomputer architecture600 may connect to thenetwork104 through anetwork interface unit616 connected to thebus610. It should be appreciated that thenetwork interface unit616 also may be utilized to connect to other types of networks and remote computer systems, for example, theserver computer102, thediscovery engine108, thedata store114, theSN server116, and/or other devices and/or networks. Thecomputer architecture600 also may include an input/output controller618 for receiving and processing input from a number of other devices, including a keyboard, mouse, or electronic stylus (not shown inFIG. 6). Similarly, the input/output controller618 may provide output to a display screen, a printer, or other type of output device (also not shown inFIG. 6).
It should be appreciated that the software components described herein may, when loaded into theCPU602 and executed, transform theCPU602 and theoverall computer architecture600 from a general-purpose computing system into a special-purpose computing system customized to facilitate the functionality presented herein. TheCPU602 may be constructed from any number of transistors or other discrete circuit elements, which may individually or collectively assume any number of states. More specifically, theCPU602 may operate as a finite-state machine, in response to executable instructions contained within the software modules disclosed herein. These computer-executable instructions may transform theCPU602 by specifying how theCPU602 transitions between states, thereby transforming the transistors or other discrete hardware elements constituting theCPU602.
Encoding the software modules presented herein also may transform the physical structure of the computer-readable media presented herein. The specific transformation of physical structure may depend on various factors, in different implementations of this description. Examples of such factors may include, but are not limited to, the technology used to implement the computer-readable media, whether the computer-readable media is characterized as primary or secondary storage, and the like. For example, if the computer-readable media is implemented as semiconductor-based memory, the software disclosed herein may be encoded on the computer-readable media by transforming the physical state of the semiconductor memory. For example, the software may transform the state of transistors, capacitors, or other discrete circuit elements constituting the semiconductor memory. The software also may transform the physical state of such components in order to store data thereupon.
As another example, the computer-readable media disclosed herein may be implemented using magnetic or optical technology. In such implementations, the software presented herein may transform the physical state of magnetic or optical media, when the software is encoded therein. These transformations may include altering the magnetic characteristics of particular locations within given magnetic media. These transformations also may include altering the physical features or characteristics of particular locations within given optical media, to change the optical characteristics of those locations. Other transformations of physical media are possible without departing from the scope and spirit of the present description, with the foregoing examples provided only to facilitate this discussion.
In tight of the above, it should be appreciated that many types of physical transformations take place in thecomputer architecture600 in order to store arid execute the software components presented herein. It also should be appreciated that thecomputer architecture600 may include other types of computing devices, including hand-held computers, embedded computer systems, personal digital assistants, and other types of computing devices known to those skilled in the art. It is also contemplated that thecomputer architecture600 may not include all of the components shown inFIG. 6, may include other components that are not explicitly shown inFIG. 6, or may utilize an architecture completely different than that shown inFIG. 6.
Based on the foregoing, it should be appreciated that technologies for contextual and task-focused computing have been disclosed herein. Although the subject matter presented herein has been described in language specific to computer structural features, methodological and transformative acts, specific computing machinery, and computer readable media, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features, acts, or media described herein. Rather, the specific features, acts and mediums are disclosed as example forms of implementing the claims.
The subject matter described above is provided by way of illustration only and should not be construed as limiting. Various modifications and changes may be made to the subject matter described herein without following the example embodiments and applications illustrated and described, and without departing from the true spirit and scope of the present invention, which is set forth in the following claims.