BACKGROUNDGraphical user interfaces provide a mechanism for users to submit input to software applications, as well as a way for the applications to communicate information to the users. In most aspects, individual applications are developed to manage, process, or address a specific one or related issues. For example, an accounting application handles accounting issues and an email client processes and handles email related issues. Some applications may be developed by a common software developer and packed as a suite of applications that work well with each other. Even if these application suites work well with each other, there might remain a disconnect with other applications outside of the suite of applications.
Applications, whether designed for an enterprise environment, a home office, a mobile device, or other contexts and environments tend to exist in isolation or silos. A user of multiple applications may thus find that they have to monitor numerous different applications, devices, and systems to stay abreast of the many different alerts, schedules, meetings, requests, and messages that may be generated by their devices and their work, social, and/or school related applications.
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1 is an illustrative depiction of a block diagram of a platform architecture in an example embodiment;
FIGS. 2A and 2B are illustrative flow diagrams related to aspects of an integrated calendar and timeline application in an example embodiment;
FIG. 3 is an outward view of a displayed calendar day view in a user interface, according to an example embodiment;
FIG. 4A is an outward view of a displayed user interface of a calendar view according to an example embodiment;
FIG. 4B is an outward view of a displayed user interface including a detailed meeting screen according to an example embodiment;
FIG. 5A is an outward view of a displayed user interface including a calendar list view according to some example embodiments;
FIG. 5B is an outward view of a displayed user interface of a detailed view of a calendar meeting event inFIG. 5A, according to an example embodiment;
FIG. 6 is an outward view of a displayed user interface including a timeline view, according to some embodiments;
FIGS. 7A and 7B include outward views of a plurality of displayed user interfaces, including a timeline view at various levels of zoom according to some example embodiments;
FIG. 7C includes outward views of two displayed user interfaces, including a timeline view at different levels of zoom and a center of focus point according to some example embodiments;
FIG. 7D includes outward views of a plurality of displayed user interfaces including a timeline view at various levels of zoom and a view pane, according to some example embodiments;
FIG. 8A includes outward views of a plurality of displayed user interfaces, including some aspects to view and edit details of an event in a timeline view according to an example embodiment;
FIG. 8B includes outward views of a plurality of displayed user interfaces, including some aspects to specify options related to a timeline view according to an example embodiment;
FIGS. 8C and 8D include outward views of a plurality of displayed user interfaces, including some aspects to add and edit details of a subtask for an event in a timeline view according to an example embodiment;
FIGS. 9A and 9B include outward views of a plurality of displayed user interfaces, including some aspects to perform quick edits to a selected event in a timeline view according to an example embodiment;
FIG. 9C includes outward views of a two displayed user interfaces, including some aspects to perform quick edits of a subtask event in a timeline view according to an example embodiment;
FIG. 10A is an illustrative flow diagram related to synchronizing calendar views and timeline views, according to some example embodiments;
FIG. 10B is an illustrative flow diagram including a flow for calendar entry synchronizations and a flow for timeline entry synchronizations, according to an example embodiment;
FIG. 11 is an illustrative process flow in an example embodiment;
FIGS. 12A-12C include outward views of a displayed user interface, including UI controls to select a date and time of a calendar event according to an example embodiment;
FIGS. 12D and 12E include outward views of a displayed user interface, including UI controls to select a date and time of a subtask for a timeline event, according to an example embodiment;
FIGS. 13A-13C include outward views of a displayed user interface, including some aspects to specify user roles, according to an example embodiment;
FIGS. 14A-14C include outward views of a displayed user interface, including some aspects to manage pre-defined tasks, according to an example embodiment;
FIGS. 15A-15C include outward views of a displayed user interface, including some aspects to manage appointment types, according to an example embodiment;
FIGS. 16A-16C include outward views of a displayed user interface, including some aspects to manage connected applications, according to an example embodiment;
FIGS. 17A-17D include outward views of a displayed user interface, including some aspects related to recognizing schedule conflicts and options for a resolution, according to an example embodiment;
FIGS. 18A-18E include outward views of a displayed user interface, including some aspects related to scheduling conflicts for a calendar event, according to an example embodiment;
FIGS. 19A-19C include outward views of a displayed user interface, including some aspects to move an appointment to an alternative available day in a calendar view, according to an example embodiment;
FIGS. 20A-20C include outward views of a displayed user interface, including some aspects to move an appointment to an alternative available day via a natural language input in a calendar view, according to an example embodiment;
FIGS. 21A-21C include outward views of a displayed user interface, including some aspects related to conflicts in the scheduling of an appointment according to an example embodiment;
FIG. 22 is an illustrative example of a functional system block diagram, according to an example embodiment;
FIG. 23 is an illustrative example of a block diagram of a software architecture, according to an embodiment;
FIG. 24 is an illustrative example of a system, according to an example embodiment;
FIG. 25 is an outward view of a displayed user interface, including some aspects relating to an alert and notification functionality herein;
FIGS. 26A-26C include outward views of displayed user interfaces, including some aspects relating to settings and filters for an alert and notification functionality herein; and
FIG. 27 is an outward view of a displayed user interface, including some aspects relating to an example list view UI visualization for an alert and notification functionality herein.
DETAILED DESCRIPTIONThe following description is provided to enable any person in the art to make and use the described embodiments. Various modifications, however, will remain readily apparent to those in the art.
FIG. 1 is a block diagram of anarchitecture100 for an example embodiment including various design concepts and aspects of the present disclosure. The platform or architecture ofFIG. 1 is designed to provide a cross-platform and cross-application solution that can support and work with a plurality of different applications, services, devices, and systems.Architecture100, in some regards, may provide an alternative to device calendars and personal information managers or organizer applications and functions. In some aspects, the architecture ofFIG. 1 may interface and support one or more applications using one or more communication protocols and techniques.
Architecture100 includes acalendar component105 that provides calendar and calendar-related functionality and atimeline component110 that provides event organizational functions. Herein,component105 may be referred to ascalendar105 andtimeline component110 may be referred to astimeline110. In general,calendar105 creates, modifies, and manages events equal to or less than a day in duration, whereastimeline110 creates, modifies, and otherwise manages events having a duration of greater than one day. These event durations represent a delineation between how some events are handled byarchitecture100. In some instances, the durations may be modified by a user, developer or other entity. Of note, the basis for determining whether an event will primarily be managed bycalendar105 ortimeline110 should be consistently maintained so that the temporal integrity between events can be respected. In the examples herein, example calendar embodiments will manage events a day or less in duration and an example timeline will manage events equal to or greater than a day in length, unless otherwise noted.
Regardingcalendar100, events managed thereby can be represented by one or more user interface (UI) elements presented or visualized in a graphical user interface (GUI). A GUI may be referred to as simply a UI herein. Events managed (e.g., created, tracked, saved, modified, deleted, enhance, annotated, exported, imported, etc.) bycalendar105 may be represented in a number of different visualizations or views120.Views120 may be generated by a functionality integral tosystem100 or by a system or device external tosystem100 but operating in a coordinated effort withsystem100 to generate and/or process UI's for the benefit ofcalendar105.Calendar105 may be configured to organize calendar events based on timespans of days, weeks, and months. As such, views120 can provide calendar views organizing calendar events as group(s) of days, weeks, and months. In some instances, a calendar view can include one or more graphical UI elements, a list view including a text listing, and combinations thereof.
Calendar events managed bycalendar105 may receive temporal information relating to events a day or less in duration from a number ofcontent sources125. Acontent source125 may be a calendar application or function of a particular device (i.e., a smartphone or other device of an individual user, an enterprise calendar system where a particular individual is a registered user, etc.), a calendar service by a third party (e.g., a cloud-based calendar), and other systems, devices, services, and entities that might have information relating to meetings, appointments, calls, visits, etc. for a user entity. Calendar content may be represented as any type of data, including structured, semi-structured, and unstructured data embodied as any type of data structure, unless noted otherwise herein. Calendar content fromcontent sources125 may be optionally prioritized and/or characterized as private by priorityfunctional module135. Additional or alternative filters and sorting mechanisms may be applied to the calendar content by filter/sortfunctional module140. Alerts & remindersfunctional module130 might manage alerts and reminders related tocalendar105 and send such information tocalendar105 and view120 so that it can be presented, as appropriate and deemed relevant bycalendar105, inviews120.
Regardingtimeline110 and the events managed thereby being, in general, greater in length than one day, a viewsfunctional module150 provides a mechanism to represent and present visualizations of timeline events in a manner that can be readily understood by a user as having an extended duration (e.g., 2 days, 2 months, and 2 years). In some example embodiments, a timeline view generated byviews module150 can include various representations of extended lengths of time (e.g., a graphical representation of a timeline having different divisions of time markings/indicators), a list view including a listing or tabular presentation of the timeline events, and combinations thereof. In some aspects, a timeline view can be represented in one or more levels of detail, wherein the different levels of detail for a timeline view can correspond to different “zoom” levels in an example visualization of a timeline view.
Timeline events managed bytimeline110 may receive temporal information relating to events equal to or greater than a day in duration from a number of timeline content sources155. Atimeline content source155 may be a timeline or organizer application or function of a particular device (i.e., a smartphone or other device, an enterprise-wide organizer system where a particular individual is a registered user, etc.), an event organizer service by a third party (e.g., cloud-based), and other systems, devices, services for a user entity. Timeline content may be represented as any type of data, including structured, semi-structured, and unstructured data embodied as any type of data structure, unless noted otherwise herein. Timeline content fromsources155 may be optionally prioritized and/or characterized as private by priorityfunctional module165, filtered and/or sorted by filter/sortfunctional module170, and have alerts & reminders related thereto managed by functional alerts/reminders module160.
In some embodiments herein, functions provided bycalendar105 andtimeline110 are integrated or otherwise cooperatively operable to manage events being less than a day (or some other pre-determined duration) and events greater than a day in length (or some other pre-determined duration). In some aspects, as will be demonstrated in greater detail herein below, events of any duration for an individual subject entity (e.g., a user registered to receive services provided by system100) can be managed and reported to the entity (and others if so configured) using a variety of UI elements and visualizations. In some example embodiments, a view presented to a user may seamlessly switch between a calendar view and a timeline view by an integrated calendar and timeline system herein in synchronization with the tasks being performed and views being presented in response to user interactions with the system.
In some aspects, as further illustrated in the example ofFIG. 1,architecture100 includes an alerts andnotification management module175. As used herein, alerts andnotification management module175 is also referred to as a “fetch” application, service, or system. In some aspects, fetch application175 (used interchangeably with service and system) is distinct and separate from the integrated calendar and timeline functions ofarchitecture100. Fetchsystem175 may communicate with the integrated calendar and timeline functions105,110, but is itself functionally distinct therefrom. Fetchsystem175 includes a UI component in some embodiments and can operate to receive and reply to a wide variety of alerts, notifications, events, appointments, milestones, deadlines, reminders, messages, and other triggers related to integrated calendar and timeline functions105,110, one or moreuser device calendars195, and other fetch enabled applications and backend systems190.
In some embodiments, fetchsystem175 may communicate with fetch enabled applications andbackend systems190 via compatible communication interfaces (e.g., application programming interfaces, etc.) and/or other protocols and techniques to facilitate an exchange of alert and notification information between different devices and systems. In this regard, fetchsystem175 may communicate with integrated calendar and timeline functions105,110 using similar and/or alternative communication interface techniques.
In some example embodiments, fetchsystem175 may receive alerts, notifications, and links (i.e., addresses) to alert and notification content via a fetch settings, filters, and sorting functional module orsystem180. The fetch setting ofmodule180 may include, for example, one or more of the following filter settings where the filters may be used alone and in any logical combination. In some embodiments, the filter settings are specific to one or more particular fetch enabled applications or, alternatively, they may apply to all fetch enabled applications associated with a user. In some embodiments where fetch175 is deployed as a standalone application not associated with an enabled application, fetch filter settings may be individually set according to a device or deployment platform. For example, a “fetch” functionality disclosed herein may be deployed on a computer or laptop, smartphone or wearable as a native app thereon, whereby alerts and notifications from other applications, and services (e.g., a cloud-based calendar service) can be presented on the device running the fetch functionality as a native app (e.g., alerts regarding KPI's on a financial document, such as “sales are down 50% this quarter”). Filter settings for fetch settings andfilters module180 can include, for example:
- Show All (Default)
- Show High Priority
- Show Milestones
- Show Appointments
- Show Conference Calls
- Show Event Reminders
- Show Appointment Reminders
- Show Message Alerts
- Show System Alerts and Notifications
- Show Call Reminders
- Connected Application Notifications
- User Action Confirmations
In some embodiments, sort options for fetchmodule175 can include, as an example, the options to sort by a date, a priority, and a category, alone or in combinations thereof. In some aspects, an audio alert may be associated with the notifications to be provided by fetchmodule175 as configured via fetch settings, filters, and sorting functional module orsystem180.
In some embodiments, an alert and notification application herein may include a variety of alerts and notifications. Example notifications and alerts may include:
- Calendar Appointments/Meetings Reminders
- Call Reminders (incoming/outgoing/conference)
- Timeline Task Reminders (Task Start/Stop date notifications)
- Alerts (System, Errors, Connection, Missed Deadline, etc.)
- Acknowledgements
- Messages (incoming, pending phone voice msg., etc.)
- Application Updates Available
- Content Downloads Available
- Notification that a backend request (e.g., a search) while offline is available after reconnection is established
In some aspects, additional application and/or system specific notifications and reminders can be managed by an alert and notification management system herein. Specifics regarding such alerts may be dictated by the specific features of the applications and systems. However, the alert and notification management system disclosed herein is flexible and expandable to accommodate many different applications and systems.
In some aspects herein, an alert and notification management system manages links to content, as opposed to the content itself. Accordingly, deleting an alert/notification item does not delete or otherwise change the alert/notification being reported by the alert and notification management system. For example, deleting an alert regarding a meeting scheduled in a calendar application does not delete the meeting itself.
FIGS. 2A and 2B include an example of a simplified yet illustrative flow diagram relating to some aspects of an integrated calendar and timeline system, application, service, and functions (e.g.,FIG. 1, 105 and 110) and an alerts and notifications management system, application, service, and function (e.g.,FIG. 1, 175).FIG. 2A primarily relates to an integrated calendar and timeline system as disclosed herein andFIG. 2B primarily relates to an alerts and notifications management system. The operations are presented in a same yet divided drawing figure herein to further convey the functions provided by each system, application, service, and function is distinct, even in configurations sharing at least some information and/or resources there between.
FIG. 2A includes some aspects of a process flow, generally referenced bynumeral200. Atoperation205, first temporal information relating to one or more calendar events is received. Such calendar related information can be received by an embodiment of an implementation of a calendar functionality (e.g.,FIG. 1, 105). The calendar related information may be received from one or more devices, systems, services, and applications related to, registered with, or otherwise associated with a subject entity or user. In some embodiments, the calendar related information may be generated by the user in any one or more devices, systems, services, systems, and applications related to, registered with, or otherwise associated with a subject entity or user. In some embodiments, the received calendar information relates to events being equal to or less than a day in duration. In some embodiments, the one or more devices related to, registered with, or otherwise associated with a subject entity or user may be synchronized to a service, service, system, and application when those devices are online and/or reconnected online.
Atoperation210, second temporal information is received. This second temporal information relates to one or more timeline events. The timeline related information can be received by an embodiment of a timeline or event organizer (e.g.,FIG. 1, 110). The timeline related information may be received from one or more devices, systems, services, and applications related to, registered with, or otherwise associated with the subject entity or user. In some embodiments, the timeline related information may be generated by the user in any one or more devices, systems, services, systems, and applications relate to, registered with, or otherwise associated with a subject entity or user. In some example embodiments, the received timeline information relates to events being equal to or greater than a day in duration. In some embodiments, the timeline information relates to a day or point-in-time indicating an initiation or conclusion of timeline information, e.g. start event or milestone, deadline or deliverable event, etc.,
Operation215 ofFIG. 2A includes generating a user interface including user interface representations for an integrated calendar and timeline visualization. The generated visualization may be used by a user to interactively navigate and control aspects of both the calendar events and the timeline events referenced in the information received inoperations205 and210. The integrated calendar and timeline visualization includes UI elements relating to both the calendar information and the timeline information in the same UI. As used herein, a visualization may also be referred to as a view.
In some example embodiments, views relating to one or both of the calendar events and the timeline events may be invoked or initiated from most any visualization generated in accordance withoperation215 and logical constraints imposed by the realities of time-based relationships. For example, one reality of time-based relationships is that a meeting cannot have an end time that is before its start time. With this example reality, as with other logical constraints, attempts to violate this constraint may not be accepted (or even enabled) for entry by the system herein.
A process flow according toFIG. 2A proceeds tooperation220 where a record is automatically saved. The record includes, at least, an indication of a current view location and a most recent past view location, where the views (i.e., current and most recent past) are recorded relative to at least one of the calendar and the timeline. In some embodiments, the location of the views is recorded with reference to the context of the view. That is, a calendar view (i.e., visualization of an event in a calendar context) is saved with an indication of the location (i.e., where a user has navigated to) with respect to or within the calendar. Likewise, a timeline view (i.e., visualization of an event in a timeline context) is saved with an indication of the location (i.e., where a user has navigated to) with respect to the timeline events, functions, and actions. In some embodiments, a clock functionality can be used by a system herein to, at least in part, keep track of a user entity's current and most recent past location within the calendar and timeline disclosed herein. In some regards, a system herein may track and record where a user is currently located within a calendar (e.g., in a detailed view of a 3:00 PM meeting with Jon Smith) or timeline (e.g., at a task for a project spanning multiple months) so that the user's current location and the location they came from (i.e., most recent last location) can be visualized or otherwise presented to the user. In some respects, the visualization or presentation of the user's current location and the location they came from can provide the user with a sense of where they are in the context of the calendar and/or timeline. This feature may facilitate a user's context navigating “back” to their previous location within the calendar or timeline.
For example, a user may drill down to a meeting detail in the calendar context herein by selecting a specific day and further selecting a specific meeting on the selected day. In some embodiments, when the user then navigates back to a higher level, the most recent past location (i.e., the meeting detail) remains highlighted (e.g., by color, such as blue) to indicate where the user was last. This and similar tracking aspects may be included in the calendar context and the timeline aspects herein, in some example embodiments to provide, for example, context to a user by indicating from whence the user is navigating.
FIG. 2B, in some aspects relating to operations205-220 yet independent therefrom, primarily relates to operations performed by an alerts and notification management function or system (e.g., a “fetch” service, application, or system in some aspects herein).
Operation225 includes receiving at least one of an alert, a link to an address for a content item, and a notification from at least one application. The information received atoperation225 by the alert and notification management system or application can be used as a basis for visualizations related to the alerts and information. The alert, link, and notification information received atoperation225 may be received from one or more other applications, including but not limited to an integrated calendar and timeline application, service, or system herein.
Continuing with flow diagram200,operation230 includes generating a UI component including user interface representations for a state of the at least one alert, link to an address for a content item, and notification from at least one application. The UI interface generated atoperation230 may be, in some instances where an integrated calendar and timeline application, service, or system is interfaced with the alert and notification management system orapplication executing operations225 and230, presented (i.e., visualized) with the UI generated atoperation215. In this manner, some embodiments herein include a UI including integrated calendar and timeline visualizations and alert and notification management in a common or same UI.
Various aspects and features of an integrated calendar and timeline application, service, or system (FIG. 1, 105 and 110) and an alert and notification management application, service, or system (FIG. 1, 175) will be illustrated and disclosed with reference to user interfaces (UIs) and aspects thereof. A number of aspects and features of the present disclosure relate to UIs, the information conveyed therein (e.g., UI elements, text content, colors, etc.), and an order and method of navigation between different UIs, wherein the UIs are synchronized to reflect operations and processes executed by the underlying integrated calendar and timeline application and the alert and notification management application.
FIG. 3 is an illustrative view of at least a portion of a UI for an example embodiment.FIG. 3 is a calendar view, as indicated by the highlightedUI button305. It is noted that options for both a calendar view and a timeline view are presented inUI300, as represented byUI buttons305 and310, respectively. Furthermore, the current calendar day and location for the user relative to the integrated calendar application is indicated by a gold (or optionally, other colors) coloring of the 28th day in the day view shown inFIG. 3 at315. The day being selected inFIG. 3 at320 is visualized with a blue color (optionally, other colors). It is noted that the integrated calendar and timeline application in some embodiments deploys or is connected to a real time clock (e.g. from a network or device) to persistently indicate thecurrent calendar day315 in a (gold) color and the selected day, (if not the current day) is visualized by a blue color (optionally, other colors), as shown at320. In some embodiments, the current day indication (e.g., gold colored day UI element) is automatically updated as time advances from a current day to the next current day.
UI325 is a representation of the subject UI as it may be displayed in a device positioned in a portrait orientation, whereasUI330 is a representation of the UI displayed by a device positioned or at least displaying items in a landscape mode. In some aspects,UI300, as well as most other UIs disclosed herein, may dynamically adjust their sizing and/or configurations in response to at least one of an orientation, size, resolution, and other parameters of a display screen or display system, including for example resizing of a browser window. The dynamic adjustment(s) may be accomplished automatically as a function of the integrated calendar and timeline functions herein, as well as for the disclosed alert and notification management function.
In the example ofFIG. 3, scrolling through the presentedday view350 can be done by a left/right motion for both UIs since such a motion can be intuitively understood according to some mobile and desktop scrolling methods and to correspond to moving back in time (i.e., viewing date/time content to the left) and moving forward in time (i.e., viewing date/time content to the right).
FIG. 3 includes an example embodiment of aUI element345 for user entry of, in some use-cases, natural language. The natural language may be in the form of typed text or spoken inputs. Tapping/clicking on or otherwise selecting thetext field335 replaces the hint text “Type or talk . . . ” with a cursor indicating to the user that typing text in the text field in a natural language method is possible. On touch screen mobile devices, this selection may also launch a device's virtual keyboard. Tapping/clicking on or otherwise selecting the “microphone”icon340 may sound an audio tone indicating to the user that speech recognition is activated for inputting spoken words in a natural language method. In an embodiment, selecting the microphone icon may, either alone or in addition to sounding an audio tone, display a text field popup whereby the recognized speech input is shown in text. As illustrated, the naturallanguage input element345 may be presented inUI300 whether the device is in a portrait configuration or a landscape/widescreen configuration. It is noted that the particular placement ofUI element345 is not limited to the specific configurations depicted inFIG. 3 (or other depictions herein). That is, in some example embodiments, the arrangement and layout of the naturallanguage input element335 may be different from the example(s) shown in the accompanying figures.
FIG. 4A is a day view in portrait mode of an integrated calendar function herein, as illustrated by theUI depiction400 at420, and a day view in landscape mode as depicted at425. The currently selected day415 (i.e., the “29th”) is shown in blue to convey that it has been selected by the user and the current day, the28th, is shown in gold to indicate it is the current day. The event (i.e., “meeting with Paul . . . ”)410 is shown in blue to convey that the user has previously selected it to navigate to and display a specific meeting detail screen. An example of a detailed meeting screen corresponding to the selectedevent410 is shown inFIG. 4B, generally at430. As seen, thedetailed meeting screen425 includes the specific particular information regarding the selectedevent410. In this manner, a user can easily ascertain “where” they are located within the visualized representation of the integrated calendar and timeline application, and in reference to the current calendar day405 (Oct. 28, 2011) is shown in gold (similar to referenceditem315 inFIG. 3).UI420 is a representation ofUI400 as it may be displayed in a portrait orientation, whereasUI425 is a representation ofUI400 displayed in a landscape mode or a widescreen display. Meeting detailsUI435 may be presented to the user in response to the user selecting the meeting at410. The meeting details screen shown atUI435 inFIG. 4B may include specific information regarding the selectedmeeting410, including the date, time, company, attendees, etc. associated with the meeting.UIs420 and425 illustrate how a user might navigate back to an earlier date from a current location/time by scrolling (dragging or swiping with a mouse or touch gesture) the daysribbon UI element426 right towards445 (thus exposing earlier days towards the left side of the days UI element ribbon) or forward to a future date by scrolling left towards440 (thus exposing future days towards the right side of the days UI element ribbon element426). A user may scroll through days in the daysribbon UI element426 and select a new day in the scrolling days ribbon UI element. Following that selection, the corresponding meetings/appointments that are displayed withinareas427 inUI420 andUI425 inFIG. 4A will automatically scroll horizontally through all day meetings/appointments in an animation from and between the currently selected day to the newly selected day in thedirection440 or445, according to whether the newly selected day is in the past or in the future. In an example, if a newly selected date in thedays ribbon element426 is in the future of the currently displayed day (i.e., toward the right), then the meetings/appointments area427 will automatically scroll towards the left (440), thereby displaying meetings/appointments towards the right and finally stopping on the corresponding selected day in the future. Conversely, if, by example, a newly selected date in the days ribbon element is in the past of the currently displayed day (i.e., toward the left), then the meetings/appointments area427 will automatically scroll towards the right (445), thereby displaying meetings/appointments towards the left and finally stopping on the corresponding newly selected day in the past. This provides the user or other entity with a context of “traveling” forward or backwards through time (time travel experience).
FIG. 5A is an illustrative depiction of aUI500 including acalendar list view505 for a calendar of an integrated calendar and timeline application, in one example. Thecalendar list view505 is displayable to include the same relevant information whether rendered on a display device configured for and/or positioned inportrait orientation510 orlandscape mode515. Hereto, options to select, and thus invoke, either a calendar perspective view or a timeline perspective view is provided inUI500 at520. In some embodiments, an options menu (not shown inFIG. 5A) including “Add New” (calendar event), “Add New Recurring” (calendar event), etc. may be triggered from the calendar list screen by selecting anaction button element512. An example embodiment of a natural languageinput UI element525 is also shown in the UI's ofFIG. 5A at525. In some aspects, there may be a variety of calendar and timeline events listed incalendar list view505, including, for example, a “meeting” as indicated by a calendar icon next to the “Meeting with James” event, a timeline milestone event as indicated by a diamond shape next to the “SAP deadline” timeline event, etc.
FIG. 5B is an example depiction of a detailed view fora calendar item inUI500 ofFIG. 5A. In particular, thedetails530 for calendar item535 (“Meeting with James”) are shown inFIG. 5B in response to a selection of the “Meeting with James”calendar item535 by a user inFIG. 5A. BothFIGS. 5A and 5B include an example embodiment of aUI element525 for user entry of, in some use-cases, natural language. In some instances, a user might invoke a selection of another calendar item by first selecting microphone icon and then speaking, for example, the phrase, “show details for my next upcoming meeting with James”. The integrated calendar and timeline application would then display the meeting details for a “next” meeting with James. In an embodiment, if a next meeting with James is not scheduled, the system may respond with a system message such as, for example, “You do not have a next meeting with James in the future”.
FIG. 6 includes a depiction of anexample timeline UI600. As shown,multiple timeline events605 are arranged by the application in a cascading fashion from both a vertical and horizontal perspective. Eachtimeline event605 is represented by a horizontal bar, the length of which represents the duration of the event (or task) in one or more days, months, or years. Each milestone event, forexample milestone610, has a duration of one day and is shown in a diamond-shaped graphical element to indicate a milestone event. An example embodiment of a natural languageinput UI element630 inFIGS. 6 (and725 inFIG. 7A) further illustrates a mechanism by which a user can interact with the integrated calendar and timeline application herein, including using textual and/or speech inputs, in conjunction with conventional manual inputs.
The cascading arrangement from atop-most event615 to a bottom-most event (which is hidden off screen) is configured according to the start date of each event bar, which is indicated by the left edge of each event bar element in its vertical stacked placement according to a horizontaltime scale element620 on any day in the past, present (current) day, or a date in the future. One day milestone events (e.g.,milestone610 on date October 31 and the milestone on the date March 23) are placed in the cascading event stack screen configuration according to the single date assigned to each milestone. It should be noted that the end date of a timeline event bar (as represented by the right edge of the bar) does not affect its placement within the cascading stack. Also, in some cases when a timeline event has a very long duration (e.g., top-most Marketing Education event615) where the end date is represented by the length of the event bar, the end date may occur at some day later than timeline events below it in the cascading stack.
Similar in some respects to some aspects of the calendar UI embodiments herein, the current day (today) can be persistently indicated by a todayvertical line625 that extends down from the current day in thetime scale620 and may be a gold (or another) color. Times (e.g., days, months, years) to the left of thetoday line625 are in the past, and times (e.g., days, months, years) to the right of thetoday line625 are in the future. In all representations of thetimeline UI600 at all levels and views according to zoom and pan features described below, thetimeline events605 andtime scale element620 are persistently anchored together in thetimeline UI600, and in a vertical or horizontal pan view ofUI600, thetime scale620 is “pinned” above the timeline events displayarea containing events605 and is in view in accordance with any x, y view manipulations or selections as further described below.
Generally,timeline UI600 incorporates a visual design, element layout, scale and functional embodiments as described below that collectively provide a completely responsive, easy to use experience across all devices and corresponding display sizes and resolutions including desktop computers, laptops, tablets, and smartphones. In at least one embodiment, the timeline UI may operate in landscape mode on devices with small screens, such as a smartphone, so as not to constrain the timeline view and user operation via the UI.
FIG. 7A includes four representative example UIs to illustrate some aspects herein.UIs705,710,715 and716 all relate to a timeline of an integrated calendar and time application.UIs705,710, and715 reflect different zoom levels but from a same center focus perspective of the same timeline UI that may be variably adjusted by a user in embodiments as described further below.UI716 reflects a view options menu as one of many alternate method embodiments for user adjustment of zoom levels.UI705 is a high level view of the timeline, where parts of at least three years are depicted in thetime scale730.UI710 corresponds toUI705 but zoomed in one level. Accordingly, a smaller section of the timeline is presented inUI710, although more detailed information is shown for each timeline event and thetimeline time scale730 now shows month and year information.UI715 is a visualization representing a fully zoomed-in view of the timeline UI of the time depicted inUIs705 and710. As such, an even smaller section of the timeline is presented inUI715 as compared toUIs705 and710, although even more detailed information is shown for each timeline event inUI715 and thetimeline time scale730 now shows month and day information in a larger scale for easy viewing in support of timeline editing functions that may not be available inUIs705 and710.UIs705,710, and715 all show the current day (today) vertical line720 (also shown inFIG. 6 at625) as extending down from the current day in the time scale, but the width of thetoday indication line720 inUI715 is depicted as a wider line to accurately indicate a day duration according to thetime scale730 for the fully zoomed-in graphical state ofUI715.
In one example, navigation within a timeline context of the integrated calendar and timeline application herein (as indicated by the highlighted “Timeline” button702) may be represented by sequentially presentingUIs705,710, and715 to a user via continuous selections of thetoday button755, as described below. As seen in UI's705,710, and715 thecurrent day line720 in each UI indicates the current day (today) as being located in the center of each UI. As such, “today” may be the user's center point of interest (i.e., focus) as determined by, for example, the current day having been previously selected in the calendar UI context before the user navigated totimeline UI710 via theUI button310 inFIG. 3. In this manner, the user's point of interest of “today” is seen as being synchronized between the calendar and timeline contexts in an integrated calendar and timeline application herein. In a similar manner, if a user selects another day in the calendar context, for example they select October29 inFIG. 3 at320 as shown, then subsequently selects thetimeline button310, the timeline view would place a second “selected date” line in a blue (or another color) at that selected date (i.e., October 29) and position in thetime scale element730, in addition to locating the selected date line in the center of the screen to make October 29 the center of focus in the timeline view.
Referring again to the example where “today” is the user's center point focus of interest as presented inFIG. 7A inUIs705,710 and715, selecting thetoday button755 inUI705 will zoom-in one level toUI710. Similarly,710 selecting thetoday button755 inUI710 will zoom-in to the fully zoomed-in view ofUI715. In this manner, a user may quickly and sequentially zoom-in on “today” in a simple two click fashion while also maintaining the center point focus of interest on “today”.
Additionally, as a user may often focus on a current day (“today”) depending on a particular use case (e.g. “What's happening today?”) for both timeline and calendar events, it may be advantageous for the user to quickly, accurately, and efficiently toggle between calendar views and timeline views via UI button element703 (in timeline view) andUI button310 inFIG. 3 (in calendar view). In an embodiment, the user may also zoom in and zoom out of a timeline view via touch gestures on mobile touch screen enabled devices. For example, when in UI705 a two finger pinch-out gesture may be used to zoom-in one level toUI710. Similarly, when inUI710, pinching-out again will zoom-in to the fully zoomed-in view ofUI715 while maintaining the center point of focus around “today”. In some aspects when inUI715, using a two finger pinch-in gesture will zoom-out one level toUI710. Similarly, when inUI710, pinching-in again will zoom-out to the fully zoomed-out view ofUI705, while maintaining the center point of focus around “today”. It should be noted that as described herein, zoom levels may include more than three levels as depicted inFIG. 7A atUIs705,710, and715. It may be possible to incorporate variable zoom levels that present many more levels of zoom than those in the specifically disclosed examples, wherein the specifically illustrated zoom levels do not limit the zoom functionality of the present disclosure.
In an embodiment, a user may select theview options button704 in the UIs ofFIG. 7A to open theview options menu760, as shown inUI716. As a method in both mouse pointer and touch devices, when in the fully zoomed-inUI715, selecting optionlist item element765 as shown inUI716 will zoom-out one level toUI710, and close theview options popup760 while maintaining the center point of focus. Similarly, when inUI710, the user may again select theview options button704 to open theview options menu760 as shown inUI716 and select optionlist item element765 to zoom-out one level toUI705 and close theview options popup760, while maintaining the center point of focus. Now, in the fully zoomed-out view ofUI705, the user may select theview options button704 to open theview options menu760 as shown inUI716 and select optionlist item element765, which will now state “Zoom In” (not shown; instead of “Zoom Out”) to zoom in one level toUI710, with similar actions available fromUIs710 and715.
FIG. 7B includes three representative example UIs to illustrate some aspects herein.UIs717,718, and719 all relate to a timeline of an integrated calendar and timeline application, each at a different level of zoom and each with a different center point focus, as determined by a user's actions.UI717 is at a highest level of zoom of the timeline, whereby the center point focus is roughly at a currently selected day represented by theblue line778 and vertically orientated to visually show a maximum number of timeline events within the screen's view. In some embodiments, this view may be the last viewed timeline view as seen by a user (sticky-view). In another embodiment, this view may have been established by a previous calendar date selection (represented by the blue line) before switching from a calendar context to a timeline context, (i.e., as disclosed by the synchronized behaviors between timeline and calendar contexts herein). In an embodiment, as shown inUI717, a user clicks or on a touch enabled device, taps on the timeline screen at apoint770 to the right and slightly above the horizontal screen center to designate a new center point of focus and rough estimation of a date that is of interest to the user. After the user clicks or taps the screen atpoint770, thetimeline UI717 renders a new dayindication line element779 in blue (or another color) in a wider line format and simultaneously pans and zooms-in one level in an animation to transition toUI718 that shows a perspective according to the selection at770. Now at this zoom level, the user recognizes that a point of interest is themilestone773 and adjacentbusiness trip event775. In the present example, these events may be of interest because the user is currently at thetrade show event771 today and wants to view the next upcoming events after the current trade show ends in 3 days. The user may then click or tap on point772, to the right of the displayed selectedday line779 and slightly below the horizontal center ofUI718 to designate a new center point offocus774 anddate779 that is of interest to the user. After the user clicks or taps the screen at point772 the timeline UI displays a new dayindication line element779 and simultaneously pans and zooms-in one level in an animation to transition toUI719, where the fully zoomed-in view that now has an entirely new center point offocus774 in thetimeline UI719 with thedate line element779 shown in a width to clearly indicate the newly selected day fromUI718 is November 2nd. Now, themilestone773 andBusiness trip775 are roughly in thecenter point focus774 and are easily identifiable. In comparison toUI717 andUI718 the current day (today)line indication777 is located well off towards the left side of the screen view as compared toUI717 andUI718, as determined by the user selecting the new center point focus and date772 inUI718.
A milestone herein (e.g., the milestone indicated at773) is a one day event and is denoted in the present example as a diamond-shape UI element representation, as shown inFIG. 7B,UIs718 and719. In some aspects, a milestone may be added to a timeline herein in a variety of methods, including for example, via an options menu (not shown), a natural language input mechanism (e.g.,725,FIG. 7A), and other methods described below.
It should be noted that this “tap or click to zoom and change a center point of focus” feature may occur at zoom levels such as those shown inUIs717 and718, but not in the fully zoomed-in view UI719 (as further zooming-in is not possible in that UI in the present example). Also, timeline edit modes and features for timeline events in the timeline UI may be available in the fully zoomed-inview UI719 which will be described further below.
FIG. 7C includes two example representative UIs to illustrate some aspects herein.UIs780 and782 relate to a timeline of an integrated calendar and timeline application, each at a different panned view with a respective different center point focus, as may be determined by a user's interaction with a UI herein (e.g., dragging a screen view). In an embodiment, a user clicks and holds, or on a touch screen device presses and holds, on the screen atpoint786 inUI780 and drags the screen diagonally to point787 and then releases the screen. During this drag action, thetimeline events area781 pans in real time according to the direction and distance of the user's drag action. Simultaneously, according to the horizontal distance of the drag action, the anchoredtime scale area783 slides towards the left to anew scale position784 as shown inUI782, thereby accurately correlating the time scale to the timeline events below it as generally shown inUI782. Now, inUI782, a newcenter focus point788 is established. It should be noted that the drag to pan action in this example ofFIG. 7C does not establish a new day as does the tap action described in connection withFIG. 7B. However, at any time in the fully zoomed-inlevel3 the user has the option to tap or click on a day in the scale bar. For example, a user may click on a day in the time scale bar as shown inUI782 atpoint789 to select a new day and cause the placement of the indication line element (as shown inFIG. 7B inUI719 at779). In some embodiments, a dragging action may be used in any zoom view and in any distance and direction, including up, down, left, right and any combinations thereof. In an embodiment, a drag action may be interchangeably substituted by a swiping action using a mouse (or other) pointer device on a desktop or laptop or a touch gesture on other (e.g., mobile) devices.
FIG. 7D includes four representative UIs, as an example, to illustrate some aspects herein.UIs790,792,794 and796 all relate to a timeline of an integrated calendar and time application for adjusting both a pan and a zoom view using a “view pane” tool.UI790 is a representation of a fully zoomed-out timeline view. In this example, a user double clicks via a mouse or on a touch screen device presses and holds onlocation791 in the timeline. Upon the double click or press and hold action, aview pane element793 is displayed, as shown inUI792. The size and shape of the view pane element may correlate to a screen that is zoomed-in by one level inUI792. The user then drags the view pane to a new location frompoint795 to point797, then releases the mouse click or press and hold gesture. As shown inUI794, theview pane793 is now located at a position that surrounds timeline events of interest to the user. In this manner, the timeline events of interest are clearly indicated within theview pane element793. The user may then click or tap on any (arbitrary) location within theview pane element798, forexample point799, which in turn zooms-in one level to yieldUI796, while maintaining the area of focus as defined by the view pane. It should be noted that a view pane herein may be located at any point in the timeline, for example in the lower right corner of the timeline that will establish a new center point of focus for a subsequent zoomed-in view. In the present example, atUI796, the user may display another view pane element that will be representative of the size and shape (smaller size) for the next fully zoomed-in view. In some embodiments, at any time when a view pane element is displayed, a user may dismiss it by clicking or tapping at a point outside the view pane element. This action will not change the view or center focus point, but instead only dismiss the view pane element from view.
Collectively, aspects of the embodiments as described inFIGS. 7A-7D may be interchangeably available at any time as desired by a user to adjust timeline views responsively and adaptively across all device types (i.e. desktops, laptops, tablets, smartphones, etc.) and device screen sizes, resolutions and orientations to provide an optimized user operation and an intuitive user experience.
FIG. 8A includes four representative UIs as examples to illustrate some aspects herein. UI's800,802,804, and806 relate to a timeline context for drilling down into a timeline event to view and edit details related to a specific timeline event.UI800 depicts a fully zoomed-in view of a timeline. As such, this fully zoomed-in view enables drill-down and editing of a selected timeline event. In an example, a user clicks or taps on the “Develop Opportunities”event element801 that then presents the details of the event, as shown inUI802. As shown, the user may view depicted details of the selected timeline event including, for example, a start date, an end date, a location, etc. The user has the option to navigate back to thetimeline view UI800 via “Back” button element803. In an embodiment, the user may select theaction menu button805 to view options, as shown inUI804 andoptions popup menu807. In responding to a selection of the “Edit” function element809 and then theOK button element811, theedit mode UI806 is presented. The user may edit certain fields and timeline event attributes viaUI806, including, for example, “Activity Title”, “Activity Type”, “Start Date”, “End Date”, “Location”, etc. in the UI's editable text fields and selection elements. After making edits to the timeline event, the user may select the “Done”button element812 to save the edits and return to the timeline event detail screen inUI802. Optionally, the user may select the cancel button element813 inUI806 to return toUI802 without saving any edits performed therein. InUI802, the user may select the back button element803 to return to thetimeline UI800. In this manner, easy intuitive selection and editing oftimeline event801 may be performed. As an example, a milestone event (i.e., a key timeline event having a one day duration) may be specified inUI806 via the “Activity Type” attribute field shown therein. As a milestone event, the start date and end date for the milestone event will be the same date (i.e., one day in duration).
FIG. 8B includes four representative UIs as examples to illustrate some aspects herein. UI's815,817,819, and821 relate to a timeline context for selecting a timeline list view and related optional filter selections.UI815 depicts a partially zoomed-in view by one level. In an embodiment, the user may select the viewoptions button element816 to open theview options menu824 shown inUI817, wherein the user can select listview button element818 to change the timeline view to a timeline list view as shown inUI819. In an embodiment, the viewoptions popup menu824 also presents the user with the option to select or deselect certain filter options (e.g., show or hide milestones or high priority timeline events in the list view as indicated at822 and823, respectively). In this manner, a user may customize the list of timeline events to their liking. Upon selecting the “OK” button element825, the list view is presented inUI819 to show timeline events in a listed format arrangement. As in the timeline view, the list view is arranged in descending order from the earliest event start date (at the top of the list) to the latest event start date (at the bottom of the list). It should be noted that the list view inUI819 may be scrollable to view event list items that may be hidden due to device screen size constraints. The list view inUI819 may also be searchable via an action menu (not shown) or using the naturallanguage input element826 for searching by a user to find specific event(s). Similar to the embodiments inFIG. 8A, a user may select an event, forexample event820, to view a specific event's details as shown inUI821.
FIGS. 8C and 8D include eleven example representative UIs to illustrate some aspects herein. UI's827-837 relate to aspects for adding subtasks to events of a timeline including zoom views herein.UI827 depicts a fully zoomed-in timeline allowing drill-down and editing of a selected timeline event. As seen inFIG. 8A, a user clicks or taps on atimeline event838 in UI827 (i.e., “Business Trip”). This action presents the details of the event as shown inUI828. In an embodiment, the user may select theaction menu button839 to view options as shown inUI829 inoptions menu popup840. In an embodiment, upon selecting theAdd Subtask element841 and then theOK button element842, the addsubtask popup UI830 is presented with thetitle tab element843 selected as a defaulted action. In thetext field element844, the user may enter a text title element845 (i.e., “Calls”) via a device keyboard. In an embodiment, adding a title may be required before allowing the user to make any other tab or OK button selections except for canceling the operation viaUI button element846.
Moving toUI831, the user has selected the StartDate tab element847 that causes the presentation of adate picker control848. In an embodiment, datepicker slider elements849,850, and851 may be partially constrained (e.g., not selectable) to limit the duration of the subtask to a timespan within the duration oftimeline event838, which in this example represents the overall duration of the timeline event and governs the maximum duration and the related start dates and end dates of its subtasks. Moving on toUI832, the user has selected the EndDate tab element852 that causes the presenting of adate picker control853 whereby the user can select a date representing the end date for the subtask entitled “Calls” within the timeline event838 (i.e., Business Trip) The user can then select an optional color palette button element854 (indicated in a default color of, for example, purple (or another color) which then causes the presentation of acolor picker popup855 as shown inUI833 ofFIG. 8D. As indicated by thecheckmark element856, the default color is purple (or another color). In an example, the user selects thegreen color element857, which would remove the purplecheckmark element856 and re-locate it within thegreen list element857 while simultaneously changing the selectedcolor display element858 from purple to green to indicate the color change. Now, having completed entering all subtask parameters, the user may select theOK button element859 inUI833 to return to the timelineevent detail UI834. However, now the newly entered and saved subtask of “Calls”,element860, is presented in green colored text as specified by the user.
In an embodiment, referencingdetail UI834, the user has an option to selectAdd Subtask element861 to add an additional subtask using the same method as described inUIs830,831,832, and833 and illustrated byelement862, “Close Sales”.
In an embodiment, subtasks may incorporate overlapping dates and durations within the overall duration, start date, and end date of a timeline event. In this manner, a user may enter any number of subtasks, stacked and/or cascading. In another embodiment, subtasks might not be permitted to incorporate overlapping durations and/or start and end dates, thereby causing such subtasks to be sequential as illustrated inUI835. It should be noted that selecting color variants for different subtasks provides a readily apparent visual means for identifying and differentiating between subtasks at all levels of zoom and adaptive reductions of visible content (e.g., titles of subtasks) of the timeline andtimeline event863, as shown in UI's835,836, and837. Now, referring back toUI834, a user may select theback button element863 to return to thetimeline view UI835. In a fully zoomed-in view,timeline event863 includes two subtasks occurring in a sequential manner as described above. In an example timeline, event863 (Business Trip) now includes subtask864 (“Calls”) and subtask865 (“Close Sales”), as indicated in colors green and purple, respectfully.
FIG. 9A includes four representative example UIs to illustrate some aspects herein. UI's900-903 relate to a timeline for a user to perform quick edits to a selected timeline event without the need to drill down to event details, select an action button to present an options popup menu, and select edit as disclosed inFIG. 8A above. In some timeline applications, adjusting the end date or start date of a timeline event may be one of the most used edits in the course of managing a timeline project. As described below, a “quick edit” functionality may be used as an alternative to using an options popup menu method. ReferencingUI900 in an embodiment, a user clicks down or presses and holds (on touch devices) on the right end (e.g.,904) portion of the Develop Opportunities timeline event bar then immediately drags the right end portion of thetimeline event bar904 towards the right as shown inUI901 andUI902 at904. In an embodiment, in response to the click and hold or press and hold action (just before the dragging action), the timeline view immediately changes to an edit mode as shown inUI901 andUI902 wherein thetime scale907 and timelineevent bar element906 are displayed in a blue color (or another color). In this example, the user desires to extend thetimeline event906 by one day as is illustrated by the distance of the drag action being equal to one day in thetime scale element909 inUI902. Upon reaching the one day scale position as desired, the user releases the down click or press and hold action, thereby exiting the edit mode and returning to the normal state as shown inUI903 where the Develop Opportunities timelineevent end date910 is extended one day from October 31stto November 1st. In an embodiment and in a similar drag action, a user may adjust the end date to an earlier date by dragging the right (i.e., end date) portion of the timeline event bar towards the left. In an embodiment, the user may use a similar method on the left (i.e., start date) portion of a timeline event bar element to adjust and edit the start date. It should be noted that changing the start date of a timeline event may cause changing the vertical placement within the cascading stack of timeline events, as described inFIG. 6 above.
FIG. 9B includes three representative example UIs to illustrate some aspects herein. UI's911-913 relate to a timeline for a user to perform quick edits to a selected timeline event without the need to drill down to event details, select an action button to present the options popup menu, and select edit as described inFIG. 8A above. In an embodiment, a user either double clicks or presses and holds (on touch enabled devices) ontimeline event bar914 inUI911, entitled “Develop Opportunities”. The integrated calendar and timeline system herein recognizes either of these actions (depending on the device being used) and presents aquick edit popup915 as shown inUI912, while at the same time enabling a timeline edit mode as illustrated bytimeline event914 and thetime scale element917 changing to a blue (or another) color. Thequick edit popup915 includes two user selectable tab elements918 (start date) and919 (end date) and a cancel quickedit button element920. In an example, selecting the cancel quickedit button element920 that will remove the quick edit popup, not save any submitted edit entries, and exit edit mode. In another example, selecting thestart date tab918 displays thedate picker popup921 shown inUI913, thereby providing a mechanism fora user to adjust the start date as previously described inFIG. 8C atUI831. Similarly, if desired, a user may select theend date tab919 inUI912 or theend date tab922 inUI913 to adjust the end date of thetimeline event914. When the user is satisfied with the edits to thetimeline event914, their selecting of theOK button element923 saves the changes and closes thedate picker popup921.
FIG. 9C includes two representative UI examples to illustrate some aspects herein. UI's923 and924 relate to a timeline for a user to perform quick edits to a selected timeline subtask event without the need to drill down to the overall event details in which subtasks are contained, select a subtask event from a timeline event details page, and edit the subtasks in a fashion similar to that described inFIG. 8A above. In an embodiment, similar to some actions described with regards toFIG. 9B above, a user either double clicks or presses and holds (touch enabled devices) on a timeline event subtask element925 (“Calls”) or subtask926 (“Close Sales”), as shown inUI923. The integrated calendar and timeline system herein can recognize either of these actions on a respective timeline subtask and presents aquick edit popup927 as shown inUI924, while at the same time enabling a timeline edit mode as illustrated by timeline subtask928 and thetime scale element929 changing to a blue (or another) color. Subsequently, user actions, navigations, and selections may be made in a manner similar to those described inFIG. 9B above, thereby keeping the GUIs herein consistent and intuitive.
Collectively, quick edit embodiments as described inFIGS. 9A, 9B and 9C may be interchangeably available at any time as desired by a user to adjust timeline and timeline subtask start and end dates, responsively and adaptively across all device types (i.e. desktops, laptops, tablets, and smartphones), device screen sizes, resolutions and orientations to provide an optimized user operation and an intuitive user experience for a synchronized calendar and timeline application.
FIG. 10A includes an example of a simplified, yet illustrative, flow diagram relating to some aspects of an integrated calendar and timeline system application, including aspects relating to synchronizing calendar and timeline views according to user actions, selections and navigations.
Referring toFIG. 10A,calendar view1000 represents a day view wherein a current day (today) context is displayed (similar to the view shown inFIG. 4A,UI425 where the current day is shown as October28,2011). Referring toFIG. 10A, a selected day may be either the current calendar day (today) or another day selected by a user. All calendar viewsUIs1025 persistently contain a day context no matter what view (e.g., day, week, month, or list) is selected. If a user has not selected a particular day, the day context defaults to the current calendar day (today) as indicated by a gold highlight (or another color) or in the timeline by a vertical day line in gold (or another color), as shown inFIG. 6 at625. Both calendar and timeline current day (today) indications are always (persistently) displayed in all views. In a calendar view, if the user selects another day in the past or future, that selected day is shown in a blue highlight (or another color) in addition to the current day indication as shown in the week viewFIG. 3UI325 at320 (user selected day) and315 (current day). In the timeline context, a user selected day is shown by a vertical day line in a blue (or another color) as shown inFIG. 7B at778 and779 and the current day (today) is shown as a gold (or another color) vertical line, as depicted inFIG. 7B inUI718 andUI719 at776 and777, respectfully. In some aspects, a user may select or change a day selection in either the calendar or timeline contexts in all calendar views (day, week, month), and in a timeline at any zoom level as described in connection withFIG. 7B andFIG. 7C at779. As related to calendar and timeline synchronized views in diagram10A, all references to a “selected day” may be to any of a current day (today), or another user selected day from either a calendar view context or timeline view context.
Still referring to theFIG. 10A,calendar view1000 indicates a calendar day view of a selected day. A user may then navigate to atoggle button1002 vialogical connection1001 and select the timeline view button which then displays thetimeline view1003 in the fully zoomed-in state (level 1).Timeline view1003 displays the selected day from thecalendar view1000 as a vertical line indication with the view panned to show events that intersect with the selected day indication line in the center focus of the screen, as can be seen in the example disclosed inFIG. 7B atUI719. Intimeline view1003, the user may not change the selected day determined inview1000 or alternatively may select a new (different) day. In an embodiment wherein the user does not change the selected day, the user may navigate to thetoggle button1002, select calendar view, and return to the calendar day view and the selected day that as was originally indicated inview1000. In this manner, the calendar and timeline views are contextually synchronized to show relevant screen information in both the calendar view and the timeline view with a minimum need, if at all, for the user to adjust either view to review relevant event information, wherein the calendar andtimeline toggle button1002 operates as a mechanism to switch betweensynchronized calendar view1000 andtimeline view1003.
In an embodiment wherein the user selects a new day, for example in thetimeline view1003 and navigates back to thecalendar day view1000 via thetoggle button1002, thecalendar day view1000 will now indicate the newly selected day as changed intimeline view1003. In this case, thecalendar day view1000 is synchronized with thetimeline view1003 selected day change, which will be shown in thecalendar day view1000 as the selected day (highlighted). In an embodiment, the reverse synchronization may be performed. For example, if a user changes the selected day in thecalendar day view1000 and navigates to thetimeline view1003 viatoggle button1002, then that new (or changed) selected day will be represented in thetimeline view1003.
In an embodiment, a user may change the calendar view to any of a day, week, or month view as indicated inday view1000,week view1004, andmonth view1007 vialogical connections1016 and1017, respectfully. In an embodiment, a user may change the timeline view to any of azoom level 1, 2 or 3 as indicated intimeline view1003,timeline view1006, andtimeline view1009 vialogical connections1013 and1014, respectfully, including by any of the multiple interchangeable methods as described inFIGS. 7A, and 7B.
In an embodiment as shown in diagram10A, toggling betweencalendar views1025 andtimeline views1026 are also hierarchically synchronized. As an example, toggling betweencalendar view1000 andtimeline view1003 consistently maintains a hierarchical relationship between thecalendar day view1000 and thetimeline zoom level 1view1003. In an embodiment, toggling betweencalendar view1004 andtimeline view1006 maintains a hierarchical relationship between thecalendar week view1004 and thetimeline zoom level 2view1006. In an embodiment, toggling betweencalendar view1007 andtimeline view1009 maintains a hierarchical relationship between thecalendar month view1007 andtimeline zoom level 3view1009.
In the synchronized toggle and hierarchical embodiments described above, a selected day as defined above can be persistently indicated in all calendar views (by highlight), in all timeline views (indicated by day line and pan to center focus), and in a list view by live auto-scrolling (in an animated fashion) to show a selected day to facilitate maintaining a user's temporal focus and a contextual relevance betweencalendar views1025 and timeline views1026.
In an embodiment, a user may select acalendar list view1010 from any one of a calendar day, week ormonth views1000,1004, and1007 vialogical connections1022,1023, and1024, respectfully. In an embodiment, the user may navigate from thelist view1010 back to any of the calendar day, week, ormonth views1000,1004, and1007 vialogical connections1022,1023, and1024, respectfully. Similarly, in some embodiments the user may select atimeline list view1012 from any of atimeline zoom level 1, 2, and 3views1003,1006, and1009 vialogical connections1019,1020, and1021, respectfully. In an embodiment, the user may navigate from thelist view1012 back to any oftimeline zoom level 1, 2 or 3views1003,1006, and1009 vialogical connections1019,1020, and1021, respectfully.
In an embodiment, an example method for toggling between calendar or timeline views to a corresponding list view as depicted inFIG. 10A may be accomplished as shown inFIG. 8B atUI815,UI817, andUI819. In an embodiment, a user may also interchangeably use the natural language input mechanism(s) to change/toggle between calendar or timeline views and corresponding list views.
FIG. 10B includes an example of a simplified yet illustrative flow diagram relating to some aspects of an integrated calendar and timeline system application, including aspects relating to synchronizing calendar and timeline event entries, edits, and deletions according to user actions, system imposed actions, and temporal aspects.
InFIG. 10B, flow diagram1030 represents a calendar entry synchronization flow and flow diagram1040 represents a timeline entry synchronization flow. In flow diagram1030,operation1031 represents a user inputting new calendar event entries (e.g., appointments, meetings, calls, etc.) of one day or less in duration, according to a date, time, topic (title) and duration, etc. In an embodiment, a user may make edits (or deletions) to existing calendar events in an integrated calendar and timeline application or an event imported from another application registered with the integrated calendar and timeline application. In an embodiment, upon completion or user acceptance, these events may be saved to the system and subsequently made available to be viewed in all calendar views inoperation1032, according to temporal aspects (time, duration, date, etc.). Subsequent to saving the calendar events, edits or deletions to calendar events within the integrated calendar and timeline system application, these saved events or changes to events are registered with a fetch functionality inoperation1033 to provide multiple alerts, notifications, reminders, calls to action, other fetch enabled push messages, etc. to relevant users of the fetch system, according to a context of the event, edit or deletion to the user, and in many other contexts as described herein. In this manner, the user inputted event (and/or other external event inputs) can be synchronized with many aspects of the system, including multiple external users of the system to coordinate temporal calendar and timeline events, communications, calls to action, and collaborations herein.
In flow diagram1040,operation1041 represents a user inputting new timeline event entries (e.g., tasks, milestones, projects, etc.) of one day or greater in duration according to a start date, an end date, a topic (title), a sub-task, etc. In an embodiment, the user may make edits (or deletions) to existing timeline events in an integrated calendar and timeline application or a timeline event imported from another application registered with the integrated calendar and timeline application. In an embodiment, in response to completion or user acceptance, these timeline events may be saved to the system and then subsequently made available to be viewed in all calendar and timeline views inoperation1042, according to temporal aspects (time, duration, date, etc.). Subsequent to saving the timeline events, edits or deletions to events within the integrated calendar and timeline system application, these saved events or changes to events are registered with the fetch functionality inoperation1043 to provide multiple alerts, notifications, reminders, calls to action, other fetch enabled push messages, etc. to relevant users of the fetch system, according to a context of the event, edit or deletion including the user, and in many other contexts as described herein. In this manner, a user inputted event (and/or other external event inputs) can be synchronized with many aspects of the present system, application, and service, including multiple external users of the system to coordinate temporal calendar and timeline events, communications, calls to action, and collaborations herein.
FIG. 11 illustrates an example embodiment relating to natural language inputs in the context of the present disclosure. Natural language can be used to enter user inputs herein in some embodiments, either via spoken user entries or text entries. A number of the UI's discussed above include a UI element for entering natural language inputs. For example,UI325 inFIG. 3 includes a naturallanguage input control345, wherein text input (as indicated at335) and spoken input (as indicated by the microphone icon340) can be entered. The natural language submitted by a user, whether spoken or text, may be compared to a knowledge base associated with the user. The knowledge base may be consulted to determine a meaning for the user's input based on, at least in part, the user's previous entries. To facilitate this feature, the systems and applications herein may operate, via machine learning functionalities and processes, to “learn” what a user means as the user interacts and interfaces with the systems and applications disclosed herein. The machine learning components may, in some instances, learn the meaning of words and phrases from a user's most-used, recent, or certain voice prompts and syntax to build a user-specific set of rules and a key-word library. The machine learning aspects and features may be especially useful for executing compound-action navigations and other action requests. This aspect of machine learning may include, but need not be limited to, speech recognition aspects, pattern learning, and user preferences. Systems and applications herein may use speech recognition functions to, for example, identify contextual key words within a prompt to interpret the desired action, incorporate, and consult speech recognition libraries, databases, and metadata associated with databases. In some aspects, the natural language inputs may be analyzed with respect to time-based contexts including calendar, timeline, date and time contexts, to the exclusion of other contexts and words. In this manner, the focus and relevance of the systems herein may be maintained and the disclosed system(s) may operate efficiently.
When using natural language inputs a user may, for example, be automatically navigated from a current view location to a target view location in response to one natural language input from a user, even where multiple steps are needed to navigate from the starting location to the target location. In contrast to a typical manual input where the user must manually navigate through each step, embodiments herein automatically navigate from the start location to the target location and include animations and/or transition visualizations for each step in the journey. In some regards, this feature may provide a level of context for a user by visually conveying to the user where they are starting from, where they end (i.e., the target location) and how they got to the target location—automatically.
According toFIG. 11, a user starts at an initial view at1105. The user then determines (1110) that they would like to go to a different view. For example, the user may want to navigate from a first day to another day in an effort to schedule a meeting on the second date. At1115, the user enters their desired action (“go to May 4, 2017) using natural language. In response thereto, the system or application automatically determines the meaning of the user's natural language input. Such analysis may consult and compare the input(s) to past inputs from the user, metadata associated with a database of acceptable entries, and other mechanisms in an effort to ascertain a logical and correct meaning for the user's input. In some example embodiments, compound actions may be entered via natural language herein. A compound action may include multiple operations or steps to complete an action, as opposed to actions including a single task. For example, a natural language entry of “schedule a meeting with Bob Vance in the main conference room next Tuesday and notify my assistant of the meeting” might trigger a series of events, including but not limited to, for example, determining what day is “next Tuesday”, determining an open time slot (if any) in the user's schedule and the main conference room, obtaining Bob Vance's contact information to invite him to the meeting and informing the user's assistant of the meeting (e.g., including accessing the user's contacts). In this manner, a user may enter the compound action in a single natural language input entry, as opposed to as multiple simple or single job tasks. Upon determining the meaning of the user's natural language input, the system and application may automatically navigate to the location or action determined to be a target location or action based on the analysis of the user's natural language input. In some embodiments, the system automatically generates and presents UI visualizations for eachstep1125 of the navigational path from the user's current location to thetarget location1135 corresponding to the action specified in the user's natural language input. The visualizations may be presented in the form of animations, transitions between one view and a next view, and otherwise sequencing from one view to a next view as one might see in a manual navigation through each of the corresponding manual navigation steps at1130. In some embodiments, the system may automatically enter content (auto-fill) into text fields and other UI contexts, including but not limited, to a meeting or timeline task subject title, meeting attendee names, task start or end dates, geographical location information, etc. as anticipated and determined via the system's machine learning and other system aspects. For example, a natural language input could be recognized and automatically determined by the system to be calendar-recognized content. As an example, the spoken input of “Sara . . . create a new appointment with Jack from XYZ Corp. for Tuesday during the June conference in Iowa” could be recognized by the system as being a calendar context input. As the system is able to recognize the referenced time (i.e., “next Tuesday”) in its timeline, and has access to the XYZ company information and associated contacts, a new (at least partially) pre-populated meeting detail could be generated and presented to the user. In response, the user could then view his calendar for open times (in some embodiments, some suggested times could be proactively presented to the user) and complete the meeting form with additional details.
FIG. 11 also includes manual steps for navigating from the start location to the target location. As shown at1130, the manual navigation steps are individual user selection steps, whereas thesteps1125 are grouped together since they are executed automatically and presented as one continuous flow, even though multiple steps may be required to reach the desired target location in executing the actions entered via the natural language input, including single and compound actions.
In some aspects, a user may interrupt the execution of an automatic flow (e.g., operations in the automated navigation1125) at any point in the presentation thereof to stop the flow and/or navigate manually. The option to switch to a manual process may be presented to a user in most UIs herein via a user selectable UI control (not shown), for example a translucent stop/resume toggle button. In an embodiment, the user may simply click or tap on the screen to stop the automatic flow and perform overriding manual inputs or navigations, etc.
In some aspects, natural language input may be incorporated in one or more device platforms (desktop, mobile tablet, smartphone, a wearable device, etc.).FIGS. 12A-12C include illustrative depictions of a set of UI controls that may be included in an example embodiment of a UI herein.UI1200 relates to a calendar and specifically to time and date “picker” controls of an interface for the calendar to specify a date for a calendar event (e.g., appointment, call, or meeting, etc.). In the example ofFIG. 12A, adate tab1205 is selected for entering a date for a calendar event, “slider” UI control elements can be used to select aspecific year1210, amonth1215, and aday1220. As shown, the horizontal slider UI controls operate by dragging/sliding/scrolling right (exposing values back in time) or left (exposing values forward in time) for each of the year, month, and day corresponding time parameter indications within center lens area1222. Each slider control operates independent of the others and they collectively operate to indicate a specific date, including the day, month, and year for a calendar event. The slider controls may be manipulated in response to a mouse or other input device (e.g., a keyboard, etc.), a touchscreen, voice prompts, and combinations thereof. Upon aligning the desired year, month, and day as shown inFIG. 12A in the center lens1222, the user can select the “OK” button at1225 to enter the selected date (i.e., “Oct. 30, 2011”).
FIG. 12B illustrates a mechanism for setting a start time for the calendar event being set in the present example, including using the horizontal slider UI controls to select a time comprising the parameters of an hour (1230), minutes (1235), and period of day (1240). The user selects the “OK” button at1242 to enter the selected start time (i.e., “5:00 PM”).
FIG. 12C illustrates a process for setting the end time for the calendar event in the present example, including using the horizontal slider UI controls to select a time comprising the parameters of an hour (1245), minutes (1250), and period of day (1255). The user selects the “OK” button at1257 to enter the selected end time of 6:00 PM.
Still referring toFIGS. 12A-12C,UI1200 includes a user-selectable “All Day”UI element control1227 that may be selected by a user to indicate a calendar event should be scheduled for the entire day of a selected date.
WhileFIGS. 12A-12C specifically relate to a calendar functionality, similar controls to specify a period of time of one or more days in duration can be implemented for user entry of times in the context of a timeline functionality herein.FIG. 12D includes an illustrative depiction of a set of UI controls that may be included in an example embodiment of a UI herein. UI1250 (FIG. 12D) and1260 (FIG. 12E) relate to a date “picker” control of an interface for a timeline to specify a start date, end date, and a text title for a timeline event. In the example ofFIG. 12D,UI1250 includes a title tab1254 that can be used to display an editable text field (not shown) that a user may use to enter the text for the title of the timeline event. In the example ofFIG. 12D,UI1250 also includes a start date tab1252 that can be used to select a start date similar to theUI1200 inFIG. 12A, including the selection of a specific year, a month, and a day.FIG. 12D further shows the slider controls that may be manipulated in response to a mouse or other input device (e.g., a keyboard, etc.), a touchscreen, voice prompts, and combinations thereof for selecting a start date and end date. InFIG. 12E,UI1260 includes anend date tab1264 that illustrates UI elements for the setting of an end date for a subtask, including a day (1265), month (1270), and year (1275) using horizontal “slider” controls that can be manipulated left or right to select different values as indicated in thecenter lens area1280. In some aspects, a user can be prevented from entering illogical and realty-restricted entries such as, for example, an end date that is before a start date, wherein such restricted entries are not presented as an option for the user.). In an embodiment, a timeline milestone event may be specified by selecting the same day for the start date and the end date, which can be automatically interpreted by the system to indicate the specified event is a milestone type of event.
In some embodiments herein, there may be a feature of a user settings menu that includes, for example, one or more user settings or parameters that can be configured to a user's particular preference(s).
One example embodiment may include the settings or parameters of Auto Navigation, All Day Durations, User Profile & System Permissions, a Clock Source, Theming Selections, and Priority Rules. In this example, the settings and their meaning are set forth below.
Auto Navigation: Turn On/Off Auto Navigation Sequence Animations and Transitions—In Voice or Text prompt mode, disables displaying the automated navigation sequence and immediately presents the Target Screen from the initial screen where the prompt was entered.
All Day Durations—Select the “All Day” Time Duration (Uses the Time Picker control); Default is 24 hours (e.g.,FIG. 12B where the “All Day”UI element1227 may be selected to specify a calendar event is one day in duration).
User Profile & System Permissions—User selection or corporate/organization provisioned role names, ID and definitions.
Clock Source—Internal clock or network clock selection.
Theming Selections—User selection of pre-set color themes and semantic color pallet options.
Priority Rules—User selection of priorities rules in e.g., a list format by which top priority rules are indicated at the top of the list in a descending order as indicated by moving items Up/Down in the list. Examples include, (a) Nearest Time to current date, (b) Priority (!) selection, (c) Event or Appointment Type (per User Role), User entered Event or Calendar Topic/Title, (d) Participant/contact name or keyword associated with a Calendar or Timeline item, (e) Other . . . etc. In some aspects, the priority list may be constantly updated according to time and user entries and edits, and priority lists can be saved as user selectable presets to immediately display corresponding priorities (e.g., my “Trade Show Event” priority list, my “Customer” priority list, etc.).
In some example embodiments, such as a use-case including an enterprise solution, the integrated calendar and timeline application disclosed herein may be automatically provisioned according to a user's role in the organization. This feature may provide a user with the ability to efficiently and easily import/export relevant date and event data from external role-specific software environments and applications, attach external objects to calendar appointments, timeline events, fetch messages, and combinations thereof. For example, a Software Development Engineering role in some example embodiments herein may be pre-provisioned with pre-defined task & appointment types, such as the following:
- Sprint Reviews
- Quality Gate (Q-Gate) Milestone
- User Testing
- UI Code Release
- Proof of Concept Prototype (POC) development Task
- Emergency Code Correction (ECC) Task or Deadline
- Development Close (DC) Task or Deadline/Milestone
- Release to Customer (RTC) Milestone
- Product Research Task
- Etc.
This Software Development Engineering role may also cause, for such a user, links and integrations with Development Software Data and environments, such as:
- Jira Portal (development issues Tracker or project tasks & Dates, etc.)
- Excel documents
- Project Management software e.g. Microsoft Project, etc.
- Internal corporate, development, or collaboration software or portals, etc.
- Corporate wiki.
FIGS. 13A, 13B, and13C illustrate example user interfaces for an embodiment to set user roles.FIG. 13A includes aUI1300 that includes an action button at1305.Action button1305 may be selected to cause a presentation ofActions UI1310. TheUser Roles1315 action item can be selected inFIG. 13B inUI1310 to display the different user role options shown inUI1320. In an embodiment, an “All” selection1325 may be selected to select all user role options simultaneously without the need for the user to individually select each item one at a time. After the appropriate role(s) are selected inUI1320, the “OK” button is selected and the UI can be closed to return to the calendar view ofFIG. 13A.
FIGS. 14A, 14B, and14C illustrate example user interfaces for setting one or more pre-defined tasks.FIG. 14A includes aUI1400 having an action button at1405.Action button1405 may be selected to cause a presentation ofActions UI1410. The ManagePre-Defined Tasks1415 action item can be selected inFIG. 14B atUI1410 to display the different types of tasks illustrated inUI1420 that correspond to selected User Roles. Upon selection of one or more types of tasks inUI1420, the “OK” button inUI1420 can be selected and the UI can be closed to return to the calendar view ofFIG. 14A.
FIGS. 15A, 15B, and15C illustrate example user interfaces for setting one or more pre-defined appointment types. In this example embodiment,FIG. 15A includes aUI1500 having an action button at1505.Action button1505 can be selected to invoke a presentation ofActions UI1510 shown inFIG. 15B. The Manage AppointmentTypes action item1515 can be selected inUI1510 ofFIG. 15B to display the different types of appointments depicted inUI1520 ofFIG. 15C. Upon selection of one or more types of appointments in1520, the “OK” button can be selected inUI1520 to set the available appointment types, close the UI, and return to the calendar view ofFIG. 15A.
FIGS. 16A, 16B, and 16C illustrate example user interfaces for setting one or more parameters related to managing connected applications. In this example embodiment,FIG. 16A includes aUI1600 having an action button at1605.Action button1605 can be selected to invoke a presentation ofActions UI1610 where the Manage ConnectedApplications action item1615 can be selected inFIG. 16B at1615 to cause a display of the different types of applications that can be managed via the Fetch functionality herein, as depicted inUI1620 ofFIG. 16C. Upon selection of one or more applications in1620, the “OK” button inUI1620 can be selected to set the indicated link connections,close UI1620, and return to the calendar view ofFIG. 16A. In some embodiments, the selected applications may need to be compatible with the Fetch functionality herein by, for example, having an appropriate plug-in, API, etc.
Some example embodiments herein may use rules-based situation detection to recognize schedule conflicts and push/propose options for a resolution. Referring toFIGS. 17A-17D, in the case of scheduling a calendar appointment that conflicts with a timeline event, a user attempts to schedule anappointment1705 as shown inFIG. 17B (that may be navigated to fromUI1710 inFIG. 17A) in San Francisco during a trade show event that is taking place in Chicago at the same time. The calendar and timeline scheduler herein may display adialog1715 stating the conflict in the form of, for example, “You have a Chicago Trade Show Event on this Date” and provide user-selectable resolution recommendations such as:
- Change this appointment to a phone call;
- Change the date of the San Francisco appointment after the Chicago Trade Show;
- Change the date of the San Francisco appointment before the Chicago Trade Show; and
- Cancel the appointment.
In the present example, the user selects “Cancel the New Appointment” and further invokes the “OK”button1720 to cause thedialog box1715 to close, thereby deleting the newly proposed appointment and presenting the UI ofFIG. 17D. In some embodiments, the pop-up including the resolution recommendations may, as an addition to the pop-up or as an alternative thereto, inform a user of the recommended resolutions via a text-to-speech functionality.
Another scenario relating to scheduling conflicts might include moving an appointment to another day that has a conflicting calendar appointment already scheduled. In this case and referring toFIGS. 18A-18E, a user, viaUI1800, moves a previously scheduledappointment1805 to the next day (i.e., from Wednesday, 28thto Thursday, 29thinFIG. 18A) where another appointment is already scheduled for the next day in the appointed time (FIG. 18B). The system detects this conflict and displays the movedappointment1810 in a RED semantic color. Here, the user inputs a voice prompt, “Sara . . . move this appointment to nearest available day” via naturallanguage input mechanism1815. In reply, the system moves the appointment to Friday, where it is displayed in a GREEN semantic color (FIG. 18C, 1820) that indicates no conflicts exist with this appointment. In this example, the user has the option to dragappointment1820 to other days (FIG. 18D), but decides to accept the system's recommendation by selecting the GREEN indicatedappointment1820 that is then displayed in BLUE (FIG. 18E). The BLUE color indicates that the selected appointment is successfully scheduled, as illustrated at1825. The system may further automatically send the appointment date change notification to the meeting participant(s) and at the same time adds the newly scheduled appointment request to its fetch notifications, which may also cause a display of a fetchbanner confirmation1830.
In one example embodiment, voice prompts may be used to move an appointment to an alternative available day. A first example use-case of this type is illustrated in some aspects inFIGS. 19A-19C. Referring toFIG. 19A, a user selects anappointment1910, taps or clicks on the microphone icon, and then prompts the system to “Move this appointment to the nearest available day” via a naturallanguage input mechanism1905 inUI1900. The system understands the user to be referencing the selected appointment as shown at1910. The calendar and timeline scheduler herein then moves the selected appointment to Friday, November 1stwith the same scheduled time and a GREEN color. The system also displays a dialog box1920 that includes the wording, “Do you want to send a Meeting Update to all participants?” and includes user input YES/NO buttons. In response to the user selecting the YES user interface button, the appointment's color is changed to BLUE to indicate the appointed has been scheduled, as shown inFIG. 19C at1925. Also, an update notification may be entered into the fetch functionality herein, as indicated at1930, and a notification is sent to the meeting's participants that includes a mechanism for the invited participants to accept, tentatively accept, or decline the meeting. Such participant responses are then communicated to the user and the system via the fetch notification functionality herein
In some aspects and embodiments herein, the system may recognize the word “Sara” as the name identifier for the calendar/timeline scheduler application. In an embodiment, a user saying the name “Sara” invokes the speech recognition function without the need to click or tap on the microphone icon within the natural language input mechanism (e.g.,1815,1905). In some embodiments, the particular name identifier may be defined by a user, system administrator, or other entity having the appropriate credentials and/or role relative to the disclosed system.
In another example embodiment including voice prompts being used to move an appointment to an alternative available day,FIGS. 20A-20C illustrate a number of aspects of the system disclosed herein. With reference toFIG. 20A, a user inputs a prompt of, “Sara . . . move this appointment to the nearest available day” via a naturallanguage input mechanism2005 inUI2000. The system then moves the selectedappointment2010 to Friday, November 1stwith the same scheduled time but displays it in a GREEN color since it is recommended/available but not yet scheduled. The user accepts thisrecommendation2015 by selecting (e.g., clicking on, pressing on, and other UI entry methods) the recommended appointment and the appointment's color is changed to BLUE to indicate the appointed has been scheduled, as shown inFIG. 20C at2020. Hereto, an updated notification may be entered into the fetch functionality (as indicated in the notification banner at2025) and a notification is sent to the meeting's participants that includes a mechanism for the invited participants to accept, tentatively accept, or decline the meeting.
FIGS. 21A-21C relate to scheduling an appointment with one or more participants that have conflicting appointments. In this example, a user creates new appointment with participants that already have a scheduled meeting at the same time as the new appointment, thereby creating conflicts. As the system has insight into both the user's and the participant's calendars, the conflicts are detected and the new appointment is displayed in a RED Semantic color at2105. Additionally, a UI pop-up is displayed including aconflict dialog2110 describing the conflict. In reply to the reported conflict, the user may engage the system intelligence via the naturallanguage input mechanism2115 by providing spoken interactions with the system. In this case, the user states, “Sara . . . show me the nearest optional days for this new appointment that would work”. The system then displaysoptional days2120,2126, and2125 that do not have conflicts with either the participant's or user's calendars in a GREEN semantic color. In this example, the user selects the option of2126. The system removes the non-selected options (i.e.,2120 and2125) and displays the selected appointment in the BLUE as shown at2130 ofFIG. 21C to indicate that the appointment is now successfully scheduled. In some embodiments, appointment request(s) are automatically sent to the participant and the fetch functionality herein is updated with the newly scheduled appointment.
Some aspects herein may provide, support, or at least facilitate a responsive and adaptive solution across different types of device platforms (e.g., a desktop computer, a mobile tablet, a smartphone, a wearable device, and other devices and systems), wherein a user's operational actions and views can be both sticky (i.e., persistent) and fully synchronized. In this manner, operations of a user on one platform device can immediately be moved to and resumed on another device (i.e., synchronized) so that the user can resume their interactions with the system at the location where they previously stopped. The user's last location within the system and applications herein can include last used calendar or timeline screen views, actions, etc. In some embodiments, all data is persistently kept intact without a threat of losing entries or their place in a workflow, or even a need for a user to actively execute a “Save” function (i.e., performed automatically as a background job). In some embodiments, the fetch related content can also be persistently maintained and pushed or otherwise distributed to all relevant devices without a need to actively execute refreshes or other actions in order to acquire up-to-date fetch notifications and messages.
FIG. 22 is an illustrative depiction of an example functional system diagram2200 for some embodiments herein. As a representation of a functional diagram, the functions depicted inFIG. 22 may be implemented in one or more devices, systems, and components where the various devices, systems, and components may include one or more hardware and software configurations to perform the particularly illustrated functions discretely or as a combination of functions, depending on a particular implementation.FIG. 22 may encompass the calendar, timeline, and fetch functions (i.e., a “scheduler”) disclosed herein that can enable enterprise systems and role-based users with a common, user optimized calendar and timeline solution, in accordance with other aspects herein. In an embodiment, the calendar, timeline, and fetch functions (i.e., a “scheduler”) disclosed herein can enable consumer role-based users and application software with a common, user optimized calendar and timeline solution. The calendar and timeline solution, in some embodiments, is operable across different applications and devices (e.g., desktop computer, mobile computing tablet, smartphone, smartwatch, etc.). In accordance with some aspects of the present disclosure, the scheduler may deploy a notification and messaging (i.e., “fetch”) framework that logically resides on top of the scheduler to provide a unified, easy-to-use UI for system push and pull notifications, alerts, reminders, messages, content delivery, various calendar and timeline user-relevant time/date contexts, and artifacts, all in an intuitive and highly personalized manner. In some embodiments, aspects and features of the scheduler can be deployed within social networks or network groups of a consumer or an enterprise.
In some aspects, enterprise and consumer users might be constantly manipulating various calendar and timeline views to perform time-related operations, activities, and entries, where a number of those activities may be performed on small-screen devices such as, for example, mobile smartphones and tablets. The scheduler herein includes UIs having corresponding interaction and navigation design features that provide numerous variations for carrying-out the various calendar and timeline workflows disclosed herein. Some of those workflows might include calendar and timeline view changes, parameter and user date/time entries, etc. As highlighted in other portions herein, the scheduler may provide mechanisms for natural voice or text inputs, including but not limited to “instructional” commands or directives that rely on system intelligence to, in some instances, minimize user actions and UI selections, and permit the system to intelligently and automatically carry-out system operations and/or immediately display relevant screen views to users.
Functional system diagram2200 may support the various features disclosed herein related to a time relevant experience, including enabling a user to experience “time-travel” as the user navigates backwards and forwards through time in the context of the calendar and timeline functions of the present disclosure. As disclosed herein, selecting a time in the future or past may cause a display of transition and animation effects that actually impart a sense of moving through time. In some embodiments for example, instead of simply snapping to a new screen view, the scheduler herein (interchangeably) uses one or more of horizontal and/or vertical screen animations (depending, for example, on device orientations, configurations, capabilities, and displayed views) to provide users with an actual context-awareness of time. This feature delivers an additional dimension to the user's experience with the system, while maintaining the user's contextual “time-focus” as they and the system herein operate, navigate, and execute the disclosed calendar and timeline processes.
Functional system diagram2200 may generally include a logic layer2205; a UI interactions, composite views, user prompts, and actions (i.e., UI interactions)layer2210; adata modeling layer2215;intelligence plugins2220; and input/output (I/O) pluginslayer2225 that are coupled to logic layer2205,UI interactions layer2210, anddata modeling layer2215; and a responsiveadaptive UI layer2235 that facilitates communication betweenUI interaction layer2210 and external user devices2230.
In the example embodiment ofFIG. 22, logic layer2205 includes a plurality of functional modules or features, including a Rules & Error Detection Engine2201, a User Instruction &Navigation Patterns function2202, a Navigation Pattern Animations &Decision Paths function2203, a Natural LanguageRelevance Analysis feature2204, a Natural Language Vocabulary &Library feature2206, an Applying Machine Learning Patterns feature2207, a Priority & RelevanceDetermination Engine feature2208, a Real-Time Clock &Time Relevance feature2209, a Fetch Logic & Settings feature2211, aView Logic feature2213, and a Settings Executions feature2214. The functionality provided by each of these functional features may be generated, executed, or facilitated by one or more devices, systems, and apparatuses.
Regarding theUI interactions layer2210, this logic layer includes, in the example ofFIG. 22, functional features or components for Text & Voice Prompts2271, Manual Selections, Navigations & Animations2272, Automated Navigations & Transition Animations2273, Calendar Views &Date Selections2274, Timeline Views &Zooming2276, Appointment &Event Views2278, Fetch Notifications, Alerts & Filter &Messaging2281, Date &Time Pickers2282, Timeline Task & Sub-Task Creator &Editing2284, Settings & Theming &Semantic Color Selection2286, User Roles andPermissions2288, and Search functionality as shown at2289.
Data modeling layer2215 includes the following functions of Rules-Based Situation Detection2216, External Content Plugins2218, Contacts (Internal & External) Plugins2219, Semantic Vocabulary & Routing2221, User Profile &Role Data2222, Appointments &Events Data2223, External & loT (Internet of Things)Data2224, Holidays & BusinessSpecific Dates2226,Business Data2228, Fetch Data2229,External Application Data2231, and functionScreen Capture Images2232.
Intelligence plugins2220 are coupled to and used by the logic layer2205,UI interactions layer2210, anddata modeling layer2215 to perform the functions thereof. In the example ofFIG. 22,intelligence plugins2220 include Machine Learning Patterns plugin2234, Relevance Patterns plugin2236, Calendar &Timeline Conflicts plugin2238, SituationContext Plugins plugin2242, Contacts Recognition plugin2244, AI (Artificial Intelligence)API plugin2246, Third-Party search engine (e.g. Wolfram Alpha)Search plugin2248, andInternet Search plugin2251.
I/O plugins2225 are coupled to and used by the logic layer2205,UI interactions layer2210, anddata modeling layer2215 to perform the functions thereof. InFIG. 22, I/O plugins2225 includeSpeech Recognition plugin2252, ConnectedApps plugin2254,External Email plugin2256,Device Email plugin2258,Device Calendar plugin2261, Device Language Translation plugin2262, Native Device Voice Recognition plugin2263, Native DeviceSpeech Synthesis plugin2264, Native DeviceScreen Capture plugin2268, and Native Device Clock plugin2269.
The external user devices2230 may include, as shown in the example ofFIG. 22, a Desktop Browser2291, Mobile Tablets2292,Smartphones2293, Mobile Native Apps2294, and Mobile Hybrid Apps2296. The external user devices2230 are not limited to the devices specifically shown inFIG. 22 and may include additional, fewer, and alternative devices.
Certain embodiments are described herein as including logic ora number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a tangible machine-readable medium) or hardware modules. A “hardware module” herein refers to a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system, etc.) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In some embodiments, a hardware module may be implemented mechanically, electronically, or any suitable combinations thereof. Examples of a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations. For example, a hardware module of a special-purpose processor, such as a Field-Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC). A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware module may include software executed by a general-purpose processor or other programmable processor. Once configured by such software, hardware modules become specific machines (or specific components of a machine) technically improved and uniquely tailored to perform the configured functions and, as such, are no longer general-purpose processors. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
The phrase “hardware module” should be understood to encompass a tangible entity, whether that entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software accordingly configures a particular processor or processors, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules herein may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmissions (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules may be configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices and can operate on a resource (e.g., a collection of information).
The various operations of example processes disclosed herein may be performed, at least partially, by one or more processors that are temporarily (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein. As used herein, a “processor-implemented module” refers to a hardware module implemented using one or more processors.
The processes described herein may be at least partially processor-implemented, with a particular processor or processors being an example of hardware. For example, at least some of the operations of a process may be performed by one or more processors or processor-implemented modules. Moreover, the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an API).
The performance of certain of the operations herein may be distributed among the multiple processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the processors or processor-implemented modules may be distributed across a number of geographic locations.
The modules, processes, applications, services, and the like described in conjunction with other descriptions and diagrams herein may be implemented, in some embodiments, in the context of a machine and an associated software architecture. The following descriptive disclosure describes representative software architecture(s) and machine (e.g., hardware) architecture(s) that are suitable for use with the disclosed example embodiments.
Software architectures are used in conjunction with hardware architectures to create devices and machines tailored to particular purposes. For example, a particular hardware architecture coupled with a particular software architecture can create a mobile device, such as a mobile phone, tablet device, or so forth. Different combinations and variations of hardware and software architecture may yield a smart device for use in, for example, the “internet of things” (IoT), while yet another combination produces a server computer for use within a cloud computing architecture. Not all combinations of such software and hardware architectures are presented here, as those of skill in the art can readily understand how to implement the inventive subject matter in different contexts from the disclosure contained herein, without undue experimentation.
FIG. 23 is a block diagram2300 illustrating arepresentative software architecture2302 that may be used in conjunction with various hardware architectures described herein.FIG. 23 is presented as a non-limiting example of a software architecture, and it will be appreciated that many other architectures may be implemented to facilitate the functionalities disclosed herein.Software architecture2302 may execute hardware such as amachine2400 ofFIG. 24 that includes, inter alia,processors2410, memory/storage2430, and I/O components2450. Arepresentative hardware layer2304 is illustrated and can represent, for example,machine2400 ofFIG. 24.Representative hardware layer2304 includes one ormore processing units2306 having associatedexecutable instructions2308.Executable instructions2308 represent the executable instructions of thesoftware architecture2302, including implementation of the processes, modules, and functionalities disclosed herein and the accompanying drawings.Hardware layer2304 includes memory and/orstorage modules2310 that also includesexecutable instructions2308.Hardware layer2304 may also includeother hardware2312 that represents any other hardware of thehardware layer2304, such as, for example, the other hardware aspects illustrated as part ofmachine2400.
In the example architecture ofFIG. 23,software architecture2302 may be conceptualized as a stack of layers where each layer provides particular functionality. For example,software architecture2302 may include layers such as anoperating system2314,libraries2316, frameworks/middleware2318,applications2320, and apresentation layer2344. Operationally,applications2320 and/or other components within the layers may invoke API calls2324 through the software stack and receive responses, returned values, and so forth, illustrated asmessages2326, in response to the API calls2324. The layers illustrated are representative in nature and not all software architectures have all layers. For example, some mobile or special purpose operating systems may not provide a layer of frameworks/middleware2318, while others may provide such a layer. Other software architectures may include additional or different layers.
Theoperating system2314 may manage hardware resources and provide common services. Theoperating system2314 may include, for example, akernel2328,services2330, anddrivers2332. Thekernel2328 may act as an abstraction layer between the hardware and the other software layers. For example, thekernel2328 may be responsible for memory management, processor management (e.g., scheduling), component management, networking, security settings, and so on.Services2330 may provide other common services for the other software layers.Drivers2332 may be responsible for controlling or interfacing with the underlying hardware. For instance,drivers2332 may include display drivers, camera drivers, Bluetooth® drivers, flash memory drivers, serial communication drivers (e.g., Universal Serial Bus (USB) drivers), Wi-Fi® drivers, audio drivers, power management drivers, and so forth depending on the hardware configuration.
Libraries2316 may provide a common infrastructure that may be utilized byapplications2320 and/or other components and/or layers.Libraries2316 typically provide functionality that allows other software modules to perform tasks in an easier fashion than by interfacing directly with theunderlying operating system2314 functionality (e.g.,kernel2328,services2330, and/or drivers2332).Libraries2316 may include system libraries2334 (e.g., C standard library) that may provide functions such as memory allocation functions, string manipulation functions, mathematic functions, and the like.Libraries2316 may includeAPI libraries2336 such as media libraries (e.g., libraries to support presentation and manipulation of various media formats such as MPEG4, H.264, MP3, AAC, AMR, JPG, PNG), graphics libraries (e.g., an OpenGL framework that may be used to render 2D and 3D graphic content on a display), database libraries (e.g., SQLite that may provide various relational database functions), web libraries (e.g., WebKit that may provide web browsing functionality), and the like.Libraries2316 may also include a wide variety ofother libraries2338 to provide many other APIs toapplications2320 and other software components/modules.
Frameworks2318 (also sometimes referred to as middleware) may provide a higher-level common infrastructure that may be utilized byapplications2320 and/or other software components/modules. For example,frameworks2318 may provide various graphic user interface (GUI) functions, high-level resource management, high-level location services, and so forth.Frameworks2318 may provide a broad spectrum of other APIs that may be utilized byapplications2320 and/or other software components/modules, some of which may be specific to a particular operating system or platform.
Applications2320 include built-inapplications2340 and/orthird party applications2342. Examples of representative built-inapplications2340 may include, but are not limited to, a contacts application, a browser application, a book reader application, a location application, a media application, a messaging application, and/or a game application.Third party applications2342 may include any of the built-inapplications2340 as well as a broad assortment of other applications. In a specific example, third party application2342 (e.g., an application developed using the AndroidTM or iOSTM software development kit (SDK) by an entity other than the vendor of the particular platform) may be mobile software running on a mobile operating system such as iOSTM, AndroidTM, Windows® Phone, or other mobile operating systems. In this example,third party application2342 may invoke the API calls2324 provided by the mobile operating system such asoperating system2314 to facilitate functionality described herein.
Applications2320 may utilize built-inoperating system2314 functions (e.g.,kernel2328,services2330, and/or drivers2332), libraries2316 (e.g.,system libraries2334,API libraries2336, and other libraries2338), and frameworks/middleware2318 to create user interfaces to interact with users of the system. Alternatively, or additionally, in some systems, interactions with a user may occur through a presentation layer, such as thepresentation layer2344. In these systems, the application/module “logic” can be separated from the aspects of the application/module that interact with a user.
Some software architectures utilize virtual machines. In the example ofFIG. 23, this is illustrated by avirtual machine2348. A virtual machine creates a software environment where applications/modules can execute as if they were executing on a hardware machine (such as themachine2400 ofFIG. 24, for example). A virtual machine is hosted by a host operating system (e.g.,operating system2314 inFIG. 23) and typically, although not always, has avirtual machine monitor2346 that manages the operation ofvirtual machine2348, as well as the interface with the host operating system (e.g., operating system2314). A software architecture executes withinvirtual machine2348, such as anoperating system2350,libraries2352, frameworks/middleware2354,applications2356, and/or apresentation layer2358. These layers of software architecture executing withinvirtual machine2348 may be the same as corresponding layers previously described or may be different.
FIG. 24 is a block diagram illustrating components of amachine2400, according to some example embodiments, that is able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the processes discussed herein. Specifically,FIG. 24 shows an illustrative diagrammatic representation ofmachine2400 in the example form of a computer system, within which instructions2416 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing themachine2400 to perform any one or more of the processes disclosed herein may be executed.Instructions2416 transform the general, non-programmed machine into a particular machine that is technically improved and programmed to carry out the described and illustrated functions in the manner described. In alternative embodiments,machine2400 operates as a standalone device or may be coupled (e.g., networked) to other machines. In a networked deployment,machine2400 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.Machine2400 may comprise, but not be limited to, a server computer, a client computer, a PC, a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smartwatch), a smart home device (e.g., a smart appliance), other smart devices, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executinginstructions2416, sequentially or otherwise, that specify actions to be taken by themachine2400. Further, while only asingle machine2400 is illustrated inFIG. 24, the term “machine” herein shall be taken to include a collection ofmachines2400 that individually or jointly executeinstructions2416 to perform any one or more of the processes discussed herein.
Machine2400 may includeprocessors2410, memory/storage2430, and I/O components2450 that may be configured to communicate with each other such as via a bus2402. In an example embodiment, processors2410 (e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an ASIC, a Radio-Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof) may include, for example, aprocessor2412 and aprocessor2414 that may execute theinstructions2416. The term “processor” is includes multi-core processors that may comprise two or more independent processors (sometimes referred to as “cores”) that may execute instructions contemporaneously. AlthoughFIG. 24 showsmultiple processors2410,machine2400 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core processor), multiple processors with a single core, multiple processors with multiples cores, or any combination thereof.
Memory/storage2430 may include amemory2432, such as a main memory, or other memory storage, and astorage unit2436, both accessible to theprocessors2410 such as via the bus2402.Storage unit2436 andmemory2432 store theinstructions2416 embodying any one or more of the methodologies or functions described herein.Instructions2416 may also reside, completely or partially, withinmemory2432, withinstorage unit2436, within at least one of processors2410 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by themachine2400. Accordingly,memory2432,storage unit2436, and the memory of theprocessors2410 are examples of machine-readable media.
As used herein, “machine-readable medium” refers to a tangible device capable of storing instructions and data temporarily or permanently and may include, but is not limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., Erasable Programmable Read-Only Memory (EEPROM)), and/or any suitable combination thereof. The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to storeinstructions2416. The term “machine-readable medium” shall also be taken to include any tangible medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions2416) for execution by a machine (e.g., machine2400), such that the instructions, when executed by one or more processors of the machine (e.g., processors2410), cause the machine to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” refers to a single tangible storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” excludes signals per se.
I/O components2450 may include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. The specific I/O components2450 that are included in a particular machine will depend on the type of machine. For example, portable machines such as mobile phones will likely include a touch input device or other such input mechanisms, while a headless server machine will likely not include such a touch input device. It will be appreciated that I/O components2450 may include many other components that are not shown inFIG. 24. I/O components2450 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various example embodiments, I/O components2450 may includeoutput components2452 and input components2454.Output components2452 may include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor, resistance mechanisms), other signal generators, and so forth. Input components2454 may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or another pointing instrument), tactile input components (e.g., a physical button, a touch screen that provides location and/or force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like.
I/O components2450 may include, in some embodiments,biometric components2456,motion components2458,environmental components2460, orposition components2462, among a wide array of other components. For example,biometric components2456 may include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like.Motion components2458 may include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth.Environmental components2460 may include, for example, illumination sensor components (e.g., photometer), temperature sensor components (e.g., one or more thermometers that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensors (e.g., gas detection sensors to detect concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment.Position components2462 may include location sensor components (e.g., a Global Position System (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.
Communication to and frommachine2400 may be implemented using a wide variety of technologies. I/O components2450 may includecommunication components2464 operable to couple themachine2400 to anetwork2480 ordevices2470 via acoupling2482 and acoupling2472, respectively. For example,communication components2464 may include a network interface component or other suitable device to interface withnetwork2480. In further examples,communication components2464 may include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components to provide communication via other modalities.Devices2470 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a USB).
Communication components2464 may detect identifiers or include components operable to detect identifiers. For example,communication components2464 may include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, UCC RSS-2D bar code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals). In addition, a variety of information may be derived viacommunication components2464, such as location via Internet Protocol (IP) geolocation, location via Wi-Fi® signal triangulation, location via detecting an NFC beacon signal that may indicate a particular location, and the like.
In various example embodiments, one or more portions ofnetwork2480 may be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a WAN, a wireless WAN (WWAN), a metropolitan area network (MAN), the Internet, a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a Wi-Fi® network, another type of network, or a combination of two or more such networks. For example,network2480 or a portion ofnetwork2480 may include a wireless or cellular network and thecoupling2482 may be a Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) connection, or another type of cellular or wireless coupling. In this example, thecoupling2482 may implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (1xRTT), Evolution-Data Optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, third Generation Partnership Project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standard, others defined by various standard-setting organizations, other long range protocols, or other data transfer technology.
Instructions2416 may be transmitted or received overnetwork2480 using a transmission medium via a network interface device (e.g., a network interface component included in communication components2464) and utilizing any one of a number of well-known transfer protocols (e.g., HTTP).Instructions2416 may be transmitted or received using a transmission medium via the coupling2472 (e.g., a peer-to-peer coupling) todevices2470. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carryinginstructions2416 for execution bymachine2400, and includes digital or analog communications signals or other intangible media to facilitate communication of such software. The foregoing diagrams represent logical architectures for describing processes according to some embodiments, and actual implementations may include more or different components arranged in other manners. Other topologies may be used in conjunction with other embodiments. Moreover, each component or device described herein may be implemented by any number of devices in communication via any number of other public and/or private networks. Two or more of such computing devices may be located remote from one another and may communicate with one another via any known manner of network(s) and/or a dedicated connection. Each component or device may comprise any number of hardware and/or software elements suitable to provide the functions described herein as well as any other functions. For example, any computing device used in an implementation of a system according to some embodiments may include a processor to execute program code such that the computing device operates as described herein. All systems and processes discussed herein may be embodied in program code stored on one or more non-transitory computer-readable media. Such media may include, for example, a floppy disk, a CD-ROM, a DVD-ROM, a Flash drive, magnetic tape, and solid state Random Access Memory (RAM) or Read Only Memory (ROM) storage units. Embodiments are therefore not limited to any specific combination of hardware and software.
Referring again to an alert and notification “fetch” application, service, system, or other functionality herein,FIGS. 25-27 illustrate some aspects thereof for some example embodiments.FIGS. 25-27 illustrate example embodiments for presenting or visualizing representations of the alert and notification (i.e., “fetch”) application, service, system, or functionality.FIG. 25 shows a depiction of a dynamic alert andnotification banner2505 inUI2500, in combination with a calendar view (as indicated bycalendar2502 and highlighted UI button2510). In some example embodiments, the dynamic alert andnotification banner2505 is automatically and dynamically presented via a sliding down animation in response to a new alert/notification being received from the system. For example, as new alerts/notifications are received, an alert and notification banner is dynamically updated and presented inUI2500, via an animation, to provide a current status indicator of the alerts/notifications relevant to the user. In the example ofFIG. 25,banner2505 provides text notifying a user of the total number of alerts (“14” as indicated in UI button2515) and text (i.e., “You have 14Alerts 2 New 1Unread 11 Read”) describing the status and number corresponding to each alert/notification status, as either “new”, “unread”, and “read”, where a “new”notification banner2505 will always be presented in a red (or other specific) color. As shown inbanner2505, theUI button2515 is shown in a red color to indicate and alert the user that at least one new alert/notification has been received in the alert/notification application UI. In the instance the status of the alert/notification changes from “new” to another state (e.g., to “unread” or “read), then the color ofUI button2515 can automatically change from the red (or other specific) color to another color to visually indicate that there are no more “new” alerts/notifications. In an embodiment,banner2505 remains displayed until the user has either selected the “Fetch”button2517 or, similarly,UI button2515 to view the alert/notifications referenced in the alert/notification UI, which in turn dismisses (removes from view)banner2505 and presents the alert/notification list UI2700 inFIG. 27. In an embodiment, the user may simply dismiss (remove from view)banner2505 without viewing the alerts/notifications by selecting the banner dismissbutton element2519.
In an embodiment, if no “new” alert/notifications have been received by the alert/notification UI, then thenotification banner2505 is not presented. In this manner, the absence of an alert/notification banner operates as an efficient mechanism to convey that there are currently no new alerts or notifications (i.e., new fetch items) for a user. In some embodiments, a “new” alert/notification means the alert/notification was received and reported after the user last viewed their alerts/notifications via the alert/notification UI2700 inFIG. 27. An “unread” alert/notification means the user has not yet opened and viewed the alert/notification detail via selecting a specific alert/notification item, and a “read” status for an alert/notification means that the user has viewed (or accessed) a visualization or presentation of the alert/notification detail by selecting the specific item in the alert/notification UI list2700.
An example embodiment of a natural languageinput UI element2602 is shown in theUI2600 inFIG. 26A that illustrates a mechanism a user can use to interact with the “fetch” functionality, as well as the integrated calendar and timeline application, herein using textual and/or speech inputs, in conjunction with conventional manual inputs.
In some aspects and embodiments herein, the system herein dynamically and intelligently notifies a user of the current, updated status of the alerts and notifications (i.e., fetch items) relevant to them based on, for example, the user's role as specified or otherwise known by the system. The system disclosed herein including the integrated calendar and timeline knows the events, both short-term and long-term, relevant to a user and further includes or relates to a real-time clock functionality for such alerts/notifications. Using these aspects and processing capabilities (at least), a system herein may generate and present alerts, reminders and notifications to a user without direct/explicit directions and/or interactions from a user for each triggering event. For example, a user may enter a deadline with an associated priority, wherein in response to that entry a system herein may automatically generate and provide reminder alerts to a user as the deadline approaches based on the time until the deadline and the deadline's indicated priority (e.g., the frequency of the reminder alerts generated for the entered deadline may increase as the due date approaches, with more alerts being generated for high(er) priority tasks/actions). Additionally, (or alternatively), a system herein may stop sending reminder alerts once a task/action associated with the deadline is completed.
In general,FIG. 26A is an illustrative depiction of an alert andnotification UI2600 comprising anotification header button2605. Such a UI may be reserved for relatively small displays and/or devices with a relatively small display screen (e.g., some phones, a smartwatch, a wearable device, etc.), in an effort to, for example, conserve screen space and provide efficient GUI usability characteristics. In some embodiments, if no alerts/notifications are contained in the alert/notification UI2700, thenUI button2605 is not shown. In the present example embodiment ofFIG. 26A, the number of alerts/notifications indicated by UI button2605 (“14”) corresponds to the total number of “new” (2), “read” (1), and “unread”(11) alert/notifications referenced in the alert/notification banner2505 shown inUI2500 ofFIG. 25. The alerts/notifications referenced inUI2600 may be accessed by selectingheader button2605FIG. 26A. In some embodiments, if no “new” alert/notifications are contained in the alert/notification UI, (although there is some combination of “unread” and “read” alert/notifications) theUI button2605 may be shown in a grey (or another) color to visually communicate this condition to the user.Header button2605 may, in some instances, be displayed on larger display devices, such as, for example, a laptop screen, a desktop monitor, a tablet, and a mobile phone with a screen of a minimum size. Rules and/or preference settings may control which type of notification UI elements are presented to a user on any given device, such as, for example, theUI banner2505 shown inFIG. 25 or thenotification button2605 shown inFIG. 26A. In some embodiments, the rules and/or preference settings regarding alerts and notifications may be specific to a certain one or more fetch enabled applications or alternatively may be common across all fetch enabled applications and device platforms associated with a user. In some embodiments where the “fetch” functionality disclosed herein is deployed as a standalone application and not deployed in a specific enabled application, fetch preference settings may be individually set according to a specific device or deployment platform.
FIGS. 26B and 26C relate to preference and filter settings for a fetch function in some example embodiments herein. In the present example, selectingaction button2610 inUI2600 shown inFIG. 26A causes the display of fetchpreference settings UI2615, as seen inFIG. 26B. As shown, one or more categories of settings are shown inUI2615. In particular, there are settings for “App Links”2620, “Audio Alerts”2625, “Mark as all Read”2630, “Clear All”2640, “Sort”2645, and “Filter”2650 that can be further selected to enter a setting value or specific preference.
The “App Links” setting2620 may be selected by a user and provides a mechanism for the user to specify one or more applications, systems, and services that can be linked, associated, communicated, or interfaced with via the “fetch” functionality herein to send and receive alerts and notifications relating to a role of the user. In an embodiment, the one or more linked applications (e.g., a suite of applications from a particular vendor) may be interfaced with the system herein via an application programming interface (API) or other technical communication techniques, tools, and protocols.
The “Audio Alerts”setting category2625 may be selected by a user and provides a mechanism for the user to set preferences for audio alerts related to the alerts and notifications that may be processed by the “fetch” functionality herein. In an embodiment, a user may be presented with a mechanism (e.g., a UI including controls to select and specify values) to select, for example, a sound mode (e.g., sound, mute, vibrate), a volume, a ringtone, and (optionally) other audio settings for particular linked applications. In an embodiment, the audio alerts may be specifically set for individual linked applications.
The “Sort” category of fetch preference and filter settings shown inUI2615 at2645 may be selected by a user and provides a mechanism for the user to specify an order of presentation for the alerts and notifications reported by the fetch functionality herein. In an embodiment, the user may reorder the listing of alerts and notifications to their liking, whereas a default setting may list the alerts and notifications in, for example, an alphabetical or prioritized ordered listing.
The settings categories of “Mark all as Read” and “Clear All” may be selected to invoke those actions with respect to all fetch alerts and notifications contained in the alerts/notifications system UI. For example, selecting the “Mark all as Read” setting2630 will operate to classify all outstanding alerts and notifications as “read”. Selecting the “Clear All” setting will remove (i.e., clear) all of the outstanding alerts and notification so that, for example,notification banner2505 inFIG. 25 andnotification button2605 inFIG. 26A are cleared of all notifications and thus is no longer presented. A user may navigate fromUI2615 back to UI2600 (FIG. 26A) by selecting UI “Back”button2685.
Referring toFIG. 26C,UI2655 is an example of a UI that may be presented in response to selecting the “Filter” setting2650 inUI2615 ofFIG. 26B. As seen, a number of different filter settings for alerts and notifications can be selected for limiting the type of alerts and notifications to be displayed, including for example, a “Show All”designation2660 that can enable the displaying of all alerts and notifications, including but not limited to all of the different types of alerts and notifications shown inUI2655; a “Show High Priority” setting2665 that specifically enables the displaying of high priority alerts and notifications; a “Show Milestones” filter setting2670 that enables the displaying of milestones (e.g., timeline context associated event milestones); a “Show Appointments” filter setting2675 to enable the displaying of appointments (e.g., a calendar context associated appointment event); and a “Show Conference Calls” filter setting2680 that enables the displaying of scheduled conference calls for a user. A user may navigate fromUI2655 back to UI2615 (FIG. 26B) by selecting UI “Back” button2690.
FIG. 27 is an example list view UI visualization for an alert and notification application in some embodiments herein. The list view UI visualization configuration of the alert and notification application may be presented inUI2700, as shown inFIG. 27.UI2700 includes aheader element2705 that includes a status of the outstanding alerts/notifications being reported therein. In the example ofFIG. 27, there are 2 “new” alerts, 1 “unread” alert, and 11 “read” alerts being reported. In addition to the text inheader element2705 conveying the status of the alerts, the alerts each have a color corresponding to their status. For example, the 2 “new” (or updated) alerts are shown colored with the dark highlight color at2710, the 1 “unread” (or unopened) alert is presented colored with the light highlight color at2715, and the 11 “read” (or previously opened) alerts are shown with no coloring (i.e., white) at2720. In some aspects, a fetch banner (2505,FIG. 25) may be replaced with a header button (2605,FIG. 26A) after a user views the contents of a fetch pop-uplist view2700 and closes the list view by selecting UI “Done”button2725 in the alert/notification UI. In some aspects, when theheader button2605 ofUI2600 inFIG. 26A is displayed, selecting thebutton2605 launches (opens) the fetch pop-uplist view UI2700 shown inFIG. 27. In an embodiment, the pop-uplist view UI2700 allows a user to select, read, reply, edit, delete and otherwise navigate the fetch alerts/notifications.
Embodiments described herein are solely for the purpose of illustration. Those in the art will recognize other embodiments may be practiced with modifications and alterations to that described above.