TECHNICAL FIELDThe present disclosure relates generally to the management of applications executing on a vehicle head unit and their ability to output visual content on a display of the head unit.
BACKGROUNDDescription of the Related ArtAutomobiles are becoming more and more user friendly and interactive. Many new cars are now manufactured with a user interface, called a head unit, which a user can use to control various aspects of the automobile and access a variety of content or applications. For example, the user can use the head unit to change radio stations, change the temperature of the automobile cabin, access maps and global positioning systems, receive media streams via the Internet, access various Internet-based services, or access or control other accessories of the automobile. The head unit can also provide various types of information or content to the user, such as when the automobile is due for an oil change or when a tire rotation should be performed, to name a few.
To view most of this information, users navigate through multiple screens and options to access different accessory information or applications. This mode of interaction is similar in many respects to that of smartphones, where one may have screen after screen along with multiple folders of apps. This type of user interaction, however, can be a great distraction, especially to the driver. Not only can such distractions be a safety issue, many jurisdictions are now starting to limit the amount and type of content or interactions that users can perform on a head unit. The phone-like app-as-an-icon model does not always lend itself well to the in-vehicle experience. It is with respect to these and other considerations that the embodiments described herein have been made.
BRIEF SUMMARYBriefly stated, embodiments are directed towards a system that provides an environment in which multiple user applications, which manifest themselves as services, can be executed in the background of a vehicle head unit without direct screen interaction with the user applications by a user of the head unit. That is, certain applications may be run without a user selecting a dedicated application icon and without the application taking over the display when run. A category-management application on the head unit is configured to communicate with a plurality of user applications that are stored on the head unit or stored on a mobile device of the user. The category-management application receives a request to interact with one or more of the user applications. This request may be directly from a user, from the user via a navigation application, from the user applications themselves, from other user or system applications, or from an accessory of the vehicle. The category-management application provides the request to one or more user applications to cause the user applications to execute in the background of the head unit to fulfill the request. The category-management application receives response information from each user application that fulfills the request and presents associated content to the user. The presentation of the content may include a display of visual content, such as via a map displayed by the navigation application, or an audio output, such as via a speaker in the vehicle.
The environment provided by the system described herein reduces distractions caused by a user interacting with the head unit to access user applications, while also increasing the amount of content that can be presented to the user. It also reduces the complexity of the interface by not having to present the user with a distinct icon and application for each service. For example, if a user likes to get gas at particular gas stations because they receive loyalty points from those gas stations, the user can perform a one-time download of the user applications for each gas station brand they care about. These applications are generally brand-specific services that are accessed from a generic “fuel” application that manages the category of fuel services. When the user decides to get gas, the user can simply activate a gas button on the head unit. Embodiments described herein cause a category-management application to access each of the user's user gas applications to get a location of the nearest gas station associated with that application. The navigation application can then be utilized to display a map with icons representative of the locations of those stations. As another example, a geolocation search can automatically find and launch an application based on a triggered event, such as “ignition off.”
Therefore, instead of having the user manually select and interact with each gas application to obtain a nearest location and then remember where that gas station is located as the user searches through all applications, embodiments described herein provide an automated system for obtaining information from user applications without having the user interact with those applications individually. Accordingly, the user is not distracted with searching multiple user applications and the user is presented with additional content from each user application. Embodiments described herein also utilize the leverage of the information obtainable by the user applications to augment the content presented by the head unit, which reduces the need to update the head unit with new content.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGSNon-limiting and non-exhaustive embodiments are described with reference to the following drawings. In the drawings, like reference numerals refer to like parts throughout the various figures unless otherwise specified.
For a better understanding, reference will be made to the following Detailed Description, which is to be read in association with the accompanying drawings:
FIG. 1 illustrates a context diagram of a vehicle environment that utilizes a head unit to present content to a user in accordance with embodiments described herein;
FIG. 2 shows a use case example of a head unit's utilization of category-management applications and user applications in accordance with embodiments described herein;
FIG. 3 illustrates a logical flow diagram generally showing one embodiment of an overview process for utilizing a category-management application to interact with a user application to present content to a user of a vehicle in accordance with embodiments described herein;
FIG. 4 illustrates a logical flow diagram generally showing one embodiment of an alternative process for utilizing multiple category-management applications to interact with multiple user applications to present content to a user in accordance with embodiments described herein; and
FIG. 5 shows a system diagram that describes one implementation of computing systems for implementing embodiments described herein.
DETAILED DESCRIPTIONThe following description, along with the accompanying drawings, sets forth certain specific details in order to provide a thorough understanding of various disclosed embodiments. However, one skilled in the relevant art will recognize that the disclosed embodiments may be practiced in various combinations, without one or more of these specific details, or with other methods, components, devices, materials, etc. In other instances, well-known structures or components that are associated with the environment of the present disclosure, including but not limited to the communication systems and networks and the vehicle environment, have not been shown or described in order to avoid unnecessarily obscuring descriptions of the embodiments. Additionally, the various embodiments may be methods, systems, media, or devices. Accordingly, the various embodiments may be entirely hardware embodiments, entirely software embodiments, or embodiments combining software and hardware aspects.
Throughout the specification, claims, and drawings, the following terms take the meaning explicitly associated herein, unless the context clearly dictates otherwise. The term “herein” refers to the specification, claims, and drawings associated with the current application. The phrases “in one embodiment,” “in another embodiment,” “in various embodiments,” “in some embodiments,” “in other embodiments,” and other variations thereof refer to one or more features, structures, functions, limitations, or characteristics of the present disclosure, and are not limited to the same or different embodiments unless the context clearly dictates otherwise. As used herein, the term “or” is an inclusive “or” operator and is equivalent to the phrases “A or B, or both” or “A or B or C, or any combination thereof,” and lists with additional elements are similarly treated. The term “based on” is not exclusive and allows for being based on additional features, functions, aspects, or limitations not described, unless the context clearly dictates otherwise. In addition, throughout the specification, the meaning of “a,” “an,” and “the” include singular and plural references.
The term “user” is defined as a person or occupant that is in or otherwise being transported by a vehicle. The user may be the driver or a passenger of the vehicle. The term “vehicle” is defined as a device used to transport people or goods (or both), and examples include automobiles, buses, aircraft, boats, or trains. A “processor” is defined as a component with at least some circuitry or other hardware and that can execute instructions. The term “head unit” is defined as a component with at least some circuitry that is part of a vehicle and presents content to a user (as defined above). The term “present” is defined as to bring or introduce to the presence of a user through some sensory interaction.
The term “content” is defined as information that can be presented to a user of the vehicle. Content may include visual content, audio content, tactile content, or some combination thereof. Visual content can include text, graphics, symbols, video, or other information that is displayed to a user. Audio content can include songs, vocals, music, chimes, or other types of sounds. Tactile content can include vibrations, pulses, or other types of touch-based sensations provided via a haptic interface. Generalized types of content can include advertisements, sports scores or information, logos, directions, restaurant menus, prices, hours of operation, coupons, descriptive information, emergency instructions, etc.
FIG. 1 illustrates a context diagram of a vehicle environment that utilizes a head unit to present content to a user in accordance with embodiments described herein.System100 includes avehicle102 that has ahead unit104 and one or more accessories108a-108c.Thevehicle102 is virtually any means of transportation that includes a computing device and an output interface to provide content to a user of thevehicle102. In the illustrative examples described herein, the computing device of the vehicle is thehead unit104, although other types of computing devices may be employed.
Thehead unit104 is a computing device that provides content, interactive controls, user interfaces, or other information to users of thevehicle102. In various embodiments, thehead unit104 utilizes one or more input/output interfaces for user interactions, which may be integrated into the head unit104 (e.g., input/output interfaces116) or external to the head unit104 (e.g., other input/output interfaces126). In some embodiments, the input/output interfaces116 or a portion thereof may be part of or embedded within thehead unit104. In other embodiments, the other input/output interfaces126 or a portion thereof may be separate from or independent of thehead unit104. In various embodiments, thehead unit104 may utilize some combination of the input/output interfaces116 and the input/output interfaces126. For example, thehead unit104 may include a built-in display device to output visual content and utilize a separate speaker that is external to thehead unit104 to output audio content. The head-unit-integrated input/output interfaces116 and the other external input/output interfaces126 may be collectively referred to as input/output interfaces116,126.
The input/output interfaces116,126 are configured to receive input from a user of thevehicle102 or to output content to a user of thevehicle102. The input/output interfaces116,126 may include one or more output interfaces, which may include a visual interface, such as a display device; an audio output interface, such as a speaker; a haptic interface, such as a tactile output device; or a combination thereof. Therefore, the input/output interfaces116,126 may include one or more output interfaces configured to output visual content, audio content, tactile content, or some combination thereof.
The input/output interfaces116,126 may also include one or more input interfaces, which may include input buttons, a touchscreen, a microphone, or other input interfaces. Therefore, the input/output interfaces116,126 may include one or more input interfaces configured to receive visual, audio, physical input commands, or some combination thereof. Embodiments described herein regarding thehead unit104 as receiving input or providing output may be performed by an internal or integrated input/output interface116 or other external input/output interfaces126, or some combination thereof. As an illustrative example, thehead unit104 may provide a navigation interface, audio and radio controls, environmental controls, vehicle performance or maintenance information, or other types of content.
An accessory108a-108ccan be any device or process that provides information or data directly or indirectly (such as via the head unit104) to the user. Examples include the following: gas-level gauge, speedometer, odometer, oil-pressure gauge, temperature gauge, tire-pressure gauge, GPS device, ignition-status indicator, gear-shift mechanics or electronics indicating a gear state or change of state, seat-belt-status indicator, seat-weight sensors, clock, or other vehicle sensor that provides information to a user. Additional information on the accessories108a-108cwill be presented below.
In various embodiments described herein, thehead unit104 provides an environment in which category-management applications access and manage user applications such that the category-management applications present content to a user of thehead unit104 without the user interacting directly with the user applications. As an example, the content may be presented to the user through an interactive element, like a navigation application. These category-management applications may be applications that execute to coordinate requests and information between the user (via the interactive element) and the user applications. The user applications may be individual applications selected or downloaded by the user or pre-installed applications. Accordingly, user applications may obtain or provide content to a user. Moreover, user applications may be self-sufficient and not have to interact with a remote server or service or alternatively, the user application may exchange content with a remote server. In some embodiments, user applications may be full-service applications that utilize user input to obtain content, present content to the user, or provide content to a remote server. In other embodiments, user applications may be background applications or processes that are not under the direct control of an interactive user.
For example, thehead unit104 may have stored thereon a navigation application and one or more category-management applications, such as a gas application or a food application. Thehead unit104 also has stored thereon a plurality of user applications, which, for illustrative purposes, may include examples such as Gas_Station_ABC, Gas_To_Go, SuperStoreGas, Burger_Joint, and Taco_Today. Each of these user applications provides many different functions, from enabling a user to order food to providing a daily coupon to the user. The gas application accesses and interacts with Gas_Station_ABC, Gas_To_Go, and SuperStoreGas, and the food application accesses and interacts with Burger_Joint and Taco_Today.
The user applications can provide information or content to the category-management applications unrequested or in response to a request from the category-management application. The category-management application then determines if and how to present the information or content to the user of thehead unit104. Continuing the example, the gas application can obtain a current location, current route (e.g., a full route to a destination or a partial route for a select amount of time or distance), or destination of thevehicle102, such as from the navigation application. The gas application then provides this current location information to each user application and requests a nearest store location associated with that user application. In response, one or more of Gas_Station_ABC, Gas_To_Go, and SuperStoreGas provides the gas application with coordinates of its nearest corresponding gas station with respect to the current location of thevehicle102. The gas category-management application then provides the coordinates to the navigation application to display an icon representative of the location of one or more of the nearest gas stations on a map. In some embodiments, the gas user applications may also present other content to the gas category-management application, such as a company logo, prices, coupons, or other content.
The navigation application responds to the gas category-management application by presenting the location of the nearest gas stations to the user of thevehicle102 via thehead unit104. In this way, the user of thehead unit104 can see the closest gas stations associated with the gas user applications without having to open each gas user application to separately request a nearest location. In many instances, user applications are updated from time to time to include more functionality, include additional store locations, etc. The interaction between the category-management applications and the user applications, described herein, allows for thehead unit104 to present updated information from the user applications.
As another example, an accessory108a-108cor some process, like the navigation application, may monitor one or more parameters or variables, such as the vehicle's location or fuel level, and if certain conditions are met, may send messages to user applications related to the parameters or variables or surface the information to the user. In one example, a background process may periodically check the fuel level. When it's below a threshold value and the navigation application detects there's a gas station close by, thehead unit104 may present that gas station interface to the user. In another example, a user may have ordered food from a quick-service restaurant. A background process (possibly the same restaurant user application or a generic tracking user application) can send a message to the restaurant user application when the vehicle is within a certain distance from the restaurant. The restaurant user application can then forward the message to the restaurant's web-service interface using its own proprietary mechanisms. This step can allow the restaurant to start preparing the food in anticipation of the arrival of the user. In some embodiments, the generic tracking user application could be a generic in-vehicle user application that allows arbitrary applications to use a subscribe-publish model to request alerts when the vehicle is in a certain area.
Although the previous examples describe the user applications as being stored on thehead unit104, embodiments are not so limited. In some embodiments, one or more of the user applications may be stored on amobile device114 that is separate from thehead unit104. In this example, the category-management applications executing on thehead unit104 communicate with themobile device114 to obtain content or information for presentation to the user via the input/output interfaces116,126. The user applications may also be completely cloud hosted, running on one or moreremote servers124. In this case, the category-management applications are configured to be aware of them and how to communicate with them, as necessary. In this instantiation, the user applications operate much like web services. Category-management applications can subscribe to asynchronous notifications of events or send requests and receive responses via a communication network to exchange data with the user applications.
In various embodiments, thehead unit104 may be configured to communicate with other computing devices, such asmobile device114 orremote server124. For example, thehead unit104 may interact with user applications executing on themobile device114 via mobiledevice communication network120. Similarly, thehead unit104 may communicate withremote server124 viacommunication network122 to obtain content or other information, such as in response to a request from a user application, or interact with cloud-based user applications, as noted earlier. In at least one embodiment, themobile device114 may act as an intermediate device between thehead unit104 and theremote server124.
Theremote server124 is any computing device, such as a server computer, cloud resources, a smartphone or other mobile device, or other computing device, which is remote to thevehicle102 and can provide content or other information to thehead unit104 or themobile device114. Although theremote server124 is illustrated as a single device, embodiments are not so limited. Rather, theremote server124 may be one or more computer devices, including those that collectively perform functions.
Themobile device114 includes any device capable of communicating with ahead unit104 of thevehicle102 orremote server124. Themobile device114 is configured and structured to send and receive information, content, or controls to and from thehead unit104 or theremote server124. Examples of themobile device114 include laptop computers, smart phones, tablet computers, wearable computing devices, other smart devices, or other handheld computing devices.
In some embodiments, theremote server124, thehead unit104, and themobile device114 communicate with each other via acommunication network122. Thecommunication network122 is configured to couple various computing devices to transmit data from one or more devices to one or more other devices.Communication network122 includes various wireless networks that may be employed using various forms of communication technologies and topologies, such as cellular networks, mesh networks, or the like.
In various embodiments, thehead unit104 communicates with themobile device114 via a mobiledevice communication network120. The mobiledevice communication network120 is configured to couple themobile device114 with thehead unit104 to transmit content/data between themobile device114 and thehead unit104. The information communicated between devices may include current accessory status or data, vehicle status information, requests to access accessory data, requests to control or modify an accessory, video data, voice data, image data, text data, or other types of content, data, or information. Thecommunication network120 may include a variety of short-range wireless communication networks, such as personal area networks utilizing classic Bluetooth or Bluetooth Low Energy protocols, Wi-Fi, USB, an IR optical network, ornetwork120, to enable communication between themobile device114 and thehead unit104. Thecommunication network120 may also be implemented using Internet connectivity over wide-area cellular networks (such as 4G and 5G networks).
In various embodiments, the user may interact with thehead unit104 via themobile device114 such that themobile device114 acts as a virtual head unit. In this way, user input provided to thehead unit104 may be received from the user via themobile device114 and transmitted from themobile device114 to thehead unit104 for processing by thehead unit104. Conversely, content to be presented to the user may be provided to themobile device114 from thehead unit104 and displayed or otherwise output to the user from themobile device114. In some other embodiments, themobile device114 may perform the functionality ofhead unit104 or may project one or more of its applications to thehead unit104.
The mobiledevice communication network120, thecommunication network122, and theaccessory communication network106 may be separate communication networks, as illustrated, or some of them may form the same communication network or share network components.
Thehead unit104 may also be configured to access or receive information or control use of the one or more accessories108a-108c.The accessories108a-108cinclude virtually any vehicle utility or device that provides information or data to the user, including data received from core components of thevehicle102 via the vehicle's Controller Area Network (CAN bus). Accessories108a-108cmay also include applications executing on thehead unit104 that provide information to the user or have two-way interactions with the user. Examples of these accessories include navigation, audio and radio controls, television or music applications, environmental control applications, vehicle performance or maintenance applications, or other applications.
Accessories108a-108cmay also receive information from other sources. For example, in some embodiments, the accessories108a-108cmay collect “derived accessory data” from internal-facing or external-facing cameras or other sensors. Derived accessory data is information about an environment associated with the vehicle that can provide additional details or aspects of the operation of the vehicle, as described herein. For example, images from a camera on the vehicle may be analyzed to determine which user is in the vehicle, which user is operating the vehicle, where the driver or other user is looking (e.g., whether they are talking to a passenger), whether there are pedestrians nearby, whether there are billboards or store signs next to the road or vehicle, etc.
In some embodiments, the accessories108a-108cmay also include any vehicle utility or device that is controllable by a user. Examples of these accessories include adjustable seats, sun roof, side mirrors, rear-view mirror, air conditioner, power windows, or other controllable features of thevehicle102.
It should be noted that some accessories may only output data, some accessories may only receive control signals to manipulate the accessory, and some accessories may input and output data. For example, a speedometer may only output the current speed of the vehicle; a power window may only receive control signals to move the window up or down but not return any information to the head unit; and the navigation system may receive a request for a destination and also return a suggested travel route to the destination. It should be further noted that these examples are non-exhaustive and other types of accessories may also be employed.
Thehead unit104 can communicate with the accessories108a-108cvia anaccessory communication network106. Theaccessory communication network106 is configured to couple the accessories108a-108cwith thehead unit104 to transmit content/data between the accessories108a-108cand thehead unit104. The information communicated between devices may include current accessory status or data, accessory control data, video data, voice data, image data, text data, or other types of content, data, or information. Theaccessory communication network106 may include one or more physical networks; one or more wireless communication networks; one or more application program interfaces; or one or more other networks capable of transmitting data from one accessory to another, from an accessory to thehead unit104, or from the head unit to an accessory; or some combination thereof depending on the types of accessories communicating with thehead unit104. For example, theaccessory communication network106 may include an automotive body communication network, such as a wired controller area network, short-range wireless communication network, such as personal area networks utilizing Bluetooth Low Energy protocols, or any other type of network.
In some other embodiments, thehead unit104 may act as an intermediate device that facilitates communication between themobile device114 and the accessories108a-108c.In this way, thehead unit104 can act as a gateway between themobile device114 and the accessories108a-108cto provide authentication and authorization for permitting or restricting the control of accessories108a-108cand the transfer of accessory information, which can enable a user to access information from or control accessories108a-108cviamobile device114.
FIG. 2 shows a use case example of a head unit's utilization of category-management applications and user applications in accordance with embodiments described here. Example200 includes ahead unit104 and aremote server124, and optionally amobile device114, similar to what is described above in conjunction withFIG. 1.
Thehead unit104 has stored thereon anavigation application202, a plurality of category-management applications204, and a plurality ofuser applications210. In various embodiments, thenavigation application202, the category-management applications204, or theuser applications210 may be considered as accessories of the vehicle. Although not illustrated, the head unit may also store system applications, such as a camera application, other map applications, safety applications, etc., that may be built into the head unit by a manufacturer. In another arrangement, one or more of thenavigation application202, category-management applications204,user applications210, or system applications may be installed or stored on other devices and selectively presented on the head unit, such as for interaction with a user.
Thenavigation application202 is an application that obtains GPS or other location data and outputs visual or audible content to a user of thehead unit104, such as a map or directions. Accordingly, thenavigation application202 can display a map with the vehicle's current location and optionally, a route to a destination. Thenavigation application202 may also display other content to the user. For example, thenavigation application202 may overlay and display icons representative of the location of parks, stores, restaurants, etc. or external conditions, such as weather or traffic. In other embodiments, thenavigation application202 may also display other content received from the category-management applications204, such as coupons, advertisements, alerts, or other types of content. This additional content may be displayed to a user via a banner, window, or scrolling text or presented through some other audiovisual interface.
The category-management applications204 are applications that interact with one or morerelated user applications210, as well as thenavigation application202. In general, a category-management application204 serves as an interface for itscorresponding user applications210 so that theuser applications210 can execute in the background without requiring direct user interaction. This process enables the category-management application204 to aggregate information from itscorresponding user applications210 for efficient presentation to a user. In this example, the category-management applications204 include agas application204aand afood application204b.Embodiments, however, are not so limited and other types of category-management applications204 may be used, such as those related to financial institutions, retail establishments, parks and recreation, repair shops, games, safety, etc.
Theuser applications210 are applications for a particular store, service, product, business, shop, station, park, game, etc. These applications are typically considered stand-alone applications in that they can be accessed directly by a user via a graphical user interface and typically execute in the foreground of the computing system. In embodiments described herein, however, theuser applications210 execute in the background without direct interaction with the user.
Rather, the user interacts with thehead unit104 via thenavigation application202, or optionally the category-management applications204 or evenother interfaces220, which reduces the interactions between the user and thehead unit104 and subsequently reduces the distractions to the user. Therefore, the category-management applications204 and thenavigation application202 work together act as an intermediary between the user and the user applications to present content from or for auser application210 without the user directly interacting with theuser applications210. As a result, the user is not distracted by scrolling through many different windows or visual interfaces to access auser application210 or directly interacting with theuser application210 itself.
Alternatively, the category-management applications204 may provide theuser applications210 a display area that can be used directly for user interaction. In this mode, however, thehead unit104 presents the look and feel of using a single application, and the user is unaware of theuser applications210 that are effectively service modules to the category-management application204. For example, thegas application204amay show the user the look and feel of thegas_station_ABC application210a,but the actual payment interface may be provided by thegas_station_ABC application210aitself. In either case, the user is unaware of the plurality ofuser applications210 or does not select them directly.
In one embodiment, theuser application210 that is presented to the user through the category-management application204 may depend on one or more factors, like proximity to the vehicle or prior user activity. For example, thegas_station_ABC application210amay be selected as the service module to be presented to the user through thegas application204aif a gas station associated with thisapplication210ais closer to the vehicle than a different gas station. As another example, the selection of auser application210 under this arrangement may stem from the user's repeated visits to an establishment associated with thatuser application210. In this case, various machine-learning (“ML”) models, including those onboard or remote to the vehicle may assist in this selection. Factors other than proximity or previous user actions—such as type of vehicle, vehicle parameters, settings, or values, or events at the establishments associated with theuser applications210—may drive the selection of auser application210 for this form of presentation.
No matter how the category-management applications204 minimize interactions between a user and thehead unit104, one ormore user applications210 may be assigned or linked to one or more category-management applications204 as part of that process. This connection may be done automatically or manually (or both). For example, if agas user application210 is installed, the head unit104 (or some other device) may automatically assign thegas user application210 to a gas category-management application204. A user may also perform this step manually and can override any automatic linking done by the system. In addition, auser application210 may be assigned to multiple category-management applications204, such as if the entity related to that user application offers goods or services that are associated with the themes of two separate category-management applications204.
Theuser applications210 can provide a variety of content (including information) for presentation to a user of thehead unit104, such as nearest store locations, sale items, coupons, advertisements, contact information, hours of operation, prices, etc. In this example, theuser applications210 include aGas_Station_ABC application210a,aGas_To_Go application210b,aSuperStoreGas application210c,aBurger_Joint application210d,and aTaco_Today application210e.These examples are for illustrative purposes and are not to be construed as limiting.
In various embodiments, theuser applications210 communicate with one or moreremote servers124 to obtain content. For example, theGas_Station_ABC application210amay communicate with a correspondingremote server124 to obtain updated information on the locations of the Gas_Station_ABC service stations or a nearest service station given some location information (e.g., current location of the vehicle, a destination of the vehicle, a home town of an owner of the vehicle, etc.). In some embodiments, theuser applications210 may communicate with the sameremote server124, but in other embodiments, theuser applications210 communicate with their own separate respectiveremote servers124.
Theuser applications210 communicate with the category-management applications204 vialinks214a-214f,which represent interprocess communications. For example, theGas_Station_ABC application210a,theGas_To_Go application210b,and theSuperStoreGas application210ccommunicate withgas application204avialinks214a-214c,respectively, and theBurger_Joint application210dand theTaco_Today application210ecommunicate withfood application204bvialinks214dand214e,respectively. In various embodiments, eachuser application210 communicates with a single category-management application204. In other embodiments, one ormore user applications210 may communicate with multiple category-management applications204. For example, theSuperStoreGas application210cmay communicate with thegas application204avialink214cand optionally with thefood application204bvia link214f.
In various embodiments,links214a-214emay be pipes, sockets, message busses, or other communication channels, protocols, interfaces, etc. In various embodiments, thelinks214a-214fbetween the respective category-management applications204 and theuser applications210 are established upon installation of theuser application210 onto the head unit104 (or some other device). In some embodiments, the developer or an administrator of theuser applications210 selects which link214 to communicate with a specific category-management application204. For example, the developer of a gas-relateduser application210 selects alink214 to thegas application204a,whereas a developer of a food-relateduser application210 selects alink214 to thefood application204b.In some other embodiments, the user of thehead unit104 is enabled to select thelinks214 between theuser applications210 and the category-management applications204, such as via a graphical user interface. This functionality allows the user to dynamically change with whichuser applications210 each category-management application204 interacts. In other cases, the system may automatically select thelink214, which can be based on information in the software package of theuser application210. In some embodiments, theuser applications210 may not run as distinct processes but may be modules, such as shared libraries, scripts, or web services, that are linked to a category-management application204 or otherwise loaded at runtime.
As mentioned above, the user may interact with theuser applications210 via thenavigation application202 or someother interface220. As one example, a user can touch a soft button being displayed by thenavigation application202 to access user-application content. For example, thenavigation application202 may present a “gas” button to enable a user to retrieve content fromgas user applications210. Continuing with this example, thenavigation application202 sends a request to thegas application204afor the location of gas stations within a certain range of the current location of the vehicle. In this example, the range is a ten-mile radius with the current location of the vehicle at roughly the center of the radius. In various embodiments, the request includes GPS coordinates or other location information with respect to the vehicle. Thegas application204asends a request to each of theGas_Station_ABC application210a,theGas_To_Go application210b,and theSuperStoreGas application210cvialinks214a-214cto obtain information regarding gas stations within the ten-mile radius of the vehicle's current location—although such a request may be sent to one or more other applications based on user preferences or activity history.
TheGas_Station_ABC application210a,theGas_To_Go application210b,and theSuperStoreGas application210cexecute in the background to fulfill the request, which, in some embodiments, may include communicating with one or moreremote servers124. In response to obtaining the requested information, theGas_Station_ABC application210a,theGas_To_Go application210b,and theSuperStoreGas application210cprovide the requested information to thegas application204avialinks214a-214c.In this example, the requested information may be GPS coordinates and a logo for the corresponding gas station. Thegas application204aaggregates the received information and provides it to thenavigation application202, where thenavigation application202 modifies or augments a displayed map to include the gas-station logos at the respective locations.
In another arrangement, a user may initiate a request directly through a category-management application204, as opposed to launching it through thenavigation application202. For example, the user may first select thegas application204a,which can retrieve and aggregate the appropriate location information from theGas_Station_ABC application210a,theGas_To_Go application210b,and theSuperStoreGas application210c.In turn, thegas application204acan share this content with thenavigation application202, which can cause the logos to be presented on its user interface, as described above. Moreover, thegas application204acan discard content that may not be useful, before sharing it with thenavigation application202. For example, the location of the closest gas station affiliated with theGas_Station_ABC application210amay be on the outer boundary of the range (or radius) set by the vehicle's current location, and the vehicle is presently moving away from that location. In this case, thegas application204acould filter out this content, preventing its display via thenavigation application202.
Although the above examples discuss the category-management application204 as accessing nearest store-location information from theuser applications210, embodiments are not so limited. In other embodiments, auser application210 may provide information specific to the user. For example, when the user puts the vehicle into park, thenavigation application202 may provide location information to thegas application204aand thefood application204b. Thegas application204aand thefood application204bcan provide an instruction to therespective user applications210 requesting nearest store information. When theuser applications210 respond with the nearest store locations, the category-management applications204 can determine if a nearest store location matches the current location of the vehicle.
For this example, assume the location received from theGas_Station_ABC application210amatches the current location of the vehicle, which indicates that the vehicle is at a particular Gas_Station_ABC station. In response, thegas application204acan request additional information from theGas_Station_ABC application210afor that particular station. In this example, theGas_Station_ABC application210amay respond with certain content, such as gas prices, number of pumps, coupons, a rewards account number, current rewards points, etc. Thegas application204athen presents the received content to a user, such as via thenavigation application202 or some other interface of thehead unit104 or the other interfaces220. In one non-limiting example, thegas application204amay display a coupon or other content for that particular station or output audio content via a speaker indicating a currently available pump number and the current rewards balance for that station.
In some other embodiments, the category-management applications204 can provide other information to theuser applications210, so that theuser applications210 can perform other actions. For example, the category-management applications can obtain a currently entered destination, route, or estimated time of arrival to one or more of theuser applications210. In at least one embodiment, eachuser application210 registers with one or more of the category-management applications210 and notifies the category-management application210 of the information that thatuser application210 is to receive. In this way, auser application210 can provide additional information to the user based on where the user is going. For example, if the user enters a destination of Hardware_Store_MNO into the navigation application. The navigation application can provide this information to a category-management application, which can then forward the information to a user Hardware_Store_MNO application. The Hardware_Store_MNO application can then respond with a coupon or hours of operation prior to the user arriving at Hardware_Store_MNO. This information can also be made available to a companion application running on the mobile device, allowing a user to see the coupon in the car but be able to use it inside the store.
As another example, the user may interact with the category-management applications204 viaother interfaces220, such as via a voice-activated interface. In one example scenario, a user of thehead unit104 may speak verbal commands that include food ordering instructions, such as “give me a biggie with a side of wedges.” Theother interfaces220 utilizes voice-recognition techniques to obtain the text of the user's instructions. These instructions are then provided to thegas application204aand thefood application204b.Thegas application204asends the instructions to theGas_Station_ABC application210a, theGas_To_Go application210b,and theSuperStoreGas application210cand thefood application204bsends the instructions to theBurger_Joint application210dand theTaco_Today application210e.If any of theuser applications210 recognize the instructions, then they can respond to the corresponding category-management application204 indicating that the instructions are received and acknowledged.
In this example, theBurger_Joint application210dmay recognize the instructions as a request to pre-order a “Biggie Meal” with a side order of french fries. TheBurger_Joint application210dcan send an acknowledgement request back to thefood application204b,which can output a visible confirmation request via thenavigation application202 or some other interface of thehead unit104 or an audible confirmation request via the other interfaces220. If the user acknowledges the request, thefood application204bresponds to theBurger_Joint application210dwith the acknowledgement. TheBurger_Joint application210dcan then place the order by communicating with theremote server124.
In some embodiments, the category-management applications204 may include additional functionality that bypasses or does not utilize auser application210. For example, thefood application204bmay include functionality to enable the head unit users to order food from local “mom and pop” restaurants that do not have a user application. In another arrangement, a category-management application204 may be configured to process the user requests and determine whichuser application210 should receive the request. This alternative may reduce the number of signal exchanges within the system. In either case, the category-management applications204 and theuser applications210 may be configured to rely on local or remote ML models for processing the user requests, or such functionality may be embedded within them.
As yet another example, one of the category-management applications204 may be a safety-management application (not illustrated). This safety application may communicate with theremote server124 to provide or obtain various safety information. For example, the safety-management application may receive an amber alert from theremote server124. In response, the safety-management application may display or audibly present the alert to the user. Likewise, the safety management application may communicate with one or more accessories on the vehicle to obtain additional information. For example, the safety-management application may collect images from a forward-facing camera. The safety-management application can then analyze the images for license-plate numbers of nearby vehicles to compare against a target license-plate number provided in the amber alert. If there is a match, the safety-management application obtains the current location of the vehicle from thenavigation application202 and provides it to theremote server124. In some embodiments, the presence of this safety-management application may be completely invisible to the user, with no visual or audio output indicating that it is continuously working in the background.
In various embodiments, themobile device114 may also store and executeuser applications230. Theuser applications230 are similar to theuser applications210 on thehead unit104. In some embodiments, theuser applications230 on themobile device114 may include some of thesame user applications210 on thehead unit104. In other embodiments, theuser applications230 on themobile device114 may include different user applications than thehead unit104. In this illustration, themobile device114 includes aGas XYZ application230aand aGeneralStoreapplication230b,which are separate and different from theuser applications210 on thehead unit104.
The category-management applications204 access and communicate with theuser applications230 on themobile device114 similar to what is described above for communicating with theuser applications210 on thehead unit104 to obtain and select content to present to the user via thenavigation application202. The use ofsuch user applications230 on themobile device114 enables thehead unit104 to perform embodiments described herein without having thehead unit104 download andstore user applications210 on thehead unit104, which can save computing resources, as well as the time it would take a user select such user applications for downloading onto thehead unit104.
As an option, theuser applications230 may be projected from themobile device114 to thehead unit104, enabling the user to interact with them via thehead unit104. Any category-management applications installed on themobile device114 may also be projected to thehead unit104. The interactions described above may also apply to this configuration, although at least some of the processing may occur on themobile device114.
The operation of certain aspects of the disclosure will now be described with respect toFIGS. 3 and 4. In at least one of various embodiments, processes300 and400 described in conjunction withFIGS. 3 and 4, respectively, may be implemented by or executed on one or more computing devices, such ashead unit104. These processes are not necessarily limited to the chronological orders shown here and may be executed without adhering to all the steps described here or with steps that are beyond those in the diagrams. To be clear, processes300 and400 are merely examples of flows that may be applicable to or adopt the principles described above.
Process300 begins, after a start block, atblock302, where the head unit stores one or more category-management applications and user applications. In various embodiments, the category-management applications are pre-installed on the head unit, such as at the time of manufacture. One or more of the user applications may also be pre-installed on the head unit, or they may be downloaded to the head unit by the user, such as via a cellular network connection to a server hosting an application store or via a mobile device of the user. Of course, many other configurations may apply, too, including installation of the category-management applications and user applications on other computing devices.
Process300 proceeds to block304, where the category-management application receives a request to interact with one of the user applications. In some embodiments, to trigger this interaction, a navigation application receives the request from a user of the head unit, such as via the user activating a button on a graphical user interface, and provides it to the category-management application. In other embodiments, the category-management application receives the request directly from the user or from another input interface, such as an audio interface that captures a voice command provided by the user. In yet other embodiments, the request may be from a user application. For example, a safety application may receive an alert from a remote server and send a request to the category-management application for the category-management application to obtain information from other accessories of the vehicle. In some embodiments, the request may be from another accessory of the vehicle, such as from a vehicle computer when the vehicle is put into park.
In some embodiments, the request may specifically identify the user application. For example, if the navigation application is presenting a map to the user with icons for restaurants and the user activates an icon for Taco_Today, the navigation application provides a request to the category-management application for information from the Taco_Today user application.
In other embodiments, the request may indicate a desire for content or information without identifying a specific user application. Continuing the previous example, if the user activates a button for food in general (whether through the navigation application or directly through the category-management application), the category-management application may identify a nearest restaurant associated with one of the stored user applications and select that user application to process a request for current operating hours. As another example, if the navigation application is presenting a map without icons for restaurants and the user activates the food button, the category-management application can identify and select a user application associated with food to process a request the nearest location.
Process300 continues to block306, where the category-management application provides the request to the user application corresponding to the request. As mentioned herein, this request is for some type of content, which may include coordinates or an address for a nearest store location, hours of operation, coupons, daily information, rewards program information, automatic ordering information, or other information.
The user application then executes in the background to fulfill the request. In some embodiments, the user application queries a database, accesses a web service, or performs other actions to obtain the requested content.
For example, the user application may query a database stored locally on the head unit, or it may communicate with a remote server, such asremote server124 inFIG. 1, to obtain the requested content. In other embodiments, the user application may send the request or associated information to the remote server to fulfill the request. For example, if the request is to order a pizza from a particular pizzeria, the user application sends the order request to a remote server of the pizzeria to place the order. In some embodiments, the category-management application and the user application may provide one or more communications back and forth to fulfill the request.
Process300 proceeds next to block308, where the category-management application receives content from the user application in response to the request. For example, with the example of ordering a pizza, this content may be an acknowledgement that the pizza order was successfully submitted. In yet other embodiments, the content may be presented to the user.
Process300 continues next atblock310, where content associated with the response information is presented to the user of the head unit. In various embodiments, the category-management application provides the content to a navigation application for presentation to the user. In other embodiments, the category-management application itself employs a graphic user interface on the display device of the head unit to present the content to the user.
In some embodiments, the content is provided to the user via a banner or display window on the display device of the head unit. In other embodiments, one or more content items are selected based on the response and then presented to the user. For example, if the content is an address for a nearest restaurant, the category-management application may provide the address to a navigation application, which selects and displays on a map a pin icon (i.e., the content) representing the address.
In various embodiments, the content is visual content that is displayed to the user via a display device on the head unit (or some other display device within the vehicle). In other embodiments, the content is audio content that is output to the user via a speaker. In yet another embodiment, the content may be provided to a mobile device of the user, such that the content is viewable on the mobile device.
In some embodiments, the content that is presented to the user is prioritized or ranked when provided to the user. For example, the head unit may display a plurality of content in a prioritized list with the highest priority content on top and the lowest priority content on the bottom. The priority of the content may be determined based on factors associated with the user of the vehicle, the current driving conditions, or some combination thereof.
In some embodiments, content obtained by one category-management application may have a higher priority than content obtained by another category-management application. In at least one embodiment, the user may select or set the priority of the content from the different category-management applications. For example, content from a gas category-management application may be set to have a higher priority than content from a food category-management application.
In other embodiments, the current driving conditions may dictate the priority of the content. For example, if there is currently heavy traffic, then food-related content may be given a lower priority, while traffic updates or route instructions may be given a higher priority. The current driving conditions may be determined based on the current speed of the vehicle compared to the posted speed limit (e.g., based on a look-up in a database for the vehicle's current location or using a forward-facing camera and image-recognition techniques to identify speed-limit signs), the current weather (e.g., based on tire slippage or use of a weather application), the amount of braking being done by the driver, vehicle-to-vehicle communications, use of LiDAR or other proximity sensors, etc., or some combination thereof.
In some other embodiments, the priority of the content may be based on a timeline factor. For example, the highest priority content is associated with a factor the user or vehicle may experience first, and the second highest priority content is associated with a factor the user of vehicle may experience next, and so on. As such, the lowest priority content is associated with a factor the user or vehicle may experience at a maximum time in the future. In this way, the content is presented to the user in an order in matching the expected chronology of the factors.
For example, assume the plurality of content includes an advertisement for arestaurant 1 km away, a route instruction to turn left in 2 km, a safety recall notification, and a notification for an oil change that is due in 500 km. In this example, the advertisement may have a highest priority because it is likely the next factor that the user is going to experience, followed by the route instruction, safety recall, and then the oil change notification. Once the vehicle passes the restaurant, then that content may be removed from the list, and the priorities of the other content can be adjusted. If the user continues to neglect the safety recall, then its priority may be heightened as time goes on. Conversely, if the user delays the oil change, then its priority may be lowered for a threshold amount of time and possibly increased after this period expires.
The prioritized content can include the content obtained by the category-management applications, maintenance notifications, incoming phone calls or other communication messages, radio-related information (e.g., radio station, song title, and artist), advertisements, safety information, etc. The different types of content may be assigned a priority by the user or an administrator or automatically by the system. These priorities may change over time based on the current driving conditions, the user's preferences (e.g., if the user quickly dismisses a maintenance notification but follows an advertisement to a restaurant), or other factors.
In at least one embodiment, the priority changes may be based on a current distraction profile for the driver. For example, if the driver is distracted (e.g., in heavy traffic, during poor weather conditions, with other people in the vehicle, etc.), safety-related content may be given a higher priority in comparison to that of advertisements. In this way, content that has a priority that exceeds a threshold value for the current distraction profile is presented to the user. As the current distraction profile changes, the presentation threshold may change, and lower priority content may be presented to the user. Afterblock310,process300 terminates or otherwise returns to a calling process to perform other actions.
Althoughprocess300 describes employing a single category-management application and selecting a single user application, embodiments are not so limited. Rather, in some embodiments, one or more of a plurality of category-management may interact with one or more user applications, which is discussed in more detail below in conjunction withFIG. 4.
FIG. 4 illustrates a logical flow diagram generally showing one embodiment of an alternative process for utilizing multiple category-management applications to interact with multiple user applications to present content to a user in accordance with embodiments described herein.
Process400 begins, after a start block, atblock402, where a plurality of category-management applications, a plurality of user applications, and a navigation application are stored on a head unit or some other device. In various embodiments, block402 employs embodiments ofblock302 inFIG. 3 to store the applications on the head unit or other device.
Process400 proceeds to block404, where a request to interact with one or more user applications is received. In various embodiments, block404 employs embodiments ofblock304 inFIG. 3 to receive a request to interact with a user application.
Process400 continues atblock406, where one or more category-management applications are selected based on the request. As discussed above, the request may itself identify a particular user application. Accordingly, the category-management applications that interact with that particular user application are selected. In other embodiments, the request may itself identify a user application category or a particular category-management application. For example, if the request is food, then the corresponding category-management food application is selected.
In various embodiments, if the request is input via a navigation application, the navigation application may select the category-management applications. In other embodiments, the navigation application may provide the request to another application, such as an administrator application that manages the category-management applications, to select the category-management applications. As another option, the user may directly select the category-management applications.
Process400 proceeds next to block408, where the selected category-management applications select at least some of the user applications based on the request. In some embodiments, only one user application is selected. In other embodiments, multiple (but not all) user applications are selected. In yet other embodiments, all the user applications are selected.
The selected user applications may be identified in the request or selected by the category-management applications. For example, if the request is related to food, the food category-management application can select user applications associated with food that it can interact with (e.g., vialinks214 or viamobile device114 inFIG. 2). In some embodiments, a category-management application maintains a data structure to identify which user applications to select for which requests.
Process400 continues next atblock410, where the selected category-management applications provide the request to the selected user applications. In various embodiments, block410 employs embodiments ofblock306 to provide the request to the selected user applications.
Process400 proceeds to block412, where the selected category-management applications receive content from the selected user applications in response to the request. In various embodiments, block412 may perform embodiments ofblock308 for the category-management applications to receive the responses from the user applications, although multiple category-management applications may be receiving responses from multiple user applications.
Process400 continues atblock414, where the selected category-management applications select content associated with the responses for presentation to the user. In some embodiments, the selected category-management application selects content for the responses from the selected user applications. In other embodiments, the selected category-management applications select content for a subset of the responses. In one example, the category-management gas application may select content for the closest single gas station to present to the user, even though it may have received content from five different user gas applications. Similarly, as another example, the location of the closest gas station that is in the user's preconfigured list of preferred gas stations may be presented.
In various embodiments, a subset of the selected category-management applications may select content. For example, each selected category-management application may receive nearest-store locations from each user application, but only the category-management application that identifies the single closest store to the vehicle may select the content to present to the user.
In some embodiments, the category-management applications coordinate the selection of the content together. In other embodiments, the category-management applications select content separately from each other. In at least one such embodiment, the navigation application or some other administrative application can aggregate the selected content and determine what content to present to the user.
Process400 proceeds next to block416, where the content is presented to the user. In various embodiments, block416 performs embodiments ofblock310 inFIG. 3 to present the content to a user of the head unit. In some embodiments, content from multiple user applications is concurrently presented to the user. Afterblock416,process400 terminates or returns to a calling process to perform other actions.
Althoughprocess300 inFIG. 3 andprocess400 inFIG. 4 describe the user applications as being stored on and executed by the head unit, embodiments are not so limited. Rather, as discussed herein, one or more of the user applications may be stored on and executed by a mobile device of the user (e.g.,mobile device114 inFIG. 2) or some other device, whether onboard the vehicle or remote to it. In these embodiments, the category-management application utilizes a network interface to communicate with the user applications on the mobile device or other device.
FIG. 5 shows a system diagram that describes one implementation of computing systems for implementing embodiments described herein. (Other systems may also apply to the concepts presented above.)System500 includeshead unit104 and optionally, amobile device114 or one or moreremote servers124.
Thehead unit104 is a computing device that can perform functionality described herein for managing and aggregating results from user applications to select content to display to a user of thehead unit104. One or more special-purpose computing systems may be used to implement thehead unit104. Accordingly, various embodiments described herein may be implemented in software, hardware, firmware, or in some combination thereof. Thehead unit104 includesmemory504, one ormore processors522,display524, input/output (I/O) interfaces526, other computer-readable media528,network interface530, andother components532.
Processor522 includes one or more processing devices that execute computer instructions to perform actions, including at least some embodiments described herein. In various embodiments, theprocessor522 may include one or more central processing units (CPUs), programmable logic, or other processing circuitry.
Memory504 may include one or more various types of non-volatile and/or volatile storage technologies. Examples ofmemory504 include flash memory, hard disk drives, optical drives, solid-state drives, various types of random-access memory (RAM), various types of read-only memory (ROM), other computer-readable storage media (also referred to as processor-readable storage media), or other memory technologies, or any combination thereof.Memory504 may be utilized to store information, including computer-readable instructions that are utilized byprocessor522 to perform actions, including at least some embodiments described herein.
Memory504 may have stored thereon various modules, such as anavigation application202, one or more category-management applications204, and one ormore user applications210. Thenavigation application202 provides functionality to present a map or other content to a user of thehead unit104, such as viadisplay524,other components532, or input/output interfaces116,126 inFIG. 1. Each category-management application204 provides functionality to communicate with one ormore user applications210, oruser applications230 on themobile device114, to request content, aggregate responses, select content, and coordinate the presentation of the content to the user, such as via thenavigation application202. Auser application210 can be an executable program developed for a particular store, service, product, business, shop, station, etc. In some embodiments, theuser applications210 request associated content from another computing device, such asremote servers124.
Memory504 may also storeother programs518 andother content520.Other programs518 may include operating systems, user applications, or other computer programs.Content520 may include visual, audio, or tactile content to provide to the user, which may or may not be accessible to theuser applications210.
Display524 is a display device capable of rendering content to a user. Thedisplay524 may be a liquid-crystal display, light-emitting diode, or other type of display device and may include a touch sensitive screen capable of receiving inputs from a user's hand, stylus, or other object.
I/O interfaces526 may include interfaces for various other input or output devices, such as audio interfaces, other video interfaces, USB interfaces, physical buttons, keyboards, or the like.
Other computer-readable media528 may include other types of stationary or removable computer-readable media, such as removable flash drives, external hard drives, or the like.
Network interfaces530 are configured to communicate with other computing devices, such as themobile device114 or theremote servers124, via acommunication network534. Network interfaces530 include transmitters and receivers (not illustrated) to send and receive data as described herein. Thecommunication network534 may include thecommunication network122 or the mobiledevice communication network120 ofFIG. 1.
Themobile device114 is a computing device that is separate from thehead unit104, such as a smart phone, tablet computer, laptop computer, etc. One or more special-purpose computing systems may be used to implement themobile device114. Accordingly, various embodiments described herein may be implemented in software, hardware, firmware, or in some combination thereof.
Briefly, themobile device114 includes amemory554 andother computing components562. Thememory554 may include one or more various types of non-volatile and/or volatile storage technologies to store information and computer-readable instructions, similar tomemory504. Thememory554 may store one ormore user applications230. Theuser applications230 are similar to theuser applications210 on thehead unit104 and are accessible to the category-management applications204 on thehead unit104. Thememory554 may also store other programs and content.
Theother computing components562 include one or more processors, I/O interfaces, network interface, and other computing components and resources, which are known to those skilled in the art and are not discussed here for brevity.
Theremote servers124 include one or more computing devices that are remote from thehead unit104. Theseremote servers124 may be host devices, backend devices, or other servers that provide content or application support to theuser applications210 or230. One or more special-purpose computing systems may be used to implement eachremote server124. Accordingly, various embodiments described herein may be implemented in software, hardware, firmware, or in some combination thereof.
Briefly, eachremote server124 includes amemory578 and other computing components582. Thememory578 may include one or more various types of non-volatile and/or volatile storage technologies to store information and computer-readable instructions, similar tomemory504. Thememory578 may store user application content580, which can be provided to thehead unit104 or themobile device114 upon request from auser application210 or230, respectively. Thememory578 may also store other programs and content. The other computing components582 include one or more processors, I/O interfaces, network interface, and other computing components and resources, which are known to those skilled in the art and are not discussed here for brevity.
The various embodiments described above can be combined to provide further embodiments. These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.