The present patent application is a U.S. National Phase Application under 35 U.S.C. 371 of International Application No. PCT/US2014/013499 filed Jan. 29, 2014, the contents of which are incorporated herein in their entirety by reference.
FIELDEmbodiments described herein generally relate to mobile computer systems. More particularly, embodiments relate to implementation of a secondary display device at a mobile computer system.
BACKGROUNDCurrent computer systems often implement auxiliary/secondary display devices. However, such systems suffer from a range of problems. For example, existing auxiliary displays are typically not sufficiently large to view or read displayed results, and may be difficult to view at a distance, at oblique angles, or in bright ambient lighting conditions (e.g., outdoors). Additionally, existing auxiliary displays lose displayed images when the computer system enters a sleep mode or turns off.
BRIEF DESCRIPTION OF THE DRAWINGSEmbodiments are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like reference numerals refer to similar elements.
FIG. 1 is a block diagram illustrating one embodiment of a computer system.
FIGS. 2A-2C illustrate embodiments of computer systems with a secondary display.
FIG. 3 a block diagram illustrating one embodiment of a secondary display architecture.
FIG. 4 illustrates one embodiment of a process for performing secure image posting.
FIG. 5 illustrates one embodiment of a location based application for a secondary display.
FIG. 6 is a flow diagram illustrating one embodiment of a process for tagging images based on location.
FIG. 7 is a flow diagram illustrating one embodiment of a process for displaying location based tagged images.
FIGS. 8A-8C illustrate alternative embodiments for implementing a secondary display microcontroller.
DETAILED DESCRIPTIONIn the following description, numerous specific details are set forth. However, embodiments, as described herein, may be practiced without these specific details. In other instances, well-known circuits, structures and techniques have not been shown in details in order not to obscure the understanding of this description.
Throughout this document, terms like “logic”, “component”, “module”, “framework”, “engine”, “store”, or the like, may be referenced interchangeably and include, by way of example, software, hardware, and/or any combination of software and hardware, such as firmware.
While the concepts of the present disclosure are susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawings and will be described herein in detail. It should be understood, however, that there is no intent to limit the concepts of the present disclosure to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives consistent with the present disclosure and the appended claims.
References in the specification to “one embodiment,” “an embodiment,” “an illustrative embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may or may not necessarily include that particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
The disclosed embodiments may be implemented, in some cases, in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on a transitory or non-transitory machine-readable (e.g., computer-readable) storage medium, which may be read and executed by one or more processors. A machine-readable storage medium may be embodied as any storage device, mechanism, or other physical structure for storing or transmitting information in a form readable by a machine (e.g., a volatile or non-volatile memory, a media disc, or other media device).
In the drawings, some structural or method features may be shown in specific arrangements and/or orderings. However, it should be appreciated that such specific arrangements and/or orderings may not be required. Rather, in some embodiments, such features may be arranged in a different manner and/or order than shown in the illustrative figures. Additionally, the inclusion of a structural or method feature in a particular figure is not meant to imply that such feature is required in all embodiments and, in some embodiments, may not be included or may be combined with other features.
FIG. 1 illustrates an embodiment of acomputing system100.Computing system100 represents a range of computing and electronic devices (wired or wireless) including, for example, desktop computing systems, laptop computing systems, cellular telephones, personal digital assistants (PDAs) including cellular-enabled PDAs, set top boxes, smartphones, tablets, etc. Alternate computing systems may include more, fewer and/or different components.
Computing system100 includes bus105 (or, for example, a link, an interconnect, or another type of communication device or interface to communicate information) andprocessor110 coupled to bus105 that may process information. Whilecomputing system100 is illustrated with a single processor,electronic system100 and may include multiple processors and/or co-processors, such as one or more of central processors, graphics processors, and physics processors, etc.Computing system100 may further include random access memory (RAM) or other dynamic storage device120 (referred to as main memory), coupled to bus105 and may store information and instructions that may be executed byprocessor110.Main memory120 may also be used to store temporary variables or other intermediate information during execution of instructions byprocessor110.
Computing system100 may also include read only memory (ROM) and/orother storage device130 coupled to bus105 that may store static information and instructions forprocessor110.Data storage device140 may be coupled to bus105 to store information and instructions.Data storage device140, such as magnetic disk or optical disc and corresponding drive may be coupled tocomputing system100.
Computing system100 may also be coupled via bus105 to displaydevice150, such as a cathode ray tube (CRT), liquid crystal display (LCD) or Organic Light Emitting Diode (OLED) array, to display information to a user.User input device160, including alphanumeric and other keys, may be coupled to bus105 to communicate information and command selections toprocessor110. Another type ofuser input device160 iscursor control170, such as a mouse, a trackball, a touch screen, a touchpad, or cursor direction keys to communicate direction information and command selections toprocessor110 and to control cursor movement ondisplay150. Camera andmicrophone arrays190 ofcomputer system100 may be coupled to bus105 to observe gestures, record audio and video and to receive and transmit visual and audio commands.
Computing system100 may further include network interface(s)180 to provide access to a network, such as a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a personal area network (PAN), Bluetooth, a cloud network, a mobile network (e.g., 3rdGeneration (3G), etc.), an intranet, the Internet, etc. Network interface(s)180 may include, for example, a wireless networkinterface having antenna185, which may represent one or more antenna(e). Network interface(s)180 may also include, for example, a wired network interface to communicate with remote devices vianetwork cable187, which may be, for example, an Ethernet cable, a coaxial cable, a fiber optic cable, a serial cable, or a parallel cable.
Network interface(s)180 may provide access to a LAN, for example, by conforming to IEEE 802.11b and/or IEEE 802.11g standards, and/or the wireless network interface may provide access to a personal area network, for example, by conforming to Bluetooth standards. Other wireless network interfaces and/or protocols, including previous and subsequent versions of the standards, may also be supported.
In addition to, or instead of, communication via the wireless LAN standards, network interface(s)180 may provide wireless communication using, for example, Time Division, Multiple Access (TDMA) protocols, Global Systems for Mobile Communications (GSM) protocols, Code Division, Multiple Access (CDMA) protocols, and/or any other type of wireless communications protocols.
Network interface(s)180 may include one or more communication interfaces, such as a modem, a network interface card, or other well-known interface devices, such as those used for coupling to the Ethernet, token ring, or other types of physical wired or wireless attachments for purposes of providing a communication link to support a LAN or a WAN, for example. In this manner, the computer system may also be coupled to a number of peripheral devices, clients, control surfaces, consoles, or servers via a conventional network infrastructure, including an Intranet or the Internet, for example.
According to one embodiment,computer system100 also includes asecondary display device155 coupled to bus105. In such an embodiment,secondary display155 is a secondary persistent display (e.g. an Electronic Paper Display (EPD), such as E-Ink) added to an industrial design of a mobile device such ascomputer system100. In a further embodiment,secondary display155 is a limited interactive device interfaced via bus105, rather than being a graphics device.FIGS. 2A-2C illustrate embodiments of computer systems with a secondary display.
For example,FIG. 2A illustrates one embodiment in whichsecondary display155 implemented as a separate tablet from a tablet at whichdisplay device150 is implemented.FIG. 2B illustrates one embodiment in whichsecondary display155 is positioned on opposite side of a primary display device on a notebook computer system. In such an embodiment,secondary display155 may be connected to the computer system via a wired connection (e.g., USB) or wireless (e.g., Bluetooth or wireless LAN).
Although shown on the outside of a notebook computer system lid, other embodiments may featuresecondary display155 located on the back of a clamshell/convertible phone. In a further embodiment, contents ofsecondary display155 can be viewed by others when the lid is open (e.g., for social broadcasting usages) or used as an e-reader, for messages (e.g., alerts, etc.) when the lid is closed or in a tablet/phone mode.
FIG. 2C illustrates another embodiment in whichsecondary display155 is positioned on opposite side of a primary display device on a notebook computer system. However in this embodiment,secondary display155 may be a wrapper/cover of the notebook computer that mechanically slides on the cover and attaches mechanically to receive power and data. In further embodiment,secondary display155 may have other configuration, such as located on a wearable device.
In yet another embodiment, the size ofsecondary display155 mounted on a notebook computer system. For instance,secondary display155 may be a 6 inch or 9 inch display. Additionally,secondary display155 may be a 9.7 inch E-Ink cover that attaches to the notebook computer system. In the above-described embodiments,secondary display155 is mechanically mounted in a manner that minimizes thickness, retains robustness (e.g., against breakage), and provides pleasant viewing for the user that reduces eye strain.
According to one embodiment, a user is permitted to select the contents of what is to be displayed onsecondary display155. In this embodiment, a user selects a bitmap image, which is transmitted tosecondary display155 in a manner similar to a print function. In such an embodiment,secondary display155 includes a controller to receive the bitmap image and outputs the image to the display.FIG. 3 a block diagram illustrating one embodiment of asecondary display155 architecture implemented to perform such a process.
As shown inFIG. 3,secondary display155 includesdisplay310,buttons320 andmicrocontroller330. In one embodiment,display310 is a touch screen which, when touched by a user, initiates an algorithm that routes finger-touch-coordinates to a particular application. In a further embodiment,display310 includes an illumination mechanism (e.g., back-lighting, side-lighting, or front-lighting). In yet a further embodiment, content present ondisplay310 is persistent (e.g., does not require refresh cycles, such as E-Ink) on the screen even after power is removed from the mobile device. Thus, power is only consumed upon changing an image being displayed.
Buttons320 may be momentary push-buttons or capacitive buttons that, when pressed by the user, initiates an algorithm that routes a button-input-status to particular application.Microcontroller330 coordinates the display of content in the form of bitmap data atdisplay310. In one embodiment,microcontroller330 facilitates the display of pop-up notifications atdisplay310 for alerts (e.g., new emails, missed phone calls, stock/news/weather/traffic updates, social network updates, etc.). In such an embodiment, a pop-up may automatically be dismissed (e.g., disappear) after a pre-defined time period, or in response to a user action (e.g., abutton320 press or touch screen touch).
In one embodiment,microcontroller330 detects an event (e.g., touching atdisplay310 and/or buttons320) and forwards the event todevice drivers340.Devices drivers340 interface withmicrocontroller330. Thus,device drivers340 are aware of the attributes of display310 (e.g., a hard configuration) in order to provide bitmap data in a human interface device (HID) compliant format specific to display310. In one embodiment, the bitmap data is display independent.
Upon receiving an event,device drivers340 forward the event to an application specific interface (API)350 supported by thecomputer system100 interface along with the display attributes. Further,API350 receives raw bitmap data from anapplication program360 in a display independent format.Application program360 receives the display attributes fromAPI350 and in return, forwards a bitmap object toAPI350 in an operating system format.
In operation, anapplication program360 desires to prepare a bitmap image for display on an arbitrary sizedsecondary display310. Theapplication360 does not, in advance, know the proper display attributes (e.g., resolution, color depth) of the types of bitmap images supported for rendering bydisplay310. Theapplication360 makes anoperating system API350 call to retrieve the display attributes, which are hard-configured inmicrocontroller firmware330.
Once theapplication360 knows the valid format, it prepares the bitmap image in its own local memory according to rules mandated by the operating system. If the original bitmap image comes from another (e.g., external) source and has different attributes (resolution, color depth) than thephysical display310,API350 is implemented to condition (e.g., resize, dither) the image. Subsequently,API350 makes an operating system API call to transfer the bitmap image in the operating system mandated object form down todevice drivers340. In one embodiment, theoperating system API350 used may be the Printer API, such that the secondary display appears to be a printer as far as the operating system and applications are concerned.
The secondarydisplay device drivers340 package the bitmap data into a bus-independent and display independent format and transmits the data over bus105 tomicrocontroller330. Upon receiving the data,microcontroller330 unpacks the (HID formatted bitmap) data and renders it ondisplay310 according to thedisplay310 native electrical signaling requirements. Sincedisplay310 is persistent the bitmap data can be streamed only once directly to the display panel. In one embodiment,microcontroller330,device drivers340 and/orapplication360 may perform arithmetic algorithms on the bitmap data in order to improve appearance of the displayed image.
According to one embodiment, amanager module370 is included to provide an easy-to-find boundary for allapplications360 that are secondary display aware. In such an embodiment,manager module370 facilitates secure image posting tosecondary display310 by requiring thatnew applications360 installed atcomputer system100 register withmanager module370 in order to prevent malware from displaying unauthorized images. Thus, users may launchapplications360 from withinmanager module370 upon approving or disapproving of anapplication360 upon installation.Manager module370 maintains a permission list of the approved/disapprovedapplication360 that is used to subsequently enforce the user's selected preferences.
FIG. 4 illustrates one embodiment of a process for performing secure image posting. Initially, a first application360 (App1) requests permission fromdevice driver340 for permission to post tosecondary display155. Subsequently,device driver340 verifies thatApp1 is on the permission list. SinceApp1 is not currently on the list,device driver340 requests permission from the user viamanager module370. Subsequently,manager module370 displays a dialog box, via a graphical user interface (GUI), to receive an “Allow” or “Deny” answer from the user, which is forwarded todevice driver340.Device driver340 stores the user-chosen permission for future reference.Device driver340 then communicates the Allow/Deny answer toApp1. In this example,App1 now has permission to post an image to display310.
In one embodiment,manager module370 maintains an exact copy of the image currently being displayed atdisplay310 to prevent a user from having to viewdisplay310 to verify what is being displayed. Accordingly, themanager module370 GUI enables the user to control images, if any, that are to be displayed atdisplay310. In a further embodiment, themanager module370 GUI enables the user to control the brightness level of an illumination mechanism (e.g., front-lighting) ofdisplay310, add, remove and/or move registeredapplication360 between the approved and disapproved list of applications. In a further embodiment, the GUI enables the user to designate (e.g., activate or deactivate) a specific image that will be displayed on the secondary display when the computer enters a low-power sleep state (e.g., S3/S4) or turns off (e.g., S5).
In one embodiment,manager module370, viamicrocontroller330, may be implemented to facilitate one or more regions ofdisplay310 be sequestered from user control to permit a permanent image (e.g., OEM/company logo, platform logo content) be displayed. In a further embodiment, one or more regions ofdisplay310 may be sequestered by the user to add locked content (e.g., not removable by a thief). Such a feature may be referred to as “Digital Engraving,” which may include an owner's name, photo, phone number, and/or an “if lost return to” message. In yet a further embodiment,manager module370 may communicate with an anti-theft subsystem (e.g., Intel Anti-Theft) to enable display of a message ondisplay310, in a secure manner that the thief cannot override, thatcomputer system100 has been stolen.
According to one embodiment,manager module370 may facilitate the display of images onsecondary display155 based on the geographical location ofcomputer system100.FIG. 5 illustrates one embodiment of such a location based application in which a first image (image1) is displayed atsecondary display155 ifcomputer system100 is located at a first location (location A), a second image (image2) is displayed atsecondary display155 ifcomputer system100 is located at a second location (location B) and a third image (image3) is displayed atsecondary display155 ifcomputer system100 is located at a third location (location C).
To perform location based image application, images are first tagged for display based on location.FIG. 6 is a flow diagram illustrating one embodiment of a process for tagging images based on location. At processing block610, one or more images are selected by a user to be used. Atprocessing block620, a location is selected for association with each of the selected images. In one embodiment, the location can be either the current location or a designated location. If a current location is selected, a global positioning system (GPS) sensor at computer system is read to determine the current location. If a designated location is selected (e.g., based on a typed address), the GPS coordinates of the designated address is received (e.g., via a cloud service (e.g. Geocoder)). Atprocessing block630, the location is assigned to the one or more selected images. This process is repeated for each location at which a user wishes to tag images. In other embodiments, alternative forms of location determination may be used in lieu of a GPS sensor, including but not limited to: cellular network location determination, WiFi location determination, Bluetooth location determination, and so on.
Once the tagging process has been completed, location based tagged images may be displayed atsecondary display155 upon feature enablement.FIG. 7 is a flow diagram illustrating one embodiment of a process for displaying location based tagged images. Atprocessing block710, the current location ofcomputer system100 is detected by reading the GPS sensor, or alternative form of location determination. At processing block720, the current location is matched with a location stored during the tagging process. If a match is found, the one or more matched location taggedimages manager module370 displays the images onsecondary display155.
Although described above as being included withinsecondary display155,microcontroller330 may be incorporated withinother computer system100 architecture configurations.FIGS. 8A-8C illustrate alternative embodiments for implementing asecondary display microcontroller330.FIG. 8A illustrates one embodiment in whichmicrocontroller330 is external and coupled betweensecondary display155 and processor(s)110. In such an embodiment,microcontroller330 is used to abstract a display specific interface (e.g. display specific timing, protocols) from the operating system.
FIG. 8B illustrates an embodiment of a cost-optimized design in whichmicrocontroller330 is included within processor(s)110 and coupled tosecondary display155. In other embodiments,microcontroller330 may be an integrated engine such as the ISH (Integrated Sensor Hub) or CSME (Converged Security/Manageability Engine). In the embodiments described inFIGS. 8A and 8Bsecondary display155 include adisplay controller850 in addition todisplay310. In such an embodiment,display controller850 receives images, manipulates images, writes full or partial images to the panel and controls thedisplay310 pixels (e.g. panel source and gate control, display power control).
FIG. 8C illustrates another embodiment in which thedisplay controller850 function is performed by host software. In such an embodiment, the host software writes directly tosecondary display155 via bus105. In this embodiment, nomicrocontroller130 is implemented.Secondary display155 includes adisplay interface860 that converts bus105 signals to panel specific interface (e.g., source and gate control, display power control) anddisplay310.
It is to be appreciated that a lesser or more equipped system than the example described above may be preferred for certain implementations. Therefore, the configuration of computing device102 may vary from implementation to implementation depending upon numerous factors, such as price constraints, performance requirements, technological improvements, or other circumstances. Examples of the electronic device or computing device102 may include without limitation a mobile device, a personal digital assistant, a mobile computing device, a smart phone, a cellular telephone, a handset, a one-way pager, a two-way pager, a messaging device, a computer, a personal computer (PC), a desktop computer, a laptop computer, a notebook computer, a handheld computer, a tablet computer, a server, a server array or server farm, a web server, a network server, an Internet server, a work station, a mini-computer, a main frame computer, a supercomputer, a network appliance, a web appliance, a distributed computing system, multiprocessor systems, processor-based systems, consumer electronics, programmable consumer electronics, television, digital television, set top box, wireless access point, base station, subscriber station, mobile subscriber center, radio network controller, router, hub, gateway, bridge, switch, machine, or combinations thereof.
Embodiments may be implemented as any or a combination of: one or more microchips or integrated circuits interconnected using a parent board, hardwired logic, software stored by a memory device and executed by a microprocessor, firmware, an application specific integrated circuit (ASIC), and/or a field programmable gate array (FPGA). The term “logic” may include, by way of example, software or hardware and/or combinations of software and hardware.
Embodiments may be provided, for example, as a computer program product which may include one or more machine-readable media having stored thereon machine-executable instructions that, when executed by one or more machines such as a computer, network of computers, or other electronic devices, may result in the one or more machines carrying out operations in accordance with embodiments described herein. A machine-readable medium may include, but is not limited to, floppy diskettes, optical disks, CD-ROMs (Compact Disc-Read Only Memories), and magneto-optical disks, ROMs, RAMs, EPROMs (Erasable Programmable Read Only Memories), EEPROMs (Electrically Erasable Programmable Read Only Memories), magnetic or optical cards, flash memory, or other type of media/machine-readable medium suitable for storing machine-executable instructions.
Moreover, embodiments may be downloaded as a computer program product, wherein the program may be transferred from a remote computer (e.g., a server) to a requesting computer (e.g., a client) by way of one or more data signals embodied in and/or modulated by a carrier wave or other propagation medium via a communication link (e.g., a modem and/or network connection).
As used in the claims, unless otherwise specified the use of the ordinal adjectives “first”, “second”, “third”, etc., to describe a common element, merely indicate that different instances of like elements are being referred to, and are not intended to imply that the elements so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
The following clauses and/or examples pertain to further embodiments or examples. Specifics in the examples may be used anywhere in one or more embodiments. The various features of the different embodiments or examples may be variously combined with some features included and others excluded to suit a variety of different applications. Examples may include subject matter such as a method, means for performing acts of the method, at least one machine-readable medium including instructions that, when performed by a machine cause the machine to performs acts of the method, or of an apparatus or system for facilitating content-morphism and distribution of advertisement content and user content according to embodiments and examples described herein.
Some embodiments pertain to Example 1 that includes a computing device having a processor, a bus coupled to the processor, a graphics display device, coupled to the bus, to display graphics data, an interactive display device, coupled to the bus, to display bitmap image data and a manager module to manage the bitmap image data and transmit the bitmap image data to the interactive display device.
Example 2 includes the subject matter of Example 1 and wherein the interactive display device comprises a controller to receive the bitmap image data and a persistent display screen to display the bitmap image data.
Example 3 includes the subject matter of Example 2 and wherein the persistent display screen is a touch screen.
Example 4 includes the subject matter of Example 3 and wherein the interactive display device further comprises one or more buttons.
Example 5 includes the subject matter of Example 4 and wherein the controller detects an event upon activation of a button or the touch screen and transmits the event to the manager module.
Example 6 includes the subject matter of Example 1 and wherein the manager module manages requests from one or more applications to display data at the interactive display device.
Example 7 includes the subject matter of Example 6 and wherein the manager module comprises a graphical user interface to prompt and receive user interaction to approve the requests from the one or more applications to display data at the interactive display device.
Example 8 includes the subject matter of Example 7 and wherein the manager module further comprises a permission list of the applications approved and disapproved to display data at the interactive display device.
Example 9 includes the subject matter of Example 1 and further comprising a global positioning system (GPS) sensor to receive positioning data indicating a location of the computer system, wherein the manager module receives the positioning data and selects bitmap image data for display at the interactive display device based on a location indicated by the positioning data.
Example 10 includes the subject matter of Example 3 and wherein the interactive display device is an Electronic Paper Display (EPD).
Example 11 includes the subject matter of Example 3 and wherein the interactive display device is coupled to the bus via a wireless communications link.
Example 12 includes the subject matter of Example 3 and wherein the graphics display device is mounted on a front side of the computing system and the interactive display device is mounted on a backside of the computing device.
Example 13 includes the subject matter of Example 3 and wherein the interactive display device is located on a tablet device separate from the computing device.
Some embodiments pertain to Example 14 that includes a computer generated method for displaying data at a secondary display device comprising receiving display independent bitmap image data from an application, determining if the application is authorized to display data at an interactive display device, rendering the independent bitmap image data to device dependent bitmap image data if the application is authorized to display data at an interactive display device and displaying the device dependent bitmap image data at the interactive display device.
Example 15 includes the subject matter of Example 14 and wherein determining if the application is authorized to display data at the interactive display device comprises receiving a permission request from the application to display data at the interactive display device prior to receiving the display independent bitmap image data from an application and accessing a permission list to determine if the application has been approved to display data at an interactive display device.
Example 16 includes the subject matter of Example 15 and further comprising enabling the application to display data at the interactive display device if the application has been approved to display data at an interactive display device.
Example 17 includes the subject matter of Example 15 and further comprising displaying a prompt at the interactive display device for a user to select options to approve or disapprove the device if the application is not included on the permission list and communicating the option to the application.
Example 18 includes the subject matter of Example 14 and wherein the application is a location based application to display a first image at the interactive display device if a computer system is located at a first location and to display a second image at the interactive display device if computer system is located at a second location.
Example 19 includes the subject matter of Example 18 and wherein the location based application tags the first image to be associated with the first location and tags the second image to be associated with the second location.
Example 20 includes the subject matter of Example 19 and wherein tagging an image comprises selecting a location to be associated with the image.
Example 21 includes the subject matter of Example 20 and wherein the location is selected by designating global positioning system (GPS) coordinates of the first and second locations.
Example 22 includes the subject matter of Example 20 and wherein the location is selected by determining a location of the computer system during tagging of first image.
Example 23 includes the subject matter of Example 20 and wherein displaying an image at the interactive display device comprises detecting a current location of the computer system, matching the current location with a tagged location and displaying the tagged location.
Example 24 includes the subject matter of Example 20 and further comprising persistently displaying the data at the interactive display device.
Some embodiments pertain to Example 25 that includes a machine-readable medium comprising a plurality of instructions that in response to being executed on a computing device, causes the computing device to carry out operations according to any one of claims14 to24.
Some embodiments pertain to Example 26 that includes a system comprising a mechanism to carry out operations according to any one ofclaims1 to10.
Some embodiments pertain to Example 27 that includes a computing device arranged to carry out operations according to any one ofclaims1 to10.
Some embodiments pertain to Example 28 that includes a communications device arranged to carry out operations according to any one ofclaims1 to10.
Some embodiments pertain to Example 29 that includes a machine-readable medium comprising a plurality of instructions that in response to being executed on a computing device, causes the computing device to carry out operations comprising receiving display independent bitmap image data from an application, determining if the application is authorized to display data at an interactive display device, rendering the independent bitmap image data to device dependent bitmap image data if the application is authorized to display data at an interactive display device and displaying the device dependent bitmap image data at the interactive display device.
Example 30 includes the subject matter of Example 29 and wherein determining if the application is authorized to display data at the interactive display device comprises receiving a permission request from the application to display data at the interactive display device prior to receiving the display independent bitmap image data from an application and accessing a permission list to determine if the application has been approved to display data at an interactive display device.
Example 31 includes the subject matter of Example 30 and comprising further instructions that in response to being executed on a computing device, causes the computing device to carry out operations comprising enabling the application to display data at the interactive display device if the application has been approved to display data at an interactive display device.
Example 32 includes the subject matter of Example 30 and comprising further instructions that in response to being executed on a computing device, causes the computing device to carry out operations comprising displaying a prompt at the interactive display device for a user to select options to approve or disapprove the device if the application is not included on the permission list and communicating the option to the application.
Example 33 includes the subject matter of Example 29 and wherein the application is a location based application to display a first image at the interactive display device if a computer system is located at a first location and to display a second image at the interactive display device if computer system is located at a second location.
Example 34 includes the subject matter of Example 33 and wherein the location based application tags the first image to be associated with the first location and tags the second image to be associated with the second location.
Example 35 includes the subject matter of Example 34 and wherein tagging an image comprises selecting a location to be associated with the image.
Example 36 includes the subject matter of Example 35 and wherein the location is selected by designating global positioning system (GPS) coordinates of the first and second locations.
Example 37 includes the subject matter of Example 35 and wherein the location is selected by determining a location of the computer system during tagging of first image.
Example 38 includes the subject matter of Example 35 and wherein displaying an image at the interactive display device comprises detecting a current location of the computer system, matching the current location with a tagged location and displaying the tagged location.
Example 39 includes the subject matter of Example 29 and comprising further instructions that in response to being executed on a computing device, causes the computing device to carry out operations comprising persistently displaying the data at the interactive display device.
Some embodiments pertain to Example 40 that includes an apparatus comprising means for detecting that a media capture device is prepared to capture media data of a scene, means for capturing data associated with the scene, means for analyzing the scene data to identify and classify behavior of one or more objects in the scene and means for adjusting the media capture device based on the scene data analysis to optimize the capture of the media data.
Example 41 includes the subject matter of Example 40 and wherein determining if the application is authorized to display data at the interactive display device comprises receiving a permission request from the application to display data at the interactive display device prior to receiving the display independent bitmap image data from an application and accessing a permission list to determine if the application has been approved to display data at an interactive display device.
Example 42 includes the subject matter of Example 40 and further comprising means for enabling the application to display data at the interactive display device if the application has been approved to display data at an interactive display device.
The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of embodiments is at least as broad as given by the following claims.