Movatterモバイル変換


[0]ホーム

URL:


US10748182B2 - Device functionality-based content selection - Google Patents

Device functionality-based content selection
Download PDF

Info

Publication number
US10748182B2
US10748182B2US16/039,128US201816039128AUS10748182B2US 10748182 B2US10748182 B2US 10748182B2US 201816039128 AUS201816039128 AUS 201816039128AUS 10748182 B2US10748182 B2US 10748182B2
Authority
US
United States
Prior art keywords
content
video
media item
advertisement
skippable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/039,128
Other versions
US20180322530A1 (en
Inventor
Poorva Arankalle
Brienne M. Finger
Lin Liao
Manish Gupta
Rajas Moonka
Reuven Lax
Jill A. Huchital
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLCfiledCriticalGoogle LLC
Priority to US16/039,128priorityCriticalpatent/US10748182B2/en
Assigned to GOOGLE INC.reassignmentGOOGLE INC.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: FINGER, BRIENNE, LAX, REUVEN, LIAO, LIN, ARANKALLE, POORVA, GUPTA, MANISH, HUCHITAL, JILL A., MOONKA, RAJAS
Assigned to GOOGLE LLCreassignmentGOOGLE LLCCHANGE OF NAME (SEE DOCUMENT FOR DETAILS).Assignors: GOOGLE INC.
Publication of US20180322530A1publicationCriticalpatent/US20180322530A1/en
Priority to US16/943,383prioritypatent/US11210697B2/en
Application grantedgrantedCritical
Publication of US10748182B2publicationCriticalpatent/US10748182B2/en
Priority to US17/552,565prioritypatent/US11915263B2/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

Techniques for presenting a content item with a media item are described. The techniques include receiving user input indicating a placement preference for a content item to be presented with a media item. The placement preference indicates a presentation preference of the content item relative to presentation of the media item. The placement preference is used to influence selection of a media item with which the content item is to be presented.

Description

CROSS REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of priority under 35 U.S.C. § 120 as a continuation of U.S. patent application Ser. No. 15/638,306, filed Jun. 29, 2017, which claims the benefit of priority under 35 U.S.C. § 120 as a continuation of U.S. patent application Ser. No. 14/164,719, filed on Jan. 27, 2014, which claims the benefit of priority under 35 U.S.C. § 120 as a continuation of U.S. patent application Ser. No. 11/770,585, filed on Jun. 28, 2007, which claims the benefit of priority under 35 U.S.C. § 119 of U.S. Provisional Application No. 60/946,717, filed on Jun. 27, 2007, each of which are hereby incorporated by reference herein in their entirety.
BACKGROUND
Online video is a growing medium. The popularity of online video services reflect this growth. Advertisers see online video as another way to reach their customers. Many advertisers are interested in maximizing the number of actions (e.g., impressions and/or click-throughs) for their advertisements. To achieve this, advertisers make efforts to target advertisements to content, such as videos, that are relevant to their advertisements.
When an advertiser wishes to target advertisements to a video, the advertiser may target advertisements to the video content. For example, if videos are classified into categories, the advertiser can target advertisements to the videos based on the categories.
In some online advertising systems, advertisers pay for their ads through an advertising auction system in which they bid on advertisement placement on a Cost-Per-Click (CPC) or a Cost-Per-Mille (e.g., thousand impressions) (CPM) basis. The advertiser typically has a budget to spend on advertising, and the auction can be run between competing advertisers via each bidder's CPC and/or CPM bid given the advertiser's budget, or through a more complex equation of CPC and CPM, such as one that weighs the advertiser's bid by that advertisement's known Click-Thru-Rate (CTR) or other values. In one variation on the system, an advertiser targets an advertisement at a particular content location, web site, or content category, and the advertiser's bid is weighted by an estimated Click Through Rate (eCTR).
SUMMARY
In one general aspect, user input indicating a placement preference for an advertisement to be presented with a video is received. The placement preference indicates a presentation preference of the advertisement relative to presentation of feature content of the video. The placement preference is used to influence selection of a video with which the advertisement is to be presented.
In one general aspect, user input indicating a placement preference for a content item to be presented with a media item is received. The placement preference indicates a presentation preference of the content item relative to presentation of the media item. The placement preference is used to influence selection of a media item with which the content item is to be presented.
Implementations may include one or more of the following features. For example, the media item may be one or more of an audio item, a video item, and a combination of a video item and an audio item. The content item may be presented using one or more of text, graphics, still-image, video, audio, banners and links. The placement preference may indicate a presentation preference of a sequence of the content item relative to the presentation of the media item. The placement preference may include one or more of pre-roll placement such that the content item is to be placed prior to playing of feature content of the media item, mid-roll placement such that the content item is to be placed within feature content of the media item, and post-roll placement such that the content item is to be placed once playing of feature content of the media item is completed. The placement preference may include placement of the content item based on whether a viewer of the media item has capability of skipping the content item.
The content item may include an advertisement. Receiving user input may include receiving a bid for placement of the advertisement that reflects placement preference of a sponsor of the advertisement. The placement preference and the bid may be used to influence selection of media with which the advertisement is to be presented.
The placement preference may include a first placement preference. User input indicating a second placement preference for the advertisement to be presented with a media item may be received. The second placement preference may indicate a second presentation preference of the advertisement relative to presentation of the media item. A first and second bids for placement of the advertisement may be received. The first and second bids may respectively reflect the first and second placement preferences of a sponsor of the advertisement. The second bid may be different from the first bid and the second placement preference may be different from the first placement preference. The first and second placement preferences and the first and second bids may be used to influence selection of a media item with which the advertisement is to be presented.
In another general aspect, user input indicating a placement preference for a content item to be presented with a media item is received. The placement preference indicates a presentation preference of the content item based on an entity presenting the media item. The placement preference is used to influence selection of a media item with which the content item is to be presented.
Implementations may include one or more of the features noted above. Implementations may also include one or more of the following features. For example, the placement preference may indicate whether the content item is to be presented with an embedded media item. An entity presenting the embedded media item is different from an entity owning the embedded media item. The placement preference may indicate one or more entities and whether the content item may be presented with a media item presented by the one or more entities.
In yet another general aspect, a graphical user interface is generated on a display device for using a computer to specify ad placement preferences. The graphical user interface includes a placement preference region. The placement preference region includes placement preference which may be modified by a user. The placement preference indicates a presentation preference of an advertisement relative to presentation of feature content. Implementations may include one or more of the features noted above.
In a further general aspect, a graphical user interface is generated on a display device for using a computer to specify ad placement preferences. The graphical user interface includes a placement preference region. The placement preference region includes placement preference which may be modified by a user. The placement preference indicates a presentation preference of an advertisement based on an entity presenting a media item. Implementations may include one or more of the features noted above.
The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features and advantages will be apparent from the description and drawings as well as from the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates an example of an environment for providing content.
FIG. 2 is a block diagram illustrating an example environment in which electronic promotional material (e.g., advertising content) may be identified according to targeting criteria.
FIGS. 3 and 4 are examples of a user interface illustrating advertising content displayed on a screen with video content.
FIG. 5 is a flow diagram of an example process flow for providing video advertisements.
FIGS. 6 and 8 are example flow diagrams for using placement preference in selecting advertisements.
FIG. 7 is an example user interface for entering bids for placement relative to feature content.
FIG. 9 is an example user interface for excluding placement of advertisements in video based on presenting entity.
FIG. 10 is a block diagram illustrating an example generic computer and an example generic mobile computer device.
DETAILED DESCRIPTION
FIG. 1 shows an example of anenvironment100 for providing content. The content, or “content items,” can include various forms of electronic media. For example, the content can include text, audio, video, advertisements, configuration parameters, documents, video files published on the Internet, television programs, podcasts, video podcasts, live or recorded talk shows, video voicemail, segments of a video conversation, and other distributable resources.
Theenvironment100 includes, or is communicably coupled with, anadvertisement provider102, acontent provider104, and one ormore user devices106, at least some of which communicate acrossnetwork108. In general, theadvertisement provider102 can characterize presented content and provide relevant advertising content (“ad content”) or other relevant content. By way of example, reference is made to delivering ad content, though other forms of content (e.g., other content item types) can be delivered. The presented content may be provided by thecontent provider104 through thenetwork108. The ad content may be distributed, throughnetwork108, to one ormore user devices106 before, during, or after presentation of the material. In some implementations,advertisement provider102 may be coupled with anadvertising repository103. The ad repository stores advertising that can be presented with various types of content, including audio and/or video content.
In some implementations, theenvironment100 may be used to identify relevant advertising content according to a particular selection of a video or audio content item (e.g., one or more segments of video or audio). For example, theadvertisement provider102 can acquire knowledge about scenes in a video content item, such as content changes in the audio and video data of the video content item. The knowledge can be used to determine targeting criteria for the video content item, which in turn can be used to select relevant advertisements for appropriate places in the video content item. In some implementations, the relevant advertisements can be placed in proximity to or overlaid with the presented content item, such as in a banner, sidebar, or frame.
The selection of advertisements for placement in the video content item is determined based on a placement preference of, for example, an advertiser. The placement preference indicates a presentation preference of an advertisement relative to the presentation of the video content item. For example, a placement preference may include placement of an advertisement relative to video feature content, such as targeting (or excluding) one or more of pre-roll placement, mid-roll placement or post-roll placement. The pre-roll placement or pre-roll advertising (also called pre-watch advertising) refers to advertising presented before the video feature plays. This may be accomplished, for example, by superimposing pixels corresponding to the advertising content over the video playback area of the video player before the video feature begins. The pre-roll advertising may be presented as an opaque display. The pre-roll advertising may be presented so as to allow the viewer to see both the advertising portion and the underlying video feature that is covered by the advertising. The mid-roll placement or mid-roll advertising (also called mid-watch advertising or interstitial advertising) refers to advertising presented while the video feature content has begun or is playing. The post-roll placement or post-roll advertising (also called post-watch advertising) refers to advertising presented after the video feature has finished playing.
The placement preference may also include placement of an advertisement based on whether the viewer has the capability of skipping advertisements, excluding placement of an advertisement in an embedded video, or excluding placement of an advertisement in video presented by web sites identified by the advertiser.
In some implementations, advertisers may identify preferences for an advertisement or group of advertisements by entering or adjusting bids used to place advertisements in videos where the bids reflect the advertisers placement preferences.
In some implementations, the selection of advertisements for placement in a video content item is determined based on a placement preference and a bid of an advertiser. For each placement preference of an advertisement, the advertiser may offer a bid for placement of the advertisement. Among the advertisements having a matching placement preference with a video content item, the advertisement(s) with the highest bid may be presented in the video feature content as specified by the placement preference.
In some implementations, a “video content item” is an item of content that includes content that can be perceived visually when played, rendered, or decoded. A video content item includes video data, and optionally audio data and metadata. Video data includes content in the video content item that can be perceived visually when the video content item is played, rendered, or decoded. Audio data includes content in the video content item that can be perceived aurally when the video content item is played, decoded, or rendered. A video content item may include video data and any accompanying audio data regardless of whether or not the video content item is ultimately stored on a tangible medium. A video content item may include, for example, a live or recorded television program, a live or recorded theatrical or dramatic work, a music video, a televised event (e.g., a sports event, a political event, a news event, etc.), video voicemail, etc. Each of different forms or formats of the same video data and accompanying audio data (e.g., original, compressed, packetized, streamed, etc.) may be considered to be a video content item (e.g., the same video content item, or different video content items).
Video content can be consumed at various client locations, using various devices. Examples of the various devices include customer premises equipment which is used at a residence or place of business (e.g., computers, video players, video-capable game consoles, televisions or television set-top boxes, etc.), a mobile telephone with video functionality, a video player, a laptop computer, a set top box, a game console, a car video player, etc. Video content may be transmitted from various sources including, for example, terrestrial television (or data) transmission stations, cable television (or data) transmission stations, satellite television (or data) transmission stations, via satellites, and video content servers (e.g., Webcasting servers, podcasting servers, video streaming servers, video download Websites, etc.), via a network such as the Internet for example, and a video phone service provider network such as the Public Switched Telephone Network (“PSTN”) and the Internet, for example.
A video content item can also include many types of associated data. Examples of types of associated data include video data, audio data, closed-caption or subtitle data, a transcript, content descriptions (e.g., title, actor list, genre information, first performance or release date, etc.), related still images, user-supplied tags and ratings, etc. Some of this data, such as the description, can refer to the entire video content item, while other data (e.g., the closed-caption data) may be temporally-based or timecoded. In some implementations, the temporally-based data may be used to detect scene or content changes to determine relevant portions of that data for targeting ad content to users.
In some implementations, an “audio content item” is an item of content that can be perceived aurally when played, rendered, or decoded. An audio content item includes audio data and optionally metadata. The audio data includes content in the audio content item that can be perceived aurally when the video content item is played, decoded, or rendered. An audio content item may include audio data regardless of whether or not the audio content item is ultimately stored on a tangible medium. An audio content item may include, for example, a live or recorded radio program, a live or recorded theatrical or dramatic work, a musical performance, a sound recording, a televised event (e.g., a sports event, a political event, a news event, etc.), voicemail, etc. Each of different forms or formats of the audio data (e.g., original, compressed, packetized, streamed, etc.) may be considered to be an audio content item (e.g., the same audio content item, or different audio content items).
Audio content can be consumed at various client locations, using various devices. Examples of the various devices include customer premises equipment which is used at a residence or place of business (e.g., computers, audio players, audio-capable game consoles, televisions or television set-top boxes, etc.), a mobile telephone with audio playback functionality, an audio player, a laptop computer, a car audio player, etc. Audio content may be transmitted from various sources including, for example, terrestrial radio (or data) transmission stations, via satellites, and audio content servers (e.g., Webcasting servers, podcasting servers, audio streaming servers, audio download Websites, etc.), via a network such as the Internet for example, and a video phone service provider network such as the Public Switched Telephone Network (“PSTN”) and the Internet, for example.
An audio content item can also include many types of associated data. Examples of types of associated data include audio data, a transcript, content descriptions (e.g., title, actor list, genre information, first performance or release date, etc.), related album cover image, user-supplied tags and ratings, etc. Some of this data, such as the description, can refer to the entire audio content item, while other data (e.g., the transcript data) may be temporally-based. In some implementations, the temporally-based data may be used to detect scene or content changes to determine relevant portions of that data for targeting ad content to users.
Ad content can include text, graphics, still-images, video, audio, audio and video, banners, links (such as advertising providing a hyperlink to an advertiser's website), and other web or television programming related data. As such, ad content can be formatted differently, based on whether the ad content is primarily directed to websites, media players, email, television programs, closed captioning, etc. For example, ad content directed to a website may be formatted for display in a frame within a web browser. In other examples, ad content may be delivered in an RSS (Real Simple Syndication) feed, or ad content may be delivered relative to a radio item (such as before, during or after a radio item). As yet another example, ad content directed to a video player may be presented “in-stream” as video content is played in the video player. In some implementations, in-stream ad content may replace the video or audio content in a video or audio player for some period of time or may be inserted between portions of the video or audio content. An in-stream advertisement can be placed pre-roll, post-roll, or mid-roll relative to video feature content. An in-stream advertisement may include video, audio, text, animated images, still images, or some combination thereof.
Thecontent provider104 can present content to users (e.g., user device106) through thenetwork108. In some implementations, thecontent providers104 are web servers where the content includes webpages or other content written in the Hypertext Markup Language (HTML), or any language suitable for authoring webpages. In general,content provider104 can include users, web publishers, and other entities capable of distributing content over a network. For example, a web publisher may create an MP3 audio file and post the file on a publicly available web server. In some implementations, thecontent provider104 may make the content accessible through a known Uniform Resource Locator (URL).
Thecontent provider104 can receive requests for content (e.g., articles, discussion threads, music, audio, video, graphics, search results, webpage listings, etc.). Thecontent provider104 can retrieve the requested content in response to, or otherwise service, the request. Theadvertisement provider102 may broadcast content as well (e.g., not necessarily responsive to a request).
A request for advertisements (or “ad request”) may be submitted to theadvertisement provider102. Such an ad request may include ad spot information (e.g., a number of advertisements desired, a duration, type of ads eligible, etc.). In some implementations, the ad request may also include information about the content item that triggered the request for the advertisements. This information may include the content item itself (e.g., a page, a video file, a segment of an audio stream, data associated with the video or audio file, etc.), one or more categories or topics corresponding to the content item or the content request (e.g., arts, business, computers, arts-movies, arts-music, etc.), part or all of the content request, content age, content type (e.g., text, graphics, video, audio, mixed media, etc.), geo-location information, etc.
In some implementations, the information in the ad request submitted byadvertisement provider102 may indicate characteristics of a video content item that triggered the request for the advertisements. Such characteristics may be used to determine advertisements having a matching placement preference. For example, the ad request may indicate whether the video content item allows pre-roll placement of ads, mid-roll placement or post-roll placement. Alternatively or additionally, the ad request may indicate whether a viewer has capability of skipping advertisements. Then, advertisements with matching placement preference may be selected to be presented in the video content item. For example, a video content may allow only post-roll placement of advertisements (e.g., the advertisements may be presented only after the video content has finished playing) and may not allow a viewer to skip advertisements. Then, those ads with the matching placement preference may be selected and placed relative to video content based on the placement preference (e.g., pre-roll, mid-roll and/or post-roll).
Content provided bycontent provider104 can include news, weather, entertainment, or other consumable textual, audio, or video media. More particularly, the content can include various resources, such as documents (e.g., webpages, plain text documents, Portable Document Format (PDF) documents, images), video or audio clips, etc. In some implementations, the content can be graphic-intensive, media-rich data, such as, for example, Flash-based content that presents video and sound media.
Theenvironment100 includes one ormore user devices106. Theuser device106 can include a desktop computer, laptop computer, a media player (e.g., an MP3 player, a streaming audio player, a streaming video player, a television, a computer, a mobile device, etc.), a mobile phone, a browser facility (e.g., a web browser application), an e-mail facility, telephony means, a set top box, a television device, a radio device or other device that can access advertisements and other content vianetwork108. Thecontent provider104 may permituser device106 to access content (e.g., video files for downloading or streaming, audio files for downloading or streaming, etc.).
Thenetwork108 facilitates wireless or wireline communication between theadvertisement provider102, thecontent provider104, and any other local or remote computers (e.g., user device106). Thenetwork108 may be all or a portion of an enterprise or secured network. In another example, thenetwork108 may be a virtual private network (VPN) between thecontent provider104 and theuser device106 across a wireline or a wireless link. While illustrated as a single or continuous network, thenetwork108 may be logically divided into various sub-nets or virtual networks without departing from the scope of this disclosure, so long as at least a portion of thenetwork108 may facilitate communications between theadvertisement provider102,content provider104, and at least one client (e.g., user device106). In certain implementations, thenetwork108 may be a secure network associated with the enterprise and certain local orremote clients106.
Examples ofnetwork108 include a local area network (LAN), a wide area network (WAN), a wireless phone network, a Wi-Fi network, and the Internet.
In some implementations, a content item is combined with one or more of the advertisements provided by theadvertisement provider102. This combined information including the content of the content item and advertisement(s) is then forwarded toward auser device106 that requested the content item or that configured itself to receive the content item, for presentation to a user.
Thecontent provider104 may transmit information about the ads and how, when, and/or where the ads are to be rendered, and/or information about the results of that rendering (e.g., ad spot, specified segment, position, selection or not, impression time, impression date, size, temporal length, volume, conversion or not, etc.) back to theadvertisement provider102 through thenetwork108. Alternatively, or in addition, such information may be provided back to theadvertisement provider102 by some other means.
In some implementations, thecontent provider104 includes advertisement media as well as other content. In such a case, theadvertisement provider102 can determine and inform thecontent provider104 which advertisements to send to theuser device106, for example.
FIG. 2 is a block diagram illustrating anexample environment200 in which electronic promotional material (e.g., advertising content or advertisements) may be identified according to targeting criteria.Environment200 includes, or is communicatively coupled withadvertisement provider102,content provider104, anduser device106, at least some of which communicate acrossnetwork108.
In some implementations, theadvertisement provider102 includes acontent analyzer202, aboundary module204, and anad server206. Thecontent analyzer202 may examine received content items to determine segmentation boundaries and/or targeting criteria for content items. For example, thecontent analyzer202 may implement various analysis methods, including, but not limited to weighting schemes, speech processing, image or object recognition, and statistical methods.
The analysis methods can be applied to the contextual elements of the received content item (e.g., video content, audio content, etc.) to determine boundaries for segmenting the received content and to determine relevant targeting criteria. For example, the received content may undergo one or more of audio volume normalization, automatic speech recognition, transcoding, indexing, image recognition, sound recognition, etc. In some implementations, thecontent analyzer202 includes a speech totext module208, asound recognition module210, and anobject recognition module212. Other modules are possible.
The speech totext module208 can analyze content received inenvironment200 to identify speech in the content. For example, a video content item may be received in theenvironment200. The speech-to-text module208 can analyze the video content item as a whole. Textual information may be derived from the speech included in the audio data of the video content item by performing speech recognition on the audio content, producing in some implementations hypothesized words annotated with confidence scores, or in other implementations a lattice which contains many hypotheses. Examples of speech recognition techniques include techniques based on hidden Markov models, dynamic programming, or neural networks.
In some implementations, the speech analysis may include identifying phonemes, converting the phonemes to text, interpreting the phonemes as words or word combinations, and providing a representation of the words, and/or word combinations, which best corresponds with the received input speech (e.g., speech in the audio data of a video content item). The text can be further processed to determine the subject matter of the video content item. For example, keyword spotting (e.g., word or utterance recognition), pattern recognition (e.g., defining noise ratios, sound lengths, etc.), or structural pattern recognition (e.g., syntactic patterns, grammar, graphical patterns, etc.) may be used to determine the subject matter, including different segments, of the video content item. The identified subject matter in the video content item content can be used to identify boundaries for dividing the video content item into segments and to identify relevant targeting criteria. In some implementations, further processing may be carried out on the video content item to refine the identification of subject matter in the video content item.
A video content item can also include timecoded metadata. Examples of timecoded metadata include closed-captions, subtitles, or transcript data that includes a textual representation of the speech or dialogue in the video or audio content item. In some implementations, a caption data module at the advertisement provider102 (not shown) extracts the textual representation from the closed-caption, subtitle, or transcript data of the content item and used the extracted text to identify subject matter in the video content item. The extracted text can be a supplement to or a substitute for application of speech recognition on the audio data of the video content item.
Further processing may include sound recognition techniques performed by thesound recognition module210. Accordingly, thesound recognition module210 may use sound recognition techniques to analyze the audio data. Understanding the audio data may enable theenvironment200 to identify the subject matter in the audio data and to identify likely boundaries for segmenting the content item. For example, thesound recognition module210 may recognize abrupt changes in the audio or periods of silence in the video, which may be indicia of segment boundaries.
Further processing of received content can also include object recognition. For example, automatic object recognition can be applied to received or acquired video data of a video content item to determine targeting criteria for one or more objects associated with the video content item. For example, theobject recognition module212 may automatically extract still frames from a video content item for analysis. The analysis may identify targeting criteria relevant to objects identified by the analysis. The analysis may also identify changes between sequential frames of the video content item that may be indicia of different scenes (e.g., fading to black). If the content item is an audio content item, then object recognition analysis is not applicable (because there is no video content to analyze). Examples of object recognition techniques include appearance-based object recognition, and object recognition based on local features, an example of which is disclosed in Lowe, “Object Recognition from Local Scale-Invariant Features,” Proceedings of the Seventh IEEE International Conference on Computer Vision, Volume 2, pp. 1150-1157 (September 1999), which is incorporated by reference in its entirety.
Advertisement provider102 includes aboundary module204. Theboundary module204 may be used in conjunction with thecontent analyzer202 to place boundaries in the content received at theadvertisement provider102. The boundaries may be placed in text, video, graphical, or audio data based on previously received content. For example, a content item may be received as a whole and the boundaries may be applied based on the subject matter in the textual, audio, or video content. In some implementations, theboundary module204 may simply be used to interpret existing boundary settings for a particular selection of content (e.g., a previously aired television program). In some implementations, the boundary data are stored separately from the content item (e.g., in a separate text file).
Advertisement provider102 includes a targetingcriteria module209. The targetingcriteria module209 may be used in conjunction with thecontent analyzer202 to identify targeting criteria for content received at theadvertisement provider102. The targeting criteria can include keywords, topics, concepts, categories, and the like.
In some implementations, the information obtained from analyses of a video content item performed by thecontent analyzer202 can be used by both theboundary module204 and the targetingcriteria module209.Boundary module204 can use the information (e.g., recognized differences between frames, text of speech in the video content item, etc.) to identify multiple scenes in the video content item and the boundaries between the scenes. The boundaries segment the video content item into segments, for which the targetingcriteria module209 can use the same information to identify targeting criteria.
Advertisement provider102 also includes anad server206.Ad server206 may directly, or indirectly, enter, maintain, and track advertisement information. The ads may be in the form of graphical ads such as so-called banner ads, text only ads, image ads, audio ads, video ads, ads combining one of more of any of such components, etc. The ads may also include embedded information, such as a link, and/or machine executable instructions.User devices106 may submit requests for ads to, accept ads responsive to their request from, and provide usage information to, thead server206. An entity other than auser device106 may initiate a request for ads. Although not shown, other entities may provide usage information (e.g., whether or not a conversion or selection related to the advertisement occurred) to thead server206. For example, this usage information may include measured or observed user behavior related to ads that have been served.
Thead server206 may include information concerning accounts, campaigns, creatives, targeting, etc. The term “account” relates to information for a given advertiser (e.g., a unique email address, a password, billing information, etc.). A “campaign,” “advertising campaign,” or “ad campaign” refers to one or more groups of one or more advertisements, and may include a start date, an end date, budget information, targeting information, syndication information, etc.
In some implementations, theadvertisement provider102 may receive content from thecontent provider104. The techniques and methods discussed in the above description may be applied to the received content. Theadvertisement provider102 can then provide advertising content to thecontent provider104 that corresponds to the received/analyzed content.
In some implementations, the selection of advertisements for placement in the received/analyzed video content may be determined based on a placement preference determined by, for example, an advertiser. The placement preference indicates a presentation preference of an advertisement relative to the presentation of the video feature content. The advertiser may modify the placement preference for an advertisement to influence the selection of a video content item in which the advertisement is to be presented. Thead server206 may provide a user interface for the advertiser to enter and modify the place preference for an advertisement.
The placement preference for an advertisement may include characteristics of a video content item in which the advertisement is to appear. For example, a placement preference may include placement of an advertisement relative to video feature content, such as targeting (or excluding) one or more of pre-roll placement, mid-roll placement or post-roll placement. The placement preference may also include placement of an advertisement based on whether the viewer has the capability of skipping advertisements, excluding placement of an advertisement in an embedded video, or excluding placement of an advertisement in video presented by web sites identified by the advertiser.
Theadvertisement provider102 may use one or more advertisement repositories214 for selecting ads for presentation to a user or other advertisement providers. The repositories214 may include any memory or database module and may take the form of volatile or non-volatile memory including, without limitation, magnetic media, optical media, random access memory (RAM), read-only memory (ROM), removable media, or any other suitable local or remote memory component.
Thecontent provider104 includes avideo server216. Thevideo server216 may be thought of, generally, as a content server in which the content served is simply a video content item, such as a video stream or a video file for example. Further, video player applications may be used to render video files. Ads may be served in association with video content items. For example, one or more ads may be served before, during, or after a music video, program, program segment, etc. Alternatively, one or more ads may be served in association with a music video, program, program segment, etc. In implementations where audio-only content items can be provided, thevideo server216 can be an audio server instead, or more generally, a content server can serve video content items and audio content items.
Thecontent provider104 may have access to various content repositories. For example, the video content and advertisement targetingcriteria repository218 may include available video content items (e.g., video content items for a particular website) and their corresponding targeting criteria. In some implementations, theadvertisement provider102 analyzes the material from therepository218 and determines the targeting criteria for the received material. This targeting criteria can be correlated with the material in thevideo server216 for future usage, for example. In some implementations, the targeting criteria for a content item in the repository is associated with a unique identifier of the content item.
In operation, theadvertisement provider102 and thecontent provider104 can both provide content to auser device106. Theuser device106 is one example of an advertisement consumer. Theuser device106 may include a user device such as a media player (e.g., an MP3 player, a streaming audio player, a streaming video player, a television, a computer, a mobile device, etc.), a browser facility, an e-mail facility, telephony means, etc.
As shown inFIG. 2, theuser device106 includes avideo player module220, a targetingcriteria extractor222, and anad requester224. Thevideo player module220 can execute documents received in thesystem106. For example, thevideo player module220 can play back video files or streams. In some implementations, thevideo player module220 is a multimedia player module that can play back video files or streams and audio files or streams.
In some implementations, when theuser device106 receives content from the content provider (e.g., video, audio, textual content), the targetingcriteria extractor222 can receive corresponding metadata. The metadata includes targeting criteria. The targetingcriteria extractor222 extracts the targeting criteria from the received metadata. In some implementations, the targetingcriteria extractor222 can be a part of thead requester224. In this example, thead requestor224 extracts the targeting criteria form the metadata. The extracted targeted criteria can be combined with targeting criteria derived from other sources (e.g., web browser type, user profile, etc.), if any, and one or more advertisement requests can be generated based on the targeting criteria.
In some other implementations, the metadata, which includes targeting criteria, is received by the user device. A script for sending a request can be run by thead requester224. The script operates to send a request using the received targeting criteria, without necessarily extracting the targeting criteria from the metadata.
Thead requester224 can also simply perform the ad request using the targeting criteria information. For example, thead requester224 may submit a request for ads to theadvertisement provider102. Such an ad request may include a number of ads desired. The ad request may also include document request information. This information may include the document itself (e.g., page), a category or topic corresponding to the content of the document or the document request (e.g., arts, business, computers, arts-movies, arts-music, etc.), part or all of the document request, content age, content type (e.g., text, graphics, video, audio, mixed media, etc.), geo-location information, metadata information, etc.
In some implementations, the ad request may include placement information of a video content item. Thead server206 may use the received placement information of the video content item to determine whether the video content item satisfies a placement preference of an advertisement determined by an advertiser. For example, the placement information may indicate whether the video content item allows pre-roll placement, mid-roll placement or post-roll placement. Alternatively or additionally, the placement information may indicate whether the video content item allows a viewer to skip advertisements.
In some implementations,content analyzer202,boundary module204, and targetingcriteria module209 can be included in thecontent provider104. That is, the analysis of content items and determination of boundaries and targeting criteria can take place at thecontent provider104.
Although the foregoing examples described servers as (i) requesting ads, and (ii) combining them with content, one or both of these operations may be performed by a user device (e.g., an end user computer, for example).
FIG. 3 is anexample user interface300 illustrating advertising content displayed on a screen with video content. Theuser interface300 illustrates an example web browser user interface. However, the content shown in theuser interface300 can be presented in a webpage, an MP3 player, a streaming audio player, a streaming video player, a television, a computer, a mobile device, etc. The content shown in theuser interface300 may be provided byadvertisement provider102,content provider104, another networked device, or some combination of those providers.
As shown, theuser interface300 includes avideo player region302 and one or more “other content”regions304A and304B. Thevideo display region302 may include a media player for presenting text, images, video, or audio, or any combination thereof. An example of what can be shown in thevideo display region302 is described in further detail below in relation toFIG. 4.
Theother content regions304A and304B may display links, third party add-ins (e.g., search controls, download buttons, etc.), video and audio clips (e.g., graphics), help instructions (e.g., text, html, pop-up controls, etc.), and advertisements (e.g., banner ads, flash-based video/audio ads, scrolling ads, etc.).
The other content may be related to the content displayed in thevideo player region302. For example, boundaries, targeting criteria, and other metadata related to the video player content may have been used to determine the other content displayed in one of theother content regions304A and304B. In some implementations, the other content is not related to the content in thevideo player region302.
Theother content regions304A and304B may be in proximity to thevideo player region302 during the presentation of video or audio content in theregion302. For example, theother content regions304A and304B can be adjacent to thevideo display region302, either above, below, or to the side of thevideo display region302. For example, theuser interface300 may include an add-on, such as a stock ticker with text advertisements. The stock ticker can be presented in one of theother content regions304A and304.
FIG. 4 illustrates an example user interface that can be displayed in a video player, such as invideo player region302. Content items, such as video, audio, and so forth can be displayed in thevideo player region302. Theregion302 includes acontent display portion402 for displaying a content item, aportion404 for displaying information (e.g., title, running time, etc.) about the content item, player controls405 (e.g., volume adjustment, full-screen mode, play/pause button, progress bar and slider, option menu, etc.), anadvertisement display portion408, and amulti-purpose portion406 that can be used to display various content (e.g., advertisements, closed-captions/subtitles/transcript of the content item, related links, etc.).
As shown, the content shown represents a video (or audio) interview occurring between a person located in New York City, N.Y. and a person located in Los Angeles, Calif. The interview is displayed in thecontent display portion402 of theregion302.
Theregion302 may be presented as a stream, upon visiting a particular site presenting the interview, or after the execution of a downloaded file containing the interview or a link to the interview. As such, theregion302 may display additional content (e.g., advertisement content) that relates to the content shown in the video interview. For example, the additional content may change according to what is displayed in theregion302. The additional content can be substantially available as content from thecontent provider104 and/or theadvertisement provider102.
An on-screen advertisement is displayed in themulti-purpose portion406. An additional on-screen advertisement is displayed in theadvertisement display portion408. In some implementations, on-screen advertisements may include text-and-audio, video, text, animated images, still images, or some combination thereof.
In some implementations, thecontent display portion402 can display advertisements targeted to audio-only content, such as ads capable of being displayed in-stream with a podcast or web monitored radio broadcasts. For example, theadvertisement provider102 may provide interstitial advertisements, sound bytes, or news information in the audio stream of music or disc jockey conversations.
Advertisements may be presented on thecontent display portion402. Temporal placement of advertisements relative to a video content item may vary. For example, an advertisement presentation may be pre-roll, mid-roll or post-roll placement.
In some implementations, the progress bar in the player controls405 also shows the positions of the advertisement slots in the content item being played.
Themulti-purpose portion406 may also include a skip advertisement link orcontrol410. When theskip advertisement link410 is selected by the user, the currently displayed video advertisement is skipped and playback continues from the first frame of the video after the skipped video advertisement (or, playback stops if the skipped video advertisement is located at the end of the video). In some implementations, the skip advertisement link orcontrol410 is a link. In some other implementations, the skip advertisement link orcontrol410 may be a button, selectable icon, or some other user-selectable user interface object. As described previously with respect toFIGS. 1 and 2, the ability of a user to skip advertisements, for example, by using the skip advertisement link orcontrol410, may effect the selection of an advertisement to be presented by thead server206.
FIG. 5 is an example flow diagram of aprocess flow500 for providing video advertisements. A video is received by a client (502), which, for example, may be an implementation of auser device106 ofFIGS. 1 and 2. In some implementations, after the client sends a request for the video to the publisher, a video is received by a client from the publisher, which, for example, may be an implementation of thecontent provider104 ofFIGS. 1 and 2. The request may be sent by the client, in response to the client attempting to access the video. For example, the client may have loaded, at a user's command, a web page within a web browser application, where the web page has an embedded video, referred by its URL.
The video is played (504). The video may be played in a standalone video player module or in an embedded player module/plug-in. In an exemplary implementation, the video is played in a video player user interface in a web page, such as that described above with relation toFIGS. 3 and 4. In some implementations, the video begins playing after the entire video is downloaded into memory (volatile and/or non-volatile) at the client. In some other implementations, the video is streamed to the client.
During the playback of the video, an impending advertisement slot in the video is detected (506). Detecting locations for insertions of advertisements in a video stream may be accomplished using the technology, for example, described in U.S. patent application Ser. No. 11/738,292, for “Media Advertising,” which is incorporated by reference in its entirety. One or more video advertisements are requested (508). The video advertisements are requested for placement in the detected advertisement slot and for display to the user when playback of the video reaches the advertisement slot. In some implementations, the request merely asks for one or more advertisements, without requesting for any specific advertisement. In some other implementations, the request may ask for a specific advertisement. In an exemplary implementation, the request includes an identifier of the video (e.g., a video ID), metadata associated with the video, the position of the advertisement slot, and the length of the advertisement slot.
The request is received by, for example, an ad server (510). In some implementations, the server may identify the video for which the video advertisement is placed by a video identifier (ID) included in the request. The identity of the video for the video advertisement may be used to track advertisement placements. The server may determine one or more video advertisements for placement based on any number of factors, including but not limited to the position of the advertisement slot relative to video feature content, identity of presenting websites such as represented by a URL, a domain or a sub-domain, ability to skip advertisements, whether the video content item is embedded, the length of the advertisement slot, metadata associated with the video, any categories with which the video is associated, advertisement placement preference or advertisement exclusion preference, etc.
The ad server may compare the information in an ad request from a client with placement preferences of advertisers to determine one or more advertisements for placement. For example, the ad request may indicate that the video allows an advertisement to be presented only after the feature content of the video has finished playing. Based on this information, the ad server identifies advertisements for which placement preferences of advertisers permit post-roll placement of the advertisements. In another example, advertisements may be selected or excluded by other information in the ad request, such as whether a viewer of the video has capability of skipping ads, whether the video is an embedded video, or whether the video is presented by websites identified by the advertiser.
At least one advertisement is transmitted (512). In some implementations, the advertisement(s) are transmitted from the publisher at the request of the ad server. In some other implementations, the video advertisement(s) are transmitted by the ad server. The advertisement(s) is received by the client (514). The received advertisement(s) is placed in the advertisement slot within the video and when playback of the video reaches the advertisement slot, the advertisement(s) is presented (518). This may be accomplished using the technology, for example, described in U.S. patent application Ser. No. 11/550,388, for “Using Viewing Signals In Targeted Video Advertising,” which is incorporated by reference in its entirety. In one example, the advertisements may be presented in one or both of thecontent regions304A and304B ofFIG. 3.
It should be appreciated that it may be possible that no advertisement is transmitted for an advertisement slot. For example, the ad server may determine that no advertiser provided an advertisement for placement with the video. In another example, the ad server may determine that the ad request does not satisfy any placement preferences of advertisements. When playback of the video reaches the advertisement slot, the advertisement slot may be bypassed, and playback continues from the next portion of the video.
As described above, a video may have one or more advertisement slots. An advertisement slot is a span of time in a video that is reserved for presenting advertisements. In some implementations, an advertisement slot is akin to the well-known commercial break within or between television programs. An advertisement slot may be located anywhere in the video, including at the beginning (before the feature content of the video), in between portions of the video, or at the end. A video may have one or more advertisement slots. An advertisement slot may be of any non-zero length. In an example implementation, the length of an advertisement slot is thirty (30) seconds. In another example implementation, the length of an advertisement slot is sixty (60) seconds. Furthermore, in some implementations, the advertisement slot has a maximum length and the total running time of the one or more advertisements placed in a particular slot may be less than or equal to the maximum length of that slot.
In some implementations, one or more advertisement slots are added to a video by the creator of the video. That is, the creator of the video indicates the positions and lengths of the advertisement slots as part of the process of creating the video or as a subsequent modification to the video. In some other implementations, positions of advertisement slots are determined by automated processes.
FIG. 6 is an example flow diagram of aprocess600 for indicating a placement preference for use in selecting advertisements. Theprocess600 may be executed, for example, by theadvertisement provider system102 ofFIGS. 1 and 2.
Theprocess600 begins when a user input indicating a placement preference is received (610). More particularly, the user input indicates placement preference for an advertisement to be presented in video relative to presentation of video feature content. The user may be an advertiser who wants to specify placement preference for an advertisement. The placement preference may indicate, for example, the temporal position of the advertisement relative to video feature content (e.g., pre-roll placement, mid-roll placement and post-roll placement). Additionally or alternatively, the placement preference may indicate whether the advertisement may be presented in video content where the viewer may skip the advertisement.
In some implementations, an advertiser may use a graphical user interface to enter or modify a placement preference (or preferences). One example of such a user interface is theuser interface700 described below with respect toFIG. 7.
In some implementations, receiving user input may include receiving a bid from an advertiser for placement of an advertisement that reflect placement preference of a sponsor of the advertisement. For example, a bid may be received for an advertisement based on a pre-roll placement, and another bid may be received for the same advertisement based on a post-roll placement.
The received placement preference is stored in association with an advertisement (620). For example, the placement preference is stored in theadvertising repository103 ofFIG. 1. The stored placement preference is then used to influence the selection of advertisements for presentation in video (630). In some implementations, the stored placement preference may be used to select an advertisement in response to a request for advertisements from a client as described previously with respect toFIG. 5. For example, if the request from a client allows a pre-roll placement, advertisements having placement preference for the pre-roll placement may be selected. Alternatively or additionally, if the request is from a client which allows a viewer to skip advertisements, advertisements having placement preference against the skipping feature may be excluded.
Selecting advertisements for presentation during a video broadcast or in a video stream may be accomplished using the technology, for example, described in U.S. patent application Ser. No. 11/550,249, for “Targeted Video Advertising,” which is incorporated by reference in its entirety.
FIG. 7 is anexample user interface700 for entering bids for advertisement placement relative to feature content. Placement of advertisement(s) during the video playback may be accomplished using the technology, for example, described in U.S. Patent Application No. 60/915,654, for “User Interfaces for Web-Based Video Player,” which is incorporated by reference in its entirety. In this example, the selection of advertisements for placement in a video content item is determined based on a placement preference and a bid of an advertiser. For example, among the advertisements having a matching placement preference with a video content item, the advertisement(s) with the highest bid may be presented in the video feature content as specified by the placement preference.
Auctions for particular placement of advertisements may be accomplished using the technology, for example, described in U.S. patent application Ser. No. 11/479,942, for “Slot Preference Auction,” which is incorporated by reference in its entirety.
Theuser interface700 may be used, for example, during theprocess600 ofFIG. 6. Theuser interface700 may be provided byad server206 to an advertiser, so that the advertiser may specify placement preference.
As shown, theuser interface700 includes anadvertiser region702, anadvertisement region704 and aplacement preference region706. Theplacement preference region706 includesplacement preferences718,720,722,724 and726. Each of theplacement preferences718,720,722,724 and726 specifies factors to be considered in selecting advertisements to be presented in video content. In this example, each of theplacement preferences718,720,722,724 and726 may specify five factors or columns, “Video Channel”column708, “Placement”column710, “Permit Skipping”column712, “BID: CPM”714 and “BID: CPC”716. Theuser interface700 also includes “save”button728 and “cancel”730 button.
Theadvertiser region702 identifies the advertiser, for example, “ABC Advertisement, Inc.” Thead server206 may associate the advertiser specified in theadvertiser region702 with information for the advertiser (e.g., an email address, a password, billing information, etc.). Theadvertisement region704 specifies one or more advertisements of the advertiser to which the placement preferences of theplacement preference region706 are to be applied. In this example, single advertisement “ABC AD1” is specified in theadvertisement region704. In some implementations, more than one advertisements may also be specified for which theplacement preferences718,720,722,724 and726 are to be applied.
Each of theplacement preferences718,720,722,724 and726 specifies each of fivefactors708,710,712,714 and716 to be used to select advertisements to be presented in video content. The “Video Channel”708 indicates the type of video content in which the advertisement specified in theadvertisement region704 may be presented, such as “News-Videos” and “Action-Movies.” The “Placement”710 indicates the temporal position in video content where the advertisement should be placed, such as pre-roll, mid-roll and post-roll placements. The “Permit Skipping”712 indicates whether the advertisement may be presented in video content where a viewer may skip advertisements. The “BID: CPM”714 specifies a bid based on Cost-Per-Mille (CPM). The “BID: CPC”716 specifies a bid based on Cost-Per-Click (CPC).
In theplacement preference718, the “Video Channel”column708, “Placement”column710, “Permit Skipping”column712, “BID: CPM”714 and “BID: CPC”716 are respectively specified as “News-Videos,” “Pre-Roll,” “No,” $1.00 and $0.05. Thus, theplacement preference718 specifies that the advertisement “ABC AD1” should be presented in news-videos (“News-Videos”) before the video feature plays (“Pre-Roll”) where a viewer cannot skip advertisements (“No” for “Permit Skipping”). For such a presentation of “ABC AD1”, the advertiser “ABC Advertisement, Inc.” offered a bid of $1.00 based on CPM and a bid of $0.05 based on CPC.
In theplacement preference720, the five factors orcolumns708,710,712,714 and716 are respectively specified as “News-Videos,” “Mid-Roll,” “Yes,” $0.25 and $0.02. Thus, theplacement preference720 specifies that the advertisement “ABC AD1” should be presented in news-videos (“News-Videos”) while the video feature has begun or is playing (“Mid-Roll”) regardless of whether a viewer may skip advertisements (“Yes” for “Permit Skipping”). For such a presentation of “ABC AD1”, the advertiser “ABC Advertisement, Inc.” offered a bid of $0.25 based on CPM and a bid of $0.02 based on CPC.
In theplacement preference722, the five factors orcolumns708,710,712,714 and716 are respectively specified as “News-Videos,” “Post-Roll,” “Yes,” $0.05 and $0.02. Thus, theplacement preference722 specifies that the advertisement “ABC AD1” should be presented in news-videos (“News-Videos”) after the video feature has finished playing (“Post-Roll”) regardless of whether a viewer may skip advertisements (“Yes” for “Permit Skipping”). For such a presentation of “ABC AD1”, the advertiser “ABC Advertisement, Inc.” offered a bid of $0.05 based on CPM and a bid of $0.02 based on CPC.
In theplacement preference724, the five factors orcolumns708,710,712,714 and716 are respectively specified as “Action-Movies,” “Pre-Roll,” “No,” $1.25 and $0.05. Thus, theplacement preference724 specifies that the advertisement “ABC AD1” should be presented in action movies (“Action-Movies”) before the video feature plays (“Pre-Roll”) where a viewer cannot skip advertisements (“No” for “Permit Skipping”). For such a presentation of “ABC AD1”, the advertiser “ABC Advertisement, Inc.” offered a bid of $1.25 based on CPM and a bid of $0.05 based on CPC.
In theplacement preference726, the five factors orcolumns708,710,712,714 and716 are respectively specified as “Action-Movies,” “Pre-Roll,” “Yes,” $0.50 and $0.02. Thus, theplacement preference726 specifies that the advertisement “ABC AD1” should be presented in action movies (“Action-Movies”) before the video feature plays (“Pre-Roll”) regardless of whether a viewer may skip advertisements (“Yes” for “Permit Skipping”). For such a presentation of “ABC AD1”, the advertiser “ABC Advertisement, Inc.” offered a bid of $0.50 based on CPM and a bid of $0.02 based on CPC.
The advertiser may modifyregions702,704 and706 to modify selection of a media stream or file in which one or more advertisements are to be presented. For example, the advertiser may add or delete advertisements in theadvertisement region704, thereby determining which advertisements are influenced by the settings of theplacement preference region706. The advertiser may also modify items in the placement preference region to influence selection of a media stream or files. The advertiser may cancel the modification of the settings of theuser interface700 by using the cancel button730. The advertiser may also store and initiate the modification to take effect by using thesave button728.
FIG. 8 is an example flow diagram of aprocess800 for indicating a placement preference for use in selecting advertisements. Theprocess800 may be executed, for example, by theadvertisement provider102 ofFIGS. 1 and 2.
The process begins when a user input indicating a placement preference is received (810). More particularly, the user input indicates placement preference for an advertisement to be presented in video based on entity presenting the video. The user may be an advertiser who wants to specify placement preference for an advertisement. The placement preference may indicate, for example, whether the advertisement may be presented in an embedded content. Additionally or alternatively, the placement preference may include a list of entities. The placement preference may indicate that the advertisements should not be presented in video contents presented by those entities in the list.
In some implementations, an advertiser may use a graphical user interface to enter or modify a placement preference (preferences). One example of such a user interface is theuser interface900 described below with respect toFIG. 9. In some implementations, receiving user input may include receiving a bid from an advertiser for placement of an advertisement that reflect placement preference of a sponsor of the advertisement. For example, a bid may be received for an advertisement indicating that the advertisement should be not presented in an embedded content.
The received placement preference is stored in association with an advertisement (820). For example, the placement preference is stored in theadvertising repository103 ofFIG. 1. The stored placement preference is then used to influence the selection of advertisements for presentation in video (830). In some implementations, the stored placement preference may be used to reject a request for advertisement for an embedded content. For example, upon receiving a request for advertisement, an ad server may determine the incoming request's domain and compare the domain with the domain of the content owner. If both domains are different or if the incoming request's domain can not be determined, the ad server may determine that the request for advertisement is for an embedded content. Then, advertisements with placement preference that prohibit presentation in an embedded content will not be selected by the ad server for presentation in the content. Alternatively or additionally, the stored placement preference may be used to reject a request for advertisement from certain entities. For example, upon receiving a request for advertisement, an ad server may determine the incoming request's domain and determine whether the domain is included in the list of entities indicated by the stored placement preference for an advertisement. If the list includes the incoming request's domain, then the ad server should not select the advertisement for presentation in the content.
In some implementations, the selected advertisements may be presented in a media stream or file, other than a video stream or file. For example, the media stream or file may be an audio stream or file, or a combination of video and audio streams or files.
FIG. 9 is anexample user interface900 for excluding placement of advertisements in video based on video presenting entity. Theuser interface900 may be used, for example, during theprocess800 ofFIG. 8 Theuser interface900 may be provided byad server206 to an advertiser, so that the advertiser may specify placement preference.
As shown, theuser interface900 includes anadvertiser region902, anadvertisement region904 and anadvertiser preference region906. Theplacement preference region906 includesadvertiser preferences906A,906B and906C and awebsite list region908. Theuser interface900 also includes “save”button910 and “cancel”912 button.
Theadvertiser region902 specifies the advertiser, for example, “ABC Advertisement, Inc.” Thead server206 may associate the advertiser specified in theadvertiser region902 with information for the advertiser (e.g., a unique email address, a password, billing information, etc.). Theadvertisement region904 specifies one or more advertisements of the advertiser to which the advertiser preferences of theadvertiser preference region906 will be applied. In this example, the advertiser may select one or more advertisements among “ABC AD1”904A and “ABC AD2”904B or may select all the advertisements of “ABC Advertisement, Inc. by selecting “All ABC Ads”904C.
Each of theadvertiser preferences906A,906B and906C specifies specific placement preference that will influence selection of a media stream or file in which the advertisements specified in theadvertisement region904 are to be presented. For example, the advertisement preference “Do Not Place AD in Video Content”906A specifies whether the advertisements may be placed in video content. In the example, theadvertisement preference906A is not activated, as illustrated by theradio button907A near theadvertisement preference906A. Thus, the advertisements may be placed in video content. The advertiser may prohibit the placement of the advertisement in video content by activating theradio button907A for theadvertisement preference906A.
The advertiser preference “Show AD in Embedded Videos”906B specifies whether the advertisements may be shown in embedded videos. In the example, theadvertisement preference906B is not activated, as illustrated by theradio button907B near theadvertisement preference906B. Thus, the advertisements may be shown in embedded videos. The advertiser may prohibit the placement of the advertisement in embedded videos by activating theradio button907B for theadvertisement preference906B.
The advertiser preference “Do Not Place AD on These Sites:”906C dictates that the advertisements should not be shown in videos presented by the websites specified in thewebsite list region908. In the example, theadvertisement preference906C is activated, as illustrated by theradio button907C near theadvertisement preference906C. Thus, the advertisements should not be shown in videos presented by any of the websites specified in the website list regions908. As illustrated, www.example.com, example.com, negative.example.com, www.example.com/category, and www.example.com/home.html are excluded.
The advertiser may activate one ormore advertiser preferences906A,906B or906C to influence selection of media streams or files in which the advertisements are to be presented. The advertiser may also add or delete websites in thewebsite list region908 to influence on which websites the advertisements should not be shown. The advertiser may add or delete advertisements in theadvertisement region904, thereby determining which advertisements are influenced by the settings of theadvertiser preference region906.
The advertiser may cancel the modification of the settings of theuser interface900 by using the cancelbutton912. The advertiser may also store and initiate the modified settings to take effect by using thesave button910.
Although the above implementations describe targeting advertisements to content items that include video content and presenting such advertisements, the above implementations are applicable to other types of content items and to the targeting of content other than advertisements to content items. For example, in some implementations, a text advertisement, an image advertisement, an audio-only advertisement, or other content, etc. might be presented with a video content item. Thus, although the format of the ad content may match that of the video content item with which it is served, the format of the advertisement need not match that of the video content item. The ad content may be rendered in the same screen position as the video content, or in a different screen position (e.g., adjacent to the video content as illustrated inFIG. 3). A video advertisement may include video components, as well as additional components (e.g., text, audio, etc.). Such additional components may be rendered on the same display as the video components, and/or on some other output means of the user device. Similarly, video ads may be played with non-video content items (e.g., a video advertisement with no audio can be played with an audio-only content item).
In some implementations, the content item can be an audio content item (e.g., music file, audio podcast, streaming radio, etc.) and advertisements of various formats can be presented with the audio content item. For example, audio-only advertisements can be presented in-stream with the playback of the audio content item. If the audio content item is played in an on-screen audio player module (e.g., a Flash-based audio player module embedded in a webpage), on-screen advertisements can be presented in proximity to the player module. Further, if the player module can display video as well as play back audio, video advertisements can be presented in-stream with the playback of the audio content item.
Further, in some implementations, the content that is identified for presentation based on the targeting criteria (advertisements in the implementations described above) need not be advertisements. The identified content can include non-advertisement content items that are relevant to the original content item in some way. For example, for a respective boundary in a video content item, other videos (that are not necessarily advertisements) relevant to the targeting criteria of one or more segments preceding the boundary can be identified. Information (e.g., a sample frame, title, running time, etc.) and the links to the identified videos can be presented in proximity to the video content item as related videos. In these implementations, the related content provider can be considered a second content provider that includes a content analyzer, boundary module, and a targeting criteria module.
The implementations above were described in reference to a client-server system architecture. It should be appreciated, however, that system architectures other than a client-server architecture can be used. For example, the system architecture can be a peer-to-peer architecture.
FIG. 10 shows an example of ageneric computer device1000 and a genericmobile computer device1050, which may be used with the techniques described above.Computing device1000 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, television set-top boxes, servers, blade servers, mainframes, and other appropriate computers.Computing device1050 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smartphones, and other similar computing devices. The components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit the implementations described and/or the claims.
Computing device1000 includes aprocessor1002,memory1004, astorage device1006, a high-speed interface1008 connecting tomemory1004 and high-speed expansion ports1010, and a low speed interface1012 connecting tolow speed bus1014 andstorage device1006. Each of thecomponents1002,1004,1006,1008,1010, and1012, are interconnected using various buses, and may be mounted on a common motherboard or in other manners as appropriate. Theprocessor1002 can process instructions for execution within thecomputing device1000, including instructions stored in thememory1004 or on thestorage device1006 to display graphical information for a GUI on an external input/output device, such asdisplay1016 coupled tohigh speed interface1008. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also,multiple computing devices1000 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
Thememory1004 stores information within thecomputing device1000. In one implementation, thememory1004 is a volatile memory unit or units. In another implementation, thememory1004 is a non-volatile memory unit or units. Thememory1004 may also be another form of computer-readable medium, such as a magnetic or optical disk.
Thestorage device1006 is capable of providing mass storage for thecomputing device1000. In one implementation, thestorage device1006 may be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations. A computer program product can be tangibly embodied in an information carrier. The computer program product may also contain instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a computer- or machine-readable medium, such as thememory1004, thestorage device1006, memory onprocessor1002, or a propagated signal.
Thehigh speed controller1008 manages bandwidth-intensive operations for thecomputing device1000, while the low speed controller1012 manages lower bandwidth-intensive operations. Such allocation of functions is exemplary only. In one implementation, the high-speed controller1008 is coupled tomemory1004, display1016 (e.g., through a graphics processor or accelerator), and to high-speed expansion ports1010, which may accept various expansion cards (not shown). In the implementation, low-speed controller1012 is coupled tostorage device1006 and low-speed expansion port1014. The low-speed expansion port, which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet) may be coupled to one or more input/output devices, such as akeyboard1034, apointing device1030, ascanner1036, aprinter1032 or a networking device such as a switch or router, e.g., through a network adapter.
Thecomputing device1000 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as astandard server1020, or multiple times in a group of such servers. It may also be implemented as part of arack server system1024. In addition, it may be implemented in a personal computer such as alaptop computer1022. Alternatively, components fromcomputing device1000 may be combined with other components in a mobile device (not shown), such asdevice1050. Each of such devices may contain one or more ofcomputing device1000,1050, and an entire system may be made up ofmultiple computing devices1000,1050 communicating with each other.
Computing device1050 includes aprocessor1052,memory1064, an input/output device such as adisplay1054, acommunication interface1066, and atransceiver1068, among other components. Thedevice1050 may also be provided with a storage device, such as a microdrive or other device, to provide additional storage. Each of thecomponents1050,1052,1064,1054,1066, and1068, are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.
Theprocessor1052 can execute instructions within thecomputing device1050, including instructions stored in thememory1064. The processor may be implemented as a chipset of chips that include separate and multiple analog and digital processors. The processor may provide, for example, for coordination of the other components of thedevice1050, such as control of user interfaces, applications run bydevice1050, and wireless communication bydevice1050.
Processor1052 may communicate with a user throughcontrol interface1058 anddisplay interface1056 coupled to adisplay1054. Thedisplay1054 may be, for example, a TFT (Thin-Film-Transistor Liquid Crystal Display) display or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology. Thedisplay interface1056 may comprise appropriate circuitry for driving thedisplay1054 to present graphical and other information to a user. Thecontrol interface1058 may receive commands from a user and convert them for submission to theprocessor1052. In addition, anexternal interface1062 may be provide in communication withprocessor1052, so as to enable near area communication ofdevice1050 with other devices.External interface1062 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used.
Thememory1064 stores information within thecomputing device1050. Thememory1064 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units. Expansion memory1074 may also be provided and connected todevice1050 through expansion interface1072, which may include, for example, a SIMM (Single In Line Memory Module) card interface. Such expansion memory1074 may provide extra storage space fordevice1050, or may also store applications or other information fordevice1050. Specifically, expansion memory1074 may include instructions to carry out or supplement the processes described above, and may include secure information also. Thus, for example, expansion memory1074 may be provide as a security module fordevice1050, and may be programmed with instructions that permit secure use ofdevice1050. In addition, secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner.
The memory may include, for example, flash memory and/or NVRAM memory, as discussed below. In one implementation, a computer program product is tangibly embodied in an information carrier. The computer program product contains instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a computer- or machine-readable medium, such as thememory1064, expansion memory1074, memory onprocessor1052, or a propagated signal that may be received, for example, overtransceiver1068 orexternal interface1062.
Device1050 may communicate wirelessly throughcommunication interface1066, which may include digital signal processing circuitry where necessary.Communication interface1066 may provide for communications under various modes or protocols, such as GSM voice calls, SMS, EMS, or MMS messaging, CDMA, TDMA, PDC, WCDMA, CDMA2000, or GPRS, among others. Such communication may occur, for example, through radio-frequency transceiver1068. In addition, short-range communication may occur, such as using a Bluetooth, WiFi, or other such transceiver (not shown). In addition, GPS (Global Positioning System)receiver module1070 may provide additional navigation- and location-related wireless data todevice1050, which may be used as appropriate by applications running ondevice1050.
Device1050 may also communicate audibly usingaudio codec1060, which may receive spoken information from a user and convert it to usable digital information.Audio codec1060 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset ofdevice1050. Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating ondevice1050.
Thecomputing device1050 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as acellular telephone1080. It may also be implemented as part of asmartphone1082, personal digital assistant, or other similar mobile device.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms “machine-readable medium” “computer-readable medium” refers to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), and the Internet.
The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made. For example, various forms of the flows shown above may be used, with steps re-ordered, added, or removed. Other implementations are within the scope of the following claims.

Claims (20)

What is claimed is:
1. A system for functionality-based content selection, comprising:
a data processing system including one or more processors to:
receive, from a first provider device, first input indicating a first media item eligible for presentation to client devices;
identify, for the first media item, a first presentation preference established for the first media item by the first provider device that configures the first media item to receive one of skippable content items or content items that are not skippable;
receive, responsive to execution of a script at a client device presenting the first media item, a request for content to be provided to the client device, the request generated to include an indication about the first media item, and the first presentation preference;
determine, from the indication in the request and based on a capability of the client device, that the client device presents the first media item established with the first presentation preference to receive skippable content items; and
select, responsive to the request and the capability of the client device, from among at least the skippable content items and the content items that are not skippable, a first skippable content item to provide with the first media item, the first skippable content item selected by the data processing system based on the determination that the client device presents the first media item configured to skip presentation of content items.
2. The system ofclaim 1, wherein the first skippable content item comprises audio content.
3. The system ofclaim 1, wherein the first provider device corresponds to a first source different than a second source of a second provider device.
4. The system ofclaim 1, wherein the first provider device corresponds to a first video stream different than a second video stream of a second provider device.
5. The system ofclaim 1, wherein the first provider device grants permission to the client device to access the first media item, and the client device accesses the first media item responsive to the permission.
6. The system ofclaim 1, wherein the first provider device establishes the first presentation preference responsive to the client device having a capability to skip content items.
7. The system ofclaim 1, wherein the first provider device establishes the first presentation preference responsive to the client device being permitted to access the first media item.
8. The system ofclaim 1, wherein the first provider device establishes the first presentation preference that configures the first media item to receive the one of the skippable content items or the content items that are not skippable based on a video channel type or a video stream type.
9. The system ofclaim 1, wherein the first provider device establishes the first presentation preference that configures the first media item to receive the one of the skippable content items or the content items that are not skippable based on an entity providing the first media item.
10. The system ofclaim 1, comprising:
the data processing system to receive the request comprising metadata about the first media item.
11. The system ofclaim 1, wherein the client device comprises a digital assistant.
12. The system ofclaim 1, comprising:
the data processing system to select the first media item based on a speech processing technique.
13. The system ofclaim 1, comprising:
the data processing system to select the first skippable content item based on a speech processing technique.
14. The system ofclaim 1, comprising:
the data processing system to identify keywords from speech at the client device based on at least one of keyword spotting or pattern recognition.
15. A method of selecting content based on functionality, comprising:
receiving, by a data processing system comprising one or more processors, from a first provider device, first input indicating a first media item eligible for presentation to client devices;
identifying, by the data processing system for the first media item, a first presentation preference established for the first media item by the first provider device that configures the first media item to receive one of skippable content items or content items that are not skippable;
receiving, by the data processing system responsive to execution of a script at a client device presenting the first media item, a request for content to be provided to the client device, the request generated to include an indication about the first media item, and the first presentation preference of the first media item;
determining, by the data processing system from the indication in the request and based on a capability of the client device, that the client device presents the first media item established with the first presentation preference to receive skippable content items; and
selecting, by the data processing system responsive to the request and the capability of the client device, from among at least the skippable content items and the content items that are not skippable, a first skippable content item to provide with the first media item, the first skippable content item selected by the data processing system based on the determination that the client device presents the first media item configured to skip presentation of content items.
16. The method ofclaim 15, wherein the first provider device corresponds to a first source different than a second source of a second provider device.
17. The method ofclaim 15, wherein the first provider device corresponds to a first video stream different than a second video stream of a second provider device.
18. The method ofclaim 15, wherein the first provider device establishes the first presentation preference responsive to the client device having a capability to skip content items.
19. The method ofclaim 15, wherein the first provider device establishes the first presentation preference that configures the first media item to receive the one of the skippable content items or the content items that are not skippable based on a video channel type or a video stream type.
20. The method ofclaim 15, wherein the first provider device establishes the first presentation preference that configures the first media item to receive the one of the skippable content items or the content items that are not skippable based on an entity providing the first media item.
US16/039,1282007-06-272018-07-18Device functionality-based content selectionActiveUS10748182B2 (en)

Priority Applications (3)

Application NumberPriority DateFiling DateTitle
US16/039,128US10748182B2 (en)2007-06-272018-07-18Device functionality-based content selection
US16/943,383US11210697B2 (en)2007-06-272020-07-30Device functionality-based content selection
US17/552,565US11915263B2 (en)2007-06-272021-12-16Device functionality-based content selection

Applications Claiming Priority (5)

Application NumberPriority DateFiling DateTitle
US94671707P2007-06-272007-06-27
US11/770,585US8661464B2 (en)2007-06-272007-06-28Targeting in-video advertising
US14/164,719US9697536B2 (en)2007-06-272014-01-27Targeting in-video advertising
US15/638,306US10032187B2 (en)2007-06-272017-06-29Device functionality-based content selection
US16/039,128US10748182B2 (en)2007-06-272018-07-18Device functionality-based content selection

Related Parent Applications (1)

Application NumberTitlePriority DateFiling Date
US15/638,306ContinuationUS10032187B2 (en)2007-06-272017-06-29Device functionality-based content selection

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US16/943,383ContinuationUS11210697B2 (en)2007-06-272020-07-30Device functionality-based content selection

Publications (2)

Publication NumberPublication Date
US20180322530A1 US20180322530A1 (en)2018-11-08
US10748182B2true US10748182B2 (en)2020-08-18

Family

ID=40161713

Family Applications (6)

Application NumberTitlePriority DateFiling Date
US11/770,585Active2029-10-16US8661464B2 (en)2007-06-272007-06-28Targeting in-video advertising
US14/164,719Active2028-01-25US9697536B2 (en)2007-06-272014-01-27Targeting in-video advertising
US15/638,306ActiveUS10032187B2 (en)2007-06-272017-06-29Device functionality-based content selection
US16/039,128ActiveUS10748182B2 (en)2007-06-272018-07-18Device functionality-based content selection
US16/943,383ActiveUS11210697B2 (en)2007-06-272020-07-30Device functionality-based content selection
US17/552,565Active2027-11-29US11915263B2 (en)2007-06-272021-12-16Device functionality-based content selection

Family Applications Before (3)

Application NumberTitlePriority DateFiling Date
US11/770,585Active2029-10-16US8661464B2 (en)2007-06-272007-06-28Targeting in-video advertising
US14/164,719Active2028-01-25US9697536B2 (en)2007-06-272014-01-27Targeting in-video advertising
US15/638,306ActiveUS10032187B2 (en)2007-06-272017-06-29Device functionality-based content selection

Family Applications After (2)

Application NumberTitlePriority DateFiling Date
US16/943,383ActiveUS11210697B2 (en)2007-06-272020-07-30Device functionality-based content selection
US17/552,565Active2027-11-29US11915263B2 (en)2007-06-272021-12-16Device functionality-based content selection

Country Status (6)

CountryLink
US (6)US8661464B2 (en)
EP (1)EP2176821A4 (en)
AU (1)AU2008268134B2 (en)
BR (1)BRPI0812926A2 (en)
CA (1)CA2692921A1 (en)
WO (1)WO2009003162A2 (en)

Families Citing this family (274)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8554868B2 (en)2007-01-052013-10-08Yahoo! Inc.Simultaneous sharing communication interface
US8661464B2 (en)2007-06-272014-02-25Google Inc.Targeting in-video advertising
US20090018917A1 (en)*2007-07-132009-01-15Brent ChapmanPlatform for advertisement referrals for authors of video content
JPWO2009013956A1 (en)*2007-07-262010-09-30日本電気株式会社 Multimedia service
US7805373B1 (en)*2007-07-312010-09-28Qurio Holdings, Inc.Synchronizing multiple playback device timing utilizing DRM encoding
US20090076882A1 (en)*2007-09-142009-03-19Microsoft CorporationMulti-modal relevancy matching
US9191450B2 (en)*2007-09-202015-11-17Disney Enterprises, Inc.Measuring user engagement during presentation of media content
US8160923B2 (en)*2007-11-052012-04-17Google Inc.Video advertisements
US9043828B1 (en)2007-12-282015-05-26Google Inc.Placing sponsored-content based on images in video content
US8156001B1 (en)*2007-12-282012-04-10Google Inc.Facilitating bidding on images
US8315423B1 (en)2007-12-282012-11-20Google Inc.Providing information in an image-based information retrieval system
US11227315B2 (en)2008-01-302022-01-18Aibuy, Inc.Interactive product placement system and method therefor
US8312486B1 (en)2008-01-302012-11-13Cinsay, Inc.Interactive product placement system and method therefor
US20110191809A1 (en)2008-01-302011-08-04Cinsay, LlcViral Syndicated Interactive Product System and Method Therefor
US8060904B1 (en)2008-02-252011-11-15Qurio Holdings, Inc.Dynamic load based ad insertion
US8726146B2 (en)*2008-04-112014-05-13Advertising.Com LlcSystems and methods for video content association
KR100970326B1 (en)*2008-05-162010-07-15엔에이치엔비즈니스플랫폼 주식회사 Method and system for providing advertising content in digital broadcasting service, and method and system for advertising content exposure
US9183885B2 (en)*2008-05-302015-11-10Echostar Technologies L.L.C.User-initiated control of an audio/video stream to skip interstitial content between program segments
US8156520B2 (en)2008-05-302012-04-10EchoStar Technologies, L.L.C.Methods and apparatus for presenting substitute content in an audio/video stream using text data
US20090327346A1 (en)*2008-06-302009-12-31Nokia CorporationSpecifying media content placement criteria
US9128981B1 (en)2008-07-292015-09-08James L. GeerPhone assisted ‘photographic memory’
US8775454B2 (en)2008-07-292014-07-08James L. GeerPhone assisted ‘photographic memory’
US8359612B2 (en)2008-08-132013-01-22Tivo Inc.Content distribution system using transportable memory devices
US20100053108A1 (en)*2008-09-012010-03-04Chae Jung-GukPortable devices and controlling method thereof
US20140258039A1 (en)*2013-03-112014-09-11Hsni, LlcMethod and system for improved e-commerce shopping
US12437325B2 (en)2008-12-082025-10-07Hsni, LlcMethod and system for improved e-commerce shopping
KR20100095924A (en)*2009-02-232010-09-01삼성전자주식회사Advertizement keyword extracting apparatus and method using situation of video
KR101673426B1 (en)*2009-04-012016-11-07포스월 미디어, 인코포레이티드Systems, methods, and apparatuses for enhancing video advertising with interactive content
KR101217828B1 (en)*2009-04-302013-01-02주식회사 엔톰애드Method and apparatus for providing multiple on-line advertisement by using information of scroll-bar location
US8738443B2 (en)2009-05-182014-05-27George M. DavisonSystem and method for advertising using pushed video
US9633379B1 (en)*2009-06-012017-04-25Sony Interactive Entertainment America LlcQualified video delivery advertisement
US20100312608A1 (en)*2009-06-052010-12-09Microsoft CorporationContent advertisements for video
US9396484B2 (en)*2009-09-162016-07-19International Business Machines CorporationSystems and method for dynamic content injection using aspect oriented media programming
US9141962B2 (en)*2009-12-242015-09-22Infosys LimitedMethod and system for determining a best price for multimedia content
US8682728B2 (en)*2010-01-222014-03-25Vincent KONKOLNetwork advertising methods and apparatus
US8516063B2 (en)2010-02-122013-08-20Mary Anne FletcherMobile device streaming media application
KR20120060134A (en)*2010-08-162012-06-11삼성전자주식회사Method and apparatus for reproducing advertisement
US20120179541A1 (en)*2011-01-122012-07-12Scentara Oy AbSystem and method for providing advertisement in web sites
US9224166B2 (en)2011-03-082015-12-29Bank Of America CorporationRetrieving product information from embedded sensors via mobile device video analysis
US9317860B2 (en)2011-03-082016-04-19Bank Of America CorporationCollective network of augmented reality users
US8873807B2 (en)2011-03-082014-10-28Bank Of America CorporationVehicle recognition
US8922657B2 (en)2011-03-082014-12-30Bank Of America CorporationReal-time video image analysis for providing security
US8718612B2 (en)2011-03-082014-05-06Bank Of American CorporationReal-time analysis involving real estate listings
US9317835B2 (en)2011-03-082016-04-19Bank Of America CorporationPopulating budgets and/or wish lists using real-time video image analysis
US8721337B2 (en)2011-03-082014-05-13Bank Of America CorporationReal-time video image analysis for providing virtual landscaping
US9773285B2 (en)2011-03-082017-09-26Bank Of America CorporationProviding data associated with relationships between individuals and images
US8660951B2 (en)*2011-03-082014-02-25Bank Of America CorporationPresenting offers on a mobile communication device
US12212791B2 (en)2011-06-142025-01-28Comcast Cable Communications, LlcMetadata delivery system for rendering supplementary content
US20170041648A1 (en)*2011-06-142017-02-09Watchwith, Inc.System and method for supplemental content selection and delivery
WO2012174301A1 (en)2011-06-142012-12-20Related Content Database, Inc.System and method for presenting content with time based metadata
MX2014000392A (en)2011-07-122014-04-30Mobli Technologies 2010 LtdMethods and systems of providing visual content editing functions.
KR101995425B1 (en)2011-08-212019-07-02엘지전자 주식회사Video display device, terminal device and operating method thereof
TW201322740A (en)*2011-11-222013-06-01Inst Information IndustryDigitalized TV commercial product display system, method, and recording medium thereof
US9363540B2 (en)*2012-01-122016-06-07Comcast Cable Communications, LlcMethods and systems for content control
US9172983B2 (en)*2012-01-202015-10-27Gorilla Technology Inc.Automatic media editing apparatus, editing method, broadcasting method and system for broadcasting the same
US20130198013A1 (en)*2012-01-262013-08-01SPOTXCHANGE, INC. A Delaware CorporationConsumer-initiated payment to skip electronic advertisements
US20130211925A1 (en)*2012-02-152013-08-15Robert W. HollandUser-selected advertisement layout
US9462302B2 (en)*2012-02-232016-10-04Mobitv, Inc.Efficient delineation and distribution of media segments
US8768876B2 (en)2012-02-242014-07-01Placed, Inc.Inference pipeline system and method
US11734712B2 (en)2012-02-242023-08-22Foursquare Labs, Inc.Attributing in-store visits to media consumption based on data collected from user devices
US8972357B2 (en)2012-02-242015-03-03Placed, Inc.System and method for data collection to validate location data
WO2013166588A1 (en)2012-05-082013-11-14Bitstrips Inc.System and method for adaptable avatars
US20130325601A1 (en)*2012-06-052013-12-05Yahoo! Inc.System for providing content
US20130347032A1 (en)*2012-06-212013-12-26Ebay Inc.Method and system for targeted broadcast advertising
CN103796048A (en)*2012-10-312014-05-14中兴通讯股份有限公司IPTV advertisement putting method and system
US9100719B2 (en)*2012-12-032015-08-04Brightcove, Inc.Advertising processing engine service
US20140164887A1 (en)*2012-12-122014-06-12Microsoft CorporationEmbedded content presentation
US9749710B2 (en)*2013-03-012017-08-29Excalibur Ip, LlcVideo analysis system
KR101952179B1 (en)*2013-03-052019-05-22엘지전자 주식회사Mobile terminal and control method for the mobile terminal
US12198203B2 (en)2013-03-082025-01-14Google LlcSystem for serving shared content on a video sharing web site
US10817959B2 (en)2013-03-082020-10-27Google LlcSystem for serving shared content on a video sharing web site
US20140270704A1 (en)*2013-03-122014-09-18Echostar Technologies L.L.C.Show number of seconds before program restart
US9648389B1 (en)*2013-03-142017-05-09Citizen, Inc.In-stream association of media content for video presentation
CN105190671A (en)*2013-03-152015-12-23雅虎公司Compact data interface for real time bidding in digital video advertisement systems
US10157618B2 (en)2013-05-022018-12-18Xappmedia, Inc.Device, system, method, and computer-readable medium for providing interactive advertising
US20140358697A1 (en)*2013-06-032014-12-04Zippit LlpAutomated suppression of content delivery
US9634910B1 (en)*2013-06-142017-04-25Google Inc.Adaptive serving companion shared content
US20170345032A1 (en)*2013-06-282017-11-30Google IncAutomatic audience creation by scoring users, and optimizing created audiences
US9986307B2 (en)2013-07-192018-05-29Bottle Rocket LLCInteractive video viewing
US9720890B2 (en)*2013-08-062017-08-01Educational Testing ServiceSystem and method for rendering an assessment item
US10218954B2 (en)*2013-08-152019-02-26Cellular South, Inc.Video to data
US9940972B2 (en)*2013-08-152018-04-10Cellular South, Inc.Video to data
US9654814B2 (en)*2013-10-282017-05-16Microsoft Technology Licensing, LlcVideo frame selection for targeted content
US10349140B2 (en)*2013-11-182019-07-09Tagboard, Inc.Systems and methods for creating and navigating broadcast-ready social content items in a live produced video
CA2863124A1 (en)2014-01-032015-07-03Investel Capital CorporationUser content sharing system and method with automated external content integration
US9628950B1 (en)2014-01-122017-04-18Investment Asset Holdings LlcLocation-based messaging
US9679314B1 (en)2014-01-312017-06-13Google Inc.Content selection using distribution parameter data
US9602566B1 (en)*2014-02-132017-03-21Google Inc.Providing selectable content creator controls in conjunction with sponsored media content items
US9461936B2 (en)2014-02-142016-10-04Google Inc.Methods and systems for providing an actionable object within a third-party content slot of an information resource of a content publisher
US9246990B2 (en)2014-02-142016-01-26Google Inc.Methods and systems for predicting conversion rates of content publisher and content provider pairs
CN104038473B (en)*2014-04-302018-05-18北京音之邦文化科技有限公司For intercutting the method, apparatus of audio advertisement, equipment and system
US9396354B1 (en)2014-05-282016-07-19Snapchat, Inc.Apparatus and method for automated privacy protection in distributed images
US9537811B2 (en)2014-10-022017-01-03Snap Inc.Ephemeral gallery of ephemeral messages
IL239238B (en)2014-06-052022-04-01Mobli Tech 2010 LtdAutomatic article enrichment by social media trends
US10872356B2 (en)*2014-06-112020-12-22Google LlcMethods, systems, and media for presenting advertisements during background presentation of media content
WO2015190566A1 (en)*2014-06-132015-12-17シャープ株式会社Advertisement delivery device, advertisement delivery system, advertisement delivery method, advertisement delivery program, content display device, content display program, information processing terminal, and information processing program
US9113301B1 (en)2014-06-132015-08-18Snapchat, Inc.Geo-location based event gallery
US9225897B1 (en)2014-07-072015-12-29Snapchat, Inc.Apparatus and method for supplying content aware photo filters
US9904441B2 (en)*2014-07-292018-02-27Echostar Technologies L.L.C.Methods and systems for providing additional content
US9661382B2 (en)*2014-08-052017-05-23Verizon Patent And Licensing Inc.Commercial advertising platform
US10423983B2 (en)2014-09-162019-09-24Snap Inc.Determining targeting information based on a predictive targeting model
US10824654B2 (en)2014-09-182020-11-03Snap Inc.Geolocation-based pictographs
US11216869B2 (en)2014-09-232022-01-04Snap Inc.User interface to augment an image using geolocation
US10284508B1 (en)2014-10-022019-05-07Snap Inc.Ephemeral gallery of ephemeral messages with opt-in permanence
US9015285B1 (en)2014-11-122015-04-21Snapchat, Inc.User interface for accessing media at a geographic location
US9544659B2 (en)*2014-11-262017-01-10Aol Inc.Systems and methods for providing non-intrusive advertising content to set-top boxes
US10311916B2 (en)2014-12-192019-06-04Snap Inc.Gallery of videos set to an audio time line
US9385983B1 (en)2014-12-192016-07-05Snapchat, Inc.Gallery of messages from individuals with a shared interest
US9754355B2 (en)2015-01-092017-09-05Snap Inc.Object recognition based photo filters
US11388226B1 (en)2015-01-132022-07-12Snap Inc.Guided personal identity based actions
US10133705B1 (en)2015-01-192018-11-20Snap Inc.Multichannel system
US9992553B2 (en)*2015-01-222018-06-05Engine Media, LlcVideo advertising system
US9521515B2 (en)2015-01-262016-12-13Mobli Technologies 2010 Ltd.Content request by location
US20160232579A1 (en)2015-02-112016-08-11The Nielsen Company (Us), LlcMethods and apparatus to detect advertisements embedded in online media
US10223397B1 (en)2015-03-132019-03-05Snap Inc.Social graph based co-location of network users
KR102662169B1 (en)2015-03-182024-05-03스냅 인코포레이티드Geo-fence authorization provisioning
US9692967B1 (en)2015-03-232017-06-27Snap Inc.Systems and methods for reducing boot time and power consumption in camera systems
EP3086273A1 (en)*2015-04-202016-10-26Spoods GmbHA method for data communication between a data processing unit and an end device as well as a system for data communication
US9881094B2 (en)2015-05-052018-01-30Snap Inc.Systems and methods for automated local story generation and curation
US10135949B1 (en)2015-05-052018-11-20Snap Inc.Systems and methods for story and sub-story navigation
US10021458B1 (en)2015-06-262018-07-10Amazon Technologies, Inc.Electronic commerce functionality in video overlays
US9973819B1 (en)2015-06-262018-05-15Amazon Technologies, Inc.Live video stream with interactive shopping interface
US9883249B2 (en)*2015-06-262018-01-30Amazon Technologies, Inc.Broadcaster tools for interactive shopping interfaces
US10440436B1 (en)2015-06-262019-10-08Amazon Technologies, Inc.Synchronizing interactive content with a live video stream
US10993069B2 (en)2015-07-162021-04-27Snap Inc.Dynamically adaptive media content delivery
US10817898B2 (en)2015-08-132020-10-27Placed, LlcDetermining exposures to content presented by physical objects
US20170060405A1 (en)*2015-08-282017-03-02Facebook, Inc.Systems and methods for content presentation
US9743154B2 (en)2015-09-092017-08-22Sorenson Media, IncDynamic video advertisement replacement
US9978366B2 (en)2015-10-092018-05-22Xappmedia, Inc.Event-based speech interactive media player
US9652896B1 (en)2015-10-302017-05-16Snap Inc.Image based tracking in augmented reality systems
US9984499B1 (en)2015-11-302018-05-29Snap Inc.Image and point cloud based tracking and in augmented reality systems
US10474321B2 (en)2015-11-302019-11-12Snap Inc.Network resource location linking and visual content sharing
US12216702B1 (en)2015-12-082025-02-04Snap Inc.Redirection to digital content based on image-search
US10354425B2 (en)2015-12-182019-07-16Snap Inc.Method and system for providing context relevant media augmentation
WO2017117422A1 (en)*2015-12-292017-07-06Echostar Technologies L.L.CMethods and apparatus for presenting advertisements during playback of recorded television content
US10887664B2 (en)*2016-01-052021-01-05Adobe Inc.Controlling start times at which skippable video advertisements begin playback in a digital medium environment
US10462531B2 (en)*2016-01-122019-10-29Google LlcMethods, systems, and media for presenting an advertisement while buffering a video
US11023514B2 (en)2016-02-262021-06-01Snap Inc.Methods and systems for generation, curation, and presentation of media collections
US10679389B2 (en)2016-02-262020-06-09Snap Inc.Methods and systems for generation, curation, and presentation of media collections
US10285001B2 (en)2016-02-262019-05-07Snap Inc.Generation, curation, and presentation of media collections
US10339365B2 (en)2016-03-312019-07-02Snap Inc.Automated avatar generation
US9912711B1 (en)*2016-04-122018-03-06Blingby, LlcComputer system of computer servers and dedicated computer clients configured for video playing with live metadata streaming based on video content, and methods of use thereof
US10178101B2 (en)2016-06-082019-01-08Bank Of America CorporationSystem for creation of alternative path to resource acquisition
US10129126B2 (en)2016-06-082018-11-13Bank Of America CorporationSystem for predictive usage of resources
US10433196B2 (en)2016-06-082019-10-01Bank Of America CorporationSystem for tracking resource allocation/usage
US10291487B2 (en)2016-06-082019-05-14Bank Of America CorporationSystem for predictive acquisition and use of resources
US10581988B2 (en)2016-06-082020-03-03Bank Of America CorporationSystem for predictive use of resources
US10334134B1 (en)2016-06-202019-06-25Maximillian John SuiterAugmented real estate with location and chattel tagging system and apparatus for virtual diary, scrapbooking, game play, messaging, canvasing, advertising and social interaction
US10638256B1 (en)2016-06-202020-04-28Pipbin, Inc.System for distribution and display of mobile targeted augmented reality content
US11785161B1 (en)2016-06-202023-10-10Pipbin, Inc.System for user accessibility of tagged curated augmented reality content
US11876941B1 (en)2016-06-202024-01-16Pipbin, Inc.Clickable augmented reality content manager, system, and network
US11201981B1 (en)2016-06-202021-12-14Pipbin, Inc.System for notification of user accessibility of curated location-dependent content in an augmented estate
US10805696B1 (en)2016-06-202020-10-13Pipbin, Inc.System for recording and targeting tagged content of user interest
US11044393B1 (en)2016-06-202021-06-22Pipbin, Inc.System for curation and display of location-dependent augmented reality content in an augmented estate system
US9681265B1 (en)2016-06-282017-06-13Snap Inc.System to track engagement of media items
US10430838B1 (en)2016-06-282019-10-01Snap Inc.Methods and systems for generation, curation, and presentation of media collections with automated advertising
US10733255B1 (en)2016-06-302020-08-04Snap Inc.Systems and methods for content navigation with automated curation
US10360708B2 (en)2016-06-302019-07-23Snap Inc.Avatar based ideogram generation
US10855632B2 (en)2016-07-192020-12-01Snap Inc.Displaying customized electronic messaging graphics
US10104417B2 (en)2016-07-262018-10-16At&T Mobility Ii LlcMethod and apparatus for sponsored messaging
US10171601B2 (en)*2016-08-212019-01-01International Business Machines CorporationAvailability-based video presentation management
WO2018045076A1 (en)2016-08-302018-03-08C3D Augmented Reality Solutions LtdSystems and methods for simultaneous localization and mapping
US10432559B2 (en)2016-10-242019-10-01Snap Inc.Generating and displaying customized avatars in electronic messages
US10684738B1 (en)*2016-11-012020-06-16Target Brands, Inc.Social retail platform and system with graphical user interfaces for presenting multiple content types
CN109952610B (en)2016-11-072021-01-08斯纳普公司Selective identification and ordering of image modifiers
US20180160158A1 (en)*2016-12-062018-06-07Bing LiuMethod and system for live stream broadcast and content monetization
US10203855B2 (en)2016-12-092019-02-12Snap Inc.Customized user-controlled media overlays
US11616745B2 (en)2017-01-092023-03-28Snap Inc.Contextual generation and selection of customized media content
US10454857B1 (en)2017-01-232019-10-22Snap Inc.Customized digital avatar accessories
US10915911B2 (en)2017-02-032021-02-09Snap Inc.System to determine a price-schedule to distribute media content
US11250075B1 (en)2017-02-172022-02-15Snap Inc.Searching social media content
US10319149B1 (en)2017-02-172019-06-11Snap Inc.Augmented reality anamorphosis system
US10074381B1 (en)2017-02-202018-09-11Snap Inc.Augmented reality speech balloon system
US10565795B2 (en)2017-03-062020-02-18Snap Inc.Virtual vision system
US10523625B1 (en)2017-03-092019-12-31Snap Inc.Restricted group content collection
US10582277B2 (en)2017-03-272020-03-03Snap Inc.Generating a stitched data stream
US10581782B2 (en)2017-03-272020-03-03Snap Inc.Generating a stitched data stream
US11170393B1 (en)2017-04-112021-11-09Snap Inc.System to calculate an engagement score of location based media content
US10387730B1 (en)2017-04-202019-08-20Snap Inc.Augmented reality typography personalization system
US10212541B1 (en)2017-04-272019-02-19Snap Inc.Selective location-based identity communication
US11893647B2 (en)2017-04-272024-02-06Snap Inc.Location-based virtual avatars
CN110800018A (en)2017-04-272020-02-14斯纳普公司Friend location sharing mechanism for social media platform
US10467147B1 (en)2017-04-282019-11-05Snap Inc.Precaching unlockable data elements
US10803120B1 (en)2017-05-312020-10-13Snap Inc.Geolocation based playlists
US10511692B2 (en)2017-06-222019-12-17Bank Of America CorporationData transmission to a networked resource based on contextual information
US10524165B2 (en)2017-06-222019-12-31Bank Of America CorporationDynamic utilization of alternative resources based on token association
US10313480B2 (en)2017-06-222019-06-04Bank Of America CorporationData transmission between networked resources
US11475254B1 (en)2017-09-082022-10-18Snap Inc.Multimodal entity identification
US10740974B1 (en)2017-09-152020-08-11Snap Inc.Augmented reality system
US10499191B1 (en)2017-10-092019-12-03Snap Inc.Context sensitive presentation of content
EP3474161A1 (en)*2017-10-172019-04-24Spotify ABPlayback of audio content along with associated non-static media content
US10573043B2 (en)2017-10-302020-02-25Snap Inc.Mobile-based cartographic control of display content
JP6463826B1 (en)*2017-11-272019-02-06株式会社ドワンゴ Video distribution server, video distribution method, and video distribution program
US11265273B1 (en)2017-12-012022-03-01Snap, Inc.Dynamic media overlay with smart widget
EP3609190A1 (en)2017-12-122020-02-12Spotify ABMethods, computer server systems and media devices for media streaming
US11017173B1 (en)2017-12-222021-05-25Snap Inc.Named entity recognition visual context and caption data
US10678818B2 (en)2018-01-032020-06-09Snap Inc.Tag distribution visualization system
US20190221208A1 (en)*2018-01-122019-07-18Kika Tech (Cayman) Holdings Co., LimitedMethod, user interface, and device for audio-based emoji input
US20190253751A1 (en)*2018-02-132019-08-15Perfect Corp.Systems and Methods for Providing Product Information During a Live Broadcast
US11507614B1 (en)2018-02-132022-11-22Snap Inc.Icon based tagging
US10979752B1 (en)2018-02-282021-04-13Snap Inc.Generating media content items based on location information
US10885136B1 (en)2018-02-282021-01-05Snap Inc.Audience filtering system
US10327096B1 (en)2018-03-062019-06-18Snap Inc.Geo-fence selection system
EP3766028A1 (en)2018-03-142021-01-20Snap Inc.Generating collectible items based on location information
US11163941B1 (en)2018-03-302021-11-02Snap Inc.Annotating a collection of media content items
US10219111B1 (en)2018-04-182019-02-26Snap Inc.Visitation tracking system
US10896197B1 (en)2018-05-222021-01-19Snap Inc.Event detection system
US10679393B2 (en)2018-07-242020-06-09Snap Inc.Conditional modification of augmented reality object
US10997760B2 (en)2018-08-312021-05-04Snap Inc.Augmented reality anthropomorphization system
US10698583B2 (en)2018-09-282020-06-30Snap Inc.Collaborative achievement interface
US10778623B1 (en)2018-10-312020-09-15Snap Inc.Messaging and gaming applications communication platform
US10939236B1 (en)2018-11-302021-03-02Snap Inc.Position service to determine relative position to map features
US11199957B1 (en)2018-11-302021-12-14Snap Inc.Generating customized avatars based on location information
US12411834B1 (en)2018-12-052025-09-09Snap Inc.Version control in networked environments
US11032670B1 (en)2019-01-142021-06-08Snap Inc.Destination sharing in location sharing system
US10939246B1 (en)2019-01-162021-03-02Snap Inc.Location-based context information sharing in a messaging system
US11294936B1 (en)2019-01-302022-04-05Snap Inc.Adaptive spatial density based clustering
US11972529B2 (en)2019-02-012024-04-30Snap Inc.Augmented reality system
US10936066B1 (en)2019-02-132021-03-02Snap Inc.Sleep detection in a location sharing system
US10838599B2 (en)2019-02-252020-11-17Snap Inc.Custom media overlay system
US10964082B2 (en)2019-02-262021-03-30Snap Inc.Avatar based on weather
US10852918B1 (en)2019-03-082020-12-01Snap Inc.Contextual information in chat
US12242979B1 (en)2019-03-122025-03-04Snap Inc.Departure time estimation in a location sharing system
US11868414B1 (en)2019-03-142024-01-09Snap Inc.Graph-based prediction for contact suggestion in a location sharing system
US11025976B2 (en)2019-03-182021-06-01At&T Intellectual Property I, L.P.System and method for state based content delivery to a client device
US11852554B1 (en)2019-03-212023-12-26Snap Inc.Barometer calibration in a location sharing system
US11249614B2 (en)2019-03-282022-02-15Snap Inc.Generating personalized map interface with enhanced icons
US11166123B1 (en)2019-03-282021-11-02Snap Inc.Grouped transmission of location data in a location sharing system
US10810782B1 (en)2019-04-012020-10-20Snap Inc.Semantic texture mapping system
US10582453B1 (en)2019-05-302020-03-03Snap Inc.Wearable device location systems architecture
US10560898B1 (en)2019-05-302020-02-11Snap Inc.Wearable device location systems
US10575131B1 (en)2019-05-302020-02-25Snap Inc.Wearable device location accuracy systems
US10893385B1 (en)2019-06-072021-01-12Snap Inc.Detection of a physical collision between two client devices in a location sharing system
US11134036B2 (en)2019-07-052021-09-28Snap Inc.Event planning in a content sharing platform
US11307747B2 (en)2019-07-112022-04-19Snap Inc.Edge gesture interface with smart interactions
US11250045B2 (en)*2019-09-102022-02-15Kyndryl, Inc.Media content modification
US11821742B2 (en)2019-09-262023-11-21Snap Inc.Travel based notifications
US11218838B2 (en)2019-10-312022-01-04Snap Inc.Focused map-based context information surfacing
US11429618B2 (en)2019-12-302022-08-30Snap Inc.Surfacing augmented reality objects
US11128715B1 (en)2019-12-302021-09-21Snap Inc.Physical friend proximity in chat
US11169658B2 (en)2019-12-312021-11-09Snap Inc.Combined map icon with action indicator
US11343323B2 (en)2019-12-312022-05-24Snap Inc.Augmented reality objects registry
US11228551B1 (en)2020-02-122022-01-18Snap Inc.Multiple gateway message exchange
US11172269B2 (en)2020-03-042021-11-09Dish Network L.L.C.Automated commercial content shifting in a video streaming system
US11516167B2 (en)2020-03-052022-11-29Snap Inc.Storing data based on device location
US11619501B2 (en)2020-03-112023-04-04Snap Inc.Avatar based on trip
US10956743B1 (en)2020-03-272021-03-23Snap Inc.Shared augmented reality system
US11430091B2 (en)2020-03-272022-08-30Snap Inc.Location mapping for large scale augmented-reality
US11411900B2 (en)2020-03-302022-08-09Snap Inc.Off-platform messaging system
US11314776B2 (en)2020-06-152022-04-26Snap Inc.Location sharing using friend list versions
US11290851B2 (en)2020-06-152022-03-29Snap Inc.Location sharing using offline and online objects
US11503432B2 (en)2020-06-152022-11-15Snap Inc.Scalable real-time location sharing framework
US11483267B2 (en)2020-06-152022-10-25Snap Inc.Location sharing using different rate-limited links
US11308327B2 (en)2020-06-292022-04-19Snap Inc.Providing travel-based augmented reality content with a captured image
US11349797B2 (en)2020-08-312022-05-31Snap Inc.Co-location connection service
US11533543B1 (en)*2021-03-182022-12-20Twitch Interactive, Inc.Community boosting of stream visibility
US11606756B2 (en)2021-03-292023-03-14Snap Inc.Scheduling requests for location data
US11645324B2 (en)2021-03-312023-05-09Snap Inc.Location-based timeline media content system
US12026362B2 (en)2021-05-192024-07-02Snap Inc.Video editing application for mobile devices
US12141826B2 (en)2021-06-232024-11-12Rivit TV, Inc.Systems and methods for alternative adverts
US11917263B2 (en)2021-06-232024-02-27Rivit TV, Inc.Device, method, and graphical user interface for alternative advert system
US12056735B2 (en)*2021-06-232024-08-06Rivit TV, Inc.Systems and methods of providing alternative advert (AA) query items and AA offers for an AA system
US12294767B2 (en)2021-06-232025-05-06Rivit TV, Inc.Systems and methods for alternative adverts
US12166839B2 (en)2021-10-292024-12-10Snap Inc.Accessing web-based fragments for display
US11829834B2 (en)2021-10-292023-11-28Snap Inc.Extended QR code
US12001750B2 (en)2022-04-202024-06-04Snap Inc.Location-based shared augmented reality experience system
US12243167B2 (en)2022-04-272025-03-04Snap Inc.Three-dimensional mapping using disparate visual datasets
US12164109B2 (en)2022-04-292024-12-10Snap Inc.AR/VR enabled contact lens
US11973730B2 (en)2022-06-022024-04-30Snap Inc.External messaging function for an interaction system
US12020384B2 (en)2022-06-212024-06-25Snap Inc.Integrating augmented reality experiences with other components
US12020386B2 (en)2022-06-232024-06-25Snap Inc.Applying pregenerated virtual experiences in new location
US11948172B2 (en)*2022-07-082024-04-02Roku, Inc.Rendering a dynamic endemic banner on streaming platforms using content recommendation systems and content affinity modeling
US12265664B2 (en)2023-02-282025-04-01Snap Inc.Shared augmented reality eyewear device with hand tracking alignment
US12361664B2 (en)2023-04-192025-07-15Snap Inc.3D content display using head-wearable apparatuses

Citations (41)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2001050296A2 (en)1999-12-302001-07-12Nokia CorporationSelective media stream advertising technique
US6275806B1 (en)1999-08-312001-08-14Andersen Consulting, LlpSystem method and article of manufacture for detecting emotion in voice signals by utilizing statistics for voice signal parameters
US20020090198A1 (en)2000-12-272002-07-11Scott RosenbergAdvertisements in a television recordation system
US20020100042A1 (en)2000-01-192002-07-25Denis KhooMethod and system for providing intelligent advertisement placement in a motion picture
US20020144262A1 (en)2001-04-032002-10-03Plotnick Michael A.Alternative advertising in prerecorded media
US20030112258A1 (en)2001-12-132003-06-19International Business Machines CorporationStreaming internet media record and playback software program
US20030149975A1 (en)2002-02-052003-08-07Charles ElderingTargeted advertising in on demand programming
US20030149618A1 (en)2002-02-012003-08-07Microsoft CorporationFlexible dynamic advertising
US6684249B1 (en)2000-05-262004-01-27Sonicbox, Inc.Method and system for adding advertisements over streaming audio based upon a user profile over a world wide area network of computers
US20040030599A1 (en)2002-06-252004-02-12Svod LlcVideo advertising
US6857007B1 (en)2000-08-302005-02-15Bloomfield Enterprises, LlcPersonal digital assistant facilitated communication system
US20050171843A1 (en)2004-02-032005-08-04Robert BrazellSystems and methods for optimizing advertising
US20060074752A1 (en)2004-10-052006-04-06David NewmarkMethod of facilitating placement of advertising
US7085732B2 (en)2001-09-182006-08-01Jedd Adam GouldOnline trading for the placement of advertising in media
US20060179453A1 (en)2005-02-072006-08-10Microsoft CorporationImage and other analysis for contextual ads
US20060212897A1 (en)2005-03-182006-09-21Microsoft CorporationSystem and method for utilizing the content of audio/video files to select advertising content for display
US20070055986A1 (en)2005-05-232007-03-08Gilley Thomas SMovie advertising placement optimization based on behavior and content analysis
US20070078718A1 (en)2005-05-202007-04-05Anchorfree, Inc.System and method for monetizing internet usage
US20070097975A1 (en)2005-11-022007-05-03Sbc Knowledge Ventures, L.P.Service to push author-spoken audio content with targeted audio advertising to users
US20070127688A1 (en)2006-02-102007-06-07Spinvox LimitedMass-Scale, User-Independent, Device-Independent Voice Messaging System
US20070162335A1 (en)2006-01-112007-07-12Mekikian Gary CAdvertiser Sponsored Media Download and Distribution Using Real-Time Ad and Media Matching and Concatenation
US20080004962A1 (en)2006-06-302008-01-03Muthukrishnan ShanmugavelayuthSlot preference auction
US20080066107A1 (en)2006-09-122008-03-13Google Inc.Using Viewing Signals in Targeted Video Advertising
US20080152300A1 (en)*2006-12-222008-06-26Guideworks, LlcSystems and methods for inserting advertisements during commercial skip
WO2008137696A1 (en)2007-05-022008-11-13Google Inc.User interfaces for web-based video player
WO2008137482A1 (en)2007-05-022008-11-13Google Inc.Animated video overlay
US20080300872A1 (en)*2007-05-312008-12-04Microsoft CorporationScalable summaries of audio or visual content
US20090006191A1 (en)2007-06-272009-01-01Google Inc.Targeting in-video advertising
US7806329B2 (en)2006-10-172010-10-05Google Inc.Targeted video advertising
US7853255B2 (en)2004-04-162010-12-14Broadcom CorporationDigital personal assistance via a broadband access gateway
US8195133B2 (en)2005-09-142012-06-05Jumptap, Inc.Mobile dynamic advertisement creation and placement
US20120265528A1 (en)2009-06-052012-10-18Apple Inc.Using Context Information To Facilitate Processing Of Commands In A Virtual Assistant
US8386386B1 (en)2009-01-052013-02-26Sprint Communications Company L.P.Phone usage pattern as credit card fraud detection trigger
US20130117022A1 (en)2010-01-182013-05-09Apple Inc.Personalized Vocabulary for Digital Assistant
US20130275164A1 (en)2010-01-182013-10-17Apple Inc.Intelligent Automated Assistant
US20130304758A1 (en)2012-05-142013-11-14Apple Inc.Crowd Sourcing Information to Fulfill User Requests
US20170092278A1 (en)2015-09-302017-03-30Apple Inc.Speaker recognition
US20170110130A1 (en)2015-10-162017-04-20Google Inc.Hotword recognition
US20170110144A1 (en)2015-10-162017-04-20Google Inc.Hotword recognition
US20170132019A1 (en)2015-11-062017-05-11Apple Inc.Intelligent automated assistant in a messaging environment
US20170358301A1 (en)2016-06-102017-12-14Apple Inc.Digital assistant providing whispered speech

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20030190961A1 (en)*2002-02-072003-10-09Seidman Charles B.DVD and method of using the same
CA2672735A1 (en)*2006-12-132008-06-19Quickplay Media Inc.Mobile media platform

Patent Citations (48)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6275806B1 (en)1999-08-312001-08-14Andersen Consulting, LlpSystem method and article of manufacture for detecting emotion in voice signals by utilizing statistics for voice signal parameters
WO2001050296A2 (en)1999-12-302001-07-12Nokia CorporationSelective media stream advertising technique
US20020100042A1 (en)2000-01-192002-07-25Denis KhooMethod and system for providing intelligent advertisement placement in a motion picture
US6684249B1 (en)2000-05-262004-01-27Sonicbox, Inc.Method and system for adding advertisements over streaming audio based upon a user profile over a world wide area network of computers
US6857007B1 (en)2000-08-302005-02-15Bloomfield Enterprises, LlcPersonal digital assistant facilitated communication system
US20020090198A1 (en)2000-12-272002-07-11Scott RosenbergAdvertisements in a television recordation system
US20020144262A1 (en)2001-04-032002-10-03Plotnick Michael A.Alternative advertising in prerecorded media
US7085732B2 (en)2001-09-182006-08-01Jedd Adam GouldOnline trading for the placement of advertising in media
US20030112258A1 (en)2001-12-132003-06-19International Business Machines CorporationStreaming internet media record and playback software program
US20030149618A1 (en)2002-02-012003-08-07Microsoft CorporationFlexible dynamic advertising
US20030149975A1 (en)2002-02-052003-08-07Charles ElderingTargeted advertising in on demand programming
US20040030599A1 (en)2002-06-252004-02-12Svod LlcVideo advertising
US20050171843A1 (en)2004-02-032005-08-04Robert BrazellSystems and methods for optimizing advertising
US7853255B2 (en)2004-04-162010-12-14Broadcom CorporationDigital personal assistance via a broadband access gateway
US20060074752A1 (en)2004-10-052006-04-06David NewmarkMethod of facilitating placement of advertising
US20060179453A1 (en)2005-02-072006-08-10Microsoft CorporationImage and other analysis for contextual ads
US20060212897A1 (en)2005-03-182006-09-21Microsoft CorporationSystem and method for utilizing the content of audio/video files to select advertising content for display
US20070078718A1 (en)2005-05-202007-04-05Anchorfree, Inc.System and method for monetizing internet usage
US20070055986A1 (en)2005-05-232007-03-08Gilley Thomas SMovie advertising placement optimization based on behavior and content analysis
US8195133B2 (en)2005-09-142012-06-05Jumptap, Inc.Mobile dynamic advertisement creation and placement
US20070097975A1 (en)2005-11-022007-05-03Sbc Knowledge Ventures, L.P.Service to push author-spoken audio content with targeted audio advertising to users
US20070162335A1 (en)2006-01-112007-07-12Mekikian Gary CAdvertiser Sponsored Media Download and Distribution Using Real-Time Ad and Media Matching and Concatenation
US20070127688A1 (en)2006-02-102007-06-07Spinvox LimitedMass-Scale, User-Independent, Device-Independent Voice Messaging System
US20080004962A1 (en)2006-06-302008-01-03Muthukrishnan ShanmugavelayuthSlot preference auction
US20080066107A1 (en)2006-09-122008-03-13Google Inc.Using Viewing Signals in Targeted Video Advertising
US7806329B2 (en)2006-10-172010-10-05Google Inc.Targeted video advertising
US20080152300A1 (en)*2006-12-222008-06-26Guideworks, LlcSystems and methods for inserting advertisements during commercial skip
WO2008137696A1 (en)2007-05-022008-11-13Google Inc.User interfaces for web-based video player
US20130247096A1 (en)2007-05-022013-09-19Google Inc.User Interfaces For Web-Based Video Player
US8468562B2 (en)2007-05-022013-06-18Google Inc.User interfaces for web-based video player
WO2008137482A1 (en)2007-05-022008-11-13Google Inc.Animated video overlay
US8281332B2 (en)2007-05-022012-10-02Google Inc.Animated video overlays
US20120320091A1 (en)2007-05-022012-12-20Google Inc.Animated Video Overlays
US8310443B1 (en)2007-05-022012-11-13Google Inc.Pie chart time indicator
US20080300872A1 (en)*2007-05-312008-12-04Microsoft CorporationScalable summaries of audio or visual content
US20090006191A1 (en)2007-06-272009-01-01Google Inc.Targeting in-video advertising
US8661464B2 (en)2007-06-272014-02-25Google Inc.Targeting in-video advertising
US8386386B1 (en)2009-01-052013-02-26Sprint Communications Company L.P.Phone usage pattern as credit card fraud detection trigger
US20120265528A1 (en)2009-06-052012-10-18Apple Inc.Using Context Information To Facilitate Processing Of Commands In A Virtual Assistant
US8903716B2 (en)2010-01-182014-12-02Apple Inc.Personalized vocabulary for digital assistant
US20130117022A1 (en)2010-01-182013-05-09Apple Inc.Personalized Vocabulary for Digital Assistant
US20130275164A1 (en)2010-01-182013-10-17Apple Inc.Intelligent Automated Assistant
US20130304758A1 (en)2012-05-142013-11-14Apple Inc.Crowd Sourcing Information to Fulfill User Requests
US20170092278A1 (en)2015-09-302017-03-30Apple Inc.Speaker recognition
US20170110130A1 (en)2015-10-162017-04-20Google Inc.Hotword recognition
US20170110144A1 (en)2015-10-162017-04-20Google Inc.Hotword recognition
US20170132019A1 (en)2015-11-062017-05-11Apple Inc.Intelligent automated assistant in a messaging environment
US20170358301A1 (en)2016-06-102017-12-14Apple Inc.Digital assistant providing whispered speech

Non-Patent Citations (94)

* Cited by examiner, † Cited by third party
Title
"Adsense" Jun. 21, 2007, from Wikipedia, the free encyclopedia. Retrieved from the Internet: URL:http://en.wikipedia.org/w/index.php?title=Adsense&oldid=139743978[retrieved on Apr. 9, 2013], XP055058840, 4 pages.
"Walmart and Google to offer voice-enabled shopping", BBC News, Aug. 23, 2017 (10 pages).
"YouTube Embedded Player Featuring Google Video Ads?", Jun. 8, 2007, Xp055058826, Retrieved from the Internet: URL:http://www.searchenginejournal.com/new-youtube-embedded-player-features-google-video-ads/5064 [retrieved on Apr. 9, 2013], 8 pages.
Abrams, Help users find, interact & re-engage with your app on the Google Assistant, Google Developers Blog, Nov. 15, 2017 (16 pages).
Albrecht, "Alexa, How Can You Be Used in Restaurants?", the spoon, Dec. 10, 2017 (6 pages).
Amazon, "Echo Look | Hands-Free Camera and Style Assistant", reprinted from https://www.amazon.com/gp/product/B0186JAEWK?ref%5F=cm%5Fsw%5Fr%5Ffa%5Fdp%5Ft2%5FC5oazbJTKCB18&pldnSite=1 on Aug. 22, 2017 (7 pages).
Barr, "AWS DeepLens-Get Hands-On Experience with Deep Learning With Our New Video Camera", AWS News Blog, Nov. 29, 2017 (11 pages).
Barr, "AWS DeepLens—Get Hands-On Experience with Deep Learning With Our New Video Camera", AWS News Blog, Nov. 29, 2017 (11 pages).
Berg J: "Google AdSense to Benefit YouTube", Oct. 10, 2006, Xp002574898, Retrieved from the Internet: URL:http://www.imediaconnection.com/content/11634.imc [retrieved on Mar. 22, 2010] p. 1.
Broussard, Mitchel, "Chatbot-Like Siri Patent Includes Intelligent Image, Video, and Audio Recognition Within Messages", May 11, 2017, 11 pages.
Buckland et al., "Amazon's Alexa Takes Open-Source Route to Beat Google Into Cars", Bloomberg, Feb. 27, 2018 (6 pages).
Chen, Yilun Lulu, "Alibaba Challenges Google, Amazon With New Echo-Like Device", Bloomberg, Jul. 5, 2017, 3 pages.
Close, "Amazon Echo Will Give You These Deals If You Order Through Alexa This Weekend," Web Article, Nov. 18, 2016, Time.com (2 pages).
Clover, Juli, "Amazon and Google Want to Turn Their Smart Home Speakers Into Telephone Replacements", MacRumors, Feb. 15, 2017 (5 pages).
Coberly, "Apple patent filing reveals potential whispering Siri functionality", Techspot, Dec. 14, 2017 (4 pages).
Collins, et al., "Can Twitter Save Itself?", cnet, Apr. 26, 2017, reprinted from https://www.cnet.com/news/twitter-q1-2017-earnings/ on Aug. 22, 2017 (2 pages).
Communication Pursuant to Article 94(3) EPC for EPO Appl. Ser. No. 08796039.9 dated May 10, 2013 (5 pages).
Cook, "A Siri for advertising: These mobile ads talk back to you," Web Article, Apr. 1, 2013, Geekwire.com (7 pages).
Crist, Ry, "Logitech Harmony's Alexa Skill just got a whole lot better", cnet, Jul. 13, 2017 (2 pages).
Estes, "Amazon's Newest Gadget Is a Tablet That's Also an Echo", Gizmodo, Sep. 19, 2017 (3 pages).
Examiner's First Report for AU Patent Application No. 2008268134, dated Apr. 16, 2012, 7 pages.
Foghorn Labs, "10 Tips to Improve the Performance of Google Product Listing Ads," printed from Internet address: http://www.foghornlabs.com/2012/11/21/product-listing-ads-best-practices/, on Mar. 18, 2013 (5 pages).
Forrest, Conner, "Essential Home wants to be 'bridge' between Amazon Alexa, Apple's Siri, and Google Assistant," TechRepublic, May 31, 2017, 9 pages.
Forrest, Conner, "Essential Home wants to be ‘bridge’ between Amazon Alexa, Apple's Siri, and Google Assistant," TechRepublic, May 31, 2017, 9 pages.
Foxx, Chris, "Apple reveals HomePod smart speaker", BBC, Jun. 5, 2017, 9 pages.
Gebhart, Andrew, "Google Assistant is spreading, but it needs its own 'Echo Dot'", Cnet, May 20, 2017, 6 pages.
Gebhart, Andrew, "Google Home to the Amazon Echo: 'Anything you can do . . . '", cnet, May 18, 2017 (7 pages).
Gebhart, Andrew, "Google Assistant is spreading, but it needs its own ‘Echo Dot’", Cnet, May 20, 2017, 6 pages.
Gebhart, Andrew, "Google Home to the Amazon Echo: ‘Anything you can do . . . ’", cnet, May 18, 2017 (7 pages).
Gibbs, Samuel, "Your Facebook messenger app is about to be filled with ads", The Guardian, Jul. 12, 2017 (3 pages).
Golgowski, Nina, "This Burger King Ad Is Trying to Control Your Google Home Device", Huffpost, Apr. 12, 2017 (7 pages).
Google Developers Newsletter "Google Assistant SDK", developers.google.com, retrieved on Jul. 12, 2017, 2 pages.
Google Inc., "Products Feed Specification," printed from Internet address: http://www.support.google.com/merchants/bin/answer.py?hl=en&answer=188494#US, on Mar. 18, 2013 (6 pages).
Google Inc., "Supported File Formats," printed from Internet address: http://www.support.google.com/merchants/bin/answer.py?hl=en&answer=160567, on Mar. 18, 2013 (1 page).
Gurman, Mark and Webb, Alex, "Apple Is Manufacturing a Siri Speaker to Outdo Google and Amazon", Bloomberg, May 31, 2017, 3 pages.
Hardwick, Tim, "Facebook Smart Speaker Coming Next Year With 15-inch Touch Panel", MacRumors, Jul. 25, 2017 (5 pages).
Heater, "Amazon Alexa devices can finally tell voices apart", TechCrunch, Oct. 11, 2017 (6 pages).
International Preliminary Report on Patentability issued in Appl. Ser. No. PCT/US2008/068552 dated Jan. 14, 2010 (7 pages).
International Search Report and Written Opinion for International Application No. PCT/US2008/068552, dated Jan. 15, 2009 (12 pages).
Johnston, "Amazon Whirlwind: New Echo, Plus, Spot, Connect, Fire TV Take The Stage", Twice, Sep. 27, 2017 (4 pages).
Kelion, "Amazon revamps Echo smart speaker family", BBC News, Sep. 27, 2017 (11 pages).
Kelion, Leo, "Amazon's race to make Alexa smarter", BBC News, Jul. 28, 2017 (8 pages).
Koetsier, John, "Ads on Amazon Echo: Wendy's, ESPN, and Progressive Among Brands Testing", Forbes, May 11, 2017, 3 pages.
Krishna, "Jim Beam's smart decanter will pour you a shot when you ask", engadget, Nov. 29, 2017 (3 pages).
Lacy, "Improving search and advertising are the next frontiers for voice-activated devices", TechCrunch, Dec. 20, 2017 (13 pages).
Larson, Selena, "Google Home now recognizes your individual voice", CNN Money, San Francisco, California, Apr. 20, 2017 (3 pages).
Lee, Dave, "The five big announcements from Google I/O", BBC, May 18, 2017, 11 pages.
Lee, Take Two for Samsung's troubled Bixby assistant, BBC News, Oct. 19, 2017 (6 pages).
Lowe, "Object Recognition from Local Scale-Invariant Features", Proceedings of the Seventh IEEE International Conference on Computer Vision, vol. 2, Sep. 1999, pp. 1150-1157, [online][retrieved on Feb. 25, 2009][retrieved from: http://www.cs.ubc.ca/˜lowe/papers/iccv99.pdf] pp. 1-8.
Lund, Pamela, Mastering Google Product Feeds and Product Listing Ads $2013 Part 1, found at http://www.blueglass.com/blog/mastering-google-product-feeds-and-product-listing-ads-part-1/#comments, Dec. 28, 2013 (17 pages).
Nieva, Richard, "Google Home and eBay can tell you how much that's worth", cnet, Mar. 8, 2017 (3 pages).
Novet, et al., "Amazon is getting ready to bring Alexa to work", CNBC, Nov. 29, 2017 (4 pages).
Office Action for Canadian Appl. Ser. No. 2692921 dated Mar. 24, 2016 (6 pages).
Office Action for Indian Patent Application No. 2412/MUMNP/2009 dated Mar. 13, 2018.
Office Action issued in Canadian Application Ser. No. 2,692,921 dated Feb. 12, 2015 (4 pages).
Palladino, "Garmin teamed up with Amazon to make a tiny Echo Dot for your car", ars Technica, Oct. 17, 2017 (2 pages).
Patently Apple, "Apple Patent Reveals a New Security Feature Coming to Siri", Apr. 4, 2017, reprinted from http://www.patentlyapple.com/patently-apple/2017/04/apple-patent-reveals-a-new-security-feature-coming-to-siri.html, on Aug. 22, 2017 (6 pages).
Patently Mobile, "The Patent behind Google Home's new Feature of Understanding Different Voices in the Home Surfaced Today", Apr. 20, 2017, reprinted from http://www.patentlymobile.com/2017/04/the-patent-behind-google-homes-new-feature-of-understanding-different-voices-in-the-home-surfaced-today.html, on Aug. 22, 2017 (3 pages).
Perez, "Alexa's 'Routines' will combine smart home control with other actions, like delivering your news and weather", TechCrunch, Sep. 28, 2017 (10 pages).
Perez, "Alexa's ‘Routines’ will combine smart home control with other actions, like delivering your news and weather", TechCrunch, Sep. 28, 2017 (10 pages).
Perez, Sarah, "The first ad network for Alexa Skills shuts down following Amazon's policy changes", Tech Crunch, Jun. 15, 2017, 8 pages.
Porter, Jon, "Amazon Echo Show release date, price, news and features", Tech Radar, Jun. 26, 2017, 11 pages.
Pringle, "'I'm sorry to hear that': Why training Siri to be a therapist won't be easy", CBC News, Sep. 24, 2017 (3 pages).
Pringle, "‘I'm sorry to hear that’: Why training Siri to be a therapist won't be easy", CBC News, Sep. 24, 2017 (3 pages).
Purcher, Jack, Today Google Home's Virtual Assistant can learn its Owner's voice for Security Reasons like Apple's Patent Pending Idea, Apr. 20, 2017 (4 pages).
Sablich, Justin, "Planning a Trip With the Help of Google Home", New York Times, dated May 31, 2017, 6 pages.
Seifert, Dan, "Samsung's new virtual assistant will make using your phone easier", The Verge, Mar. 20, 2017 (6 pages).
Sherr, Ian, "IBM built a voice assistant for cybersecurity", cnet, Feb. 13, 2017 (2 pages).
Siegal, Daniel, "IP Attys Load Up Apps' Legal Challenges at 'Silicon Beach'", Law360, Los Angeles, California, Feb. 2, 2017 (4 pages).
Siegal, Daniel, "IP Attys Load Up Apps' Legal Challenges at ‘Silicon Beach’", Law360, Los Angeles, California, Feb. 2, 2017 (4 pages).
Simonite, "How Alexa, Siri, and Google Assistant Will Make Money Off You," Web Article, May 31, 2016, technologyreview.com (11 pages).
Simonite, "How Assistant Could End Up Eating Google's Lunch," Web Article, Sep. 23, 2016, technologyreview.com (9 pages).
Smith, Dave, "The Amazon Echo got 2 incredibly useful features thanks to a new update", Business Insider, Jun. 1, 2017, 2 pages.
Supplementary European Search Report for EPO Appl. Ser. No. 08796039.9 dated Apr. 22, 2013 (4 pages).
U.S. Appl. No. 11/479,942, filed Jan. 3, 2008, Muthukrishnan et al.
U.S. Appl. No. 11/550,249, filed Apr. 17, 2008, Dmitriev et al.
U.S. Appl. No. 11/550,388, filed Mar. 13, 2008, Moonka et al.
U.S. Appl. No. 60/915,654, Klein et al.
U.S. Final Office Action on U.S. Appl. No. 15/638,306, dated Feb. 26, 2018, 9 pages.
U.S. Notice of Allowance on U.S. Appl. No. 11/770,585 dated Oct. 21, 2013 (21 pages).
U.S. Notice of Allowance on U.S. Appl. No. 14/164,719 dated Mar. 1, 2017 (11 pages).
U.S. Notice of Allowance on U.S. Appl. No. 15/638,306, dated Mar. 26, 2018, 11 pages.
U.S. Office Action on U.S. Appl. 15/638,306, dated Sep. 19, 2017, 13 pages.
U.S. Office Action on U.S. Appl. No. 11/770,585 dated Dec. 10, 2009 (25 pages).
U.S. Office Action on U.S. Appl. No. 11/770,585 dated Jan. 13, 2013 (32 pages).
U.S. Office Action on U.S. Appl. No. 11/770,585 dated Jan. 24, 2012 (21 pages).
U.S. Office Action on U.S. Appl. No. 11/770,585 dated Jun. 5, 2012 (28 pages).
U.S. Office Action on U.S. Appl. No. 11/770,585 dated May 15, 2009 (26 pages).
U.S. Office Action on U.S. Appl. No. 14/164,719 dated Mar. 26, 2015 (13 pages).
U.S. Office Action on U.S. Appl. No. 14/164,719 dated Oct. 28, 2014 (12 pages).
U.S. Office Action on U.S. Appl. No. 14/164,719 dated Sep. 1, 2016 (22 pages).
Unknown Author, "'Dolphin' attacks fool Amazon, Google voice assistants", BBC News, Sep. 7, 2017 (8 pages).
Unknown Author, "‘Dolphin’ attacks fool Amazon, Google voice assistants", BBC News, Sep. 7, 2017 (8 pages).
Willens, Max, "For publishers, Amazon Alexa holds promise but not much money (yet)", Digiday, Jul. 6, 2017, 5 pages,

Also Published As

Publication numberPublication date
US11210697B2 (en)2021-12-28
US11915263B2 (en)2024-02-27
US20090006191A1 (en)2009-01-01
US9697536B2 (en)2017-07-04
US10032187B2 (en)2018-07-24
US20200357020A1 (en)2020-11-12
EP2176821A2 (en)2010-04-21
US20140143800A1 (en)2014-05-22
US20180322530A1 (en)2018-11-08
AU2008268134A1 (en)2008-12-31
AU2008268134B2 (en)2012-06-07
BRPI0812926A2 (en)2014-12-09
EP2176821A4 (en)2013-05-22
US8661464B2 (en)2014-02-25
US20170300965A1 (en)2017-10-19
WO2009003162A2 (en)2008-12-31
US20220108351A1 (en)2022-04-07
CA2692921A1 (en)2008-12-31
WO2009003162A3 (en)2009-03-19

Similar Documents

PublicationPublication DateTitle
US11915263B2 (en)Device functionality-based content selection
US20080276266A1 (en)Characterizing content for identification of advertising
US8433611B2 (en)Selection of advertisements for placement with content
US10299015B1 (en)Time-based content presentation
US11778272B2 (en)Delivery of different services through different client devices
US8315423B1 (en)Providing information in an image-based information retrieval system
EP3346715B1 (en)Multifunction multimedia device
US9043828B1 (en)Placing sponsored-content based on images in video content
US9106979B2 (en)Sentiment mapping in a media content item
WO2018028533A1 (en)Media information publishing method, terminal, server, system and storage medium
US20170213248A1 (en)Placing sponsored-content associated with an image
US20090172727A1 (en)Selecting advertisements to present
AU2010287064B2 (en)Informational content scheduling system and method
US8346604B2 (en)Facilitating bidding on images
US10681427B2 (en)Sentiment mapping in a media content item
US20220036407A1 (en)Methods and systems to increase user engagement with advertisements

Legal Events

DateCodeTitleDescription
FEPPFee payment procedure

Free format text:ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

ASAssignment

Owner name:GOOGLE INC., CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ARANKALLE, POORVA;FINGER, BRIENNE;LIAO, LIN;AND OTHERS;SIGNING DATES FROM 20070625 TO 20070627;REEL/FRAME:046416/0873

Owner name:GOOGLE LLC, CALIFORNIA

Free format text:CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:046599/0677

Effective date:20170929

STPPInformation on status: patent application and granting procedure in general

Free format text:DOCKETED NEW CASE - READY FOR EXAMINATION

STPPInformation on status: patent application and granting procedure in general

Free format text:NON FINAL ACTION MAILED

STPPInformation on status: patent application and granting procedure in general

Free format text:RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPPInformation on status: patent application and granting procedure in general

Free format text:NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPPInformation on status: patent application and granting procedure in general

Free format text:PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCFInformation on status: patent grant

Free format text:PATENTED CASE

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4


[8]ページ先頭

©2009-2025 Movatter.jp