Movatterモバイル変換


[0]ホーム

URL:


US9324002B2 - User identification and personalization based on automotive identifiers - Google Patents

User identification and personalization based on automotive identifiers
Download PDF

Info

Publication number
US9324002B2
US9324002B2US13/706,678US201213706678AUS9324002B2US 9324002 B2US9324002 B2US 9324002B2US 201213706678 AUS201213706678 AUS 201213706678AUS 9324002 B2US9324002 B2US 9324002B2
Authority
US
United States
Prior art keywords
user
vehicle
identification information
data
sale
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/706,678
Other versions
US20130216102A1 (en
Inventor
Michael Joseph Ryan
Christopher Dennis Boncimino
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PayPal Inc
Original Assignee
PayPal Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US13/706,678priorityCriticalpatent/US9324002B2/en
Application filed by PayPal IncfiledCriticalPayPal Inc
Priority to CA2866482Aprioritypatent/CA2866482C/en
Priority to PCT/US2013/027426prioritypatent/WO2013126772A1/en
Priority to AU2013222234Aprioritypatent/AU2013222234B2/en
Assigned to EBAY INC.reassignmentEBAY INC.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: BONCIMINO, Christopher Dennis, RYAN, MICHAEL JOSEPH
Publication of US20130216102A1publicationCriticalpatent/US20130216102A1/en
Assigned to PAYPAL, INC.reassignmentPAYPAL, INC.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: EBAY INC.
Priority to US15/066,533prioritypatent/US9996861B2/en
Publication of US9324002B2publicationCriticalpatent/US9324002B2/en
Application grantedgrantedCritical
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

A system and method for user identification and personalization based on automotive identifiers are described. Image data of a vehicle is received from an image capture device. Vehicle identification information is extracted from the image data. A data record associated with a user is retrieved using the vehicle identification information. A personalized communication for the user is generated based on the retrieved data record. The personalized communication may be transmitted to a device. The personalized communication may comprise a recommendation.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims priority to U.S. Provisional Application No. 61/601,972, filed on Feb. 22, 2012, and entitled, “USER IDENTIFICATION AND PERSONALIZATION BASED ON AUTOMOTIVE IDENTIFIERS,” which is hereby incorporated by reference in its entirety as if set forth herein
TECHNICAL FIELD
The present application relates generally to the technical field of information retrieval, and, in various embodiments, to systems and methods of user personalization based on automotive identifiers.
BACKGROUND
Certain segments of the retail industry attempt to provide users with quick and convenient methods to acquire goods and services. One known method is drive-through service, in which retailers, predominantly restaurants, offer users the ability to place and receive orders from automobiles. While drive-through service offers convenience for a user, the user still must peruse a menu, place an order, and pay for the order.
BRIEF DESCRIPTION OF THE DRAWINGS
Some embodiments of the present disclosure are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like reference numbers indicate similar elements and in which:
FIG. 1 is a block diagram depicting a network architecture of a system, according to some embodiments, having a client-server architecture configured for exchanging data over a network;
FIG. 2 is a block diagram illustrating an example embodiment of a personalization system;
FIG. 3 illustrates an example embodiment of an implementation of a personalization system;
FIG. 4 illustrates another example embodiment of an implementation of a personalization system;
FIG. 5 is a flowchart illustrating an example embodiment of a method for user personalization;
FIG. 6 is a flowchart illustrating another example embodiment of a method for user personalization; and
FIG. 7 shows a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions may be executed to cause the machine to perform any one or more of the methodologies discussed herein.
DETAILED DESCRIPTION
The description that follows includes illustrative systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques have not been shown in detail.
Methods and systems for user identification and personalization based on automotive identifiers are disclosed. In some embodiments, image data of a vehicle may be received from an image capture device. Vehicle identification information may then be extracted from the image data. A data record associated with a user may be retrieved using the vehicle identification information. A personalized communication may then be generated based on the retrieved data record.
In some embodiments, the image data comprises a still image or video. In some embodiments, the vehicle identification information comprises at least one of a license plate number of the vehicle, a make of the vehicle, a model of the vehicle, and a color of the vehicle. In some embodiments, user identification information is extracted from the image data. The user identification information may be used along with the vehicle identification information in retrieving the data record. In some embodiments, the personalized communication is presented on a display proximate the user. In some embodiments, the image capture device may be located proximate a point-of-sale device that is configured to complete a transaction for an item. In some embodiments, the data record comprises a history of transactions for the user or preferences of the user. In some embodiments, the personalized communication is a recommendation related to at least one item for sale.
In some embodiments, a system comprises at least one processor, an imaging module, a database interface module, and a personalized communication module. The imaging module may be executable by the at least one processor and configured to receive image data of a vehicle from an image capture device and to extract vehicle identification information from the image data. The database interface module may be executable by the at least one processor and configured to use the vehicle identification information to retrieve a data record associated with a user. The personalized communication module may be executable by the at least one processor and configured to generate a personalized communication for the user based on the retrieved data record.
In some embodiments, the image data comprises a still image or video. In some embodiments, the vehicle identification information comprises at least one of a license plate number of the vehicle, a make of the vehicle, a model of the vehicle, and a color of the vehicle. In some embodiments, the imaging module is further configured to extract user identification information from the image data, and the database interface module is further configured to use the user identification information along with the vehicle identification information to retrieve the data record. In some embodiments, the personalized communication module is further configured to cause the personalized communication to be presented on a display proximate the user. In some embodiments, the image capture device is located proximate a point-of-sale device configured to complete a transaction for an item. In some embodiments, the data record comprises a history of transactions for the user or preferences of the user. In some embodiments, the personalized communication is a recommendation related to at least one item for sale.
FIG. 1 shows a network diagram depicting anetwork system100, according to various embodiments, having a client-server architecture configured for exchanging data over a network. For example, thenetwork system100 may comprise a network-based publication system (or interchangeably “network-based publisher”)102 where clients may communicate and exchange data within thenetwork system100. The data may pertain to various functions (e.g., selling and purchasing of items) and aspects (e.g., data describing items listed on the publication/publisher system) associated with thenetwork system100 and its users. In some embodiments, the data may correspond to multimedia content, audio content, or visual content. Although illustrated herein as a client-server architecture as an example, other example embodiments may include other network architectures, such as a peer-to-peer or distributed network environment.
A data exchange platform, in an example form of the network-basedpublisher102, may provide server-side functionality, via a network104 (e.g., the Internet) to one or more clients. The one or more clients may include users that utilize thenetwork system100 and more specifically, the network-basedpublisher102, to exchange data over thenetwork104. These transactions may include transmitting, receiving (communicating) and processing data to, from, and regarding content and users of thenetwork system100. The data may include, but are not limited to, content and user data such as feedback data; user reputation values; user profiles; user attributes; product and service reviews; product, service, manufacture, and vendor recommendations and identifiers; product and service listings associated with buyers and sellers; auction bids; transaction data; and payment data, among other things.
In various embodiments, the data exchanges within thenetwork system100 may be dependent upon user-selected functions available through one or more client or user interfaces (UIs). The UIs may be associated with a client machine, such as aclient machine106 using aweb client110. Theweb client110 may be in communication with the network-basedpublisher102 via aweb server120. The UIs may also be associated with aclient machine108 using aprogrammatic client112, such as a client application, or athird party server114 hosting athird party application116. It can be appreciated in various embodiments that theclient machine106,108, orthird party server114 may be associated with a buyer, a seller, a third party electronic commerce platform, a payment service provider, or a shipping service provider, each in communication with the network-basedpublisher102 and optionally each other. The buyers and sellers may be any one of individuals, merchants, or service providers, among other things.
In various embodiments, the client machine may be connected to thenetwork104 through which the client machine requests and accesses content from one or more content providers. The content may be broadcasted, multicasted, streamed, or otherwise transmitted to the client device by the content providers. In some embodiments, the client machine may store content previously retrieved from a content provider and may access the stored content. In addition to the above-disclosed embodiments, in various embodiments, the client machine may be associated with a user or content viewer.
Turning specifically to the network-basedpublisher102, an application program interface (API)server118 and aweb server120 are coupled to, and provide programmatic and web interfaces respectively to, one ormore application servers122. Theapplication servers122 host one or more publication application(s)124. Theapplication servers122 are, in turn, shown to be coupled to one or more database server(s)126 that facilitate access to one or more database(s)128.
In one embodiment, theweb server120 and theAPI server118 communicate and receive data pertaining to listings, transactions, feedback, and content items among other things, via various user input tools. For example, theweb server120 may send and receive data to and from a toolbar or webpage on a browser application e.g., web client110) operating on a client machine (e.g., client machine106). TheAPI server118 may send and receive data to and from an application (e.g.,programmatic client112 or third party application116) running on another client machine (e.g.,client machine108 or third party server114).
The publication application(s)124 may provide a number of publisher functions and services (e.g., search, listing, content viewing, payment, etc.) to users that access the network-basedpublisher102. For example, the publication application(s)124 may provide a number of services and functions to users for listing goods and/or services for sale, searching for goods and services, facilitating transactions, and reviewing and providing feedback about transactions and associated users. Additionally, the publication application(s)124 may track and store data and metadata relating to listings, transactions, and user interactions with the network-basedpublisher102. In some embodiments, the publication application(s)124 may publish or otherwise provide access to content items stored inapplication servers122 or database(s)128 accessible to theapplication servers122 and/or the database server(s)126.
FIG. 1 also illustrates athird party application116 that may execute on athird party server114 and may have programmatic access to the network-basedpublisher102 via the programmatic interface provided by theAPI server118. For example, thethird party application116 may use information retrieved from the network-basedpublisher102 to support one or more features or functions on a website hosted by the third party. The third party website may, for example, provide one or more listing, feedback, publisher or payment functions that are supported by the relevant applications of the network-basedpublisher102.
While theexample network system100 ofFIG. 1 employs a client-server architecture, the present disclosure is not limited to such an architecture. Theexample network system100 can equally well find application in, for example, a distributed or peer-to-peer architecture system.
FIG. 2 is a block diagram illustrating an example embodiment of apersonalization system200. Thepersonalization system200 may comprise animaging module210, adatabase interface module220, and apersonalized communication module230.
Theimaging module210 may be configured to receive image data of a vehicle from an image capture device and to extract identification information corresponding to the vehicle from the image data. The image data may comprise still image data and/or video image data. In some embodiments, the identification information may comprise vehicle identification information. In some embodiments, theimaging module210 may receive image data from one or more image capture devices. Examples of image capture devices include, but are not limited to, video cameras and still picture cameras.
Image data may be processed by theimaging module210 to recognize information contained in the image data that may identify a user. To the extent needed, theimaging module210 may perform image recognition on the image data to identify information captured in the data. Image recognition techniques that may be used include, but are not limited to, optical character recognition (OCR), face detection techniques, edge detection, color and pattern matching, and so forth.
In some embodiments, identification information may comprise vehicle identification information. Examples of vehicle identification information include, but are not limited to, a license plate number, a geographic location or license plate issuing authority/entity (e.g., a state or country government) to which a license plate belongs or is associated with, a make and/or model of a vehicle, and a color of a vehicle. Other examples of vehicle identification information include, but are not limited to, one or more other distinguishing features of a vehicle, such as dents, scratches, bumper stickers, emblems, decals, and various vehicle features (e.g., sunroof, spoiler, rims or hubcaps, and exhaust pipes).
In some embodiments, identification information may further comprise user identification information. Examples of user identification information may include, but are not limited to, an image of a user (e.g., a user's face), clothing worn by a user, and one or more identifying features of a user (e.g., tattoos, scars, piercings, hair style, facial hair, glasses, and accessories). In some embodiments, the image data may be captured at specific locations relative to a retailer location (e.g., a retail store). For example, cameras may be placed at certain areas of a drive-through lane, at entrances to a parking lot, at entrances to a physical store, and so forth.
Thepersonalization system200 may be used in a variety of different environment scenarios. Environments in which thepersonalization system200 may be implemented include, but are not limited to, restaurants, fast food locations, quick serve locations, retail stores, parking locations, fuel stations, car washes, hotels and other lodging environments, and other commerce environments as well.
Thedatabase interface module220 may be configured to use the identification information to retrieve adata record227. In some embodiments, thedata record227 may be associated with a user. In some embodiments, thedatabase interface module220 may receive one or more pieces of identification information from theimaging module210 and may use the piece(s) of information as key(s) or search query terms to perform a search of one ormore databases225, which may storedata records227 for users and/or vehicles. For example, a license plate number may be used to look up one or more data records227. The data records227 may identify a user associated with the vehicle. The data records227 also may comprise user history information. One example of user history information is an order history for the user with respect to a specific retailer or online marketplace. The data records227 may also comprise user preference information. User preference information may be submitted by a user or may be inferred or determined from accumulated user data.
In some embodiments, history and preference data may be stored in adata record227 associated with a specific vehicle as opposed to with a specific user. In some embodiments, additional identification information may be submitted to further refine the search. For example, a vehicle may be shared by multiple users. By including user information extracted from the image data, data records for a specific user and a specific vehicle may be retrieved.
In some embodiments, thedatabase interface module220 may retrieve adata record227 identifying a vehicle and/or a user associated with the vehicle. Thedatabase interface module220 may use thisrecord227 to identify linked data records which may be separately maintained in the same or different databases. The linked data records may store the user history and/or user preference information. In some embodiments, portions of the vehicle and/or user information may be stored in third party databases. For example, in some embodiments, license plate information may be stored in a third party database maintained by a state's Department of Motor Vehicles.
Thepersonalized communication module230 may be configured to generate a personalized communication based on the retrieveddata record227. Thepersonalized communication module230 may receive one ormore data records227 from thedatabase interface module220 and may generate one or more personalized communications for presentation to a user. In some embodiments, the personalized communications comprise recommendations. The recommendations may be based on the history of a user, the preferences of a user, or both. In some embodiments, the recommendations may comprise offers. In some embodiments, thepersonalized communication module230 may generate or retrieve one or more offers based on the user history and/or user preferences for presentation to the user. Offers may include coupons, discounts, and so forth. In some embodiments, the personalized communications may comprise advertisements. These advertisements may be generated or retrieved based on the user history and/or user preferences. In some embodiments, recommendations, offers, and advertisements may be generated based on observed trends from user history data. For example, if a user is ordering food from a drive-through lane, an interactive menu may present one or more healthy menu options based on an observed trend that the user has recently been ordering low-fat menu items. In some embodiments, the recommendations, offers, and advertisements may be transmitted to a client device, such as a display that includes static areas and dynamic areas. For example, the display may include a static area that features fixed or manually changeable menu items and a dynamic area that comprises an electronic display screen capable of presenting dynamic information. In some embodiments, thepersonalized communication module230 may present additional information for one or more items, such as further menu or product information. In some embodiments, an advertisement may be transmitted to a display screen to show optimal advertisements at a given time based on image recognition. For example, an advertisement may be transmitted in the form of a personalized communication to a dynamic billboard on a highway or in a parking lot in response to an identification of a particular user or a particular vehicle.
In some embodiments, one or more of the modules described with reference toFIG. 2 may be implemented or executed by one or more processors. Additionally, in some embodiments, one or more of the modules described with reference toFIG. 2 may comprise one or more modules to carry out specific operations or tasks. In some embodiments, some or all of the modules described with reference toFIG. 2 may reside in an application executing on a client device. In some embodiments, some or all of the modules ofFIG. 2 may reside on one or more servers of thepublication system102 ofFIG. 1. In addition, the modules ofFIG. 2 may have separate utility and application outside of thepublication system102 ofFIG. 1. Thepublication system102 may be hosted on dedicated or shared server machines (not shown) that are communicatively coupled to enable communications between the server machines. The multiple components themselves may be communicatively coupled (e.g., via appropriate interfaces), either directly or indirectly, to each other and to various data sources, to allow information to be passed between the components or to allow the components to share and access common data. Furthermore, the components may access the one or more database(s)128 via the one ormore database servers126, both shown inFIG. 1.
In some embodiments, client machines or devices (e.g.,client machines106,108 ofFIG. 1) may be employed to use thepersonalization system200 of the present disclosure. Client machines may comprise interactive displays that present data (e.g., data published by the network-basedpublication system102 ofFIG. 1) for viewing and selection by a user. The user may interact with the displays using audio inputs, touch inputs, other biometric inputs (e.g., vision or eye detection, gestures), I/O device inputs, or any other means of interaction.
One or more client machines or devices may communicate and exchange data with one or more application servers (e.g.,application server122 ofFIG. 1). In some embodiments, one or more of the client machines may be point-of-sale (POS) devices that are capable of completing transactions involving items desired to be purchased by one or more users. The client machines also may include one or more displays that users may interact with. The displays may present item options to users and may include one or more interfaces that enable users to communicate with a publisher system (e.g.,publication system102 ofFIG. 1). For example, users may communicate through microphones, video cameras, keyboards, touch screens, or other user input devices that are part of the displays.
The client devices also may include one or more data capture devices that may capture data from one or more users interfacing with other client devices. For example, a camera may capture and/or record video and/or still images, from which information related to a user or an item associated with a user may be ascertained.
FIG. 3 illustrates an example embodiment of an implementation of apersonalization system200. In some embodiments, auser315 may drive his or hervehicle310 through a drive-through lane of aretail store350. The drive-through lane may comprise aninteractive display330, which may present information335 (e.g., menu options) to theuser315. In some embodiments, thedisplay330 may comprise animage capture device320. Theimage capture device320 may capture image data related to themotor vehicle310 being driven by theuser315. As previously mentioned, this image data may include, but is not limited to, the license plate, make and/or model of thevehicle310, and any other identifying features of thevehicle310, such as stickers (located on the bumper or otherwise), any dents or scratches, emblems, decals, type of hubcaps or rims, type of tires, paint color, the presence of a sunroof, the presence of a spoiler, and so forth.
In addition, theimage capture device320 may capture information about theuser315. As previously discussed, this information about theuser315 may include, but is not limited to, an image of theuser315, clothing worn by theuser315, any identifying features (e.g., tattoos, scars, moles, hair style, facial hair, glasses, jewelry and other accessories) of theuser315 visible to theimage capture device320, and so forth. In addition, theimage capture device320 may capture a number of people located in thevehicle310 along with any identifying information associated with each person.
In some embodiments, animage capture device322, having the same functional capability asimage capture device320, may be positioned in a location other than alongside or integrated with thedisplay330. For example,image capture device322 may be coupled to theretail store350, such as above the entrance of theretail store350.
The image data captured by one or more of theimage capture devices320,322 may be transmitted to the personalization system200 (e.g., via a network340) and may be stored in one or more database(s). One or more pieces of the image data may be used by thepersonalization system200 to retrieve anydata records227 associated with theuser315. In some embodiments, as previously discussed, thedata records227 may comprise user preference and/or user history data. In some embodiments, the user preference and/or user history data may be specific to an entity, such as a retailer. In some embodiments, thedata records227 may be linked or associated with other records that store the user preference and/or user history data.
In some example embodiments, a license plate of thevehicle310 being driven by the user may be captured by one of theimage capture devices320,322 and submitted to thepersonalization system200. In some embodiments, thepersonalization system200 may perform image recognition on the license plate to recognize the letters, numbers, and/or symbols of the license plate (generally referred to herein as “license plate number”). In some embodiments, the license plate number may be a unique identifier associated with auser315. The license plate number may be used to retrieve auser data record227 and any user history or preference information stored therewith.
Based on the retrieveduser data record227, thepersonalization system200 may generate a personalized communication and send this personalized communication to one or more devices. In some embodiments, the personalized communication is sent to thedisplay330 located outside of theretail store350. In sortie embodiments, this personalized communication may comprise one or more recommendations. For example, based on a user'sdata record227, which may include the user's order history, theinteractive display330 may offer recommendations to the user while the user peruses thedisplay330. The recommendations may include the user's favorite menu options, the user's last order, one or more items that the user may like based on the user's order history, one or more items fitting within a nutritional profile of the user, and so forth. In addition, one or more offers may be presented to the user on thedisplay330 to reward the user for the user's business or to incentivize the user to try anew item. When the user orders one or more items, the order may be recorded and the user'sdata record227 may be updated.
In some embodiments, an order history and other data related to prior transactions and interactions may be stored for thevehicle310 rather than for auser315. That is, one ormore data records227 associated with the license plate (or other identifier) may store history and/or preference information for thevehicle310. Thus, when thevehicle310 next enters a drive-through lane and has its license plate number captured, thepersonalization system200 may retrieve adata record227 associated with thevehicle310. Recommendations and other personalized information may then be presented to an occupant of thevehicle310.
Although the foregoing examples have been discussed with reference to the use of a license plate number as an identifier for providing personalized user services, it will be appreciated that other data items (e.g., license plate number, make, model, stickers, decals, user image, user clothing, etc.) that may be captured byimage capture device320,322 may be used to identify and retrieve user-related information. Additionally, combinations of data items may be used to identify avehicle310 and/or auser315 associated with thevehicle310 and enable retrieval of user-related information.
In some embodiments, a personalized communication generated by thepersonalization system200 may alternatively or additionally be sent to a user's personal handheld device (e.g., the user's cell phone) or to a display device of the user'svehicle310.
In some embodiments, the personalized communication may be sent to adisplay360 inside theretail store350. Although the previous discussion with respect toFIG. 3 has been directed towards an embodiment of thepersonalization system200 being implemented in the context of aretail store350 having a drive-through lane, it is contemplated that thepersonalization system200 may also be implemented in the context of aretail store350 that does not employ a drive-through lane. It is contemplated that theretail store350 may also be a restaurant or any other place of commerce. In some embodiments, the personalized communication may be used to arrange for personalized service for theuser315 when theuser315 enters theretail store350. In some embodiments,display360 inside theretail store350 may displayinformation365 based on the personalized communication. Thisinformation365 may then be viewed and used by anemployee370 of theretail store350. In some embodiments, the personalized communication may comprise instructions, recommendations, or otherwise actionable information and may be communicated only to devices used by employees of thestore350 and not to theuser315. In some embodiments, the personalized communication is sent to a device used by an employee of theretail store350 before being sent to theuser315.
In some embodiments, a concierge service, a favorite salesperson, or other personalized service offerings may greet theuser315 as theuser315 enters thestore350. Depending on the context, if theuser315 has pre-ordered items or arranged for aretail store350 to hold items for pick up, thestore350 may be provided advanced notice of the user's visit in the form of the personalized communication and may arrange for the items to be available for theuser315 for pick up, such as when theuser315 enters thestore350.
In some embodiments, auser315 may interface with an application associated with a retailer and may build, compile, assemble, or otherwise select one or more items that theuser315 is interested in purchasing via the application. When auser315 nears aretail store350 and is detected by one or moreimage capture devices320,322, the application may be triggered by thepersonalization system200 to submit the user's list of items to theretail store350. In some embodiments, the application may be part of thepersonalization system200, or thepersonalization system200 may be part of the application. The application may reside on the retailer's on-site computer system or on an off-site computer system. In some embodiments, the application may reside on a user's personal device, such as a cell phone. In some embodiments, the application may be triggered based on a detection of a location of a user device executing the application within a predetermined proximity to the store. For example, theuser315 may be executing the application using his or her cell phone, the detection of which within a predetermined proximity to theretail store350 may trigger a submission. The application may operate in conjunction with thepersonalization system200 to use geo-location or geo-fencing to determine when a retailer is supposed to begin fulfilling the user's order. In some embodiments, the application may be triggered and the retailer may be notified of the user's presence through a check-in performed by theuser315 within the application. The check-in may notify the retailer that the user is within physical proximity of the physicalretail store350 and that the retailer should begin preparing the user's order.
In some embodiments, one or more image capture device(s)320,322 may transmit a signal to the application based on a detection of theuser315 and a retrieval of a record of theuser315. The signal may instruct the application to transmit the contents of the order to theretail store350. Theretail store350 may then prepare the items for pickup by theuser315. In some embodiments, the order may comprise a list of food and beverage items selected from a menu. In this case, the retail store350 (e.g., restaurant) may begin preparing the food and beverage items so that theuser315 may receive them promptly. In some embodiments, the order may comprise one or more items that theretail store350 may gather from store inventory.
In some embodiments, with or separate from the order, the application may transmit payment information to theretail store350 to further streamline the transaction. The payment information may be processed by theretail store350 to complete the transaction. When theuser315 arrives at theretail store350, theuser315 may then pick up the ordered items without having to stop to select or pay for the items. In some embodiments, aphysical store350 with a drive-through lane may have multiple drive-through lanes. In one or more of the drive-through lanes, an unattended holding area for an item (e.g., a container, locker, dumbwaiter or receptacle) may replace or supplement a drive-through window. Auser315 with a mobile device may scan his or her mobile device at an interface near the unattended holding area to open a door of the holding area in order to retrieve their order. In some embodiments, the user is allowed access to the inside of the holding area based on the identification of theuser315 or thevehicle310 by thepersonalization system200 without the need for a scan of a mobile device. The identification itself may enable access to the contents within the holding area.
In some embodiments, instead of the application transmitting payment information, the application may call or cause to be executed a second application to handle payment of the order. The second application may be a mobile payment solution, such as a virtual wallet or other payment mechanism. The second application may securely transmit (e.g., via encrypted methods, one-time payment methods, and so forth) payment information for theuser315 to theretail store350 and may complete a transaction with theretail store350. Confirmation of payment may be received by the second application and notification of the confirmation may be provided to the first application. In some embodiments, a POS device associated with the physicalretail store350 may transmit received payment information securely to one or more verification services (e.g., via Wi-Fi using a Wi-Fi adapter connected to a USB port on the POS device). The verification services may verify the account information contained in the payment information and confirm the transaction. For example, the verification service may be a credit card company or issuer. If auser315 uses a Visa credit card to pay for the transaction, the POS device may securely communicate with a server associated with Visa or the card issuer to confirm the account information and receive a confirmation number. In other embodiments, the verification service may be an entity responsible for maintaining the virtual wallet or other payment mechanism used to pay for the order. In some embodiments, thepersonalization system200 may authorize a payment with a merchant based on the identification of theuser315 or thevehicle310. Examples of authorizing a payment may include, but are not limited to, opening a tab, such as using a check-in application that enables a user to check in to a store and then pay for goods and services with an online money transfer account (e.g., a PayPal account).
AlthoughFIG. 3 shows thepersonalization system200 and its modules as separate from theimage capture devices320,322 and thedisplays330,360, it is contemplated that, in some embodiments, thepersonalization system200 or any of its modules may be incorporated into theimage capture devices320,322 or thedisplays330,360 or any other devices. For example, in some embodiments, any or all modules of thepersonalization system200 may be incorporated into a retailer's on-site computer system or display system.
As previously discussed, although the foregoing examples have been discussed with respect to a drive-through lane of a restaurant, embodiments of the disclosure are not so limited.FIG. 4 illustrates another example embodiment of an implementation of a personalization system in a context of aretail store450 without a drive-through lane. In some embodiments, one or moreimage capture devices420,422,424 may be strategically placed in areas near theretail store450. It is contemplated thatimage capture devices420,422,424 may have the same functional capabilities as those discussed above forimage capture devices320,322 ofFIG. 3. In some embodiments, one or moreimage capture devices420 may be located at the entrance to aparking lot430 of aretail store450. In some embodiments, one or moreimage capture devices422 may be located within theparking lot430, between the entrance to theparking lot430 and the entrance to theretail store450. In some embodiments, one or moreimage capture devices424 may be coupled to theretail store450, for example, next to the entrance of theretail store450.
The image capture device(s)420,422,424 may capture image data related to a user and/or a vehicle being driven by the user. The image capture device(s)420,422,424 may capture image data of avehicle410 entering theparking lot430, image data of avehicle412 driving through theparking lot430, or image data of avehicle414 parked in theparking lot430. The image data may be transmitted to the personalization system200 (not shown inFIG. 4). As previously described above, thepersonalization system200 may then use the image data to generate a personalized communication. Thepersonalization system200 or any of its components may be associated with theretail store450 and be incorporated into the retail store's on-site computer system, or thepersonalization system200 may be separate from the retail store's on-site computer system. Thepersonalization system200 may transmit the personalized communication to theretail store450 to notify the retailer that a user is likely to visit theretail store450. It is contemplated that thepersonalization system200 may send a personalized communication to any device. For example, thepersonalization system200 may send the personalized communication to any of the devices discussed above with respect toFIG. 3, for example, a user's handheld device, a device coupled to the user's vehicle, a device external to theretail store450, or a device internal to theretail store450. Additionally, the personalized communication discussed in the embodiments ofFIG. 4 may comprise any type of personalized communications previously discussed, for example, recommendations, discounts, advertisements, orders, payment information, and so forth.
In some embodiments, thepersonalization system200 may be configured to receive audio data of a vehicle from an audio capture device (not shown) and to extract identification information corresponding to the vehicle from the audio data. The audio capture device may be separate from the image capture device (e.g., a separate audio recorder) or may be incorporated into the image capture device (e.g., a video camera that records audio in addition to video). Referring back toFIG. 2, in some embodiments, anaudio module240 may be configured to receive the audio data of a vehicle from an audio capture device and to extract identification information corresponding to the vehicle from the audio data.Database interface module220 may be configured to use the identification information to retrieve adata record227. Thepersonalized communication module230 may then generate a personalized communication based on the retrieveddata record227. In some embodiments,database interface module220 may be configured to use the identification information extracted from the audio data to retrieve adata record227 without using identification information extracted from image data. In some embodiments,database interface module220 may be configured to use the identification information extracted from the audio data along with identification information extracted from image data to retrieve adata record227.
Audio data may be processed by theaudio module240 to recognize information contained in the audio data that may identify a user or vehicle. To the extent needed, theaudio module240 may perform audio recognition on the audio data to identify information captured in the audio data. It is contemplated that any audio recognition techniques may be used to identify information captured in the audio data. In some embodiments, the natural noise of a vehicle may be used to validate or confirm an identity match. For example, thepersonalization system200 may use a unique engine sound signature to identify or verify the identification of a vehicle based on differences in engine sound that may exist from one car to another car. In some embodiments, an apparatus may be added to a vehicle that may broadcast a unique audio pattern, which may be used by thepersonalization system200 to identify the vehicle. In some embodiments, this unique audio pattern may be broadcast in a way not detectable by the human ear.
FIG. 5 is a flowchart illustrating an example embodiment of amethod500 for user personalization. It is contemplated that any of the previously discussed features may be incorporated into themethod500.
Atoperation510, a personalization system (e.g., personalization system200) receives image data of a vehicle. The image data may have been obtained by an image capture device. In some embodiments, the image data comprise a still image or video.
Atoperation520, the personalization system extracts vehicle identification information from the image data. Examples of vehicle identification information include, but are not limited to, a license plate number, a make and/or model of a vehicle, and a color of a vehicle. Other examples of vehicle identification information include, but are not limited to, one or more other distinguishing features of a vehicle, such as dents, scratches, bumper stickers, emblems, decals, and various vehicle features (e.g., sunroof, spoiler, rims or hubcaps, and exhaust pipes).
Atoperation530, the personalization system retrieves a user data record using the vehicle identification information. In some embodiments, the user data record may comprise user history information or user preference information.
Atoperation540, the personalization system generates a personalized communication based on the retrieved data record. The personalized communication may comprise a variety of different types of information, including, but not limited to, recommendations, offers, notifications of a user's location, an order, payment information, and a prompting of an action by a store employee to service a user.
Atoperation550, the personalization system sends the personalized communication to one or more of a variety of devices, including, but not limited to, a user's handheld device (e.g., a cell phone), a device of the vehicle (e.g., display system in vehicle), a display external to a retail store or any other location or structure associated with the sale of goods and/or services (e.g., a drive-through display), and an on-site computer system located internally within the retail store or any other location or structure associated with the sale of goods and/or services (e.g., a device that is part of the on-site computer system and used by employees of the retail store).
FIG. 6 is a flowchart illustrating another example embodiment of amethod600 for user personalization. It is contemplated that any of the previously discussed features may be incorporated into themethod600.
At operation610, a personalization system (e.g., personalization system200) receives image data of a vehicle and/or image data of a user. The image data may have been obtained by an image capture device. In some embodiments, the image data comprise a still image or video.
Atoperation620, the personalization system extracts identification information from the image data. The identification information may comprise information to identify a user and/or information to identify a vehicle. Examples of vehicle identification information include, but are not limited to, a license plate number, a make and/or model of a vehicle, and a color of a vehicle. Other examples of vehicle identification information include, but are not limited to, one or more other distinguishing features of a vehicle, such as dents, scratches, bumper stickers, emblems, decals, and various vehicle features (e.g., sunroof, spoiler, rims or hubcaps, and exhaust pipes). Examples of user identification information include, but are not limited to, an image of a user (e.g., a user's face), clothing worn by a user, and one or more identifying features of a user (e.g., tattoos, scars, piercings, hair style, facial hair, glasses, and accessories).
Atoperation630, the personalization system retrieves a data record using the identification information. In some embodiments, the data record may comprise a data record associated with a user. In some embodiments, the user data record may comprise user history information or user preference information. In some embodiments, the data record may be associated with a vehicle. In some embodiments, the vehicle data record may comprise history information or preference information associated with a vehicle.
Atoperation640, the personalization system generates a personalized communication based on the retrieved data record. The personalized communication may comprise a variety of different types of information, including, but not limited to, recommendations, offers, notifications of a user's location, an order, and payment information.
Atoperation650, the personalization system sends the personalized communication to one or more of a variety of devices, including, but not limited to, a user's handheld device (e.g., a cell phone), a device of the vehicle (e.g., display system in vehicle), a display external to a retail store (e.g., a drive-through display), and an on-site computer system located internally within the retail store.
Modules, Components and Logic
Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A hardware module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client, or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) be driven by cost and time considerations.
Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired) or temporarily configured (e.g., programmed) to operate in a certain manner and/or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices and can operate on a resource (e.g., a collection of information).
The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
Similarly, the methods described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), these operations being accessible via a network (e.g., thenetwork104 ofFIG. 1) and via one or more appropriate interfaces (e.g., APIs).
Electronic Apparatus and System
Example embodiments may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. Example embodiments may be implemented using a computer program product, e.g., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable medium for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers.
A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
In example embodiments, operations may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method operations can also be performed by, and apparatus of example embodiments may be implemented as, special purpose logic circuitry (e.g., a FPGA or an ASIC).
A computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In embodiments deploying a programmable computing system, it will be appreciated that both hardware and software architectures merit consideration. Specifically, it will be appreciated that the choice of whether to implement certain functionality in permanently configured hardware (e.g., an ASIC), in temporarily configured hardware (e.g., a combination of software and a programmable processor), or a combination of permanently and temporarily configured hardware may be a design choice. Below are set out hardware (e.g., machine) and software architectures that may be deployed, in various example embodiments.
Example Machine Architecture and Machine-Readable Medium
FIG. 7 is a block diagram of a machine in the example form of acomputer system700 within whichinstructions724 for causing the machine to perform any one or more of the methodologies discussed herein may be executed. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
Theexample computer system700 includes a processor702 (e.g., a central processing unit (CPU), a graphics processing unit (CPU) or both), amain memory704 and astatic memory706, which communicate with each other via abus708. Thecomputer system700 may further include a video display unit710 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). Thecomputer system700 also includes an alphanumeric input device712 (e.g., a keyboard), a user interface (UI) navigation (or cursor control) device714 (e.g., a mouse), adisk drive unit716, a signal generation device718 (e.g., a speaker) and anetwork interface device720.
Machine-Readable Medium
Thedisk drive unit716 includes a machine-readable medium722 on which is stored one or more sets of data structures and instructions724 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. Theinstructions724 may also reside, completely or at least partially, within themain memory704 and/or within theprocessor702 during execution thereof by thecomputer system700, themain memory704 and theprocessor702 also constituting machine-readable media. Theinstructions724 may also reside, completely or at least partially, within thestatic memory706.
While the machine-readable medium722 is shown in an example embodiment to be a single medium, the term “machine-readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one ormore instructions724 or data structures. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present embodiments, or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media. Specific examples of machine-readable media include non-volatile memory, including by way of example semiconductor memory devices (e.g., Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), and flash memory devices); magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and compact disc-read-only memory (CD-ROM) and digital versatile disc (or digital video disc) read-only memory (DVD-ROM) disks.
Transmission Medium
Theinstructions724 may further be transmitted or received over acommunications network726 using a transmission medium. Theinstructions724 may be transmitted using thenetwork interface device720 and any one of a number of well-known transfer protocols (e.g., HTTP). Examples of communication networks include a LAN, a WAN, the Internet, mobile telephone networks, POTS networks, and wireless data networks (e.g., WiFi and WiMax networks). The term “transmission medium” shalt be taken to include any intangible medium capable of storing, encoding, or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible media to facilitate communication of such software.
Although an embodiment has been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the present disclosure. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof, show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments.
Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.
The Abstract of the Disclosure is provided to comply with 37 C.F.R. §1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.

Claims (23)

What is claimed is:
1. A system comprising:
at least one processor;
an imaging module, executable by the at least one processor, configured to receive image data of a vehicle from an image capture device and to extract vehicle identification information from the image data;
a database interface module, executable by the at least one processor, configured to use the vehicle identification information to retrieve a data record associated with a user;
a personalized communication module, executable by the at least one processor, configured to generate a personalized communication, relating to physical goods for sale proximate to the image capture device, based on the retrieved data record; and
a point-of-sale device, located proximate to the physical goods for sale, configured to complete a transaction of at least one of the physical goods for sale.
2. The system ofclaim 1, wherein the image data comprises a still image or video.
3. The system ofclaim 1, wherein the vehicle identification information comprises at least one of a license plate number of the vehicle, a make of the vehicle, a model of the vehicle, and a color of the vehicle.
4. The system ofclaim 1, wherein:
the imaging module is further configured to extract user identification information from the image data; and
the database interface module is further configured to use the user identification information along with the vehicle identification information to retrieve the data record.
5. The system ofclaim 1, wherein the personalized communication module is further configured to cause the personalized communication to be presented on a display proximate the user.
6. The system ofclaim 1, wherein the data record comprises a history of transactions for the user or preferences of the user.
7. The system ofclaim 1, wherein the personalized communication is a recommendation related to at least one of the physical goods for sale.
8. A computer-implemented method comprising:
receiving image data of a vehicle from an image capture device;
extracting vehicle identification information from the image data;
retrieving a data record associated with a user using the vehicle identification information;
generating a personalized communication, relating to physical goods for sale proximate to the image capture device, based on the retrieved data record;
sending the personalized communication first or only to a device operated by an employee of a store offering the physical goods for sale; and
completing a transaction for at least one of the physical goods for sale with a point of sale device located proximate to physical goods.
9. The method ofclaim 8, wherein the image data comprises a still image or video.
10. The method ofclaim 8, wherein the vehicle identification information comprises at least one of a license plate number of the vehicle, a make of the vehicle, a model of the vehicle, and a color of the vehicle.
11. The method ofclaim 8 further comprising extracting user identification information from the image data, the user identification information being used along with the vehicle identification information in retrieving the data record.
12. The method ofclaim 8 further comprising causing the personalized communication to be presented on a display proximate the user.
13. The method ofclaim 8, wherein the data record comprises a history of transactions for the user or preferences of the user.
14. The method ofclaim 8, wherein the personalized communication is a recommendation related to at least one of the physical goods for sale.
15. The method ofclaim 8, wherein the store is proximate the image capture device.
16. A non-transitory machine-readable storage device storing a set of instructions that, when executed by at least one processor, causes the at least one processor to perform a set of operations comprising:
receiving image data of a vehicle from an image capture device;
extracting vehicle identification information from the image data;
retrieving a data record associated with a user using the vehicle identification information;
generating a personalized communication for the user, relating to physical goods for sale proximate to the image capture device based on the retrieved data record; and
receiving transaction data for at least one of the physical goods for sale from a point of sale device located proximate to physical goods.
17. The device ofclaim 16, wherein the image data comprises a still image or video.
18. The device ofclaim 16, wherein the vehicle identification information comprises at least one of a license plate number of the vehicle, a make of the vehicle, a model of the vehicle, and a color of the vehicle.
19. The device ofclaim 16, wherein the set of operations further comprises extracting user identification information from the image data, the user identification information being used along with the vehicle identification information in retrieving the data record.
20. The device ofclaim 16, wherein the set of operations further comprises causing the personalized communication to be presented on a display proximate the user.
21. The device ofclaim 16, wherein the data record comprises a history of transactions for the user or preferences of the user.
22. The system ofclaim 1, wherein the personalized communication comprises an offer to sell at least one of the physical goods.
23. The system ofclaim 1, wherein the personalized communication comprises a coupon.
US13/706,6782012-02-222012-12-06User identification and personalization based on automotive identifiersActive2033-11-12US9324002B2 (en)

Priority Applications (5)

Application NumberPriority DateFiling DateTitle
US13/706,678US9324002B2 (en)2012-02-222012-12-06User identification and personalization based on automotive identifiers
CA2866482ACA2866482C (en)2012-02-222013-02-22User identification and personalization based on automotive identifiers
PCT/US2013/027426WO2013126772A1 (en)2012-02-222013-02-22User identification and personalization based on automotive identifiers
AU2013222234AAU2013222234B2 (en)2012-02-222013-02-22User identification and personalization based on automotive identifiers
US15/066,533US9996861B2 (en)2012-02-222016-03-10User identification and personalization based on automotive identifiers

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US201261601972P2012-02-222012-02-22
US13/706,678US9324002B2 (en)2012-02-222012-12-06User identification and personalization based on automotive identifiers

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US15/066,533ContinuationUS9996861B2 (en)2012-02-222016-03-10User identification and personalization based on automotive identifiers

Publications (2)

Publication NumberPublication Date
US20130216102A1 US20130216102A1 (en)2013-08-22
US9324002B2true US9324002B2 (en)2016-04-26

Family

ID=48982290

Family Applications (2)

Application NumberTitlePriority DateFiling Date
US13/706,678Active2033-11-12US9324002B2 (en)2012-02-222012-12-06User identification and personalization based on automotive identifiers
US15/066,533ActiveUS9996861B2 (en)2012-02-222016-03-10User identification and personalization based on automotive identifiers

Family Applications After (1)

Application NumberTitlePriority DateFiling Date
US15/066,533ActiveUS9996861B2 (en)2012-02-222016-03-10User identification and personalization based on automotive identifiers

Country Status (4)

CountryLink
US (2)US9324002B2 (en)
AU (1)AU2013222234B2 (en)
CA (1)CA2866482C (en)
WO (1)WO2013126772A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20160132889A1 (en)*2014-03-012016-05-12Govindaraj SetlurSystem and method for payer controlled payment processing system
US20160189252A1 (en)*2012-02-222016-06-30Paypal, Inc.User identification and personalization based on automotive identifiers
US20180033090A1 (en)*2016-07-262018-02-01Samsung Electronics Co., LtdSystem and method for universal card acceptance
US20180165678A1 (en)*2016-12-142018-06-14Mastercard International IncorporatedMethods and systems for processing a payment transaction
US10341854B2 (en)2016-11-302019-07-02Bank Of America CorporationCreating a secure physical connection between a computer terminal and a vehicle
US10402892B2 (en)*2016-11-302019-09-03Bank Of America CorporationResource delivery via automated channel and virtual record
US10430566B2 (en)*2016-12-272019-10-01Paypal, Inc.Vehicle based electronic authentication and device management
US10528929B2 (en)2016-11-302020-01-07Bank Of America CorporationComputer terminal having a detachable item transfer mechanism for dispensing and collecting items
US11195158B2 (en)*2012-09-122021-12-07Shreyas KamatCommunicating payments
US20230153893A1 (en)*2018-12-062023-05-18Walmart Apollo, LlcSystems and methods for handling alternate pick-up using vehicle recognition
US20240220945A1 (en)*2022-12-292024-07-04American Express Travel Related Services Company, Inc.Overlay network for real-time payment networks
US12380457B2 (en)2023-01-262025-08-05American Express Travel Related Services Company, Inc.Optimal routing of payments

Families Citing this family (52)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20140278030A1 (en)*2013-03-142014-09-18Harman International Industries, IncorportedAutomobile traffic detection system
US10628815B1 (en)2013-09-272020-04-21Groupon, Inc.Systems and methods for programmatically grouping consumers
US20150125042A1 (en)*2013-10-082015-05-07Smartlanes Technologies, LlcMethod and system for data collection using processed image data
US10929661B1 (en)*2013-12-192021-02-23Amazon Technologies, Inc.System for user identification
JP2015156152A (en)*2014-02-202015-08-27テックファーム株式会社service information management system and service information management method
KR101552873B1 (en)*2014-05-082015-09-14현대자동차주식회사Vehicle, Terminal and method for controlling the vehicle
US9760776B1 (en)2014-06-272017-09-12Blinker, Inc.Method and apparatus for obtaining a vehicle history report from an image
US9773184B1 (en)2014-06-272017-09-26Blinker, Inc.Method and apparatus for receiving a broadcast radio service offer from an image
US10579892B1 (en)2014-06-272020-03-03Blinker, Inc.Method and apparatus for recovering license plate information from an image
US10733471B1 (en)2014-06-272020-08-04Blinker, Inc.Method and apparatus for receiving recall information from an image
US10867327B1 (en)2014-06-272020-12-15Blinker, Inc.System and method for electronic processing of vehicle transactions based on image detection of vehicle license plate
US9779318B1 (en)2014-06-272017-10-03Blinker, Inc.Method and apparatus for verifying vehicle ownership from an image
US10515285B2 (en)2014-06-272019-12-24Blinker, Inc.Method and apparatus for blocking information from an image
US9558419B1 (en)2014-06-272017-01-31Blinker, Inc.Method and apparatus for receiving a location of a vehicle service center from an image
US9754171B1 (en)2014-06-272017-09-05Blinker, Inc.Method and apparatus for receiving vehicle information from an image and posting the vehicle information to a website
US9607236B1 (en)2014-06-272017-03-28Blinker, Inc.Method and apparatus for providing loan verification from an image
US9563814B1 (en)2014-06-272017-02-07Blinker, Inc.Method and apparatus for recovering a vehicle identification number from an image
US9589201B1 (en)2014-06-272017-03-07Blinker, Inc.Method and apparatus for recovering a vehicle value from an image
US9589202B1 (en)2014-06-272017-03-07Blinker, Inc.Method and apparatus for receiving an insurance quote from an image
US10572758B1 (en)2014-06-272020-02-25Blinker, Inc.Method and apparatus for receiving a financing offer from an image
US9600733B1 (en)2014-06-272017-03-21Blinker, Inc.Method and apparatus for receiving car parts data from an image
US9818154B1 (en)2014-06-272017-11-14Blinker, Inc.System and method for electronic processing of vehicle transactions based on image detection of vehicle license plate
US10540564B2 (en)2014-06-272020-01-21Blinker, Inc.Method and apparatus for identifying vehicle information from an image
US9892337B1 (en)2014-06-272018-02-13Blinker, Inc.Method and apparatus for receiving a refinancing offer from an image
US9594971B1 (en)2014-06-272017-03-14Blinker, Inc.Method and apparatus for receiving listings of similar vehicles from an image
MX2017001114A (en)*2014-08-082017-08-10Oney Servicios Financieros Efc S A UTransaction management method by recognition of the registration number of a vehicle.
CN104378384A (en)*2014-12-012015-02-25深圳如果技术有限公司Access control method, access control equipment and cloud server
WO2017151108A1 (en)*2016-03-012017-09-08Ford Global Technologies, LlcFiltering a dsrc broadcast based on user-defined preferences
EP3445706B1 (en)2016-04-212023-10-11Wayne Fueling Systems LlcIntelligent fuel dispensers
US12248995B2 (en)2016-04-212025-03-11Wayne Fueling Systems LlcIntelligent fuel dispensers
WO2017197444A1 (en)*2016-05-202017-11-23Algodriven LimitedSystems and methods for determining attributes of motor vehicles
BR112018074266A2 (en)2016-05-272019-03-06Wayne Fueling Systems Llc transparent fuel supply pump
CN106372652A (en)*2016-08-282017-02-01乐视控股(北京)有限公司Hair style identification method and hair style identification apparatus
US11361345B2 (en)2016-11-112022-06-14Craig HackerTargeted advertising system and method for drivers
CN106777169A (en)*2016-12-212017-05-31北京车网互联科技有限公司A kind of user's trip hobby analysis method based on car networking data
AU2017235967B2 (en)*2017-02-232022-08-04Plate Properties Pty LtdComputer System Configured for Issuing a Personalised Vehicle Number Plate
CN107170284A (en)*2017-06-232017-09-15深圳市盛路物联通讯技术有限公司A kind of washing bay management service end and data processing method
US11321790B2 (en)2018-04-252022-05-03Tanku LTD.System and method for vehicle identification based on fueling captures
WO2020023390A1 (en)2018-07-252020-01-30Modernatx, Inc.Mrna based enzyme replacement therapy combined with a pharmacological chaperone for the treatment of lysosomal storage disorders
US10460330B1 (en)2018-08-092019-10-29Capital One Services, LlcIntelligent face identification
JP7119863B2 (en)*2018-10-022022-08-17トヨタ自動車株式会社 Roadside device, center server, and information processing method
CN112417922B (en)*2019-08-202024-06-28杭州海康威视数字技术股份有限公司 Target recognition method and device
US12387384B2 (en)*2020-03-062025-08-12Oshkosh CorporationSystems and methods for augmented reality application
US11805160B2 (en)2020-03-232023-10-31Rovi Guides, Inc.Systems and methods for concurrent content presentation
US11790364B2 (en)2020-06-262023-10-17Rovi Guides, Inc.Systems and methods for providing multi-factor authentication for vehicle transactions
US11599880B2 (en)*2020-06-262023-03-07Rovi Guides, Inc.Systems and methods for providing multi-factor authentication for vehicle transactions
WO2022015849A1 (en)*2020-07-152022-01-20Mccuskey Scott ATag-based social interaction computing system and method
US12211061B2 (en)2020-07-312025-01-28Adeia Guides Inc.Systems and methods for providing an offer based on calendar data mining
MX2023001144A (en)2020-12-152023-04-14Selex Es IncSystems and methods for electronic signature tracking.
JP2024512026A (en)2021-03-242024-03-18モデルナティエックス インコーポレイテッド Lipid nanoparticles and polynucleotide encoding ornithine transcarbamylase for the treatment of ornithine transcarbamylase deficiency
DE22868062T1 (en)2021-09-092024-10-24Leonardo Us Cyber And Security Solutions, Llc SYSTEMS AND METHODS FOR ELECTRONIC SIGNATURE TRACKING AND ANALYSIS
US12347315B2 (en)2022-01-242025-07-01Leonardo Us Cyber And Security Solutions LlcSystems and methods for parking management

Citations (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20020032613A1 (en)*2000-04-182002-03-14Buettgenbach Thomas H.Methods and systems for the physical delivery of goods ordered through an electronic network
US20020111881A1 (en)1998-10-052002-08-15Walker Jay S.Method and apparatus for maintaining a customer database using license plate scanning
US20020178073A1 (en)2001-05-252002-11-28Kelly GravelleAVI for expedited mobile ordering and fulfillment
US20030042303A1 (en)*1999-06-072003-03-06Metrologic Instruments, Inc.Automatic vehicle identification (AVI) system employing planar laser illumination imaging (PLIIM) based subsystems
US20030195821A1 (en)2002-04-122003-10-16Kennamer Jack J.QSR ordering system and method for drive thru operations
US20060278705A1 (en)*2003-02-212006-12-14Accenture Global Services GmbhElectronic Toll Management and Vehicle Identification
US20080167966A1 (en)2004-10-262008-07-10The Coca-Cola CompanyTransaction System and Method
US20110025842A1 (en)*2009-02-182011-02-03King Martin TAutomatically capturing information, such as capturing information using a document-aware device
WO2013126772A1 (en)2012-02-222013-08-29Ebay Inc.User identification and personalization based on automotive identifiers
US8768009B1 (en)*2011-07-262014-07-01Shawn B. SmithLocating persons of interest based on license plate recognition information

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2001045004A1 (en)*1999-12-172001-06-21Promo VuInteractive promotional information communicating system
AU2002363055A1 (en)*2001-10-192003-05-06Bank Of America CorporationSystem and method for interative advertising
JP2006343632A (en)*2005-06-102006-12-21Pentax Corp Optical system for oblique projection projector
US20080040278A1 (en)*2006-08-112008-02-14Dewitt Timothy RImage recognition authentication and advertising system
US9875719B2 (en)*2009-12-232018-01-23Gearbox, LlcIdentifying a characteristic of an individual utilizing facial recognition and providing a display for the individual
IT1400153B1 (en)*2010-05-212013-05-17Carnevale METHODS OF USE OF BAGS FOR EXPENDITURE OF THE REUSABLE TYPE WITH DIGITAL IDENTIFICATION.
US9070242B2 (en)*2011-07-012015-06-30Digital Creations, LLCTechniques for controlling game event influence and/or outcome in multi-player gaming environments
US20130132102A1 (en)*2011-11-172013-05-23International Business Machines CorporationSmart parking space allocation system and method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20020111881A1 (en)1998-10-052002-08-15Walker Jay S.Method and apparatus for maintaining a customer database using license plate scanning
US20030042303A1 (en)*1999-06-072003-03-06Metrologic Instruments, Inc.Automatic vehicle identification (AVI) system employing planar laser illumination imaging (PLIIM) based subsystems
US20020032613A1 (en)*2000-04-182002-03-14Buettgenbach Thomas H.Methods and systems for the physical delivery of goods ordered through an electronic network
US20020178073A1 (en)2001-05-252002-11-28Kelly GravelleAVI for expedited mobile ordering and fulfillment
US20030195821A1 (en)2002-04-122003-10-16Kennamer Jack J.QSR ordering system and method for drive thru operations
US20060278705A1 (en)*2003-02-212006-12-14Accenture Global Services GmbhElectronic Toll Management and Vehicle Identification
US20080167966A1 (en)2004-10-262008-07-10The Coca-Cola CompanyTransaction System and Method
US20110025842A1 (en)*2009-02-182011-02-03King Martin TAutomatically capturing information, such as capturing information using a document-aware device
US8768009B1 (en)*2011-07-262014-07-01Shawn B. SmithLocating persons of interest based on license plate recognition information
WO2013126772A1 (en)2012-02-222013-08-29Ebay Inc.User identification and personalization based on automotive identifiers

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"International Application Serial No. PCT/US2013/027426, International Search Report mailed May 2, 2013", 2 pgs.
"International Application Serial No. PCT/US2013/027426, Written Opinion mailed May 2, 2013", 4 pgs.

Cited By (15)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20160189252A1 (en)*2012-02-222016-06-30Paypal, Inc.User identification and personalization based on automotive identifiers
US9996861B2 (en)*2012-02-222018-06-12Paypal, Inc.User identification and personalization based on automotive identifiers
US11195158B2 (en)*2012-09-122021-12-07Shreyas KamatCommunicating payments
US20160132889A1 (en)*2014-03-012016-05-12Govindaraj SetlurSystem and method for payer controlled payment processing system
US11120511B2 (en)*2016-07-262021-09-14Samsung Electronics Co., Ltd.System and method for universal card acceptance
US20180033090A1 (en)*2016-07-262018-02-01Samsung Electronics Co., LtdSystem and method for universal card acceptance
US10341854B2 (en)2016-11-302019-07-02Bank Of America CorporationCreating a secure physical connection between a computer terminal and a vehicle
US10402892B2 (en)*2016-11-302019-09-03Bank Of America CorporationResource delivery via automated channel and virtual record
US10528929B2 (en)2016-11-302020-01-07Bank Of America CorporationComputer terminal having a detachable item transfer mechanism for dispensing and collecting items
US20180165678A1 (en)*2016-12-142018-06-14Mastercard International IncorporatedMethods and systems for processing a payment transaction
US10430566B2 (en)*2016-12-272019-10-01Paypal, Inc.Vehicle based electronic authentication and device management
US20230153893A1 (en)*2018-12-062023-05-18Walmart Apollo, LlcSystems and methods for handling alternate pick-up using vehicle recognition
US20240220945A1 (en)*2022-12-292024-07-04American Express Travel Related Services Company, Inc.Overlay network for real-time payment networks
US12406238B2 (en)*2022-12-292025-09-02American Express Travel Related Services Company, Inc.Overlay network for real-time payment networks
US12380457B2 (en)2023-01-262025-08-05American Express Travel Related Services Company, Inc.Optimal routing of payments

Also Published As

Publication numberPublication date
WO2013126772A1 (en)2013-08-29
US20130216102A1 (en)2013-08-22
US20160189252A1 (en)2016-06-30
CA2866482A1 (en)2013-08-29
AU2013222234B2 (en)2016-06-02
AU2013222234A1 (en)2014-09-25
US9996861B2 (en)2018-06-12
CA2866482C (en)2017-02-28

Similar Documents

PublicationPublication DateTitle
US9996861B2 (en)User identification and personalization based on automotive identifiers
CN112669109B (en) Interactive mirror display system and corresponding method and machine-readable medium
KR102658873B1 (en) Augmented reality devices, systems and methods for purchasing
US11037202B2 (en)Contextual data in augmented reality processing for item recommendations
US11893615B2 (en)Spot market: location aware commerce for an event
US20070136140A1 (en)Provision of shopping information to mobile devices
US20160086254A1 (en)3d printing: marketplace with federated access to printers
US20170287018A1 (en)Methods and systems for performing an advertisement-based electronic transaction
WO2014210020A1 (en)Integrated online and offline inventory management
US10489840B2 (en)System, method, and non-transitory computer-readable storage media related to providing real-time price matching and time synchronization encryption
CN106796700A (en) User interface using tagged media, 3D indexed virtual reality images and GPS location for e-commerce
US20170364889A1 (en)Retail checkout systems and methods
US20160350365A1 (en)Mobile search
US12008616B2 (en)Systems and methods for providing an enhanced analytical engine
US20180165738A1 (en)Enhanced View System
WO2015066025A2 (en)User susceptibility profiles in marketplace environments
KR20140007274A (en)A server and a product trading terminal for providing trading service on the on/of line, and a method for providing trading service

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:EBAY INC., CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RYAN, MICHAEL JOSEPH;BONCIMINO, CHRISTOPHER DENNIS;REEL/FRAME:029926/0864

Effective date:20121213

ASAssignment

Owner name:PAYPAL, INC., CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EBAY INC.;REEL/FRAME:036170/0202

Effective date:20150717

STCFInformation on status: patent grant

Free format text:PATENTED CASE

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:8


[8]ページ先頭

©2009-2025 Movatter.jp