Movatterモバイル変換


[0]ホーム

URL:


US9270885B2 - Method, system, and computer program product for gamifying the process of obtaining panoramic images - Google Patents

Method, system, and computer program product for gamifying the process of obtaining panoramic images
Download PDF

Info

Publication number
US9270885B2
US9270885B2US13/662,073US201213662073AUS9270885B2US 9270885 B2US9270885 B2US 9270885B2US 201213662073 AUS201213662073 AUS 201213662073AUS 9270885 B2US9270885 B2US 9270885B2
Authority
US
United States
Prior art keywords
image
user device
quality
capture
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/662,073
Other versions
US20140118479A1 (en
Inventor
Evan Rapoport
Scott Ettinger
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLCfiledCriticalGoogle LLC
Priority to US13/662,073priorityCriticalpatent/US9270885B2/en
Assigned to GOOGLE INC.reassignmentGOOGLE INC.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: RAPOPORT, Evan, ETTINGER, SCOTT
Priority to PCT/US2013/029772prioritypatent/WO2014065854A1/en
Publication of US20140118479A1publicationCriticalpatent/US20140118479A1/en
Priority to US14/990,238prioritypatent/US9667862B2/en
Application grantedgrantedCritical
Publication of US9270885B2publicationCriticalpatent/US9270885B2/en
Priority to US15/491,573prioritypatent/US9832374B2/en
Assigned to GOOGLE LLCreassignmentGOOGLE LLCCHANGE OF NAME (SEE DOCUMENT FOR DETAILS).Assignors: GOOGLE INC.
Priority to US15/791,580prioritypatent/US10165179B2/en
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

Systems, methods, and computer readable mediums are provided to generate a number of targets for a panoramic image, each of the targets defining a portion of the panoramic image, monitor a position of a user device with respect to a current target, responsive to determining that the user device is properly positioned with respect to the current target, capture a first image for the current target using a camera of the user device, monitor the position of the user device with respect to a next target, responsive to determining that the user device is properly positioned with respect to the next target, capture a second image for the next target using the camera of the user device; and generate the panoramic image using the first image and the second image.

Description

TECHNICAL FIELD
The present disclosure relates to a process for obtaining panoramic images. More specifically, embodiments of the present disclosure use an augmented video stream to encourage users to properly capture panoramic images using an image capture device.
BACKGROUND
Panoramic photography involves capturing images with enlarged fields of view. Specialized hardware and/or software is typically used to capture individual images, which are then stitched together to form panoramic images. For example, a digital camera may be equipped with video capture capabilities such that when a user sweeps the camera through a field of view, individual images are continuously capture and then used to form a panoramic image. At this stage, the digital camera includes software to stitch the individual images together in order to create a panoramic image with a wider field of view. In this example, the quality of the panoramic image is affected by the velocity and steadiness of the digital camera as it is swept through the field of view. To improve the quality of panoramic images, digital cameras typically include functionality to indicate whether the digital camera is moving at an appropriate velocity.
Typical panoramic photography techniques are often time-consuming and tedious. Users may grow uninterested during a panoramic image capture, resulting in low quality images that are not suitable for stitching into a panoramic image. Or users may not be motivated enough to invest the amount of time required to learn and then properly execute a panoramic image capture.
SUMMARY
Various embodiments of systems, methods, and computer readable mediums for obtaining panoramic images are described herein. In some aspects, provided are a system, method, computer readable medium for generating a number of targets for a panoramic image, each of the targets defining a portion of the panoramic image, monitoring a position of a user device with respect to a current target, responsive to determining that the user device is properly positioned with respect to the current target, capturing a first image for the current target using a camera of the user device, monitoring the position of the user device with respect to a next target, responsive to determining that the user device is properly positioned with respect to the next target, capturing a second image for the next target using the camera of the user device; and generating the panoramic image using the first image and the second image.
In some aspects, the system, method, and computer readable medium are further for, responsive to determining that a targeting guide of the user device is within a threshold distance of the current target, displaying a high quality indicator at the current target. In some aspects, the system, method, and computer readable medium are further for displaying a low quality indicator at the current target while the targeting guide of the user device is outside the threshold distance.
In some aspects, the system, method, and computer readable medium are further for calculating a quality of the first image based on quality factors (e.g., velocity of the user device during the capture of the first image, exposure of the camera during the capture of the first image, rotational position of the user device during the capture of the first image, distance of a targeting guide of the user device from the first target during the capture of the first image, overlap of the first image with the second image) and determining whether the quality of the first image satisfies a quality threshold.
In some aspects, the system, method, and computer readable medium are further for, responsive to determining that the quality of the first image is below the quality threshold, discarding the first image and capturing a new image for the current target using the camera of the user device. In some aspects, the quality threshold is determined based on historical quality data of a user of the user device, the historical quality data being generated based on image quality of previously captured panoramic images.
In some aspects, the system, method, and computer readable medium are further for calculating a quality of the second image based on at least one of velocity of the user device during the capture of the second image, exposure of the camera during the capture of the second image, and rotational position of the user device during the capture of the second image and determining a quality of the panoramic image based on the quality of the first image and the quality of the second image.
BRIEF DESCRIPTION OF THE DRAWINGS
FIGS. 1A-1D show example user interfaces in accordance with one or more embodiments.
FIGS.2 and3A-3C show diagrams of systems in accordance with one or more embodiments.
FIG. 4 shows a flow chart in accordance with one or more embodiments.
FIG. 5 shows an example image target graph in accordance with one or more embodiments.
FIG. 6 shows a flow chart in accordance with one or more embodiments.
FIG. 7 shows an example user interface in accordance with one or more embodiments.
While obtaining panoramic images is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. The drawings may not be to scale. It should be understood, however, that the drawings and detailed description thereto are not intended to limit obtaining panoramic images to the particular form disclosed, but to the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
DETAILED DESCRIPTION
As discussed in more detail below, provided in some embodiments are systems and methods for obtaining panoramic images using an augmented video stream. In one embodiment, the process for obtaining panoramic images using an augmented video stream includes the steps of generating a number of targets for a panoramic image, each of the targets defining a portion of the panoramic image, monitoring a position of the user device with respect to a current target, responsive to determining that the user device is properly positioned with respect to the current target, capturing a first image for the current target using a camera of the user device, monitoring the position of the user device with respect to a next target, responsive to determining that the user device is properly positioned with respect to the next target, capturing a second image for the next target using the camera of the user device; and generating the panoramic image using the first image and the second image.
A panoramic image is an image having an expanded field of view that exceeds the bounds of individual images that can be captured by a camera's lens. In some embodiments, the panoramic image is generated by stitching together overlapping images that in combination cover the expanded field of view. The overlapping images can be captured by, for example, an image capture device at intervals or a video capture device in a continuous stream as the camera is swept across the expanded field of view. The field of view of a panoramic image can be expanded both horizontally and vertically. For instance, a 360-degree panoramic image can be obtained by capturing overlapping images as a camera is completely rotated around a fixed point.
FIGS. 1A-1D show example interfaces in accordance with embodiments of obtaining panoramic images. More specifically,FIGS. 1A-1D show example user interfaces for performing panoramic image capture on auser device102.
InFIG. 1A, theuser device102 includes adevice display103 displaying an augmented video stream provided by a camera, such as an image capture device disposed facing out, away from a rear face of theuser device102, which is into the page in the view ofFIG. 1A. Examples ofdevice display103 technologies include multi-touch capacitive screens, organic light emitting diode (OLED) screens, etc. The augmented video stream may displaytargets105, which are shown as dotted circles. Thetargets105 are located at the center of overlapping, target images that once captured are to be combined in a panoramic image.
In this example, the user has already captured a target image near the current location of thetargeting guide109. Thetargeting guide109 is located at the vertical and horizontal center of thedevice display103 and is used by the user to redirect the camera of theuser device102 towards each of thetargets105. As target images are captured at each of thetargets105, apreview106 shown at the bottom of thedevice display103 is updated to include the captured target images. Thepreview106 shows a panoramic image where each of the target images are projected into a flat image that accounts for the spherical angle of view used to capture the target images. Examples of images projections that may be used to generate the panoramic image include, but are not limited to, an equiretangular projection, a cylindrical projection, a rectilinear projection, a fisheye projection, a mercator projection, a sinusoidal projection, and a stereographic projection. The projection reduces distortion caused by the wider angle of view of the panoramic image.
Thetargets105 may be captured by the user in any order. As target images are captured for each of thetargets105, additional targets may be dynamically added to further direct the user to redirect the camera of theuser device102. As the user repositions theuser device102, thetargets105 remains affixed to the field of view displayed in thedevice display103. In other words, the representation oftargets105 in thedevice display103 shifts with the field of view shown on thedevice display103 as the user device is repositioned102.
In some embodiments, theuser device102 includes acapture button112 for initiating a panoramic image capture. In response to the user selecting thecapture button112, a camera application of theuser device102 may present the augmented video stream shown ondisplay device103 to initiate the panoramic image capture. The panoramic image capture may be completed when target images have been captured for all presentedtargets105 or when the user selects thecapture button112 for a second time to stop the panoramic image capture. At this stage, theuser device102 may generate the panoramic image by stitching together the target images. In some embodiments, the target images are projected from a three-dimensional (3D) coordinate system to a two-dimensional (2D) perspective projection when generating the panoramic image similar to as discussed above with respect to thepreview106.
InFIG. 1B, thedevice display103 is displaying an augmented video stream similar to the one shown inFIG. 1A except that asingle target105 is displayed and the preview ofFIG. 1A is replaced with asoft capture button112. Thesoft capture button112 includes an indicator of the current state of the panoramic capture process. In this example, the indicator of thesoft capture button112 shows a square stop sign, which indicates that the user may halt the panoramic capture process by selecting thesoft capture button112.
FIG. 1B also differs fromFIG. 1A in that the targetingguide109 is presented as an incomplete circle in the center of which thetarget105 should be placed in order to properly position theuser device102 for the next image capture of the panoramic capture process. In this example, the targetingguide109 is also surrounded bytarget bounds113 that show the extent of the images captured during the panoramic capture process.
InFIG. 1C, thedevice display103 is displaying an augmented video stream similar to the one shown inFIG. 1A except that thetargets105 ofFIG. 1A are replaced with quality indicators (e.g.,low quality indicators110, high quality indicator111). The quality indicators (e.g.,low quality indicators110, high quality indicator111) notify the user of the quality of the positioning of theuser device102 with respect to each of the quality indicators. The high quality indicator111 is a happy face, which indicates that theuser device102 is well positioned for capturing an image at the high quality indicator111. Thelow quality indicator110 are sad faces, which indicates that theuser device102 is not well positioned for capturing images at each of thelow quality indicators110.
In some embodiments, the quality of the positioning of theuser device102 may be determined based on the position of the targetingguide109 with respect to the quality indicators (e.g.,low quality indicators110, high quality indicator111). For example, a high quality position would be indicated if the targetingguide109 is proximate to a quality indicator as shown for the high quality indicator111 inFIG. 1C.
Thelow quality indicators110 can continue to display sad faces until the targetingguide109 is detected to be within a threshold distance of one of thelow quality indicators110. Once the targetingguide109 is within the threshold distance, thelow quality indicator110 may change from a sad face to a happy face to indicate that theuser device102 is properly positioned to capture a target image at that location. Those skilled in the art will appreciate that other styles of quality indicators (e.g.,low quality indicators110, high quality indicator111) may be used to indicate the quality of current positioning. For example, the quality indicators (e.g.,low quality indicators110, high quality indicator111) may be rendered as a butterflies that begins flapping their wings when theuser device102 is properly positioned. In another example, the quality indicators (e.g.,low quality indicators110, high quality indicator111) may be rendered as targets that are struck by bullets when theuser device102 is properly positioned.
InFIG. 1D, thedevice display103 is displaying an augmented video stream similar to the one shown inFIG. 1B except that it also includes acapture selection control114. Thecapture selection control114 allows the user to switch between different camera modes of theuser device103. In this example from top to bottom, thecapture selection control114 includes selections for (1) spherical panoramic capture; (2) panoramic capture; (3) video capture; and (4) image capture.FIG. 1D shows the spherical panoramic capture as being selected within thecapture selection control114.
FIG. 2 shows a diagram of a system in accordance with one embodiment. The system of this embodiment includes user devices (e.g.,user device A102A anduser device N102N) interacting with application server(s)208. Further, the illustratedapplication server208 stores information in animage repository210.FIGS. 3A-3C describe further aspects of the aforementioned components ofFIG. 2.
Examples of user devices (e.g.,user device A102A,user device N102N) include digital cameras, smartphones, tablet computers, laptop computers, augmented reality head-mounted display, etc. Each of the user devices (e.g.,user device A102A,user device N102N) is equipped with a camera configured to capture images. Specifically, the user devices (e.g.,user device A102A,user device N102N) may be configured to capture individual images that can be stitched together in order to create a panoramic image. As shown inFIG. 2, the user devices (e.g.,user device A102A,user device N102N) in this example are operated by users (e.g., user A204A,user N204N).
In some embodiments, the application server(s)208 may include an image service server and a map service server. Each of the application server(s)208 may be may be implemented on multiple computing devices (i.e., servers), where a load balancing scheme distributes requests across the multiple computing devices. Theimage service server208 may be substantially similar to the application server discussed below with respect toFIG. 3A-3C. Themap service server208 may be configured to provide spatial data (e.g., maps, geographic coordinates, directions, etc.) to the user devices (e.g.,user device A102A,user device N102N). For example, themap service server208 may provide a map displayed onuser device A102A, where user A204A uses the map to locate a nearby point of interest. Alternatively or additionally, themap service server208 may, in some embodiments, also provide images for the points of interest, such as images of a building, path, road, waterway, or other feature, which are viewed by the user A204A on theuser device A102A. Themap service server208 may be configured to obtain images for maps from theimage repository210. In some embodiments, additional repositories at the same or different location as theimage repository210 may also be queried by themap service server208 to generate maps of the points of interest or geographic areas for the user devices (e.g.,user device A102A,user device N102N).
In some embodiments, theimage service server208 is configured to obtain and store images, where the stored images may be associated with the corresponding users (e.g., user A204A,user N204N) for use via the Internet (e.g., sharing on social networks, cloud storage, etc.). As images are received by theimage service server208 in some embodiments, the images are stored in theimage repository210, where theimage service server208 may associate the stored images with points of interests and geographic areas.
FIG. 3A shows a diagram of a system in accordance with some embodiments of obtaining panoramic images. The example system includes auser device102 interacting with anapplication server208. Further, theapplication server208 of this embodiment stores information in animage repository210 and interacts with asocial networking service207.
In some embodiments, theuser device102 is a mobile computing device. For example, theuser device102 may be a digital camera, a laptop computer, a smartphone, a tablet computer, a wirelessly-networked imaging device, an augmented reality head-mounted display, or other image capture device configured to be readily transported with a user over a distance. In some embodiments, theuser device102 includes acamera324 configured to capture images, such as in a video format or as still images, including stereoscopic video or still images. For instance, thecamera324 may include one or more image sensors configured to capture images of light within the visible spectrum for use by theuser device102.
In some embodiments, theuser device102 includes aprocessor318, an input/output module320, and amemory322. Theuser device102 may be implemented as a computing device with an operating system, stored in thememory322, for interacting with a user. For example, the operating system may be configured to provide applications (e.g., camera application, map application, social networking application, etc.) to the user. In some embodiments, thememory322 includes animage storage unit326 and atarget display unit330.
In some embodiments, theimage storage unit326 of theuser device102 is configured to manage the images captured by thecamera324. For example, theimage storage unit326 may be configured to (1)store images354 ofFIG. 3B captured by thecamera324 and/or (2) transmit the storedimages354 ofFIG. 3B to theapplication server208. In some embodiments, the storedimages354 ofFIG. 3B may be stored on a local, tangible storage medium (e.g., random access memory, flash memory, etc.) of theuser device102.
In some embodiments, theimage storage unit326 may further include astitching module356 ofFIG. 3B that is configured to stitch images together by (1) matching objects (i.e., overlapping the same object in neighboring images) across images; (2) calibrating the images to minimize differences between the images (e.g., optical defects, exposure differences, image capture device parameters when the image was captured, etc.); and (3) blending the images based on the calibration (e.g., adjusting colors for exposure differences, motion compensation, etc.).
In some embodiments, thestitching module356 ofFIG. 3B is configured to improve the image stitching process by using location information associated with the images. For example, thestitching module356 ofFIG. 3B may be configured to use the location information of the images to account for different perspective points of each of the images when identifying matching objects in the images. In this example, the location information may be obtained from a positioning device such the Global Positioning System or from amotion detection device325 of theuser device102 as discussed below.
Theimage storage unit326 may associate operating parameters of thecamera324 with the image, e.g., the time at which the image was captured and data indicative of the quality of the image such as the velocity of theuser device102 during image capture. In some embodiments, camera settings and attributes of the user device at the time the image was captured may also be associated with the image by theimage storage unit326 such as resolution, exposure time, aperture, depth of focus, and post processing settings (e.g., white balance, compressing settings, sharpness adjustments). For instance, theuser device102 may include a motion detection device325 (e.g., an accelerometer such as a 3-axis accelerometer or a 6-axis accelerometer), and based on signals from themotion detection device325, aspects of the positioning of theuser device102, such as the altitude and orientation between a portrait or landscape view (e.g., angular position of the image sensor about a horizontal axis) of theuser device102, may be associated with the image by theimage storage unit326. Themotion detection device325 may also include a magnetometer or other sensor configured to determine the azimuth of theuser device102 at the time the image is captured. In this case, the azimuth may also be associated with the image by the image storage device.
In some embodiments, theimage storage unit326 may include a server interface358 ofFIG. 3B that is configured to provide images and associated data to theapplication server208. For example, the server interface358 ofFIG. 3B may provide panoramic images to theapplication server208 along with a request to store the panoramic images in cloud storage or to share the panoramic images on a social network. In some embodiments, the request may also include quality data (e.g., coverage of panoramic image, proper overlap between images during stitching, clarity of image, etc.) associated with the panoramic images. For example, the server interface358 ofFIG. 3B may be configured to provide a panoramic image and a calculated quality of the image to theapplication208 for sharing on a social network services, where the user of theuser device102 earns points or achievements based on the quality of the panoramic image.
In some embodiments, thetarget display unit330 includes auser interface controller362 ofFIG. 3B configured to display guidance for capturing a panoramic image on a display screen (not shown) of theuser device102. For example, theuser interface controller362 ofFIG. 3B may superimpose a targeting guide and targets in a video stream obtained from thecamera324 in order to encourage the user to properly reposition theuser device102. In this example, the operating parameters of thecamera324 may be analyzed by theuser interface controller362 ofFIG. 3B to place targets in the video stream where the user should capture images, which can then be stitched into a panoramic image by thestitching module356 ofFIG. 3B. Further, theuser interface controller362 ofFIG. 3B may also be configured to monitor the position and orientation (obtained from, for example, accelerometers) of thecamera324 to track the movement of the user in order to update the superimposed user interface elements as theuser device102 is repositioned. In some embodiments, the user interface elements superimposed by theuser interface controller362 ofFIG. 3B may be as discussed above with respect toFIGS. 1A-1D.
In some embodiments, theuser interface controller362 ofFIG. 3B of thetarget display unit330 is further configured to display confirmation of each successful image capture on the display screen of theuser device102. For example, theuser interface controller362 ofFIG. 3B may notify the user as each image is properly captured during a panoramic image capture and update a preview of the panoramic image showing a panoramic including the currently captured images. In this example, theuser interface controller362 ofFIG. 3B may also be configured to display the completed panoramic image for review by the user after all the images have been captured. In some embodiments, theuser interface controller362 ofFIG. 3B may also be configured to display quality indicators at the targets, where the quality indicators notify the user as to whether theuser device102 is properly positioned to capture an image at each of this targets. In this example, the quality of each of the individual images may be calculated based on how closely centered and properly aligned the targeting guide is with respect to the corresponding image target (e.g., targets or quality indicators as discussed above with respect toFIGS. 1A-1D) when the individual images are captured.
In some embodiments, theuser interface controller362 ofFIG. 3B may also be configured to track the quality of panoramic images captured by thecamera324 as historical quality data. The historical quality data may be used to adjust a quality threshold for the individual images captured by thecamera324, where the quality threshold specifies the minimum quality that should be achieved before an individual image can be included in a panoramic image. In some embodiments, the historical quality data allows the quality threshold to be adjusted based on the image capturing capabilities of the user. For example, an experienced user that consistently captures high quality images may have a higher quality threshold whereas a novice user that captures images of inconsistent quality may have a lower quality threshold.
In some embodiments, thetarget display unit330 further includes atarget graph module364 ofFIG. 3B configured to generate an image target graph that defines the bounds of the images to be captured in order to generate a panoramic image. For example, when a panoramic image capture is initiated by the user, thetarget graph module364 ofFIG. 3B may generate an image target graph based on the current orientation and the angle of view of thecamera324. In this example, the image target grid may be used by theuser interface controller362 ofFIG. 3B to place superimposed user interface elements as discussed above on the video stream of thecamera324 during a panoramic image capture.
In some embodiments, theapplication server208 is a computing device configured to provide application services (e.g., image services, map services, etc.) to a number of client devices such as theuser device102. In some embodiments, theapplication server208 includes aprocessor332, an input/output module334, and amemory336. Theapplication server208 may include various types of computing devices that execute an operating system. Theprocessor332 may execute instructions, including instructions stored in thememory336. The instructions, like the other instructions executed by computing devices herein, may be stored on a non-transitory computer readable medium such as an optical disk (e.g., compact disc, digital versatile disk, etc.), a flash drive, a hard drive, or any other computer readable storage device. The input/output module334 of theapplication server208 may include an input module, such as a radio frequency sensor, a keyboard, and/or a mouse, and an output module, such as a radio frequency transmitter, a printer, and/or a monitor. Theapplication server208 may be connected to a local area network (LAN) or a wide area network (e.g., the Internet) via a network interface connection. In some embodiments, the input/output module334 may take other forms.
In some embodiments, thememory336 includes adevice authorizer340 and animage manager342. The aforementioned components of theapplication server208 may be implemented on multiple computing devices (i.e., servers), where a load balancing scheme distributes requests across the multiple computing devices.
In some embodiments, thedevice authorizer module340 of theapplication server208 is configured to manage user sessions for user devices204. For example, thedevice authorizer module340 of this embodiment includes adevice interface370 ofFIG. 3C configured to authenticate credentials from theuser device102 when initiating a user session. In this example, theuser device102 is not authorized to interact with theapplication server208 until the credentials are confirmed to be valid by thedevice interface370 ofFIG. 3C. In some embodiments, thedevice authorizer340 also includes acredentials repository372 ofFIG. 3C configured to store encrypted credentials used to authorize the users of theapplication server208.
In some embodiments, thedevice interface370 ofFIG. 3C of thedevice authorizer module340 may also be configured to interact with asocial networking service207 on behalf of theuser device102. In this case, thedevice interface370 ofFIG. 3C is configured to request authorization to access thesocial networking service207 from theuser device102. Once authorized, thedevice interface370 ofFIG. 3C may interact with thesocial networking service207 to post images provided by theuser device102 and provide social rewards in response to those images to the user.
In some embodiments, theimage manager module342 of theapplication server208 is configured to manage images received fromuser devices102. Specifically, theimage manager module342 may include: (1) a spatial data processor366 ofFIG. 3C configured to associate the images with geographic locations; (2) arepository interface376 ofFIG. 3C configured to store and manage images in theimage repository210; and (3) a location receiver382 ofFIG. 3C configured to process location information received fromuser devices102.
In some embodiments, the location receiver382 ofFIG. 3C of theimage manager module342 may be configured to manage location information received fromuser devices102. The location receiver382 ofFIG. 3C may be configured to receive location information from theuser device102, where the location information is associated with corresponding images provided byuser device102. In this case, the location receiver382 ofFIG. 3C may be configured to anonymize the location information before it is stored to protect the identity of the user. For example, personal data identifying the user is stripped from the location information before the location information is stored in thememory336. Further, the location information may only be obtained from theuser device102 if the user elects to participate in image collection for theapplication server208.
In some embodiments, the spatial data processor366 ofFIG. 3C may be configured to associate images provided byuser devices102 with geographic locations. For example, the images provided by theuser devices102 may include embedded location information such as geographic coordinates. In this example, the spatial data processor366 ofFIG. 3C may associate the images with their corresponding geographic locations in a spatial database. The spatial database may then be used to provide images that are included in maps of the geographic locations.
In some embodiments, therepository interface376 ofFIG. 3C is configured to store images in theimage repository210. For example, therepository interface376 ofFIG. 3C may store the images in theimage repository210 in order to provide a cloud storage service. In this example, therepository interface376 ofFIG. 3C may also be configured to retrieve and provide the images to theuser device102 for the user. In another example, the stored images may be related to location information (i.e., geographic coordinates), allowing a map service to use the stored images as spatial data for generating maps. Theimage repository210 may correspond to a server, a database, files, a memory cache, etc. that is stored locally (e.g., located on the application server) or shared on a network (e.g., a database server). Theuser device102 may interact directly with theimage repository210 to directly store captured images. In some embodiments, metadata associated with the stored images is stored in a separate repository (not shown). For example, theimage repository210 and the separate repository may be organized in a distributed relational database architecture.
In some embodiments, theimage repository210, or a related repository, is configured to store information related to the stored images. For example, theimage repository210 may also store results of analysis (e.g., object recognition, etc.) performed on the stored images. In another example, theimage repository210 may also store metadata (e.g., geographic location of image, timestamp of image, format of image, etc.) related to the stored images.
FIG. 4 shows a flow chart in accordance with certain embodiments. More specifically,FIG. 4 is a flow chart of a method performed by a user device to obtain a panoramic image. As is the case with the other processes described herein, various embodiments may not include all of the steps described below, may include additional steps, and may sequence the steps differently. Accordingly, the specific arrangement of steps shown inFIG. 4 should not be construed as limiting the scope of obtaining images to enhance imagery coverage.
Instep402 of this embodiment, a request for a panoramic image capture is received by the user device. The request may be initiated by a user selecting a panoramic command (e.g., button on a touch screen interface, manual button on a digital camera, etc.) on the user interface of the user device. In some embodiments, the request from the user includes operating parameters such as an orientation and angle of view of a camera of the user device. The request from the user may also include a stitching overlap threshold, which specifies an amount of overlap that should occur between neighboring individual images. In some embodiments, the stitching overlap threshold may have a default value that is configurable by the user.
Instep404 of this embodiment, a set of targets are generated based on the operating parameters of the camera. The set of targets may be generated as an image target graph that includes image targets for obtaining overlapping images sufficient to build a 360 degree panoramic view at the current point of view of the user. For example, based on the angle of view and orientation of the camera, the image target graph is generated with a number of image targets represented as target nodes, where neighboring target nodes are connected by proximate edges. In this example, both the vertical and horizontal angle of view of the camera may be accounted for when determining the image targets of the image target graph. Further, as the stitching overlap threshold decreases, the number of individual images required for a complete 360 degree panoramic view increases. An example image target graph for a wide lens camera in a landscape orientation is discussed below with respect toFIG. 5.
In some embodiments, an image target graph such as the one shown inFIG. 5 is generated by selecting image targets according to the vertical and horizontal angle of view of the camera. In this case, the image targets within the image target graph are positioned such that each image capture at an image target overlaps, both vertically and horizontally, any neighboring images captured by the overlap threshold. Further, the selection of image targets for the image target graph vary depending upon the operating parameters of the camera. Generally, a camera with a wider angle of view has fewer image targets within its image target graph because each captured image covers a larger proportion of a full 360 panorama. Conversely, a camera with a narrower angle of view has more image targets within its image target graph because each captured image covers a smaller proportion of a full 360 panorama. The image targets selected are also similarly affected by the overlap threshold. Specifically, as the overlap threshold increases the number of image targets within the image target graph increases and vice versa.
In step406 of this embodiment, target(s) are presented on the display of the user device. Initially, the target may correspond to an initial target node at the center of the image target graph. In this case, the user may be encouraged to position a targeting guide displayed on the user device over the initial target. For example, the initial target may flash or otherwise notify the user that it is the current target for the panoramic image capture.
Instep408 of this embodiment, a determination is made as to whether the user device is properly positioned. The user device may be determined to be properly positioned when the targeting guide is centered over and aligned with one of the target(s). If the user device is not properly positioned, the process returns to step406.
Once the user device is properly positioned, an image is captured instep410. The user device may be determined to be properly positioned when target requirements for a target of the panoramic image capture are satisfied. For example, the target requirements may correspond to image quality factors (e.g., velocity of user device, distance that the targeting guide is from the image target, etc.) that should be satisfied before an image is captured at the target. In this example, thresholds for the image quality factors can be configured by the user to, for example, be consistent with the proficiency level of the user (i.e., a more experienced users may specify higher thresholds for the image quality factors).
Instep412 of this embodiment, a determination is made as to whether there is sufficient coverage for the panoramic image. Sufficient coverage for the panoramic image may be, for example, sufficient images to generate a complete 360 panoramic image at the current position of the user. In other embodiments, it may be determined that there is sufficient coverage for the panoramic image in response to a request from the user to stop the panoramic image capture.
If the current coverage of the images is not sufficient to generate the panoramic image, the proximate targets of the last target captured may be updated for the user in step414. The proximate targets may be determined according to the image target graph generated instep404. For example, if the last target captured is the initial target, the device display may be updated to also display the proximate targets that are neighboring the initial target node in the image target graph. Similarly, as each of the images of the panoramic is captured instep410, the device display is updated to also display the proximate targets of the last target captured.
Instep416 of this embodiment, a panoramic preview may be refreshed to include the image captured instep410. As each of the individual images of the panoramic is captured instep410, the panoramic preview may be updated with the latest captured image. The panoramic preview allows the user to view the overall progress of the panoramic image as the user device is repositioned to obtain each of the individual images. After the proximate targets and panoramic preview are updated, the method may return to step408 to determine whether to capture the next image in a similar manner as discussed above with respect to steps410-416. In some embodiments when multiple targets are displayed on the display of the user device, the user may be allowed to capture images for the targets in any order.
If it is determined that there is sufficient coverage for the panoramic image, the panoramic may be generated instep418 of this embodiment. The panoramic image may be stitched together from the individual images captured instep410. In some embodiments, the image targets used to capture the individual images may also be used to generate the panoramic image. In this case, the image targets may be used to properly place and stitch the individual images together to form the panoramic image.
Instep420 of this embodiment, a quality rating of the panoramic image is determined. The quality rating of the panoramic image describes an overall quality of the panoramic image based on, for example, the quality factors of the individual images discussed above instep410. In this example, other quality information related to the panoramic image may also be used to determine the quality rating such as dynamic range, contrast, sharpness, and color accuracy of the generated panoramic image.
FIG. 5 shows an example image target graph in accordance with embodiments of obtaining panoramic images. More specifically,FIG. 5 shows an example image target graph for a user device with a camera operating in landscape mode.
InFIG. 5, the image target graph includes target nodes502 andproximate edges504, which connect neighboring target nodes502. Each of the target nodes502 represents an image target for generating a panoramic image. In this example, the image targets are positioned such that if images are captured at each of the image targets, the images can be stitched together to generate a full 360 degree panoramic image.
Aninitial node506 is shown at the center of the image target graph. After a panoramic image capture is initiated, an initial target corresponding to theinitial node506 may be displayed on a user device as discussed above with respect toFIG. 4. In response to an image being captured at the initial target, the display of the user device may be updated to include targets corresponding to the target nodes502 that are connected to theinitial node506 viaproximate edges504. As additional images are captured for the newly included image targets, the display of the user device may be updated to further include additional targets for target nodes502 connected byproximate edges504.
FIG. 6 shows a flow chart in accordance with certain embodiments. More specifically,FIG. 6 is a flow chart of a method performed by a user device to obtain a panoramic image. As is the case with the other processes described herein, various embodiments may not include all of the steps described below, may include additional steps, and may sequence the steps differently. Accordingly, the specific arrangement of steps shown inFIG. 6 should not be construed as limiting the scope of obtaining images to enhance imagery coverage.
Instep602 of this embodiment, a request for a panoramic image capture is received by the user device. The request may be initiated by a user selecting a panoramic command (e.g., button on a touch screen interface, manual button on a digital camera, etc.) on the user interface of the user device. In some embodiments, the request from the user includes operating parameters such as an image quality threshold and a target size of the panoramic image. The image quality threshold may specify the minimum quality required for each of the individual images that are to be used to generate the panoramic image.
Instep604 of this embodiment, an image target is presented on the display of the user device. The image target may be a quality indicator that also provides an indication of whether the user device is properly positioned. A targeting guide may also be presented on the display of the user device, where the user is encouraged to position the targeting guide over the image target. For example, the image target may be presented as a cartoon face that is sad when the user device is improperly positioned and is happy when the user device is properly positioned.
Instep606 of this embodiment, a determination is made as to whether the user device is properly positioned. The user device may be determined to be properly positioned when the targeting guide is centered over and aligned with the image target. If the user device is not properly positioned, a warning that the user device is not properly positioned is presented to the user in step608. For example, if the targeting guide is a quality indicator, the warning of improper position may be presented as a sad face. In some embodiments, the presentation of the quality indicator may be correspondingly updated as the user device is repositioned. In this example, the presentation of the face may correspond to the distance the targeting guide is from the image target (e.g., the face may change from being sad to neutral to happy as the targeting guide moves closer to the image target).
Once the user device is properly positioned, an image is captured in step610. The user device may be determined to be properly positioned when target requirements for the panoramic image capture are satisfied. For example, the target requirements may correspond to image quality factors (e.g., velocity of user device, distance that the targeting guide is from the image target, etc.) that should be satisfied before an image is captured. In this example, thresholds for the image quality factors can be configured by the user to, for example, be consistent with the proficiency level of the user (i.e., a more experienced users may specify higher thresholds for the image quality factors).
Instep612 of this embodiment, a determination is made as to whether there is sufficient coverage for the panoramic image. Sufficient coverage for the panoramic image may be, for example, sufficient images to generate a complete 360 panoramic image at the current position of the user. In other embodiments, it may be determined that there is sufficient coverage for the panoramic image in response to a request from the user to stop the panoramic image capture.
If the current coverage of the images is not sufficient to generate the panoramic image, the proximate targets of the last target captured may be updated for the user in step614. The proximate targets may be determined according to the image target graph generated instep604. For example, if the last target captured is the initial target, the device display may be updated to also display the proximate targets that are neighboring the initial target node in the image target graph. Similarly, as each of the following images of the panoramic is captured in step610, the device display is updated to also display the proximate targets of the last target captured. Each of the image targets added in614 is a quality indicator that also provides an indication of whether the user device is properly positioned as discussed above.
Instep616 of this embodiment, a panoramic preview may be refreshed to include the image captured in step610. As each of the individual images of the panoramic is captured in step610, the panoramic preview may be updated with the latest captured image. The panoramic preview allows the user to view the overall progress of the panoramic image as the user device is repositioned to obtain each of the individual images. After the proximate targets and panoramic preview are updated, the method may return to step606 to determine whether to capture the next image in a similar manner as discussed above with respect to steps610-616. In some embodiments when multiple targets are displayed on the display of the user device, the user may be allowed to capture images for the targets in any order.
If it is determined that there is sufficient coverage for the panoramic image, the panoramic image may be generated instep618 of this embodiment. The panoramic image may be stitched together from the individual images captured in step610. In some embodiments, the image targets used to capture the individual images may also be used to generate the panoramic image. In this case, the image targets may be used to place and stitch the individual images together to form the panoramic image.
In step620 of this embodiment, a quality rating of the panoramic image is determined. The quality rating of the panoramic image describes an overall quality of the panoramic image based on, for example, the quality factors of the individual images discussed above in step610. In this example, other quality information related to the panoramic image may also be used to determine the quality rating such as dynamic range, contrast, sharpness, and color accuracy of the generated panoramic image.
FIG. 7 shows an example interface in accordance with embodiments of obtaining panoramic images. More specifically,FIG. 7 shows an example user interface for performing panoramic image capture on auser device102.
InFIG. 7, theuser device102 includes adevice display103 displaying an augmented video stream provided by a camera that may be substantially similar to theuser device102 described above with respect toFIGS. 1A-1D.FIG. 7 shows a target grid107 for obtaining six individual images that are to be used to generate a panoramic image. The target grid107 includes cells that correspond to, for example, target nodes of an image target graph. As each image is captured, apreview106 of the panoramic image may be updated to include the captured images.
In some embodiments, individual images are obtained for each of the quality indicators (e.g., capturedindicators702,low quality indicators110, and high quality indicator111) as discussed above with respect toFIG. 6. The capturedindicators702 show the targets at which an image has already been captured. Thelow quality indicators110 show the targets at which the user device is poorly positioned for capturing an image. The high quality indicator111 shows the target at which theuser device102 is properly positioned for capturing an image. In this example, theuser device102 is properly positioned when the targetingguide109 is within a threshold distance of the high quality indicator111.
In this example after all the individual images are obtained, the six images are stitched together and included in a panoramic image. A quality rating of the panoramic image may be determined based on how closely centered and properly aligned the targetingguide109 is with respect to the corresponding image target (e.g.,low quality indicators110, high quality indicator111) when the individual images are captured.
In some embodiments of the invention, theuser device102 may present the quality rating to the user. The quality rating may be presented on a user interface of an application (e.g., camera application, social networking application, game, etc.) that is configured to monitor the quality ratings of panoramic images captured by theuser device102. For example, a camera application may monitor the quality ratings of panoramic images in order to provide the user with achievements or other rewards. In this example, the user's proficiency may be ranked by the camera application based on the quality ratings of the panoramic images captured by the user. In response to an increase in the user's proficiency, the camera application may modify operating parameters of the camera application such as the image quality threshold.
While obtaining panoramic images has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications, and variations will be apparent to those skilled in the art in light of the foregoing description. Accordingly, it is intended to embrace all such alternatives, modifications, and variations as fall within the spirit and broad scope of the appended claims. The present embodiments may suitably comprise, consist or consist essentially of the elements disclosed and may be practiced in the absence of an element not disclosed.
As used throughout this application, the word “may” is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). The words “include”, “including”, and “includes” mean including, but not limited to. As used throughout this application, the singular forms “a”, “an” and “the” include plural referents unless the content clearly indicates otherwise. Thus, for example, reference to “an element” includes two or more elements. Unless specifically stated otherwise, as apparent from the discussion, it is appreciated that throughout this specification discussions utilizing terms such as “processing”, “computing”, “calculating”, “determining” or the like refer to actions or processes of a specific apparatus, such as a special purpose computer or a similar special purpose electronic processing/computing device. In the context of this specification, a special purpose computer or a similar special purpose electronic processing/computing device is capable of manipulating or transforming signals, typically represented as physical electronic or magnetic quantities within memories, registers, or other information storage devices, transmission devices, or display devices of the special purpose computer or similar special purpose electronic processing/computing device.

Claims (20)

We claim:
1. A computer-implemented method of obtaining panoramic images by a user device, the computer-implemented method comprising:
generating a plurality of targets for a panoramic image, each of the plurality of targets defining a portion of the panoramic image, and each of the plurality of targets being located at a center of a prospective target image;
monitoring a position of the user device with respect to a current target of the plurality of targets;
responsive to determining that the user device is properly positioned with respect to the current target in accordance with one or more thresholds for corresponding image quality factors, capturing a first image for the current target using a camera of the user device, the first image defining a first portion of the panoramic image;
monitoring the position of the user device with respect to a next target of the plurality of targets;
responsive to determining that the user device is properly positioned with respect to the next target in accordance with the one or more thresholds for corresponding image quality factors, capturing a second image for the next target using the camera of the user device, the second image defining a second portion of the panoramic image; and
generating the panoramic image using the first image and the second image.
2. The computer-implemented method ofclaim 1, further comprising:
responsive to determining that a targeting guide of the user device is within a threshold distance of the current target, displaying a high quality indicator at the current target.
3. The computer-implemented method ofclaim 2, further comprising:
displaying a low quality indicator at the current target while the targeting guide of the user device is outside the threshold distance.
4. The computer-implemented method ofclaim 1, further comprising:
calculating a quality of the first image based on at least one of velocity of the user device during the capture of the first image, exposure of the camera during the capture of the first image, rotational position of the user device during the capture of the first image, distance of a targeting guide of the user device from the first target during the capture of the first image, and overlap of the first image with the second image; and
determining whether the quality of the first image satisfies a quality threshold.
5. The computer-implemented method ofclaim 4, further comprising:
responsive to determining that the quality of the first image is below the quality threshold, discarding the first image; and
capturing a new image for the current target using the camera of the user device.
6. The computer-implemented method ofclaim 4, wherein the quality threshold is determined based on historical quality data of a user of the user device, the historical quality data being generated based on image quality of a plurality of previously captured panoramic images.
7. The computer-implemented method ofclaim 4, further comprising:
calculating a quality of the second image based on at least one of velocity of the user device during the capture of the second image, exposure of the camera during the capture of the second image, rotational position of the user device during the capture of the second image, distance of a targeting guide of the user device from the first target during the capture of the second image, and overlap of the second image with the first image; and
determining a quality of the panoramic image based on the quality of the first image and the quality of the second image.
8. A system, comprising:
one or more memories;
one or more processors, each operatively connected to the one or more memories;
a target graph module stored on the one or more memories and configured to be executed by the one or more processors to generate a plurality of targets for a panoramic image, each of the plurality of targets defining a portion of the panoramic image, and each of the plurality of targets being located at a center of a prospective target image;
a user interface controller stored on the one or more memories and configured to be executed by the one or more processors to:
monitor a position of a user device with respect to a current target of the plurality of targets,
responsive to determining that the user device is properly positioned with respect to the current target in accordance with one or more thresholds for corresponding image quality factors, request that a camera of the user device capture a first image for the current target, the first image defining a first portion of the panoramic image,
monitor the position of the user device with respect to a next target of the plurality of targets, and
responsive to determining that the user device is properly positioned with respect to the next target in accordance with the one or more thresholds for corresponding image quality factors, request that a camera of the user device capture a second image for the next target, the second image defining a second portion of the panoramic image;
a stitching module stored on the one or more memories and configured to be executed by the one or more processors to generate the panoramic image using the first image and the second image; and
the camera configured to capture the first image and the second image.
9. The system ofclaim 8, wherein the user interface controller is further configured to be executed by the one or more processors to:
responsive to determining that a targeting guide of the user device is within a threshold distance of the current target, display a high quality indicator at the current target.
10. The system ofclaim 9, wherein the user interface controller is further configured to be executed by the one or more processors to:
display a low quality indicator at the current target while the targeting guide of the user device is outside the threshold distance.
11. The system ofclaim 8, wherein the user interface controller is further configured to be executed by the one or more processors to:
calculate a quality of the first image based on at least one of velocity of the user device during the capture of the first image, exposure of the camera during the capture of the first image, rotational position of the user device during the capture of the first image, distance of a targeting guide of the user device from the first target during the capture of the first image, and overlap of the first image with the second image, and
determine whether the quality of the first image satisfies a quality threshold.
12. The system ofclaim 11, wherein the user interface controller is further configured to be executed by the one or more processors to:
responsive to determining that the quality of the first image is below the quality threshold, discard the first image, and
request that the camera of the user device capture a new image for the current target.
13. The system ofclaim 11, wherein the quality threshold is determined based on historical quality data of a user of the user device, the historical quality data being generated based on image quality of a plurality of previously captured panoramic images.
14. The system ofclaim 11, wherein the user interface controller is further configured to be executed by the one or more processors to:
calculate a quality of the second image based on at least one of velocity of the user device during the capture of the second image, exposure of the camera during the capture of the second image, rotational position of the user device during the capture of the second image, distance of a targeting guide of the user device from the first target during the capture of the second image, and overlap of the second image with the first image, and
determine a quality of the panoramic image based on the quality of the first image and the quality of the second image.
15. A non-transitory computer readable medium having computer-executable program instructions embodied therein that when executed cause a computer processor to:
generate a plurality of targets for a panoramic image, each of the plurality of targets defining a portion of the panoramic image, and each of the plurality of targets being located at a center of a prospective target image;
monitor a position of a user device with respect to a current target of the plurality of targets;
responsive to determining that a targeting guide of the user device is within a threshold distance of the current target, display a high quality indicator at the current target;
responsive to determining that the user device is properly positioned with respect to the current target in accordance with one or more thresholds for corresponding image quality factors, capture a first image for the current target using a camera of the user device, the first image defining a first portion of the panoramic image;
monitor the position of the user device with respect to a next target of the plurality of targets;
responsive to determining that the user device is properly positioned with respect to the next target in accordance with the one or more thresholds for corresponding image quality factors, capture a second image for the next target using the camera of the user device, the second image defining a second portion of the panoramic image; and
generate the panoramic image using the first image and the second image.
16. The computer readable medium ofclaim 15, wherein the instructions when executed further cause the computer processor to:
display a low quality indicator at the current target while the targeting guide of the user device is outside the threshold distance.
17. The computer readable medium ofclaim 15, wherein the instructions when executed further cause the computer processor to:
calculate a quality of the first image based on at least one of velocity of the user device during the capture of the first image, exposure of the camera during the capture of the first image, rotational position of the user device during the capture of the first image, distance of a targeting guide of the user device from the first target during the capture of the first image, and overlap of the first image with the second image; and
determine whether the quality of the first image satisfies a quality threshold.
18. The computer readable medium ofclaim 17, wherein the instructions when executed further cause the computer processor to:
responsive to determining that the quality of the first image is below the quality threshold, discard the first image; and
capture a new image for the current target using the camera of the user device.
19. The computer readable medium ofclaim 17, wherein the quality threshold is determined based on historical quality data of a user of the user device, the historical quality data being generated based on image quality of a plurality of previously captured panoramic images.
20. The computer readable medium ofclaim 17, wherein the instructions when executed further cause the computer processor to:
calculate a quality of the second image based on at least one of velocity of the user device during the capture of the second image, exposure of the camera during the capture of the second image, rotational position of the user device during the capture of the second image, distance of a targeting guide of the user device from the first target during the capture of the second image, and overlap of the second image with the first image; and
determine a quality of the panoramic image based on the quality of the first image and the quality of the second image.
US13/662,0732012-10-262012-10-26Method, system, and computer program product for gamifying the process of obtaining panoramic imagesActive2034-05-25US9270885B2 (en)

Priority Applications (5)

Application NumberPriority DateFiling DateTitle
US13/662,073US9270885B2 (en)2012-10-262012-10-26Method, system, and computer program product for gamifying the process of obtaining panoramic images
PCT/US2013/029772WO2014065854A1 (en)2012-10-262013-03-08Method, system and computer program product for gamifying the process of obtaining panoramic images
US14/990,238US9667862B2 (en)2012-10-262016-01-07Method, system, and computer program product for gamifying the process of obtaining panoramic images
US15/491,573US9832374B2 (en)2012-10-262017-04-19Method, system, and computer program product for gamifying the process of obtaining panoramic images
US15/791,580US10165179B2 (en)2012-10-262017-10-24Method, system, and computer program product for gamifying the process of obtaining panoramic images

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US13/662,073US9270885B2 (en)2012-10-262012-10-26Method, system, and computer program product for gamifying the process of obtaining panoramic images

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US14/990,238ContinuationUS9667862B2 (en)2012-10-262016-01-07Method, system, and computer program product for gamifying the process of obtaining panoramic images

Publications (2)

Publication NumberPublication Date
US20140118479A1 US20140118479A1 (en)2014-05-01
US9270885B2true US9270885B2 (en)2016-02-23

Family

ID=50545053

Family Applications (4)

Application NumberTitlePriority DateFiling Date
US13/662,073Active2034-05-25US9270885B2 (en)2012-10-262012-10-26Method, system, and computer program product for gamifying the process of obtaining panoramic images
US14/990,238ActiveUS9667862B2 (en)2012-10-262016-01-07Method, system, and computer program product for gamifying the process of obtaining panoramic images
US15/491,573Expired - Fee RelatedUS9832374B2 (en)2012-10-262017-04-19Method, system, and computer program product for gamifying the process of obtaining panoramic images
US15/791,580ActiveUS10165179B2 (en)2012-10-262017-10-24Method, system, and computer program product for gamifying the process of obtaining panoramic images

Family Applications After (3)

Application NumberTitlePriority DateFiling Date
US14/990,238ActiveUS9667862B2 (en)2012-10-262016-01-07Method, system, and computer program product for gamifying the process of obtaining panoramic images
US15/491,573Expired - Fee RelatedUS9832374B2 (en)2012-10-262017-04-19Method, system, and computer program product for gamifying the process of obtaining panoramic images
US15/791,580ActiveUS10165179B2 (en)2012-10-262017-10-24Method, system, and computer program product for gamifying the process of obtaining panoramic images

Country Status (2)

CountryLink
US (4)US9270885B2 (en)
WO (1)WO2014065854A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20150077513A1 (en)*2012-04-132015-03-19Cyclomedia Technology B.V.System, Device, and Vehicle for Recording Panoramic Images
US20150271400A1 (en)*2014-03-192015-09-24Htc CorporationHandheld electronic device, panoramic image forming method and non-transitory machine readable medium thereof
US10257485B2 (en)2016-06-082019-04-09Google LlcGenerating a composite image from a physical item
US10675955B2 (en)2016-11-142020-06-09Google LlcAdaptive glare removal and/or color correction

Families Citing this family (51)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9270885B2 (en)*2012-10-262016-02-23Google Inc.Method, system, and computer program product for gamifying the process of obtaining panoramic images
US10444845B2 (en)2012-12-212019-10-15Qualcomm IncorporatedDisplay of separate computer vision based pose and inertial sensor based pose
US9662564B1 (en)*2013-03-112017-05-30Google Inc.Systems and methods for generating three-dimensional image models using game-based image acquisition
US20140267587A1 (en)*2013-03-142014-09-18Microsoft CorporationPanorama packet
US9712746B2 (en)2013-03-142017-07-18Microsoft Technology Licensing, LlcImage capture and ordering
KR102058857B1 (en)*2013-04-082019-12-26삼성전자주식회사Image photographing apparatus and method for controlling the same
US9836875B2 (en)*2013-04-262017-12-05Flipboard, Inc.Viewing angle image manipulation based on device rotation
JP6228392B2 (en)*2013-05-312017-11-08任天堂株式会社 Panorama image display program, panorama image display device, panorama image display system, and panorama image display method
US20150098000A1 (en)*2013-10-032015-04-09Futurewei Technologies, Inc.System and Method for Dynamic Image Composition Guidance in Digital Camera
US10592929B2 (en)2014-02-192020-03-17VP Holdings, Inc.Systems and methods for delivering content
US10389992B2 (en)*2014-08-052019-08-20Utherverse Digital Inc.Immersive display and method of operating immersive display for real-world object alert
US9998655B2 (en)*2014-12-232018-06-12Quallcomm IncorporatedVisualization for viewing-guidance during dataset-generation
JP6072100B2 (en)*2015-01-302017-02-01キヤノン株式会社 Radiation imaging system, control method, control method, and program
CN106034206B (en)*2015-03-172020-02-21联想(北京)有限公司Electronic device and image display method
CN106155459B (en)*2015-04-012019-06-14北京智谷睿拓技术服务有限公司Exchange method, interactive device and user equipment
US20160353012A1 (en)*2015-05-252016-12-01Htc CorporationZooming control method for camera and electronic apparatus with camera
US10582125B1 (en)*2015-06-012020-03-03Amazon Technologies, Inc.Panoramic image generation from video
US10242474B2 (en)2015-07-152019-03-26Fyusion, Inc.Artificially rendering images using viewpoint interpolation and extrapolation
US11006095B2 (en)2015-07-152021-05-11Fyusion, Inc.Drone based capture of a multi-view interactive digital media
US12261990B2 (en)2015-07-152025-03-25Fyusion, Inc.System and method for generating combined embedded multi-view interactive digital media representations
US11095869B2 (en)2015-09-222021-08-17Fyusion, Inc.System and method for generating combined embedded multi-view interactive digital media representations
US10222932B2 (en)2015-07-152019-03-05Fyusion, Inc.Virtual reality environment based manipulation of multilayered multi-view interactive digital media representations
US10147211B2 (en)2015-07-152018-12-04Fyusion, Inc.Artificially rendering images using viewpoint interpolation and extrapolation
US10158722B2 (en)2015-07-312018-12-18Jeffrey T EschbachMethod and systems for the scheduled capture of web content from web servers as sets of images
US10447761B2 (en)*2015-07-312019-10-15Page Vault Inc.Method and system for capturing web content from a web server as a set of images
US9986149B2 (en)2015-08-142018-05-29International Business Machines CorporationDetermining settings of a camera apparatus
US11783864B2 (en)2015-09-222023-10-10Fyusion, Inc.Integration of audio into a multi-view interactive digital media representation
US9911213B2 (en)*2015-12-182018-03-06Ricoh Co., Ltd.Panoramic image stitching using objects
US9871962B2 (en)*2016-03-042018-01-16RollCall, LLCMovable user interface shutter button for camera
KR20170124814A (en)*2016-05-032017-11-13삼성전자주식회사Image display apparatus and operating method for the same
US11202017B2 (en)2016-10-062021-12-14Fyusion, Inc.Live style transfer on a mobile device
CN107993276B (en)*2016-10-252021-11-23杭州海康威视数字技术股份有限公司Panoramic image generation method and device
US10437879B2 (en)2017-01-182019-10-08Fyusion, Inc.Visual search using multi-view interactive digital media representations
US20180227482A1 (en)2017-02-072018-08-09Fyusion, Inc.Scene-aware selection of filters and effects for visual digital media content
US10313651B2 (en)2017-05-222019-06-04Fyusion, Inc.Snapshots at predefined intervals or angles
US11069147B2 (en)2017-06-262021-07-20Fyusion, Inc.Modification of multi-view interactive digital media representation
US20200218753A1 (en)*2017-07-042020-07-09Ovass Pty LtdMethod and system for accessing image data
GB2569817B (en)*2017-12-292021-06-23Snugs Tech LtdEar insert shape determination
CN108282675A (en)*2018-01-092018-07-13山东浪潮商用系统有限公司A kind of system and method, set-top box, the remote terminal of control TV
US10592747B2 (en)2018-04-262020-03-17Fyusion, Inc.Method and apparatus for 3-D auto tagging
SE545276C2 (en)*2018-05-162023-06-13Tracy Of Sweden AbArrangement and method for identifying and tracking log
US10666863B2 (en)*2018-05-252020-05-26Microsoft Technology Licensing, LlcAdaptive panoramic video streaming using overlapping partitioned sections
US10764494B2 (en)2018-05-252020-09-01Microsoft Technology Licensing, LlcAdaptive panoramic video streaming using composite pictures
US10817996B2 (en)2018-07-162020-10-27Samsung Electronics Co., Ltd.Devices for and methods of combining content from multiple frames
US10885657B2 (en)*2018-10-312021-01-05Disney Enterprises, Inc.Configuration for indicating image capture device position
CN110381259B (en)*2019-08-132021-08-31广州欧科信息技术股份有限公司Mural image acquisition method and device, computer equipment and storage medium
US20210004948A1 (en)*2019-09-142021-01-07Ron ZassVerifying purported capturing parameters of images of construction sites
CN110830724B (en)*2019-12-052022-01-18维沃移动通信(杭州)有限公司Shooting method and electronic equipment
CN111090762A (en)*2019-12-192020-05-01京东方科技集团股份有限公司Image acquisition method and device, electronic equipment and storage medium
CN113808264B (en)*2021-08-022023-06-20日立楼宇技术(广州)有限公司 Method, device and storage medium for obtaining installation position and angle of escalator camera
CN120358340A (en)*2024-01-222025-07-22中移(杭州)信息技术有限公司VR panoramic image output method, VR panoramic live broadcast method and interaction method

Citations (42)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20020107965A1 (en)2001-02-052002-08-08Piccionelli Gregory A.Performance distribution method
DE10156414A1 (en)2001-02-052002-09-12Siemens AgExecution of surveys using mobile phone text messaging, e.g. for carrying out surveys during a conference or commercial fair, using a method that is simple and economical to implement
US20050206743A1 (en)2004-03-162005-09-22Sim Wong HDigital still camera and method of forming a panoramic image
US20060079200A1 (en)2003-07-042006-04-13Kiyoshi HirouchiDisaster system control method and disaster system control apparatus
US20070024527A1 (en)2005-07-292007-02-01Nokia CorporationMethod and device for augmented reality message hiding and revealing
US20080045236A1 (en)2006-08-182008-02-21Georges NahonMethods and apparatus for gathering and delivering contextual messages in a mobile communication system
US20080119131A1 (en)2006-11-222008-05-22Bindu Rama RaoSystem for providing interactive user interactive user interest survey to user of mobile devices
US20080118184A1 (en)*2006-11-172008-05-22Microsoft CorporationSwarm imaging
US7424218B2 (en)2005-07-282008-09-09Microsoft CorporationReal-time preview for panoramic images
AU2008212499A1 (en)2000-04-072008-12-04Zyzeba Holdings LimitedInteractive marketing system
US7535492B2 (en)2002-07-022009-05-19Lightsurf Technologies, Inc.Imaging system providing automated fulfillment of image photofinishing based on location
US20090157876A1 (en)2007-12-172009-06-18Lection David BMethods, Systems, And Computer Readable Media For Managing User Access To An Electronic Media Sharing Environment
WO2009086235A2 (en)2007-12-212009-07-09Wikiatlas CorporationContributor compensation system and method
US20100009700A1 (en)2008-07-082010-01-14Sony Ericsson Mobile Communications AbMethods and Apparatus for Collecting Image Data
US20100161506A1 (en)2008-12-192010-06-24Nurago GmbhMobile device and method for providing logging and reporting of user-device interaction
WO2010090906A2 (en)2009-02-042010-08-12Motorola, Inc.Method and apparatus for creating virtual graffiti in a mobile virtual and augmented reality system
KR20100124947A (en)2009-05-202010-11-30삼성에스디에스 주식회사Ar contents providing system and method providing a portable terminal real-time regional information by using augmented reality technology
US20100309225A1 (en)2009-06-032010-12-09Gray Douglas RImage matching for mobile augmented reality
US20110071843A1 (en)2009-09-182011-03-24Michael GilvarOccurrence marketing tool
US7936915B2 (en)2007-05-292011-05-03Microsoft CorporationFocal length estimation for panoramic stitching
US20110102605A1 (en)2009-11-022011-05-05Empire Technology Development LlcImage matching to augment reality
WO2011059780A1 (en)2009-10-282011-05-19Google Inc.Navigation images
US20110154264A1 (en)2006-03-062011-06-23Veveo, Inc.Methods and Systems for Selecting and Presenting Content Based on Learned Periodicity of User Content Selection
US20110161002A1 (en)2004-06-302011-06-30Devries Steven PMethod of Collecting Information for a Geographic Database for use with a Navigation System
US20110165893A1 (en)2010-01-042011-07-07Samsung Electronics Co., Ltd.Apparatus to provide augmented reality service using location-based information and computer-readable medium and method of the same
US20110169947A1 (en)2010-01-122011-07-14Qualcomm IncorporatedImage identification using trajectory-based location determination
US20110234750A1 (en)*2010-03-242011-09-29Jimmy Kwok Lap LaiCapturing Two or More Images to Form a Panoramic Image
US20110312374A1 (en)2010-06-182011-12-22Microsoft CorporationMobile and server-side computational photography
US8120641B2 (en)2007-02-142012-02-21Samsung Electronics Co., Ltd.Panoramic photography method and apparatus
US20120076426A1 (en)2009-09-162012-03-29Olaworks, Inc.Method and system for matching panoramic images using a graph structure, and computer-readable recording medium
US8189964B2 (en)2009-12-072012-05-29Google Inc.Matching an approximately located query image against a reference image set
WO2012131151A1 (en)2011-03-282012-10-04Nokia CorporationMethods and apparatuses for generating a panoramic image
US8310522B2 (en)2006-12-272012-11-13Samsung Electronics Co., Ltd.Method for photographing panoramic image
US20120300020A1 (en)2011-05-272012-11-29Qualcomm IncorporatedReal-time self-localization from panoramic images
US20130033568A1 (en)2007-08-292013-02-07Samsung Electronics Co., Ltd.Method for photographing panoramic picture
US8600194B2 (en)*2011-05-172013-12-03Apple Inc.Positional sensor-assisted image registration for panoramic photography
US8655620B2 (en)*2010-12-292014-02-18National Tsing Hua UniversityMethod and module for measuring rotation and portable apparatus comprising the module
US8773502B2 (en)*2012-10-292014-07-08Google Inc.Smart targets facilitating the capture of contiguous images
US8902335B2 (en)*2012-06-062014-12-02Apple Inc.Image blending operations
US8933986B2 (en)*2010-05-282015-01-13Qualcomm IncorporatedNorth centered orientation tracking in uninformed environments
US8941716B2 (en)*2010-09-272015-01-27Casio Computer Co., Ltd.Image capturing apparatus capable of capturing a panoramic image
US8957944B2 (en)*2011-05-172015-02-17Apple Inc.Positional sensor-assisted motion filtering for panoramic photography

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US3978337A (en)*1975-01-291976-08-31Wisconsin Alumni Research FoundationThree-dimensional time-of-flight gamma camera system
US7170492B2 (en)*2002-05-282007-01-30Reactrix Systems, Inc.Interactive video display system
US9094675B2 (en)*2008-02-292015-07-28Disney Enterprises Inc.Processing image data from multiple cameras for motion pictures
US8587771B2 (en)*2010-07-162013-11-19Microsoft CorporationMethod and system for multi-phase dynamic calibration of three-dimensional (3D) sensors in a time-of-flight system
US9194953B2 (en)*2010-10-212015-11-24Sony Corporation3D time-of-light camera and method
US9270885B2 (en)*2012-10-262016-02-23Google Inc.Method, system, and computer program product for gamifying the process of obtaining panoramic images
US9325861B1 (en)*2012-10-262016-04-26Google Inc.Method, system, and computer program product for providing a target user interface for capturing panoramic images
US9374512B2 (en)*2013-02-242016-06-21Pelican Imaging CorporationThin form factor computational array cameras and modular array cameras
US9220462B2 (en)*2013-05-242015-12-29Toshiba America Electronic Components, Inc.Imaging sensor and method for biometric mapping of facial skin

Patent Citations (42)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
AU2008212499A1 (en)2000-04-072008-12-04Zyzeba Holdings LimitedInteractive marketing system
DE10156414A1 (en)2001-02-052002-09-12Siemens AgExecution of surveys using mobile phone text messaging, e.g. for carrying out surveys during a conference or commercial fair, using a method that is simple and economical to implement
US20020107965A1 (en)2001-02-052002-08-08Piccionelli Gregory A.Performance distribution method
US7535492B2 (en)2002-07-022009-05-19Lightsurf Technologies, Inc.Imaging system providing automated fulfillment of image photofinishing based on location
US20060079200A1 (en)2003-07-042006-04-13Kiyoshi HirouchiDisaster system control method and disaster system control apparatus
US20050206743A1 (en)2004-03-162005-09-22Sim Wong HDigital still camera and method of forming a panoramic image
US20110161002A1 (en)2004-06-302011-06-30Devries Steven PMethod of Collecting Information for a Geographic Database for use with a Navigation System
US7424218B2 (en)2005-07-282008-09-09Microsoft CorporationReal-time preview for panoramic images
US20070024527A1 (en)2005-07-292007-02-01Nokia CorporationMethod and device for augmented reality message hiding and revealing
US20110154264A1 (en)2006-03-062011-06-23Veveo, Inc.Methods and Systems for Selecting and Presenting Content Based on Learned Periodicity of User Content Selection
US20080045236A1 (en)2006-08-182008-02-21Georges NahonMethods and apparatus for gathering and delivering contextual messages in a mobile communication system
US20080118184A1 (en)*2006-11-172008-05-22Microsoft CorporationSwarm imaging
US20080119131A1 (en)2006-11-222008-05-22Bindu Rama RaoSystem for providing interactive user interactive user interest survey to user of mobile devices
US8310522B2 (en)2006-12-272012-11-13Samsung Electronics Co., Ltd.Method for photographing panoramic image
US8120641B2 (en)2007-02-142012-02-21Samsung Electronics Co., Ltd.Panoramic photography method and apparatus
US7936915B2 (en)2007-05-292011-05-03Microsoft CorporationFocal length estimation for panoramic stitching
US20130033568A1 (en)2007-08-292013-02-07Samsung Electronics Co., Ltd.Method for photographing panoramic picture
US20090157876A1 (en)2007-12-172009-06-18Lection David BMethods, Systems, And Computer Readable Media For Managing User Access To An Electronic Media Sharing Environment
WO2009086235A2 (en)2007-12-212009-07-09Wikiatlas CorporationContributor compensation system and method
US20100009700A1 (en)2008-07-082010-01-14Sony Ericsson Mobile Communications AbMethods and Apparatus for Collecting Image Data
US20100161506A1 (en)2008-12-192010-06-24Nurago GmbhMobile device and method for providing logging and reporting of user-device interaction
WO2010090906A2 (en)2009-02-042010-08-12Motorola, Inc.Method and apparatus for creating virtual graffiti in a mobile virtual and augmented reality system
KR20100124947A (en)2009-05-202010-11-30삼성에스디에스 주식회사Ar contents providing system and method providing a portable terminal real-time regional information by using augmented reality technology
US20100309225A1 (en)2009-06-032010-12-09Gray Douglas RImage matching for mobile augmented reality
US20120076426A1 (en)2009-09-162012-03-29Olaworks, Inc.Method and system for matching panoramic images using a graph structure, and computer-readable recording medium
US20110071843A1 (en)2009-09-182011-03-24Michael GilvarOccurrence marketing tool
WO2011059780A1 (en)2009-10-282011-05-19Google Inc.Navigation images
US20110102605A1 (en)2009-11-022011-05-05Empire Technology Development LlcImage matching to augment reality
US8189964B2 (en)2009-12-072012-05-29Google Inc.Matching an approximately located query image against a reference image set
US20110165893A1 (en)2010-01-042011-07-07Samsung Electronics Co., Ltd.Apparatus to provide augmented reality service using location-based information and computer-readable medium and method of the same
US20110169947A1 (en)2010-01-122011-07-14Qualcomm IncorporatedImage identification using trajectory-based location determination
US20110234750A1 (en)*2010-03-242011-09-29Jimmy Kwok Lap LaiCapturing Two or More Images to Form a Panoramic Image
US8933986B2 (en)*2010-05-282015-01-13Qualcomm IncorporatedNorth centered orientation tracking in uninformed environments
US20110312374A1 (en)2010-06-182011-12-22Microsoft CorporationMobile and server-side computational photography
US8941716B2 (en)*2010-09-272015-01-27Casio Computer Co., Ltd.Image capturing apparatus capable of capturing a panoramic image
US8655620B2 (en)*2010-12-292014-02-18National Tsing Hua UniversityMethod and module for measuring rotation and portable apparatus comprising the module
WO2012131151A1 (en)2011-03-282012-10-04Nokia CorporationMethods and apparatuses for generating a panoramic image
US8600194B2 (en)*2011-05-172013-12-03Apple Inc.Positional sensor-assisted image registration for panoramic photography
US8957944B2 (en)*2011-05-172015-02-17Apple Inc.Positional sensor-assisted motion filtering for panoramic photography
US20120300020A1 (en)2011-05-272012-11-29Qualcomm IncorporatedReal-time self-localization from panoramic images
US8902335B2 (en)*2012-06-062014-12-02Apple Inc.Image blending operations
US8773502B2 (en)*2012-10-292014-07-08Google Inc.Smart targets facilitating the capture of contiguous images

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
Co-pending U.S. Appl. No. 13/272,556 entitled "Using augmented reality to drive user-generated photos" by David Bort, filed Oct. 13, 2011.
Co-pending U.S. Appl. No. 13/315,748 entitled "Method, System, and Computer Program Product for Obtaining Crowd-Sourced Location Information" by Martha Welsh, filed Dec. 9, 2011.
Co-pending U.S. Appl. No. 13/662,124 entitled "Method, System, and Computer Program Product for Providing a Target User Interface for Capturing Panoramic Images" by Scott Ettinger, filed Oct. 26, 2012.
Co-pending U.S. Appl. No. 13/858,514 entitled "Systems and Methods for Updating Panoramic Images" by Daniel Joseph Filip, filed Apr. 8, 2013.
Dao, et al., "Location Based Services: Technical and Service Issues", GPS Solutions, 2002.
ISR/WO for PCT App. No. PCT/US12/59853, mailed on Mar. 8, 2013. (pp. 1-10).
ISR/WO for PCT App. No. PCT/US13/29772, mailed on May 21, 2013. (pp. 1-8).
Nurminen, Antii , "Mobile 3D City Maps", IEEE Computer Graphics and Applications, Jul./Aug. 2008, pp. 20-31.

Cited By (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20150077513A1 (en)*2012-04-132015-03-19Cyclomedia Technology B.V.System, Device, and Vehicle for Recording Panoramic Images
US9648233B2 (en)*2012-04-132017-05-09Cyclomedia Technology B.V.System, device, and vehicle for recording panoramic images
US20150271400A1 (en)*2014-03-192015-09-24Htc CorporationHandheld electronic device, panoramic image forming method and non-transitory machine readable medium thereof
US10257485B2 (en)2016-06-082019-04-09Google LlcGenerating a composite image from a physical item
US10531061B2 (en)2016-06-082020-01-07Google LlcGenerating a composite image from a physical item
US10675955B2 (en)2016-11-142020-06-09Google LlcAdaptive glare removal and/or color correction
US11483463B2 (en)2016-11-142022-10-25Google LlcAdaptive glare removal and/or color correction

Also Published As

Publication numberPublication date
US20140118479A1 (en)2014-05-01
US9667862B2 (en)2017-05-30
WO2014065854A1 (en)2014-05-01
US20160119537A1 (en)2016-04-28
US9832374B2 (en)2017-11-28
US20180048809A1 (en)2018-02-15
US20170223266A1 (en)2017-08-03
US10165179B2 (en)2018-12-25

Similar Documents

PublicationPublication DateTitle
US10165179B2 (en)Method, system, and computer program product for gamifying the process of obtaining panoramic images
US9723203B1 (en)Method, system, and computer program product for providing a target user interface for capturing panoramic images
US11979547B2 (en)Multi-dimensional data capture of an environment using plural devices
US11272153B2 (en)Information processing apparatus, method for controlling the same, and recording medium
US9662564B1 (en)Systems and methods for generating three-dimensional image models using game-based image acquisition
AU2012352520B2 (en)Multiple-angle imagery of physical objects
US20130095855A1 (en)Method, System, and Computer Program Product for Obtaining Images to Enhance Imagery Coverage
WO2019015405A1 (en)Virtual prop allocation method, server, client and storage medium
US9756260B1 (en)Synthetic camera lenses
TWI547901B (en)Simulating stereoscopic image display method and display device
US10545215B2 (en)4D camera tracking and optical stabilization
US20240087157A1 (en)Image processing method, recording medium, image processing apparatus, and image processing system
US20090167786A1 (en)Methods and apparatus for associating image data
WO2017133147A1 (en)Live-action map generation method, pushing method and device for same
CN113870213B (en)Image display method, device, storage medium and electronic equipment
US9108571B2 (en)Method, system, and computer program product for image capture positioning using a pattern of invisible light
CN109120901B (en)Method for switching pictures among cameras
KR102298047B1 (en)Method of recording digital contents and generating 3D images and apparatus using the same
WO2020017600A1 (en)Display control device, display control method and program
Figueroa et al.Camera placement optimization for sports filming
JP2025059155A (en) IMAGE DISPLAY SYSTEM, IMAGE DISPLAY METHOD, AND PROGRAM
JP2013214203A (en)Imaging device, imaging control method, and program

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:GOOGLE INC., CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RAPOPORT, EVAN;ETTINGER, SCOTT;SIGNING DATES FROM 20121126 TO 20121218;REEL/FRAME:029702/0441

STCFInformation on status: patent grant

Free format text:PATENTED CASE

ASAssignment

Owner name:GOOGLE LLC, CALIFORNIA

Free format text:CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044566/0657

Effective date:20170929

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:8


[8]ページ先頭

©2009-2025 Movatter.jp