Movatterモバイル変換


[0]ホーム

URL:


US7382399B1 - Omniview motionless camera orientation system - Google Patents

Omniview motionless camera orientation system
Download PDF

Info

Publication number
US7382399B1
US7382399B1US09/315,962US31596299AUS7382399B1US 7382399 B1US7382399 B1US 7382399B1US 31596299 AUS31596299 AUS 31596299AUS 7382399 B1US7382399 B1US 7382399B1
Authority
US
United States
Prior art keywords
image
cos
sin
angle
selected portion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US09/315,962
Inventor
Danny A. McCall
H. Lee Martin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US07/699,366external-prioritypatent/US5185667A/en
Priority claimed from US08/014,508external-prioritypatent/US5359363A/en
Priority claimed from US08/189,585external-prioritypatent/US5384588A/en
Priority claimed from US08/373,446external-prioritypatent/US6243131B1/en
Priority claimed from US08/863,584external-prioritypatent/US6002430A/en
Application filed by Sony CorpfiledCriticalSony Corp
Priority to US09/315,962priorityCriticalpatent/US7382399B1/en
Assigned to IMAGE INVESTOR PORFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC, A DELAWARE LIMITED LIABILITY COMPANYreassignmentIMAGE INVESTOR PORFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC, A DELAWARE LIMITED LIABILITY COMPANYINTELLECTUAL PROPERTY SECURITY AGREEMENTAssignors: INTERACTIVE PICTURES CORPORATION
Assigned to IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC, A DELAWARE LIMITED LIABILITY COMPANYreassignmentIMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC, A DELAWARE LIMITED LIABILITY COMPANYINTELLECTUAL PROPERTY SECURITY AGREEMENTAssignors: PW TECHNOLOGY, INC.
Assigned to IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC, A DELAWARE LIMITED LIABILITY COMPANYreassignmentIMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC, A DELAWARE LIMITED LIABILITY COMPANYINTELLECTUAL PROPERTY SECURITY AGREEMENTAssignors: INTERNET PICTURES CORPORATION
Assigned to INTERMET PICTURES CORPORATIONreassignmentINTERMET PICTURES CORPORATIONRELEASEAssignors: IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC
Assigned to PW TECHNOLOGY, INC.reassignmentPW TECHNOLOGY, INC.RELEASEAssignors: IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC
Assigned to INTERACTIVE PICTURES CORPORATIONreassignmentINTERACTIVE PICTURES CORPORATIONRELEASEAssignors: IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC
Assigned to SONY CORPORATIONreassignmentSONY CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: IPIX CORPORATION
Assigned to INTERNET PICTURES CORPORATIONreassignmentINTERNET PICTURES CORPORATIONMERGER (SEE DOCUMENT FOR DETAILS).Assignors: BAMBOO.COM, INTERACTIVE PICTURES CORPORATION, INTERNET PICTURES CORPORATION
Assigned to IPIX CORPORATIONreassignmentIPIX CORPORATIONCHANGE OF NAME (SEE DOCUMENT FOR DETAILS).Assignors: INTERNET PICTURES CORPORATION
Priority to US12/102,699prioritypatent/US20090040291A1/en
Publication of US7382399B1publicationCriticalpatent/US7382399B1/en
Application grantedgrantedCritical
Anticipated expirationlegal-statusCritical
Expired - Fee Relatedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A method and apparatus for capture of a spherical image is disclosed. The present invention includes at least one camera having a lens with at least a 180° field-of-view for capturing a hemispherical image. In a first embodiment, a second hemispherical image is created corresponding to a mirror image of the hemispherical image captured by the camera. In a second embodiment, two back-to-back cameras capture first and second hemispherical images, respectively. In both embodiments, a converter combines the two images along their outside edges to form a single, spherical image. Finally, the converter stores the complete spherical image for later retrieval and perspective corrected viewing.

Description

This application is a divisional of appl Ser. No. 08/863,584 filed May 27, 1997 now U.S. Pat. No. 6,002,430 which is a continuation-in-part of U.S. application Ser. No. 08/386,912 filed Feb. 8, 1995 now abandoned, which is a continuation of U.S. application Ser. No. 08/339,663 filed Nov. 14, 1994 now abandoned, which is a continuation of U.S. application Ser. No. 08/189,585 filed Jan. 31, 1994 (now U.S. Pat. No. 5,384,588), which is a continuation-in-part of U.S. application Ser. No. 08/014,508 filed Feb. 8, 1993 (now U.S. Pat. No. 5,359,363), which is a continuation-in-part of U.S. application Ser. No. 07/699,366 filed May 13, 1991 (now U.S. Pat. No. 5,185,667). This application is also a continuation-in-part of U.S. application Ser. No. 08/373,446 filed Jan. 17, 1995, which is a continuation-in-part of U.S. application Ser. No. 08/189,585 filed Jan. 31, 1994 (now U.S. Pat. No. 5,384,588).
This invention was made with Goverment support under contract NAS1-18855awarded by NASA. This Government has certain rights in this invention.
BACKGROUND OF THE INVENTION
1. Technical Field
This invention relates generally to an apparatus and method for capturing an image having a spherical field-of-view for subsequent viewing. Specifically, the present invention relates to a system involving a single camera having a lens with a field of view of at least 180° and associated method for capturing a first hemispherical image for subsequent combination into the spherical image. Alternatively, when the system comprises two cameras with such lenses mounted in a securely attached back-to-back arrangement, the system and method captures two distinct hemispherical images for subsequent combination into the spherical image. The preferred system includes a single-use, still image camera.
2. Background Art
The discussion of the background art related to the invention described herein relates to two subjects: spherical image capture and subsequent captured image transformations.
Spherical Image Capture
The goal of imaging technology is to make the observer feel as though he or she is part of the image. Prior art systems have partially accomplished this goal. Unfortunately, the ability of prior art systems to make the user feel part of the captured images are proportional to the cost of the image capture system.
Relating to inexpensive image capturing systems, camera companies have introduced disposable cameras. A disposable camera generally refers to a single-use camera that includes film, a lens, and a camera body, all in a single compact shell. The film includes either a single frame of film or multiple frames of film. After the entire roll of film has been exposed, the entire camera is returned for film developing. All the photographer receives back are the developed prints or slides. The manufacturer then recycles the parts from the returned camera, adds film, and ships the camera to a retailer for sale again. Disposable cameras come in various types including regular magnification cameras, telephoto cameras, water resistant cameras, and panoramic cameras.
Images captured by panoramic cameras provide wide angle horizontal images (left to right) but lack wide angle vertical images (up and down). Accordingly, while capturing a wide field-of-view on one plane (horizontal), the photographer loses the wide field-of-view on the other plane (vertical). Rotating the camera only alters the wide angle direction. The following example illustrates this shortcoming. Suppose a photographer desires to capture the grandeur of a dense forest from within the forest. While an image captured by a panoramic camera would include a sweeping cross section of trees (left to right), it world only include, at most, the middle portions of the nearest trees. To capture the forest floor and canopy, the photographer would have to take multiple panoramic photographs from looking almost straight down to looking straight up. The final image of the forest would then only be realized with the laborious task of manually cutting and pasting the different images together. Unfortunately, the left and right ends of the final image become distorted and cannot be easily resolved. The distortions created are similar to those encountered in map-making where one tries to represent a round earth on a flat map. Specifically, objects and relative distances near the extremes of the wide angle image become distorted. Additionally, this approach wastes film.
A slightly more complex panoramic camera employs a scanning drive mechanism which selectively exposes vertical strips of film as the camera scans from extreme to extreme. However, scanning panoramic cameras invariably introduce noise into captured images through vibrations generated from their scanning motions as well as take a relatively long period of time to capture the image.
Other wide-angle image capturing systems exist. For example, IMAX and 70 mm films provide high definition images on a large screen. However, these screens are flat. While a viewer can feel part of the scene when staring straight ahead, this feeling dissipates where the screen ends.
Another imaging system includes the OMNIMAX camera and projection system where an image was recorded and later projected on a spherical screen to produce an image 180 degrees wide, 100 degrees up from the horizon and 20 degrees below. While this system offers significant improvements over a flat screen projection system, the viewer's absorption into the displayed images is limited by the edges of the displayed image.
Another image capture and display system is U.S. Pat. No. 5,023,725 to McCutchen. McCutchen discloses a dodecahedral imaging system which breaks a sphere into 12 discrete polyhedrons. Each section has its own dedicated CCD camera. The images are captured and displayed on the walls of a hemispherical room. This system offers increased resolution through increasing the number of cameras used. However, as the number of cameras increase, the bulk of the imaging system likewise increases. Additionally, each camera has to be perfectly aligned with respect to the other cameras to adequately capture a spherical image. Using McCutcheon's system, increased resolution requires more bulk and more expense. Furthermore, the images of each camera are not integrated together. Accordingly, the system fails to account for the seams between the displayed images. While quickly moving images may mask these edge effects, the edge effects may be more noticeable with slow moving images.
Captured Image Transformations
Camera viewing systems are used in abundance for surveillance, inspection, security, and remote sensing. Remote viewing is critical, for example, for robotic manipulation tasks. Close viewing is necessary for detailed manipulation tasks while wide-angle viewing aids positioning of the robotic system to avoid collisions with the work space. Most of these systems use either a fixed-mount camera with a limited viewing field to reduce distortion, or they utilize mechanical pan-and-tilt platforms and mechanized zoom lenses to orient the camera and magnify its image. In the application where orientation of the camera and magnification of its image are required, the mechanical solution is large in size and can subtend a significant volume making the viewing system difficult to conceal or use in close quarters. Several cameras are usually necessary to provide wide-angle viewing of the work space.
In order to provide a maximum amount of viewing coverage or subtended angle, mechanical pan/tilt mechanisms usually use motorized drives and gear mechanisms to manipulate the vertical and horizontal orientation. An example of such a device is shown in U.S. Pat. No. 4,728,839 issued to J. B. Coughlan, et al, on Mar. 1, 1988. Collisions with the working environment caused by these mechanical pan/tilt orientation mechanisms can damage both the camera and the work space and impede the remote handling operation. Simultaneously, viewing in said remote environments is extremely important to the performance of inspection and manipulation activities.
Camera viewing systems that use internal optics to provide wide viewing angles have also been developed in order to minimize the size and volume of the camera and the intrusion into the viewing area. These systems rely on the movement of either a mirror or prism to change the tilt-angle of orientation and provide mechanical rotation of the entire camera to change the pan angle of orientation. Additional lenses are used to minimize distortion. Using this means, the size of the camera orientation system can be minimized, but “blind spots” in the center of the view result. Also, these systems typically have no means of magnifying the image and or producing multiple images from a single camera.
Further, references that may be relevant to the evaluation of the captured image transformations as described herein include U.S. Pat. No. 4,772,942 issued to M. J. Tuck on Sep. 20, 1988; U.S. Pat. No. 5,067,019 issued to R. D. Juday on Nov. 19, 1991; and U.S. Pat. No. 5,068,735 issued to K. Tuchiya, et al on Nov. 26, 1991.
OBJECTS OF THE INVENTION
Accordingly, it is an object of the present invention to provide an apparatus that captures at least one hemispherical image for later manipulation.
Another object of the invention is to provide an apparatus which captures a spherical image from two images produced by two cameras.
Another object of the invention is to form a single spherical image from the captured image or images.
It is a further object of the invention to provide a spherical image capture system and method without the bulk of a large number of cameras and the necessity of multiple camera alignment.
Another object of the invention is to reduce the number of seams in a formed image.
Another object of the invention is to accomplish the above objectives using a single-use, disposable camera.
Another object of the invention is to provide a system for displaying a complete spherical image with perspective correction and without edge effects and image distortion.
Another object of the invention is to enable interaction with any portion of the spherical image with the selected portion being perspective corrected.
It is another object of the present invention to provide horizontal orientation (pan), vertical orientation (tilt) and rotational orientation (rotation) of the viewing direction with no moving mechanisms.
It is another object of the present invention to provide the ability to magnify or scale the image (zoom in and out) electronically.
It is another object of the present invention to provide electronic control of the image intensity (iris level).
It is another object of the present invention to be able to accomplish pan, tilt, zoom, rotation, and iris adjustment with simple inputs made by a lay person from a joystick, keyboard controller, or computer controlled means.
It is also an object of the present invention to provide accurate control of the absolute viewing direction and orientations using said input devices.
A further object of the present invention is to provide the ability to produce multiple images with different orientations and magnifications simultaneously from a single input image.
Another object of the present invention is to be able to provide these images at real-time video rate, e.g. thirty transformed images per second, and to support various display format standards such as the National Television Standards Committee RS-170 signal format and/or higher resolution formats currently under development and to provide the images to a computer display performing perspective correction transforms on a personal computer system.
It is also an object of the present invention to provide a system than can be used for automatic or manual surveillance of selected environments, with optical views of these environments corrected electronically to remove distortion so as to facilitate this surveillance.
It is another object of this invention to provide a means for directly addressing each picture element of an analog image captured with an imaging device having a field-of-view, the picture elements being addressed in a non-linear sequence determined in a manner similar to that described by U.S. Pat. No. 5,185,667 to provide a distortion-corrected image without requiring the use of filters and memory holding buffers.
Another object of the present invention is to provide a means for directly addressing each picture element of an image (still or video) captured using an imaging device having a two-dimensional field-of-view.
SUMMARY OF THE INVENTION
According to the principles of the present invention, at least one camera with a 180° or greater field-of-view lens captures a spherical image. When the system employs two cameras with such lenses, the cameras and lenses are mounted in a back-to-back arrangement. When used in this disclosure and attached claims, “back-to-back” means two cameras clasped together such that the image planes of the lenses fall between each of the lenses and both lenses' optical axes are collinear with a single line which passes through each lens and camera. An imaging element or elements capture the images produced by the lenses. When used herein and in the claims, an “imaging element” or “imaging elements” refer to both film and linear scanning devices and alternatives thereof upon which an image is focused and captured. The captured images from each camera are stored and combined to form a single, spherical image (a final, formed image). When used herein and in the claims, “stored” not only means to digitally store an image in a retrievable form but also means to capture the image on film. To form the spherical image, the system includes a converter which identifies, joins, and smooths the edges (also referred to as the “seams”) of each hemispherical image. When used herein and in the claims, a “converter” refers to not only a manual system (splicing by hand and airbrush image altering techniques) but also an automatic image processing system (digital processing by a computer where images are altered-automatically) for combining the two images together. Where a partial overlap exists between the two hemispherical images, the converter processes the partial overlap to remove the overlap and any distortion and create a single, complete, formed spherical image. Finally, a selected planar portion of the spherical image may be displayed on a personal computer using perspective correction software or hardware.
A method for capturing a spherical image includes the steps of capturing a first hemispherical image with a first camera including a first 180° or greater field-of-view lens; receiving a second hemispherical image either by capturing the second hemispherical image by means of a second camera including a second oppositely directed 180° or greater field-of-view lens or by creating a mirror image of the first hemispherical image; and, combining the first and second oppositely directed hemispherical images to create a spherical image.
An apparatus capturing a spherical image includes a first camera equipped with a 180° or greater field-of-view lens, the first camera and the lens directed in a first direction, the first camera capturing a first image; a second device either forming a second image corresponding to a mirror image of the first image or including a second camera equipped with a 180° or greater field-of-view lens, directed in a second direction opposite to the first direction, the second camera capturing the second image; and, a combining system for combining the first and second images into a formed spherical image.
The cameras disclosed above capture high resolution images. Various cameras may be used including still cameras, video cameras, and CCD, CID, or CMOS APS cameras. With high resolution (crystal clear) images as a goal, the system employs a still camera capturing a high resolution image on a fine grain film. Film generally composes a layer of silver halide crystals. Upon exposure to light, this silver halide layer picks up the image exposed to it. The greater the number of separate halide crystals, the greater the resolution of the film. Thus, a finer grain size refers to an increase in number of silver halide crystals per unit area of film which in turn refers to an increase in the potential resolution of the film medium.
When capturing a spherical image with two single-use cameras, the cameras include additional features allowing for dual image capture. Where “single-use camera” is referred to herein and in the claims, it refers to a disposable camera or other alternative. The additional features which aid in spherical image capture include attachment devices which attach the backs of the cameras to each other. When used herein, “attachment devices” refer to locking pins, locking clasps, lever and hook systems, and alternatives thereof. Also, each camera's shutter release may be controlled by a single button (common shutter release control) with either a mechanical or electrical servo linkage releasing each camera's shutter. Additionally, to allow a photographer to avoid his or her image from being captured by the spherical image capture system, the dual camera system includes a shutter auto timer or a remote shutter activation control controlling the common shutter release control. The remote shutter control may be an IR transmitter or remote shutter release cable. Further, the dual camera system may include two different shutters operable independently or sequentially. The sequential shutter operations allow the photographer to walk around to the other side of the dual camera system so as not to become part of the captured spherical image.
According to the present invention, when using a still image recorded on film, after developing the film, a high resolution digital scanner scans and digitizes the image contained in the developed film and stores the digitized image in a retrievable medium. The retrievable medium includes, inter alia, CD-ROMs, magnetic disks and tapes, semiconductor devices, and magneto-optical disks.
As referred to above, the second image may be created from the first image. This may be accomplished by at least one of two methods: first, manually, by forming the second image by hand and, second, automatically, by means of a computer running image processing software. As to manually creating the image, the film developing and printing steps generate the second image. For example, after printing or scanning the first hemispherical image, a technician or device flips or likewise reverses the film storing the at least hemispherical image (from left to right orientation to right to left orientation) and scans or prints the film again.
The automatic printing or scanning technique creates the second hemispherical image (also known as a mirror image of the first image) through appropriate software. Alternatively, image processing software or hardware may reverse the scanned image without the need to manually flip a developed piece of film.
The converter (automatic or manual) seams the two hemispherical images together and stores a generated, complete spherical image in a storage medium including CD-ROMs, magnetic disks and tapes, semiconductor devices and magneto-optical disks. This converting may be accomplished by sending the camera and/or film to a processing center which sends back the spherical image stored in one of the above storage mediums.
Finally, using the perspective correction and manipulation system as disclosed in U.S. Pat. No. 5,185,667 and its progeny including U.S. Pat. Nos. 5,359,363 and 5,313,306 and Ser. Nos. 08/189,585, 08/339,663, and 08/373,446, the formed, seamless, spherical image may be explored. These patents and applications and others herein are expressly incorporated by reference.
Preferably, a personal computer system runs the perspective correction software or hardware. These computers may be directly linked to the image capturing system (allowing viewing of the spherical image as captured by the hemispherical camera or cameras and manipulated by the perspective correction system) or may remain completely separate (photographing an image, sending the film to a processing center which creates a spherical image from the photograph or photographs, and returning the spherical image stored in a retrievable form for display on a personal computer).
BRIEF DESCRIPTION OF THE DRAWINGS
The above mentioned features of the invention will become more clearly understood from the following detailed description of the invention read together with the drawings in which:
FIG. 1 is a diagram of the fields of view for 180° and greater than 180° fields of view lenses as mounted to a single camera body.
FIG. 2 shows two back-to-back cameras each capturing more than 180° fields of view images.
FIG. 3 shows two back-to-back cameras each capturing 180° fields of view images.
FIG. 4 shows an alternated embodiment of the spherical capture system of the present invention.
FIGS. 5A and 5B relate to the elements used to capture a spherical image.FIG. 5A shows two hemispherical lenses capturing complementary hemispherical images and feeding them to remote cameras.FIG. 5B shows a hemispherical lens capturing a hemispherical image and a mirror image converter for converting the first hemispherical image into a second hemispherical image.
FIG. 6A shows two hemispherical lenses similar to that ofFIG. 5A passing images to local cameras through reflective and refractive optics.FIG. 6B shows two hemispherical lenses conveying images to a single camera.
FIGS. 7A and 7B represent two hemispherical images combined into a single spherical image.
FIG. 8 shows a storage/display option of the instant invention.
FIGS. 9A and 9B show a schematic block diagram of the signal processing portion of the present invention illustrating the major components thereof.FIG. 9A shows the perspective correction process implemented in hardware.FIG. 9B shows the perspective correction process implemented in software, operating inside a personal computer.
FIG. 10 is an exemplary drawing of an at least hemispherical image used as input by the present invention. Lenses having other field-of-view values will produce images with similar distortion, particularly when the field-of-view is about eighty degrees or greater.
FIG. 11 is an exemplary drawing of the output image after correction for a desired image or orientation and magnification within the original image.
FIG. 12 is a schematic diagram of the fundamental geometry that the present invention embodies to accomplish the image transformation.
FIG. 13 is a schematic diagram demonstrating the projection of the object plane and position vector into image plane coordinates.
DETAILED DESCRIPTION OF THE INVENTION
Spherical Image Capture
The disclosed Spherical Image Capture system employs the components disclosed inFIGS. 1-8 to capture hemispherical images and form spherical images. The image transform engine as disclosed inFIGS. 9-13 operates to transform selected portions of the formed spherical images into planar, perspective corrected portions.
Referring toFIG. 1,camera601 includeslens602 with optical axis A, image plane I, and a field-of-view of 180° or greater. Iflens602 has a 180° field-of-view it captures at most the image fromhemisphere603. On the other hand, iflens602 has a field-of-view greater than 180°, then it captures the image from sector604 (shown by dotted lines) as well as that ofhemisphere603.
FIG. 2 shows a camera body701 (which may include two cameras) connected tolenses702 and703 (with image planes I702and I703, respectively). Each oflenses702 and703 have fields of view greater than 180°. Placed in a back-to-back arrangement where the lenses are mounted such that the image planes I702and I703from the lenses fall between each of the lenses and both lenses' optical axes A coincide in a single line which passes through each lens and camera, they capture the spherical image surrounding thecamera body701. It should be noted, however, that the thickness of thecamera body701 plays a role in how much of the spherical image surrounding the camera is captured. Specifically, the objects on the sides of the camera may or may not be completely photographed depending on their distances from thecamera body701. For example, if objects are withinboundary704, some of the objects may fall into the camera'sblind spots707 and not be completely photographed. On the other hand, because of the converging angles of lenses' greater than 180° fields of view, objects withinsectors705 will be photographed twice: first, by means of the image captured bylens702 and, second, by means of the image captured bylens703. Decreasing the distances between the lenses reducesblind spots707 of the spherical capture system. In this example, reducing the distance between the lenses means reducing the thickness of thecamera body701. Reducing the camera body thickness can be accomplished, for example, by using smaller imaging and recording elements such as a CCD, CID, or CMOS APS camera as disclosed in U.S. Ser. No. 08/373,446, expressly incorporated herein by reference. Additionally, the distance between image planes I702and I703oflenses702 and703, respectively, may be reduced to the point where the image planes coincide, further reducing the thickness of the camera body.
FIG. 3 disclosescamera body801, similar to that ofcamera body701, andlenses802 and803 with image planes I802and I803, respectively, each having a field-of-view of exactly 180°. Lens802 receives the image of hemisphere804 andlens803 receives the image ofhemisphere805. Similar toFIG. 2 above, the lenses attach tocamera body801 in a back-to-back arrangement where the lenses are mounted such that the image planes I802and I803from the lenses fall between each of the lenses and both lenses' optical axes A coincide in a single line which passes through each lens and camera. As discussed with reference toFIG. 2 above, becausecamera body801 has a thickness (i.e., the distance betweenlenses802 and803 is greater than zero), theimage capture system800 hasblind spots806 on the sides of thecamera body801. These blind spots may be reduced by decreasing the distance betweenlenses802 and803. Here, this means reducing the thickness ofcamera body801. This may be accomplished, inter alia, by reducing the size of the imaging and recording components as discussed above in reference to FIG.2.
Referring now toFIG. 4, twocameras201 and202 equipped withlenses203,204, each having a field-of-view (FOV) greater than 180°, are disclosed in a back-to-back arrangement (the image planes (not shown) falling between each of the lenses and the optical axes of thelenses203 and204 are collinear as designated by line A). Because eachcamera201,202 has a lens (203,204) which has a field-of-view (FOV) greater than 180°, each captures more than the image of a complete hemisphere. By employing two cameras in this arrangement, the camera system captures a complete spherical image. The types of cameras employed are chosen from the group comprising of at least still cameras with loaded film or digital image capture, motion picture cameras with loaded film or digital image capture, the KODAK™ digital image capture system, video, and linear scanning CID, CCD, or CMOS APS camera arrays. The outputs ofcameras201 and202 connect by means of electrical, optical, or electro-optical links215 to hemispherical-to-spherical image converter216. When the captured hemispherical images are stored on film, optical-to-electrical converter215A converts the stored images into a form usable by hemispherical-to-spherical image converter216. Optical-to-electrical converter215A includes a scanning system which scans a photographed image and outputs a high resolution, electronic replica of the photographed image. One converter includes the Kodak™ Photo-CD Rom converter which takes a photograph and converts it into a high resolution digital form which then may be stored on a compact disk. Hemispherical-to-spherical converter216 receives the hemispherical images fromcameras201 and202 (or alternatively, from optical-to-electrical converter215A).
The cameras include additional features allowing for dual image capture. For example, the backs of the cameras are attached to each other viaseparable attachment devices401.Attachment devices401 may be locking pins, locking clasps, lever and clip systems, etc. Also, each camera's shutter release may be controlled by asingle button402A (common shutter release control) with either a mechanical or electrical servo linkage releasing each camera's shutter. Additionally, to allow a photographer to ensure his or her image is not recorded by the spherical image capture system, the dual camera system includes a shutter auto timer or a remoteshutter activation control403 controlling the common shutter release control, allowing the photographer to move to a concealed or non-image-captured position. Theremote shutter control403 may be an IR transmitter or remote shutter release cable. Further, the dual camera system may include two different shuttersrelease control buttons402B operable independently or sequentially. The sequential shutter operations allow the photographer to walk around to the other side of the dual camera system so as not to become part of the captured spherical image.
Next, hemispherical-to-spherical converter216 combines the hemispherical images into a single, complete spherical image. Finally, the edges of the two hemispherical images may be combined to form a seamless spherical image. Removing the seams from the two hemispherical images may be accomplished in a number of ways. For example, the two images may be “airbrushed” together (where any difference between the two images at the periphery of the images are smoothed together. Alternatively, a more complex method of seaming the two images together may include matching related pixels by their luminance and chrominance values and interpolating the corresponding values for interstitial pixels. In the event that a partial overlap exists between the two hemispherical images, the converter processes the spherical image to remove the partial overlap any distortion and creates a single, complete, formed image. The processing may include choosing and displaying one hemisphere over the other, weighted and non-weighted averaging of the overlapping sections, and linear and non-linear approximations creating intermediary images.
FIG. 5A showslenses203 and204 positioned remotely fromcameras201 and202. Here, the image planes I203and I204fall betweenlenses203 and204 and the optical axes of thelenses203 and204 are collinear as designated by line A. Electrical, optical (including fiber optic lines), or electro-optical links215 connect the images received fromlenses203 and204 to thecameras201 and202. Next, hemispherical-to-spherical image converter216 receives the outputs fromcameras201 and202 and outputs a spherical image as described in relation to FIG.4.
FIG. 5B showssingle lens203 positioned remotely fromcamera201. Electrical, optical (including fiber optic lines), or electro-optical links215 connect the image received fromlens203 to thecamera201. Next,camera201 captures a first hemispherical image. The output of camera201 (a still or video image contained in a frame or frames of film, digital or analog signal) is sent tomirror image converter901 and one input of the hemispherical tospherical image converter216. Themirror image converter901 assumes many forms depending on the form of image relayed to it. For developed film,converter901 refers to a re-scanning system re-scanning the developed film with the film flipped (flipped from a left to right orientation to a right to left orientation). For an optical or electrical signal,converter901 refers to a signal processing system which automatically creates a second hemispherical image from the first hemispherical image. The output ofconverter901 flows to the hemispherical-to-spherical image converter216 as the second hemispherical image. Finally, hemispherical-to-spherical image converter216 outputs a spherical image as described in relation to FIG.4.
FIG. 6A shows an alternative arrangement of thecameras201 and202 and thelenses203 and204. The optical axes of thelenses203 and204 are collinear as designated by line A. Here, the devices used to convey the images fromlenses203 and204 tocameras201 and202 include hollow chambers withreflective optics215B andrefractive optics215C, as necessary for proper transmission of the hemispherical images. Thereflective optics215B allow thecameras201 and202 to be moved from a location directly behind eachlens203,204. Therefractive optics215C aid in focusing the hemispherical images generated bylenses203 and204. This movement of the cameras from behind the lenses allows the lenses to be moved closer together, maximizing the area photographed.
A further modification includes the substitution of the APS camera array of co-pending U.S. application Ser. No. 08/373,446 (expressly incorporated herein by reference) for the optical system described above. Because of the small size of an APS camera array, two arrays may be placed back to back to further maximize the content of each hemispherical image. An advantage of using APS camera arrays is the shifted processing location of the Omniview engine. Specifically, by adding additional processing circuitry on the APS camera array chip, the selection and “dewarping” transformations may be performed locally on the APS chip. This results in less subsequent processing of the image as well as a reduction in the bandwidth required for sending each hemispherical image to an external processing device.
Furthermore, as described above,image conduits215 may include optical fibers instead of thereflective optics215B andrefractive optics215C. An imaging system including optical fibers connected between a hemispherical lens and imaging array is found in U.S. Pat. No. 5,313,306 to Martin which is expressly incorporated by reference. The present invention includes the application of the spherical imaging system with a combination of an endoscope and dual hemispherical lenses to capture hemispherical images of remote locations.Converter216 combines the hemispherical images to a form complete, spherical image.
FIG. 6B relates to another embodiment where asingle camera201A captures the images produced bylenses203 and204. The optical axes of thelenses203 and204 are collinear as designated by line A. Here, employing a single camera to capture both hemispherical images (fromlenses203 and204) eliminates the bulk of the second camera. For example, wherecamera201A is a still camera, the camera records the two hemispherical images in a single frame in a side-by-side relationship, exposed at the same time or during related time intervals. Alternatively, the two images may be captured in separate frames, exposed at the same time or during related time intervals. The same applies to video and motion picture cameras as well. Image capture with a single camera may be used in the other embodiments of described in greater detail herein. A system ofFIG. 6B including an APS camera array may be mounted onto a single, silicon chip. This combination has multiple advantages including reduced size of the image capture system, reduced bulk from extra cameras, higher resolution from the APS camera arrays.
FIG. 7A shows first205 and second206 hemispherical images, each taken from one ofcameras201 or202.FIG. 7A also shows theedges207,208 (or seams) of each hemispherical image.FIG. 7B shows the twoimages205 and206 combined into a single,spherical image209.Seams207 and208 have been combined to form the single,seamless image209.
FIG. 8 shows a possible future viewing system for viewing the formed spherical image system. The image planes1203 and1204 fall betweenlenses203 and204 and the optical axes of thelenses203 and204 are collinear as designated by line A.Image input buffer217 temporarily stores images received fromcameras201 and202 until hemispherical-to-spherical image converter216 accepts the stored images. Also,FIG. 8 includes options for the spherical images. For example, after combining the two hemispherical images into a single, spherical image inconverter216, the spherical image may be immediately viewed throughviewing engine218.Viewing engine218 includes the Omniview calculation engine with viewer communication interface124 as shown inFIG. 1 of co-pending U.S. Ser. No. 08/373,446 (expressly incorporated herein by reference). Here, the user may view selected portions of the formed spherical image as output from the hemispherical-to-spherical image converter216. Alternatively, the spherical image may be stored instorage device219. The storage device119 may include video tape, CD-ROM, semiconductor devices, magnetic or magneto-optical disks, or laser disks as the storage medium. By the interconnections betweenviewing engine218 andstorage device219, a new spherical image may be displayed and saved instorage device219 as well as saved instorage device219 and viewed at a later time.
Further enhancements include using two side-by-side hemispherical lens equipped cameras for stereo-optical viewing. Additionally, the back-to-back camera system described herein may be attached to the exterior of any of a number of different vehicles for spherical image capture of a number of different environments.
Captured Image Transformation
FIGS. 9-13 relate to the captured image transformation system.
In order to minimize the size of the camera orientation system while maintaining the ability to zoom, a camera orientation system that utilizes electronic image transformation rather than mechanisms was developed. While numerous patents on mechanical pan-and-tilt systems have been filed, no approach using strictly electronic transforms and 180° or greater field of view optics is known to have been successfully implemented. In addition, the electro-optical approach utilized in the present invention allows multiple images to be extracted from the output of a signaled camera. These images can be then utilized to energize appropriate alarms, for example, as a specific application of the basic image transformation in connection with a surveillance system. As utilized herein, the term “surveillance” has a wide range including, but not limited to, determining ingress or egress from a selected environment. Further, the term “wide angle” as used herein means a field-of-view of about eighty degrees or greater. Motivation for this device came from viewing system requirements in remote handling applications where the operating envelop of the equipment is a significant constraint to task accomplishment.
The principles of the optical transform utilized in the present invention can be understood by reference to thesystem10 ofFIGS. 9A and 9B. (This is also set forth in the aforecited U.S. patent application Ser. No. 07/699,366 that is incorporated herein by reference.) Referring toFIG. 9A, shown schematically at11 is a wide angle, e.g., a hemispherical, lens that provides an image of the environment with a 180 degree or greater field-of-view. The lens is attached to acamera12 which converts the optical image into an electrical signal. These signals are then digitized electronically13 and stored in animage buffer14 within the present invention. An image processing system consisting of an X-MAP and a Y-MAP processor shown as16 and17, respectively, performs the two-dimensional transform mapping. The image transform processors are controlled by the microcomputer andcontrol interface15. The microcomputer control interface provides initialization and transform parameter calculation for the system. The control interface also determines the desired transformation coefficients based on orientation angle, magnification, rotation, and light sensitivity input from an input means such as ajoystick controller22 or computer input means23. The transformed image is filtered by a 2-dimensional convolution filter18 and the output of the filtered image is stored in anoutput image buffer19. Theoutput image buffer19 is scanned out bydisplay electronics20 to avideo display device21 for viewing.
A range of lens types can be accommodated to support various fields of view. Thelens optics11 correspond directly with the mathematical coefficients used with the X-MAP and Y-MAP processors16 and17 to transform the image. The capability to pan and tilt the output image remains even though a different maximum field-of-view is provided with a different lens element.
The invention can be realized by proper combination of a number of optical and electronic devices. Thelens11 is exemplified by any of a series of wide angle lenses from, for example, Nikon, particularly the 8 mm F2.8. Anyvideo source12 andimage capturing device13 that converts the optical image into electronic memory can serve as the input for the invention such as a Videk Digital Camera interfaced with Texas Instrument's TMS 34061 integrated circuits. Input and output image buffers14 and19 can be construed using Texas Instrument TMS44C251 video random access memory chips or their equivalents. The control interface can be accomplished with any of a number of microcontrollers including the Intel 80C196. The X-MAP and Y-MAP transform processors16 and17 andimage filtering19 can be accomplished with application specific integrated circuits or other means as will be known to persons skilled in the art. The display driver can also be accomplished with integrated circuits such as the Texas Instruments TMS34061. The output video signal can be of the NTSC RS-170, for example, compatible with most commercial television displays in the United States.Remote control22 andcomputer control23 are accomplished via readily available switches and/or computer systems than also will be well known. These components function as a system to select a portion of the input image (hemispherical or other wide angle) and then mathematically transform the image to provide the proper prospective for output. The keys to the success of the perspective correction system include:
    • (1) the entire input image need not be transformed, only the portion of interest;
    • (2) the required mathematical transform is predictable based on the lens characteristics; and
    • (3) calibration coefficients can be modified by the end user to correct for any lens/camera combination supporting both new and retrofit applications.
FIG. 9B contains elements similar to that ofFIG. 9A but is implemented in a personal computer represented by dashed line D. The personal computer includescentral processing unit15′ performing the perspective correction algorithms X-MAP16′ and Y-MAP17′ as stored in RAM, ROM, or some other form. Thedisplay driver20 outputs the perspective corrected image to computer display monitor21′.
The transformation that occurs between theinput memory buffer14 and theoutput memory buffer19, as controlled by the two coordinatedbuffer19, as controlled by the two coordinatedtransformation circuits16 and17 ofFIG. 9A (or algorithms as stored in16′ and17′ of FIG.9B), is better understood by referring toFIG. 10 is a rendering of the image of a grid pattern produced by a hemispherical lens. This image has a field-of-view of 180 degrees and shows the contents of the environment throughout and entire hemisphere. Notice that the resulting image inFIG. 10 is significantly distorted relative to human perception. Similar distortion will be obtained even with lesser field-of-view lenses. Vertical grid lines in the environment appear in the image plane as24a,24b, and24c. Horizontal grid lines in the environment appear in the image plane as25a,25b, and25c. The image of an object is exemplified by26. A portion of the image inFIG. 10 has been corrected, magnified, and rotated to produce the image shown in FIG.11.Item27 shows the corrected representation of the object in the output display. The results shown in the image inFIG. 11 can be produced from any portion of the image ofFIG. 10 using the present invention. The corrected perspective of the view is demonstrated by the straightening of the grid pattern displayed in FIG.11. In the present invention, these transformations can be performed at real-time video rates (e.g., thirty times per second), compatible with commercial video standards.
The transformation portion of the invention as described has the capability to pan and tilt the output image through the entire field-of-view of the lens element by changing the input means, e.g. the joystick or computer, to the controller. This allows a large area to be scanned for information as can be useful in security and surveillance applications. The image can also be rotated through any portion of 360 degrees on its axis changing the perceived vertical of the displayed image. This capability provides the ability to align the vertical image with the gravity vector to maintain a proper perspective in the image display regardless of the pan or tilt angle of the image. The invention also supports modifications in the magnification. used to display the output image. This is commensurate with a zoom function that allows a change in the field-of-view of the output image. This function is extremely useful for inspection and surveillance operations. The magnitude of zoom provided is a function of the resolution of the input camera, the resolution of the output display, the clarity of the output display, and the amount of picture element (pixel) averaging that is used in a given display. The invention supports all of these functions to provide capabilities associated with traditional mechanical pan (through 180 degrees), tilt (through 180 degrees), rotation (through 360 degrees), and zoom devices. The digital system also supports image intensity scaling that emulates the functionality of a mechanical iris by shifting the intensity of the displayed image based on commands from the user or an external computer.
The postulates and equations that follow are based on the image transformation portion of the present invention utilizing a wide angle lens as the optical element. These also apply to other field-of-view lens systems. There are two basic properties and two basic postulates that describe the perfect wide angle lens system. The first property of such a lens is that the lens has a 2π steradian filed-of-view and the image it produces is a circle. The second property is that all objects in the field-of-view are in focus, i.e. the perfect wide angle lens has an infinite depth-of-field. The two important postulates of this lens system (refer toFIGS. 12 and 13) are stated as follows:
Postulate 1: Azimuth angle invariability—For object points that lie in a content plane that is perpendicular to the image plane and passes through the image plane origin, all such points are mapped as image points onto the line of intersection between the image plane and the content plane, i.e. along a radial line. The azimuth angle of the image points is therefore invariant to elevation and object distance changes within the content plane.
Postulate 2: Equidistant Projection Rule—The radial distance, r, from the image plane origin along the azimuth angle containing the projection of the object point is linearly proportional to the zenith angle β, where β is defined as the angle between a perpendicular line through the image plane origin and the line from the image plane origin to the object point. Thus the relationship:
r=kβ  (1)
Using these properties and postulates as the foundation of the lens system, the mathematical transformation for obtaining a perspective corrected image can be determined.FIG. 12 shows the coordinate reference frames for the object plane and the image plane. The coordinates u,v describe object points within the object plane. The coordinates x,y,z describe points within the image coordinate frame of reference.
The object plane shown inFIG. 12 is a typical region of interest to determine the mapping relationship onto the image plane to properly correct the object. The direction of view vector, DOV[x,y,z], determines the zenith and azimuth angles for mapping the object plane, UV, onto the image plane, XY. The object plane is defined to be perpendicular to the vector, DOV[x,y,z].
The location of the origin of the object plane in terms of the image plane [x,y,z] in spherical coordinates is given by:
x=Dsin β cos ∂
y=Dsin β cos ∂
z=Dcos β  (2)
where D=scaler length from the image plane origin to the object plane origin, β is the zenith angle, and ∂ is the azimuth angle in image plane spherical coordinates. The origin of object plane is represented as a vector using the components given inEquation 1 as:
DOV[x,y,z]=[Dsin β cos ∂,Dsin β sin ∂,Dcos β]  (3)
DOV[x,y,z] is perpendicular to the object plane and its scaler magnitude D provides the distance to the object plane. By aligning the XY plane with the direction of action of DOV[x,y,z], the azimuth angle ∂ becomes either 90 or 270 degrees and therefore the x component becomes zero resulting in the DOV[x,y,z] coordinates:
DOV[x,y,z]=[0, −Dsin β,Dcos β]  (4)
Referring now toFIG. 13, the object point relative to the UV plane origin in coordinates relative to the origin of the image plane is given by the following:
x=u
y=vcos β
z=vsin β  (5)
therefore, the coordinates of a point P(u,v) that lies in the object plane can be represented as a vector P[x,y,z] in image plane coordinates:
P[x,y,z]=[u, vcos β,vsin β]  (6)
where P[x,y,z] describes the position of the object point in image coordinates relative to the origin of the UV plane. The object vector o[x,y,z] that describes the object point in image coordinates is then given by:
O[x,y,z]=DOV[x,y,z]+P[x,y,z]  (7)
O[x,y,z]=[u, vcos β−Dsin β,vsin β+Dcos β]  (8)
Projection onto a hemisphere of radius R attached to the image plane is determined by scaling the object vector o[x,y,z] to produce a surface vector s[x,y,z]:S[x,y,z]=RO[x,y,z]O[x,y,z](9)
By substituting for the components of o[x,y,z] fromEquation 8, the vector S[x,y,z] describing the image point mapping onto the hemisphere becomes:S[x,y,z]=RO[u,(vcosβ-Dsinβ),(vsinβ+Dcosβ)]u2+(vcosβ-Dsinβ)2+(vsinβ+Dcosβ)2(10)
The denominator inEquation 10 represents the length or absolute value of the vector o[x,y,z] and can be simplified through algebraic and trigonometric manipulation to give:S[x,y,z]=RO[u,(vcosβ-Dsinβ),(vsinβ+Dcosβ)]u2+v2+D2(11)
FromEquation 11, the mapping onto the two-dimensional image plane can be obtained for both x and y as:x=Ruu2+v2+D2(12)y=R(vcosβ-Dsinβ)u2+v2+D2(13)
Additionally, the image plane center to object plane distance D can be represented in terms of the image circular radius R by the relation:
D=mR  (14)
    • where m represents the scale factor in radial units R from the image plane origin to the object plane origin. SubstitutingEquation 14 intoEquations 12 and 13 provides a means for obtaining an effective scaling operation or magnification which can be used to provide zoom operation.x=Ruu2+v2+m2R2(15)y=R(vcosβ-mRsinβ)u2+v2+m2R2(16)
Using the equations for two-dimensional rotation of axes for both the UV object plane and the XY image plane the last two equations can be further manipulated to provide a more general set of equations that provides for rotation within the image plane and rotation within the object plane.x=R[uA-vB+mRsinβsin]u2+v2+m2R2(17)y=R(uC-vD-mRsinβcos)u2+v2+m2R2(18)
    • where:
      A=(cos ø cos ∂−sin ø sin ∂ cos β)
      B=(sin ø cos ∂+cos ø sin ∂ cos β)
      C=(cos ø sin ∂+sin ø cos ∂ cos β)
      D=(sin ø sin ∂−cos ø cos ∂ cos β)  (19)
    • and where:
    • R=radius of the image circle
    • β=zenith angle
    • ∂=Azimuth angle in image plane
    • ø=Object plane rotation angle
    • m=Magnification
    • u,v=object plane coordinates
    • x,y=image plane coordinates
TheEquations 17 and 18 provide a direct mapping from the UV space to the XY image space and are the fundamental mathematical result that supports the functioning of the present omnidirectional viewing system with no moving parts. By knowing the desired zenith, azimuth, and object plane rotation angles and the magnification, the locations of x and y in the imaging array can be determined. This approach provides a means to transform an image from the input video buffer to the output video buffer exactly. Also, the image system is completely symmetrical about the zenith, therefore, the vector assignments and resulting signs of various components can be chosen differently depending on the desired orientation of the object plane with respect to the image plane. In addition, these postulates and mathematical equations can be modified for various lens elements as necessary for the desired field-of-view coverage in a given application.
The input means defines the zenith angle, β, the azimuth angle, ∂, the object rotation, ø, and the magnification, m. These values are substituted intoEquations 19 to determine values for substitution intoEquations 17 and 18. The image circle radius, R, is fixed value that is determined by the camera lens and element relationship. The variables u and v vary throughout the object plane determining the values for x and y in the image plane coordinates.
From the foregoing, it can be seen that a wide angle lens provides a substantially hemispherical view that is captured by a camera. The image is then transformed into a corrected image at a desired pan, tilt, magnification, rotation, and focus based on the desired view as described by a control input. The image is then output to a television display with the perspective corrected. Accordingly, no mechanical devices are required to attain this extensive analysis and presentation of the view of an environment through 180 degrees of pan, 180 degrees of tilt, 360 degrees of rotation, and various degrees of zoom magnification.
As indicated above, one application for the perspective correction of images obtained with a motionless wide angle camera is in the field of surveillance. The term “surveillance” is meant to include inspection and like operations as well. It is often desired to continuously or periodically view a selected environment to determine activity in that environment. The term “environment” is meant to include such areas as rooms, warehouses, parks and the like. This activity might be, for example, ingress and egress of some object relative to that environment. It might also be some action that is taking place in that environment. It may be desired to carry out this surveillance either automatically at the desired frequency (or continuously), or upon demand by an operator. The size of the environment may require more than one motionless camera for complete surveillance.
While a preferred embodiment has been shown and described, it will be understood that it is not intended to limit the disclosure, but rather it is intended to cover all modifications and alternate methods falling within the spirit and the scope of the invention as defined in the appended claims. All of the above referenced U.S. patents and pending applications referenced herein are expressly incorporated by reference.
Having thus described the aforementioned invention,

Claims (38)

1. A system for providing perspective corrected views of a selected portion of a received optical image captured using a wide angle lens, the received optical image being distorted, the system comprising:
image capture means for receiving signals corresponding to said received optical image and for digitising said signal;
input image memory means for receiving said digitised signal;
input means for selecting a non-predetermined portion of said received image to view;
image transform processor means for processing said digitised signals to produce an output signal corresponding to a perspective corrected image of said selected portion of said received image;
output image memory means for receiving said output signal from said image transform processor means; and
output means connected to said output image memory means for recording or displaying said perspective corrected image of said selected portion;
characterised in that said image transform processor means comprises transform parameter calculation means for calculating transform parameters in response to the selection of said selected portion of said image and processes said digitised signal based on said calculated transform parameters to generate said output signal.
13. A method for providing perspective corrected views of a selected portion of an optical image captured with a wide angle lens, the received optical image being distorted, the method comprising:
providing a digitised signal corresponding to said optical image;
selecting a non-predetermined portion of said optical image;
transforming said digitised signal to produce an output signal corresponding to a perspective corrected image of said selected portion of said received image; and
displaying or recording said perspective corrected image of said selected portion;
characterised in that said step of transforming said digitised signal comprises calculating transform parameters in response to the selection of said selected portion of said image, said calculated transform parameters being used to control said transformation of the digitised signal to generate said output signal.
29. A method for providing perspective corrected views of a selected portion of a spherical image comprising two images captured with a fisheye lens, the received spherical image being distorted, the method comprising:
providing a digitised signal corresponding to said spherical image;
selecting a portion of said spherical image;
transforming said digitised signal to produce an output signal corresponding to a perspective corrected image of said selected portion of said spherical image; and
displaying or recording said perspective corrected image of said selected portion;
characterised in that said step of transforming said digitised signal comprises calculating transform parameters for said selected portion of said image, said calculated transform parameters being used to control said transformation of the digitised signal to generate said output signal.
30. A system for providing perspective corrected views of a selected poriton of a received optical image captured using a wide angle lens, the received optical image being distorted, the system comprising:
image capture means for receiving signals corresponding to said received optical image and for digitizing said signal;
input image memory means for receiving said digitized signal;
input means for selecting a portion of said received image to view;
image transform processor means for processing said digitized signals to produce an output signal corresponding to a perspective corrected image of said selected portion of said received image;
output image memory means for receiving said output signal from said image transform processor means; and
output means connected to said output image memory means for recording or displaying said perspective corrected image of said selected portion;
characterized in that the input means is adapted to input a pan, tilt and magnification for the selected portion; and said image transform processor means comprises transform parameter calculation means for calculating transform parameters in response to the input pan, tilt and magnification for the selected portion of said image and processes said digitized signal based on said calculated transform paraeters to generate said output signal.
31. A method for providing perspective corrected views of a selected portion of an optical image captured with a wide angle lens, the received optical image being distorted, the method comprising:
providing a digitized signal corresponding to said optical image;
selecting a portion of said optical image; transforming said digitized signal to produce an output signal corresponding to a perspective corrected image of said selected portion of said received image; and
displaying or recording said perspective corrected of said selected portion;
characterized in that the step of selecting the portion inputting a pan, tilt and magnification; and said step of transforming said digitized signal comprises calculating transform parameters in response to the input pan, tilt and magnification for the selected portion of said image, said calculated transform parameters being used to control said transformation of the digitized signal to generate said output signal.
US09/315,9621991-05-131999-05-21Omniview motionless camera orientation systemExpired - Fee RelatedUS7382399B1 (en)

Priority Applications (2)

Application NumberPriority DateFiling DateTitle
US09/315,962US7382399B1 (en)1991-05-131999-05-21Omniview motionless camera orientation system
US12/102,699US20090040291A1 (en)1991-05-132008-04-14Omniview motionless camera orientation system

Applications Claiming Priority (8)

Application NumberPriority DateFiling DateTitle
US07/699,366US5185667A (en)1991-05-131991-05-13Omniview motionless camera orientation system
US08/014,508US5359363A (en)1991-05-131993-02-08Omniview motionless camera surveillance system
US08/189,585US5384588A (en)1991-05-131994-01-31System for omindirectional image viewing at a remote location without the transmission of control signals to select viewing parameters
US33966394A1994-11-141994-11-14
US08/373,446US6243131B1 (en)1991-05-131995-01-17Method for directly scanning a rectilinear imaging element using a non-linear scan
US38691295A1995-02-081995-02-08
US08/863,584US6002430A (en)1994-01-311997-05-27Method and apparatus for simultaneous capture of a spherical image
US09/315,962US7382399B1 (en)1991-05-131999-05-21Omniview motionless camera orientation system

Related Parent Applications (1)

Application NumberTitlePriority DateFiling Date
US08/863,584DivisionUS6002430A (en)1991-05-131997-05-27Method and apparatus for simultaneous capture of a spherical image

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US12/102,699DivisionUS20090040291A1 (en)1991-05-132008-04-14Omniview motionless camera orientation system

Publications (1)

Publication NumberPublication Date
US7382399B1true US7382399B1 (en)2008-06-03

Family

ID=40346066

Family Applications (2)

Application NumberTitlePriority DateFiling Date
US09/315,962Expired - Fee RelatedUS7382399B1 (en)1991-05-131999-05-21Omniview motionless camera orientation system
US12/102,699AbandonedUS20090040291A1 (en)1991-05-132008-04-14Omniview motionless camera orientation system

Family Applications After (1)

Application NumberTitlePriority DateFiling Date
US12/102,699AbandonedUS20090040291A1 (en)1991-05-132008-04-14Omniview motionless camera orientation system

Country Status (1)

CountryLink
US (2)US7382399B1 (en)

Cited By (90)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20060043264A1 (en)*2004-09-022006-03-02Casio Computer Co., Ltd.Imaging apparatus, image processing method for imaging apparatus, recording medium and carrier wave signal
US20070263093A1 (en)*2006-05-112007-11-15Acree Elaine SReal-time capture and transformation of hemispherical video images to images in rectilinear coordinates
US20090002855A1 (en)*2007-06-292009-01-01Cameron WagnerThermally controlled solid immersion lens fixture
US20100010301A1 (en)*2008-07-082010-01-14Hale Eric LSolid State Variable Direction of View Endoscope
US20100013906A1 (en)*2008-07-172010-01-21Border John NZoom by multiple image capture
US7714936B1 (en)*1991-05-132010-05-11Sony CorporationOmniview motionless camera orientation system
US20110134245A1 (en)*2009-12-072011-06-09Irvine Sensors CorporationCompact intelligent surveillance system comprising intent recognition
US8209051B2 (en)2002-07-252012-06-26Intouch Technologies, Inc.Medical tele-robotic system
CN102694968A (en)*2011-03-252012-09-26鸿富锦精密工业(深圳)有限公司 Camera device and surrounding view monitoring method thereof
US8340819B2 (en)2008-09-182012-12-25Intouch Technologies, Inc.Mobile videoconferencing robot system with network adaptive driving
US8384755B2 (en)2009-08-262013-02-26Intouch Technologies, Inc.Portable remote presence robot
US20130050408A1 (en)*2011-08-312013-02-28Kensuke MasudaImaging optical system, imaging device and imaging system
US8401275B2 (en)2004-07-132013-03-19Intouch Technologies, Inc.Mobile robot with a head-based movement mapping scheme
US8515577B2 (en)2002-07-252013-08-20Yulun WangMedical tele-robotic system with a master remote station with an arbitrator
US8670017B2 (en)2010-03-042014-03-11Intouch Technologies, Inc.Remote presence system including a cart that supports a robot face and an overhead camera
US20140071226A1 (en)*2012-09-112014-03-13Hiroyuki SatohImage capture system and imaging optical system
US20140092017A1 (en)*2012-09-282014-04-03National Taiwan Normal UniversityInteractive simulated-globe display system
US8718837B2 (en)2011-01-282014-05-06Intouch TechnologiesInterfacing with a mobile telepresence robot
EP2727513A1 (en)2012-11-012014-05-07Karl Storz Imaging Inc.Solid state variable direction of view endoscope with rotatable wide-angle field for maximal image performance
WO2014093902A1 (en)*2012-12-132014-06-19Microsoft CorporationDisplacing image on imager in multi-lens cameras
US8758234B2 (en)2008-07-082014-06-24Karl Storz Imaging, Inc.Solid state variable direction of view endoscope
US8771177B2 (en)2008-07-082014-07-08Karl Storz Imaging, Inc.Wide angle flexible endoscope
US8836751B2 (en)2011-11-082014-09-16Intouch Technologies, Inc.Tele-presence system with a user interface that displays different communication links
US8849680B2 (en)2009-01-292014-09-30Intouch Technologies, Inc.Documentation through a remote presence robot
US8849679B2 (en)2006-06-152014-09-30Intouch Technologies, Inc.Remote controlled robot system that provides medical images
US8861750B2 (en)2008-04-172014-10-14Intouch Technologies, Inc.Mobile tele-presence system with a microphone system
US8892260B2 (en)2007-03-202014-11-18Irobot CorporationMobile robot for telecommunication
US8897920B2 (en)2009-04-172014-11-25Intouch Technologies, Inc.Tele-presence robot system with software modularity, projector and laser pointer
US8902278B2 (en)2012-04-112014-12-02Intouch Technologies, Inc.Systems and methods for visualizing and managing telepresence devices in healthcare networks
US8902322B2 (en)2012-11-092014-12-02Bubl Technology Inc.Systems and methods for generating spherical images
US8930019B2 (en)2010-12-302015-01-06Irobot CorporationMobile human interface robot
US8935005B2 (en)2010-05-202015-01-13Irobot CorporationOperating a mobile robot
US8996165B2 (en)2008-10-212015-03-31Intouch Technologies, Inc.Telepresence robot with a camera boom
US9014848B2 (en)2010-05-202015-04-21Irobot CorporationMobile robot system
KR20150066930A (en)*2013-12-092015-06-17씨제이씨지브이 주식회사Method for generating images of multi-projection theater and image manegement apparatus using the same
EP2779620A4 (en)*2011-11-072015-06-24Sony Computer Entertainment Inc IMAGE GENERATING DEVICE AND IMAGE GENERATING METHOD
US9098611B2 (en)2012-11-262015-08-04Intouch Technologies, Inc.Enhanced video interaction for a user interface of a telepresence network
US20150235383A1 (en)*2014-02-142015-08-20Daum Communications Corp.Image Database Constructing Method and Device Using the Same
US9138891B2 (en)2008-11-252015-09-22Intouch Technologies, Inc.Server connectivity control for tele-presence robot
US9160783B2 (en)2007-05-092015-10-13Intouch Technologies, Inc.Robot system that operates through a network firewall
US9174342B2 (en)2012-05-222015-11-03Intouch Technologies, Inc.Social behavior rules for a medical telepresence robot
US9193065B2 (en)2008-07-102015-11-24Intouch Technologies, Inc.Docking system for a tele-presence robot
US9198728B2 (en)2005-09-302015-12-01Intouch Technologies, Inc.Multi-camera mobile teleconferencing platform
US9251313B2 (en)2012-04-112016-02-02Intouch Technologies, Inc.Systems and methods for visualizing and managing telepresence devices in healthcare networks
US9264664B2 (en)2010-12-032016-02-16Intouch Technologies, Inc.Systems and methods for dynamic bandwidth allocation
RU2579004C1 (en)*2015-05-052016-03-27Вячеслав Михайлович СмелковDevice for computer system for panoramic television surveillance with implementation of exchange of image parameters
US9296107B2 (en)2003-12-092016-03-29Intouch Technologies, Inc.Protocol for a remotely controlled videoconferencing robot
US9323250B2 (en)2011-01-282016-04-26Intouch Technologies, Inc.Time-dependent navigation of telepresence robots
US9361021B2 (en)2012-05-222016-06-07Irobot CorporationGraphical user interfaces including touchpad driving interfaces for telemedicine devices
US9498886B2 (en)2010-05-202016-11-22Irobot CorporationMobile human interface robot
US9529824B2 (en)*2013-06-052016-12-27Digitalglobe, Inc.System and method for multi resolution and multi temporal image search
US9582731B1 (en)*2014-04-152017-02-28Google Inc.Detecting spherical images
WO2017120379A1 (en)*2016-01-062017-07-13360fly, Inc.Modular panoramic camera systems
US9729788B2 (en)2011-11-072017-08-08Sony CorporationImage generation apparatus and image generation method
US9763563B2 (en)2012-07-112017-09-19Karl Storz Imaging, Inc.Endoscopic camera single-button mode activation
US20170270633A1 (en)*2016-03-152017-09-21Microsoft Technology Licensing, LlcBowtie view representing a 360-degree image
US9842192B2 (en)2008-07-112017-12-12Intouch Technologies, Inc.Tele-presence robot system with multi-cast features
US9883101B1 (en)*2014-07-232018-01-30Hoyos Integrity CorporationProviding a real-time via a wireless communication channel associated with a panoramic video capture device
US9894272B2 (en)2011-11-072018-02-13Sony Interactive Entertainment Inc.Image generation apparatus and image generation method
US9930225B2 (en)2011-02-102018-03-27Villmer LlcOmni-directional camera and related viewing software
US9974612B2 (en)2011-05-192018-05-22Intouch Technologies, Inc.Enhanced diagnostics for a telepresence robot
US10002406B2 (en)2016-10-032018-06-19Samsung Electronics Co., Ltd.Consistent spherical photo and video orientation correction
US10059000B2 (en)2008-11-252018-08-28Intouch Technologies, Inc.Server connectivity control for a tele-presence robot
US10092169B2 (en)2008-07-082018-10-09Karl Storz Imaging, Inc.Solid state variable direction of view endoscope
US20190005709A1 (en)*2017-06-302019-01-03Apple Inc.Techniques for Correction of Visual Artifacts in Multi-View Images
US10284776B2 (en)2011-11-072019-05-07Sony Interactive Entertainment Inc.Image generation apparatus and image generation method
US10343283B2 (en)2010-05-242019-07-09Intouch Technologies, Inc.Telepresence robot system that can be accessed by a cellular phone
US10444955B2 (en)2016-03-152019-10-15Microsoft Technology Licensing, LlcSelectable interaction elements in a video stream
US10471588B2 (en)2008-04-142019-11-12Intouch Technologies, Inc.Robotic based health care system
US20190385274A1 (en)*2015-08-122019-12-19Gopro, Inc.Equatorial stitching of hemispherical images in a spherical image capture system
US10593014B2 (en)*2018-03-262020-03-17Ricoh Company, Ltd.Image processing apparatus, image processing system, image capturing system, image processing method
US10754242B2 (en)2017-06-302020-08-25Apple Inc.Adaptive resolution and projection format in multi-direction video
US10769739B2 (en)2011-04-252020-09-08Intouch Technologies, Inc.Systems and methods for management of information among medical providers and facilities
US10769471B2 (en)*2018-10-032020-09-08Karl Storz Se & Co. KgSystem and method for holding an image display apparatus
US10808882B2 (en)2010-05-262020-10-20Intouch Technologies, Inc.Tele-robotic system with a robot face placed on a chair
US10875182B2 (en)2008-03-202020-12-29Teladoc Health, Inc.Remote presence system mounted to operating room hardware
US10924747B2 (en)2017-02-272021-02-16Apple Inc.Video coding techniques for multi-view video
US10999602B2 (en)2016-12-232021-05-04Apple Inc.Sphere projected motion estimation/compensation and mode decision
US11093752B2 (en)2017-06-022021-08-17Apple Inc.Object tracking in multi-view video
US11154981B2 (en)2010-02-042021-10-26Teladoc Health, Inc.Robot user interface for telepresence robot system
US11252328B2 (en)*2019-02-082022-02-15Canon Kabushiki KaishaElectronic device and method for controlling the same
US11259046B2 (en)2017-02-152022-02-22Apple Inc.Processing of equirectangular object data to compensate for distortion by spherical projections
US11389064B2 (en)2018-04-272022-07-19Teladoc Health, Inc.Telehealth cart that supports a removable tablet with seamless audio/video switching
US11399153B2 (en)2009-08-262022-07-26Teladoc Health, Inc.Portable telepresence apparatus
US11636944B2 (en)2017-08-252023-04-25Teladoc Health, Inc.Connectivity infrastructure for a telehealth platform
US11742094B2 (en)2017-07-252023-08-29Teladoc Health, Inc.Modular telehealth cart with thermal imaging and touch screen user interface
US11862302B2 (en)2017-04-242024-01-02Teladoc Health, Inc.Automated transcription and documentation of tele-health encounters
US12093036B2 (en)2011-01-212024-09-17Teladoc Health, Inc.Telerobotic system with a dual application screen presentation
EP4407373A3 (en)*2015-03-182024-10-30GoPro, Inc.Unibody dual-lens mount for a spherical camera
US12224059B2 (en)2011-02-162025-02-11Teladoc Health, Inc.Systems and methods for network-based counseling

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
KR101473215B1 (en)*2008-04-182014-12-17삼성전자주식회사Apparatus for generating panorama image and method therof
JP5724755B2 (en)*2011-08-262015-05-27株式会社リコー Imaging system
JP2013214947A (en)*2012-03-092013-10-17Ricoh Co LtdImage capturing apparatus, image capturing system, image processing method, information processing apparatus, and program
US10666860B2 (en)*2012-09-112020-05-26Ricoh Company, Ltd.Image processor, image processing method and program, and imaging system
KR102172354B1 (en)*2013-06-282020-10-30삼성전자주식회사Image file generating method and apparatus thereof
CN104883513A (en)*2014-02-282015-09-02系统电子工业股份有限公司 Image processing device for 720-degree surround photography
JP5920507B1 (en)*2015-03-102016-05-18株式会社リコー Image processing system, image processing method, and program
US20160295126A1 (en)*2015-04-032016-10-06Capso Vision, Inc.Image Stitching with Local Deformation for in vivo Capsule Images
KR102249946B1 (en)*2015-09-042021-05-11삼성전자주식회사Apparatus and method for controlling a image capture and a image output
EP3190460A1 (en)*2016-01-052017-07-12GiropticImage capturing device on a moving body
CN108462838B (en)*2018-03-162020-10-02影石创新科技股份有限公司Panoramic video anti-shake method and device and portable terminal

Citations (72)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US1282177A (en)1916-11-061918-10-22George Stevens BlankenhornMethod of and apparatus for making panoramic pictures.
US3240140A (en)1962-05-171966-03-15Bank Of America Nat Trust & Savings AssPanoramic camera with removable film casette
US4152724A (en)1975-05-211979-05-01Elliott Brothers (London) LimitedMissile guidance systems
EP0011909A1 (en)1978-08-111980-06-11E.I. Du Pont De Nemours And CompanyX-ray intensifying screen based on a tantalate phosphor and process for producing the phosphor
US4214821A (en)1978-10-301980-07-29Termes Richard ATotal environment photographic mount and photograph
WO1982003712A1 (en)1981-04-101982-10-28Gabriel Steven AllenController for system for spatially transforming images
US4406530A (en)1981-05-121983-09-27Riso Kagaku CorporationReflecting type overhead projector with automatically variable safety illumination
US4463372A (en)1982-03-241984-07-31Ampex CorporationSpatial transformation system including key signal generator
US4463380A (en)1981-09-251984-07-31Vought CorporationImage processing system
US4472732A (en)1981-04-101984-09-18Ampex CorporationSystem for spatially transforming images
US4493554A (en)1979-02-271985-01-15DiffractoMethod and apparatus for determining physical characteristics of objects and object surfaces
US4513374A (en)1981-09-251985-04-23Ltv Aerospace And DefenseMemory system
US4528585A (en)*1983-03-301985-07-09Rca CorporationTelevision receiver having picture magnifying apparatus
US4549208A (en)*1982-12-221985-10-22Hitachi, Ltd.Picture processing apparatus
US4613898A (en)1983-05-161986-09-23Barr & Stroud LimitedImaging systems
US4631750A (en)1980-04-111986-12-23Ampex CorporationMethod and system for spacially transforming images
US4656506A (en)1983-02-251987-04-07Ritchey Kurtis JSpherical projection system
US4660969A (en)1984-08-081987-04-28Canon Kabushiki KaishaDevice for searching objects within wide visual field
US4672435A (en)1984-07-211987-06-09Krauss-Maffei A.G.Observation and reconnaissance system for armored vehicles
US4736436A (en)1984-04-131988-04-05Fujitsu LimitedInformation extraction by mapping
US4772942A (en)1986-01-111988-09-20Pilkington P.E. LimitedDisplay system having wide field of view
US4807158A (en)1986-09-301989-02-21Daleco/Ivex Partners, Ltd.Method and apparatus for sampling images to simulate movement within a multidimensional space
US4841292A (en)1986-08-111989-06-20Allied-Signal Inc.Third dimension pop up generation from a two-dimensional transformed image display
US4858149A (en)1986-09-031989-08-15International Business Machines CorporationMethod and system for solid modelling
US4899293A (en)1988-10-241990-02-06Honeywell Inc.Method of storage and retrieval of digital map data based upon a tessellated geoid system
US4908874A (en)1980-04-111990-03-13Ampex CorporationSystem for spatially transforming images
JPH02127877A (en)1988-11-081990-05-16Casio Comput Co Ltd Electronic still camera with fisheye lens
US4949108A (en)1986-08-181990-08-14Verret Jean MichelImage shooting method for recording visual spheres and device for implementing such method
US4965844A (en)1985-04-031990-10-23Sony CorporationMethod and system for image transformation
US4989084A (en)1989-11-241991-01-29Wetzel Donald CAirport runway monitoring system
US5023725A (en)1989-10-231991-06-11Mccutchen DavidMethod and apparatus for dodecahedral imaging system
US5040746A (en)1990-08-141991-08-20The United States Of America As Represented By The Secretary Of The ArmyFinned projectile with supplementary fins
US5067019A (en)1989-03-311991-11-19The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationProgrammable remapper for image processing
US5068735A (en)1989-08-221991-11-26Fuji Photo Optical Co., Ltd.System for controlling the aiming direction, focus, zooming, and/or position of a television camera
US5083389A (en)1988-07-151992-01-28Arthur AlperinPanoramic display device and method of making the same
US5130794A (en)1990-03-291992-07-14Ritchey Kurtis JPanoramic display system
US5175808A (en)1989-09-121992-12-29PixarMethod and apparatus for non-affine image warping
US5185667A (en)1991-05-131993-02-09Telerobotics International, Inc.Omniview motionless camera orientation system
US5200818A (en)1991-03-221993-04-06Inbal NetaVideo imaging system with interactive windowing capability
US5313306A (en)1991-05-131994-05-17Telerobotics International, Inc.Omniview motionless camera endoscopy system
US5359363A (en)1991-05-131994-10-25Telerobotics International, Inc.Omniview motionless camera surveillance system
US5384588A (en)1991-05-131995-01-24Telerobotics International, Inc.System for omindirectional image viewing at a remote location without the transmission of control signals to select viewing parameters
US5395363A (en)1993-06-291995-03-07Utah Medical ProductsDiathermy coagulation and ablation apparatus and method
US5396583A (en)1992-10-131995-03-07Apple Computer, Inc.Cylindrical to planar image mapping using scanline coherence
US5444478A (en)1992-12-291995-08-22U.S. Philips CorporationImage processing method and device for constructing an image from adjacent images
US5446833A (en)1992-05-081995-08-29Apple Computer, Inc.Textured sphere and spherical environment map rendering using texture map double indirection
WO1996008105A1 (en)1994-09-091996-03-14Motorola Inc.Method for creating image data
WO1996026610A1 (en)1995-02-231996-08-29Motorola Inc.Broadcasting plural wide angle images
US5657073A (en)1995-06-011997-08-12Panoramic Viewing Systems, Inc.Seamless multi-camera panoramic imaging with distortion correction and selectable field of view
US5684937A (en)1992-12-141997-11-04Oxaal; FordMethod and apparatus for performing perspective transformation on visible stimuli
US5764276A (en)1991-05-131998-06-09Interactive Pictures CorporationMethod and apparatus for providing perceived video viewing experiences using still images
US5903319A (en)1991-05-131999-05-11Interactive Pictures CorporationMethod for eliminating temporal and spacial distortion from interlaced video signals
US5903782A (en)1995-11-151999-05-11Oxaal; FordMethod and apparatus for producing a three-hundred and sixty degree spherical visual data set
US5990941A (en)1991-05-131999-11-23Interactive Pictures CorporationMethod and apparatus for the interactive display of any portion of a spherical image
US6002430A (en)1994-01-311999-12-14Interactive Pictures CorporationMethod and apparatus for simultaneous capture of a spherical image
US6005611A (en)*1994-05-271999-12-21Be Here CorporationWide-angle image dewarping method and apparatus
US6118454A (en)1996-10-162000-09-12Oxaal; FordMethods and apparatuses for producing a spherical visual data set using a spherical mirror and one or more cameras with long lenses
US6147709A (en)1997-04-072000-11-14Interactive Pictures CorporationMethod and apparatus for inserting a high resolution image into a low resolution interactive image to produce a realistic immersive experience
US6201574B1 (en)1991-05-132001-03-13Interactive Pictures CorporationMotionless camera orientation system distortion correcting sensing element
US6243099B1 (en)1996-11-142001-06-05Ford OxaalMethod for interactive viewing full-surround image data and apparatus therefor
US6243131B1 (en)1991-05-132001-06-05Interactive Pictures CorporationMethod for directly scanning a rectilinear imaging element using a non-linear scan
US6256061B1 (en)1991-05-132001-07-03Interactive Pictures CorporationMethod and apparatus for providing perceived video viewing experiences using still images
US6301447B1 (en)1991-05-132001-10-09Interactive Pictures CorporationMethod and system for creation and interactive viewing of totally immersive stereoscopic images
FR2821172A1 (en)2001-02-162002-08-23Immervision Internat Pte Ltd METHOD AND DEVICE FOR ORIENTATION OF A DIGITAL PANORAMIC IMAGE
FR2821156A1 (en)2001-02-162002-08-23Immervision Internat Pte Ltd METHOD AND DEVICE FOR OBTAINING A DIGITAL PANORAMIC IMAGE WITH CONSTANT TINT
FR2821167A1 (en)2001-02-162002-08-23Immervision Internat Pte Ltd PHOTOGRAPHIC SUPPORT DEVICE
WO2002093908A2 (en)2001-05-112002-11-216115187 Canada Inc.Method for capturing and displaying a variable-resolution digital panoramic image
US6492985B1 (en)1999-07-062002-12-10Internet Pictures CorporationPresenting manipulating and serving immersive images
FR2827680A1 (en)2001-07-202003-01-24Immervision Internat Pte LtdComputer screen display digital panoramic image having fish eye objective panorama image detector projected without reducing field forming non circular image/covering increase number pixels.
US6687387B1 (en)1999-12-272004-02-03Internet Pictures CorporationVelocity-dependent dewarping of images
US6731284B1 (en)1992-12-142004-05-04Ford OxaalMethod of and apparatus for performing perspective transformation of visible stimuli
US6788211B2 (en)2000-06-142004-09-07Edwards Systems Technology, Inc.Apparatus and method using smoke and/or gas sensing in cooking devices

Patent Citations (89)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US1282177A (en)1916-11-061918-10-22George Stevens BlankenhornMethod of and apparatus for making panoramic pictures.
US3240140A (en)1962-05-171966-03-15Bank Of America Nat Trust & Savings AssPanoramic camera with removable film casette
US4152724A (en)1975-05-211979-05-01Elliott Brothers (London) LimitedMissile guidance systems
EP0011909A1 (en)1978-08-111980-06-11E.I. Du Pont De Nemours And CompanyX-ray intensifying screen based on a tantalate phosphor and process for producing the phosphor
US4214821A (en)1978-10-301980-07-29Termes Richard ATotal environment photographic mount and photograph
US4493554A (en)1979-02-271985-01-15DiffractoMethod and apparatus for determining physical characteristics of objects and object surfaces
US4908874A (en)1980-04-111990-03-13Ampex CorporationSystem for spatially transforming images
US4631750A (en)1980-04-111986-12-23Ampex CorporationMethod and system for spacially transforming images
WO1982003712A1 (en)1981-04-101982-10-28Gabriel Steven AllenController for system for spatially transforming images
US4468688A (en)1981-04-101984-08-28Ampex CorporationController for system for spatially transforming images
US4472732A (en)1981-04-101984-09-18Ampex CorporationSystem for spatially transforming images
US4406530A (en)1981-05-121983-09-27Riso Kagaku CorporationReflecting type overhead projector with automatically variable safety illumination
US4513374A (en)1981-09-251985-04-23Ltv Aerospace And DefenseMemory system
US4463380A (en)1981-09-251984-07-31Vought CorporationImage processing system
US4463372A (en)1982-03-241984-07-31Ampex CorporationSpatial transformation system including key signal generator
US4549208A (en)*1982-12-221985-10-22Hitachi, Ltd.Picture processing apparatus
US4656506A (en)1983-02-251987-04-07Ritchey Kurtis JSpherical projection system
US4528585A (en)*1983-03-301985-07-09Rca CorporationTelevision receiver having picture magnifying apparatus
US4613898A (en)1983-05-161986-09-23Barr & Stroud LimitedImaging systems
US4736436A (en)1984-04-131988-04-05Fujitsu LimitedInformation extraction by mapping
US4672435A (en)1984-07-211987-06-09Krauss-Maffei A.G.Observation and reconnaissance system for armored vehicles
US4660969A (en)1984-08-081987-04-28Canon Kabushiki KaishaDevice for searching objects within wide visual field
US4965844A (en)1985-04-031990-10-23Sony CorporationMethod and system for image transformation
US4772942A (en)1986-01-111988-09-20Pilkington P.E. LimitedDisplay system having wide field of view
US4841292A (en)1986-08-111989-06-20Allied-Signal Inc.Third dimension pop up generation from a two-dimensional transformed image display
US4949108A (en)1986-08-181990-08-14Verret Jean MichelImage shooting method for recording visual spheres and device for implementing such method
US4858149A (en)1986-09-031989-08-15International Business Machines CorporationMethod and system for solid modelling
US4807158A (en)1986-09-301989-02-21Daleco/Ivex Partners, Ltd.Method and apparatus for sampling images to simulate movement within a multidimensional space
US5083389A (en)1988-07-151992-01-28Arthur AlperinPanoramic display device and method of making the same
US4899293A (en)1988-10-241990-02-06Honeywell Inc.Method of storage and retrieval of digital map data based upon a tessellated geoid system
JPH02127877A (en)1988-11-081990-05-16Casio Comput Co Ltd Electronic still camera with fisheye lens
US5067019A (en)1989-03-311991-11-19The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationProgrammable remapper for image processing
US5068735A (en)1989-08-221991-11-26Fuji Photo Optical Co., Ltd.System for controlling the aiming direction, focus, zooming, and/or position of a television camera
US5175808A (en)1989-09-121992-12-29PixarMethod and apparatus for non-affine image warping
US5023725A (en)1989-10-231991-06-11Mccutchen DavidMethod and apparatus for dodecahedral imaging system
US4989084A (en)1989-11-241991-01-29Wetzel Donald CAirport runway monitoring system
US5130794A (en)1990-03-291992-07-14Ritchey Kurtis JPanoramic display system
US5040746A (en)1990-08-141991-08-20The United States Of America As Represented By The Secretary Of The ArmyFinned projectile with supplementary fins
US5200818A (en)1991-03-221993-04-06Inbal NetaVideo imaging system with interactive windowing capability
US5313306A (en)1991-05-131994-05-17Telerobotics International, Inc.Omniview motionless camera endoscopy system
US5764276A (en)1991-05-131998-06-09Interactive Pictures CorporationMethod and apparatus for providing perceived video viewing experiences using still images
US5359363A (en)1991-05-131994-10-25Telerobotics International, Inc.Omniview motionless camera surveillance system
US5384588A (en)1991-05-131995-01-24Telerobotics International, Inc.System for omindirectional image viewing at a remote location without the transmission of control signals to select viewing parameters
US6603502B2 (en)1991-05-132003-08-05Internet Pictures CorporationSystem for omnidirectional image viewing at a remote location without the transmission of control signals to select viewing parameters
US6201574B1 (en)1991-05-132001-03-13Interactive Pictures CorporationMotionless camera orientation system distortion correcting sensing element
EP0971540B1 (en)1991-05-132002-06-26Interactive Pictures CorporationOmniview motionless camera orientation system
US5903319A (en)1991-05-131999-05-11Interactive Pictures CorporationMethod for eliminating temporal and spacial distortion from interlaced video signals
US6301447B1 (en)1991-05-132001-10-09Interactive Pictures CorporationMethod and system for creation and interactive viewing of totally immersive stereoscopic images
US6256061B1 (en)1991-05-132001-07-03Interactive Pictures CorporationMethod and apparatus for providing perceived video viewing experiences using still images
USRE36207E (en)1991-05-131999-05-04Omniview, Inc.Omniview motionless camera orientation system
US5990941A (en)1991-05-131999-11-23Interactive Pictures CorporationMethod and apparatus for the interactive display of any portion of a spherical image
US6243131B1 (en)1991-05-132001-06-05Interactive Pictures CorporationMethod for directly scanning a rectilinear imaging element using a non-linear scan
US5185667A (en)1991-05-131993-02-09Telerobotics International, Inc.Omniview motionless camera orientation system
US5877801A (en)1991-05-131999-03-02Interactive Pictures CorporationSystem for omnidirectional image viewing at a remote location without the transmission of control signals to select viewing parameters
US5561756A (en)1992-05-081996-10-01Apple Computer, Inc.Textured sphere and spherical environment map rendering using texture map double indirection
US5446833A (en)1992-05-081995-08-29Apple Computer, Inc.Textured sphere and spherical environment map rendering using texture map double indirection
US5396583A (en)1992-10-131995-03-07Apple Computer, Inc.Cylindrical to planar image mapping using scanline coherence
US6252603B1 (en)1992-12-142001-06-26Ford OxaalProcesses for generating spherical image data sets and products made thereby
US6271853B1 (en)1992-12-142001-08-07Ford OxaalMethod for generating and interactively viewing spherical image data
US6731284B1 (en)1992-12-142004-05-04Ford OxaalMethod of and apparatus for performing perspective transformation of visible stimuli
US6157385A (en)1992-12-142000-12-05Oxaal; FordMethod of and apparatus for performing perspective transformation of visible stimuli
US6323862B1 (en)1992-12-142001-11-27Ford OxaalApparatus for generating and interactively viewing spherical image data and memory thereof
US5936630A (en)1992-12-141999-08-10Oxaal; FordMethod of and apparatus for performing perspective transformation of visible stimuli
US5684937A (en)1992-12-141997-11-04Oxaal; FordMethod and apparatus for performing perspective transformation on visible stimuli
US5444478A (en)1992-12-291995-08-22U.S. Philips CorporationImage processing method and device for constructing an image from adjacent images
US5395363A (en)1993-06-291995-03-07Utah Medical ProductsDiathermy coagulation and ablation apparatus and method
US6002430A (en)1994-01-311999-12-14Interactive Pictures CorporationMethod and apparatus for simultaneous capture of a spherical image
US6005611A (en)*1994-05-271999-12-21Be Here CorporationWide-angle image dewarping method and apparatus
WO1996008105A1 (en)1994-09-091996-03-14Motorola Inc.Method for creating image data
WO1996026610A1 (en)1995-02-231996-08-29Motorola Inc.Broadcasting plural wide angle images
US5657073A (en)1995-06-011997-08-12Panoramic Viewing Systems, Inc.Seamless multi-camera panoramic imaging with distortion correction and selectable field of view
US6795113B1 (en)1995-06-232004-09-21Ipix CorporationMethod and apparatus for the interactive display of any portion of a spherical image
US5903782A (en)1995-11-151999-05-11Oxaal; FordMethod and apparatus for producing a three-hundred and sixty degree spherical visual data set
US6118454A (en)1996-10-162000-09-12Oxaal; FordMethods and apparatuses for producing a spherical visual data set using a spherical mirror and one or more cameras with long lenses
US6243099B1 (en)1996-11-142001-06-05Ford OxaalMethod for interactive viewing full-surround image data and apparatus therefor
US6147709A (en)1997-04-072000-11-14Interactive Pictures CorporationMethod and apparatus for inserting a high resolution image into a low resolution interactive image to produce a realistic immersive experience
US6492985B1 (en)1999-07-062002-12-10Internet Pictures CorporationPresenting manipulating and serving immersive images
US6687387B1 (en)1999-12-272004-02-03Internet Pictures CorporationVelocity-dependent dewarping of images
US6788211B2 (en)2000-06-142004-09-07Edwards Systems Technology, Inc.Apparatus and method using smoke and/or gas sensing in cooking devices
FR2821156A1 (en)2001-02-162002-08-23Immervision Internat Pte Ltd METHOD AND DEVICE FOR OBTAINING A DIGITAL PANORAMIC IMAGE WITH CONSTANT TINT
FR2821167A1 (en)2001-02-162002-08-23Immervision Internat Pte Ltd PHOTOGRAPHIC SUPPORT DEVICE
WO2002067572A1 (en)2001-02-162002-08-29Immervision International Pte LtdMethod and device for collimating a digital panoramic image
FR2821172A1 (en)2001-02-162002-08-23Immervision Internat Pte Ltd METHOD AND DEVICE FOR ORIENTATION OF A DIGITAL PANORAMIC IMAGE
WO2002067048A2 (en)2001-02-162002-08-29Immervision International Pte LtdMethod and device for obtaining a constant-hue digital panoramic image
WO2002067016A2 (en)2001-02-162002-08-29Immervision International Pte LtdPhotographic camera support device
WO2002093908A2 (en)2001-05-112002-11-216115187 Canada Inc.Method for capturing and displaying a variable-resolution digital panoramic image
FR2826221A1 (en)2001-05-112002-12-20Immervision Internat Pte Ltd METHOD FOR OBTAINING AND DISPLAYING A VARIABLE RESOLUTION DIGITAL PANORAMIC IMAGE
FR2827680A1 (en)2001-07-202003-01-24Immervision Internat Pte LtdComputer screen display digital panoramic image having fish eye objective panorama image detector projected without reducing field forming non circular image/covering increase number pixels.
WO2003010599A1 (en)2001-07-202003-02-066115187 Canada Inc.Method for capturing a panoramic image using a rectangular image sensor

Non-Patent Citations (83)

* Cited by examiner, † Cited by third party
Title
"Declaration of Scott Gilbert in Support of Defendant Infinite Pictures" Memorandum in Opposition to Plaintiff's Motion For Preliminary Injunction, Omniview, Inc. v. Infinite Pictures, Inc., Civ. Action No. 3-96-849.
A. Paeth, "Digital Cartography For Computer Graphics," Graphics Gems, 1990, pp. 307-320.
ACM Computing Surveys, vol. 24, No. 4, Dec. 1992, A Survey of Image Registration Techniques, Lisa Gottesfeld Brown.
Annex to the communication-Opposition; EP0971540; dated Sep. 15, 2005.
Brief Communication-Opposition proceedings; EP0971540; dated Dec. 6, 2004.
Claims (clean); EP0971540; dated Aug. 26, 2005.
Claims (marked); EP0971540; dated Aug. 26, 2005.
Claims; EP0971540; dated Aug. 18, 2005.
Communication of a notice of opposition-first info of patent proprietor with annexes; EP0971540; dated Mar. 30, 2003.
Communication pursuant to Article 101(2) and Rule 58(1)-(4) EPC with annex; EP0971540; dated Aug. 24, 2004.
Communication pursuant to Article 101(2) and Rule 58(1)-(4) EPC with annex; EP0971540; dated Feb. 10, 2004.
Communication under Rule 51(4) EPC; EP0971540; dated Oct. 29, 2001.
Communications of the acm, "Interactive Technologies," Association For Computing Machinery, vol. 32, No. 7, Jul. 1989.
Complaint, Grandeye v. IPIX Corporation, Eastern District of Virginia, 2:05CV134, dated Mar. 4, 2005.
Computer Graphics World, DVI Video/Graphics, Douglas F. Dixon et al. 1987.
Data Sheets For Imaging Products, Defendant's Exhibit 202, pp. 40-63.
Data Sheets for Simplified Block Diagraph, Plantiff's Exhibit 409, pp. 41-77.
Data Sheets For TMC2301, TMC2302, Defendant's Exhibit 402, 1 sheet.
Declaration of Ernest L. Hall dated Jul. 16, 2003; Internet Pictures Corporation v. Ford Oxaal; Case No. 3:03-CV-317 in the Eastern District of Tennessee.
Declaration of Jake Richter dated Jul. 15, 2003; Internet Pictures Corporation v. Ford Oxaal; Case No. 3:03-CV-317 in the Eastern District of Tennessee.
Declaration of Leif Ford Oxaal dated Jul. 16, 2003; Internet Pictures Corporation v. Ford Oxaal, Case No. 3:03-CV-317 in the Eastern District of Tenessee.
Declaration of Paul E. Satterlee, Jr. dated Jul. 15, 2003; Internet Pictures v. Ford Oxaal; Case No. 3:03-CV-317 in the Eastern District of Tennessee.
Deposition of Gerald L. Greenberg taken Nov. 10, 2000 in Ford Oxaal v. Interactive Pictures Corp. et al., No. 99-CV-0802(LEK/DRH) (N.D.N.Y., filed May 20, 1999), including Exhibits Nos. 1-15.
Deposition of Jacquelyne E. Parker taken Nov. 8, 2000 in Ford Oxaal v. Interactive Pictures Corp. et al., No. 99-CV-0802(LEK/DRH) (N.D.N.Y., filed May 20, 1999), including Exhibits Nos. 1-11.
Deposition of Miles Johnson taken Nov. 7, 2000 in Ford Oxaal v. Interactive Pictures Corp. et al., No. 99-CV-0802(LEK/DRH) (N.D.N.Y., filed May 20, 1999), including Exhibits Nos. 1-8.
Deposition of Richard J. Felix taken Nov. 9, 2000 in Ford Oxaal v. Interactive Pictures Corp. et al., No. 99-CV-0802(LEK/DRH) (N.D.N.Y., filed May 20, 1999), including Exhibits Nos. 1-47.
Deposition of Steve Zimmermann taken Apr. 26, 2000 in Ford Oxaal v. Interactive Pictures Corp. et al., No. 99-CV-0802(LEK/DRH) (N.D.N.Y., filed May 20, 1999), transcript pp. 105-140 and Exhibit No. 37.
EPO search report with annexes, EP0971540; dated Nov. 23, 1999.
Examination report with annex; EP0971540; dated Mar. 23,2001.
Exhibit 5 of Expert Report of Dr. J. D. Birdwell in Ford Oxaal v. Interactive Pictures Corp., et al., No. 99-CV-0802(LEK/DRH) (N.D.N.Y., filed May 20, 1999) ("Examples and Photographs of Panoramic Cameras," Jul. 14, 1999).
F. Kenton Musgrave, "A Panoramic Virtual Screen For Ray Tracing," Graphics Gems, 1992, pp. 288-294.
F. Pearson II, "Map Projections Theory and Applications," CRC Press, Inc., 1990, pp. 215-345.
File Wrapper for U.S. Pat. No. 6,252,603.
G. David Ripley, "DVI-A Digital Multimedia Technology," Communications of the ACM Jul. 1989, vol. 32, No. 7, pp. 811-822.
G. Wolberg, "Digital Image Warping," IEEE Computer Society Press, 1988.
Heckbert, "The PMAT and Poly User's Manual," NYIT Document, 1983.
Heckbert, Fundamentals of Texture Mapping and Image Warping, Report No. UCB/CSD 89/516, Jun. 1989.
Intel Corporation, "Action Media 750 Production Tool Reference," 1998, 1991.
IPIX's Supplemental Responses To Oxaal's Interrogatories Nos. 1, 6, 7, 9, 11, 12 and 36, in Ford Oxaal v. Interactive Pictures Corp., et al., No. 99-CV-0802(LEK/DRH) (N.D.N.Y., filed May 20, 1999).
J. Blinn et al., "Texture and Reflection in Computer Generated Images," Comm. ACM, vol. 19, No. 10, 1976, pp. 542-547.
J.D. Foley et al., "Computer Graphics: Principles and Practice," 1990, 1996, pp. 229-381.
Leonardo, vol. 16, No. 1, pp. 1-9, 1983, Flat-Sphere Perspective, Fernando R. Casas.
Leonardo, vol. 25, No. 3 and 4, 1992, New Representative Methods for Real and Imaginary Environments, Emilio Frisia, pp. 369-376.
Letter regarding the Opposition procedure (no time limit) with annex; EP0971540; dated Jan. 16, 2004.
Letter regarding the Opposition procedure (no time limit) with annexes; EP0971540; dated Jun. 23, 2004.
Letter regarding the Opposition procedure (no time limit) with translation; EP0971540; dated Mar. 30, 2004.
Letter regarding the Opposition procedure (no time limit); EP0971540; dated Aug. 18, 2005.
Letter regarding the Opposition procedure (no time limit); EP0971540; dated Aug. 26, 2005.
Letter regarding the Opposition procedure (no time limit); EP0971540; dated Jul. 2, 2004.
M. Onoe et al., "Digital Processing of Images Taken by Fish-Eye Lens," IEEE: Proceedings, New York, 1982, vol. 1, p. 105-8.
Matter concerning the application; EP0971540; dated Jan. 18, 2000.
Memorandum in Support of Minds-Eye-View, Inc. and Ford Oxaal's Opposition to IPIX's Motion for Preliminary Injunction (public version), Internet Pictures Corporation v. Ford Oxaal and Minds-Eye-View, Inc., Eastern District of Tennessee, 3:03CV317, dated Jul. 18, 2003.
Mohammad Ehtashami, SPNG J. OH, and Ernest L. Hall, Omnidirectional Position Location for Mobile Robots, Center for Robotics Research, Cincinnati, Ohio, 1984.
N. Greene et al., "Creating Raster Omnimax Images From Multiple Perspective Views Using the Elliptical Weighted Average Filter," IEEE Computer Graphics and Applications, Jun. 1986, pp. 21-27.
N. Greene, "A Method of Modeling Sky For Computer Animations," Proc. First Int'l Conf. Engineering and Computer Graphics, Aug. 1984, pp. 297-300.
N. Greene, "Environmental Mapping and other Applications of World Projections," IEEE Computer Graphics and Applications, Nov. 1986, pp. 21-29.
Nicolas Alvertos, E. L. Hall, and R. L. Anderson; Omnidirectional Viewing: The Fish-Eye Problem, 1983.
Printout from http://ofi.epoline.org/view/GetDossier, dated May 3, 2005, listing EPO docket for EP0971540 (additional documents can be downloaded and viewed from this website).
Printout from http://ofi.epoline.org/view/GetDossier, dated Oct. 10, 2005, listing EPO docket for EP0971540 (additional documents can be downloaded and viewed from this website).
Production document No. F 000070 in Ford Oxaal v. Interactive Pictures Corp. et al., No. 99-CV-0802(LEK/DRH) (N.D.N.Y., filed May 20, 1999).
Production document No. OX 002846 in Ford Oxaal v. Interactive Pictures Corp. et al., No. 99-CV-0802(LEK/DRH) (N.D.N.Y., filed May 20, 1999) ("Lovers Leap" VHS videotape sleeve).
Production documents Nos. I 053110-I 053134 in Ford Oxaal v. Interactive Pictures Corp. et al., No. 99-CV-0802(LEK/DRH) (N.D.N.Y., filed May 20, 1999) (documents from prosecution history of Oxaal U.S. Patent No. 5,903,782).
Production documents Nos. OX 003774-3774A; OX 003843-3854; and OX 003887 in Ford Oxaal v. Interactive Pictures Corp. et al., No. 99-CV-0802(LEK/DRH) (N.D.N.Y., filed May 20, 1999) (Artintact Artists' Interactive CD-ROMagazine).
Production documents Nos. OX 1480-OX 001516 in Ford Oxaal v. Interactive Pictures Corp. et al., No. 99-CV-0802(LEK/DRH) (N.D.N.Y., filed May 20, 1999) (Apple Quicktime VR related documents).
Production documents Nos. Z 000055-Z 000110 in Ford Oxaal v. Interactive Pictures Corp., et al., No. 99-CV-0802(LEK/DRH (N.D.N.Y., filed May 20, 1999).
Production No. F 000806 in Ford Oxaal v. Interactive Pictures Corp. et al., No. 99-CV-0802(LEK/DRH) (N.D.N.Y., filed May 20, 1999) (Omnigraph demonstration video referenced in Production document No. F 000070).
Prosecution history for U.S. Reissue Patent No. Re 36,207.
R. Kingslake, "Optical System Design," Academic Press, 1983, pp. 86-87.
R. L. Anderson, N Alvertos, and E. L. Hall; Omnidirectional real time imaging using digital restoration, SPIE vol. 348 High Speed Photograph, San Diego, 1982.
Reply to an examination report in opposition proceedings (comm. Art. 101(2) and Rule 58(1) to (4) EPC); EP0971540; dated Mar. 1, 2005.
Reply to the communication under Rule 51(6) EPC-Filing of the translations of the claims with annexes; EP0971540; dated Mar. 6, 2002.
Request for accelerated examination; EP0971540; dated Aug. 2, 2001.
Request for the Umschreibstelle; EP0971540; dated Apr. 12, 2002.
S. Morris, "Digital Video Interactive-A New Integrated Format For Multi-Media Information," Microcomputer For Information Management, Dec. 1987, 4(4):249-261.
S. Ray, "The Lens in Action," Hastings House, 1976, pp. 114-117.
Science & Technology, Mar. 6, 1995, pp. 54-55, NASA's Tiny Camera Has A Wide-Angle Future, Larry Armstrong and Larry Holyoke.
SPIE vol. 1668 Visual Data Interpretation (1992), Image based panoramic virtual reality system, Kurtis J. Ritchey.
SPIE vol. 348 High Speed Photography (San Diego 1982), Omnidirectional real time imaging using digital restoration, R. L. Anderson, N. Alvertos, and E. L. Hall.
Summons to attend oral proceedings pursuant to Rule 71(1) EPC with annex; EP0971540; dated Apr. 29, 2005.
Two (2) Japanese prior art articles authorized by Dr. Morio Kuno (1980).
Video Tape-IPIX v. Infinite Pictures, Ref. No. 01096.58462, Exhibit Nos. 216 & 217.
Withdrawal of an opposition; EP0971540; dated Nov. 17, 2004.
Zuo L Cao, Sung J Oh, Ernest L Hall, Dynamic omnidirectional vision for mobile rebots, SPIE vol. 579 Intelligent Robots and Computer Vision, 1985.

Cited By (175)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US7714936B1 (en)*1991-05-132010-05-11Sony CorporationOmniview motionless camera orientation system
US20110007129A1 (en)*1991-05-132011-01-13Sony CorporationOmniview motionless camera orientation system
US9849593B2 (en)2002-07-252017-12-26Intouch Technologies, Inc.Medical tele-robotic system with a master remote station with an arbitrator
US10315312B2 (en)2002-07-252019-06-11Intouch Technologies, Inc.Medical tele-robotic system with a master remote station with an arbitrator
US8515577B2 (en)2002-07-252013-08-20Yulun WangMedical tele-robotic system with a master remote station with an arbitrator
USRE45870E1 (en)2002-07-252016-01-26Intouch Technologies, Inc.Apparatus and method for patient rounding with a remote controlled robot
US8209051B2 (en)2002-07-252012-06-26Intouch Technologies, Inc.Medical tele-robotic system
US9296107B2 (en)2003-12-092016-03-29Intouch Technologies, Inc.Protocol for a remotely controlled videoconferencing robot
US9375843B2 (en)2003-12-092016-06-28Intouch Technologies, Inc.Protocol for a remotely controlled videoconferencing robot
US9956690B2 (en)2003-12-092018-05-01Intouch Technologies, Inc.Protocol for a remotely controlled videoconferencing robot
US10882190B2 (en)2003-12-092021-01-05Teladoc Health, Inc.Protocol for a remotely controlled videoconferencing robot
US8401275B2 (en)2004-07-132013-03-19Intouch Technologies, Inc.Mobile robot with a head-based movement mapping scheme
US9766624B2 (en)2004-07-132017-09-19Intouch Technologies, Inc.Mobile robot with a head-based movement mapping scheme
US8983174B2 (en)2004-07-132015-03-17Intouch Technologies, Inc.Mobile robot with a head-based movement mapping scheme
US10241507B2 (en)2004-07-132019-03-26Intouch Technologies, Inc.Mobile robot with a head-based movement mapping scheme
US20060043264A1 (en)*2004-09-022006-03-02Casio Computer Co., Ltd.Imaging apparatus, image processing method for imaging apparatus, recording medium and carrier wave signal
US7643701B2 (en)*2004-09-022010-01-05Casio Computer Co., Ltd.Imaging apparatus for correcting a distortion of an image
US10259119B2 (en)2005-09-302019-04-16Intouch Technologies, Inc.Multi-camera mobile teleconferencing platform
US9198728B2 (en)2005-09-302015-12-01Intouch Technologies, Inc.Multi-camera mobile teleconferencing platform
US8160394B2 (en)2006-05-112012-04-17Intergraph Software Technologies, CompanyReal-time capture and transformation of hemispherical video images to images in rectilinear coordinates
US20070263093A1 (en)*2006-05-112007-11-15Acree Elaine SReal-time capture and transformation of hemispherical video images to images in rectilinear coordinates
US8849679B2 (en)2006-06-152014-09-30Intouch Technologies, Inc.Remote controlled robot system that provides medical images
US8892260B2 (en)2007-03-202014-11-18Irobot CorporationMobile robot for telecommunication
US9296109B2 (en)2007-03-202016-03-29Irobot CorporationMobile robot for telecommunication
US9160783B2 (en)2007-05-092015-10-13Intouch Technologies, Inc.Robot system that operates through a network firewall
US10682763B2 (en)2007-05-092020-06-16Intouch Technologies, Inc.Robot system that operates through a network firewall
US7660054B2 (en)*2007-06-292010-02-09Intel CorporationThermally controlled sold immersion lens fixture
US20090002855A1 (en)*2007-06-292009-01-01Cameron WagnerThermally controlled solid immersion lens fixture
TWI401468B (en)*2007-06-292013-07-11Intel Corp Thermally controlled solid immersion lens device
US11787060B2 (en)2008-03-202023-10-17Teladoc Health, Inc.Remote presence system mounted to operating room hardware
US10875182B2 (en)2008-03-202020-12-29Teladoc Health, Inc.Remote presence system mounted to operating room hardware
US10471588B2 (en)2008-04-142019-11-12Intouch Technologies, Inc.Robotic based health care system
US11472021B2 (en)2008-04-142022-10-18Teladoc Health, Inc.Robotic based health care system
US8861750B2 (en)2008-04-172014-10-14Intouch Technologies, Inc.Mobile tele-presence system with a microphone system
US10092169B2 (en)2008-07-082018-10-09Karl Storz Imaging, Inc.Solid state variable direction of view endoscope
US20100010301A1 (en)*2008-07-082010-01-14Hale Eric LSolid State Variable Direction of View Endoscope
US8814782B2 (en)2008-07-082014-08-26Karl Storz Imaging, Inc.Solid state variable direction of view endoscope
US8771177B2 (en)2008-07-082014-07-08Karl Storz Imaging, Inc.Wide angle flexible endoscope
US8758234B2 (en)2008-07-082014-06-24Karl Storz Imaging, Inc.Solid state variable direction of view endoscope
US8992423B2 (en)2008-07-082015-03-31Karl Storz Imaging, Inc.Solid state variable direction of view endoscope
US10493631B2 (en)2008-07-102019-12-03Intouch Technologies, Inc.Docking system for a tele-presence robot
US9193065B2 (en)2008-07-102015-11-24Intouch Technologies, Inc.Docking system for a tele-presence robot
US10878960B2 (en)2008-07-112020-12-29Teladoc Health, Inc.Tele-presence robot system with multi-cast features
US9842192B2 (en)2008-07-112017-12-12Intouch Technologies, Inc.Tele-presence robot system with multi-cast features
US20100013906A1 (en)*2008-07-172010-01-21Border John NZoom by multiple image capture
US8134589B2 (en)*2008-07-172012-03-13Eastman Kodak CompanyZoom by multiple image capture
US9429934B2 (en)2008-09-182016-08-30Intouch Technologies, Inc.Mobile videoconferencing robot system with network adaptive driving
US8340819B2 (en)2008-09-182012-12-25Intouch Technologies, Inc.Mobile videoconferencing robot system with network adaptive driving
US8996165B2 (en)2008-10-212015-03-31Intouch Technologies, Inc.Telepresence robot with a camera boom
US10875183B2 (en)2008-11-252020-12-29Teladoc Health, Inc.Server connectivity control for tele-presence robot
US10059000B2 (en)2008-11-252018-08-28Intouch Technologies, Inc.Server connectivity control for a tele-presence robot
US12138808B2 (en)2008-11-252024-11-12Teladoc Health, Inc.Server connectivity control for tele-presence robots
US9138891B2 (en)2008-11-252015-09-22Intouch Technologies, Inc.Server connectivity control for tele-presence robot
US8849680B2 (en)2009-01-292014-09-30Intouch Technologies, Inc.Documentation through a remote presence robot
US10969766B2 (en)2009-04-172021-04-06Teladoc Health, Inc.Tele-presence robot system with software modularity, projector and laser pointer
US8897920B2 (en)2009-04-172014-11-25Intouch Technologies, Inc.Tele-presence robot system with software modularity, projector and laser pointer
US10911715B2 (en)2009-08-262021-02-02Teladoc Health, Inc.Portable remote presence robot
US11399153B2 (en)2009-08-262022-07-26Teladoc Health, Inc.Portable telepresence apparatus
US9602765B2 (en)2009-08-262017-03-21Intouch Technologies, Inc.Portable remote presence robot
US8384755B2 (en)2009-08-262013-02-26Intouch Technologies, Inc.Portable remote presence robot
US10404939B2 (en)2009-08-262019-09-03Intouch Technologies, Inc.Portable remote presence robot
US20110134245A1 (en)*2009-12-072011-06-09Irvine Sensors CorporationCompact intelligent surveillance system comprising intent recognition
US11154981B2 (en)2010-02-042021-10-26Teladoc Health, Inc.Robot user interface for telepresence robot system
US9089972B2 (en)2010-03-042015-07-28Intouch Technologies, Inc.Remote presence system including a cart that supports a robot face and an overhead camera
US10887545B2 (en)2010-03-042021-01-05Teladoc Health, Inc.Remote presence system including a cart that supports a robot face and an overhead camera
US8670017B2 (en)2010-03-042014-03-11Intouch Technologies, Inc.Remote presence system including a cart that supports a robot face and an overhead camera
US11798683B2 (en)2010-03-042023-10-24Teladoc Health, Inc.Remote presence system including a cart that supports a robot face and an overhead camera
US9014848B2 (en)2010-05-202015-04-21Irobot CorporationMobile robot system
US9902069B2 (en)2010-05-202018-02-27Irobot CorporationMobile robot system
US8935005B2 (en)2010-05-202015-01-13Irobot CorporationOperating a mobile robot
US9498886B2 (en)2010-05-202016-11-22Irobot CorporationMobile human interface robot
US10343283B2 (en)2010-05-242019-07-09Intouch Technologies, Inc.Telepresence robot system that can be accessed by a cellular phone
US11389962B2 (en)2010-05-242022-07-19Teladoc Health, Inc.Telepresence robot system that can be accessed by a cellular phone
US10808882B2 (en)2010-05-262020-10-20Intouch Technologies, Inc.Tele-robotic system with a robot face placed on a chair
US9264664B2 (en)2010-12-032016-02-16Intouch Technologies, Inc.Systems and methods for dynamic bandwidth allocation
US10218748B2 (en)2010-12-032019-02-26Intouch Technologies, Inc.Systems and methods for dynamic bandwidth allocation
US8930019B2 (en)2010-12-302015-01-06Irobot CorporationMobile human interface robot
US12093036B2 (en)2011-01-212024-09-17Teladoc Health, Inc.Telerobotic system with a dual application screen presentation
US11468983B2 (en)2011-01-282022-10-11Teladoc Health, Inc.Time-dependent navigation of telepresence robots
US10399223B2 (en)2011-01-282019-09-03Intouch Technologies, Inc.Interfacing with a mobile telepresence robot
US9469030B2 (en)2011-01-282016-10-18Intouch TechnologiesInterfacing with a mobile telepresence robot
US8965579B2 (en)2011-01-282015-02-24Intouch TechnologiesInterfacing with a mobile telepresence robot
US11289192B2 (en)2011-01-282022-03-29Intouch Technologies, Inc.Interfacing with a mobile telepresence robot
US8718837B2 (en)2011-01-282014-05-06Intouch TechnologiesInterfacing with a mobile telepresence robot
US9323250B2 (en)2011-01-282016-04-26Intouch Technologies, Inc.Time-dependent navigation of telepresence robots
US9785149B2 (en)2011-01-282017-10-10Intouch Technologies, Inc.Time-dependent navigation of telepresence robots
US10591921B2 (en)2011-01-282020-03-17Intouch Technologies, Inc.Time-dependent navigation of telepresence robots
US9930225B2 (en)2011-02-102018-03-27Villmer LlcOmni-directional camera and related viewing software
US12224059B2 (en)2011-02-162025-02-11Teladoc Health, Inc.Systems and methods for network-based counseling
CN102694968B (en)*2011-03-252016-03-30中山市云创知识产权服务有限公司Camera device and environment monitoring method thereof
CN102694968A (en)*2011-03-252012-09-26鸿富锦精密工业(深圳)有限公司 Camera device and surrounding view monitoring method thereof
US10769739B2 (en)2011-04-252020-09-08Intouch Technologies, Inc.Systems and methods for management of information among medical providers and facilities
US9974612B2 (en)2011-05-192018-05-22Intouch Technologies, Inc.Enhanced diagnostics for a telepresence robot
US10788652B2 (en)2011-08-312020-09-29Ricoh Company, Ltd.Imaging optical system, imaging device and imaging system
US9110273B2 (en)*2011-08-312015-08-18Ricoh Company, Ltd.Imaging optical system, imaging device and imaging system
US9739983B2 (en)2011-08-312017-08-22Ricoh Company, Ltd.Imaging optical system, imaging device and imaging system
US20130050408A1 (en)*2011-08-312013-02-28Kensuke MasudaImaging optical system, imaging device and imaging system
US10295797B2 (en)2011-08-312019-05-21Ricoh Company, Ltd.Imaging optical system, imaging device and imaging system
US10284776B2 (en)2011-11-072019-05-07Sony Interactive Entertainment Inc.Image generation apparatus and image generation method
US9894272B2 (en)2011-11-072018-02-13Sony Interactive Entertainment Inc.Image generation apparatus and image generation method
US9729788B2 (en)2011-11-072017-08-08Sony CorporationImage generation apparatus and image generation method
US9560274B2 (en)2011-11-072017-01-31Sony CorporationImage generation apparatus and image generation method
EP2779620A4 (en)*2011-11-072015-06-24Sony Computer Entertainment Inc IMAGE GENERATING DEVICE AND IMAGE GENERATING METHOD
US8836751B2 (en)2011-11-082014-09-16Intouch Technologies, Inc.Tele-presence system with a user interface that displays different communication links
US9715337B2 (en)2011-11-082017-07-25Intouch Technologies, Inc.Tele-presence system with a user interface that displays different communication links
US10331323B2 (en)2011-11-082019-06-25Intouch Technologies, Inc.Tele-presence system with a user interface that displays different communication links
US10762170B2 (en)2012-04-112020-09-01Intouch Technologies, Inc.Systems and methods for visualizing patient and telepresence device statistics in a healthcare network
US11205510B2 (en)2012-04-112021-12-21Teladoc Health, Inc.Systems and methods for visualizing and managing telepresence devices in healthcare networks
US8902278B2 (en)2012-04-112014-12-02Intouch Technologies, Inc.Systems and methods for visualizing and managing telepresence devices in healthcare networks
US9251313B2 (en)2012-04-112016-02-02Intouch Technologies, Inc.Systems and methods for visualizing and managing telepresence devices in healthcare networks
US10892052B2 (en)2012-05-222021-01-12Intouch Technologies, Inc.Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US10061896B2 (en)2012-05-222018-08-28Intouch Technologies, Inc.Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US11453126B2 (en)2012-05-222022-09-27Teladoc Health, Inc.Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices
US10328576B2 (en)2012-05-222019-06-25Intouch Technologies, Inc.Social behavior rules for a medical telepresence robot
US11515049B2 (en)2012-05-222022-11-29Teladoc Health, Inc.Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US10603792B2 (en)2012-05-222020-03-31Intouch Technologies, Inc.Clinical workflows utilizing autonomous and semiautonomous telemedicine devices
US9174342B2 (en)2012-05-222015-11-03Intouch Technologies, Inc.Social behavior rules for a medical telepresence robot
US9776327B2 (en)2012-05-222017-10-03Intouch Technologies, Inc.Social behavior rules for a medical telepresence robot
US11628571B2 (en)2012-05-222023-04-18Teladoc Health, Inc.Social behavior rules for a medical telepresence robot
US9361021B2 (en)2012-05-222016-06-07Irobot CorporationGraphical user interfaces including touchpad driving interfaces for telemedicine devices
US10780582B2 (en)2012-05-222020-09-22Intouch Technologies, Inc.Social behavior rules for a medical telepresence robot
US10658083B2 (en)2012-05-222020-05-19Intouch Technologies, Inc.Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US9763563B2 (en)2012-07-112017-09-19Karl Storz Imaging, Inc.Endoscopic camera single-button mode activation
US20150015766A1 (en)*2012-09-112015-01-15Hiroyuki SatohImage capture system and imaging optical system
US10816778B2 (en)2012-09-112020-10-27Ricoh Company, Ltd.Image capture system and imaging optical system
US9013544B2 (en)*2012-09-112015-04-21Ricoh Company, Ltd.Image capture system and imaging optical system
US20140071226A1 (en)*2012-09-112014-03-13Hiroyuki SatohImage capture system and imaging optical system
US9798117B2 (en)2012-09-112017-10-24Ricoh Company, Ltd.Image capture system and imaging optical system
US10151905B2 (en)2012-09-112018-12-11Ricoh Company, Ltd.Image capture system and imaging optical system
US9413955B2 (en)*2012-09-112016-08-09Ricoh Company, Ltd.Image capture system and imaging optical system
US20140092017A1 (en)*2012-09-282014-04-03National Taiwan Normal UniversityInteractive simulated-globe display system
US8982049B2 (en)*2012-09-282015-03-17National Taiwan Normal UniversityInteractive simulated-globe display system
US9408527B2 (en)2012-11-012016-08-09Karl Storz Imaging, Inc.Solid state variable direction of view endoscope with rotatable wide-angle field for maximal image performance
EP2727513A1 (en)2012-11-012014-05-07Karl Storz Imaging Inc.Solid state variable direction of view endoscope with rotatable wide-angle field for maximal image performance
US8902322B2 (en)2012-11-092014-12-02Bubl Technology Inc.Systems and methods for generating spherical images
US10334205B2 (en)2012-11-262019-06-25Intouch Technologies, Inc.Enhanced video interaction for a user interface of a telepresence network
US9098611B2 (en)2012-11-262015-08-04Intouch Technologies, Inc.Enhanced video interaction for a user interface of a telepresence network
US11910128B2 (en)2012-11-262024-02-20Teladoc Health, Inc.Enhanced video interaction for a user interface of a telepresence network
US10924708B2 (en)2012-11-262021-02-16Teladoc Health, Inc.Enhanced video interaction for a user interface of a telepresence network
CN105191280A (en)*2012-12-132015-12-23微软技术许可有限责任公司Displacing image on imager in multi-lens cameras
WO2014093902A1 (en)*2012-12-132014-06-19Microsoft CorporationDisplacing image on imager in multi-lens cameras
US9094540B2 (en)2012-12-132015-07-28Microsoft Technology Licensing, LlcDisplacing image on imager in multi-lens cameras
US9529824B2 (en)*2013-06-052016-12-27Digitalglobe, Inc.System and method for multi resolution and multi temporal image search
US10096085B2 (en)2013-12-092018-10-09Cj Cgv Co., Ltd.Method for generating images for multi-projection theater and image management apparatus using the same
WO2015088187A1 (en)*2013-12-092015-06-18Cj Cgv Co., Ltd.Method for generating images for multi-projection theater and image management apparatus using the same
KR20150066930A (en)*2013-12-092015-06-17씨제이씨지브이 주식회사Method for generating images of multi-projection theater and image manegement apparatus using the same
US9508159B2 (en)*2014-02-142016-11-29Kakao Corp.Image database constructing method and device using the same
US20150235383A1 (en)*2014-02-142015-08-20Daum Communications Corp.Image Database Constructing Method and Device Using the Same
US9582731B1 (en)*2014-04-152017-02-28Google Inc.Detecting spherical images
US9883101B1 (en)*2014-07-232018-01-30Hoyos Integrity CorporationProviding a real-time via a wireless communication channel associated with a panoramic video capture device
EP4407373A3 (en)*2015-03-182024-10-30GoPro, Inc.Unibody dual-lens mount for a spherical camera
RU2579004C1 (en)*2015-05-052016-03-27Вячеслав Михайлович СмелковDevice for computer system for panoramic television surveillance with implementation of exchange of image parameters
US10650487B2 (en)*2015-08-122020-05-12Gopro, Inc.Equatorial stitching of hemispherical images in a spherical image capture system
US11195253B2 (en)2015-08-122021-12-07Gopro, Inc.Equatorial stitching of hemispherical images in a spherical image capture system
US20190385274A1 (en)*2015-08-122019-12-19Gopro, Inc.Equatorial stitching of hemispherical images in a spherical image capture system
US11631155B2 (en)2015-08-122023-04-18Gopro, Inc.Equatorial stitching of hemispherical images in a spherical image capture system
WO2017120379A1 (en)*2016-01-062017-07-13360fly, Inc.Modular panoramic camera systems
US20170270633A1 (en)*2016-03-152017-09-21Microsoft Technology Licensing, LlcBowtie view representing a 360-degree image
US10444955B2 (en)2016-03-152019-10-15Microsoft Technology Licensing, LlcSelectable interaction elements in a video stream
US10204397B2 (en)*2016-03-152019-02-12Microsoft Technology Licensing, LlcBowtie view representing a 360-degree image
US10002406B2 (en)2016-10-032018-06-19Samsung Electronics Co., Ltd.Consistent spherical photo and video orientation correction
US10999602B2 (en)2016-12-232021-05-04Apple Inc.Sphere projected motion estimation/compensation and mode decision
US11818394B2 (en)2016-12-232023-11-14Apple Inc.Sphere projected motion estimation/compensation and mode decision
US11259046B2 (en)2017-02-152022-02-22Apple Inc.Processing of equirectangular object data to compensate for distortion by spherical projections
US10924747B2 (en)2017-02-272021-02-16Apple Inc.Video coding techniques for multi-view video
US11862302B2 (en)2017-04-242024-01-02Teladoc Health, Inc.Automated transcription and documentation of tele-health encounters
US11093752B2 (en)2017-06-022021-08-17Apple Inc.Object tracking in multi-view video
US10754242B2 (en)2017-06-302020-08-25Apple Inc.Adaptive resolution and projection format in multi-direction video
US20190005709A1 (en)*2017-06-302019-01-03Apple Inc.Techniques for Correction of Visual Artifacts in Multi-View Images
US11742094B2 (en)2017-07-252023-08-29Teladoc Health, Inc.Modular telehealth cart with thermal imaging and touch screen user interface
US11636944B2 (en)2017-08-252023-04-25Teladoc Health, Inc.Connectivity infrastructure for a telehealth platform
US10593014B2 (en)*2018-03-262020-03-17Ricoh Company, Ltd.Image processing apparatus, image processing system, image capturing system, image processing method
US11389064B2 (en)2018-04-272022-07-19Teladoc Health, Inc.Telehealth cart that supports a removable tablet with seamless audio/video switching
US10769471B2 (en)*2018-10-032020-09-08Karl Storz Se & Co. KgSystem and method for holding an image display apparatus
US11252328B2 (en)*2019-02-082022-02-15Canon Kabushiki KaishaElectronic device and method for controlling the same

Also Published As

Publication numberPublication date
US20090040291A1 (en)2009-02-12

Similar Documents

PublicationPublication DateTitle
US7382399B1 (en)Omniview motionless camera orientation system
US6002430A (en)Method and apparatus for simultaneous capture of a spherical image
JP3290993B2 (en) Method and apparatus for creating a spherical image
EP0971540B1 (en)Omniview motionless camera orientation system
EP0610863B1 (en)Omniview motionless camera surveillance system
US7714936B1 (en)Omniview motionless camera orientation system
US6201574B1 (en)Motionless camera orientation system distortion correcting sensing element
US6977676B1 (en)Camera control system
US7583858B2 (en)Image processing based on direction of gravity
US5508734A (en)Method and apparatus for hemispheric imaging which emphasizes peripheral content
US7176960B1 (en)System and methods for generating spherical mosaic images
US6215519B1 (en)Combined wide angle and narrow angle imaging system and method for surveillance and monitoring
CA2639527C (en)Security camera system and method of steering beams to alter a field of view
JP3463612B2 (en) Image input method, image input device, and recording medium
US20060078215A1 (en)Image processing based on direction of gravity
JP2002503893A (en) Virtual reality camera
EP0735745B1 (en)Visual information processing method and apparatus
JP2001333303A (en)Omnidirectional vision system
CN100562102C (en) Method and system for capturing wide field of view images and regions of interest therein
KR20060094957A (en) Method and system for capturing wide image and its region of interest
JPH1141509A (en)Image pickup device
JPH1118007A (en)Omnidirectional image display system
JPH03217978A (en)Picture display device
JPH06105232A (en) Image synthesizer
Edwards360° camera systems for surveillance and security

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEM

Free format text:INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:INTERNET PICTURES CORPORATION;REEL/FRAME:011828/0054

Effective date:20010514

Owner name:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEM

Free format text:INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:PW TECHNOLOGY, INC.;REEL/FRAME:011828/0088

Effective date:20010514

Owner name:IMAGE INVESTOR PORFOLIO, A SEPARATE SERIES OF MEMP

Free format text:INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:INTERACTIVE PICTURES CORPORATION;REEL/FRAME:011837/0431

Effective date:20010514

ASAssignment

Owner name:PW TECHNOLOGY, INC., CALIFORNIA

Free format text:RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0978

Effective date:20010926

Owner name:INTERACTIVE PICTURES CORPORATION, TENNESSEE

Free format text:RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0982

Effective date:20010926

Owner name:INTERMET PICTURES CORPORATION, TENNESSEE

Free format text:RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0986

Effective date:20010926

Owner name:PW TECHNOLOGY, INC.,CALIFORNIA

Free format text:RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0978

Effective date:20010926

Owner name:INTERACTIVE PICTURES CORPORATION,TENNESSEE

Free format text:RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0982

Effective date:20010926

Owner name:INTERMET PICTURES CORPORATION,TENNESSEE

Free format text:RELEASE;ASSIGNOR:IMAGE INVESTOR PORTFOLIO, A SEPARATE SERIES OF MEMPHIS ANGELS, LLC;REEL/FRAME:012295/0986

Effective date:20010926

ASAssignment

Owner name:SONY CORPORATION, JAPAN

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IPIX CORPORATION;REEL/FRAME:019084/0034

Effective date:20070222

Owner name:SONY CORPORATION,JAPAN

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IPIX CORPORATION;REEL/FRAME:019084/0034

Effective date:20070222

ASAssignment

Owner name:IPIX CORPORATION, TENNESSEE

Free format text:CHANGE OF NAME;ASSIGNOR:INTERNET PICTURES CORPORATION;REEL/FRAME:020172/0380

Effective date:20040322

Owner name:INTERNET PICTURES CORPORATION, TENNESSEE

Free format text:MERGER;ASSIGNORS:BAMBOO.COM;INTERACTIVE PICTURES CORPORATION;INTERNET PICTURES CORPORATION;REEL/FRAME:020172/0348

Effective date:20000119

FEPPFee payment procedure

Free format text:PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text:PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPPFee payment procedure

Free format text:PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

REMIMaintenance fee reminder mailed
LAPSLapse for failure to pay maintenance fees
STCHInformation on status: patent discontinuation

Free format text:PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FPLapsed due to failure to pay maintenance fee

Effective date:20120603


[8]ページ先頭

©2009-2025 Movatter.jp