Movatterモバイル変換


[0]ホーム

URL:


USRE49105E1 - Self-calibrated, remote imaging and data processing system - Google Patents

Self-calibrated, remote imaging and data processing system
Download PDF

Info

Publication number
USRE49105E1
USRE49105E1US16/661,868US201916661868AUSRE49105EUS RE49105 E1USRE49105 E1US RE49105E1US 201916661868 AUS201916661868 AUS 201916661868AUS RE49105 EUSRE49105 EUS RE49105E
Authority
US
United States
Prior art keywords
imaging sensor
image
mount unit
imaging
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US16/661,868
Inventor
Chester L. Smitherman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vi Technologies LLC
Original Assignee
Vi Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/664,737external-prioritypatent/US7127348B2/en
Priority claimed from US11/581,235external-prioritypatent/US7725258B2/en
Priority claimed from US12/798,899external-prioritypatent/US8483960B2/en
Assigned to VISUAL INTELLIGENCE LPreassignmentVISUAL INTELLIGENCE LPCHANGE OF NAME (SEE DOCUMENT FOR DETAILS).Assignors: M7 VISUAL INTELLIGENCE LP
Priority to US16/661,868priorityCriticalpatent/USRE49105E1/en
Application filed by Vi Technologies LLCfiledCriticalVi Technologies LLC
Assigned to M7 VISUAL INTELLIGENCE, L.P.reassignmentM7 VISUAL INTELLIGENCE, L.P.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: SMITHERMAN, CHESTER L.
Assigned to VI TECHNOLOGIES, LLCreassignmentVI TECHNOLOGIES, LLCASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: Visual Intelligence, LP
Assigned to VI TECHNOLOGIES, LLCreassignmentVI TECHNOLOGIES, LLCASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: Visual Intelligence, LP
Publication of USRE49105E1publicationCriticalpatent/USRE49105E1/en
Application grantedgrantedCritical
Anticipated expirationlegal-statusCritical
Expired - Lifetimelegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

An imaging sensor system, having a view of a target area comprising: a rigid mount unit having at least two imaging sensors disposed within the mount unit, wherein a first imaging and a second imaging sensor each has a focal axis passing through an aperture in the mount unit, wherein the first imaging sensor generates a first image area comprising a first data array of pixels and the second imaging sensor generates a second image area comprising a second data array of pixels, wherein the first and second imaging sensors are offset to have a first image overlap area in the target area, wherein the first sensors image data bisects the second sensors image data in the first image overlap area.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This application is a reissue of U.S. patent application Ser. No. 15/200,883 (now issued as U.S. Pat. No. 9,797,980), filed on Jul. 1, 2016, which was a continuation of U.S. patent application Ser. No. 13/772,994 (to issue now issued as U.S. Pat. No. 9,389,298), filed on Feb. 21, 2013, which was a continuation of U.S. patent application Ser. No. 12/798,8999 12/798,899 (now issued as U.S. Pat. No. 8,483,960), filed on Apr. 13, 2010, which was a continuation-in-part of U.S. patent application Ser. No. 11/581,235 (now issued as U.S. Pat. No. 7,725,258), filed on Oct. 11, 2006, which was a continuation-in-part of and claimed priority to U.S. patent application Ser. No. 10/664,737 (now issued as U.S. Pat. No. 7,127,348), filed on Sep. 18, 2003, which claimed priority to U.S. Provisional Patent Application Ser. No. 60/412,504, filed on Sep. 20, 2002 for “Vehicle Based Data Collection and Processing System.”
TECHNICAL FIELD OF THE INVENTION
The present invention relates, generally, to the field of remote imaging techniques and, more particularly, to a system for rendering high-resolution, high accuracy, low distortion digital images over very large fields of view.
BACKGROUND OF THE INVENTION
Remote sensing and imaging are broad-based technologies having a number of diverse and extremely important practical applications—such as geological mapping and analysis, and meteorological forecasting. Aerial and satellite-based photography and imaging are especially useful remote imaging techniques that have, over recent years, become heavily reliant on the collection and processing of data for digital images, including spectral, spatial, elevation, and vehicle location and orientation parameters. Spatial data—characterizing real estate improvements and locations, roads and highways, environmental hazards and conditions, utilities infrastructures (e.g., phone lines, pipelines), and geophysical features—can now be collected, processed, and communicated in a digital format to conveniently provide highly accurate mapping and surveillance data for various applications (e.g., dynamic GPS mapping). Elevation data may be used to improve the overall system's spatial and positional accuracy and may be acquired from either existing Digital Elevation Model (DEM) data sets or collected with the spectral sensor data from an active, radiation measuring Doppler based devices, or passive, stereographic calculations.
Major challenges facing remote sensing and imaging applications are spatial resolution and spectral fidelity. Photographic issues, such as spherical aberrations, astigmatism, field curvature, distortion, and chromatic aberrations are well-known problems that must be dealt with in any sensor/imaging application. Certain applications require very high image resolution—often with tolerances of inches. Depending upon the particular system used (e.g., aircraft, satellite, or space vehicle), an actual digital imaging device may be located anywhere from several feet to miles from its target, resulting in a very large scale factor. Providing images with very large scale factors, that also have resolution tolerances of inches, poses a challenge to even the most robust imaging system. Thus, conventional systems usually must make some trade-off between resolution quality and the size of a target area that can be imaged. If the system is designed to provide high-resolution digital images, then the field of view (FOV) of the imaging device is typically small. If the system provides a larger FOV, then usually the resolution of the spectral and spatial data is decreased and distortions are increased.
Ortho-imaging is an approach that has been used in an attempt to address this problem. In general, ortho-imaging renders a composite image of a target by compiling varying sub-images of the target. Typically, in aerial imaging applications, a digital imaging device that has a finite range and resolution records images of fixed subsections of a target area sequentially. Those images are then aligned according to some sequence to render a composite of a target area.
Often, such rendering processes are very time-consuming and labor intensive. In many cases, those processes require iterative processing that measurably degrades image quality and resolution—especially in cases where thousands of sub-images are being rendered. In cases where the imaging data can be processed automatically, that data is often repetitively transformed and sampled—reducing color fidelity and image sharpness with each successive manipulation. If automated correction or balancing systems are employed, such systems may be susceptible to image anomalies (e.g., unusually bright or dark objects)—leading to over or under-corrections and unreliable interpretations of image data. In cases where manual rendering of images is required or desired, time and labor costs are immense.
There is, therefore, a need for an ortho-image rendering system that provides efficient and versatile imaging for very large FOVs and associated data sets, while maintaining image quality, accuracy, positional accuracy and clarity. Additionally, automation algorithms are applied extensively in every phase of the planning, collecting, navigating, and processing all related operations.
SUMMARY OF THE INVENTION
The present invention relates to remote data collection and processing system using a variety of sensors. The system may include computer console units that control vehicle and system operations in real-time. The system may also include global positioning systems that are linked to and communicate with the computer consoles. Additionally, cameras and/or camera array assemblies can be employed for producing an image of a target viewed through an aperture. The camera array assemblies are communicatively connected to the computer consoles. The camera array assembly has a mount housing, a first imaging sensor centrally coupled to the housing having a first focal axis passing through the aperture. The camera array assembly also has a second imaging sensor coupled to the housing and offset from the first imaging sensor along an axis, that has a second focal axis passing through the aperture and intersecting the first focal axis within an intersection area. The camera array assembly has a third imaging sensor, coupled to the housing and offset from the first imaging sensor along the axis, opposite the second imaging sensor, that has a third focal axis passing through the aperture and intersecting the first focal axis within the intersection area. Any number of one-to-n cameras may be used in this manner, where “n” can be any odd or even number.
The system may also include an Attitude Measurement Unit (AMU) such as inertial, optical, or similar measurement units communicatively connected to the computer consoles and the camera array assemblies. The AMU may determine the yaw, pitch, and/or roll of the aircraft at any instant in time and successive DGPS positions may be used to measure the vehicle heading with relation to geodesic north. The AMU data is integrated with the precision DGPS data to produce a robust, real-time AMU system. The system may further include a mosaicing module housed within the computer consoles. The mosaicing module includes a first component for performing initial processing on an input image. The mosaicing module also includes a second component for determining geographical boundaries of an input image with the second component being cooperatively engaged with the first component. The mosaicing module further includes a third component for mapping an input image into the composite image with accurate geographical position. The third component being cooperatively engaged with the first and second components. A fourth component is also included in the mosaicing module for balancing color of the input images mapped into the composite image. The fourth component can be cooperatively engaged with the first, second and third components. Additionally, the mosaicing module can include a fifth component for blending borders between adjacent input images mapped into the composite image. The fifth component being cooperatively engaged with the first, second, third and fourth components.
A sixth component, an optional forward oblique and/or optional rear oblique camera array system may be implemented that collects oblique image data and merges the image data with attitude and positional measurements in order to create a digital elevation model using stereographic techniques. Creation of which may be performed in real-time onboard the vehicle or post processed later. This sixth component works cooperatively with the other components. All components may be mounted to a rigid platform for the purpose of providing co-registration of sensor data. Vibrations, turbulence, and other forces may act on the vehicle in such a way as to create errors in the alignment relationship between sensors. Utilization of common, rigid platform mount for the sensors provides a significant advantage over other systems that do not use this co-registration architecture.
Further, the present invention may employ a certain degree of lateral oversampling to improve output quality and/or co-mounted, co-registered oversampling to overcome physical pixel resolution limits.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the invention, and to show by way of example how the same may be carried into effect, reference is now made to the detailed description of the invention along with the accompanying figures in which corresponding numerals in the different figures refer to corresponding parts and in which:
FIG. 1 illustrates a vehicle based data collection and processing system of the present invention;
FIG. 1A illustrates a portion of the vehicle based data collection and processing system ofFIG. 1;
FIG. 1B illustrates a portion of the vehicle based data collection and processing system ofFIG. 1;
FIG. 2 illustrates a vehicle based data collection and processing system ofFIG. 1 with the camera array assembly of the present invention shown in more detail;
FIG. 3 illustrates a camera array assembly in accordance with certain aspects of the present invention;
FIG. 4 illustrates one embodiment of an imaging pattern retrieved by the camera array assembly ofFIG. 1;
FIG. 5 depicts an imaging pattern illustrating certain aspects of the present invention;
FIG. 6 illustrates an image strip in accordance with the present invention;
FIG. 7 illustrates another embodiment of an image strip in accordance with the present invention;
FIG. 8 illustrates one embodiment of an imaging process in accordance with the present invention;
FIG. 9 illustrates diagrammatically how photos taken with the camera array assembly can be aligned to make an individual frame;
FIG. 10 is a block diagram of the processing logic according to certain embodiments of the present invention;
FIG. 11 is an illustration of lateral oversampling looking down from a vehicle according to certain embodiments of the present invention;
FIG. 12 is an illustration of lateral oversampling looking down from a vehicle according to certain embodiments of the present invention;
FIG. 13 is an illustration of flight line oversampling looking down from a vehicle according to certain embodiments of the present invention;
FIG. 14 is an illustration of flight line oversampling looking down from a vehicle according to certain embodiments of the present invention;
FIG. 15 is an illustration of progressive magnification looking down from a vehicle according to certain embodiments of the present invention;
FIG. 16 is an illustration of progressive magnification looking down from a vehicle according to certain embodiments of the present invention;
FIG. 17 is an illustration of progressive magnification looking down from a vehicle according to certain embodiments of the present invention;
FIG. 18 is a schematic of the system architecture according to certain embodiments of the present invention;
FIG. 19 is an illustration of lateral co-mounted, co-registered oversampling in a sidelap sub-pixel area for a single camera array looking down from a vehicle according to certain embodiments of the present invention;
FIG. 20 is an illustration of lateral co-mounted, co-registered oversampling in a sidelap sub-pixel area for two overlapping camera arrays looking down from a vehicle according to certain embodiments of the present invention; and
FIG. 21 is an illustration of fore and lateral co-mounted, co-registered oversampling in sidelap sub-pixel areas for two stereo camera arrays looking down from a vehicle according to certain embodiments of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
While the making and using of various embodiments of the present invention are discussed in detail below, it should be appreciated that the present invention provides many applicable inventive concepts, which can be embodied in a wide variety of specific contexts. The specific embodiments discussed herein are merely illustrative of specific ways to make and use the invention and do not limit the scope of the invention.
A vehicle based data collection andprocessing system100 of the present invention is shown inFIGS. 1, 1A, and 1B. Additional aspects and embodiments of the present invention are shown inFIGS. 2 and 18.System100 includes one or more computer consoles102. The computer consoles contain one ormore computers104 for controlling both vehicle and system operations. Examples of the functions of the computer console are the controlling digital color sensor systems that can be associated with the data collection and processing system, providing the display data to a pilot, coordinating the satellite generated GPS pulse-per-second (PPS) event trigger (which may be 20 or more pulses per second), data logging, sensor control and adjustment, checking and alarming for error events, recording and indexing photos, storing and processing data, flight planning capability that automates the navigation of the vehicle, data, and providing a real-time display of pertinent information. A communications interface between the control computer console and the vehicle autopilot control provides the ability to actually control the flight path of the vehicle in real-time. This results in a more precise control of the vehicle's path than is possible by a human being. All of these functions can be accomplished by the use of various computer programs that are synchronized to the GPS PPS signals and take into account the various electrical latencies of the measurement devices. In an embodiment, the computer is embedded within the sensor.
One or more differentialglobal positioning systems106 are incorporated into thesystem100. Theglobal positioning systems106 are used to navigate and determine precise flight paths during vehicle and system operations. To accomplish this, theglobal positioning systems106 are communicatively linked to thecomputer console102 such that the information from theglobal positioning systems106 can be acquired and processed without flight interruption. Zero or more GPS units may be located at known survey points in order to provide a record of each sub-seconds' GPS satellite-based errors in order to be able to back correct the accuracy of thesystem100. GPS and/or ground based positioning services may be used that eliminate the need for ground control points altogether. This technique results in greatly improved, sub-second by sub-second positional accuracy of the data capture vehicle.
One or more AMUs108 that provide real-time yaw, pitch, and roll information that is used to accurately determine the attitude of the vehicle at the instant of data capture are also communicatively linked to thecomputer console102. The present attitude measurement unit (AMU) (e.g., Applanix POS AV), uses three high performance fiber optic gyros, one gyro each for yaw, pitch, and roll measurement. AMUs from other manufacturers, and AMUs that use other inertial measurement devices can be used as well. Additionally, an AMU may be employed to determine the instantaneous attitude of the vehicle and make the system more fault tolerant to statistical errors in AMU readings. Connected to the AMU can be one or moremulti-frequency DGPS receivers110. Themulti-frequency DGPS receivers110 can be integrated with the AMU's yaw, pitch, and roll attitude data in order to more accurately determine the location of the remote sensor platform in three dimensional space. Additionally, the direction of geodesic North may be determined by the vector created by successive DGPS positions, recorded in a synchronized manner with the GPS PPS signals.
One or morecamera array assemblies112 for producing an image of a target viewed through an aperture are also communicatively connected to the one or more computer consoles102. Thecamera array assemblies112, which will be described in greater detail below, provide the data collection and processing system with the ability to capture high resolution, high precision progressive scan or line scan, color digital photography.
The system may also include DC power andconditioning equipment114 to condition DC power and to invert DC power to AC power in order to provide electrical power for the system. The system may further include anavigational display116, which graphically renders the position of the vehicle versus the flight plan for use by the pilot (either onboard or remote) of the vehicle to enable precision flight paths in horizontal and vertical planes. The system may also include an EMU module comprised of LIDAR,SAR118 or a forward and rear oblique camera array for capturing three dimensional elevation/relief data. TheEMU module118 can include alaser unit120, anEMU control unit122, and anEMU control computer124. Temperature controlling devices, such as solid state cooling modules, can also be deployed as needed in order to provide the proper thermal environment for the system.
The system also includes a mosaicing module, not depicted, housed with thecomputer console102. The mosaicing module, which will be described in further detail below, provides the system the ability to gather data acquired by theglobal positioning system106, theAMU108, and thecamera system112 and process that data into useable orthomaps.
Thesystem100 also can include a self-locking flight path technique that provides the ability to micro-correct the positional accuracy of adjacent flight paths in order to realize precision that exceeds the native precision of the AMU and DGPS sensors alone.
A complete flight planning methodology is used to micro plan all aspects of missions. The inputs are the various mission parameters (latitude/longitude, resolution, color, accuracy, etc.) and the outputs are detailed on-line digital maps and data files that are stored onboard the data collection vehicle and used for real-time navigation and alarms. The ability to interface the flight planning data directly into the autopilot is an additional integrated capability. A computer program may be used that automatically controls the flight path, attitude adjustments, graphical display, moving maps of the vehicle path, checks for alarm conditions and corrective actions, notifies the pilot and/or crew of overall system status, and provides for fail-safe operations and controls. Safe operations parameters may be constantly monitored and reported. Whereas the current system uses a manned crew, the system is designed to perform equally well in an unmanned vehicle.
FIG. 2 shows another depiction of the present invention. InFIG. 2, thecamera array assembly112 is shown in more detail. As is shown, thecamera array assembly112 allows for images to be acquired from the rear oblique, the forward obliques and the nadir positions.FIG. 3 describes in more detail a camera array assembly of the present invention.FIG. 3 provides acamera array assembly300 airborne over target302 (e.g., terrain). For illustrative purposes, the relative size ofassembly300, and the relative distance between it andterrain302, are not depicted to scale inFIG. 3. Thecamera array assembly300 comprises ahousing304 within whichimaging sensors306,308,310,312 and314 are disposed along a concavecurvilinear axis316. The radius of curvature ofaxis316 may vary or be altered dramatically, providing the ability to effect very subtle or very drastic degrees of concavity inaxis316. Alternatively,axis316 may be completely linear—having no curvature at all. Theimaging sensors306,308,310,312 and314 couple to thehousing304, either directly or indirectly, byattachment members318.Attachment members318 may comprise a number of fixed or dynamic, permanent or temporary, connective apparatus. For example, theattachment members318 may comprise simple welds, removable clamping devices, or electro-mechanically controlled universal joints.
Additionally, thesystem100 may have a real-time, onboard navigation system to provide a visual, bio-feedback display to the vehicle pilot, or remote display in the case of operations in an unmanned vehicle. The pilot is able to adjust the position of the vehicle in real-time in order to provide a more accurate flight path. The pilot may be onboard the vehicle or remotely located and using the flight display to control the vehicle through a communication link.
Thesystem100 may also use highly fault-tolerant methods that have been developed to provide a software interleaved disk storage methodology that allows one or two hard drives to fail and still not lose target data that is stored on the drives. This software inter-leaved disk storage methodology provides superior fault-tolerance and portability versus other, hardware methodologies, such as RAID-5.
Thesystem100 may also incorporate a methodology that has been developed that allows for a short calibration step just before mission data capture. The calibration methodology step adjusts the camera settings, mainly exposure time, based on sampling the ambient light intensity and setting near optimal values just before reaching the region of interest. A moving average algorithm is then used to make second-by-second camera adjustments in order to deliver improved, consistent photo results. This improves the color processing of the orthomaps. Additionally, the calibration may be used to check or to establish the exact spatial position of each sensor device (cameras, DPG, AMU, EMU, etc.). In this manner, changes that may happen in the spatial location of these devices may be accounted for and maintain overall system precision metrics.
Additionally, thesystem100 may incorporate a methodology that has been developed that allows for calibrating the precision position and attitude of each sensor device (cameras, DPG, AMU, EMU, etc.) on the vehicle by flying over an area that contains multiple known, visible, highly accurate geographic positions. A program takes this data as input and outputs the micro positional data that is then used to precisely process the orthomaps.
As depicted inFIG. 3,housing304 comprises a simple enclosure inside of whichimaging sensors306,308,310,312 and314 are disposed. WhereasFIG. 3 depicts a 5-camera array, the system works equally well when utilizing any number of camera sensors from 1 to any number.Sensors306 through314 couple, via theattachment members318, either collectively to a single transverse cross member, or individually to lateral cross members disposed between opposing walls of thehousing304. In alternative embodiments, thehousing304 may itself comprise only a supporting cross member of concave curvature to which theimaging sensors306 through314 couple, viamembers318. In other embodiments, thehousing304 may comprise a hybrid combination of enclosure and supporting cross member. Thehousing304 further comprises an aperture320 formed in its surface, between the imaging sensors andtarget302. Depending upon the specific type of host craft, the aperture320 may comprise only a void, or it may comprise a protective screen or window to maintain environmental integrity within thehousing304. In the event that a protective transparent plate is used for any sensor, special coatings may be applied to the plate to improve the quality of the sensor data. Optionally, the aperture320 may comprise a lens or other optical device to enhance or alter the nature of the images recorded by the sensors. The aperture320 is formed with a size and shape sufficient to provide theimaging sensors306 through314 proper lines of sight to atarget region322 onterrain302.
Theimaging sensors306 through314 are disposed within or alonghousing304 such that the focal axes of all sensors converge and intersect each other within an intersection area bounded by the aperture320. Depending upon the type of image data being collected, the specific imaging sensors used, and other optics or equipment employed, it may be necessary or desirable to offset the intersection area or point of convergence above or below the aperture320. Theimaging sensors306 through314 are separated from each other at angular intervals. The exact angle of displacement between the imaging sensors may vary widely depending upon the number of imaging sensors utilized and on the type of imaging data being collected. The angular displacement between the imaging sensors may also be unequal, if required, so as to provide a desired image offset or alignment. Depending upon the number of imaging sensors utilized, and the particular configuration of the array, the focal axes of all imaging sensors may intersect at exactly the same point, or may intersect at a plurality of points, all within close proximity to each other and within the intersection area defined by the aperture320.
As depicted inFIG. 3, the imaging sensor310 is centrally disposed within thehousing304 alongaxis316. The imaging sensor310 has afocal axis324, directed orthogonally from thehousing304 to align the line of sight of the imaging sensor with theimage area326 of theregion322. Theimaging sensor308 is disposed within thehousing304 along theaxis316, adjacent to the imaging sensor310. Theimaging sensor308 is aligned such that its line of sight coincides with theimage area328 of theregion322, and such that itsfocal axis330 converges with and intersects theaxis324 within the area bounded by the aperture320. Theimaging sensor312 is disposed within thehousing304 adjacent to the imaging sensor310, on the opposite side of theaxis316 as theimaging sensor308. Theimaging sensor312 is aligned such that its line of sight coincides with theimage area332 of theregion322, and such that itsfocal axis334 converges with and intersectsaxes324 and330 within the area bounded by the aperture320. Theimaging sensor306 is disposed within thehousing304 along theaxis316, adjacent to thesensor308. Theimaging sensor306 is aligned such that its line of sight coincides with theimage area336 ofregion322, and such that itsfocal axis338 converges with and intersects the other focal axes within the area bounded by aperture320. Theimaging sensor314 is disposed withinhousing304 adjacent tosensor312, on the opposite side ofaxis316 assensor306. Theimaging sensor314 is aligned such that its line of sight coincides withimage area340 ofregion322, and such that itsfocal axis344 converges with and intersects the other focal axes within the area bounded by aperture320.
Theimaging sensors306 through314 may comprise a number of digital imaging devices including, for example, individual area scan cameras, line scan cameras, infrared sensors, hyperspectral and/or seismic sensors. Each sensor may comprise an individual imaging device, or may itself comprise an imaging array. Theimaging sensors306 through314 may all be of a homogenous nature, or may comprise a combination of varied imaging devices. For ease of reference, theimaging sensors306 through314 are hereafter referred to ascameras306 through314, respectively.
In large-format film or digital cameras, lens distortion is typically a source of imaging problems. Each individual lens must be carefully calibrated to determine precise distortion factors. In one embodiment of this invention, small-format digital cameras having lens angle widths of 17 degrees or smaller are utilized. This alleviates noticeable distortion efficiently and affordably.
Cameras306 through314 are alternately disposed withinhousing304 alongaxis316 such that each camera's focal axis converges upon aperture320, crossesfocal axis324, and aligns its field of view with a target area opposite its respective position in the array resulting in a “cross-eyed”, retinal relationship between the cameras and the imaging target(s). Thecamera array assembly300 is configured such that adjoining borders ofimage areas326,328,332,336 and340 overlap slightly.
If theattachment members318 are of a permanent and fixed nature (e.g., welds), then the spatial relationship between the aperture320, the cameras, and their lines of sight remain fixed as will the spatial relationship betweenimage areas326,328,332,336 and340. Such a configuration may be desirable in, for example, a satellite surveillance application where thecamera array assembly300 will remain at an essentially fixed distance fromregion322. The position and alignment of the cameras is set such thatareas326,328,332,336 and340 provide full imaging coverage ofregion322. If theattachment members318 are of a temporary or adjustable nature, however, it may be desirable to selectively adjust, either manually or by remote automation, the position or alignment of the cameras so as to shift, narrow or widenareas326,328,332,336 and340—thereby enhancing or altering the quality of images collected by thecamera array assembly300.
In an embodiment, multiple, i.e., at least two, rigid mount units are affixed to the same rigid mount plate. The mount unit is any rigid structure to which at least one imaging sensor may be affixed. The mount unit is preferably a housing, which encloses the imaging sensor, but may be any rigid structure including a brace, tripod, or the like. For the purposes of this disclosure, an imaging sensor means any device capable of receiving and processing active or passive radiometric energy, i.e., light, sound, heat, gravity, and the like, from a target area. In particular, imaging sensors may include any number of digital cameras, including those that utilize a red-blue-green filter, a bushbroom filter, or a hyperspectral filter, LIDAR sensors, infrared sensors, heat-sensing sensors, gravitometers and the like. Imagining sensors do not include attitude measuring sensors such as gyroscopes, GPS devices, and the like devices, which serve to orient the vehicle with the aid of satellite data and/or inertial data. Preferably, the multiple sensors are different.
In the embodiment wherein the imaging sensor is a camera, LIDAR, or the like imaging sensor, the mount unit preferably has an aperture through which light and/or energy may pass. The mount plate is preferably planer, but may be non-planer. In the embodiment, wherein the imaging sensor is a camera, LIDAR, or the like imaging sensor, the mount plate preferably has aperture(s) in alignment with the aperture(s) of the mount unit(s) through which light and/or energy may pass.
A rigid structure is one that flexes less than about 100thof a degree, preferably less than about 1,000thof a degree, more preferably less than about 10,000thof a degree while in use. Preferably, the rigid structure is one that flexes less than about 100thof a degree, preferably less than about 1,000thof a degree, more preferably less than about 10,000thof a degree while secured to an aircraft during normal, i.e., non-turbulent, flight. Objects are rigidly affixed to one another if during normal operation they flex from each other less than about 100thof a degree, preferably less than about 1,000thof a degree, more preferably less than about 10,000thof a degree.
Camera310 is designated as the principal camera. Theimage plane326 of camera310 serves as a plane of reference. The orientations of theother cameras306,308,312 and314 are measured relative to the plane of reference. The relative orientations of each camera are measured in terms of the yaw, pitch and roll angles required to rotate the image plane of the camera to become parallel to the plane of reference. The order of rotations is preferably yaw, pitch, and roll.
The imaging sensors affixed to the mount unit(s) may not be aligned in the same plane. Instead, the angle of their mount relative to the mount angle of a first sensor affixed to the first mount unit, preferably the principle nadir camera of the first mount unit, may be offset. Accordingly, the imaging sensors may be co-registered to calibrate the physical mount angle offset of each imaging sensor relative to each other. In an embodiment, multiple, i.e., at least two, rigid mount units are affixed to the same rigid mount plate and are co-registered. In an embodiment, thecameras306 through314 are affixed to a rigid mount unit and co-registered. In this embodiment, the geometric centerpoint of the AMU, preferably a gyroscope, is determined using GPS and inertial data. The physical position of the first sensor affixed to the first mount unit, preferably the principle nadir camera of the first mount unit, is calculated relative to a reference point, preferably the geometric centerpoint of the AMU. Likewise, the physical position of all remaining sensors within all mount units are calculated—directly or indirectly—relative to the same reference point.
The boresight angle of a sensor is defined as the angle from the geometric center of that sensor to a reference plane. Preferably the reference plane is orthogonal to the target area. The boresight angle of the first sensor may be determined using the ground target points. The boresight angles of subsequent sensors are preferably calculated with reference to the boresight angle of the first sensor. The sensors are preferably calibrated using known ground targets, which are preferably photo-identifiable, and alternatively calibrated using a self-locking flight path or any other method as disclosed in U.S. Publication No. 2004/0054488A1, now U.S. Pat. No. 7,212,938B2, the disclosure of which is hereby incorporated by reference in full.
The imaging sensor within the second mount unit may be any imaging sensor, and is preferably a LIDAR. Alternative, the second imaging sensor is a digital camera, or array of digital cameras. In an embodiment, the boresight angle of so the sensor(s) affixed to the second mount unit are calculated with reference to the boresight angle of the first sensor. The physical offset of the imaging sensor(s) within the second mount unit may be calibrated with reference to the boresight angle of the first sensor within the first mount unit.
In this manner, all of the sensors are calibrated at substantially the same epoch, using the same GPS signal, the same ground target(s), and under substantially the same atmospheric conditions. This substantially reduces compounded error realized when calibrating each sensor separately, using different GPS signals, against different ground targets, and under different atmospheric conditions.
Referring now toFIG. 4, images ofareas336,328,326,332 and340 taken bycameras306 through314, respectively, are illustrated from an overhead view. Again, because of the “cross-eyed” arrangement, the image ofarea336 is taken bycamera306, the image ofarea340 is taken bycamera314, and so on. In one embodiment of the present invention, images other than those taken by the center camera310 take on a trapezoidal shape after perspective transformation.Cameras306 through314 form an array alongaxis316 that is, in most applications, pointed down vertically. In an alternative embodiment, a second array of cameras, configured similar the array ofcameras306 through314, is aligned with respect to the first array of cameras to have an oblique view providing a “heads-up” perspective. The angle of declination from horizontal of the heads-up camera array assembly may vary due to mission objectives and parameters but angles of 25-45 degrees are typical. Other alternative embodiments, varying the mounting of camera arrays, are similarly comprehended by the present invention. In all such embodiments, the relative positions and attitudes of the cameras are precisely measured and calibrated so as to facilitate image processing in accordance with the present invention.
In one embodiment of the present invention, an external mechanism (e.g., a GPS timing signal) is used to trigger the cameras simultaneously thereby capturing an array of input images. A mosaicing module then renders the individual input images from such an array into an ortho-rectified compound image (or “mosaic”), without any visible seams between the adjacent images. The mosaicing module performs a set of tasks comprising: determining the geographical boundaries and dimensions of each input image; projecting each input image onto the mosaic with accurate geographical positioning; balancing the color of the images in the mosaic; and blending adjacent input images at their shared seams. The exact order of the tasks performed may vary, depending upon the size and nature of the input image data. In certain embodiments, the mosaicing module performs only a single transformation to an original input image during mosaicing. That transformation can be represented by a 4×4 matrix. By combining multiple transformation matrices into a single matrix, processing time is reduced and original input image sharpness is retained.
During mapping of the input images to the mosaic, especially when mosaicing is performed at high resolutions, pixels in the mosaic (i.e., output pixels) may not be mapped to by any pixels in the input images (i.e., input pixels). Warped lines could potentially result as artifacts in the mosaic. Certain embodiments of the present invention overcome this with a super-sampling system, where each input and output pixel is further divided into an n×m grid of sub-pixels. Transformation is performed from sub-pixels to sub-pixels. The final value of an output pixel is the average value of its sub-pixels for which there is a corresponding input sub-pixel. Larger n and m values produce mosaics of higher resolution, but do require extra processing time.
During its processing of image data, the mosaicing module may utilize the following information: the spatial position (e.g., x, y, z coordinates) of each camera's focal point at the time an input image is captured; the attitude (i.e., yaw, pitch, roll) of each camera's image plane relative to the target region's ground plane at the time an input image was captured; each camera's fields of view (i.e., along track and cross track); and the Digital Terrain Model (DTM) of the area. The attitude can be provided by the AMUs associated with the system. Digital terrain models (DTMs) or Digital surface models (DSMs) can be created from information obtained using aLIDAR module118. LIDAR is similar to the more familiar radar, and can be thought of as laser radar. In radar, radio waves are transmitted into the atmosphere that scatters some of the energy back to the radar's receiver. LIDAR also transmits and receives electromagnetic radiation, but at a higher frequency since it operates in the ultraviolet, visible and infrared region of the electromagnetic spectrum. In operation, LIDAR transmits light out to a target area. The transmitted light interacts with and is changed by the target area. Some of this light is reflected/scattered back to the LIDAR instrument where it can be analyzed. The change in the properties of the light enables some property of the target area to be determined. The time for the light to travel out to the target area and back to LIDAR device is used to determine the range to the target.
DTM and DSM data sets can also be captured from the camera array assembly. Traditional means of obtaining elevation data may also be used such as stereographic techniques.
There are presently three basic types of LIDAR: Range finders, Differential Absorption LIDAR (DIAL) and Doppler LIDAR. Range finder LIDAR is the simplest LIDAR and is used to measure the distance from the LIDAR device to a solid or hard target. DIAL LIDAR is used to measure chemical concentrations (such as ozone, water vapor, pollutants) in the atmosphere. A DIAL LIDAR uses two different laser wavelengths that are selected so that one of the wavelengths is absorbed by the molecule of interest while the other wavelength is not. The difference in intensity of the two return signals can be used to deduce the concentration of the molecule being investigated. Doppler LIDAR is used to measure the velocity of a target. When the light transmitted from the LIDAR hits a target moving towards or away from the LIDAR, the wavelength of the light reflected/scattered off the target will be changed slightly. This is known as a Doppler-shift and therefore Doppler LIDAR. If the target is moving away from the LIDAR, the return light will have a longer wavelength (sometimes referred to as a red shift), if moving towards the LIDAR the return light will be at a shorter wavelength (blue shifted). The target can be either a hard target or an atmospheric target (e.g. microscopic dust and aerosol particles that are carried by the wind.
A camera's focal point is preferably used as a perspective transformation center. Its position in space may be determined, for example, by a multi-frequency carrier phase post-processed GPS system mounted on the host craft. The offsets, in three dimensions, of a camera's focal point are preferably carefully measured against the center of the GPS antenna. These offsets may be combined with the position of the GPS antenna, and the orientation of the host craft, to determine the exact position of the camera's focal point. The position of the GPS antenna is preferably determined by processing of collected GPS data against similar ground-based GPS antennas deployed at precisely surveyed points.
One or more AMUs (e.g., the Applanix POS AV) are preferably mounted onboard for attitude determination. The attitude of the AMU reference plane relative to the target region's ground plane is preferably measured and recorded at short intervals, with accuracy better than one-hundredth of one degree. The attitude of the AMU reference plane may be defined as the series of rotations that can be performed on the axes of this plane to make it parallel to the ground plane. The term “align” could also be used to describe this operation.
The attitude of center camera310 (i.e. its image plane), relative to the AMU, is preferably precisely calibrated. The attitude of each of the other cameras, relative to center camera310, is preferably also be carefully calibrated. This dependent calibration is more efficient than directly calibrating each camera. When thecamera array assembly300 is remounted, only center camera310 needs to be recalibrated. Effectively, a series of two transformations is applied to an input image from center camera310. First, the center camera's image plane is aligned to the AMU plane. Then, the AMU plane is aligned again to the ground plane. These transformations, however, combine into a single operation by multiplying their respective transformation matrices. For images from each of the other cameras, an additional transformation is first performed to align it with the center camera's image plane.
The position of the focal point of center camera310 may be determined as described above. The x and y components of this position preferably determine the position of the mosaic'snadir point400 on the ground. Field of view (FOV) angles of each camera are known, thus the dimensions of each input image may be determined by the z component of that camera's focal point. An average elevation of the ground is preferably determined by computing the average elevation of points in the DTMs of the area, and then each input image is projected to an imaginary horizontal plane at this elevation. Relief displacement is then preferably applied using the DTMs of the area. The DTMs can be obtained from many sources including: the USGS 30-or 10-meter DTMs available for most of the US; commercial DTMs; or DTMs obtained by a LIDAR or SAR EMU device mounted on the host craft that captures data concurrently with the cameras.
Besides being geographically correctly placed, the resulting compound image also needs to have radiometric consistency throughout, and no visible seams at the joints between two adjacent images. The present invention provides a number of techniques for achieving this goal.
A characteristic of a conventional camera is the exposure time (i.e., the time the shutter is open to collect light onto the image plane). The longer the exposure time, the lighter the resultant image becomes. Exposure time must adapt to changes in ambient lighting caused by conditions such as: cloud coverage; the angle and position of the sun relative to the camera; and so forth. Optimal exposure time may also depend on a camera's orientation with respect to lighting sources (e.g., cameras pointing towards a sunlit object typically receive more ambient light than those pointing towards a shaded object). Exposure time is adjusted to keep the average intensity of an image within a certain desired range. For example, in 24-bit color images each Red, Green and Blue component can have intensity values from 0 to 255. In most instances, however, it is desirable to keep the average intensity at a mean value (i.e., 127).
In the present invention, an exposure control module controls exposure time for each of the cameras or imaging sensors. It examines each input image and calculates average image intensity. Based on a moving average (i.e., average intensity of the last X number of images), the exposure control module determines whether to increase or decrease exposure time. The module can use a longer running average to effect a slower reaction to changes in lighting conditions, with less susceptibility to unusually dark or light images (e.g., asphalt roads or water). The exposure control module controls exposure time for each camera separately.
In systems where cameras are mounted without forward-motion compensation mechanisms, there must be a maximum limit for exposure time. Setting exposure time to a value larger than the maximum may cause motion-induced blurriness. For example, assume cameras are mounted on an airplane traveling at 170 miles/hour (or about 3 inches/ms). Assume desired pixel resolution is 6 inches. Forward motion during image capture should be limited to half a pixel size—which in this case equals 3 inches. Thus, maximum exposure for example is 1 millisecond.
In controlling imaging quality, it is useful to be able to determine if changes in light intensity are caused either due to a change in ambient light or due to the presence of unusually light or dark objects (e.g., reflecting water body, metal roofs, asphalts, etc.). Certain applications of this invention involve aerial photography or surveillance. It is observed that aerial images of the ground usually contain plants and vegetation—which have more consistent reflectivity than water bodies or man-made structures such as roads and buildings. Of course, images of plants and vegetation are usually green-dominant (i.e., the green component is the greatest of the red, green and blue values). Therefore, intensity correlation can be made more accurate by focusing on the green-dominant pixels.
The exposure control module computes the average intensity of an image by selecting only green-dominant pixels. For example, if an image has 1 million pixels and 300,000 are green-dominant, only those 300,000 green-dominant pixels are included in the calculation of average intensity. This results in an imaging process that is less susceptible to biasing caused by man-made structures and water bodies, whose pixels are usually not green-dominant. As previously noted, it is desirable to maintain an intensity value of about 127. When intensity value is over 127 (i.e., over-exposed), exposure time is reduced so that less light is captured. Similarly, when intensity value is under 127 (i.e., under-exposed), exposure time is increased so that more light is captured. For example, consider a system flying over a target terrain area having many white roofs, whose intensities are very high. Average intensity for the images captured would tend to be high. In most conventional systems, exposure time would by reduced in order to compensate. In such an example, however, reducing exposure time is not proper, because the average intensity of the images has been biased by the bright roofs. Reducing exposure time would result in images where the ground is darker than it should be. In contrast, if only green-dominant pixels are processed in accordance with the present invention, then pixels representing the overly bright roofs do bias the average intensity and the exposure time is not changed.
Thus, the exposure control module reduces intensity differences between input images. Nonetheless, further processing is provided to enhance tonal balance. There are a number of factors (e.g., lens physics, atmospheric conditions, spatial/positional relationships of imaging devices) that cause an uneven reception of light from the image plane. More light is received in the center of a camera or sensor than at the edges.
The mosaicing module of the present invention addresses this with an anti-vignetting function, illustrated in reference now toFIG. 5. A number offocal columns500,502,504,506 and508 converge fromimage plane509 and cross throughfocal point510 as they range across imaging target area512 (e.g., ground terrain).Columns500 through508 may comprise individual resolution columns of a single camera or sensor, or may represent the focal axes of a number of independent cameras or sensors. For reference purposes,column504 serves as the axis andpoint513 at whichcolumn504 intersectsimage plane509 serves as a principal point. The exposure control module applies an anti-vignetting function multiplying the original intensity of an input pixel with a column-dependent anti-vignetting factor. Because the receiving surface is represented as a plane with a coordinate system, each column will have a number of resolution rows (not shown). This relationship may be expressed, for a pixel p at column x and row y, as follows:
<adjusted intensity>=<original intensity>*ƒ(x);
where ƒ(x) is a function of the form:
ƒ(x)=cos(off-axis angle)**4.
The off-axis angle514 is: zero forcenter column504; larger forcolumns502 and506; and larger still forcolumns500 and508. The overall field of view angle516 (FOVx angle) is depicted betweencolumns504 and508.
The function ƒ(x) can be approximated by a number of line segments between columns. For a point falling within a line segment between any given columns c1 and c2, an adjustment factor is computed as follows:
<adjustment factor for c>=ƒ(c1)+[ƒ(c2)−ƒ(c1)*(c−c1)/(c2−c1)];
where ƒ(c1) and ƒ(c2) are the ƒ function values of the off-axis angles at column c1 and c2, respectively.
Each set of input images needs to be stitched into a mosaic image. Even though the exposure control module regulates the amount of light each camera or sensor receives, the resulting input images may still differ in intensity. The present invention provides an intensity-balancing module that compares overlapping area between adjacent input images, to further balance the relative intensities. Because adjoining input images are taken simultaneously, the overlapping areas should, in theory, have identical intensity in both input images. However, due to various factors, the intensity values are usually not the same. Some such factors causing intensity difference could include, for example, the exposure control module being biased by unusually bright or dark objects present in the field of view of only a particular camera, or the boresight angles of cameras being different (i.e., cameras that are more slanted receive less light than those more vertical).
To balance two adjacent images, one is chosen as the reference image and the other is the secondary image. A correlation vector (fR, fG, FB) is determined using, for example, the following process. Let V be a 3×1 vector representing the values (R, G and B) of a pixel:
V=RGB.
A correlation matrix C may be derived as:
C=FR000FG000FB;
where FR=AvgIr/AvgIn; AvgIr=Red average intensity of overlapped region in reference image; AvgIn=Red average intensity of overlapped region in new image; and FG and FB are similarly derived.
The correlation matrix scales pixel values of the secondary image so that the average intensity of the overlapping area of the secondary image becomes identical to the average intensity of the overlapping area of the reference image. The second image can be balanced to the reference image by multiplying its pixel values by the correlation matrix.
Thus, in one embodiment of a balancing process according to the present invention, a center image is considered the reference image. The reference image is first copied to the compound image (or mosaic). Overlapping areas between the reference image and an adjoining image (e.g., the near left image) are correlated to compute a balancing correlation matrix (BCM). The BCM will be multiplied with vectors representing pixels of the adjoining image to make the intensity of the overlapping area identical in both images. One embodiment of this relationship may be expressed as:
Let I(center)=Average intensity of overlapping area in center image;
I(adjoining)=Average intensity of overlap in adjoining image; then
Balancing factor=I(center)/I(adjoining).
The balancing factor for each color channel (i.e., red, green and blue) is independently computed. These three values form the BCM. The now-balanced adjoining image is copied to the mosaic. Smooth transitioning at the border of the copied image is providing by “feathering” with a mask. This mask has the same dimension as the adjoining image and comprises a number of elements. Each element in the mask indicates the weight of the corresponding adjoining image pixel in the mosaic. The weight is zero for pixels at the boundary (i.e. the output value is taken from the reference image), and increases gradually in the direction of the adjoining image until it becomes unity—after a chosen blending width has been reached. Beyond the blending area, the mosaic will be entirely determined by the pixels of the adjoining image. Similarly, the overlaps between all the other constituent input images are analyzed and processed to compute the correlation vectors and to balance the intensities of the images.
A correlation matrix is determined using, for example, the following process with reference toFIG. 6.FIG. 6 depicts astrip600 being formed in accordance with the present invention. Abase mosaic602 and anew mosaic604, added along path (or track)606, overlap each other inregion608. Let V be a vector that represents the R, G and B values of a pixel:
V=RGB
Let h be the transition width ofregion608, and y be the along-track606 distance from theboundary610 of the overlapped region to a point A, whose pixel values are represented by V. Let C be the correlation matrix:
C=FR000FG000FB
The balanced value of V, called V′ is:
V′=[y/h·I+(1−y/h)·C]×V, for 0<y<h;
V′=V, for y>=h;
Where I is the identity matrix
I=100010001.
Note that the “feathering” technique is also used in combination with the gradient to minimize seam visibility.
When mosaics are long, differences in intensity at the overlap may change from one end of the mosaic to the other. Computing a single correlation vector to avoid creating visible seams may not be possible. The mosaic can be divided into a number of segments corresponding to the position of the original input images that make up the mosaic. The process described above is applied to each segment separately to provide better local color consistency.
Under this refined algorithm, pixels at the border of two segments may create vertical seams (assuming north-south flight lines). To avoid this problem, balancing factors for pixels in this area have to be “transitioned” from that of one segment to the other. This is explained now with reference toFIG. 7.
FIG. 7 depicts astrip700 being formed in accordance with the present invention. Abase mosaic702 and anew segment704 overlap inarea706. Mosaic702 and anothernew segment708 overlap inarea710.Segments704 and708 overlap inarea712, andareas706,710 and712 all overlap and coincide atarea714. For explanation purposes,point716 serves as an origin for y-axis718 andx-axis720. Movement along y-axis718 represents movement along the flight path of the imaging system.Point716 is located at the lower left ofarea714.
According to the present invention, the dimensions of a strip are determined by the minimum and maximum x and y values of the constituent mosaics. An output strip is initialized to a background color. A first mosaic is transferred to the strip. The next mosaic (along the flight path) is processed next. Intensity values of the overlapping areas of the new mosaic and the first mosaic are correlated, separately for each color channel. The new mosaic is divided into a number of segments corresponding to the original input images that made up the mosaic. A mask matrix, comprising a number of mask elements, is created for the new mosaic. A mask element contains the correlation matrix for a corresponding pixel in the new mosaic. All elements in the mask are initialized to unity. The size of the mask can be limited to just the transition area of the new mosaic. The correlation matrix is calculated for the center segment. The mask area corresponding to the center segment is processed. The values of the elements at the edge of the overlap area are set to the correlation vector. Then, gradually moving away from the first mosaic along the strip, the components of the correlation matrix are either increased or decreased (whether they are less or more than unity, respectively) until they become unity at a predetermined transition distance. The area of the mask corresponding to a segment adjoining the center segment is then processed similarly. However, thearea714 formed by the first mosaic and the center and adjoining segments of the new image requires special treatment. Because the correlation matrix for the adjoining segment may not be identical to that of the center segment, a seam may appear at the border of the two segments in theoverlap area714 with the first mosaic. Therefore, the corner is influenced by the correlation matrices from both segments. For a mask cell A at distance x to the border with the center segment and distance y to the overlap edge, its correlation matrix is the distance-weighted average of the two segments, evaluated as follows:
  • For pixel A(x, y) inarea714 at distance x to the border with the center segment, its balanced values are computed as the distance-weighted averages of the values computed using the two segments;
  • V1 is the balanced RGB vector based onsegment704;
  • V2 is the balanced RGB vector based onsegment708;
  • V′ is the combined (final) balanced RGB vector
    V′=((d−x)/d)·V1+(x/d)·V2;
    Where
    • x-axis is the line going through bottom of overlapped region;
    • y-axis is the line going through the left side of the overlapped region betweensegments704 and708;
    • h is the transition width; and
    • d is the width of the overlapped region betweensegments704 and708.
      The mask areas corresponding to other adjoining segments are computed similarly.
Further according to the present invention, a color fidelity (i.e., white-balance) filter is applied. This multiplies R and B components with a determinable factor to enhance color fidelity. The factor may be determined by calibrating the cameras and lenses. The color fidelity filter ensures that the colors in an image retain their fidelity, as perceived directly by the human eye. Within the image capture apparatus, the Red, Green and Blue light receiving elements may have different sensitivities to the color they are supposed to capture. A “while-balance” process is applied—where image of a white object is captured. Theoretically, pixels in the image of that white object should have equivalent R, G and B values. In reality, however, due to different sensitivities and other factors, the average color values for each R, G and B may be avgR, avgG and avgB, respectively. To equalize the color components, the R, G and B values of the pixels are multiplied by the following ratios:
R values are multiplied by the ratio avgG/avgR; and
B values are multiplied by the ratio avgG/avgB.
The end result is that the image of the white object is set to have equal R G B components.
In most applications, a strip usually covers a large area of non-water surface. Thus, average intensity for the strip is unlikely to be skewed by anomalies such as highly reflecting surfaces. The present invention provides an intensity normalization module that normalizes the average intensity of each strip so that the mean and standard deviation are of a desired value. For example, a mean of 127 is the norm in photogrammetry. A standard deviation of 51 helps to spread the intensity value over an optimal range for visual perception of image features. Each strip may have been taken in different lighting conditions and, therefore, may have different imaging data profiles (i.e., mean intensity and standard deviation). This module normalizes the strips, such that all have the same mean and standard deviation. This enables the strips to be stitched together without visible seams.
This intensity normalization comprises a computation of the mean intensity for each channel R, G and B, and for all channels. The overall standard deviation is then computed. Each R, G and B value of each pixel is transformed to the new mean and standard deviation:
new value=new mean+(old value−old mean)*(new std/old std).
Next, multiple adjacent strips are combined to produce tiled mosaics for an area of interest. Finished tiles can correspond to the USGS quads or quarter-quads. Stitching strips into mosaics is similar to stitching mosaics together to generate strips, with strips now taking the role of the mosaics. At the seam line between two strips, problems may arise if the line crosses elevated structures such as buildings, bridges, etc. This classic problem in photogrammetry arises from the parallax caused by the same object being looked at from two different perspectives. During imaging of a building, for example, one strip may present a view from one side of the building while another strip presents a view from another side of the building. After the images are stitched together, the resulting mosaic may look like a tepee. In order to address this, a terrain-guided mosaicing process may be implemented to guide the placement of a seam line. For example, LIDAR or DEM data collected with, or analyzed from, image data may be processed to determine the configuration and shaping of images as they are mosaiced together. Thus, in some mosaiced images, a seam line may not be a straight line—instead comprising a seam line that shifts back and forth to snake through elevated structures.
Referring now toFIG. 8, one embodiment of animaging process800 is illustrated in accordance with the present invention as described above.Process800 begins with aseries802 of one, or more, raw collected images.Images802 are then processed through a white-balancingprocess804, transforming them into a series of intermediate images.Series802 is then processed throughanti-vignetting function806 before progressing to theorthorectification process808. As previously noted, orthorectification may rely on position andattitude data810 from the imaging sensor system or platform, and onDTM data812.DTM data812 may be developed fromposition data810 and from, for example,USGS DTM data814 orLIDAR data816.Series802 is now orthorectified and processing continues with color balancing818. After color balancing,series802 is converted bymosaicing module820 intocompound image822.Module820 performs the mosaicing and feathering processes during this conversion. Now, one ormore compound images822 are further combined instep824, by mosaicing with a gradient and feathering, intoimage strip826. Image strips are processed throughintensity normalization828. The now normalizedstrips828 are mosaiced together instep830, again by mosaicing with a gradient and feathering, rendering a finishingtiled mosaic832. The mosaicing performed instep830 may comprise a terrain-guided mosaicing, relying onDTM data812 orLIDAR data816.
FIG. 9 illustrates diagrammatically how photos taken with the camera array assembly may be aligned to make an individual frame. This embodiment shows a photo patter illustration looking down from a vehicle, using data ortho-rectified from five cameras.
FIG. 10 is a block diagram of the processing logic according to certain embodiments of the present invention. As shown in block diagram1000, the processing logic accepts one or more inputs, which may includeelevation measurements1002,attitude measurements1004 and/or photo andsensor imagery1006. Certain inputs may be passed through an initial processing step prior to analysis, as is shown inblock1008, wherein the attitude measurements are combined with data from ground control points.Elevation measurements1002 andattitude measurements1004 may be combined to generate processedelevation data1010.Processed elevation data1010 may then be used to generateelevation DEM1014 andDTM1016. Similarly,attitude measurements1006 may be combined with photo andsensor imagery1006 to generategeoreferenced images1012, which then undergoimage processing1018, which may include color balancing and gradient filtering.
Depending on the data set to be used (1020), eitherDTM1016 or aUSGS DEM1022 is combined with processedimages1018 to generateorthorectified imagery1024.Orthorectified imagery1024 then feeds into self-lockingflightlines1026. Balancingprojection mosaicing1028 then follows, to generatefinal photo output1030.
The present invention may employ a certain degree of lateral oversampling to improve output quality.FIG. 11 is an illustration of alateral oversampling pattern1100 looking down from a vehicle according to certain embodiments of the present invention showing minimal lateral oversampling. In this illustration, thecentral nadir region1102 assigned to the center camera overlaps only slightly with theleft nadir region1104 andright nadir region1106, so that overlap is minimizedFIG. 12 is an illustration of alateral oversampling pattern1200 looking down from a vehicle according to certain embodiments of the present invention showing a greater degree of lateral oversampling. In this illustration, thecentral nadir region1202 shows a high degree of overlap withleft nadir region1204 andright nadir region1206.
In addition to the use of lateral oversampling as shown inFIGS. 11 and 12, the present invention may employ flight line oversampling as well.FIG. 13 is an illustration of a flightline oversampling pattern1300 looking down from a vehicle according to certain embodiments of the present invention showing a certain degree of flight line oversampling but minimal lateral oversampling.Central nadir regions1302 and1304 are overlapped to one another along the flight line, but do not overlap laterally withleft nadir regions1306 and1308 or withright nadir regions1310 and1312.
FIG. 14 is an illustration of flight line oversampling looking down from a vehicle according to certain embodiments of the present invention showing significant flight line oversampling as well as significant lateral oversampling. It can be seen that each of thecentral nadir regions1402 through1406 are significantly overlapped with one another as well as withleft nadir regions1408 through1412 andright nadir regions1414 through1418.Left nadir regions1408 through1412 are overlapped with one another, as areright nadir regions1414 through1418. Accordingly, each point on the surface is sampled at least twice, and in some cases as many as four times. This technique uses the fact that in the area of an image that is covered twice, or more, by different camera sensors, a doubling of the image resolution is possible in both the lateral (across path) and flight line (along path) directions for an overall quadrupling of the resolution. In practice, the improvement in image/sensor resolution is somewhat less than doubled in each of the dimensions, approximately 40% in each dimension, or 1.4×1.4=˜2 times. This is due to the statistical variations of the sub-pixel alignment/orientation. In effect, the pixel grid is rarely exactly equidistant from the overlaid pixel grid. If extremely precise lateral camera sensor alignments were made at the sub-pixel level, a quadrupling of image resolution could be realized.
FIG. 15 is an illustration of aprogressive magnification pattern1500 looking down from a vehicle according to certain embodiments of the present invention.Central nadir region1502 is bounded on its left and right edges by innerleft nadir region1504 and innerright nadir region1506, respectively. Innerleft nadir region1504 is bounded on its left edge by outerleft nadir region1508, while innerright nadir region1506 is bounded on its right edge by outerright nadir region1510. Note that these regions exhibit a minimal degree of overlap and oversampling from one to another.
FIG. 16 is an illustration of aprogressive magnification pattern1600 looking down from a vehicle according to certain embodiments of the present invention.Central nadir region1602 is bounded on its left and right edges by innerleft nadir region1604 and innerright nadir region1606, respectively. Innerleft nadir region1604 is bounded on its left edge by outerleft nadir region1608, while innerright nadir region1606 is bounded on its right edge by outerright nadir region1610. Note that, as above, these regions exhibit a minimal degree of overlap and oversampling from one to another. Within each of thenadir regions1604 through1610, there is acentral image region1614 through1620 shown shaded in grey.
FIG. 17 is an illustration of aprogressive magnification pattern1700 looking down from a vehicle according to certain embodiments of the present invention. In the center ofpattern1700, a leftinner nadir region1702 and a rightinner nadir region1704 overlap in the center. A leftintermediate nadir region1706 and a rightintermediate nadir region1708 are disposed partly outside ofregions1702 and1704, respectively, each sharing an overlapping area with the respective adjacent area by approximately 50%. An outerleft nadir region1710 and an outerright nadir region1712 are disposed partly outside ofregions1706 and1708, respectively, each sharing an overlapping area with the respective adjacent area by approximately 50%. Acentral image region1714 is disposed in the center ofpattern1700, comprised of the central portions ofnadir regions1702 through1712.
FIG. 18 depicts a schematic of the architecture of asystem1800 according to certain embodiments of the present invention.System1800 may include one ormore GPS satellites1802 and one ormore SATCOM satellites1804. One or moreGPS location systems1806 may also be included, operably connected to one ormore modules1808 collecting LIDAR, GPS and/or X, Y, Z location data and feeding such information to one or more datacapture system applications1812. One or more datacapture system applications1812 may also receive spectral data from acamera array1822. ADGPS1810 may communicate with one ormore SATCOM satellites1804 via a wireless communications link1826. One ormore SATCOM satellites1804 may, in turn, communicate with one or more datacapture system applications1812.
One or more datacapture system applications1812 may interface with anautopilot1816, an SSD and/or aRealTime StitchG system1820, which may also interact with one another.SSD1814 may be operably connected toRealTime DEM1818. Finally,RealTime DEM1818 andRealTime StitchG1820 may be connected to a storage device, such asdisk array1824.
The present invention may employ a certain degree of co-mounted, co-registered oversampling to overcome physical pixel resolution limits.FIG. 19 is an illustration of a lateral co-mounted,co-registered oversampling configuration1900 for asingle camera array112 looking down from a vehicle according to certain embodiments of the present invention showing minimal lateral oversampling. The cameras overlap a few degrees in thevertical sidelap area1904 and1908. WhereasFIG. 19 depicts a 3-camera array, these subpixel calibration techniques work equally well when utilizing any number of camera sensors from 2 to any number of cameras being calibrated.
Similar to the imaging sensors inFIGS. 3 and 4, the camera sensors may be co-registered to calibrate the physical mount angle offset of each sensor relative to each other and/or to the nadir camera. This provides an initial, “close” calibration. These initial calibration parameters may be entered into anonboard computer system104 in thesystem100, and updated during flight using oversampling techniques.
Referring now toFIG. 19, the rectangles labeled A, B, and C representimage areas1902,1906 and1910 from a 3-camera array C-B-A (not shown). Images ofareas1902,1906 and1910 taken by cameras A through C (not shown), respectively, are illustrated from an overhead view. Again, similar toFIGS. 3 and 4, because of the “cross-eyed” arrangement, the image ofarea1902 is taken by right camera A, the image ofarea1906 is taken by center/nadir camera B, and the image ofarea1910 is taken by left camera C. Cameras A through C form an array (not shown) that is, in most applications, pointed down vertically.
InFIG. 19, the hatched areas labeled A/B and B/C sidelaps representimage overlap areas1904 and1908, respectively. The leftimage overlap area1904 is where right camera A overlaps with the center/nadir camera B, and the rightimage overlap area1908 is where the left camera C overlaps with the center/nadir camera B. In thesesidelap areas1904 and1908, the camera sensor grid bisects each pixel in theoverlap areas1904 and1908, which effectively quadruples the image resolution in theseareas1904 and1908 via the mechanism of co-mounted, co-registered over-sampling. In effect, the improvement in image/sensor resolution is doubled in each dimension, or 2×2=4 times. This quadrupling of the image resolution also quadruples the alignment precision between adjacent cameras.
Further, this quadrupling of alignment precision between adjacent cameras improves thesystems100 alignment precision for all sensors affixed to a rigid mount plate. The cameras and sensors are affixed to a rigid mount unit, which is affixed to the rigid mount plate, as discussed above. In particular, when the angular alignment of adjacent cameras affixed to the rigid mount unit is improved, the angular alignment of the other sensors is also enhanced. This enhancement of alignment precision for the other sensors affixed to the rigid mount plate also improves the image resolution for those sensors.
A lateral co-mounted,co-registered oversampling configuration2000 for two overlappingcamera arrays112 is illustrated inFIG. 20. In particular,FIG. 20 is an illustration of a lateral co-mounted,co-registered oversampling configuration2000 for two overlappingcamera arrays112 looking down from a vehicle according to certain embodiments of the present invention showing maximum lateral oversampling. The adjacent cameras overlap a few degrees in the vertical sidelap areas2006,2008,2014 and2016, and the corresponding cameras overlap completely in theimage areas2002,2010,2018 and2004,2012,2020. WhereasFIG. 20 depicts two 3-camera arrays, these subpixel calibration techniques work equally well when utilizing two overlapping camera arrays with any number of camera sensors from 2 to any number of cameras being calibrated.
Similar to the imaging sensors inFIGS. 3 and 4, the camera sensors may be co-registered to calibrate the physical mount angle offset of each sensor relative to each other and/or to the nadir camera. In this embodiment, multiple, i.e., at least two, rigid mount units are affixed to a rigid mount plate and are co-registered. This provides an initial, “close” calibration. These initial calibration parameters may be entered into anonboard computer system104 in thesystem100, and updated during flight.
Referring now toFIG. 20, the rectangles labeled A, B, and C representimage areas2002,2010,2018, and2004,2012,2020 from two overlapping 3-camera arrays C-B-A (not shown), respectively. Images ofareas2002,2010,2018, and2004,2012,2020 taken by cameras A through C (not shown) and overlapping cameras A′ through C′ (not shown), respectively, are illustrated from an overhead view. Again, similar toFIGS. 3 and 4, because of the “cross-eyed” arrangement, the image ofarea2002 is taken by right camera A, the image ofarea2010 is taken by center/nadir camera B, and the image ofarea2018 is taken by left camera C. Further, the image ofarea2004 is taken by right camera A′, the image ofarea2012 is taken by center camera B′, and the image ofarea2020 is taken by left camera C′. Cameras A through C and overlapping cameras A′ through C′ form arrays (not shown) that are, in most applications, pointed down vertically.
InFIG. 20, the hatched areas labeled A/B and B/C sidelaps represent two overlapping image overlap areas2006,2008 and2014,2016, respectively. The left image overlap areas2006,2008 is where right camera A overlaps with the center/nadir camera B, and where right camera A′ overlaps with the center camera B′, respectively. The right image overlap areas2014 and2016 is where the left camera C overlaps with the center/nadir camera B, and where the left camera C′ overlaps with the center camera B′. In these sidelap areas2006,2008 and2014,2016, respectively, the camera sensor grid bisects each pixel in the overlap areas2006,2008 and2014,2016, which effectively quadruples the image resolution in these areas2006,2008 and2014,2016 via the mechanism of co-mounted, co-registered oversampling. In effect, the improvement in image/sensor resolution is doubled in each dimension, or 2×2=4 times. This quadrupling of the image resolution quadruples the alignment precision between adjacent cameras, as discussed above.
By having two overlapping camera arrays, the image resolution is effectively quadrupled again for the overlapping sidelap overlap areas2006,2008 and2014,2016. This results in an astounding overall 64 times improvement insystem100 calibration and camera alignment.
In the overlapping sidelap areas2006 and2008, the overlapping camera sensor grids bisects each pixel in the sidelap areas2006 and2008, which effectively quadruples the image resolution in these areas2006 and2008 via the mechanism of co-mounted, co-registered oversampling. Similarly, in the overlapping sidelap areas2014 and2016, the overlapping camera sensor grids bisects each pixel in the sidelap areas2014 and2016, which effectively quadruples the image resolution in these areas2014 and2016. In effect, the improvement in image/sensor resolution is again doubled in each dimension, or 2×2×2×2×2×2=64 times. This overall 64 times improvement of the image resolution also enhances alignment precision by 64 times between adjacent cameras.
This 64 times improvement of alignment precision between adjacent and corresponding cameras enhances thesystems100 alignment precision for all sensors affixed to a rigid mount plate. Cameras A through C and, optionally, other sensors are affixed to a first rigid mount unit and cameras A′ through C′ and, optionally, other sensors are affixed to a second rigid mount unit, which are each affixed to a rigid mount plate. In particular, when the angular alignment of adjacent and/or corresponding cameras affixed to the first and/or second rigid mount units is improved, the angular alignment of the other sensors is also enhanced. This enhancement of alignment precision for the other sensors affixed to the rigid mount plate also improves the image resolution for those sensors.
By having two overlapping camera arrays, the image resolution is effectively quadrupled for the entire image, not just for the A/B and B/C sidelap overlap areas. Referring now toFIG. 20, the overlapping grid detail labeled “OVERLAPPING GRID 4X” represents overlapping areas2022 and2024 inright images areas2018 and2020, respectively. In the overlapping areas2022 and2024, the overlapping camera sensor grids bisects each pixel in the overlapping areas2022 and2024, which effectively quadruples the image resolution in these areas2022 and2024 via the mechanism of co-mounted, co-registered oversampling. In effect, the improvement in image resolution is doubled in each dimension, or 2×2=4 times.
In a preferred embodiment, one camera array is monochrome, and another camera array is red-green-blue. Even though each array covers different color bands, simple image processing techniques are used so that all color bands realize the benefit of this increased resolution. Another advantage provided by these techniques is that, in the case where one camera array is red-green-blue and the other, overlapping camera array is an infrared or near infrared (or some other bandwidth), which results in a superior multi-spectral image.
Accordingly, all of the improvements (i.e., 4 times) identified for the embodiment ofFIG. 19 discussed above apply to the embodiment ofFIG. 20, however, additional significant enhancements (i.e., 64 times) to thesystems100 calibration precision and overall image resolution may be realized through the two overlapping camera arrays.
FIG. 21 is an illustration of a fore and lateral co-mounted,co-registered oversampling configuration2100 for twocamera arrays112 looking down from a vehicle according to certain embodiments of the present invention. In particular,FIG. 21 is an illustration of a fore and lateral co-mounted,co-registered oversampling configuration2100 for two overlappingcamera arrays112 looking down from a vehicle according to certain embodiments of the present invention showing minimal fore and minimal lateral oversampling. The adjacent cameras overlap a few degrees in thevertical sidelap areas2104,2108,2124 and2128, and the corresponding cameras overlap a few degrees along thehorizontal forelap areas2112,2116 and2120. WhereasFIG. 21 depicts two 3-camera arrays, these subpixel calibration techniques work equally well when utilizing two overlapping camera arrays with any number of camera sensors from 2 to any number of cameras being calibrated.
Similar to the imaging sensors inFIGS. 3 and 4, the camera sensors may be co-registered to calibrate the physical mount angle offset of each sensor relative to each other and/or to the nadir camera. In this embodiment, multiple, i.e., at least two, rigid mount units are affixed to a rigid mount plate and are co-registered. This provides an initial, “close” calibration. These initial calibration parameters may be entered into anonboard computer system104 in thesystem100, and updated during flight.
Referring now toFIG. 21, the rectangles labeled A, B, and C representimage areas2102,2106 and2110 from a 3-camera array C-B-A (not shown), and the rectangles D, E, and F representimage areas2122,2126 and2130 from a 3-camera array F-E-D (not shown), respectively. Images ofareas2102,2106 and2110 taken by cameras A through C (not shown), and images ofareas2122,2126 and2130 taken by cameras D through F (not shown), respectively, are illustrated from an overhead view. Again, similar toFIGS. 3 and 4, because of the “cross-eyed” arrangement, the rear, left image ofarea2102 is taken by rear, right camera A, the rear, center image ofarea2106 is taken by rear, center/nadir camera B, and the rear, right image ofarea2110 is taken by rear, left camera C. Further, the forward, left image ofarea2122 is taken by forward, right camera D, the forward, center image ofarea2126 is taken by forward, center camera E, and the forward, right image ofarea2020 is taken by forward, left camera F. Cameras A through C and overlapping cameras D through F form arrays (not shown) that are, in most applications, pointed down vertically.
InFIG. 21, the vertical hatched areas represent fourimage overlap areas2104,2108,2124 and2128. The rear, leftimage overlap area2104 is where rear, right camera A overlaps with the center/nadir camera B, and the rear, rightimage overlap area2108 is where rear, left camera C overlaps with the center/nadir camera B. The forward, leftimage overlap area2124 is where forward, right camera D overlaps with the center/nadir camera E, and the forward, rightimage overlap area2128 is where forward, left camera F overlaps with the center camera E.
Referring now toFIG. 21, the overlapping grid detail labeled “SIDELAP AREA 4:1” represents overlaping sidelap overlapareas2104,2108 and2124,2128. In these sidelap overlapareas2104,2108,2124 and2128, the camera sensor grid bisects each pixel in theoverlap areas2104,2108,2124 and2128, which effectively quadruples the image resolution in theseareas2104,2108,2124 and2128 via the mechanism of co-mounted, co-registered oversampling. In effect, the improvement in image/sensor resolution is doubled in each dimension, or 2×2=4 times. This quadrupling of the image resolution quadruples the alignment precision between adjacent cameras, as discussed above.
This quadrupling of alignment precision between adjacent cameras improves thesystems100 alignment precision for all sensors affixed to a rigid mount plate. Cameras A through C and, optionally, other sensors are affixed to a first rigid mount unit and cameras D through F and, optionally, other sensors are affixed to a second rigid mount unit, which are each affixed to a rigid mount plate. In particular, when the angular alignment of adjacent cameras affixed to the first or second rigid mount units is improved, the angular alignment of the other sensors affixed to the mount unit is also enhanced. This enhancement of alignment precision for the other sensors affixed to the rigid mount plate also improves the image resolution for those sensors.
Similarly, the horizontal hatched areas represent threeimage overlap areas2112,2116 and2120. The forward, leftimage overlap area2112 is where rear, right camera A overlaps with the forward, right camera D, forward, center image overlap area2116 is where rear, center/nadir camera B overlaps with the forward, center camera E, and the rear, right image overlap area2120 is where rear, left camera C overlaps with forward, left camera F.
Referring now toFIG. 21, the overlapping grid detail labeled “FORELAP AREA 4:1” represents overlaping forelap overlapareas2112,2116 and2120. In these forelap overlapareas2112,2116 and2120, the camera sensor grid bisects each pixel in theoverlap areas2112,2116 and2120, which effectively quadruples the image resolution in theseareas2112,2116 and2120 via the mechanism of co-mounted, co-registered oversampling. In effect, the improvement in image/sensor resolution is doubled in each dimension, or 2×2=4 times. This quadrupling of the image resolution quadruples the alignment precision between corresponding cameras.
This quadrupling of alignment precision between corresponding cameras improves thesystems100 alignment precision for all sensors affixed to a rigid mount plate. Cameras A through C and, optionally, other sensors are affixed to a first rigid mount unit and cameras D through F and, optionally, other sensors are affixed to a second rigid mount unit, which are each affixed to a rigid mount plate. In particular, when the angular alignment of corresponding cameras affixed to the first or second rigid mount units is improved, the angular alignment of the other sensors is also enhanced. This enhancement of alignment precision for the other sensors affixed to the rigid mount plate also improves the image resolution for those sensors.
Similar to the overlapping sidelap overlap areas2006,2008 and2014,2016 inFIG. 20, the intersecting forelap and sidelap overlapareas2114 and2118 inFIG. 21 results in an astounding overall 64 times improvement in system calibration and camera alignment. Referring now toFIG. 21, the intersecting grid detail labeled “QUAD OVERLAP AREA 64:1” represents intersecting forelap andsidelap overlap area2118. In the intersecting forelap and sidelap overlapareas2114 and2118, the overlapping camera sensor grids bisects each pixel in theintersecting areas2114 and2118, which effectively quadruples the image resolution in theseareas2114 and2118 via the mechanism of co-mounted, co-registered oversampling. In effect, the improvement in image/sensor resolution is again doubled in each dimension, or 2×2×2×2×2×2=64 times. This overall 64 times improvement of the image resolution also enhances alignment precision by 64 times between adjacent cameras.
This 64 times improvement of alignment precision between adjacent and corresponding cameras enhances thesystems100 alignment precision for all sensors affixed to a rigid mount plate. Cameras A through C and, optionally, other sensors are affixed to a first rigid mount unit and cameras D through E and, optionally, other sensors are affixed to a second rigid mount unit, which are each affixed to a rigid mount plate. In particular, when the angular alignment of adjacent and/or corresponding cameras affixed to the first and/or second rigid mount units is improved, the angular alignment of the other sensors is also enhanced. This enhancement of alignment precision for the other sensors affixed to the rigid mount plate also improves the image resolution for those sensors.
In a preferred embodiment, one camera array is monochrome, and another camera array is red-green-blue. Even though each array covers different color bands, simple image processing techniques are used so that all color bands realize the benefit of this increased resolution. Another advantage provided by these techniques is that, in the case where one camera array is red-green-blue and the other, overlapping camera array is an infrared or near infrared (or some other bandwidth), which results in a superior multi-spectral image.
As shown inFIGS. 19-21, these techniques may be used to overcome the resolution limits imposed on camera systems due to the inability of optical glass to resolve “very small” objects. In particular, there are known physical limits to the ability of optical glass in camera lenses to resolve very small objects. This is often called “the resolving limit of glass”. For example, if 1 millimeter pixels are required from 10,000 feet of altitude, the use of an extremely high magnification telescopic lens would be required to obtain a ground swath of about 100 feet. This is because no matter how many pixels can be produced by a charged-coupled device sensor (e.g., 1 billion pixels), the resolving power of the purest glass would not permit image resolution to 1 millimeter pixels at 10,000 feet of altitude. This example is used to make the point that there are physical limits for pixel resolution in glass as well as pixel density limits for an imaging sensor.
Thesystems100 imaging sensor alignment in the rigid mount unit(s) affixed to the rigid mount plate and related calibration techniques provide a unique solution to this problem, as described above. By using these techniques, the resolving limitations of glass can effectively be overcome. For example, a single camera array results in 1 times (or no) oversampling benefits. However, two overlapping camera arrays results in 4 times overall improvement in both image resolution and overall geospatial horizontal and vertical accuracy. Further, three overlapping camera arrays results in 16 times overall improvement, four overlapping camera arrays results in 64 times overall improvement, and so on.
As can be deduced from these examples, the equation for overall improvement is as follows:
overall improvement=4N
where N is the number of overlapping camera arrays.
If there are four camera arrays, then there are three overlapping camera arrays (i.e., N=3). Accordingly, four camera arrays provide a 64 times (i.e., 43=64 times) overall improvements in both the image resolution and overall geospatial horizontal and vertical accuracy.
Further, these subpixel calibration techniques may be combined with the self-locking flight path techniques, as disclosed in U.S. Publication No. 2004/0054488A1, now U.S. Pat. No. 7,212,938B2, the disclosure of which is hereby incorporated by reference in full.
In addition to fore and/or lateral co-mounted, co-registered oversampling as shown inFIGS. 19-21, the present invention may also employ flight line oversampling as well to further improve the image resolution, as shown inFIGS. 13-17. As shown inFIGS. 13-17, the flight lines overlap each other in an image region because each flight line is parallel to one another. These overlapping image regions may be used to calibrate the sensors by along-track and cross-track parallax of images in adjacent flight lines using stereographic techniques.
In an embodiment, the self-locking flight path may comprise any pattern that produces at least three substantially parallel travel lines out of a group of three or more travel lines. Further, at least one of the travel lines should be in an opposing direction to the other substantially parallel travel lines. In a preferred embodiment, the travel pattern comprises at least one pair of travel lines in a matching direction and at least one pair of travel lines in an opposing direction.
When using the self-locking flight path in opposite directions, the observable positional error may be doubled in some image regions. According, the self-locking flight path technique includes an algorithm to significantly reduce these positional errors. This reduction in positional errors is especially important in the outside, or far left and far right “wing” image areas where the greatest positional errors occur.
In an embodiment, these positional improvements may be realized by using a pattern matching technique to automatically match a pixel pattern area obtained from a flight line (e.g., North/South) with the same pixel pattern area obtained from an adjacent flight line (e.g., North/South). In a preferred embodiment, the latitude/longitude coordinates from one or more GPS location systems may be used to accelerate this pattern matching process.
Similarly, these subpixel calibration and self-locking flight path techniques may be combined with stereographic techniques because stereographic techniques rely heavily on the positional accuracy of each pixel relative to all other pixels. In particular, these techniques improve the stereographic image resolution and overall geospatial horizontal and vertical accuracy, especially, in the far left and far right “wing” image areas, where the greatest positional errors occur. Further, stereographic techniques are used to match known elevation data with the improved stereographic datasets. Accordingly, the combined subpixel calibration, self-locking flight path and stereographic techniques provide a greatly improved Digital Elevation Model, which results in superior image.
Further, these subpixel calibration and self-locking flight path techniques may be used to provide a dynamic, RealTime calibration of thesystem100. In particular, these techniques provide the ability to rapidly “roll on” one or morecamera array assemblies112 onto thesystem100, to immediately begin collecting image data of a target area and to quickly produce high-quality images because the individual sensors have been initially calibrated in the rigid mount unit(s) affixed to the rigid mount plate, as discussed above. In particular, the camera sensors are co-registered to calibrate the physical mount angle offset of each sensor relative to each other and/or to the nadir camera. In an embodiment, multiple, i.e., at least two, rigid mount units are affixed to a rigid mount plate and are co-registered. This provides an initial, “close” calibration. These initial calibration parameters may be entered into anonboard computer system104 in thesystem100, and updated during flight using oversampling techniques, as discussed above.
In an embodiment, thesystem100 comprises a RealTime, self-calibrating system to update the calibration parameters. In particular, theonboard computer104 software comprises a RealTime software “daemon” (i.e., a background closed-loop monitoring software) to constantly monitor and update the calibration parameters using the co-mounted, co-registered oversampling and flight line oversampling techniques, as discussed above. In a preferred embodiment, the RealTime daemon combines subpixel calibration, self-locking flight path and stereographic techniques to improve the stereographic image resolution and overall geospatial horizontal and vertical accuracy. In particular, stereographic techniques are used to match known elevation data to the improved stereographic datasets. Accordingly, the combined subpixel calibration, self-locking flight path and stereographic techniques provide a greatly improved Digital Elevation Model, which results in superior image.
In an embodiment, thesystem100 comprises a RealTime GPS data system to provide GPS input data. Calibration accuracy is driven by input data from electronic devices such as a GPS and an IMU, and by calibration software which is augmented by industry standard GPS and IMU software systems. Accordingly, a key component of this RealTime, self-calibrating system is a RealTime GPS input data via a potentially low bandwidth communication channel such as satellite phone, cell phone, RF modem, or similar device. Potential sources for the RealTime GPS input data include project controlled ad-hoc stations, fixed broadcast GPS locations (or similar) or inertial navigation via an onboard IMU.
The modules, algorithms and processes described above can be implemented in a number of technologies and configurations. Embodiments of the present invention may comprise functional instances of software or hardware, or combinations thereof. Furthermore, the modules and processes of the present invention may be combined together in a single functional instance (e.g., one software program), or may comprise operatively associated separate functional devices (e.g., multiple networked processor/memory blocks). All such implementations are comprehended by the present invention.
The embodiments and examples set forth herein are presented to best explain the present invention and its practical application and to thereby enable those skilled in the art to make and utilize the invention. However, those skilled in the art will recognize that the foregoing description and examples have been presented for the purpose of illustration and example only. The description as set forth is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching without departing from the spirit and scope of the following claims.

Claims (77)

What is claimed is:
1. A system for generating a map of a target area, comprising:
a global positioning receiver;
an imaging sensor system having a view of the target area, comprising:
a rigid mount unit having at least two imaging sensors disposed within the mount unit,
wherein a first imaging sensor and a second imaging sensor each has a focal axis passing through an aperture in the mount unit,
wherein the first imaging sensor generates a first image area comprising a first data array of pixels and the second imaging sensor generates a second image area comprising a second data array of pixels,
wherein the first and second imaging sensors are offset to have a first image overlap area in the target area,
wherein the first sensors image data bisects the second sensors image data in the first image overlap area; and
a computer in communication with the a global positioning antenna, the first imaging sensor, and the second imaging sensor; correlating at least a portion of the image areas from the first imaging sensor and the second imaging sensor to a portion of the target area based on input from the global positioning antenna.
2. The system ofclaim 1 further comprising:
a third imaging sensor disposed within the mount unit, wherein the third imaging sensor has a focal axis passing through the aperture in the mount unit, wherein the third imaging sensor generates a third image area comprising a third data array of pixels.
3. The system ofclaim 2, further comprising:
a fourth imaging sensor disposed within the mount unit, wherein the fourth imaging sensor has a focal axis passing through the aperture in the mount unit, wherein the fourth imaging sensor generates a fourth image area comprising a fourth data array of pixels, wherein the third and fourth imaging sensors are offset to have a second image overlap area in the target area, wherein the third sensors image data bisects the fourth sensors image data in the second image overlap area.
4. The system ofclaim 3, wherein a first sensor array comprising the first and second image sensors and a second sensor array comprising the third and fourth image sensors are offset to have a third image overlap area in the target area, wherein the first sensor arrays image data bisects the second sensor arrays image data in the third overlap area.
5. The system ofclaim 3, wherein the first sensors arrays image data completely overlaps the second sensors arrays image data.
6. The system ofclaim 3, wherein third and fourth imaging sensors are selected from the group consisting of digital cameras, light detection and ranging (LIDAR), infrared, heat-sensing and gravitometers.
7. The system ofclaim 3, wherein the first and second imaging sensors are a digital camera and the third imaging sensor is a light detection and ranging (LIDAR).
8. The system ofclaim 2, wherein the third imaging sensor is selected from the group consisting of digital cameras, light detection and ranging (LIDAR), infrared, heat-sensing and gravitometers.
9. The system ofclaim 2, wherein the third imaging sensor is selected from the group consisting of a digital camera having a hyperspectral filter and a light detection and ranging (LIDAR).
10. The system ofclaim 2, wherein the first and second imaging sensors are a digital camera and the third imaging sensor is a light detection and ranging (LIDAR).
11. The system ofclaim 1, wherein the mount unit flexes less than 100th of a degree during operation.
12. The system ofclaim 11, wherein the mount unit flexes less than 1,000th of a degree during operation.
13. The system ofclaim 12, wherein the mount unit flexes less than 10,000th of a degree during operation.
14. The system ofclaim 1, wherein the first imaging sensor is calibrated relative to one or more attitude measuring devices selected from the group consisting of a gyroscope, an inertial measurement unit (IMU), and a global positioning system (GPS).
15. The system ofclaim 1, wherein the first and second imaging sensors are selected from the group consisting of digital cameras, light detection and ranging (LIDAR), infrared, heat-sensing and gravitometers.
16. An imaging sensor system comprising:
a mount unit in alignment with a target area, having at least two imaging sensors disposed within the mount unit, wherein a first imaging sensor and a second imaging sensor each has a focal axis passing through an aperture in the mount unit, wherein the first imaging sensor generates a first image area comprising a first data array of pixels and the second imaging sensor generates a second image area comprising a second data array of pixels, wherein the first and second imaging sensors are offset to have a first image overlap area in the target area, wherein the first sensors image data bisects the second sensors image data in the first image overlap area.
17. The system ofclaim 16 further comprising:
a third imaging sensor disposed within the mount unit, wherein the third imaging sensor has a focal axis passing through the aperture in the mount unit, wherein the third imaging sensor generates a third image area comprising a third data array of pixels.
18. The system ofclaim 17 further comprising:
a fourth imaging sensor disposed within the mount unit, wherein the fourth imaging sensor has a focal axis passing through the aperture in the mount unit, wherein the fourth imaging sensor generates a fourth image area comprising a fourth data array of pixels, wherein the third and fourth imaging sensors are offset to have a second image overlap area in the target area, wherein the third sensors image data bisects the fourth sensors image in the second image overlap area.
19. The system ofclaim 18, wherein a first sensors array comprising the first and the second image sensor and a second sensors array comprising the third and the fourth image sensor are offset to have a third image overlap area in the target area, wherein first sensor arrays image data bisects the second sensor arrays image data in the third image overlap area.
20. The system ofclaim 18, wherein the first sensors arrays image data completely overlaps the second sensors arrays image data.
21. The system ofclaim 18, wherein the third and fourth imaging sensors are selected from the group consisting of digital cameras, light detection and ranging (LIDAR), infrared, heat-sensing and gravitometers.
22. The system ofclaim 18, wherein the first and second imaging sensors are a digital camera and the third imaging sensor is a light detection and ranging (LIDAR).
23. The system ofclaim 17, wherein the third imaging sensor is selected from the group consisting of digital cameras, light detection and ranging (LIDAR), infrared, heat-sensing and gravitometers.
24. The system ofclaim 17, wherein the third imaging sensor is selected from the group consisting of a digital camera having a hyperspectral filter and a light detection and ranging (LIDAR).
25. The system ofclaim 17, wherein the first and second imaging sensors are a digital camera and the third imaging sensor is a light detection and ranging (LIDAR).
26. The system ofclaim 16, wherein the mount unit flexes less than 100th of a degree during operation.
27. The system ofclaim 26, wherein the mount unit flexes less than 1,000th of a degree during operation.
28. The system ofclaim 27, wherein the mount unit flexes less than 10,000th of a degree during operation.
29. The system ofclaim 16, wherein the first imaging sensor is calibrated relative to one or more attitude measuring devices selected from the group consisting of a gyroscope, an inertial measurement unit (IMU), and a global positioning receiver (GPS).
30. The system ofclaim 16, wherein the first and second imaging sensors are selected from the group consisting of digital cameras, light detection and ranging (LIDAR), infrared, heat-sensing and gravitometers.
31. A method of calibrating imaging sensors comprising the steps of:
performing an initial calibration of the imaging sensors comprising:
determining the position of an attitude measurement unit (AMU) selected from the group consisting of a gyroscope, an inertial measurement unit (IMU), and a global positioning system (GPS);
determining the position of a first imaging sensor within a rigid mount unit relative to the AMU;
determining the position of a second imaging sensor within the rigid mount unit relative to the AMU;
calibrating the first imaging sensor against a target area and determining a boresight angle of the first imaging sensor; and
calculating the position of one or more subsequent imaging sensors within the rigid mount unit relative to the first imaging sensor; and
calibrating the one or more subsequent imaging sensors using the boresight angle of the first imaging sensor; and
using oversampling techniques to update at least one initial calibration parameter of the first imaging sensor against a target area and the boresight angle of the first imaging sensor;
using oversampling techniques to update the position of one or more subsequent imaging sensors within the rigid mount unit relative to the first imaging sensor; and
updating at least one calibration parameter of one or more subsequent imaging sensors within the rigid mount using the updated boresight angle of the first imaging sensor.
32. The method ofclaim 31, wherein the initial calibration step further comprises the step of:
calibrating the second imaging sensor using the updated boresight angle of the first imaging sensor.
33. The method ofclaim 32, further comprising the step of:
using oversampling techniques to update the position of the second imaging sensor within the rigid mount unit relative to the first imaging sensor.
34. The method ofclaim 31, further comprising the steps of:
using flight line oversampling techniques to update the calibration of the first imaging sensor against a target area and the boresight angle of the first imaging sensor; and
using flight line oversampling techniques to update the position of one or more subsequent imaging sensors within the rigid mount unit relative to the first imaging sensor.
35. The method ofclaim 34, further comprising the steps of:
using flight line oversampling techniques to update the position of the second imaging sensor within the rigid mount unit relative to the first imaging sensor;
using flight line oversampling techniques to update the position of one or more subsequent imaging sensors within the rigid mount unit relative to the first imaging sensor; and
updating at least one calibration parameter of one or more subsequent imaging sensors within the rigid mount using the updated boresight angle of the first imaging sensor.
36. A system for generating a map of a surface, comprising:
a global position receiver;
a global positioning antenna;
an imaging array, having a view of the surface, comprising:
a mount unit;
an aperture, formed in the mount unit;
a first imaging sensor, coupled to the mount unit, having a first focal axis passing through the aperture, wherein the first image sensor generates a first image area of the surface comprising a first data array of pixels, wherein the first data array of pixels is at least two dimensional; and
a second imaging sensor, coupled to the mount unit and offset from the first imaging sensor, having a second focal axis passing through the aperture and intersecting the first focal axis, wherein the second imaging sensor generates a second image area of the surface comprising a second data array of pixels, wherein the second data array of pixels is at least two dimensional; and
a computer, connected to the global positioning antenna, and first and second imaging sensors; correlating at least a portion of the image area from the first and second imaging sensors to a portion of the surface based on input from the global positioning antenna.
37. The system ofclaim 36, further comprising a third imaging sensor, coupled to the mount unit and offset from the first imaging sensor, having a third focal axis passing through the aperture and intersecting the first focal axis within an intersection area.
38. The system ofclaim 37, wherein the focal axes of the third imaging sensor lies in a common plane with the focal axes of the first and second imaging sensors.
39. The system ofclaim 37, wherein the focal axes of the first and second imaging sensors lie in a first common plane and the focal axis of the third imaging sensor lies in a plane orthogonal to the first common plane.
40. A system for generating a map of a surface, comprising:
a global position receiver;
a global positioning antenna;
a first imaging sensor, having a view of the surface, having a focal axis disposed in the direction of the surface, wherein the first imaging sensor generates an image area comprising a first data array of pixels, wherein the first data array of pixels is at least two dimensional; and
a computer, connected to the global positioning antenna, and the first imaging sensor; generating a calculated longitude and calculated latitude value for a coordinate corresponding to at least one pixel in the array based on input from the global positioning antenna.
41. A system for generating a map of a target area, comprising:
a global position receiver;
a global positioning antenna;
an imaging sensor system, having a view of the target area, comprising:
a mount unit, having a first and second imaging sensor disposed within the mount unit, wherein the first and second imaging sensors each have a focal axis passing through an aperture in the mount unit, wherein the first imaging sensor generates a first image area comprising a first data array of pixels and second imaging sensor generates a second image area comprising a second data array of pixels, wherein the first and second data array of pixels is at least two dimensional; and
a computer in communication with the global positioning antenna, the first imaging sensor, and the second imaging sensor; correlating at least a portion of the image area from the first imaging sensor and the second imaging sensor to a portion of the target area based on input from the global positioning antenna.
42. The system ofclaim 41, further comprising a third imaging sensor disposed within the mount unit, wherein the third imaging sensor has a focal axis passing through an aperture in the mount unit, wherein the third imaging sensor generates a third image area comprising a third data array of pixels.
43. An imaging sensor system comprising:
a mount unit, having a first and second imaging sensors disposed within the mount unit, wherein the first imaging and second imaging sensors each have a focal axis passing through an aperture in the mount unit, wherein the first imaging sensor generates a first image area comprising a first data array of pixels and the second imaging sensor generates a second image area comprising a second data array of pixels, wherein the first and second data array of pixels is at least two dimensional.
44. The system ofclaim 43, further comprising a third imaging sensor disposed within the mount unit, wherein the third imaging sensor has a focal axis passing through an aperture in the mount unit, wherein the third imaging sensor generates a third image area comprising a third data array of pixels.
45. A system for generating a map of a target area, comprising:
a global positioning receiver;
an imaging sensor system having a view of the target area, comprising:
a rigid mount unit having a first imaging sensor and a second imaging sensor,
wherein the first imaging sensor generates a first image area comprising a first data array of pixels and the second imaging sensor generates a second image area comprising a second data array of pixels,
wherein the first and second imaging sensors are offset to have a first image overlap area in the target area,
wherein the first data array of pixels bisects the second data array of pixels in the first image overlap area; and
a computer in communication with the global positioning receiver, the first imaging sensor, and the second imaging sensor, wherein at least a portion of the first image area from the first imaging sensor is correlated to a portion of the target area based on input from the global positioning receiver or other geographical positioning technique.
46. The system of claim 45, wherein the first image overlap area comprises at least one oversampling pattern.
47. The system of claim 46, wherein the at least one oversampling pattern comprises a plurality of pixels that have been positioned using an oversampling technique to a precision that is less than one pixel in magnitude.
48. The system of claim 47, wherein the mount unit flexes less than 100th of a degree during operation.
49. The system of claim 47, wherein the mount unit flexes less than 1,000th of a degree during operation.
50. The system of claim 47, wherein the mount unit flexes less than 10,000th of a degree during operation.
51. An imaging sensor system comprising:
a rigid mount unit in alignment with a target area;
a first imaging sensor rigidly connected to the mount unit, wherein the first imaging sensor generates a first image area comprising a first data array of pixels;
a second imaging sensor rigidly connected to the mount unit, wherein the second imaging sensor generates a second image area comprising a second data array of pixels;
wherein the first and second imaging sensors are offset to have a first image overlap area in the target area,
wherein the first data array of pixels bisects the second data array of pixels in the first image overlap area.
52. The system of claim 51, wherein the first image overlap area comprises at least one oversampling pattern.
53. The system of claim 52, wherein the at least one oversampling pattern comprises a plurality of pixels that have been positioned using an oversampling technique to a precision that is less than one pixel in magnitude.
54. A method of calibrating imaging sensors comprising the steps of:
performing an initial calibration of the imaging sensors comprising:
determining the position of an attitude measurement unit (AMU) selected from the group consisting of a gyroscope, an inertial measurement unit (IMU), and a global positioning system (GPS);
determining the position of a first imaging sensor relative to the AMU;
calibrating the first imaging sensor against a target area and determining a boresight angle of the first imaging sensor; and
determining the position of the second imaging sensor relative to the first imaging sensor; and
calibrating the second imaging sensor using the boresight angle of the first imaging sensor; and
using oversampling techniques to update at least one initial calibration parameter of the first imaging sensor against a target area and the boresight angle of the first imaging sensor;
using oversampling techniques to update the position of the second imaging sensor relative to the first imaging sensor; and
updating at least one calibration parameter of the second imaging sensor using the updated boresight angle of the first imaging sensor.
55. The method of claim 54, wherein the oversampling techniques comprise flight line oversampling.
56. The method of claim 54, wherein the oversampling techniques comprise lateral oversampling.
57. The method of claim 54, wherein the oversampling techniques comprise generating at least one pixel that has been positioned using the oversampling techniques to a precision that is less than one pixel in magnitude.
58. The method of claim 54, wherein the oversampling techniques comprise generating a plurality of pixels that have been positioned using the oversampling techniques to a precision that is less than one pixel in magnitude.
59. A system for generating an image of a surface, comprising:
a global position receiver;
an imaging array, having a view of the surface, comprising:
a mount unit;
a first imaging sensor, coupled to the mount unit, having a first focal axis passing through an aperture in the mount unit,
wherein the first image sensor generates a first image area of the surface comprising a first data array of pixels,
wherein the first data array of pixels is at least two dimensional; and
a second imaging sensor, coupled to the mount unit and offset from the first imaging sensor, having a second focal axis passing through an aperture in the mount unit,
wherein the second imaging sensor generates a second image area of the surface comprising a second data array of pixels,
wherein the second data array of pixels is at least two dimensional; and
a computer, connected to the global position receiver, the first imaging sensor, and the second imaging sensor, wherein at least a portion of the first image area from the first imaging sensor is correlated to a portion of the surface based on input from the global position receiver.
60. The system of claim 59, wherein the mount unit flexes less than 100th of a degree during operation.
61. The system of claim 59, wherein the mount unit flexes less than 1,000th of a degree during operation.
62. The system of claim 59, wherein the mount unit flexes less than 10,000th of a degree during operation.
63. A system for generating an image of a surface, comprising:
a global position receiver;
a first imaging sensor adapted to view a surface and disposed at least partially in a mount unit, wherein the mount unit flexes less than 100th of a degree during operation,
wherein the first imaging sensor generates a first image area comprising a first data array of pixels,
wherein the first data array of pixels is at least two dimensional; and
a computer connected to the global position receiver and the first imaging sensor, wherein a calculated longitude value and a calculated latitude value are generated for at least one pixel in the first data array of pixels based on input from the global position receiver.
64. The system of claim 63, wherein the mount unit flexes less than 1,000th of a degree during operation.
65. The system of claim 63, wherein the mount unit flexes less than 10,000th of a degree during operation.
66. A system for generating an image, comprising:
a rigid mount unit;
a first imaging sensor disposed at least partially within the rigid mount unit, wherein the first imaging sensor generates a first image area comprising a first two-dimensional data array of pixels;
a second imaging sensor disposed at least partially within the rigid mount unit, wherein the second imaging sensor generates a second image area comprising a second two-dimensional data array of pixels;
wherein the first imaging sensor and the second imaging sensor are offset such that the first image area overlaps with the second image area to form a first image overlap area;
a computer in communication with the first imaging sensor and the second imaging sensor;
a mosaicking module associated with the computer for balancing the color of the second two-dimensional data array of pixels based on the average intensity of green-dominant pixels in the first two-dimensional data array of pixels.
67. The system of claim 66, wherein the first image overlap area comprises at least one oversampling pattern.
68. The system of claim 67, wherein the at least one oversampling pattern comprises a plurality of pixels that have been positioned using an oversampling technique to a precision that is less than one pixel in magnitude.
69. The system of claim 66, wherein the mount unit flexes less than 100th of a degree during operation.
70. The system of claim 66, wherein the mount unit flexes less than 1,000th of a degree during operation.
71. The system of claim 66, wherein the mount unit flexes less than 10,000th of a degree during operation.
72. An imaging sensor system comprising:
a rigid mount unit;
a first imaging sensor disposed at least partially within the rigid mount unit, wherein the first imaging sensor generates a first image area comprising a first two-dimensional data array of pixels;
a second imaging sensor disposed at least partially within the rigid mount unit, wherein the second imaging sensor generates a second image area comprising a second two-dimensional data array of pixels;
wherein the first imaging sensor and the second imaging sensor are offset such that the first image area overlaps with the second image area to form a first image overlap area;
a computer in communication with the first imaging sensor and the second imaging sensor; and
an intensity balancing module associated with the computer for balancing the intensity of the first two-dimensional data array of pixels and the second two-dimensional data array of pixels using a balancing correlation matrix.
73. The system of claim 72, wherein the first image overlap area comprises at least one oversampling pattern.
74. The system of claim 73, wherein the at least one oversampling pattern comprises a plurality of pixels that have been positioned using an oversampling technique to a precision that is less than one pixel in magnitude.
75. The system of claim 72, wherein the mount unit flexes less than 100th of a degree during operation.
76. The system of claim 72, wherein the mount unit flexes less than 1,000th of a degree during operation.
77. The system of claim 72, wherein the mount unit flexes less than 10,000th of a degree during operation.
US16/661,8682002-09-202019-10-23Self-calibrated, remote imaging and data processing systemExpired - LifetimeUSRE49105E1 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US16/661,868USRE49105E1 (en)2002-09-202019-10-23Self-calibrated, remote imaging and data processing system

Applications Claiming Priority (7)

Application NumberPriority DateFiling DateTitle
US41250402P2002-09-202002-09-20
US10/664,737US7127348B2 (en)2002-09-202003-09-18Vehicle based data collection and processing system
US11/581,235US7725258B2 (en)2002-09-202006-10-11Vehicle based data collection and processing system and imaging sensor system and methods thereof
US12/798,899US8483960B2 (en)2002-09-202010-04-13Self-calibrated, remote imaging and data processing system
US13/772,994US9389298B2 (en)2002-09-202013-02-21Self-calibrated, remote imaging and data processing system
US15/200,883US9797980B2 (en)2002-09-202016-07-01Self-calibrated, remote imaging and data processing system
US16/661,868USRE49105E1 (en)2002-09-202019-10-23Self-calibrated, remote imaging and data processing system

Related Parent Applications (1)

Application NumberTitlePriority DateFiling Date
US15/200,883ReissueUS9797980B2 (en)2002-09-202016-07-01Self-calibrated, remote imaging and data processing system

Publications (1)

Publication NumberPublication Date
USRE49105E1true USRE49105E1 (en)2022-06-14

Family

ID=81926290

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US16/661,868Expired - LifetimeUSRE49105E1 (en)2002-09-202019-10-23Self-calibrated, remote imaging and data processing system

Country Status (1)

CountryLink
US (1)USRE49105E1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20200294620A1 (en)*2017-10-042020-09-17KWS SAAT SE & Co. KGaAMethod and system for performing data analysis for plant phenotyping
US20220342161A1 (en)*2021-04-212022-10-27Panasonic Intellectual Property Management Co., Ltd.Optical measuring device, assembling device of mounting substrate, and assembling method for mounting substrate

Citations (175)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US1699136A (en)1929-01-15eliel
US1910425A (en)1928-07-261933-05-23Brock & Weymouth IncMethod of making maps
US2036062A (en)1935-07-191936-03-31Fairchild Aerial Camera CorpCamera positioning device
US2104976A (en)1936-02-291938-01-11Leon T ElielApparatus for aerial photography
US2433534A (en)1944-04-141947-12-30Chicago Aerial Survey CompanyStereoscopic camera
US2720029A (en)1952-09-221955-10-11Fairchild Aerial Surveys IncPhotogrammetric apparatus
US2747012A (en)1953-04-101956-05-22Vitarama CorpClosed link electronic camera chain
US2896501A (en)1953-05-281959-07-28Faximile IncApparatus for outlining contours
US2955518A (en)1958-07-031960-10-11Texas Instruments IncAerial camera
US2988953A (en)1957-11-291961-06-20Photographic Analysis IncApparatus for contour plotting
US3109057A (en)1960-12-081963-10-29Singer Inc H R BStereo scanning unit and system
US3518929A (en)1967-03-061970-07-07Gen ElectricThree dimensional camera
US3527880A (en)1967-01-251970-09-08Mc Donnell Douglas CorpPseudo stereo-optical observation means
DE2811428A1 (en)1978-03-161979-09-20Bosch Gmbh RobertHeadlights unit for motor vehicle - has moulded plastics vibration damping sealing ring between lens and body
US4217607A (en)1974-05-071980-08-12Societe Anonyme De TelecommunicationsProcess and device for the instantaneous display of a countryside scanned by a camera of the single line scanning type
US4313678A (en)1979-09-241982-02-02The United States Of America As Represented By The Secretary Of The InteriorAutomated satellite mapping system (MAPSAT)
US4322741A (en)1980-08-261982-03-30Jun KawabayashiImage dividing system for use in television
US4398195A (en)1979-07-021983-08-09Del Norte Technology, Inc.Method of and apparatus for guiding agricultural aircraft
US4504914A (en)1980-11-191985-03-12Messerschmitt-Bolkow-Blohm Gesellschaft Mit Beschrankter HaftungPhotogrammetric device for aircraft and spacecraft for producing a digital terrain representation
US4543603A (en)1982-11-301985-09-24Societe Nationale Industrielle Et AerospatialeReconnaissance system comprising an air-borne vehicle rotating about its longitudinal axis
US4583703A (en)1982-03-171986-04-22The United States Of America As Represented By The Secretary Of The ArmyOne fin orientation and stabilization device
US4650305A (en)1985-12-191987-03-17HineslabCamera mounting apparatus
US4686474A (en)1984-04-051987-08-11Deseret Research, Inc.Survey system for collection and real time processing of geophysical data
US4689748A (en)1979-10-091987-08-25Messerschmitt-Bolkow-Blohm Gesellschaft Mit Beschrankter HaftungDevice for aircraft and spacecraft for producing a digital terrain representation
US4708472A (en)1982-05-191987-11-24Messerschmitt-Bolkow-Blohm GmbhStereophotogrammetric surveying and evaluation method
US4712010A (en)1986-01-301987-12-08Hughes Aircraft CompanyRadiator scanning with image enhancement and noise reduction
US4724449A (en)1986-03-251988-02-09Douglas WrightMethod and apparatus for stereoscopic photography
US4750810A (en)1985-11-081988-06-14British Telecommunications PlcCamera optics for producing a composite image from two scenes
US4754327A (en)1987-03-201988-06-28Honeywell, Inc.Single sensor three dimensional imaging
US4757378A (en)1986-09-301988-07-12The Boeing CompanyMonocular scene generator for biocular wide field of view display system
US4764008A (en)1987-11-191988-08-16Wren Clifford TSurveillance housing assembly
US4814711A (en)1984-04-051989-03-21Deseret Research, Inc.Survey system and method for real time collection and processing of geophysicals data using signals from a global positioning satellite network
US4887779A (en)1987-12-011989-12-19The Boeing CompanyRoll drum sensor housing having sliding window
US4935629A (en)1988-10-241990-06-19Honeywell Inc.Detector array for high V/H infrared linescanners
US4951136A (en)1988-01-261990-08-21Deutsche Forschungs- Und Versuchsanstalt Fur Luft- Und Raumfahrt E.V.Method and apparatus for remote reconnaissance of the earth
US4956705A (en)1989-03-101990-09-11Dimensional Visions GroupElectronic method and apparatus for stereoscopic photography
US4965572A (en)1988-06-101990-10-23Turbulence Prediction SystemsMethod for producing a warning of the existence of low-level wind shear and aircraftborne system for performing same
US4964721A (en)1989-10-121990-10-23Kaman Aerospace CorporationImaging lidar system
US5013917A (en)1988-07-071991-05-07Kaman Aerospace CorporationImaging lidar system using non-visible light
US5027199A (en)1988-09-201991-06-25Nec CorporationImage pickup system capable of obtaining a plurality of stereo images with different base height ratios
US5029009A (en)1989-05-081991-07-02Kaman Aerospace CorporationImaging camera with adaptive range gating
US5045937A (en)1989-08-251991-09-03Space Island Products & Services, Inc.Geographical surveying using multiple cameras to obtain split-screen images with overlaid geographical coordinates
US5104217A (en)1986-03-171992-04-14Geospectra CorporationSystem for determining and controlling the attitude of a moving airborne or spaceborne platform or the like
EP0494700A2 (en)1991-10-111992-07-15Kaman Aerospace CorporationImaging lidar system for shallow and coastal water
US5138444A (en)1991-09-051992-08-11Nec CorporationImage pickup system capable of producing correct image signals of an object zone
US5166789A (en)1989-08-251992-11-24Space Island Products & Services, Inc.Geographical surveying using cameras in combination with flight computers to obtain images with overlaid geographical coordinates
US5187754A (en)1991-04-301993-02-16General Electric CompanyForming, with the aid of an overview image, a composite image from a mosaic of images
US5193124A (en)1989-06-291993-03-09The Research Foundation Of State University Of New YorkComputational methods and electronic camera apparatus for determining distance of objects, rapid autofocusing, and obtaining improved focus images
US5198657A (en)1992-02-051993-03-30General AtomicsIntegrated imaging and ranging lidar receiver
US5231401A (en)1990-08-101993-07-27Kaman Aerospace CorporationImaging lidar system
US5247356A (en)1992-02-141993-09-21Ciampa John AMethod and apparatus for mapping and measuring land
US5249034A (en)1991-01-291993-09-28Toyo Glass Co., Ltd.Method of and apparatus for inspecting end of object for defect
US5259037A (en)1991-02-071993-11-02Hughes Training, Inc.Automated video imagery database generation using photogrammetry
US5262953A (en)1989-10-311993-11-16Agence Spatiale EuropeenneMethod of rectifying images from geostationary meteorological satellites in real time
US5266799A (en)1989-09-151993-11-30State Of Israel, Ministry Of Energy & InfastructureGeophysical survey system
US5276321A (en)1991-04-151994-01-04Geophysical & Environmental Research Corp.Airborne multiband imaging spectrometer
US5308022A (en)1982-04-301994-05-03Cubic CorporationMethod of generating a dynamic display of an aircraft from the viewpoint of a pseudo chase aircraft
US5317394A (en)1992-04-301994-05-31Westinghouse Electric Corp.Distributed aperture imaging and tracking system
US5332968A (en)1992-04-211994-07-26University Of South FloridaMagnetic resonance imaging color composites
US5347539A (en)1991-04-151994-09-13Codex CorporationHigh speed two wire modem
US5371358A (en)1991-04-151994-12-06Geophysical & Environmental Research Corp.Method and apparatus for radiometric calibration of airborne multiband imaging spectrometer
US5379065A (en)1992-06-221995-01-03The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationProgrammable hyperspectral image mapper with on-array processing
US5414462A (en)1993-02-111995-05-09Veatch; John W.Method and apparatus for generating a comprehensive survey map
GB2284273A (en)1993-11-291995-05-31Hadland Photonics LimitedOptical beam splitter and electronic high speed camera incorporating it
US5426476A (en)1994-11-161995-06-20Fussell; James C.Aircraft video camera mount
JPH0728400Y2 (en)1988-10-071995-06-28文化シャッター株式会社 Shutter winding device
US5448936A (en)1994-08-231995-09-12Hughes Aircraft CompanyDestruction of underwater objects
US5450125A (en)1991-04-241995-09-12Kaman Aerospace CorporationSpectrally dispersive imaging lidar system
US5467271A (en)1993-12-171995-11-14Trw, Inc.Mapping and analysis system for precision farming applications
US5471056A (en)1992-09-251995-11-28Texaco Inc.Airborne scanner image spectrometer
JPH0830194A (en)1994-07-111996-02-02Mitsubishi Precision Co LtdMethod for forming geospecific texture
US5517419A (en)1993-07-221996-05-14Synectics CorporationAdvanced terrain mapping system
US5555018A (en)1991-04-251996-09-10Von Braun; Heiko S.Large-scale mapping of parameters of multi-dimensional structures in natural environments
US5557397A (en)1994-09-211996-09-17Airborne Remote Mapping, Inc.Aircraft-based topographical data collection and processing system
JPH08335298A (en)1995-06-081996-12-17Mitsubishi Electric Corp Vehicle driving support device
US5596494A (en)1994-11-141997-01-21Kuo; ShihjongMethod and apparatus for acquiring digital maps
US5604534A (en)1995-05-241997-02-18Omni Solutions International, Ltd.Direct digital airborne panoramic camera system and method
US5625409A (en)1992-10-141997-04-29Matra Cap SystemesHigh resolution long-range camera for an airborne platform
US5633946A (en)1994-05-191997-05-27Geospan CorporationMethod and apparatus for collecting and processing visual and spatial position information from a moving platform
US5639964A (en)1994-10-241997-06-17Djorup; Robert S.Thermal anemometer airstream turbulent energy detector
US5647015A (en)1991-12-111997-07-08Texas Instruments IncorporatedMethod of inferring sensor attitude through multi-feature tracking
US5668593A (en)1995-06-071997-09-16Recon/Optical, Inc.Method and camera system for step frame reconnaissance with motion compensation
US5721611A (en)1993-02-151998-02-24E.M.S. Technik, GmbhPhotogrammetric camera, in particular for photogrammetric measurements of technical objects
US5765044A (en)1993-12-131998-06-09Core Corp.Airborne photographing apparatus
US5790188A (en)1995-09-071998-08-04Flight Landata, Inc.Computer controlled, 3-CCD camera, airborne, variable interference filter imaging spectrometer system
US5798786A (en)1996-05-071998-08-25Recon/Optical, Inc.Electro-optical imaging detector array for a moving vehicle which includes two axis image motion compensation and transfers pixels in row directions and column directions
US5815314A (en)1993-12-271998-09-29Canon Kabushiki KaishaImage display apparatus and image display method
DE19714396A1 (en)1997-04-081998-10-15Zeiss Carl FaPhotogrammetric camera used in aircraft or satellite
US5872590A (en)1996-11-111999-02-16Fujitsu Ltd.Image display apparatus and method for allowing stereoscopic video image to be observed
US5878356A (en)1995-06-141999-03-02Agrometrics, Inc.Aircraft based infrared mapping system for earth based resources
US5886821A (en)1997-10-021999-03-23Fresnel Technologies, Inc.Lens assembly for miniature motion sensor
US5894323A (en)1996-03-221999-04-13Tasc, Inc,Airborne imaging system using global positioning system (GPS) and inertial measurement unit (IMU) data
WO1999018732A1 (en)1997-10-061999-04-15Ciampa John ADigital-image mapping
WO1999034346A1 (en)1997-12-251999-07-08Toyota Jidosha Kabushiki KaishaMethod and apparatus for processing digital map data
US5937212A (en)1996-11-151999-08-10Canon Kabushiki KaishaImage pickup apparatus
US5953054A (en)1996-05-311999-09-14Geo-3D Inc.Method and system for producing stereoscopic 3-dimensional images
US5963664A (en)1995-06-221999-10-05Sarnoff CorporationMethod and system for image combination using a parallax-based technique
US5982951A (en)1996-05-281999-11-09Canon Kabushiki KaishaApparatus and method for combining a plurality of images
US6002815A (en)1997-07-161999-12-14Kinetic Sciences, Inc.Linear sensor imaging method and apparatus
US6005987A (en)1996-10-171999-12-21Sharp Kabushiki KaishaPicture image forming apparatus
US6055012A (en)1995-12-292000-04-25Lucent Technologies Inc.Digital multi-view video compression with complexity and compatibility constraints
US6075905A (en)1996-07-172000-06-13Sarnoff CorporationMethod and apparatus for mosaic image construction
US6078701A (en)1997-08-012000-06-20Sarnoff CorporationMethod and apparatus for performing local to global multiframe alignment to construct mosaic images
US6087984A (en)1998-05-042000-07-11Trimble Navigation LimitedGPS guidance system for use with circular cultivated agricultural fields
US6125329A (en)1998-06-172000-09-26Earth Satellite CorporationMethod, system and programmed medium for massive geodetic block triangulation in satellite imaging
US6130705A (en)1998-07-102000-10-10Recon/Optical, Inc.Autonomous electro-optical framing camera system with constant ground resolution, unmanned airborne vehicle therefor, and methods of use
CA2268611A1 (en)1999-04-132000-10-13Richard J. PollockCamera mount for aerial photography
US6173087B1 (en)1996-11-132001-01-09Sarnoff CorporationMulti-view image registration with application to mosaicing and lens distortion correction
US6204799B1 (en)1980-05-272001-03-20William J. Caputi, Jr.Three dimensional bistatic imaging radar processing for independent transmitter and receiver flightpaths
US6209834B1 (en)1999-04-122001-04-03Verimap Plus Inc.Optical imaging mount apparatus
US6211906B1 (en)1995-09-072001-04-03Flight Landata, Inc.Computerized component variable interference filter imaging spectrometer system method and apparatus
US6281970B1 (en)1998-03-122001-08-28Synergistix LlcAirborne IR fire surveillance system providing firespot geopositioning
US6282301B1 (en)1999-04-082001-08-28The United States Of America As Represented By The Secretary Of The ArmyAres method of sub-pixel target detection
US6282362B1 (en)*1995-11-072001-08-28Trimble Navigation LimitedGeographical position/image digital recording and display system
US6323858B1 (en)1998-05-132001-11-27Imove Inc.System for digitally capturing and recording panoramic movies
WO2002006892A2 (en)2000-07-142002-01-24Z/I Imaging GmbhCamera system with at least two first and second cameras
EP1178283A1 (en)2000-07-312002-02-06CO.RI.AL. Consorzio Ricerche Alimentari S.C.p.A.Airborne spectroscopic digital camera imaging system
WO2002012830A1 (en)2000-08-072002-02-14Bae Systems PlcHeight measurement apparatus
US6356646B1 (en)*1999-02-192002-03-12Clyde H. SpencerMethod for creating thematic maps using segmentation of ternary diagrams
EP1189021A1 (en)2000-09-132002-03-20Roke Manor Research LimitedImprovements in or relating to camera systems
US6393163B1 (en)1994-11-142002-05-21Sarnoff CorporationMosaic based image processing system
US20020060784A1 (en)2000-07-192002-05-23Utah State University3D multispectral lidar
US6422508B1 (en)2000-04-052002-07-23Galileo Group, Inc.System for robotic control of imaging data having a steerable gimbal mounted spectral sensor and methods
US20020101438A1 (en)2001-01-312002-08-01Harris CorporationSystem and method for identifying tie point collections used in imagery
US6434280B1 (en)1997-11-102002-08-13Gentech CorporationSystem and method for generating super-resolution-enhanced mosaic images
EP1231780A2 (en)2001-02-072002-08-14Sony CorporationImage pickup apparatus
WO2002065155A1 (en)2001-02-092002-08-22Commonwealth Scientific And Industrial Research OrganisationLidar system and method
US6456938B1 (en)1999-07-232002-09-24Kent Deon BarnardPersonal dGPS golf course cartographer, navigator and internet web site with map exchange and tutor
US20020163582A1 (en)2001-05-042002-11-07Gruber Michael A.Self-calibrating, digital, large format camera with single or mulitiple detector arrays and single or multiple optical systems
US6526352B1 (en)2001-07-192003-02-25Intelligent Technologies International, Inc.Method and arrangement for mapping a road
US20030048357A1 (en)2001-08-292003-03-13Geovantage, Inc.Digital imaging system for airborne applications
US6542831B1 (en)2001-04-182003-04-01Desert Research InstituteVehicle particulate sensor system
US6553311B2 (en)2000-12-082003-04-22Trimble Navigation LimitedNavigational off- line and off-heading indication system and method
US20030081827A1 (en)2001-10-302003-05-01Eastman Kodak CompanySuperimposing graphic representations of ground locations onto ground location images after detection of failures
US6570612B1 (en)1998-09-212003-05-27Bank One, Na, As Administrative AgentSystem and method for color normalization of board images
US6597991B1 (en)2001-03-282003-07-22Agrosense Ltd.System and method for remote monitoring of water stress status of growing crops
US6597818B2 (en)1997-05-092003-07-22Sarnoff CorporationMethod and apparatus for performing geo-spatial registration of imagery
US6611289B1 (en)1999-01-152003-08-26Yanbin YuDigital cameras using multiple sensors with multiple lenses
US20030169259A1 (en)2002-03-082003-09-11Lavelle Michael G.Graphics data synchronization with multiple data paths in a graphics accelerator
US20030198364A1 (en)*2001-03-222003-10-23Yonover Robert N.Video search and rescue device
US20030210336A1 (en)2001-05-092003-11-13Sal KhanSecure access camera and method for camera control
US6694064B1 (en)*1999-11-192004-02-17Positive Systems, Inc.Digital aerial image mosaic method and apparatus
US6694094B2 (en)2000-08-312004-02-17Recon/Optical, Inc.Dual band framing reconnaissance camera
US20040041914A1 (en)2002-08-282004-03-04Peters Leo J.Retinal array compound camera system
US20040054488A1 (en)2002-09-172004-03-18Mai Tuy VuMethod of using a self-locking travel pattern to achieve calibration of remote sensors using conventionally collected data
US6711475B2 (en)2000-03-162004-03-23The Johns Hopkins UniversityLight detection and ranging (LIDAR) mapping system
WO2004028134A2 (en)2002-09-202004-04-01M7 Visual Intelligence, LpVehicule based data collection and porcessing system
US6747686B1 (en)2001-10-052004-06-08Recon/Optical, Inc.High aspect stereoscopic mode camera and method
US6766226B2 (en)2002-05-162004-07-20Andersen Aeronautical Technologies, Ltd.Method of monitoring utility lines with aircraft
US6771208B2 (en)2002-04-242004-08-03Medius, Inc.Multi-sensor system
US6781707B2 (en)2002-03-222004-08-24Orasee Corp.Multi-spectral display
US20040257441A1 (en)2001-08-292004-12-23Geovantage, Inc.Digital imaging system for airborne applications
US6839972B2 (en)*2001-06-152005-01-11Snap-On IncorporatedSelf-calibrating position determination system
DE10341822A1 (en)2003-09-092005-09-29Clauß, Ulrich, Dr.-Ing.Three dimensional object photogrammetry recording method, e.g. for use in geological survey, involves storing picture information in polar coordinate system, where information allows eventual turning or tilting of camera
US6954310B2 (en)2003-09-252005-10-11University Of Florida Research Foundation, Inc.High resolution multi-lens imaging device
JP2005333336A (en)2004-05-192005-12-02Olympus Corp Imaging device
US7006132B2 (en)1998-02-252006-02-28California Institute Of TechnologyAperture coded camera for three dimensional imaging
US7006709B2 (en)2002-06-152006-02-28Microsoft CorporationSystem and method deghosting mosaics using multiperspective plane sweep
US7019777B2 (en)2000-04-212006-03-28Flight Landata, Inc.Multispectral imaging system with spatial resolution enhancement
JP2006217131A (en)2005-02-022006-08-17Matsushita Electric Ind Co Ltd Imaging device
US7184072B1 (en)2000-06-152007-02-27Power View Company, L.L.C.Airborne inventory and inspection system and apparatus
US20070046448A1 (en)2002-09-202007-03-01M7 Visual IntelligenceVehicle based data collection and processing system and imaging sensor system and methods thereof
JP2007323615A (en)2006-06-052007-12-13Topcon Corp Image processing apparatus and processing method thereof
US7365774B2 (en)2002-12-132008-04-29Pierre LouisDevice with camera modules and flying apparatus provided with such a device
JP2008109477A (en)2006-10-262008-05-08Fuji Electric Holdings Co Ltd Image generating apparatus and image generating method
US7424133B2 (en)2002-11-082008-09-09Pictometry International CorporationMethod and apparatus for capturing, geolocating and measuring oblique images
US7437062B2 (en)2005-11-102008-10-14Eradas, Inc.Remote sensing system capable of coregistering data from sensors potentially having unique perspectives
US20080278828A1 (en)2005-07-142008-11-13Carl Zeiss Smt AgOptical Element
CN101344391A (en)2008-07-182009-01-14北京工业大学 Autonomous determination method of lunar rover position and orientation based on full-function solar compass
JP2010085719A (en)2008-09-302010-04-15Fujinon CorpLens frame, lens assembly and photographing apparatus
US20100235095A1 (en)2002-09-202010-09-16M7 Visual Intelligence, L.P.Self-calibrated, remote imaging and data processing system
CN102506868A (en)2011-11-212012-06-20清华大学SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system
CN103038761A (en)2010-04-132013-04-10视觉智能有限合伙公司Self-calibrated, remote imaging and data processing system
US8462209B2 (en)2009-06-262013-06-11Keyw CorporationDual-swath imaging system
JP7028400B2 (en)2015-06-112022-03-02レンチング アクチエンゲゼルシャフト Use of cellulosic fibers to manufacture non-woven fabrics

Patent Citations (202)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US1699136A (en)1929-01-15eliel
US1910425A (en)1928-07-261933-05-23Brock & Weymouth IncMethod of making maps
US2036062A (en)1935-07-191936-03-31Fairchild Aerial Camera CorpCamera positioning device
US2104976A (en)1936-02-291938-01-11Leon T ElielApparatus for aerial photography
US2433534A (en)1944-04-141947-12-30Chicago Aerial Survey CompanyStereoscopic camera
US2720029A (en)1952-09-221955-10-11Fairchild Aerial Surveys IncPhotogrammetric apparatus
US2747012A (en)1953-04-101956-05-22Vitarama CorpClosed link electronic camera chain
US2896501A (en)1953-05-281959-07-28Faximile IncApparatus for outlining contours
US2988953A (en)1957-11-291961-06-20Photographic Analysis IncApparatus for contour plotting
US2955518A (en)1958-07-031960-10-11Texas Instruments IncAerial camera
US3109057A (en)1960-12-081963-10-29Singer Inc H R BStereo scanning unit and system
US3527880A (en)1967-01-251970-09-08Mc Donnell Douglas CorpPseudo stereo-optical observation means
US3518929A (en)1967-03-061970-07-07Gen ElectricThree dimensional camera
US4217607A (en)1974-05-071980-08-12Societe Anonyme De TelecommunicationsProcess and device for the instantaneous display of a countryside scanned by a camera of the single line scanning type
DE2811428A1 (en)1978-03-161979-09-20Bosch Gmbh RobertHeadlights unit for motor vehicle - has moulded plastics vibration damping sealing ring between lens and body
US4398195A (en)1979-07-021983-08-09Del Norte Technology, Inc.Method of and apparatus for guiding agricultural aircraft
US4313678A (en)1979-09-241982-02-02The United States Of America As Represented By The Secretary Of The InteriorAutomated satellite mapping system (MAPSAT)
US4689748A (en)1979-10-091987-08-25Messerschmitt-Bolkow-Blohm Gesellschaft Mit Beschrankter HaftungDevice for aircraft and spacecraft for producing a digital terrain representation
US6204799B1 (en)1980-05-272001-03-20William J. Caputi, Jr.Three dimensional bistatic imaging radar processing for independent transmitter and receiver flightpaths
US4322741A (en)1980-08-261982-03-30Jun KawabayashiImage dividing system for use in television
US4504914A (en)1980-11-191985-03-12Messerschmitt-Bolkow-Blohm Gesellschaft Mit Beschrankter HaftungPhotogrammetric device for aircraft and spacecraft for producing a digital terrain representation
US4583703A (en)1982-03-171986-04-22The United States Of America As Represented By The Secretary Of The ArmyOne fin orientation and stabilization device
US5308022A (en)1982-04-301994-05-03Cubic CorporationMethod of generating a dynamic display of an aircraft from the viewpoint of a pseudo chase aircraft
US4708472A (en)1982-05-191987-11-24Messerschmitt-Bolkow-Blohm GmbhStereophotogrammetric surveying and evaluation method
US4543603A (en)1982-11-301985-09-24Societe Nationale Industrielle Et AerospatialeReconnaissance system comprising an air-borne vehicle rotating about its longitudinal axis
US4686474A (en)1984-04-051987-08-11Deseret Research, Inc.Survey system for collection and real time processing of geophysical data
US4814711A (en)1984-04-051989-03-21Deseret Research, Inc.Survey system and method for real time collection and processing of geophysicals data using signals from a global positioning satellite network
US4750810A (en)1985-11-081988-06-14British Telecommunications PlcCamera optics for producing a composite image from two scenes
US4650305A (en)1985-12-191987-03-17HineslabCamera mounting apparatus
US4712010A (en)1986-01-301987-12-08Hughes Aircraft CompanyRadiator scanning with image enhancement and noise reduction
US5104217A (en)1986-03-171992-04-14Geospectra CorporationSystem for determining and controlling the attitude of a moving airborne or spaceborne platform or the like
US4724449A (en)1986-03-251988-02-09Douglas WrightMethod and apparatus for stereoscopic photography
US4757378A (en)1986-09-301988-07-12The Boeing CompanyMonocular scene generator for biocular wide field of view display system
US4754327A (en)1987-03-201988-06-28Honeywell, Inc.Single sensor three dimensional imaging
US4764008A (en)1987-11-191988-08-16Wren Clifford TSurveillance housing assembly
US4887779A (en)1987-12-011989-12-19The Boeing CompanyRoll drum sensor housing having sliding window
US4951136A (en)1988-01-261990-08-21Deutsche Forschungs- Und Versuchsanstalt Fur Luft- Und Raumfahrt E.V.Method and apparatus for remote reconnaissance of the earth
US4965572A (en)1988-06-101990-10-23Turbulence Prediction SystemsMethod for producing a warning of the existence of low-level wind shear and aircraftborne system for performing same
US5013917A (en)1988-07-071991-05-07Kaman Aerospace CorporationImaging lidar system using non-visible light
US5027199A (en)1988-09-201991-06-25Nec CorporationImage pickup system capable of obtaining a plurality of stereo images with different base height ratios
JPH0728400Y2 (en)1988-10-071995-06-28文化シャッター株式会社 Shutter winding device
US4935629A (en)1988-10-241990-06-19Honeywell Inc.Detector array for high V/H infrared linescanners
US4956705A (en)1989-03-101990-09-11Dimensional Visions GroupElectronic method and apparatus for stereoscopic photography
US5029009A (en)1989-05-081991-07-02Kaman Aerospace CorporationImaging camera with adaptive range gating
US5193124A (en)1989-06-291993-03-09The Research Foundation Of State University Of New YorkComputational methods and electronic camera apparatus for determining distance of objects, rapid autofocusing, and obtaining improved focus images
US5045937A (en)1989-08-251991-09-03Space Island Products & Services, Inc.Geographical surveying using multiple cameras to obtain split-screen images with overlaid geographical coordinates
US5166789A (en)1989-08-251992-11-24Space Island Products & Services, Inc.Geographical surveying using cameras in combination with flight computers to obtain images with overlaid geographical coordinates
US5266799A (en)1989-09-151993-11-30State Of Israel, Ministry Of Energy & InfastructureGeophysical survey system
US4964721A (en)1989-10-121990-10-23Kaman Aerospace CorporationImaging lidar system
US5262953A (en)1989-10-311993-11-16Agence Spatiale EuropeenneMethod of rectifying images from geostationary meteorological satellites in real time
US5231401A (en)1990-08-101993-07-27Kaman Aerospace CorporationImaging lidar system
US5249034A (en)1991-01-291993-09-28Toyo Glass Co., Ltd.Method of and apparatus for inspecting end of object for defect
US5259037A (en)1991-02-071993-11-02Hughes Training, Inc.Automated video imagery database generation using photogrammetry
US5276321A (en)1991-04-151994-01-04Geophysical & Environmental Research Corp.Airborne multiband imaging spectrometer
US5347539A (en)1991-04-151994-09-13Codex CorporationHigh speed two wire modem
US5371358A (en)1991-04-151994-12-06Geophysical & Environmental Research Corp.Method and apparatus for radiometric calibration of airborne multiband imaging spectrometer
US5450125A (en)1991-04-241995-09-12Kaman Aerospace CorporationSpectrally dispersive imaging lidar system
US5555018A (en)1991-04-251996-09-10Von Braun; Heiko S.Large-scale mapping of parameters of multi-dimensional structures in natural environments
US5187754A (en)1991-04-301993-02-16General Electric CompanyForming, with the aid of an overview image, a composite image from a mosaic of images
US5138444A (en)1991-09-051992-08-11Nec CorporationImage pickup system capable of producing correct image signals of an object zone
EP0494700A2 (en)1991-10-111992-07-15Kaman Aerospace CorporationImaging lidar system for shallow and coastal water
US5647015A (en)1991-12-111997-07-08Texas Instruments IncorporatedMethod of inferring sensor attitude through multi-feature tracking
US5198657A (en)1992-02-051993-03-30General AtomicsIntegrated imaging and ranging lidar receiver
US5247356A (en)1992-02-141993-09-21Ciampa John AMethod and apparatus for mapping and measuring land
US5332968A (en)1992-04-211994-07-26University Of South FloridaMagnetic resonance imaging color composites
US5317394A (en)1992-04-301994-05-31Westinghouse Electric Corp.Distributed aperture imaging and tracking system
US5379065A (en)1992-06-221995-01-03The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationProgrammable hyperspectral image mapper with on-array processing
US5471056A (en)1992-09-251995-11-28Texaco Inc.Airborne scanner image spectrometer
US5625409A (en)1992-10-141997-04-29Matra Cap SystemesHigh resolution long-range camera for an airborne platform
US5414462A (en)1993-02-111995-05-09Veatch; John W.Method and apparatus for generating a comprehensive survey map
US5721611A (en)1993-02-151998-02-24E.M.S. Technik, GmbhPhotogrammetric camera, in particular for photogrammetric measurements of technical objects
US5517419A (en)1993-07-221996-05-14Synectics CorporationAdvanced terrain mapping system
US5734507A (en)1993-11-291998-03-31Hadland Photonics LimitedOptical beam splitter and electronic high speed camera incorporating such a beam splitter
GB2284273A (en)1993-11-291995-05-31Hadland Photonics LimitedOptical beam splitter and electronic high speed camera incorporating it
US5765044A (en)1993-12-131998-06-09Core Corp.Airborne photographing apparatus
US5467271A (en)1993-12-171995-11-14Trw, Inc.Mapping and analysis system for precision farming applications
US5815314A (en)1993-12-271998-09-29Canon Kabushiki KaishaImage display apparatus and image display method
US5633946A (en)1994-05-191997-05-27Geospan CorporationMethod and apparatus for collecting and processing visual and spatial position information from a moving platform
JPH0830194A (en)1994-07-111996-02-02Mitsubishi Precision Co LtdMethod for forming geospecific texture
US5448936A (en)1994-08-231995-09-12Hughes Aircraft CompanyDestruction of underwater objects
US5557397A (en)1994-09-211996-09-17Airborne Remote Mapping, Inc.Aircraft-based topographical data collection and processing system
US5639964A (en)1994-10-241997-06-17Djorup; Robert S.Thermal anemometer airstream turbulent energy detector
US6393163B1 (en)1994-11-142002-05-21Sarnoff CorporationMosaic based image processing system
US5596494A (en)1994-11-141997-01-21Kuo; ShihjongMethod and apparatus for acquiring digital maps
US5426476A (en)1994-11-161995-06-20Fussell; James C.Aircraft video camera mount
US5999211A (en)1995-05-241999-12-07Imageamerica, Inc.Direct digital airborne panoramic camera system and method
US5604534A (en)1995-05-241997-02-18Omni Solutions International, Ltd.Direct digital airborne panoramic camera system and method
US5668593A (en)1995-06-071997-09-16Recon/Optical, Inc.Method and camera system for step frame reconnaissance with motion compensation
JPH08335298A (en)1995-06-081996-12-17Mitsubishi Electric Corp Vehicle driving support device
US5878356A (en)1995-06-141999-03-02Agrometrics, Inc.Aircraft based infrared mapping system for earth based resources
US5963664A (en)1995-06-221999-10-05Sarnoff CorporationMethod and system for image combination using a parallax-based technique
US6211906B1 (en)1995-09-072001-04-03Flight Landata, Inc.Computerized component variable interference filter imaging spectrometer system method and apparatus
US5790188A (en)1995-09-071998-08-04Flight Landata, Inc.Computer controlled, 3-CCD camera, airborne, variable interference filter imaging spectrometer system
US6282362B1 (en)*1995-11-072001-08-28Trimble Navigation LimitedGeographical position/image digital recording and display system
US6055012A (en)1995-12-292000-04-25Lucent Technologies Inc.Digital multi-view video compression with complexity and compatibility constraints
US5894323A (en)1996-03-221999-04-13Tasc, Inc,Airborne imaging system using global positioning system (GPS) and inertial measurement unit (IMU) data
US5798786A (en)1996-05-071998-08-25Recon/Optical, Inc.Electro-optical imaging detector array for a moving vehicle which includes two axis image motion compensation and transfers pixels in row directions and column directions
US5982951A (en)1996-05-281999-11-09Canon Kabushiki KaishaApparatus and method for combining a plurality of images
US5953054A (en)1996-05-311999-09-14Geo-3D Inc.Method and system for producing stereoscopic 3-dimensional images
US6075905A (en)1996-07-172000-06-13Sarnoff CorporationMethod and apparatus for mosaic image construction
US6005987A (en)1996-10-171999-12-21Sharp Kabushiki KaishaPicture image forming apparatus
US5872590A (en)1996-11-111999-02-16Fujitsu Ltd.Image display apparatus and method for allowing stereoscopic video image to be observed
US6173087B1 (en)1996-11-132001-01-09Sarnoff CorporationMulti-view image registration with application to mosaicing and lens distortion correction
US5937212A (en)1996-11-151999-08-10Canon Kabushiki KaishaImage pickup apparatus
DE19714396A1 (en)1997-04-081998-10-15Zeiss Carl FaPhotogrammetric camera used in aircraft or satellite
US6473119B1 (en)1997-04-082002-10-29Carl-Zeiss-StiftungPhotogrammetic camera
US20020085094A1 (en)1997-04-082002-07-04Teuchert Wolf DieterPhotogrammetric camera
US6597818B2 (en)1997-05-092003-07-22Sarnoff CorporationMethod and apparatus for performing geo-spatial registration of imagery
US6002815A (en)1997-07-161999-12-14Kinetic Sciences, Inc.Linear sensor imaging method and apparatus
US6078701A (en)1997-08-012000-06-20Sarnoff CorporationMethod and apparatus for performing local to global multiframe alignment to construct mosaic images
US5886821A (en)1997-10-021999-03-23Fresnel Technologies, Inc.Lens assembly for miniature motion sensor
WO1999018732A1 (en)1997-10-061999-04-15Ciampa John ADigital-image mapping
US6434280B1 (en)1997-11-102002-08-13Gentech CorporationSystem and method for generating super-resolution-enhanced mosaic images
EP1069547A1 (en)1997-12-252001-01-17Toyota Jidosha Kabushiki KaishaMethod and apparatus for processing digital map data
WO1999034346A1 (en)1997-12-251999-07-08Toyota Jidosha Kabushiki KaishaMethod and apparatus for processing digital map data
US7006132B2 (en)1998-02-252006-02-28California Institute Of TechnologyAperture coded camera for three dimensional imaging
US6281970B1 (en)1998-03-122001-08-28Synergistix LlcAirborne IR fire surveillance system providing firespot geopositioning
US6353409B1 (en)1998-05-042002-03-05Trimble Navigation LimitedGPS guidance system for use with circular cultivated agricultural fields
US6087984A (en)1998-05-042000-07-11Trimble Navigation LimitedGPS guidance system for use with circular cultivated agricultural fields
US6323858B1 (en)1998-05-132001-11-27Imove Inc.System for digitally capturing and recording panoramic movies
US6125329A (en)1998-06-172000-09-26Earth Satellite CorporationMethod, system and programmed medium for massive geodetic block triangulation in satellite imaging
US6130705A (en)1998-07-102000-10-10Recon/Optical, Inc.Autonomous electro-optical framing camera system with constant ground resolution, unmanned airborne vehicle therefor, and methods of use
US6570612B1 (en)1998-09-212003-05-27Bank One, Na, As Administrative AgentSystem and method for color normalization of board images
US6611289B1 (en)1999-01-152003-08-26Yanbin YuDigital cameras using multiple sensors with multiple lenses
US6356646B1 (en)*1999-02-192002-03-12Clyde H. SpencerMethod for creating thematic maps using segmentation of ternary diagrams
US6282301B1 (en)1999-04-082001-08-28The United States Of America As Represented By The Secretary Of The ArmyAres method of sub-pixel target detection
US6209834B1 (en)1999-04-122001-04-03Verimap Plus Inc.Optical imaging mount apparatus
CA2268611A1 (en)1999-04-132000-10-13Richard J. PollockCamera mount for aerial photography
US6456938B1 (en)1999-07-232002-09-24Kent Deon BarnardPersonal dGPS golf course cartographer, navigator and internet web site with map exchange and tutor
US6694064B1 (en)*1999-11-192004-02-17Positive Systems, Inc.Digital aerial image mosaic method and apparatus
US6711475B2 (en)2000-03-162004-03-23The Johns Hopkins UniversityLight detection and ranging (LIDAR) mapping system
US6422508B1 (en)2000-04-052002-07-23Galileo Group, Inc.System for robotic control of imaging data having a steerable gimbal mounted spectral sensor and methods
US7019777B2 (en)2000-04-212006-03-28Flight Landata, Inc.Multispectral imaging system with spatial resolution enhancement
US7184072B1 (en)2000-06-152007-02-27Power View Company, L.L.C.Airborne inventory and inspection system and apparatus
US6834163B2 (en)2000-07-142004-12-21Z/I Imaging GmbhCamera system having at least two first cameras and two second cameras
WO2002006892A2 (en)2000-07-142002-01-24Z/I Imaging GmbhCamera system with at least two first and second cameras
US20030138247A1 (en)2000-07-142003-07-24Michael TrunzCamera system having at least two first cameras and two second cameras
US20020060784A1 (en)2000-07-192002-05-23Utah State University3D multispectral lidar
US6664529B2 (en)2000-07-192003-12-16Utah State University3D multispectral lidar
EP1178283A1 (en)2000-07-312002-02-06CO.RI.AL. Consorzio Ricerche Alimentari S.C.p.A.Airborne spectroscopic digital camera imaging system
WO2002012830A1 (en)2000-08-072002-02-14Bae Systems PlcHeight measurement apparatus
US6694094B2 (en)2000-08-312004-02-17Recon/Optical, Inc.Dual band framing reconnaissance camera
US6826358B2 (en)2000-08-312004-11-30Recon/Optical, Inc.Dual band hyperspectral framing reconnaissance camera
EP1189021A1 (en)2000-09-132002-03-20Roke Manor Research LimitedImprovements in or relating to camera systems
US6553311B2 (en)2000-12-082003-04-22Trimble Navigation LimitedNavigational off- line and off-heading indication system and method
US20020101438A1 (en)2001-01-312002-08-01Harris CorporationSystem and method for identifying tie point collections used in imagery
EP1231780A3 (en)2001-02-072004-01-14Sony CorporationImage pickup apparatus
EP1231780A2 (en)2001-02-072002-08-14Sony CorporationImage pickup apparatus
WO2002065155A1 (en)2001-02-092002-08-22Commonwealth Scientific And Industrial Research OrganisationLidar system and method
US20030198364A1 (en)*2001-03-222003-10-23Yonover Robert N.Video search and rescue device
US6597991B1 (en)2001-03-282003-07-22Agrosense Ltd.System and method for remote monitoring of water stress status of growing crops
US6542831B1 (en)2001-04-182003-04-01Desert Research InstituteVehicle particulate sensor system
US7009638B2 (en)2001-05-042006-03-07Vexcel Imaging GmbhSelf-calibrating, digital, large format camera with single or multiple detector arrays and single or multiple optical systems
US20020163582A1 (en)2001-05-042002-11-07Gruber Michael A.Self-calibrating, digital, large format camera with single or mulitiple detector arrays and single or multiple optical systems
US7339614B2 (en)2001-05-042008-03-04Microsoft CorporationLarge format camera system with multiple coplanar focusing systems
US20030210336A1 (en)2001-05-092003-11-13Sal KhanSecure access camera and method for camera control
US6839972B2 (en)*2001-06-152005-01-11Snap-On IncorporatedSelf-calibrating position determination system
US6526352B1 (en)2001-07-192003-02-25Intelligent Technologies International, Inc.Method and arrangement for mapping a road
US20030048357A1 (en)2001-08-292003-03-13Geovantage, Inc.Digital imaging system for airborne applications
US20040257441A1 (en)2001-08-292004-12-23Geovantage, Inc.Digital imaging system for airborne applications
US6747686B1 (en)2001-10-052004-06-08Recon/Optical, Inc.High aspect stereoscopic mode camera and method
US20030081827A1 (en)2001-10-302003-05-01Eastman Kodak CompanySuperimposing graphic representations of ground locations onto ground location images after detection of failures
US20030169259A1 (en)2002-03-082003-09-11Lavelle Michael G.Graphics data synchronization with multiple data paths in a graphics accelerator
US6781707B2 (en)2002-03-222004-08-24Orasee Corp.Multi-spectral display
US6771208B2 (en)2002-04-242004-08-03Medius, Inc.Multi-sensor system
US6766226B2 (en)2002-05-162004-07-20Andersen Aeronautical Technologies, Ltd.Method of monitoring utility lines with aircraft
US7006709B2 (en)2002-06-152006-02-28Microsoft CorporationSystem and method deghosting mosaics using multiperspective plane sweep
US20090295924A1 (en)2002-08-282009-12-03M7 Visual Intelligence, L.P.Retinal concave array compound camera system
WO2004021692A2 (en)2002-08-282004-03-11M7 Visual Intelligence, LpRetinal array compound camera system
US20040041914A1 (en)2002-08-282004-03-04Peters Leo J.Retinal array compound camera system
US7212938B2 (en)*2002-09-172007-05-01M7 Visual Intelligence, LpMethod of using a self-locking travel pattern to achieve calibration of remote sensors using conventionally collected data
US20040054488A1 (en)2002-09-172004-03-18Mai Tuy VuMethod of using a self-locking travel pattern to achieve calibration of remote sensors using conventionally collected data
WO2004028134A2 (en)2002-09-202004-04-01M7 Visual Intelligence, LpVehicule based data collection and porcessing system
CA2534968A1 (en)2002-09-202004-04-01M7 Visual Intelligence, LpVehicle based data collection and processing system
US7127348B2 (en)2002-09-202006-10-24M7 Visual Intelligence, LpVehicle based data collection and processing system
CA2534968C (en)2002-09-202013-06-18M7 Visual Intelligence, LpVehicle based data collection and processing system
US20070046448A1 (en)2002-09-202007-03-01M7 Visual IntelligenceVehicle based data collection and processing system and imaging sensor system and methods thereof
US20100235095A1 (en)2002-09-202010-09-16M7 Visual Intelligence, L.P.Self-calibrated, remote imaging and data processing system
US7725258B2 (en)2002-09-202010-05-25M7 Visual Intelligence, L.P.Vehicle based data collection and processing system and imaging sensor system and methods thereof
US8068643B2 (en)2002-11-082011-11-29Pictometry International Corp.Method and apparatus for capturing, geolocating and measuring oblique images
US20120020571A1 (en)2002-11-082012-01-26Schultz Stephen LMethod and apparatus for capturing, geolocating and measuring oblique images
US7995799B2 (en)2002-11-082011-08-09Pictometry International CorporationMethod and apparatus for capturing geolocating and measuring oblique images
US7424133B2 (en)2002-11-082008-09-09Pictometry International CorporationMethod and apparatus for capturing, geolocating and measuring oblique images
US20110091076A1 (en)2002-11-082011-04-21Schultz Stephen LMethod and apparatus for capturing, geolocating and measuring oblique images
US7787659B2 (en)2002-11-082010-08-31Pictometry International Corp.Method and apparatus for capturing, geolocating and measuring oblique images
US7365774B2 (en)2002-12-132008-04-29Pierre LouisDevice with camera modules and flying apparatus provided with such a device
DE10341822A1 (en)2003-09-092005-09-29Clauß, Ulrich, Dr.-Ing.Three dimensional object photogrammetry recording method, e.g. for use in geological survey, involves storing picture information in polar coordinate system, where information allows eventual turning or tilting of camera
US6954310B2 (en)2003-09-252005-10-11University Of Florida Research Foundation, Inc.High resolution multi-lens imaging device
JP2005333336A (en)2004-05-192005-12-02Olympus Corp Imaging device
JP2006217131A (en)2005-02-022006-08-17Matsushita Electric Ind Co Ltd Imaging device
JP2009501350A (en)2005-07-142009-01-15カール・ツァイス・エスエムティー・アーゲー Optical element
US20080278828A1 (en)2005-07-142008-11-13Carl Zeiss Smt AgOptical Element
US7437062B2 (en)2005-11-102008-10-14Eradas, Inc.Remote sensing system capable of coregistering data from sensors potentially having unique perspectives
JP2007323615A (en)2006-06-052007-12-13Topcon Corp Image processing apparatus and processing method thereof
JP2008109477A (en)2006-10-262008-05-08Fuji Electric Holdings Co Ltd Image generating apparatus and image generating method
CN101344391A (en)2008-07-182009-01-14北京工业大学 Autonomous determination method of lunar rover position and orientation based on full-function solar compass
JP2010085719A (en)2008-09-302010-04-15Fujinon CorpLens frame, lens assembly and photographing apparatus
US8462209B2 (en)2009-06-262013-06-11Keyw CorporationDual-swath imaging system
CN103038761A (en)2010-04-132013-04-10视觉智能有限合伙公司Self-calibrated, remote imaging and data processing system
CN103038761B (en)2010-04-132016-07-06视觉智能有限合伙公司Self-alignment long-range imaging and data handling system
CN102506868A (en)2011-11-212012-06-20清华大学SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system)/TRNS (terrain reference navigation system) combined navigation method based on federated filtering and system
JP7028400B2 (en)2015-06-112022-03-02レンチング アクチエンゲゼルシャフト Use of cellulosic fibers to manufacture non-woven fabrics

Non-Patent Citations (379)

* Cited by examiner, † Cited by third party
Title
3Di Press Release, 3Di Acquires Eagle Scan; Spatial Data Technology Firm Enhances Digital Photography and LIDAR Mapping with Eagle Scan Acquisition, May 9, 2000.
3Di Press Release, 3Di Acquires Eagle Scan; Spatial Data Tecnnoiogy Firm Enhances Digital Photography and LIDAR Mapping with Eagle Scan Acquisition, May 9, 2000.
Abd-Elrahman, et al., Detection of positional errors in systems utilizing small-format digital aerial imagery and navigation sensors using area-based matching techniques, Photogrammetric Engineering & Remote Sensing, 67(7) (Jul. 2001) 825-31.
Abstract, Inexpensive 6-inch resolute, digital ortho-imagery with sub-meter accuracy, dated Apr. 13, 2000.
Ackermann, Airborne laser scanning for elevation models, Geomatics Info Magazine 10(10), Feature 1 (Oct. 1996) 24-25.
AirRECON III—The 3rd Generation micropixel digital color aerial photo system, as published on VISI website, archived by the Wayback Machine on Oct. 28, 2000.
Alamus and Talaya, Airborne Sensor Integration and Direct Orientation of the CASI System, IAPRS vol. XXXIII, Part B1, pp. 5-11 (ISPRS Congress, Amsterdam 2000).
Alamus, et al., On the accuracy and performance of the GeoMobil System, IAPRS vol. XXXV, Part B5, Comm. V (XXth ISPRS Congress, Istanbul, Jul. 12-23, 2004) 262-67, available at http://www.isprs.org/proceedings/XXXV/congress/comm5/comm5.aspx.
Alamus, et al., On the accuracy and performance of the GeoMobil System, IAPRS vol. XXXV, Part B5, Comm. V (XXth ISPRS Congress, Istanbul, Jul. 12-23, 2004) 262-67, available at URL http://www.isprs.org/proceedings/XXXV/congress/comm5/comm5.aspx.
Al-Bayari et al., Quality Assessment of DTM and Orthophoto Generated by Airborne Laser Scanning System Using Automated Digital Photogrammetry, Commission III, PCV Symposium 2002.
Allan and Holland, Digital Photogrammetry, Developments at Ordnance Survey, IAPRS vol. XXXIII, Part B2, pp. 46-51 (ISPRS Congress, Amsterdam 2000).
Ambrosia, et al., Remotely sensed wildland fire data and information product processing and delivery report, The Institute for the Application of Geospatial Technology (IAGT) at Cayuga Community College, Auburn, NY (Dec. 2003) (151 pages).
Apr. 1, 2013 Notice of Allowance/Allowability for U.S. Appl. No. 12/798,899, filed Apr. 13, 2010.
Apr. 11, 2006 Response to Office Action dated Jan. 10, 2006 for U.S. Appl. No. 10/244,980, filed Sep. 17, 2002.
Apr. 12, 2007 Response to Advisory Action dated Nov. 17, 2006 for U.S. Appl. No. 10/229,626.
Apr. 14, 2005 Response to Office Action/Non-Compliant Amendment dated Mar. 14, 2005 for U.S. Appl. No. 10/664,737, filed Sep. 18, 2003.
Artan, et al., Characteristic length scale of input data in distributed models: Implications for modeling grid size, J. Hydrology 227(1-4) (Jan. 31, 2000) 128-39.
ASPRS Camera Calibration Panel Report, sponsored by U.S. Geological Survey, Jan. 2000.
AU Apr. 11, 2008 Examiner's Report issued for Australian Patent Application 2003/273338.
AU Aug. 20, 2007 Examiner's Report issued for Australian Patent Application 2003/262941.
Aug. 19, 2005 Examiner's Interview Summary mailed for U.S. Appl. No. 10/664,737, filed Sep. 18, 2003.
Aug. 20, 2008 Office Action/Non-Final Rejection mailed for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Aug. 27, 2012 Notice of Allowability mailed for U.S. Appl. No. 11/805,109, filed May 22, 2007.
Aug. 27, 2014 Notice of Allowance and Notice of Allowability with Examiner's Statement of Reasons for Allowance mailed in U.S. Appl. No. 12/462,533, filed Aug. 5, 2009.
Aug. 31, 2009 Response to Office Action dated Aug. 4, 2009 for U.S. Appl. No. 11/581,235, filed Oct. 11, 2006.
Aug. 4, 2009 Office Action/Non-Final Rejection mailed for U.S. Appl. No. 11/581,235, filed Oct. 11, 2006.
Aug. 5, 2010 Office Action/Final Rejection for U.S. Appl. No. 11/805,109, filed May 22, 2007.
Axelsson, Integrated sensors for platform orientation and topographic data acquisition (Proceedings of Symposium on Digital Photogrammetry, Istanbul, May 21-22, 1998) 1-11.
Axholt, et al., User boresighting for AR calibration: A preliminary analysis, IEEE Virtual Reality 2008, Reno, Nevada (Mar. 8-12, 2008) 43-46.
Bagley, Concerning Aerial Photographic Mapping: A Review, Geographical Review 12(4) (Oct. 1922) 628-635, American Geographical Society, Article Stable URL: http://www.jstor.org/stable/208595.
Bagley, James W., Aerophotography and Aerosurveying, New York: McGraw-Hill Book Company, Inc., OCLC: 938332 (1st ed. 1941)—Part 1 (pp. 1-86).
Bagley, James W., Aerophotography and Aerosurveying, New York: McGraw-Hill Book Company, Inc., OCLC: 938332 (1st ed. 1941)—Part 2 (pp. 87-189).
Bagley, James W., Aerophotography and Aerosurveying, New York: McGraw-Hill Book Company, Inc., OCLC: 938332 (1st ed. 1941)—Part 3 (pp. 190-277).
Bagley, James W., Aerophotography and Aerosurveying, New York: McGraw-Hill Book Company, Inc., OCLC: 938332 (1st ed. 1941)—Part 4 (pp. 278-324).
Bagley, Stereophotography in Aerial Mapping, The Military Engineer 16(88) (Jul.-Aug. 1924) 303-306.
Bagley, Study of Search-Light Triangulation, The Military Engineer 18(100) (Jul.-Aug. 1926) 280-283.
Bagley, Study of Search-Light Triangulation, The Military Engineer 18(100) (July-Aug. 1926) 280-283.
Bagley, Surveying with the Five-Lens Camera, The Military Engineer 24(134) (Mar.-Apr. 1932) 111-114.
Bagley, The Tri-Lens Camera in Aerial Photography and Photographic Mapping, The Military Engineer (May-Jun. 1920) 358-363.
Bagley, The Use of the Panoramic Camera in Topographic Surveying With Notes on the Application of Photogrammetry to Aerial Surveys, U.S. Geological Survey Bulletin No. 657, Washington: Government Printing Office (1917) 1-88.
Bagley, Topographic Surveying from the Air, The Military Engineer 15(84) (Nov.-Dec. 1923) 505-515.
Beard et al, The Ultracam Camera Control and Data Acquisition System, in Advanced Telescope and Instrumentation Control Software II, Lewis, H., SPIE 4848, Astronomical Telescopes and Instrumentation, Waikoloa 2002.
Berg and Ferguson, Airborne Laser Mapping for Highway Engineering Applications, Proceedings of the ASPRS Annual Convention, St. Louis, USA. (2001).
Bossler and Schmidley, Airborne Integrated Mapping System (AIMS): Recent Results in Applications for Large-Scale Mapping, URISA 1997 Annual Conference Proceedings, Toronto, Canada, Jul. 1997.
Brochure, Limited Time Special, dated Feb. 1, 2000.
Brochure, Real-Time Disaster Assessment System, dated Jun. 6, 2002.
Brochure, Real-Time Disaster Assessment System, dated Oct. 13, 2001.
Brochure, Visual Intelligence Products and Services, dated Aug. 14, 1999.
Brochure, Visual Intelligence Products and Services, dated Oct. 21, 1999.
Brochure, Visual Intelligence Systems, Inc. 15-cm color orthophotography 480 band, .5-m hyperspectral 15-cm LIDAR, dated Feb. 7, 2001.
Brochure, With Visual Intelligence's high resolution, fast delivery and low cost, you get better alignment sheets and more accurate census counts, dated Aug. 30, 2000.
Brown et al., Inertial Instrument System for Aerial Surveying, U.S. Geological Survey Professional Paper 1390 (1987).
Burt, Peter J., et al. "A Multiresolution Spline with Application to Image Mosiacs" ACM Transactions on Graphics, vol. 2, No. 4, Oct. 1983, pp. 217-236.
CA Aug. 3, 2012 Response to Examiner's Report issued on Feb. 13, 2012 for Canadian Patent Application 2,534,978.
CA Dec. 12, 2008 VoluntaryAmendment "A" for Canadian PatentApplication 2,534,968.
CA Dec. 19, 2011 Examiner's Report issued for Canadian Patent Application 2,534,968.
CA Feb. 13, 2012 Examiner's Report issued for Canadian Patent Application 2,534,978.
CA Jan. 11, 2016 Response (Amendment "H") to Examiner's Report dated Jul. 20, 2015 in Canadian Patent Application No. 2,534,978 filed Aug. 28, 2003.
CA Jan. 30, 2015 Response to Examiner's Report dated Oct. 23, 2014 in Canadian Patent Application No. 2534978 filed Aug. 28, 2003.
CA Jul. 20, 2015 Examiner's Report issued for Canadian Patent Application No. 2,534,978, filed Aug. 28, 2003.
CA Jun. 6, 2011 Amendment "C" Response to Examiner's Report dated Nov. 26, 2010 forCanadian Patent Application 2,534,968.
CA Jun. 6, 2011 Amendment "C" Response to Examiner's Report of Nov. 26, 2010 forCanadian Patent Application 2,534,968.
CA Mar. 31, 2011 Voluntary Amendment "D" for Canadian Patent Application 2,534,978.
CA May 26, 2011 Response to Examiner's Report for Nov. 26, 2010 for Canadian Patent Application 2,534,968.
CA May 27, 2013 Examiner's Report for Canadian Patent Application No. 2,534,978.
CA May 27, 2013 Examiner's Report mailed for Canadian Patent Application 2,534,978.
CA May 28, 2012 Amendment "D" to Examiner's Report issuedDec. 19, 2011 for Canadian Application 2,534,968.
CA Nov. 23, 2016 Examiner's Report mailed in Canadian Patent Application No. 2,796,162 filed Mar. 31, 2011.
CA Nov. 26, 2010 Examiner's Report issued for Canadian Patent Application 2,534,968.
CA Nov. 26, 2013 Amendment "F" Response to Examiner's Report dated May 27, 2013 for Canadian Patent Application No. 2,534,978.
CA Nov. 26, 2013 Response to Examiner's report mailed for Canadian Patent Application 2,534,978.
CA Oct. 23, 2014 Examiner's Report issued for Canadian Patent Application 2,534,978.
CA Oct. 25, 2016 Examiner's Report mailed in Canadian Patent Application No. 2,534,978 filed Aug. 28, 2003.
Campbell, {i}Origins of Aerial Photographic Interpretation, U.S. Army, 1916 to 1918{/i}, Photogrammetric Engineering & Remote Sensing 74(1) (Jan. 2008) 77-93.
Campbell, Origins of Aerial Photographic Interpretation, U.S. Army, 1916 to 1918, Photogrammetric Engineering & Remote Sensing 74(1) (Jan. 2008) 77-93.
Campos-Marquetti et al., Digital Terrain Mapping of Bernalillo County, New Mexico using Digital Orthophotography and Airborne LIDAR.
Cascade Siskiyou National Monument Hyperspectral Imagery / LIDAR Project Final Report, Oct. 23, 2002.
CN Apr. 25, 2007 Response to Office Action for Chinese Patent Application 03820463.0.
CN Feb. 27, 2015 Notification of First Office Action mailed in Chinese Patent Application No. 201180029220.1 filed Mar. 31, 2011 (with English translation).
CN Jan. 12, 2007 Office Action issued for Chinese Patent Application 03820463.0 (with English translation).
CN Jan. 7, 2016 Response to Second Office Action in Chinese Patent Application No. 201180029220.1 filed Mar. 31, 2011 (with English translation of Amended Claims).
CN Jul. 13, 2015 Response to Feb. 27, 2015 First Office Action filed in Chinese Patent Application No. 201180029220.1 filed Mar. 31, 2011 (with English translation).
CN Mar. 16, 2017 First Office Action issued for Chinese Patent Application No. 201380053255.8 filed Jul. 26, 2013 (with English translation).
CN Mar. 16, 2017 First Office Action issued for Chinese Patent Application No. 201380053255.8 filed Jul. 26, 2013.
CN Oct. 19, 2007 Office Action issued for Chinese Patent Application 03820463.0 (with English translation).
CN Oct. 26, 2015 Notification of the Second Office Action issued in Chinese Patent Application No. 201180029220.1 filed Mar. 31, 2011 (with English translation).
Crosby, et al., Remote sensing inputs and a GIS interface for distributed hydrological modelling, Remote Sensing and Hydrology 2000 (Proceedings of Symposium, Santa Fe, New Mexico, Apr. 2000), IAHS Pub. No. 267 (2001) 421-26.
Dec. 13, 2010 Notice of Allowability for U.S. Appl. No. 12/462,563, filed Aug. 5, 2009.
Dec. 14, 2009 Office Action mailed for U.S. Appl. No. 11/581,235, filed Oct. 11, 2006.
Dec. 14, 2009 Office Action/Final Rejection mailed for U.S. Appl. No. 11/581,235, filed Oct. 11, 2006.
Dec. 17, 2009 Response to Office Action/Final Rejection dated Dec. 14, 2009 for U.S. Appl. No. 11/581,235, filed Oct. 11, 2006.
Dec. 19, 2008 Response to Office Action/Non-Final Rejection dated Aug. 20, 2008 for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Dec. 22, 2004 Office Action/Final Rejection mailed for U.S. Appl. No. 10/247,441, filed Sep. 19, 2002.
Dec. 4, 2013 Response to Office Action/Non-Final Rejection dated Sep. 20, 2013 for U.S. Appl. No. 12/462,533, filed Aug. 5, 2009.
Dec. 5, 2005 Advisory Action mailed for U.S. Appl. No. 10/664,737, filed Sep. 18, 2003.
Dec. 7, 2004 Office Action mailed for U.S. Appl. No. 10/664,737, filed Sep. 18, 2003.
Dec. 7, 2007 Office Action mailed for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Dec. 8, 2005 Response to Office Action and Request for Continued Examination for U.S. Appl. No. 10/664,737, filed Sep. 18, 2003.
Diener et al., Radiometric Normalisation and Colour Composite Generation of the DMC, IAPRS vol. XXXIII, Part B1, pp. 83-88 (ISPRS Congress, Amsterdam 2000).
Digital Mapping Camera Brochure (2002).
E.P. Baltsavias, Airborne laser scanning: existing systems and firms and other resources, ISPRS Journal of Photogrammetry & Remote Sensing 54 (1999), pp. 164-198.
EA Nov. 25, 2005 Response to Office Action dated Oct. 20, 2005 for Eurasian Application 200500412.
EA Oct. 20, 2005 Office Action issued for Eurasian Patent Application 200500412 (with English translation).
EA Sep. 28, 2006 Notification of Preparedness for Granting (Decision of Grant) issued for Eurasian Patent Application 200500513 (with English translation).
Ellis and Dodd, Applications and Lessons Learned with Airborne Multispectral Imaging, Fourteenth International Conference on Applied Geologic Remote Sensing, Las Vegas, Nevada, Nov. 6-8, 2000.
Ellum and El-Sheimy, Land-Based Mobile Mapping Systems, Photogrammetric Engineering & Remote Sensing Jan. 2002, pp. 13-28.
El-Sheimy, A Mobile Multi-Sensor System for GIS Applications in Urban Centers, IAPRS vol. XXXI, Part B2, ISPRS Comm. II, pp. 95-100 (Vienna, Jul. 9-19, 1996).
EP Apr. 15, 2016 Partial Supplementary European Search Report mailed in related European Patent Application No. EP 13831711.0.
EP Aug. 18, 2016 Supplementary European Search Report and Search Opinion mailed in European Patent Application No. 13831711.0 filed Jul. 26, 2013.
EP Dec. 23, 2008 Response to Oct. 30, 2008 Communication for European Patent Application 03755838.4.
EP Feb. 3, 2015 Response to Supplementary European Search Report dated Jul. 9, 2014 in European Patent Application No. 11862219.0 filed Mar. 31, 2011.
EP Jan. 10, 2017 EPO Communication Pursuant to Article 94(3) EPC (first examiniation report) issued for European Patent Application No. 11862219.0, filed Mar. 31, 2011.
EP Jan. 18, 2011 Response to Communication of Jul. 15, 2010 for European Patent Application 03755838.4.
EP Jan. 4, 2012 Response to Oct. 10, 2011 Summons to Attend Oral Proceedings for European Patent Application 03755838.4.
EP Jan. 6, 2012 Supplemental Response to Oct. 10, 2010 Summons to Attend Oral Proceedings for European Patent Application 03755838.4.
EP Jul. 13, 2012 Statement of Grounds of Appeal for European Patent Application 03755838.4.
EP Jul. 15, 2010 Communication Pursuant to Article 94(3) EPC for European Patent Application 03755838.3.
EP Jul. 29, 2010 Result of Consultation for European Patent Application 03755838.4.
EP Jul. 9, 2014 Supplementary European Search Report and Written Opinion mailed in European Patent Application EP11862219.0.
EP Jun. 11, 2013 Response to EPO communication under Rules 161(2) and 162 EPC dated Dec. 6, 2012, with amended claims, filed in European Patent Application EP11862219.0.
EP Jun. 26, 2017 Summons to Oral Proceedings Pursuant to Rule 115(1) EPC issued for European Patent Application No. 03755838.4, filed Sep. 18, 2003.
EP Mar. 8, 2012 Decision to Refuse for European Patent Application 03755838.4.
EP May 14, 2007 European Supplemental Search Report issued during prosecution of EP 03791891.
EP May 22, 2017 Response to EPO Communication (first examination report) dated Jan. 10, 2017 for European Patent Application No. 11862219.0, filed Mar. 31, 2011.
EP May 27, 2011 Response to Communication of Jul. 15, 2010 for Euopean Patent Application 03755838.4.
EP Oct. 10, 2011 Summons to Attend Oral Proceedings for European Patent Application 03755838.4.
EP Oct. 30, 2008 Supplemental Search Report for European Patent Application 03755838.4.
EP Sep. 29, 2008 European Supplemental Search Report issued for European Patent Application EP 03755838.4.
EP Sep. 7, 2007 Communication Pursuant to Article 96(2) issued for European Patent Application 03791891.9.
Feb. 13, 2006 Notice of Allowability for U.S. Appl. No. 10/664,737, filed Sep. 18, 2003.
Feb. 13, 2006 Notice of Allowability mailed for U.S. Appl. No. 10/664,737, filed Sep. 18, 2003.
Feb. 19, 2008 Response to Office Action/Non-Final Rejection dated Jun. 7, 2007 for U.S. Appl. No. 11/128,656, filed May 13, 2005.
Feb. 19, 2008, Reponse to Office Action/Non-Final Rejection dated Jun. 7, 2007 for U.S. Appl. No. 11/128,656, filed May 13, 2005.
Feb. 22, 2005 Response to Office Action/Final Rejection dated Dec. 22, 2004 for U.S. Appl. No. 10/247,441, filed Sep. 19, 2002.
Feb. 27, 2013 Notice of Allowability mailed for U.S. Appl. No. 12/583,815, filed Aug. 26, 2009.
Feb. 27, 2013 Response to Office Action dated Sep. 6, 2012 for U.S. Appl. No. 12/798,899, filed Apr. 13, 2010.
Feb. 28, 2007 Response to Office Action/Final Rejection for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Feb. 7, 2014 Office Action/Final Rejection mailed for U.S. Appl. No. 12/462,533, filed Aug. 5, 2009.
Fischer, et al., Fusion of digital multispectral videography with interferometric synthetic aperture radar, U.S. Army Topographic Engineering Center, Alexandria, VA (1997), Pub. No. 091 (Oct. 26, 1998) (11 pages).
Fischer, et al., The use of digital multispectral video for littoral zone applications, U.S. Army Topographic Engineering Center, Alexandria, VA (1997), Pub. No. 051 (Oct. 20, 1998) (6 pages).
GeoVantage Aerial Digital Imaging GeoScanner Airborne Imagery Collection System Support Manual Build 1—May 2000 (Mar. 15, 2001).
GeoVantage Aerial Digital Imaging Operations Manual (2000), pp. 1-26.
GeoVantage Digital Camera System Description (Aug. 20, 2001), pp. 1-11.
Gonsalves, A comprehensive uncertainty analysis and method of geometric calibration for a circular scanning airborne LIDAR (Dec. 2010) (Ph.D. Dissertation, U. S. Miss.) (416 pages).
Graefe et al., The road data acquisition system MoSES—determination and accuracy of trajectory data gained with the Applanix POS/LV, Proceedings of 3rd International Symposium on Mobile Mapping Technology, Jan. 2001.
Graefe, Quality Management in Kinematic Laser Scanning Applications (2007).
Grejner-Brzezinska and Toth, Precision Mapping of Highway Linear Features, IAPRS, vol. XXXIII, Part B2, pp. 233-240 (2000).
Grejner-Brzezinska and Wang, Gravity Modeling for High-Accuracy GPS/INS Integration, Navigation, 45(3) (1998), pp. 209-220.
Grejner-Brzezinska et al., Multi-Sensor Systems for Land-Based and Airborne Mapping: Technology of the Future?, IAPRS vol. XXXIV, Part 2, Comm. II, pp. 31-42 (Xi'an, Aug. 20-23, 2002).
Grejner-Brzezinska, Airborne Integrated Mapping System: Positioning Component, Proceedings of the 53rd Annual Meeting of the Institute of Navigation, Albuquerque, NM, Jun. 1997, 225-235.
Grejner-Brzezinska, D. A., Direct georeferencing at the Ohio State University: A historical perspective, Photogrammetric Engineering and Remote Sensing Journal of the American Society for Photogrammetry and Remote Sensing, vol. 68, No. 6 (Jun. 2002), pp. 557-560.
Grejner-Brzezinska, D., and B. Phuyal, Positioning accuracy of the airborne integrated mapping system, Proceedings of the 1998 National Technical Meeting of the Institute of Navigation, Long Beach, CA, Jan. 1998, pp. 713-721.
Grejner-Brzezinska, Direct Exterior Orientation of Airborne Imagery with GPS/INS System: Performance Analysis, Navigation, vol. 46, No. 4, pp. 261-270 (1999).
Grejner-Brzezinska, Direct Sensor Orientation in Airborne and Land-based Mapping Applications, Report No. 461, Geodetic GeoInformation Science, Dept. of Civil and Environmental Engineering and Geodetic Science, Jun. 2001.
Guangping He, Design and Application of the GPSVision Mobile Mapping System, IAPRS vol. XXXIV, Part 2, Comm. II (Xi'an Aug. 20-23, 2002), pp. 163-168.
Guangping He, Design of a Mobile Mapping System for GIS Data Collection, IAPRS vol. XXXI, Part B2 (Vienna 1996), pp. 154-159.
Haala et al., Calibration of Directly Measured Position and Attitude by Aerotriangulation of Three-Line Airborne Imagery, Commission III, Working Group 1 (2000).
Haala et al., On the Performance of Digital Airborne Pushbroom Cameras for Photogrammetric Data Processing—A Case Study, IAPRS vol. XXXIII (ISPRS Congress, Amsterdam 2000).
Haala et al., On the Use of Multispectral and Stereo Data From Airborne Scanning Systems for DTM Generation and Landuse Classification, IAPRS vol. 32/4, ISPRS Comm. IV Symposium on GIS—Between Visions and Applications (1998).
Hagolle, et al., How to double the spatial resolution of a push-broom instrument, Geoscience and Remote Sensing Symposium, 1994. IGARSS '94. Surface and Atmospheric Remote Sensing: Technologies, Data analysis and Interpretation., International, Pasadena, CA, USA (Aug. 8-12, 1994), New York, NY, USA, IEEE, vol. 3 (Aug. 8, 1994), 1553-1555.
He and Orvets, Capturing Road Network Using Mobile Mapping Technology, IAPRS vol. XXXIII, Part B2, pp. 272-277 (ISPRS Congress, Amsterdam 2000).
He, Guangping, Kurt Novak, and Wenhao Feng, On the integrated calibration of a digital stereo-vision system, IAPRS vol. XXIX, Part B5, pp. 139-145 (Washington, D.C., Aug. 2-14, 1992).
Heier, Deploying DMC in today's workflow, Photogrammetric Week 2001, pp. 35-45.
Heier, et al., Calibration of the digital modular camera, FIG XXII International Congress, Washington, D.C. (Apr. 19-26, 2002) 1-11.
Heier, Helmut, and Alexander Hinz, A digital airborne camera system for photogrammetry and thematic applications, ISPRS Joint Workshop on Sensors and Mapping from Space, Hanover, Germany, Sep. 27-30, 1999.
Hess et al., Geocoded digital videography for validation of land cover mapping in the Amazon Basin, International Journal of Remote Sensing, vol. 23, Issue 7, Feb. 2002.
Hiatt, Sensor integration aids mapping at ground zero, Photogrammetric Engineering & Remote Sensing (Sep. 2002) 877 & 879.
Hill et al., Wide-Area Topographic Mapping and Applications Using Airborne Light Detection and Ranging (LIDAR) Technology, Photogrammetric Engineering & Remote Sensing, Aug. 2000, pp. 908-914.
Hinsken, L. et al., Triangulation of LH Systems' ADS40 imagery using Orima GPS/IMU, IAPRS 34.3/A (2002), pp. 156-162.
Hinz et al., Digital Modular Camera: System Concept and Data Processing Workflow, IAPRS vol. XXXIII, Part B2, pp. 164-171 (ISPRS Congress, Amsterdam 2000).
Hinz et al., DMC—The Digital Sensor Technology of Z/I-Imaging, Photogrammetric Week '01, pp. 93-103 (2001).
Hinz, The Z/I Imaging Digital Aerial Camera System, Photogrammetric Week '99, pp. 109-115 (1999).
Holzwarth, et al., Determination and monitoring of boresight misalignment angles during the HyMap campaigns HyEurope 2003 and HyEurope 2004, Proceedings of 4th EARSeL Workshop on Imaging Spectroscopy (2005) 91-100.
Huyck, et al., Engineering and organizational issues related to the World Trade Center terrorist attack: Emergency response in the wake of the World Trade Center attach: The remote sensing perspective, MCEER Special Report Series, vol. 3 (Jun. 2002) (60 pages).
Image America brochure (2002).
IN Apr. 13, 2007 Response to Office Action dated Feb. 12, 2007 for Indian Patent Application 773/DELNP/2005.
IN Feb. 12, 2007 Office Action for Indian Patent Application 773/DELNP/2005.
Ip et al., System Performance Analysis of INS/DGPS Integrated System for Mobile Mapping System (MMS), Commission VI, WG VI/4 (2004).
Ip, Analysis of Integrated Sensor Orientation for Aerial Mapping, UCGE Reports No. 20204, Jan. 2005.
Jacobsen, Calibration Aspects in Direct Georeferencing of Frame Imagery, ISPRS Commission I/FIEOS 2002 Conference Proceedings.
Jacobsen, et al., Dependencies and problems of direct sensor orientation, Institute of Photogrammetry and GeoInformation, Univ. of Hannover, Germany, OEEPE Official Pub. No. 43 (2002) 73-84 (11 pages).
Jacobsen, Potential and Limitation of Direct Sensor Orientation, IAPRS vol. XXXIII, Part B3, pp. 429-435 (ISPRS Congress, Amsterdam 2000).
Jacobsen, System calibration for direct and integrated sensor orientation, Institute of Photogrammetry and GeoInformation, Univ. of Hannover, Germany, Proceedings of RSPRS Workshop for Working Group 5 (2003) (6 pages).
Jan. 10, 2006 Office Action mailed for U.S. Appl. No. 10/244,980, filed Sep. 17, 2002.
Jan. 10, 2011 Notice of Allowability for U.S. Appl. No. 11/805,109, filed May 22, 2007.
Jan. 11, 2006 Office Action/Non-Final Rejection mailed for U.S. Appl. No. 10/229,626.
Jan. 12, 2010 Notice of Allowability mailed for U.S. Appl. No. 11/581,235, filed Oct. 11, 2006.
Jan. 5, 2007 Notice of Allowability mailed for U.S. Appl. No. 10/244,980, filed Sep. 17, 2002.
Jan. 7, 2004 Office Action/Non-Final Rejection mailed for U.S. Appl. No. 10/247,441, filed Sep. 19, 2002.
Jan. 8, 2008 Examiner Interview Summary for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
JP Apr. 14, 2015 Decision of Rejection issued in Japanese Patent Application No. 2013-518368 filed Mar. 31, 2011 (with English translation).
JP Dec. 16, 2014 Office Action mailed in Japanese Patent Application No. 2013-518368 filed Mar. 31, 2011 (with English translation).
JP Jun. 27, 2017 First Office Action Notice of Reasons for Refusal issued for Japanese Patent Application No. 2015-528490, filed Jul. 26, 2013.
JP Mar. 24, 2015 Response to Office Action filed in Japanese Patent Application No. 2013-518368 filed Mar. 31, 2011 (with English translation of amended claims).
Jul. 13, 2006 Office Action/Non-Final Rejection mailed for U.S. Appl. No. 10/244,980, filed Sep. 17, 2002.
Jul. 13, 2009 Response to Office Action dated Mar. 17, 2009 for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Jul. 17, 2008 Office Action mailed for U.S. Appl. No. 11/128,656, filed May 13, 2005.
Jul. 30, 2009 Notice of Allowability mailed for U.S. Appl. No. 11/128,656, filed May 13, 2005.
Jul. 30, 2012 Preliminary Amendment for U.S. Appl. No. 12/462,533, filed Aug. 5, 2009.
Jul. 30, 2012 Preliminary Amendment U.S. Appl. No. 12/462,533, filed Aug. 5, 2009.
Jul. 6, 2007 Office Action/Non-Final Rejection mailed for U.S. Appl. No. 10/229,626.
Jul. 7, 2004 Response to Office Action dated Jan. 7, 2004 for U.S. Appl. No. 10/247,441, filed Sep. 19, 2002.
Jul. 9, 2014 Supplementary European Search Report and Written Opinion mailed in European Patent Application EP11862219.0.
Jun. 1, 2010 Response to Office Action/Non-Final Rejection dated Mar. 5, 2010 for U.S. Appl. No. 11/805,109, filed May 22, 2007.
Jun. 10, 2010 Office Action mailed for U.S. Appl. No. 12/462,563, filed Aug. 5, 2009.
Jun. 11, 2009 Advisory Action mailed for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Jun. 20, 2006 Notice of Allowance mailed for U.S. Appl. No. 10/664,737, filed Sep. 18, 2003.
Jun. 20, 2006 Response to Office Action dated Jan. 11, 2006 for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Jun. 21, 2010 Notice of Allowability mailed for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Jun. 5, 2008 Response to Office Action/Final Rejection dated Dec. 6, 2007/Request for Continued Examination for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Jun. 5, 2014 Response to Office Action/Final Rejection dated Feb. 7, 2014 for U.S. Appl. No. 12/462,533, filed Aug. 5, 2009.
Jun. 7, 2007 Office Action mailed for U.S. Appl. No. 11/128,656, filed May 13, 2005.
Jun. 7, 2013 Response to Office Action/Restriction Requirement dated May 20, 2013 for U.S. Appl. No. 12/462,533, filed Aug. 5, 2009.
Jun. 8, 2005 Office Action/Final Rejection mailed for U.S. Appl. No. 10/664,737, filed Sep. 18, 2003.
Kocaman, GPS and INS Integration with Kalman Filtering for Direct Georeferencing of Airborne Imagery, Geodetic Seminar Report, Institute of Geodesy and Photogrammetry, Jan. 30, 2003.
Kurz, et al., Calibration of a wide-angle digital camera system for near real time scenarios, Proceedings ISPRS Hannover Workshop (2007) 1682-777 (6 pages).
Leberl and Gruber, Flying the New Large Format Digital Aerial Camera Ultracam, Photogrammetric Week '03, D. Fritsch, Ed., Wichmann-Verlag, Heidelberg, pp. 67-76 (2003).
Leberl et al., Novel Concepts for Aerial Digital Cameras, ISPRS Commission I Symposium, Denver, Colorado, Nov. 2002.
Leberl et al., The Ultracam Large Format Aerial Digital Camera System, Proceedings of the American Society for Photogrammetry & Remote Sensing, Anchorage, Alaska, May 5-9, 2003, pp. 1-6.
Leberl, et al., The UltraCam Story, Photogrammetry, Remote Sensing & Spatial Information Sciences 39(B1) (2012) 39-44, available at http://download.microsoft.com/download/C/7/0/C70BAE4C-4A56-4410-9A4D-6453- 3D70B66A/papersandpublications/TheUltraCamStory.pdf.
Leberl, et al., The UltraCam Story, Photogrammetry, Remote Sensing & Spatial Information Sciences 39(B1) (2012) 39-44, available at URL http://download.microsoft.com/download/C/7/0/C70BAE4C-4A56-4410-9A4D-64533D70B66A/papersandpublications/TheUltraCamStory.pdf.
Lee, et al., Boresight calibration of the aerial multi-head camera system, Proceedings of SPIE vol. 8059 (2011).
Leica Geosystems Brochure: ALS40 Airborne Laser Scanner—Airborne LIDAR for Professionals (Aug. 9, 2002).
Leica Press Release, Leica ADS40 to Make American Debut Apr. 12, 2002, Apr. 12, 2002.
Li et al., Object Recognition from AIMS Data Using Neural Networks, Report No. 462, Geodetic and GeoInformation Science, Dept. of Civil and Environmental Engineering and Geodetic Science, Dec. 1998.
LIDAR for Hire, Point of Beginning Magazine, Jan. 26, 2001.
Lithopoulos, E., Blake Reid, and Bruno Scherzinger, The position and orientation system (POS) for survey applications, IAPRS 31 (1996), pp. 467-471.
Louis, John, et al., Operational use and calibration of airborne video imagery for agricultural and environmental land management applications, Proceedings of the 15th Biennial Workshop on Color Photography and Air Videography (1995).
Lutes, DAIS: A Digital Airborne Imaging System, ISPRS Commission I/FIEOS 2002 Conference Proceedings, Nov. 2002.
Mann and Chiarito, Technologies for Positioning and Placement of Underwater Structures, prepared for U.S. Army Corps of Engineers, ERDC TR-INP-00-1, Mar. 2000.
Mar. 1, 2010 Examiner's Interview Summary mailed for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Mar. 16, 2005 Notice of Allowability mailed for U.S. Appl. No. 10/247,441, filed Sep. 19, 2002.
Mar. 17, 2009 Office Action mailed for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Mar. 17, 2010 Preliminary Amendment for U.S. Appl. No. 12/462,533, filed Aug. 5, 2009.
Mar. 18, 2010 Preliminary Amendment for U.S. Appl. No. 12/583,815, filed Aug. 26, 2009.
Mar. 2, 2005 Response to Office Action dated Dec. 7, 2004 for U.S. Appl. No. 10/664,737, filed Sep. 18, 2003.
Mar. 2, 2010 Response to Office Action dated Oct. 5, 2009 for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Mar. 21, 2007 Response to Office Action dated Nov. 20, 2006 for U.S. Appl. No. 11/128,656, filed May 13, 2005.
Mar. 29, 2007 Advisory Action for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Mar. 3, 2009 Office Action mailed for U.S. Appl. No. 11/128,656, filed May 13, 2005.
Mar. 3, 2009 Office Action/Final Rejection mailed for U.S. Appl. No. 11/128,656, filed May 13, 2005.
Mar. 5, 2010 Office Action/Non-Final Rejection mailed for U.S. Appl. No. 11/805,109, filed May 22, 2007.
May 12, 2009 Response to Office Action/Non-Final Rejection dated Mar. 3, 2009 for U.S. Appl. No. 11/128,656, filed May 13, 2005.
May 19, 2005 Office Action mailed for U.S. Appl. No. 10/244,980, filed Sep. 17, 2002.
May 19, 2009 Response to Office Action dated Mar. 17, 2009 for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
May 20, 2013 Office Action/Restriction Requirement mailed for U.S. Appl. No. 12/462,533, filed Aug. 5, 2009.
May 25, 2016 Notice of Allowance with Examiner's Amendment/Comment mailed in U.S. Appl. No. 13/772,994, filed Feb. 21, 2013.
Mietz et al., An Evaluation of LIDAR Vertical Accuracy in Grand Canyon, Arizona, Jul. 2002, pp. 1-15.
Mohamed and Price, Near the Speed of Flight: Aerial Mapping with GPS/INS Direct Georeferencing, GPS World, Mar. 2002, pp. 40-45.
Mohamed et al., The Development of DORIS: An Overview, 3rd International Symposium on Mobile Mapping Technology, Cairo—Egypt, Jan. 3-5, 2001.
Mohamed, Advancements in the Development of DORIS, Photogrammatic Week '01, pp. 1-11 (2001).
Mohamed, Advancements in the Development of DORIS, Photogrammatic Week '01, pp. 1-11.
Mohamed, M.R. Mostafa, et al. Emerge DSS: A Fully Integrated Digital System for Airborne Mapping: Sep. 22-23, 2003; http://www.isprs.org/commission1/theory_tech_realities/.
Mohamed, Navigating the Ground from Air: Active Monitoring with GPS/INS Geo-referenced LiDAR, Proceedings of the 2003 National Technical Meeting of the Institute of Navigation, Anaheim, CA, Jan. 2003, pp. 593-601.
Mostafa and Hutton, Direct Positioning and Orientation Systems: How Do They Work? What is the Attainable Accuracy?, Proceedings, American Society of Photogrammetry and Remote Sensing Annual Meeting, St. Louis, MO, USA, Apr. 24-27, 2001.
Mostafa and Hutton, Emerge DSS: A Fully Integrated Digital System for Airborne Mapping, Proceedings—ISPRS Int'l Workshop—Working Group I/5, Castelldefels, Spain, Sep. 22-23, 2003.
Mostafa and Schwarz, Multi-Sensor System for Airborne Image Capture and Georeferencing, Photogrammetric Engineering & Remote Sensing, vol. 66, No. 12, Dec. 2000, pp. 1417-1423.
Mostafa and Schwarz, The Development and Testing of an Integrated GPS/SINS/Multi-Camera System for Airborne Mapping, The 3rd International Symposium on Mobile Mapping Technology, Cairo, Egypt, Jan. 3-5, 2001.
Mostafa et al., Airborne Direct Georeferencing of Frame Imagery: An Error Budget, The 3rd International Symposium on Mobile Mapping Technology, Cairo, Egypt, Jan. 3-5, 2001.
Mostafa et al., Ground Accuracy from Directly Georeferenced Imagery, GIM International, vol. 14, N. 12, Dec. 2000.
Mostafa, Airborne Image Georeferencing by GPS-aided Inertial Systems: Concepts and Performance Analysis, 22nd Asian Conf. on Remote Sensing, Nov. 5-9, 2001.
Mostafa, Boresight Calibration of Integrated Inertial/Camera Systems, Proceedings of the International Symposium on Kinematic Systems in Geodesy, Geomatics and Navigation—KIS 2001, Banff, Canada, Jun. 5-8, 2001, pp. 440-445.
Mostafa, Camera/IMU boresight calibration: New advances and performance analysis (2002), available at http://applanix.com/articles-and-papers/pos-av/256-cameraimu-boresight-ca- libration-new-advances-and-performance-analysis.html (12 pages).
Mostafa, Camera/IMU boresight calibration: New advances and performance analysis (2002), available at URL http://applanix.com/articles-and-papers/pos-av/256-cameraimu-boresight-calibration-new-advances-and-performance-analysis.html (12 pages).
Mostafa, et al., A fully digital system for airborne mapping, Proceedings of the International Symposium on Geodesy, Geomatics, and Navigation—KIS 1997, Banff, Canada, Jun. 3-6, 1997, pp. 463-471.
Mostafa, Performance analysis of the DSS in map production environment (2004), available at http://applanix.com/articles-and-papers/dss.html (6 pages).
Mostafa, Performance analysis of the DSS in map production environment (2004), available at URL http://applanix.com/articles-and-papers/dss.html (6 pages).
Neale et al., Spatial mapping of evapotranspiration and energy balance components over riparian vegetation using airborne remote sensing, Remote Sensing and Hydrology 2000 (Proceedings of a symposium held at Santa Fe, New Mexico, USA, Apr. 2000), IAHS Publ. No. 267, 2001.
Neukum, et al., The airborne HRSC-AX cameras: evaluation of the technical concept and presentation of application results after one year of operations, Photogrammetric Week 01 (D. Fritsch & R. Spiller, Eds.) (2001) 117-31, available at http://www.ifp.uni-stuttgart.de/publications/phowo01/Neukum.pdf.
Neukum, et al., The airborne HRSC-AX cameras: evaluation of the technical concept and presentation of application results after one year of operations, Photogrammetric Week 01 (D. Fritsch & R. Spiller, Eds.) (2001) 117-31, available at URL http://www.ifp.uni-stuttgart.de/publications/phowo01/Neukum.pdf.
Nov. 10, 2006 Response to Office Action/Non-Final Rejection dated Jul. 13, 2006 for U.S. Appl. No. 10/244,980, filed Sep. 17, 2002.
Nov. 17, 2006 Office Action/Final Rejection mailed for U.S. Appl. No. 10/229,626.
Nov. 18, 2010 Response to Office Action/Non-Final Rejection dated Jun. 10, 2010 for U.S. Appl. No. 12/462,563, filed Aug. 5, 2009.
Nov. 19, 2014 Notice of Allowance/Allowability mailed in U.S. Appl. No. 13/590,735, filed Aug. 21, 2012.
Nov. 2, 2010 Response to Office Action Non-Final Rejection dated Aug. 5, 2010 / Request for Continued Examination for U.S. Appl. No. 11/805,109, filed May 22, 2007.
Nov. 20, 2006 Examiner's Interview Summary Record for U.S. Appl. No. 10/244,980, filed Sep. 17, 2002.
Nov. 20, 2006 Office Action mailed for U.S. Appl. No. 11/128,656.
Nov. 26, 2008 Response to Office Action dated Jul. 17, 2008 for U.S. Appl. No. 11/128,656, filed May 13, 2005.
Oct. 18, 2005 Response to Office Action dated May 19, 2005 for U.S. Appl. No. 10/244,980, filed Sep. 17, 2002.
Oct. 28, 2005 Response to Office Action and Request for Reconsideration of Restriction Requirement dated Sep. 30, 2005 for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Oct. 5, 2009 Office Action/Non-Final Rejection mailed for U.S. Appl. No. 10/229,626.
Okubo, Airborne Laser Measurement Technology in Japan, Proceedings of FIG Working Week 2001, Seoul, Korea, May 6-11, 2001.
Optech ALTM Brochure (Aug. 1994).
PCT Apr. 30, 2004 International Search Report issued for International Application PCT/US03/28727.
PCT Apr. 6, 1999 International Search Report issued for International Application No. PCT/JP98/05679 (Revised Version Aug. 26, 1999) (with English translation).
PCT Apr. 6, 1999 International Search Report issued for International Application No. PCT/JP98/05679.
PCT Aug. 4, 2005 International Preliminary Examination Report issued for International Application PCT/US2003/029375.
PCT Dec. 13, 2004 International Preliminary Examination Report issued for International Application No. PCT/US2003/26950 dated Jan. 10, 2005.
PCT Dec. 9, 2004 International Preliminary Examination Report issued for International Application PCT/US03/028420.
PCT Feb. 18, 2005 International Preliminary Examination Report issued for International Application PCT/US03/28420.
PCT Feb. 27, 2004 International Search Report issued for International Application PCT/US03/28420.
PCT Feb. 7, 2005 International Search Report issued for International Application PCT/US2003/029375.
PCT Jan. 10, 2005 International Preliminary Examination Report issued for International Application PCT/US03/26950.
PCT Jan. 7, 2014 International Search Report and Written Opinion issued for International Application No. PCT/US2013/052278.
PCT Jun. 27, 2011 International Search Report issued for International Application PCT/US2011/000575.
PCT Jun. 3, 2005 Response to Written Opinion issued for International Application PCT/US03/29375.
PCT May 13, 2004 International Written Opinion issued for International Application PCT/US03/28420.
PCT May 5, 2005 International Written Opinion issued for International Application PCT/US03/29375.
PCT Nov. 17, 2004 Response to Written Opinion for International Application PCT/US03/28727.
PCT Oct. 20, 2004 International Search Report issued for International Application PCT/US2003/026950.
PCT Oct. 20, 2004 International Search Report mailed for International Application PCT/US2003/026950.
PCT Oct. 23, 2012 International Preliminary Report on Patentability issued for International Application PCT/US2011/000575.
PCT Sep. 17, 2004 International Written Opinion issued for International Application PCT/US2003/028727.
PCT Sep. 2, 2005 International Preliminary Examination Report issued for International Application PCT/US03/29375.
Pendleton, Map Compilation from Aerial Photographs, USGS Bulletin: 788-F (1928), pp. 379-432.
Petrie, Déjà Vu—The Configurations of the New Airborne Digital Imagers Are All Rooted in the Distant Past!, GeoInformatics, Jul./Aug. 2000.
Petrie, Further Advances in Airborne Digital Imaging: Several New Imagers Introduced at ASPRS, GeoInformatics, Jul./Aug. 2006.
Petrie, ISPRS 2000 Technical Exhibition, GeoInformatics Oct./Nov. 2000, pp. 30-35.
Petrie, Optical Imagery from Airborne & Spaceborne Platforms: Comparisons of Resolution, Coverage, & Geometry for a Given Ground Pixel Size, GeoInformatics, Jan./Feb. 2002, pp. 28-35.
Photograph of the 1934 Fairchild T-3A Five Lens Camera.
Photograph of the Bagley Three Lens Camera 1919-1928.
Pollock, Development of a Highly Automated System for the Remote Evaluation of Individual Tree Parameters, Integrated Tools Proceedings, Boise, Idaho, Aug. 16-20, 1998, pp. 638-645.
Press Release, AirRECON III delivers substantial cost and time savings to Enron subsidiary by integrating Visual Intelligence's proprietary class location software with its seamless color, rectified digitally collected photography, dated Jan. 10, 2000.
Press Release, EnerQuest Systems Announces LIDAR Point Classification Technology Sep. 16, 2002, Sep. 16, 2002.
Press Release, EnerQuest Wins Contract for LIDAR Mapping of Grand Canyon, Oct. 4, 2000.
Press Release, Houston Company to provide before and after photography of hurricane path on web, dated Aug. 27, 1998.
Printout of Optech's "ALTM Products" webpages, dated May 1, 2002.
Printout of Optech's "ALTM REALM Software" webpage, dated May 1, 2002.
Printout of Optech's "ALTM Specifications" webpages, dated May 1, 2002.
Printout of Optech's "ALTM Survey Operation" webpages, dated May 1, 2002.
Printout of Optech's "ALTM System Components" webpage, dated May 1, 2002.
Printout of Optech's "How ALTMs Work" webpage, dated May 1, 2002.
Reiger, et al., Boresight alignment method for mobile laser scanning systems, Proceedings RSPRS Conference in Moscow (Dec. 2008), published J. Applied Geodesy 4(1) (Jan. 2010) 13-21.
Renslow et al, Evaluation of Multi-Return LIDAR for Forestry Applications, RSAC-2060/4810-LSP-0001-RPT1, Nov. 2000.
Roeser et al., New Potential and Application of ADS, IAPRS vol. XXXIII, Part B1, pp. 251-257 (ISPRS Congress, Amsterdam 2000).
Sanchez, Richard D. "Airborne Digital Sensor System and GPS-aided Inertial Technology for Direct Geopositioning in Rough Terrain". Open-File Report 2004-1391; USGS; Reston, VA.
Sandau, Rainer, et al., Design principles of the LH Systems ADS40 airborne digital sensor, IAPRS vol. XXXIII, Part B1, pp. 258-265 (ISPRS Congress, Amsterdam 2000).
Santmire, Technology Changing View for Aerial Photography Pros, Houston Business Journal, Jul. 2, 2000.
Savopol et al., A Digital Multi CCD Camera System for Near Real-Time Mapping, IAPRS vol. XXXIII, Part B1, pp. 266-271 (ISPRS Congress, Amsterdam 2000).
Schade, {i}Combining GPS and photogrammetry for a kinematic surveying system{/i}, Geodetical Information Magazine 8(12) (Dec. 1994) 32-33 & 35.
Schade, Combining GPS and photogrammetry for a kinematic surveying system, Geodetical Information Magazine 8(12) (Dec. 1994) 32-33 & 35.
Schenk and Csatho, Fusion of LIDAR data and aerial imagery for a more complete surface description, International Archives of Photogrammetry Remote Sensing and Spatial Information Sciences 34.3/A (2002), pp. 310-317.
Schultz et al., Integrating Small Format Aerial Photography, Videography, and a Laser Profiler for Environmental Monitoring, ISPRS Comm. II, WG 1 Workshop on Integrated Sensor Calibration and Orientation (1999).
Schultz, et al., A system for read-time generation of geo-referenced terrain models, SPIE Enabling Technologies for Law Enforcement, Boston, MA (Nov. 5-8, 2000) (8 pages).
Schultz, et al., A system for real-time generation of geo-referenced terrain models, SPIE Enabling Technologies for Law Enforcement, Boston, MA (Nov. 5-8, 2000) (8 pages).
Schultz, et al., Cost-effective determination of biomass from aerial images, Integrated Spatial Databases Digital Images and GIS, International Workshop ISD'99, Portland, ME (Jun. 14-16, 1999) 67-76.
Sep. 13, 2007 Examiner Interview Summary for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Sep. 17, 2006 Office Action/Final Rejection mailed for U.S. Appl. No. 10/229,626.
Sep. 17, 2007 Response to Office Action/Non-Final Rejection dated Jul. 6, 2007 for U.S. Appl. No. 10/229,626, filed Aug. 28, 2002.
Sep. 20, 2013 Office Action/Non-Final Rejection mailed for U.S. Appl. No. 12/462,533, filed Aug. 5, 2009.
Sep. 6, 2012 Office Action mailed for U.S. Appl. No. 12/798,899, filed Apr. 13, 2010.
Sep. 7, 2005 Response to Office Action/Final Rejection dated Jun. 8, 2005 for U.S. Appl. No. 10/664,737, filed Sep. 18, 2003.
Skaloud, et al., Rigorous approach to bore-sight self-calibration in airborne laser scanning, ISPRS J. Photogrammetry & Remote Sensing 61 (2006) 47-59.
Snider, et al., Use of aerial videography to evaluate the effects of Flaming Gorge Dam operations on natural resources of the Green River, Report of the Environmental Assessment Division, Argonne National Laboratory (1993) (10 pages).
Sun et al., Spatial Resolution Enhancement and Dynamic Range Extending of a Computerized Airborne Multicamera Imaging System, in Sensor Fusion: Architectures, Algorithms and Applications IV, Belur V. Dasarathy (Ed.), Proceedings of SPIE vol. 4051 (2000), pp. 118-125.
Surveys From Above: A Brief History of Aerial Survey Photography in the C&GS (Jul. 17, 2009), http://www.ngs.noaa.gov/web/about_ngs/history/camera_timeline_web.pdf.
Surveys From Above: A Brief History of Aerial Survey Photography in the C&GS{/i} (Jul. 17, 2009), http://www.ngs.noaa.gov/web/about_ngs/history/camera_timeline_web.pdf.
Talaya et al., Integration of a terrestrial laser scanner with GPS/IMU orientation sensors, Proceedings of the XXth ISPRS Congress, vol. 35 (2004).
Talley, Mapping by the Use of Aerial Photographs, The Military Engineer 27(155) (Sep.-Oct. 1935) 357-361.
Tang et al., Geometric Accuracy Potential of the Digital Modular Camera, IAPRS vol. XXXIII, Part B4, pp. 1051-1057 (ISPRS Congress, Amsterdam 2000).
Tao, Automated Approaches to Object Measurement and Feature Extraction from Georeferenced Mobile Mapping Image Sequences, Ph.D Dissertation (Dept. of Geomatics Engineering, Calgary, Alberta) (Oct. 1997).
Tao, Innovations on Multi-Sensor and Multi-Platform Integrated Data Acquisition (1999).
Tao, Mobile Mapping Technology for Road Network Data Acquisition, Journal of Geospatial Engineering, vol. 2, No. 2, pp. 1-13 (2000).
Tempelmann, Udo, et al., Photogrammetric software for the LH Systems ADS40 airborne digital sensor, IAPRS, vol. XXXIII, Part B2, pp. 552-559 (ISPRS Congress, Amsterdam 2000).
Toth, C., and Dorota A. Grejner-Brzezinska, Complementarity of LIDAR and stereo imagery for enhanced surface extraction, IAPRS vol. XXXIII, Part B3/2, pp. 897-904 (ISPRS Congress, Amsterdam 2000).
Toth, Calibrating Airborne Lidar Systems, IAPRS vol. XXXIV, Part 2, Comm. II, pp. 475-480 (Xi'an, Aug. 20-23, 2002).
Toth, Charles K., and Dorota A. Grejner-Brzezinska, Performance analysis of the airborne integrated mapping system (AIMS), IAPRS 32 (1997), pp. 320-326, available at https://www.cfm.ohio-state.edu/research/AIMS/paper3.htm.
Toth, Charles, and D. Grejner-Brzezinska, DEM Extraction from High-Resolution Direct-Digital Airborne Imagery, ISPRS Commission III Symposium on Object Recognition and Scene Classification from Multispectral and Multisensor Pixels, IAPRS vol. XXXII, Part 3/1, pp. 184-189 (1998).
Toth, Direct Platform Orientation of MultiSensor Data Acquisition Systems, IAPRS vol. 32/4, ISPRS Commission IV Symposium on GIS—Between Visions and Applications, pp. 629-634 (Stuttgart, Germany 1998).
Toth, Experiences with frame CCD arrays and direct georeferencing, Photogrammetric Week '99, pp. 95-109 (1999).
Track'Air MIDAS (Multi-cameras Integrated Digital Acquisition System) brochure (2007) (10 pages).
Track'Air NANOtrack Installation manual (May 2010).
Track'Air The Tracker: snapBASE Aerial Survey Management Utility Users Manual (2008) (5 pages).
Track'Air The Tracker: snapBASE Aerial Survey Management Utility Users Manual (2008) (51 pages).
Track'Air The Tracker: snapPLOT Aerial Survey Printing and Plotting Utility Users Manual (2008) (40 pages).
Track'Air The Tracker: snapSHOT Aerial Survey Navigation and Photography Users Manual (2008) (141 pages).
Track'Air The Tracker: snapXYZ Aerial Survey Coordinates Entry Utility Users Manual (Apr. 29, 2002) (58 pages).
Track'Air TRACKER system: Hardware installation Users Manual (2008) (49 pages).
Track'Air—Lead'Air MIDAS-5 brochure (2010-2015).
Tuck Mapping Solutions Brochure: Air Force Base Mapping re eagleeye mapping system (2003).
Tuck Mapping Solutions: Eagleeye Mapping System Project Summaries (2006).
Tuck, LIDAR: A New Perspective, Professional Surveyor Magazine 24(11) (Nov. 2004), 26-28.
VISI Air DAVe web page, archived on Feb. 21, 1999.
Xuan et al.; A combined sensor system of digital camera with LiDAR; 2007 IEEE Intl. Geoscience and Remote Sensing Sym., Barcelona, 2007, pp. 589-592.*
Yastikli, et al., In-situ camera and boresight calibration with LIDAR data, Proceedings of 5th International Symposium on Mobile Mapping Technology vol. 7 (2007) (6 pages).
Zajkowski, et al., Infrared Field Users' Guide and Vendor Listings, USDA Forest Service Report RSAC-1309-RPT1 (Oct. 2003) (23 pages).
Zajkowski, et al., Infrared Field Users' Guide, USDA Forest Service Report RSAC-1309-RPT3 (Mar. 2011) (Update) (15 pages).
Zeitler and Doerstel, Geometric Calibration of the DMC: Method and Results, Pecora 15/Land Satellite Information IV/ISPRS Commission I/FIEOS 2002 Conference Proceedings (2002).
Zhu, et al., Stereo mosaics from a moving video camera for environmental monitoring, First Int'l Workshop on Digital and Computational Video, Tampa, FL (Dec. 10, 1999) (10 pages).

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20200294620A1 (en)*2017-10-042020-09-17KWS SAAT SE & Co. KGaAMethod and system for performing data analysis for plant phenotyping
US12119086B2 (en)*2017-10-042024-10-15KWS SAAT SE & Co. KGaAMethod and system for performing data analysis for plant phenotyping
US20220342161A1 (en)*2021-04-212022-10-27Panasonic Intellectual Property Management Co., Ltd.Optical measuring device, assembling device of mounting substrate, and assembling method for mounting substrate

Similar Documents

PublicationPublication DateTitle
US9797980B2 (en)Self-calibrated, remote imaging and data processing system
US7725258B2 (en)Vehicle based data collection and processing system and imaging sensor system and methods thereof
US7127348B2 (en)Vehicle based data collection and processing system
US8994822B2 (en)Infrastructure mapping system and method
JP6282275B2 (en) Infrastructure mapping system and method
EP2558953A1 (en)Self-calibrated, remote imaging and data processing system
US6928194B2 (en)System for mosaicing digital ortho-images
CN110296688A (en)A kind of detecting one inclination aerial survey gondola based on passive geographic positioning technology
USRE49105E1 (en)Self-calibrated, remote imaging and data processing system
JP2014511155A (en) Self-calibrating remote imaging and data processing system
Mostafa et al.GPS/INS integrated navigation system in support of digital image georeferencing
US11568556B2 (en)Device, method and system for determining flight height of unmanned aerial vehicle
Garg et al.Geometric Correction and Mosaic Generation of Geo High Resolution Camera Images
HK1182192A (en)Self-calibrated, remote imaging and data processing system
HK1182192B (en)Self-calibrated, remote imaging and data processing system
Mostafa et al.An Autonomous System for Aerial Image Acquisition and Georeferencing
Rosiek et al.Exploiting global positioning system and inertial measurement unit-controlled image sensors
KR20080033287A (en) Method and apparatus for determining location relative to an image

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:M7 VISUAL INTELLIGENCE, L.P., TEXAS

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SMITHERMAN, CHESTER L.;REEL/FRAME:050808/0063

Effective date:20100410

Owner name:VISUAL INTELLIGENCE LP, TEXAS

Free format text:CHANGE OF NAME;ASSIGNOR:M7 VISUAL INTELLIGENCE LP;REEL/FRAME:050810/0325

Effective date:20100902

FEPPFee payment procedure

Free format text:ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPPFee payment procedure

Free format text:ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

ASAssignment

Owner name:VI TECHNOLOGIES, LLC, TEXAS

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VISUAL INTELLIGENCE, LP;REEL/FRAME:054600/0663

Effective date:20201124

ASAssignment

Owner name:VI TECHNOLOGIES, LLC, TEXAS

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VISUAL INTELLIGENCE, LP;REEL/FRAME:055646/0409

Effective date:20210312


[8]ページ先頭

©2009-2025 Movatter.jp