Movatterモバイル変換


[0]ホーム

URL:


US5155683A - Vehicle remote guidance with path control - Google Patents

Vehicle remote guidance with path control
Download PDF

Info

Publication number
US5155683A
US5155683AUS07/683,706US68370691AUS5155683AUS 5155683 AUS5155683 AUS 5155683AUS 68370691 AUS68370691 AUS 68370691AUS 5155683 AUS5155683 AUS 5155683A
Authority
US
United States
Prior art keywords
vehicle
screen
camera
path
operator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US07/683,706
Inventor
Wadiatur Rahim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IndividualfiledCriticalIndividual
Priority to US07/683,706priorityCriticalpatent/US5155683A/en
Application grantedgrantedCritical
Publication of US5155683ApublicationCriticalpatent/US5155683A/en
Anticipated expirationlegal-statusCritical
Expired - Fee Relatedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

In a remotely piloted surface vehicle a television camera is used to send images to a screen on the operator's station. A computer displays the vehicle's intended path on the screen. The path appears as a computer-generated line superimposed on the image of the vehicle's environment, like a stripe painted on the ground. The operator can change or advance the path on the screen with a cursor control. A computer picks certain discrete screen points along the line and maps those screen positions onto ground positions called "waypoints". These are sent to the vehicle's guidance system to direct the vehicle along a path through the waypoints. The transform which maps the screen path onto the ground path depends on the camera orientation and lens. The transform parameters can be adjusted as the camera zooms, pans and tilts. Each time the screen is refreshed, the path line for that screen is calculated by the computer from the ground path, to present the ground path correctly in the new screen. The operator can extend or modify the path at will. The system is especially adapted to use in cases where a narrow bandwidth of the radio link between the camera and the station limits the screen refresh rate. The system maximizes the possible speed of the vehicle by presenting the path information in a format quickly grasped by the operator.

Description

FIELD OF THE INVENTION
The present invention relates to remote control systems for unmanned surface vehicles which are guided by an operator, in systems where the operator can see the vehicle environment by a television link.
DESCRIPTION OF THE PRIOR ART
Remote vehicle guidance has many areas of possible application. An operator riding within the vehicle may be impractical if the vehicle is small or underpowered. A vehicle operator may also be endangered in poisonous environments or in military vehicles during a battle.
A system for remote control must transmit commands from the operator's station to the vehicle, and the operator also needs feedback from the vehicle. The typical way to feed back information to the operator is through a video link from a television camera on the vehicle. Television cameras are now small and rugged enough for almost any application. It might be expected that an operator sitting at a remote station could control a remote vehicle as well by watching the environment through a high-resolution TV screen as by sitting in the vehicle and looking through a window. Unfortunately, however, the usefulness of remotely controlled vehicles is often limited by poor image feedback to the operator. This is because radio links are the only practical way to send back the images from the TV camera on the vehicle to the operator's station, in most cases.
Television, or any other system which transmits images, requires a large radio bandwidth because information is transmitted at a high rate. Images which are sent in real time, and which have reasonable resolution, may require a megahertz or more of bandwidth. Clearly, finding a place in the radio spectrum for the video link can be hard. In the case of military vehicles, there could be hundreds or even thousands of vehicles all operating at once during a battle, each needing its own uncluttered band.
Single-sideband radio, data compression techniques, and computer image enhancement can help somewhat. Special techniques like fiber-optic tethers and laser or microwave links could entirely solve the bandwidth problem, but they are impractical for other reasons. Tethers are obviously limited by length, fragility, and fouling. Infrared, visible and microwave links can only be used as line-of-sight beams, which must be accurately aimed at the moving vehicle and which are cut off by intervening objects, including the horizon. For radio links of ordinary frequencies and ranges, the fundamental constraints imposed by the mathematics of Fourier's Theorem and information theory often will mean that the images coming to the operator are either very grainy, or alternatively, that they can only be refreshed at intervals greater than the persistence time of the eye (about a twentieth of a second) and so will flicker or appear as a series of still images. Image resolution and refresh rate can be traded off against one another, but the graininess of the image cannot be arbitrarily increased; so with limited bandwidth, the refresh rate will be slowed.
(Even if the refresh rate is of the order of a minute, useful information will be presented at intervals; but if the grain is too coarse, objects will not be seen and the operator will end up colliding the vehicle with some obstacle.)
Once the screen refresh rate drops to the point where a second or more is elapsing between frames, driving the remote vehicle becomes very difficult. The operator will tend to react to a screen as if it is in real time, that is, as if the vehicle were at the ground position from which the screen image is taken; but the vehicle is elsewhere by the time the operator views the image. A delay is introduced into the feedback loop which sends the operator's commands to the vehicle and the camera's information to the operator. If the operator is under stress or must quickly make many decisions, as in a battle, he or she is likely to control the vehicle badly--even if trained to take the delay into account. As the delay becomes longer, the problem is aggravated.
The prior art has dealt with this problem in various ways. One approach is to limit high resolution to only a portion of the viewing screen; this portion is picked by the operator to view an area of interest. The area of interest appears in focus, surrounded by blurred area. This approach is discussed by Kanaly in U.S. Pat. No. 4,405,943. The loss of information with this system is obvious. Complication and operator confusion are introduced by the requirement of picking an area, and the extra hardware and/or software required.
Graham, in U.S. Pat. No. 4,682,225, discloses an other system. Graham discusses image compression, which involves sampling the data stream from the camera at intervals and transmitting only the sampled data. The resulting screen image is blurred. If the camera is almost stationary, image clarity can be improved by superimposing the blurred images of the sampled data. (The human eye will itself do this over intervals less than a second.) Basically, this system trades off clarity or detail in favor of an appearance of continuous "real time" motion on the screen. The same bandwidth which could transmit high resolution screens at intervals instead transmits a multitude of blurred screens. If the camera is panned, zoomed, or jiggled, the technique is totally ineffective. Also, if the superposition is by hardware or software rather than in the eye, cost and complexity are involved.
Hinman, in U.S. Pat. No. 4,661,849, discusses interpolation between discrete screen images or frames by computer simulation. This presents an appearance of smooth motion to the viewer. Such a system is costly in computer hardware and software running time, and may mislead the operator by presenting an impression of real time events which are fictitious projections instead of the real environment.
Narendra et al. (U.S. Pat. No. 4,855,822) also employs a computer to generate interpolated image between the discrete images sent by the camera, so as to present an impression of continuous motion of the operator. Their interpolations are determined by the motion of the vehicle. Narendra et al. also disclose the idea of superimposing an image of the vehicle on the screen. Conventional bandwidth compression techniques are used by Narendra et al.
The Jet Propulsion Laboratory has developed a system called Computer Aided Remote Driving (CARD). The CARD system is described in a paper, "Computer-Aided Remote Driving", presented at the 13th annual meeting of the Association for Unmanned Vehicle Systems in Boston, MA on Jul. 21-23, 1986. The paper is authored by Brian H. Wilcox, Robert Salo, Brian Cooper, and Richard Killon, Technical Group Supervisor at the Jet Propulsion Laboratory of California Institute of Technology in Pasadena, CA.
CARD is intended for interplanetary remote control of vehicles and military applications. In remotely driving a vehicle on another planet, narrow bandwidths data restrictions are compounded by message delays due to the finite speed of radio beams. Vehicle speed is not crucial on another planet, but may be in a military application.
The CARD system uses two high-resolution cameras to generate a stereo image for the operator. The operator views both images at once, one through either eye, to see the vehicle environment in three dimensions. The viewing system has two screens with relatively crossed Polaroid filters, two half-silvered mirrors to superimpose the images, and Polaroid glasses worn by the operator to isolate the two images. Three-dimensional viewing may be helpful when the operator is viewing an extraterrestrial environment, and is less able to extrapolate distances from unfamiliar objects.
The CARD operator at the control station sends a signal to the vehicle to transmit the stereo images, and waits for all the data for both screens to arrive at the station and to appear in the stereo viewer. Then the operator uses a three-dimensional control to denote points in the space seen in the viewer. (A three-dimensional control is one with three degrees of freedom; CARD uses a joystick with a rotatable knob.) The control drives a cursor which is superimposed on the picture which the operator sees, and which appears to move about in space in response to the operator's motion of the three-dimensional control.
A computer at the station takes the three dimensions of joystick motion and turns them into Cartesian space coordinates x, y, z at the vehicle location; it then transforms those coordinates into individual screen positions for the two viewing screens, so that the operator sees the cursor located in space in the stereo image. The transform from space coordinates to screen coordinates can easily be programmed from the geometry.
The operator, by depressing a button, can denote any cursor position as a waypoint. He or she denotes a series of waypoints to define points of a path in the space seen in the viewer, over which the operator wants the vehicle to travel. When all the waypoints are denoted, the operator pushes the "go" button. The station computer then takes the control readings recorded from the waypoints, transforms them into the appropriate commands (vehicle angles, segment lengths, compass headings), and relays these commands to the vehicle. The received commands tell the vehicle's guidance system how to proceed. The vehicle automatically responds to the commands by moving to the next waypoint; eventually it reaches the final point. It then begins the process over by sending two more images.
Neither the station computer nor the on-board computer calculates any curve from the waypoints: the vehicle moves straight to the next point, turns abruptly, and then goes on to the next. The station computer interrogates the vehicle's on-board computer about the vehicle's heading after each leg of the path is traversed, and instructs the vehicle for the next leg of the path.
CARD avoids the feedback problem by eliminating any semblance of real-time driving, and instead presenting the operator with a static problem: given a picture, chart a path through it. Operator confusion is eliminated, but at the cost of dead-time in the control cycle. The operator must wait while the vehicle laboriously goes through all of the waypoints, takes a picture, and transmits the image over the slow radio.
Being sluggish, CARD is not adapted to any use in which the operator should react quickly to changes in the vehicle environment, such as military use. CARD also has the drawback that it effectively halves the bandwidth of the radio link by presenting two stereo images, instead of only one. Moreover, the resolution needed for each of the combined stereo images is substantially greater than the resolution needed for a single monoscopic image of equal clarity. This is because higher resolution is needed to locate objects in the depth dimension when the images are combined. This need further decreases the effective bandwidth.
CARD uses conventional data compression techniques to decrease the bandwidth by about a factor of four. Such techniques are too slow for real time video, but are effective with slower transmission.
The CARD prototype described in the paper uses solid-state cameras 0.5 m apart. The cameras have a grain of 320 pixels per horizontal line, giving a 1-pixel stereo offset for objects at a range of 300 m. The vehicle includes a magnetic compass and odometer for dead reckoning calculation of vehicle position by the small on-board computer.
None of the above inventions and patents, taken either singly or in combination, is seen to describe the instant invention as claimed.
The prior art does not disclose any system for driving a vehicle by remote imaging under low screen refresh rates which is adapted to real-time driving; which is easy and natural for an operator to use; which is simple and reliable; which is inexpensive; and which allows the operator to react in the least possible time.
Accordingly, one object of the present invention is a vehicle remote imaging control system which does not confuse the operator with time delays in the control loop.
Another object is a system which is as simple as possible given the constraints of slow image data transfer, so as to be reliable and inexpensive.
A further object is a system with minimal computer hardware and software requirements.
An additional object is a system which allows a vehicle to proceed at the utmost speed.
A final object is a system which uses only available, proven technology.
These and other objects of the present invention will become readily apparent upon further review of the following specification and drawings.
SUMMARY OF THE INVENTION
In a remotely piloted surface vehicle, where a television camera on the vehicle is used to send images to a screen at a vehicle operator's station, the present invention comprises an improved and simplified system for remote driving. The system is especially adapted to slow video data transfer rate situations, where real-time video is unavailable and the operator can see only discrete "snapshot" image frames on the screen.
The vehicle's intended path is displayed on the operator's viewing screen. The path appears as a computer-generated line superimposed on the image of the vehicle's environment, appearing like a stripe painted on the ground. A screen cursor appears at the end of the line. The operator can change or advance the path line on the screen with a cursor control device, which might be a joystick, mouse, steering wheel and pedals, or any other control having two degrees of freedom.
As the line is extended by the operator, the computer picks certain discrete screen points along the line extension. The computer then maps these points onto ground positions in the vehicle environment by a mathematical transform. The ground positions are called "waypoints". These are sent to the vehicle's guidance system to direct the vehicle along a path through the waypoints. The guidance system has a memory which stores the waypoints and directs the vehicle successively over them.
The transform which maps the screen path onto the ground path uses simple trigonometric formulas and perhaps coordinate transformations. The transform and parameters depend on the camera orientation and lens. The transform parameters can be continuously adjusted if the camera zooms, pans or tilts.
In the usual low video data rate situation, the operator will see a sequence of still frames or "snapshots". A few frame will replace the old automatically at intervals determined by the data rate and the screen resolution.
The vehicle's computer includes an image buffer to store data from a instantaneous view of the camera. This data is sent to the station. Once all the data are arrived, that frame is automatically displayed on the operator's screen. The operator sees a frame taken some time ago.
For each new screen the path line is recalculated from the reported position of the vehicle relative to the ground points. The recalculated path line is then superimposed on the screen so as again to appear to lie on the surface, and the operator can quickly perceive the new situation of the vehicle and correct the projected path.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a perspective view of an operator controlling the radio message from a computer.
FIG. 2 a perspective view of the controlled vehicle, showing obstacles, a trapezoidal area seen by the camera on the vehicle, and waypoints on the ground.
FIG. 3 is a schematic elevation view showing a screen path and rectangular coordinates on the screen.
FIG. 4 is a schematic plan view showing the rectangular coordinates and screen path of FIG. 3 transformed into ground coordinates and waypoints on the ground.
FIG. 5 is a perspective view of the vehicle.
FIG. 6 is a schematic showing the flow of vehicle control information through the system.
FIG. 7 is a schematic showing the flow of camera control information through the system where the camera is movable relative to the vehicle body.
Similar reference characters denote corresponding features consistently throughout the attached drawings.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Some definitions of terms, used in the following Description and Claims, are:
"Camera" means film camera, infrared TV, radar, imaging sonar, or any other imaging apparatus whatsoever, as well as the typical visible light scanning television camera.
"Computer" means a digital computer, analog computer, hard-wired logic circuits, or any calculating device with memory.
"Cursor" means any visible, movable highlight, emblem, outline shape, etc. which can denote a position or area on a screen, and which can move in at least two dimensions. The cursor may include means for denoting direction as well as position, particularly by its shape. Typically, the cursor will be a flashing highlight on a CRT screen. The cursor may be shaped as a point, an arrow, an outline of the vehicle as seen in perspective at the cursor location, or any other shape.
"Downlink" refers to radio transmission of information from the vehicle to the station.
"Frame" means a single static view or image which fills the screen.
"Ground" means a surface over which the vehicle moves, be it concrete, carpeting, water, or any other surface. The invention is not limited to land vehicles traveling over earth.
"Radio" means communication by waves, such as electromagnetic waves. Sonar is one possible such wave.
"Screen" means any viewing device which appears to be flat or two-dimensional to the operator or viewer (as opposed to stereoscopic). Ordinarily the screen of the present invention will physically be a surface, such as the glass surface of a CRT. A liquid crystal screen, or any other surface capable of displaying a path and some image corresponding to the vehicle environment, may be used. Emblems or icons whose positions correspond to those of objects or conditions near the vehicle may be used instead of, or in addition to, regular transmitted pictures. The definition does not exclude monocular viewers such as helmet-mounted devices, or viewing goggles with individual lens systems presenting the same image, which do not incorporate a physical surface for viewing. Enhanced images are also within the scope of the invention.
"Uplink" refers to radio transmission of information from the station to the vehicle.
"Vehicle" means any device adapted to travel on or above a ground surface, having means to power the device to move and having means for controlling the motions.
"Waypoint" means a ground position referable to the coordinates of the vehicle guidance system or steering geometry. A waypoint may also be a position combined with a heading and/or a speed. It may also be any time derivative of position, such as acceleration. In general, it is a datum or data set to which vehicle position is referable. A waypoint may include elevation as well as ground position.
The present invention, as seen in FIGS. 1 and 2, is a system for presenting future vehicle path information to the operator O (FIG. 1) of a remotely controlled vehicle V (FIG. 2), where the vehicle V sends images back to the operator's station S from acamera 30 mounted on the vehicle V. It employs astation computer 16 to superimpose on a station screen 14 ascreen path line 12 showing the intended future path of the vehicle V over the surface. The operator O takes theline 12 to lie on the ground surface, like a painted stripe on a highway; the system is designed for this operator perception, and includes software to generate a projected ground path as a line on screen.
Radios 24 at on the vehicle V and at the station S send data back and forth.
The operator O, seen at the control station in FIG. 1, uses acursor control joystick 10 to extend or modify thescreen path line 12 as seen on thescreen 14. The operator O traces out an apparent path for the vehicle V with the aid of acursor 18 which is superimposed on the viewing image at the end of theline 12. The operator O can move thecursor 18 about at will with thejoystick 10.
Theline 12 must be transformed into a planned ground surface path by a simple computer program, which maps the position of any point on thescreen 14 into a corresponding ground point on the ground traversed by the vehicle V. The mapping transform will ordinarily map the rectangle of the screen onto the trapezoid on the ground which is seen by thecamera 30, as shown in FIG. 2. The mapping program parameters will depend upon the altitude, attitude, and focal length of thecamera lens 36.
The transform idea is illustrated schematically in FIGS. 3 and 4. FIG. 3 shows the viewing screen of FIG. 1, but no camera image of the vehicle environment is shown: instead is shown a rectangular grid corresponding to a possible system of screen coordinates. (This grid is not a physical image seen by the operator O, but is rather a mathematical projection to explain the transform.) A screen path is also shown, in the form of aline 12. The screen path ends with acursor 18 in the shape of an arrow. FIG. 4 is a bird's eye or plan view of the ground in front of the vehicle V, showing the rectangular grid of FIG. 3 transformed into a trapezoidal grid. The screen path of FIG. 3 has been transformed into discrete "waypoints", labeled 20, which outline the planned ground path of the vehicle; the generation of thewaypoints 20 is explained below. Both the grid lines and waypoints of FIG. 4 are, like the grid of FIG. 3, non-physical.
The transform is performed by a computer or computers at the station. This function is illustrated by the box labeled coordinate transform and waypoint generator" in Schematic FIG. 6. FIG. 6 shows the flow of information in the present invention: arrows represent information, and boxes are devices and/or processes such as computer algorithms. The reader will find reference to FIG. 6 helpful in the following discussion.
It is possible for the operator's cursor control to directly generate planned ground coordinates. The computer would in this case transform those coordinates back to screen coordinates forscreen 14 display. The essential thing is that the operator work in the view of the screen and that the screen path be correctly transformed into a planned ground path.
Usually the transform will require only trigonometry. If anunusual camera lens 36 is used, for example a fisheye lens, more complex coordinate transformations will be needed. In any case, the formulas of transformation are simple, straightforward, and easy for one skilled in the art to implement.
Thecontrol joystick 10 shown in FIG. 1 is well adapted to asimple point cursor 18. Other controls might be adapted to other sorts of cursors. For example, if an angle control such as a steering wheel is used, thecursor 18 might be in the shape of an arrow, whose angle corresponds to the angle of the steering wheel. Thejoystick 10 of FIG. 1 can include a rotatable knob for angle.
Thecursor 18 may also be made large and shaped like the outline of the vehicle V. This form ofcursor 18 is illustrated in FIG. 1. The operator O may then guide thecursor 18 outline through narrow gaps as shown in FIG. 1, and avoid a path if the outline will not fit through. In this case there could be two screen path lines 12 instead of the one illustrated in FIGS. 1 and 3, each one line trailing from a respective outer edge of thevehicle outline cursor 18. The outline should be displayed in perspective, so as to appear on the screen image of the ground as would a shadow of the vehicle V cast by a sun directly overhead. The size and shape of the outline can easily be calculated for the display. Such an outline is easily programmed by one skilled in the art. The outline of the vehicle V may also be simplified to a rectangle or bar indicating the width of the vehicle V.
Preferably, thecursor 18 sweeps out theline 12 on the screen, just as a marker leaves a line on paper, to denote the screen path. As an alternative, thecursor 18 might be used to set the screen positions of theindividual ground waypoints 20 through which the vehicle V would pass, like dots on paper. The screen points could be clicked on with a button and appear on the screen as dots or cursor shapes. Thecomputer 16 would then transform the coordinates of the screen path points into ground waypoint coordinates for radio uplink transmission to the vehicle's guidance system. However, operator choice ofwaypoints 20 puts an extra workload on the operator, so the waypoint choice is best left to the station computer, which can pickwaypoints 20 without inordinate hardware and software requirements.
If thepath line 12 is continuous, thestation computer 16 will pickground waypoints 20 based on some decision algorithm which keeps thewaypoints 20 spaced closely enough to track the vehicle V. The infinity of points defining a continous line cannot be transmitted to the vehicle V, and is not needed. The guidance system of the vehicle can regenerate a planned ground path through thewaypoints 20 and then follow that path. The required spacing of the transmittedwaypoints 20 will depend upon the sophistication of the path regeneration program of the vehicle's guidance system.
The coordinate transform and waypoint generation computer may display the screen path swept out by thecursor 18 as aline 12, or, may generate multiple images on the screen, located at the waypoint or elsewhere. Any display which informs the operator O of the screen path of the vehicle is within the scope of the present invention.
Since it may be necessary to readjust the projected path of the vehicle in the face of emergencies or miscalculations, the cursor control should have the capability of erasing the end of the screen path, that is, "back-tracking". It may be helpful to have a separate cursor reverse control which would erase the screen path line 12 from the end of the line back toward the vehicle.
The vehicle guidance system, which receives the waypoint coordinates from the station's transform and waypoint generation computer by way of the radio uplink, must include aguidance system computer 26 as seen in FIG. 1, or equivalent hard-wired circuitry, which has a memory. The memory stores the coordinates of the projectedwaypoints 20, which define a ground path for the vehicle V. This allows the operator O to set a planned ground path for the vehicle V to follow while the next frame is being transmitted. The guidance memory is an essential element of the present invention, as it allows driving in the future time of the instant of the snapshot frame, and avoids feedback loop trouble.
Because thewaypoints 20 need to be erased when the path is changed, the guidance system computer memory may conveniently be of the last-in, first-out stack type.
The guidance system may of any type which allows the vehicle V to automatically steer through thewaypoints 20. The vehicle may rely on dead reckoning to guide it on its path along the waypoints. Any method of tracking distance and direction, such odometers, integrating accelerometers, compasses, gyroscopes, or other conventional guidance means are feasible.
The guidance system need not receive operator commands directed toward the motion parameters of the vehicle V, such as vehicle speed and steering wheels angle. The transmittedwaypoints 20 are sufficient to track the vehicle V and control its speed (although direct speed control may be advantageous in some cases, and is not outside the scope of the invention). The more sophisticated the guidance program, the less need there will be for commands additional to the waypoint coordinates.
The guidance computer accepts the waypoint coordinates from the radio uplink as input, and outputs signals directly to the vehicle servomechanisms which control the motor, brakes and steering of the vehicle V. (Motor, etc. are listed as examples. Any sort of physical controls, depending on the particular type of vehicle, may be part of the invention. In this Description and in the following Claims, "vehicle servomechanism" means any means for physically controlling a vehicle according to signals from the guidance system.)
The physical motions of the vehicle V in response to the guidance system signals will not ordinarily be completely predictable. Wheel slip, steering gear backlash, or water or air currents (in the cases of vehicles which are boats or planes), will all throw off the intended path. In other words, the planned ground path and the executed ground path may differ. For this reason, the vehicle will ordinarily include a navigation system consisting of sensors and means for reporting the vehicle's attitude, speed, etc. to the vehicle guidance system and to the operator's station computer.
The sensors may be odometers, magnetic compasses, accelerometers with position integrators, speedometers, gyrocompasses, satellite position sensors, or any other means of detecting the position, attitude, or state of the vehicle V.
(The guidance system may act upon the input of non-navigational sensors as well. Such sensors would detect vehicle environmental conditions or dangers, such as mines, quicksand, etc. Such asensor 22 is shown in FIG. 5.)
The feedback of the navigation system to the guidance system is optional in cases where the vehicle motion is predictable from the signals sent to the vehicle servomechanisms. However, the feedback of the vehicle V position to the station computer is not optional, because the vehicle position at the time of the frame snapshot must be known to the station computer's screen path generator if the screen path is to be correctly displayed on thescreen 14. Therefore a device to maintain and report the vehicle's current position at any time is an essential element of the present invention. The navigation system with its sensors may be omitted if dead reckoning calculations in the guidance system are relied upon to maintain a current vehicle position, and the guidance system has means for reporting the position at the time of a snapshot to the station.
The projected path of the vehicle V extends from the vehicle position where the most recent frame was taken by the camera; this is the reference point for all calculations and motions. The path will extend up to thelast waypoint 20 transmitted by the operator O. The guidance system need not stop or slow the vehicle V until it nears thelast waypoint 20 in the path.
The vehicle, when it transmits a frame, will also transmit a report on its position at the time of the "snapshot". This frame position will be referenced to the waypoints. The current vehicle position, as discussed above, is maintained by the navigation system or guidance system.
Each time the screen is refreshed with a new frame, a newscreen path line 12 for that frame is calculated by the computer of the screen path generator. Input for the calculation is the vehicle's reported frame position and the stored waypoint positions; output is the placement of theline 12 on the screen. Theline 12 is constructed from the waypoints by an algorithm which operates inversely to the algorithm which picks the waypoints from the line.
The position of the vehicle V at the time the frame is taken will generally be intermediate between waypoints. The intermediate position may be referenced in various ways: the coordinate origin may be shifted to the last-passed waypoint; the position may be specified as a path distance from the last waypoint, and the path reconstructed by a station computer according to the same algorithm which the guidance system computer uses to generate the ground path from the waypoints; or some other methods may be used.
It should be noted that the screen path is referenced to both the ground position of the vehicle at some instant and the frame snapshot taken at the same instant. Even if the executed ground path has drifted away from the planned ground path, and the navigation system has not corrected the error, the screen path, the image of the frame, and the ground path are all kept synchronized by being reset to the same position and orientation with each new frame. There is no accumulated drift to invalidate the operator's commands. Thus both direction and distance sensors may be simple, relatively low-accuracy types, as they are constantly "recalibrated".
Because the operator O is using the cursor control to pick the vehicle's future path, and not its present actions, there is no feedback lag to throw the operator's reactions off.
Ordinarily, theTV camera 30 will be a visible light camera with a low-distortion lens 36. The operator'sscreen 14 will then present an image similar to that which he or she would see through a window on the vehicle. As discussed above, the coordinate transform of the screen path to the planned ground path maps theline 12 intowaypoints 20 on the ground. The ground trapezoid is the transformed shape of therectangular screen 14. (A low-distortion lens maps a ground trapezoid onto its rectangular film plane.)
If the camera is tilted to roughly horizontal, the horizon will be in the picture, and the trapezoid will extend to infinity. The portion of the frame above the horizon will be outside the range of the mapping transform.
Partial simulations, such as enhanced images resulting from calculations between adjacent pixels, are within the scope of the present invention. Images which are transformed to overcome lens distortion or to present a rotated, expanded, or intentionally distorted image to aid operator comprehension, are also within the scope of the present invention, as are added or enhanced portions of the screen image (for example, target sights or flashing highlights) and inset or superimposed images to denote objects identified by auxiliary systems.
Images which are generated as time projections or predictions from data received, or which are interpolations of discrete screen views separated in time, are not within the scope of the present invention.
In the present invention, the operator O is presented with a "snapshot" still frame of the ground terrain whenever the screen is refreshed. This is accomplished by video data storage in a two buffers.
Thecamera 30 will typically be a horizontal-sweep, top-to-bottom scan TV. Thecamera 30 will scan once in a short time, to "snap" a picture or frame, and send the image data to an image buffer in one of the vehicle's on-board computers for storage. The stored data represents one complete frame. This video data is sequentially retrieved from the image buffer and sent over the radio downlink to the station, where the video data is again stored in a display buffer of the station computer memory as it comes in. Once all the data for a complete frame has arrived, the stored frame is displayed on thestation screen 14 continuously until the data for the next complete frame has arrived. The operator O thus sees a still picture frame taken from the position of the vehicle V at an instant in the past, which remains on the screen until the next frame is ready for display. The time between the snapshot and display is nearly equal to the time required to transmit the video data over the radio downlink, because the downlink sends only the vehicle position in addition to the video data, and the transmission time is very small. Iftransceiver radios 24 are used, the uplink time must be included in the complete frame cycle of radio transmission, but this also requires only a brief interval compared to the time needed to transmit the great amount of video data. This is the preferred method, because only one radio band is needed for each vehicle; the total bandwidth for a set of vehicles is halved.
(The system of the present invention is also adapted to wide-band burst transmission, in which each vehicle uses the same radio frequency in turn. In this case the image can be sent as it is scanned, and no storage buffer is needed on the vehicle. A display buffer is still needed at the station if the time between frames is greater than the persistence time of the eye, that is, about a twentieth of a second.)
In some cases thecamera 30 should have the ability to tilt, pan and zoom in response to operator commands. It may also be necessary to move thecamera 30 from place to place on the vehicle V, or extend it on a boom. If this capability exists, the parameters of the coordinate transforms from theline 12 to theground waypoints 20 will change as the camera moves and changes its focal length. These changes must be reported to the station computer for generating the screen path and the waypoints. The camera position will not automatically "reset" as will the vehicle position. The camera must be kept in reference to the vehicle body for the present invention to work.
Camera motion relative to the vehicle V requires acamera mount 32, shown in FIG. 5. Themount 32 will haveservomechanisms 34, may include sensors for detecting the position of thecamera 30. In FIG. 5, the sensors are incorporated into theservos 34. FIG. 5 shows a mount having tilt, pan, and transverse translation mechanisms. These are illustrative only: mounts allowing any sort of motion are within the scope of the present invention. FIG. 5 also shows a zoom servomechanism andsensor 34 mounted on thelens 36.
A schematic of a camera control system is shown in FIG. 7. If the camera controls are positive, such as by step motors, the station computer can keep track of the camera position by integrating all the camera commands as time goes by, and calculating the transform parameters at the same time; in this case the control will follow the solid arrow lines of FIG. 7. If on the other hand the camera controls are of the type which can drift, such as belt drives, then the camera mount must include sensors and means to report the attitude and focal length of the camera and lens. The extra information flow for this case is shown by the dashed lines in FIG. 7. The station computers cannot know the the dashed lines in FIG. 7. The station computers cannot know the position of the camera by integrating past commands sent over the uplink, and so must receive reports from the camera sensors by way of the downlink.
The camera may be controlled by operator commands over the uplink. The camera may also be controlled automatically, that is, by a computer. Either the guidance system computer on board the vehicle, or a station computer, may direct the camera motions. If the guidance system controls the camera, then feedback information on camera position which is needed at the station may be sent either from camera mount sensors, or else by relaying the guidance system's camera commands, over the downlink to the station. If camera control originates at the station, then feedback on camera position from the vehicle may be needed.
The present invention is intended primarily for ground surfaces which are two dimensional, but it also adaptable to piloting vehicles which move in three-dimensional space above a surface, for example, drone airplanes and shallow-water submarines. Vehicles like these are not conceptually different from ground vehicles in which the camera is elevated to various heights on a boom or telescoping mast. Such space-traversing vehicles must include an altitude sensor, whose output is downlinked to the station computer to be used as a screen/ground transform parameter.
Two-dimensional surfaces are rarely flat. If the navigation sensors include inclinometers, and the guidance system of the vehicle has a program adapted to dealing with hills and ravines, the vehicle will guide itself more accurately. Such a program may account for foreshortening of the path as laid out on the screen, as for example, by multiplying odometer readings by the cosine of the slope to obtain the horizontal distance traveled when the waypoint generator has assumed level ground. A more sophisticated program might allow the operator O to break the path into discontinuous segments, for use when a portion of the ground is invisible on the screen, as at the crest of a hill.
As can be seen in FIG. 6, the present invention incorporates three basic feedback loops whose names are centered inside the respective loops: the operator loop wherein the operator O determines the path; the vehicle loop wherein the vehicle V is guided according to waypoints and commands transmitted over the uplink; and the system loop which is incorporates the radio links. The vehicle loop is not an essential element of the present invention, as the navigation system is optional. The other two loops are essential.
The discussion above outlines the preferred embodiment of the present invention, which allows the maximum vehicle speed possible given still picture frames. Video data storage in a storage buffer in the on-board computer is necessary in this embodiment. There is another embodiment which is within the scope of the invention, when used with continuous transmission on a narrow radio link (as opposed to burst transmission).
In this second embodiment the camera scans continuously and the data is sent continuously to the operator's station. The display buffer and the station computer are adapted to present a screen with two frames intermixed; the border between the two views scans down and a single frame would appear only at the beginning (or end) of a sweep. Real-time images are just above the horizontal border line; time delay increases with distance upward from the border to the top of the screen, and then increases yet more as the view "wraps around" to the bottom: the image just below the border is that part farthest in the past.
This method of image transmission is the closest to realtime that is possible with a slow radio link (one part of the image is always in real time.) However, distortion will be introduced into the views by this method, since the vehicle is moving while the TV camera scans. As a result, circular objects would appear egg-shaped. However, since scanned frames present the most up-to-date information possible, some distortion might be tolerable; or, the display screen or the station computer could compensate for the distortion.
This second embodiment also requires adjustment of the vehicle computer guidance system's reporting of the vehicle position. Some particular time must be designated for position references. Also, the programs used to calculate the screen path/ground path transforms would need to be modified.
In the present invention, the various components discussed above may be replaced by functional equivalents which are within the scope of the present invention according to the definitions above. It is to be understood that the present invention is not limited to the sole embodiment described above, but encompasses any and all embodiments within the scope of the following claims.

Claims (10)

I claim:
1. A control system for an operator to control a vehicle from a remote station, said system comprising:
a camera on said vehicle for gathering image data;
means for sending said image data from said camera to said station, said means for sending said image data including a radio downlink from said vehicle to said station;
a screen at said station for displaying images from said camera, for viewing by the operator;
means for generating a cursor and a screen path on said screen, for viewing by the operator;
a cursor control for the operator to move said cursor on said screen to determine placement of said screen path on said screen;
transform and waypoint generation means for geometrically transforming said screen path into a planned ground path, for determining waypoints along said planned ground path, and for assigning waypoints coordinates to said waypoints;
a radio uplink from said station to said vehicle for sending said waypoint coordinates to said vehicle;
a vehicle guidance system to guide said vehicle over an executed ground path, said executed ground path including said waypoints, said vehicle guidance system adapted to accept said waypoint coordinates, said vehicle guidance system including a guidance system memory for storing said waypoint coordinates, said vehicle guidance system including vehicle servomechanisms to physically control motions of said vehicle; and
means for reporting a frame position of said vehicle to said transform and waypoint generation means over said downlink; whereby
said screen will display an image of said vehicle environment taken by said camera at said frame position, the operator will trace out said screen path with said cursor control, and said vehicle will following said corresponding ground path automatically.
2. The control system as in claim 1, wherein said means for reporting said frame position of said vehicle includes
a navigation system for maintaining a current vehicle position relative to said waypoints, said navigation system including sensors for detecting position, motion or orientation of said vehicle, or conditions of said vehicle environment.
3. The vehicle as in claim 2 wherein said navigation system reports said current vehicle position to said guidance system, for vehicle feedback to said guidance system.
4. The control system as in claim 1, wherein
said means for sending image data includes an image buffer for storing said image data of one frame from said camera while said downlink transmits said data to said station, and
said screen includes a display buffer for maintaining said one frame stationary on said screen, whereby
the operator will continuously view a still picture taken by said camera in the past.
5. The control system as in claim 1, wherein
said control system includes command generation means for the operator to generate commands, and wherein
said uplink transmits said commands to said guidance system for controlling said vehicle.
6. The control system as in claim 5, including a mount connecting said camera to said vehicle, said mount including camera motion means for moving said camera relative to said vehicle, said camera motion means including mount servomechanisms;
wherein said commands include camera motion commands;
wherein said uplink transmits said camera motion commands to said camera motion means; and
wherein said transform and waypoint generation means obtains information on camera position from said downlink for transforming said screen path into a planned ground path according to said camera position; whereby
said camera will pan, tilt, or translate at said commands from the operator and said control system will function properly.
7. The control system as in claim 6, including means for automatically controlling said camera position whereby
said means for automatically controlling contains means for comparing said waypoints transmitted by said transform and waypoint generation means with said frame position of said vehicle.
8. The control system as in claim 6, wherein said camera includes a variable-focal-length zoom lens and zoom control servomechanism, said camera motion commands include zoom commands, and wherein said transform and waypoint generation means obtains information on camera lens focal length from said zoom control servomechanisms for transforming said screen path into a planned ground path according to said focal length; whereby
said camera lens will zoom at the command of the operator and said control system will function properly.
9. The control system as in claim 8, including means for automatically controlling said camera position whereby
said means for automatically controlling contains means for comparing said waypoints transmitted by said transform and waypoint generation means with said frame position of said vehicle.
10. The control system as in claim 1, wherein said transform and waypoint generation means includes means for automatically determining said waypoints.
US07/683,7061991-04-111991-04-11Vehicle remote guidance with path controlExpired - Fee RelatedUS5155683A (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US07/683,706US5155683A (en)1991-04-111991-04-11Vehicle remote guidance with path control

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US07/683,706US5155683A (en)1991-04-111991-04-11Vehicle remote guidance with path control

Publications (1)

Publication NumberPublication Date
US5155683Atrue US5155683A (en)1992-10-13

Family

ID=24745123

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US07/683,706Expired - Fee RelatedUS5155683A (en)1991-04-111991-04-11Vehicle remote guidance with path control

Country Status (1)

CountryLink
US (1)US5155683A (en)

Cited By (121)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
EP0606173A1 (en)*1993-01-051994-07-13Sfim IndustriesGuiding assembly
US5448487A (en)*1993-04-151995-09-05Fuji Jukogyo Kabushiki KaishaVehicle navigation control system
US5448479A (en)*1994-09-011995-09-05Caterpillar Inc.Remote control system and method for an autonomous vehicle
US5550758A (en)*1994-03-291996-08-27General Electric CompanyAugmented reality maintenance system with flight planner
US5624004A (en)*1992-10-141997-04-29Daifuku Co., Ltd.Wheel support apparatus for a carriage, a carriage having same, and an article transport system having such carriages
EP0781679A1 (en)*1995-12-271997-07-02Dassault ElectroniqueControl device for increasing safety in a fast vehicle, especially for a vehicle guided by an operator who may be located inside or outside the vehicle
US5652849A (en)*1995-03-161997-07-29Regents Of The University Of MichiganApparatus and method for remote control using a visual information stream
US5668555A (en)*1995-09-011997-09-16Starr; Jon E.Imaging system and apparatus
US5684697A (en)*1995-06-261997-11-04Mullen; Charles H.Driver emulative vehicle control system
US5706195A (en)*1995-09-051998-01-06General Electric CompanyAugmented reality maintenance system for multiple rovs
EP0822474A1 (en)*1996-08-011998-02-04Tecma S.r.l.System for orienting and guiding selfpropelled vehicles, and associated equipment
US5862498A (en)*1994-11-111999-01-19Xanavi Informatics CorporationMap display apparatus for motor vehicle
US5904724A (en)*1996-01-191999-05-18Margolin; JedMethod and apparatus for remotely piloting an aircraft
EP0813796A4 (en)*1995-03-091999-11-03Kevin DoleMiniature vehicle video production system
US5992758A (en)*1996-09-231999-11-30Agro-Mack Enterprises Ltd.Proximity detector for ground-based implements
EP0971537A3 (en)*1993-09-202000-06-14Canon Kabushiki KaishaVideo system
US6226573B1 (en)*1995-08-012001-05-01Komatsu Ltd.Course generator of moving body
US6269291B1 (en)*1997-08-042001-07-31Frog Navigation Systems B.V.System and method for controlling of vehicles
US6304290B1 (en)*1994-09-272001-10-16Societe M 5Method for the video-assisted remote control of machines, especially vehicles, and device for the implementation of this method
US6321147B1 (en)*1999-05-212001-11-20Komatsu Ltd.Unmanned vehicle running system
US6577933B2 (en)*2001-05-072003-06-10Hoton HowElectronically tracked road-map system
US20030169903A1 (en)*2002-03-112003-09-11Mitsubishi Denki Kabushiki KaishaImage pickup information recognition system
US20040017931A1 (en)*2001-08-102004-01-29International Business Machines CorporationOrientation determination
US6738158B1 (en)*1999-12-022004-05-18Xerox CorporationDigital scanner for capturing and processing images
US6778211B1 (en)1999-04-082004-08-17Ipix Corp.Method and apparatus for providing virtual processing effects for wide-angle video images
US20040169653A1 (en)*1995-04-202004-09-02Yoshinori EndoBird's-eye view forming method, map display apparatus and navigation system
US20040179125A1 (en)*2003-03-132004-09-16Olympus CorporationImaging apparatus
US20040189675A1 (en)*2002-12-302004-09-30John PretloveAugmented reality system and method
US20050062869A1 (en)*1999-04-082005-03-24Zimmermann Steven DwainImmersive video presentations
US20050090972A1 (en)*2003-10-232005-04-28International Business Machines CorporationNavigating a UAV
US20050109443A1 (en)*2003-11-212005-05-26Sleiman Joseph Z.Product labelling
US20050119801A1 (en)*2001-12-272005-06-02Itzhak FlorentinMethod and system for guiding a remote vehicle via lagged communication channel
ES2241397A1 (en)*2002-10-292005-10-16Universitat Politecnica De CatalunyaHoming beacon positioning system for use in e.g. marine rescue zone, has computer connected to global positioning system reader, specific software and transmission/receiving station
US20050262995A1 (en)*2004-05-182005-12-01San KilkisMethod and apparatus for remotely piloted landmine clearing platform with multiple sensing means
US20060187224A1 (en)*2003-07-292006-08-24Avshalom EhrlichPredictive display for a system having delayed feedback of a command issued
US20070040832A1 (en)*2003-07-312007-02-22Tan Tiow STrapezoidal shadow maps
US20090002142A1 (en)*2006-01-252009-01-01Akihiro MorimotoImage Display Device
US20090094140A1 (en)*2007-10-032009-04-09Ncr CorporationMethods and Apparatus for Inventory and Price Information Management
WO2009058697A1 (en)*2007-10-302009-05-07Raytheon CompanyUnmanned vehicle route management system
US20090185617A1 (en)*2008-01-172009-07-23Houghton Ricky AMethod and system for adapting use of a radio link between a remotely controlled device and an operator control unit
US20090208059A1 (en)*2008-02-202009-08-20Amir GevaFast License Plate Verifier
US20090309970A1 (en)*2008-06-042009-12-17Sanyo Electric Co., Ltd.Vehicle Operation System And Vehicle Operation Method
US20100170383A1 (en)*2008-05-232010-07-08Willner Byron JMethods and apparatuses for detecting and neutralizing remotely activated explosives
US20100274430A1 (en)*2009-04-222010-10-28Toyota Motor Engin. & Manufact. N.A. (TEMA)Detection of topological structure from sensor data with application to autonomous driving in semi-structured environments
US20100274414A1 (en)*2007-12-092010-10-28Bonglae ParkNarrow space slow moving operational device for vehicle and operation method thereof
US20110046781A1 (en)*2009-08-212011-02-24Harris Corporation, Corporation Of The State Of DelawareCoordinated action robotic system and related methods
US20110062958A1 (en)*2009-09-112011-03-17Wilfried SchnellCombined imaging system, including a magnetic resonance system and a uwb radar
EP2363775A1 (en)*2000-05-012011-09-07iRobot CorporationMethod and system for remote control of mobile robot
US8164627B1 (en)*1999-10-162012-04-24Bayerische Motoren Werke AktiengesellschaftCamera system for vehicles
WO2013168169A3 (en)*2012-05-082014-01-03Israel Aerospace Industries Ltd.Remote tracking of objects
US8665116B2 (en)2010-07-182014-03-04Ford Global TechnologiesParking assist overlay with variable brightness intensity
USRE44925E1 (en)1995-01-312014-06-03Transcenic, Inc.Spatial referenced photographic system with navigation arrangement
US8780050B2 (en)2004-08-312014-07-15Blackberry LimitedHandheld electronic device with text disambiguation
US8797190B2 (en)2012-07-262014-08-05General Electric CompanyMethod for displaying a user entered flight path
US20140358429A1 (en)*2011-04-192014-12-04Ford Global Technologies, LlcMethod of inputting a path for a vehicle and trailer
US20140358424A1 (en)*2011-04-192014-12-04Ford Global Technologies, LlcSystem and method of inputting an intended backing path
US20150161795A1 (en)*2013-12-102015-06-11GM Global Technology Operations LLCDistance determination using a monoscopic imager in a vehicle
US9075136B1 (en)1998-03-042015-07-07Gtj Ventures, LlcVehicle operator and/or occupant information apparatus and method
WO2015142166A1 (en)*2014-03-202015-09-24Lely Patent N.V.Method and system for navigating an agricultural vehicle on a land area
US9374562B2 (en)2011-04-192016-06-21Ford Global Technologies, LlcSystem and method for calculating a horizontal camera to target distance
EP3043202A1 (en)*2015-01-092016-07-13Ricoh Company, Ltd.Moving body system
US9511799B2 (en)2013-02-042016-12-06Ford Global Technologies, LlcObject avoidance for a trailer backup assist system
US9522677B2 (en)2014-12-052016-12-20Ford Global Technologies, LlcMitigation of input device failure and mode management
US9533683B2 (en)2014-12-052017-01-03Ford Global Technologies, LlcSensor failure mitigation system and mode management
US9555832B2 (en)2011-04-192017-01-31Ford Global Technologies, LlcDisplay system utilizing vehicle and trailer dynamics
US9566911B2 (en)2007-03-212017-02-14Ford Global Technologies, LlcVehicle trailer angle detection system and method
US9592851B2 (en)2013-02-042017-03-14Ford Global Technologies, LlcControl modes for a trailer backup assist system
US9607242B2 (en)2015-01-162017-03-28Ford Global Technologies, LlcTarget monitoring system with lens cleaning device
US9683848B2 (en)2011-04-192017-06-20Ford Global Technologies, LlcSystem for determining hitch angle
US9723274B2 (en)2011-04-192017-08-01Ford Global Technologies, LlcSystem and method for adjusting an image capture setting
US9811089B2 (en)2013-12-192017-11-07Aktiebolaget ElectroluxRobotic cleaning device with perimeter recording function
US9836060B2 (en)2015-10-282017-12-05Ford Global Technologies, LlcTrailer backup assist system with target management
US9854209B2 (en)2011-04-192017-12-26Ford Global Technologies, LlcDisplay system utilizing vehicle and trailer dynamics
US9896130B2 (en)2015-09-112018-02-20Ford Global Technologies, LlcGuidance system for a vehicle reversing a trailer along an intended backing path
US9926008B2 (en)2011-04-192018-03-27Ford Global Technologies, LlcTrailer backup assist system with waypoint selection
US9939529B2 (en)2012-08-272018-04-10Aktiebolaget ElectroluxRobot positioning system
US9946263B2 (en)2013-12-192018-04-17Aktiebolaget ElectroluxPrioritizing cleaning areas
IT201600101337A1 (en)*2016-11-032018-05-03Srsd Srl MOBILE TERRESTRIAL OR NAVAL SYSTEM, WITH REMOTE CONTROL AND CONTROL, WITH PASSIVE AND ACTIVE DEFENSES, EQUIPPED WITH SENSORS AND COMPLETE ACTUATORS CONTEMPORARY COVERAGE OF THE SURROUNDING SCENARIO
US9969428B2 (en)2011-04-192018-05-15Ford Global Technologies, LlcTrailer backup assist system with waypoint selection
US10045675B2 (en)2013-12-192018-08-14Aktiebolaget ElectroluxRobotic vacuum cleaner with side brush moving in spiral pattern
US10112646B2 (en)2016-05-052018-10-30Ford Global Technologies, LlcTurn recovery human machine interface for trailer backup assist
US10149589B2 (en)2013-12-192018-12-11Aktiebolaget ElectroluxSensing climb of obstacle of a robotic cleaning device
US10212396B2 (en)2013-01-152019-02-19Israel Aerospace Industries LtdRemote tracking of objects
US10209080B2 (en)2013-12-192019-02-19Aktiebolaget ElectroluxRobotic cleaning device
US10219665B2 (en)2013-04-152019-03-05Aktiebolaget ElectroluxRobotic vacuum cleaner with protruding sidebrush
US10231591B2 (en)2013-12-202019-03-19Aktiebolaget ElectroluxDust container
US10353400B2 (en)*2016-05-232019-07-16Asustek Computer Inc.Navigation system and navigation method
EP3547059A1 (en)*2018-03-292019-10-02Technische Hochschule KölnMethod for controlling unmanned vehicles and add-on module for retrofitting of unmanned, remotely controllable vehicles
US10433697B2 (en)2013-12-192019-10-08Aktiebolaget ElectroluxAdaptive speed control of rotating side brush
US10448555B2 (en)2016-05-272019-10-22Cnh Industrial America LlcSystem and method for scouting vehicle mapping
US10448794B2 (en)2013-04-152019-10-22Aktiebolaget ElectroluxRobotic vacuum cleaner
US10499778B2 (en)2014-09-082019-12-10Aktiebolaget ElectroluxRobotic vacuum cleaner
US10518416B2 (en)2014-07-102019-12-31Aktiebolaget ElectroluxMethod for detecting a measurement error in a robotic cleaning device
US10534367B2 (en)2014-12-162020-01-14Aktiebolaget ElectroluxExperience-based roadmap for a robotic cleaning device
US10546441B2 (en)2013-06-042020-01-28Raymond Anthony JoaoControl, monitoring, and/or security, apparatus and method for premises, vehicles, and/or articles
US10551474B2 (en)2013-01-172020-02-04Israel Aerospace Industries Ltd.Delay compensation while controlling a remote sensor
US10599161B2 (en)*2017-08-082020-03-24Skydio, Inc.Image space motion planning of an autonomous vehicle
US10617271B2 (en)2013-12-192020-04-14Aktiebolaget ElectroluxRobotic cleaning device and method for landmark recognition
US10678251B2 (en)2014-12-162020-06-09Aktiebolaget ElectroluxCleaning method for a robotic cleaning device
US10729297B2 (en)2014-09-082020-08-04Aktiebolaget ElectroluxRobotic vacuum cleaner
US10874271B2 (en)2014-12-122020-12-29Aktiebolaget ElectroluxSide brush and robotic cleaner
US10877484B2 (en)2014-12-102020-12-29Aktiebolaget ElectroluxUsing laser sensor for floor type detection
US10874274B2 (en)2015-09-032020-12-29Aktiebolaget ElectroluxSystem of robotic cleaning devices
WO2020262222A1 (en)*2019-06-242020-12-30株式会社ClueControl system for flying vehicle
US11048277B1 (en)*2018-01-242021-06-29Skydio, Inc.Objective-based control of an autonomous unmanned aerial vehicle
US11072368B2 (en)*2019-01-222021-07-27Deere & CompanyDynamically augmented bird's-eye view
US11099554B2 (en)2015-04-172021-08-24Aktiebolaget ElectroluxRobotic cleaning device and a method of controlling the robotic cleaning device
US11122953B2 (en)2016-05-112021-09-21Aktiebolaget ElectroluxRobotic cleaning device
US11169533B2 (en)2016-03-152021-11-09Aktiebolaget ElectroluxRobotic cleaning device and a method at the robotic cleaning device of performing cliff detection
EP3368957B1 (en)*2015-10-302022-02-09SZ DJI Technology Co., Ltd.Systems and methods for uav path planning and control
US11307584B2 (en)2018-09-042022-04-19Skydio, Inc.Applications and skills for an autonomous unmanned aerial vehicle
CN114397903A (en)*2017-05-242022-04-26深圳市大疆创新科技有限公司Navigation processing method and control equipment
US20220236741A1 (en)*2021-01-282022-07-28Caterpillar Inc.Visual overlays for providing perception of depth
US20220264607A1 (en)*2015-06-122022-08-18Comcast Cable Communications, LlcScheduling Resource Allocation in Wireless Network
US11474533B2 (en)2017-06-022022-10-18Aktiebolaget ElectroluxMethod of detecting a difference in level of a surface in front of a robotic cleaning device
US11635775B2 (en)2015-09-152023-04-25SZ DJI Technology Co., Ltd.Systems and methods for UAV interactive instructions and control
US20230311769A1 (en)*2022-03-312023-10-05Cnh Industrial America LlcSystem and method for an agricultural applicator
US11818071B2 (en)2015-06-122023-11-14Comcast Cable Communications, LlcScheduling request on a secondary cell of a wireless device
WO2024011210A1 (en)*2022-07-082024-01-11Polaris Industries Inc.Autonomous-ready vehicle
US11921517B2 (en)2017-09-262024-03-05Aktiebolaget ElectroluxControlling movement of a robotic cleaning device
US11943154B2 (en)2015-06-152024-03-26Comcast Cable Communications, LlcWireless uplink resource allocation

Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4405943A (en)*1981-08-191983-09-20Harris CorporationLow bandwidth closed loop imagery control and communication system for remotely piloted vehicle
US4855822A (en)*1988-01-261989-08-08Honeywell, Inc.Human engineered remote driving system
US4926346A (en)*1985-12-271990-05-15Aisin-Warner Kabushiki KaishaRoad image input system for vehicle control

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4405943A (en)*1981-08-191983-09-20Harris CorporationLow bandwidth closed loop imagery control and communication system for remotely piloted vehicle
US4926346A (en)*1985-12-271990-05-15Aisin-Warner Kabushiki KaishaRoad image input system for vehicle control
US4855822A (en)*1988-01-261989-08-08Honeywell, Inc.Human engineered remote driving system

Cited By (187)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5624004A (en)*1992-10-141997-04-29Daifuku Co., Ltd.Wheel support apparatus for a carriage, a carriage having same, and an article transport system having such carriages
EP0606173A1 (en)*1993-01-051994-07-13Sfim IndustriesGuiding assembly
US5448487A (en)*1993-04-151995-09-05Fuji Jukogyo Kabushiki KaishaVehicle navigation control system
EP0971537A3 (en)*1993-09-202000-06-14Canon Kabushiki KaishaVideo system
US20040150725A1 (en)*1993-09-202004-08-05Canon Kabushiki KaishaVideo system for use with video telephone and video conferencing
US7298400B2 (en)1993-09-202007-11-20Canon Kabushiki KaishaVideo system for use with video telephone and video conferencing
US6665006B1 (en)1993-09-202003-12-16Canon Kabushiki KaishaVideo system for use with video telephone and video conferencing
US5550758A (en)*1994-03-291996-08-27General Electric CompanyAugmented reality maintenance system with flight planner
AU685295B2 (en)*1994-09-011998-01-15Caterpillar Inc.Remote control system and method for an autonomous vehicle
US5448479A (en)*1994-09-011995-09-05Caterpillar Inc.Remote control system and method for an autonomous vehicle
US6304290B1 (en)*1994-09-272001-10-16Societe M 5Method for the video-assisted remote control of machines, especially vehicles, and device for the implementation of this method
US5862498A (en)*1994-11-111999-01-19Xanavi Informatics CorporationMap display apparatus for motor vehicle
US6421604B1 (en)*1994-11-112002-07-16Xanavi Informatics CorporationMap display apparatus for motor vehicle
US6012014A (en)*1994-11-112000-01-04Xanavi Informatics CorporationMap display apparatus for motor vehicle
USRE44925E1 (en)1995-01-312014-06-03Transcenic, Inc.Spatial referenced photographic system with navigation arrangement
EP0813796A4 (en)*1995-03-091999-11-03Kevin DoleMiniature vehicle video production system
US5652849A (en)*1995-03-161997-07-29Regents Of The University Of MichiganApparatus and method for remote control using a visual information stream
US20040169653A1 (en)*1995-04-202004-09-02Yoshinori EndoBird's-eye view forming method, map display apparatus and navigation system
US5684697A (en)*1995-06-261997-11-04Mullen; Charles H.Driver emulative vehicle control system
US6226573B1 (en)*1995-08-012001-05-01Komatsu Ltd.Course generator of moving body
US5668555A (en)*1995-09-011997-09-16Starr; Jon E.Imaging system and apparatus
US5706195A (en)*1995-09-051998-01-06General Electric CompanyAugmented reality maintenance system for multiple rovs
US5987364A (en)*1995-12-271999-11-16Dassault ElectroniqueControl device for making safe a fast vehicle, in particular guided by an operator on board the vehicle or otherwise
FR2743162A1 (en)*1995-12-271997-07-04Dassault Electronique CONTROL DEVICE FOR SECURING A FAST VEHICLE, ESPECIALLY GUIDED BY AN OPERATOR ON OR OFF IN THE VEHICLE
EP0781679A1 (en)*1995-12-271997-07-02Dassault ElectroniqueControl device for increasing safety in a fast vehicle, especially for a vehicle guided by an operator who may be located inside or outside the vehicle
US5904724A (en)*1996-01-191999-05-18Margolin; JedMethod and apparatus for remotely piloting an aircraft
EP0822474A1 (en)*1996-08-011998-02-04Tecma S.r.l.System for orienting and guiding selfpropelled vehicles, and associated equipment
US5992758A (en)*1996-09-231999-11-30Agro-Mack Enterprises Ltd.Proximity detector for ground-based implements
US6269291B1 (en)*1997-08-042001-07-31Frog Navigation Systems B.V.System and method for controlling of vehicles
US9075136B1 (en)1998-03-042015-07-07Gtj Ventures, LlcVehicle operator and/or occupant information apparatus and method
US6778211B1 (en)1999-04-082004-08-17Ipix Corp.Method and apparatus for providing virtual processing effects for wide-angle video images
US20050007483A1 (en)*1999-04-082005-01-13Zimmermann Steven DwainMethod and apparatus for providing virtual processing effects for wide-angle video images
US20050062869A1 (en)*1999-04-082005-03-24Zimmermann Steven DwainImmersive video presentations
US7312820B2 (en)1999-04-082007-12-25Ipix CorporationMethod and apparatus for providing virtual processing effects for wide-angle video images
US6321147B1 (en)*1999-05-212001-11-20Komatsu Ltd.Unmanned vehicle running system
US8164627B1 (en)*1999-10-162012-04-24Bayerische Motoren Werke AktiengesellschaftCamera system for vehicles
US6738158B1 (en)*1999-12-022004-05-18Xerox CorporationDigital scanner for capturing and processing images
EP2363775A1 (en)*2000-05-012011-09-07iRobot CorporationMethod and system for remote control of mobile robot
EP1279081B1 (en)*2000-05-012012-01-04iRobot CorporationMethod and system for remote control of mobile robot
US6577933B2 (en)*2001-05-072003-06-10Hoton HowElectronically tracked road-map system
US7054466B2 (en)*2001-08-102006-05-30International Business Machines CorporationOrientation determination
US20040017931A1 (en)*2001-08-102004-01-29International Business Machines CorporationOrientation determination
EP1468241B2 (en)2001-12-272011-03-23Rafael-Armament Development Authority Ltd.Method and system for guiding a remote vehicle via lagged communication channel
US7620483B2 (en)2001-12-272009-11-17Rafael-Armament Development Authority Ltd.Method for guiding a remote vehicle via lagged communication channel
US20050119801A1 (en)*2001-12-272005-06-02Itzhak FlorentinMethod and system for guiding a remote vehicle via lagged communication channel
US7305149B2 (en)*2002-03-112007-12-04Mitsubishi Denki Kabushiki KaishaImage pickup information recognition system
US20030169903A1 (en)*2002-03-112003-09-11Mitsubishi Denki Kabushiki KaishaImage pickup information recognition system
ES2241397B1 (en)*2002-10-292006-12-16Universitat Politecnica De Catalunya NAVIGATION SYSTEM WITH EXTERNAL CONTROL BASE AND SELF GUIDED BEARING.
ES2241397A1 (en)*2002-10-292005-10-16Universitat Politecnica De CatalunyaHoming beacon positioning system for use in e.g. marine rescue zone, has computer connected to global positioning system reader, specific software and transmission/receiving station
US7714895B2 (en)*2002-12-302010-05-11Abb Research Ltd.Interactive and shared augmented reality system and method having local and remote access
US20040189675A1 (en)*2002-12-302004-09-30John PretloveAugmented reality system and method
US20090167929A1 (en)*2003-03-132009-07-02Toshiyuki NagaokaImaging apparatus
US20040179125A1 (en)*2003-03-132004-09-16Olympus CorporationImaging apparatus
US20060187224A1 (en)*2003-07-292006-08-24Avshalom EhrlichPredictive display for a system having delayed feedback of a command issued
US7761173B2 (en)*2003-07-292010-07-20Rafael Advanced Defense Systems Ltd.Predictive display for a system having delayed feedback of a command issued
US20070040832A1 (en)*2003-07-312007-02-22Tan Tiow STrapezoidal shadow maps
US20050090972A1 (en)*2003-10-232005-04-28International Business Machines CorporationNavigating a UAV
US7153378B2 (en)*2003-11-212006-12-26Joe & Samia Management Inc.Product labelling
US20050109443A1 (en)*2003-11-212005-05-26Sleiman Joseph Z.Product labelling
US7624667B2 (en)*2004-05-182009-12-01San KilkisMethod and apparatus for remotely piloted landmine clearing platform with multiple sensing means
US20050262995A1 (en)*2004-05-182005-12-01San KilkisMethod and apparatus for remotely piloted landmine clearing platform with multiple sensing means
US8780050B2 (en)2004-08-312014-07-15Blackberry LimitedHandheld electronic device with text disambiguation
JPWO2007086431A1 (en)*2006-01-252009-06-18パナソニック株式会社 Video display device
US20090002142A1 (en)*2006-01-252009-01-01Akihiro MorimotoImage Display Device
US9566911B2 (en)2007-03-212017-02-14Ford Global Technologies, LlcVehicle trailer angle detection system and method
US9971943B2 (en)2007-03-212018-05-15Ford Global Technologies, LlcVehicle trailer angle detection system and method
US20090094140A1 (en)*2007-10-032009-04-09Ncr CorporationMethods and Apparatus for Inventory and Price Information Management
WO2009058697A1 (en)*2007-10-302009-05-07Raytheon CompanyUnmanned vehicle route management system
CN101842758B (en)*2007-10-302014-07-16雷斯昂公司Unmanned vehicle route management system
US20100070124A1 (en)*2007-10-302010-03-18Yeager Matthew RUnmanned Vehicle Route Management System
JP2011502310A (en)*2007-10-302011-01-20レイセオン カンパニー Unmanned vehicle route management system
US8948932B2 (en)2007-10-302015-02-03Raytheon CompanyUnmanned vehicle route management system
AU2008318929B2 (en)*2007-10-302013-01-17Raytheon CompanyUnmanned vehicle route management system
US20100274414A1 (en)*2007-12-092010-10-28Bonglae ParkNarrow space slow moving operational device for vehicle and operation method thereof
US8483270B2 (en)*2008-01-172013-07-09Ballistic Applications And Materials International, LlcMethod and system for adapting use of a radio link between a remotely controlled device and an operator control unit
US20090185617A1 (en)*2008-01-172009-07-23Houghton Ricky AMethod and system for adapting use of a radio link between a remotely controlled device and an operator control unit
US20090208059A1 (en)*2008-02-202009-08-20Amir GevaFast License Plate Verifier
US8229168B2 (en)*2008-02-202012-07-24International Business Machines CorporationFast license plate verifier
US8240238B2 (en)*2008-05-232012-08-14Willner Byron JMethods and apparatuses for detecting and neutralizing remotely activated explosives
US20100170383A1 (en)*2008-05-232010-07-08Willner Byron JMethods and apparatuses for detecting and neutralizing remotely activated explosives
US20090309970A1 (en)*2008-06-042009-12-17Sanyo Electric Co., Ltd.Vehicle Operation System And Vehicle Operation Method
US8384776B2 (en)2009-04-222013-02-26Toyota Motor Engineering And Manufacturing North America, Inc.Detection of topological structure from sensor data with application to autonomous driving in semi-structured environments
WO2010124056A1 (en)*2009-04-222010-10-28Toyota Motor Engineering And Manufacturing N.A.Detection of topological structure from sensor data with application to autonomous driving in semi-structured environments
US20100274430A1 (en)*2009-04-222010-10-28Toyota Motor Engin. & Manufact. N.A. (TEMA)Detection of topological structure from sensor data with application to autonomous driving in semi-structured environments
US8473101B2 (en)*2009-08-212013-06-25Harris CorporationCoordinated action robotic system and related methods
US20110046781A1 (en)*2009-08-212011-02-24Harris Corporation, Corporation Of The State Of DelawareCoordinated action robotic system and related methods
US8502533B2 (en)*2009-09-112013-08-06Siemens AktiengesellschaftCombined imaging system, including a magnetic resonance system and a UWB radar
US20110062958A1 (en)*2009-09-112011-03-17Wilfried SchnellCombined imaging system, including a magnetic resonance system and a uwb radar
US8665116B2 (en)2010-07-182014-03-04Ford Global TechnologiesParking assist overlay with variable brightness intensity
US20140358429A1 (en)*2011-04-192014-12-04Ford Global Technologies, LlcMethod of inputting a path for a vehicle and trailer
US9969428B2 (en)2011-04-192018-05-15Ford Global Technologies, LlcTrailer backup assist system with waypoint selection
US20140358424A1 (en)*2011-04-192014-12-04Ford Global Technologies, LlcSystem and method of inputting an intended backing path
US9854209B2 (en)2011-04-192017-12-26Ford Global Technologies, LlcDisplay system utilizing vehicle and trailer dynamics
US9723274B2 (en)2011-04-192017-08-01Ford Global Technologies, LlcSystem and method for adjusting an image capture setting
US9683848B2 (en)2011-04-192017-06-20Ford Global Technologies, LlcSystem for determining hitch angle
US9374562B2 (en)2011-04-192016-06-21Ford Global Technologies, LlcSystem and method for calculating a horizontal camera to target distance
US10609340B2 (en)2011-04-192020-03-31Ford Global Technologies, LlcDisplay system utilizing vehicle and trailer dynamics
US9500497B2 (en)*2011-04-192016-11-22Ford Global Technologies, LlcSystem and method of inputting an intended backing path
US9506774B2 (en)*2011-04-192016-11-29Ford Global Technologies, LlcMethod of inputting a path for a vehicle and trailer
US9926008B2 (en)2011-04-192018-03-27Ford Global Technologies, LlcTrailer backup assist system with waypoint selection
US9555832B2 (en)2011-04-192017-01-31Ford Global Technologies, LlcDisplay system utilizing vehicle and trailer dynamics
US10192139B2 (en)2012-05-082019-01-29Israel Aerospace Industries Ltd.Remote tracking of objects
WO2013168169A3 (en)*2012-05-082014-01-03Israel Aerospace Industries Ltd.Remote tracking of objects
US8797190B2 (en)2012-07-262014-08-05General Electric CompanyMethod for displaying a user entered flight path
US9939529B2 (en)2012-08-272018-04-10Aktiebolaget ElectroluxRobot positioning system
US10212396B2 (en)2013-01-152019-02-19Israel Aerospace Industries LtdRemote tracking of objects
US10551474B2 (en)2013-01-172020-02-04Israel Aerospace Industries Ltd.Delay compensation while controlling a remote sensor
US9511799B2 (en)2013-02-042016-12-06Ford Global Technologies, LlcObject avoidance for a trailer backup assist system
US9592851B2 (en)2013-02-042017-03-14Ford Global Technologies, LlcControl modes for a trailer backup assist system
US10219665B2 (en)2013-04-152019-03-05Aktiebolaget ElectroluxRobotic vacuum cleaner with protruding sidebrush
US10448794B2 (en)2013-04-152019-10-22Aktiebolaget ElectroluxRobotic vacuum cleaner
US10546441B2 (en)2013-06-042020-01-28Raymond Anthony JoaoControl, monitoring, and/or security, apparatus and method for premises, vehicles, and/or articles
US20150161795A1 (en)*2013-12-102015-06-11GM Global Technology Operations LLCDistance determination using a monoscopic imager in a vehicle
US9280826B2 (en)*2013-12-102016-03-08GM Global Technologies Operations LLCDistance determination using a monoscopic imager in a vehicle
US10045675B2 (en)2013-12-192018-08-14Aktiebolaget ElectroluxRobotic vacuum cleaner with side brush moving in spiral pattern
US10617271B2 (en)2013-12-192020-04-14Aktiebolaget ElectroluxRobotic cleaning device and method for landmark recognition
US9811089B2 (en)2013-12-192017-11-07Aktiebolaget ElectroluxRobotic cleaning device with perimeter recording function
US9946263B2 (en)2013-12-192018-04-17Aktiebolaget ElectroluxPrioritizing cleaning areas
US10433697B2 (en)2013-12-192019-10-08Aktiebolaget ElectroluxAdaptive speed control of rotating side brush
US10149589B2 (en)2013-12-192018-12-11Aktiebolaget ElectroluxSensing climb of obstacle of a robotic cleaning device
US10209080B2 (en)2013-12-192019-02-19Aktiebolaget ElectroluxRobotic cleaning device
US10231591B2 (en)2013-12-202019-03-19Aktiebolaget ElectroluxDust container
WO2015142166A1 (en)*2014-03-202015-09-24Lely Patent N.V.Method and system for navigating an agricultural vehicle on a land area
NL2012485A (en)*2014-03-202015-12-10Lely Patent NvMethod and system for navigating an agricultural vehicle on a land area.
US10518416B2 (en)2014-07-102019-12-31Aktiebolaget ElectroluxMethod for detecting a measurement error in a robotic cleaning device
US10499778B2 (en)2014-09-082019-12-10Aktiebolaget ElectroluxRobotic vacuum cleaner
US10729297B2 (en)2014-09-082020-08-04Aktiebolaget ElectroluxRobotic vacuum cleaner
US9522677B2 (en)2014-12-052016-12-20Ford Global Technologies, LlcMitigation of input device failure and mode management
US9533683B2 (en)2014-12-052017-01-03Ford Global Technologies, LlcSensor failure mitigation system and mode management
US10877484B2 (en)2014-12-102020-12-29Aktiebolaget ElectroluxUsing laser sensor for floor type detection
US10874271B2 (en)2014-12-122020-12-29Aktiebolaget ElectroluxSide brush and robotic cleaner
US10678251B2 (en)2014-12-162020-06-09Aktiebolaget ElectroluxCleaning method for a robotic cleaning device
US10534367B2 (en)2014-12-162020-01-14Aktiebolaget ElectroluxExperience-based roadmap for a robotic cleaning device
US10171796B2 (en)2015-01-092019-01-01Ricoh Company, Ltd.Moving body system
EP3043202A1 (en)*2015-01-092016-07-13Ricoh Company, Ltd.Moving body system
US9607242B2 (en)2015-01-162017-03-28Ford Global Technologies, LlcTarget monitoring system with lens cleaning device
US11099554B2 (en)2015-04-172021-08-24Aktiebolaget ElectroluxRobotic cleaning device and a method of controlling the robotic cleaning device
US11765718B2 (en)*2015-06-122023-09-19Comcast Cable Communications, LlcScheduling resource allocation in wireless network
US20220264607A1 (en)*2015-06-122022-08-18Comcast Cable Communications, LlcScheduling Resource Allocation in Wireless Network
US11818071B2 (en)2015-06-122023-11-14Comcast Cable Communications, LlcScheduling request on a secondary cell of a wireless device
US11943154B2 (en)2015-06-152024-03-26Comcast Cable Communications, LlcWireless uplink resource allocation
US12309079B2 (en)2015-06-152025-05-20Comcast Cable Communications, LlcWireless uplink resource allocation
US10874274B2 (en)2015-09-032020-12-29Aktiebolaget ElectroluxSystem of robotic cleaning devices
US11712142B2 (en)2015-09-032023-08-01Aktiebolaget ElectroluxSystem of robotic cleaning devices
US9896130B2 (en)2015-09-112018-02-20Ford Global Technologies, LlcGuidance system for a vehicle reversing a trailer along an intended backing path
US12181879B2 (en)2015-09-152024-12-31SZ DJI Technology Co., Ltd.System and method for supporting smooth target following
US11635775B2 (en)2015-09-152023-04-25SZ DJI Technology Co., Ltd.Systems and methods for UAV interactive instructions and control
US9836060B2 (en)2015-10-282017-12-05Ford Global Technologies, LlcTrailer backup assist system with target management
US10496101B2 (en)2015-10-282019-12-03Ford Global Technologies, LlcTrailer backup assist system with multi-purpose camera in a side mirror assembly of a vehicle
EP3368957B1 (en)*2015-10-302022-02-09SZ DJI Technology Co., Ltd.Systems and methods for uav path planning and control
US11169533B2 (en)2016-03-152021-11-09Aktiebolaget ElectroluxRobotic cleaning device and a method at the robotic cleaning device of performing cliff detection
US10112646B2 (en)2016-05-052018-10-30Ford Global Technologies, LlcTurn recovery human machine interface for trailer backup assist
US11122953B2 (en)2016-05-112021-09-21Aktiebolaget ElectroluxRobotic cleaning device
US10353400B2 (en)*2016-05-232019-07-16Asustek Computer Inc.Navigation system and navigation method
US10448555B2 (en)2016-05-272019-10-22Cnh Industrial America LlcSystem and method for scouting vehicle mapping
IT201600101337A1 (en)*2016-11-032018-05-03Srsd Srl MOBILE TERRESTRIAL OR NAVAL SYSTEM, WITH REMOTE CONTROL AND CONTROL, WITH PASSIVE AND ACTIVE DEFENSES, EQUIPPED WITH SENSORS AND COMPLETE ACTUATORS CONTEMPORARY COVERAGE OF THE SURROUNDING SCENARIO
CN114397903A (en)*2017-05-242022-04-26深圳市大疆创新科技有限公司Navigation processing method and control equipment
US11474533B2 (en)2017-06-022022-10-18Aktiebolaget ElectroluxMethod of detecting a difference in level of a surface in front of a robotic cleaning device
US20230257116A1 (en)*2017-08-082023-08-17Skydio, Inc.Image Space Motion Planning Of An Autonomous Vehicle
US20230257115A1 (en)*2017-08-082023-08-17Skydio, Inc.Image Space Motion Planning Of An Autonomous Vehicle
US11347244B2 (en)*2017-08-082022-05-31Skydio, Inc.Image space motion planning of an autonomous vehicle
US12330784B2 (en)*2017-08-082025-06-17Skydio, Inc.Image space motion planning of an autonomous vehicle
US11592845B2 (en)*2017-08-082023-02-28Skydio, Inc.Image space motion planning of an autonomous vehicle
US11592844B2 (en)*2017-08-082023-02-28Skydio, Inc.Image space motion planning of an autonomous vehicle
US20220050477A1 (en)*2017-08-082022-02-17Skydio, Inc.Image Space Motion Planning Of An Autonomous Vehicle
US20220050478A1 (en)*2017-08-082022-02-17Skydio, Inc.Image Space Motion Planning Of An Autonomous Vehicle
US11858628B2 (en)*2017-08-082024-01-02Skydio, Inc.Image space motion planning of an autonomous vehicle
US20240067334A1 (en)*2017-08-082024-02-29Skydio, Inc.Image Space Motion Planning Of An Autonomous Vehicle
US10599161B2 (en)*2017-08-082020-03-24Skydio, Inc.Image space motion planning of an autonomous vehicle
US11787543B2 (en)*2017-08-082023-10-17Skydio, Inc.Image space motion planning of an autonomous vehicle
US12296951B2 (en)*2017-08-082025-05-13Skydio, Inc.Image space motion planning of an autonomous vehicle
US11921517B2 (en)2017-09-262024-03-05Aktiebolaget ElectroluxControlling movement of a robotic cleaning device
US11048277B1 (en)*2018-01-242021-06-29Skydio, Inc.Objective-based control of an autonomous unmanned aerial vehicle
US11755041B2 (en)2018-01-242023-09-12Skydio, Inc.Objective-based control of an autonomous unmanned aerial vehicle
CN110320928A (en)*2018-03-292019-10-11科隆应用技术大学It controls the method for unmanned means of transport and the add-on module of remote controlled unmanned means of transport is transformed
EP3547059A1 (en)*2018-03-292019-10-02Technische Hochschule KölnMethod for controlling unmanned vehicles and add-on module for retrofitting of unmanned, remotely controllable vehicles
US11829139B2 (en)2018-09-042023-11-28Skydio, Inc.Applications and skills for an autonomous unmanned aerial vehicle
US12292737B2 (en)2018-09-042025-05-06Skydio, Inc.Applications and skills for an autonomous unmanned aerial vehicle
US11307584B2 (en)2018-09-042022-04-19Skydio, Inc.Applications and skills for an autonomous unmanned aerial vehicle
US11072368B2 (en)*2019-01-222021-07-27Deere & CompanyDynamically augmented bird's-eye view
WO2020262222A1 (en)*2019-06-242020-12-30株式会社ClueControl system for flying vehicle
US20240028042A1 (en)*2021-01-282024-01-25Caterpillar Inc.Visual overlays for providing perception of depth
US11860641B2 (en)*2021-01-282024-01-02Caterpillar Inc.Visual overlays for providing perception of depth
US20220236741A1 (en)*2021-01-282022-07-28Caterpillar Inc.Visual overlays for providing perception of depth
US12085950B2 (en)*2021-01-282024-09-10Caterpillar Inc.Visual overlays for providing perception of depth
US20230311769A1 (en)*2022-03-312023-10-05Cnh Industrial America LlcSystem and method for an agricultural applicator
WO2024011210A1 (en)*2022-07-082024-01-11Polaris Industries Inc.Autonomous-ready vehicle

Similar Documents

PublicationPublication DateTitle
US5155683A (en)Vehicle remote guidance with path control
KR102001728B1 (en)Method and system for acquiring three dimentional position coordinates in non-control points using stereo camera drone
US8229163B2 (en)4D GIS based virtual reality for moving target prediction
KR102471347B1 (en)Stabilization and display of remote images
US11310412B2 (en)Autofocusing camera and systems
EP1974331B1 (en)Real-time, three-dimensional synthetic vision display of sensor-validated terrain data
US8994822B2 (en)Infrastructure mapping system and method
RU2481612C2 (en)Method and system of controlling device operation using integrated simulation with time shift option
ES2923956T3 (en) Camera-driven automatic aircraft control for radar activation
US8314816B2 (en)System and method for displaying information on a display element
IL188655A (en)System and method for navigating a remote control vehicle past obstacles
KR20060017759A (en) Method and apparatus for video on demand
CN110574366A (en) image generation device
WO2018216536A1 (en)Video image generation device and video image generation method
US10424105B2 (en)Efficient airborne oblique image collection
US4827252A (en)Display methods and apparatus
Miller et al.UAV navigation based on videosequences captured by the onboard video camera
IL267309B2 (en)Terrestrial observation device having location determination functionality
Bhanu et al.A system for obstacle detection during rotorcraft low altitude flight
US12406586B2 (en)3D localization and mapping systems and methods
US11415990B2 (en)Optical object tracking on focal plane with dynamic focal length
KR102181809B1 (en)Apparatus and method for checking facility
KR101027533B1 (en) Video Surveillance Device and Method
Rosiek et al.Exploiting global positioning system and inertial measurement unit-controlled image sensors
WilcoxVision-based planetary rover navigation

Legal Events

DateCodeTitleDescription
FPAYFee payment

Year of fee payment:4

FPAYFee payment

Year of fee payment:8

REMIMaintenance fee reminder mailed
LAPSLapse for failure to pay maintenance fees
STCHInformation on status: patent discontinuation

Free format text:PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FPLapsed due to failure to pay maintenance fee

Effective date:20041013


[8]ページ先頭

©2009-2025 Movatter.jp