Movatterモバイル変換


[0]ホーム

URL:


HK1167920A - Data processing system and method for providing at least one driver assistance function - Google Patents

Data processing system and method for providing at least one driver assistance function
Download PDF

Info

Publication number
HK1167920A
HK1167920AHK12108581.4AHK12108581AHK1167920AHK 1167920 AHK1167920 AHK 1167920AHK 12108581 AHK12108581 AHK 12108581AHK 1167920 AHK1167920 AHK 1167920A
Authority
HK
Hong Kong
Prior art keywords
vehicle
data
driver assistance
image
stationary
Prior art date
Application number
HK12108581.4A
Other languages
Chinese (zh)
Inventor
哈弗梅.麦思亚斯
汤米.金
Original Assignee
海拉胡克双合有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 海拉胡克双合有限公司filedCritical海拉胡克双合有限公司
Publication of HK1167920ApublicationCriticalpatent/HK1167920A/en

Links

Description

Data processing system and method for providing at least one driver assistance function
The present invention relates to a data processing system and method for providing at least one driver assistance function. At least one image of the surroundings of the vehicle is generated by at least one image capturing unit of the vehicle. Based on the image data, driver assistance information of at least one driver assistance information is generated and a driver assistance function is provided in the vehicle in accordance therewith.
A large number of camera-based driver assistance systems designed to improve the comfort and driving safety of motor vehicles are known. These driver assistance systems are in particular warning systems which warn the driver in the event of an undesired lane departure (lane departure warning LDW) or which support the driver in keeping his own lane while driving (lane keeping support). Longitudinal control (ACC) for vehicles, for lighting control of the light emitted by the headlights of vehicles, driver assistance systems for traffic sign recognition and for meeting the traffic regulations specified by traffic signs, blind spot warning systems, distance measuring systems with an advance collision warning function or braking function, and brake and overtaking assistance systems are also known. For image capture, known driver assistance systems are typically implemented using a camera mounted in or on the vehicle. Advantageously, the camera is arranged in the region of the interior rear-view mirror behind the windshield. Of course, other locations are possible.
The known vehicle camera is preferably designed as a video camera for capturing several images in succession to form an image sequence. With this camera, an image of at least an area of the road surface of a detection area located in front of the vehicle is captured, and image data corresponding to the image is generated. These image data are then processed by suitable algorithms for object recognition and object separation and object tracking through several images. Objects which are classified as relevant objects and are further processed are in particular those which are relevant for the respective driver assistance function, such as oncoming vehicles and vehicles driving in front, lane markings, obstacles on the lane, pedestrians on the lane and/or in the adjacent lane, traffic signs, as well as traffic light signal systems and street lights.
Document publication No. WO 2008/019907a1 discloses a method and apparatus for providing driver assistance by generating lane information for supporting or replacing lane information of a video-based lane information apparatus. Which confirms the reliability parameters of the determined lane information and additionally determines the lane information of at least one other vehicle, which is transmitted via the communication device between the vehicles.
Document EP 1016268B1 discloses a lamp control system for a motor vehicle. Which processes at least one image by means of a microprocessor in order to detect headlights of oncoming vehicles and tail lights of preceding vehicles and to determine control signals for controlling the headlights of the vehicles.
Document WO 2008/068837a1 discloses a traffic condition display method by which traffic safety is improved because it can display the position of a vehicle according to a sequence of images.
With regard to a camera-based driver assistance system in a vehicle, there is a problem that only a relatively limited processing procedure, i.e., a relatively small calculation capacity and a relatively small storage capacity, can be provided to process image data and provide an assistance function for a driver, due to a limited space in the vehicle. Providing more resources in the vehicle means higher costs. However, high-quality driver assistance functions can only be realized if there are a large number of resources in the vehicle. As a compromise, the driver assistance function provided is actually only a part of the driver assistance functions that may be provided. In addition, the algorithms required to process the image data and analyze the image information must be adapted to the specific conditions of the vehicle and the surroundings of the vehicle. In the case where the system is already built into a vehicle, relatively complex software updates must be implemented to update.
Also, when country or region difference features are taken into account in processing image data to provide some driver assistance functions, it is necessary to store country data sets in the vehicle. In addition, these data sets need to be updated periodically.
The object of the present invention is to specify a data processing system and a method for providing at least one driver assistance function, which system and method require only few resources for providing the driver assistance function in a vehicle.
This object is achieved by a data processing system having the features of claim 1 and by a method according to the independent claim of the method class. Advantageous developments of the invention that can be carried out are specified in the dependent claims.
By transferring image data from the vehicle to a stationary processing unit, the processing costs required to provide driver assistance functions in the vehicle may be substantially reduced. In addition, in providing the driver assistance function, it becomes easy to consider supplementary information from the vehicle and information not from the vehicle. Furthermore, it is easy to expand and limit the driver assistance functionality provided in the vehicle, since only desired and/or agreed driver assistance information needs to be transmitted in accordance with the driver assistance data transmitted from the stationary processing unit to the vehicle. In particular, a simple-structured image capturing unit, which is, for example, a simple-structured camera, and a simple-structured transmitting unit for transmitting image data to a still receiving unit can be mounted in the vehicle. For this reason, only relatively few controls are required, so that the camera and the receiving unit, or a transmitting unit for transmitting the image data and a receiving unit for receiving the driver assistance data, occupy only a small space within the vehicle, and these components can be installed in a large number of vehicles at relatively low cost. In this way, a location-dependent driver assistance function is easily obtained, in particular when taking account of the country characteristics of the country in which the vehicle is actually located. These country features are particularly relevant to country traffic signs and/or country traffic navigation systems. The vehicle position can be determined by the vehicle and can be transmitted to the stationary receiving unit, or can be determined from the position of the stationary receiving unit.
In a preferred embodiment of the present invention, an image capture system is provided in a vehicle that captures a plurality of images having respective representations of an area surrounding the vehicle as a sequence of images and generates image data corresponding to the representations for each captured image. In addition, a vehicle transmitting unit is provided that transmits at least a portion of the image data to the stationary receiving unit. The image capture system in particular generates compressed image data which is compressed, for example, by a JPEG compression program or a program for MP4 compression. Still further, only image data of local details of an image captured by the image capturing system may be transmitted to the still receiving unit, and the still processing unit may process only the image data of the local details. A stationary unit has a specific geographical position at least during its operation, in relation to a component which is arranged in the vehicle and which, because of its arrangement in or on the vehicle, is a mobile unit or a vehicle unit. In particular, the stationary units remain in their respective geographical positions during the processing of the image data and during the generation of the driver assistance data.
The image capture system can in particular capture 10 to 30 images per second and then send their image data to a still receiving unit. The transmission between the vehicle and a stationary receiving unit located in the transmission range of the vehicle preferably takes place by wireless data transmission, for example by means of the known WLAN or mobile wireless data transmission links. Alternatively, transmission may be performed using line-of-sight microwave lines, such as laser transmission lines.
Further, it is advantageous to provide a vehicle receiving unit for receiving driver assistance data emitted by the stationary transmitting unit. The data transmitted from the vehicle to the stationary receiving unit and the data transmitted from the stationary transmitting unit to the vehicle receiving unit are both provided under user authentication or vehicle authentication of the vehicle to ensure that the data are distributed to the vehicle from which the processed image data is issued. Furthermore, it is advantageous to provide a processing unit arranged in the vehicle, which processing unit processes the received driver assistance data and outputs information to the driver via a human-machine interface (HMI). Alternatively or additionally, the processing unit may control at least one vehicle system of the vehicle in dependence of the received driver assistance data. In particular, the vehicle system may be a lighting system, a braking system, a steering system, a transmission system, a safety system and/or a warning system. The auxiliary system can thus actively participate in the guidance of the vehicle, which can prevent the occurrence of dangerous situations or reduce the occurrence of accidents, if desired.
Further, it is advantageous that the stationary processing unit detects and classifies the representation of objects in the image when processing the received image data, and generates driver assistance data from the classified objects. By classifying the representation of the object, traffic conditions and hazards and related information can be summarized.
Further, the stationary processing unit may determine the image position of a classified object and/or the relative position of the classified object with respect to the vehicle and/or the position of the classified object in a coordinate system independent of the vehicle, e.g. a world coordinate system. In this way, more traffic situations can be defined and specific dangerous situations can be determined.
Further, it would be advantageous to provide an image capture system that includes at least one stereo camera. The images of the individual cameras of the stereo camera can be transmitted as image data in the form of an image pair from the vehicle transmission unit to the still reception unit and further to the still processing unit. The stationary processing unit may then determine the appearance of the same object in the images of each image pair, determine their image positions and determine the distance of the object relative to the stereo camera and thus relative to the vehicle based on the image positions. Thereby, the distance of the vehicle from the object can be determined relatively accurately.
Further, the still reception unit receives additional data having supplementary information in addition to the image data from the vehicle. The supplementary information may comprise, in particular, the current position of the vehicle, the speed of the vehicle, information about weather conditions at the location of the vehicle, information about visibility conditions in the region of the vehicle, information about the setting and/or operating state of the vehicle, for example the light distribution of headlights of a regulated vehicle, and/or information detected by vehicle sensors, for example lane markings, a determined distance from objects, in particular from other vehicles. In this way, more raw information for generating the driver assistance data is available, so that the driver assistance information contained in the driver assistance data can be determined largely accurately and/or can be determined at relatively low cost.
The method having the features of the independent method claim can be extended in the same way as explained for the data processing system according to the invention.
Other features and advantages of the present invention will be pointed out with particularity in the description that follows, and in more detail, specific embodiments of the present invention will be described with reference to the accompanying drawings.
Fig. 1 shows a schematic overall configuration of a driver assistance system according to a first embodiment of the invention.
Fig. 2 shows a block diagram of a driver assistance system according to a second embodiment of the invention.
Fig. 3 shows a flow chart of the data transmission of the driver assistance system according to the invention.
Fig. 1 shows a schematic overall configuration of a driver assistance system 10 according to a first embodiment of the invention. The vehicle 12 traveling on the lane 14 of the road 16 has a camera 20 for capturing images of the road area in front of the vehicle 12, the camera 20 being disposed inside the windshield of the vehicle 12 and between the windshield and the interior rearview mirror 18 of the vehicle. Solid lines 22 and 24 show the external line of sight of the camera 20. The elliptical area of the envelope between the lines of sight 22, 24 shows the detection area of the camera 20 at various distances. The vehicle 12 also has a transmitting/receiving unit 26 for transmitting image data generated by means of the camera 20. The image data is sent to the still transmission/reception unit 30 a. Along the road 16, stationary transmitting and receiving units, such as the stationary transmitting/receiving units 30b and 30c shown in fig. 1, are disposed at appropriate intervals. The image data is preferably transmitted in a compressed format between the transmitting/receiving unit 26 of the vehicle 12 and each of the stationary transmitting/receiving units 30a to 30 c. The transmit/receive units 26, 30a to 30c are also referred to as transceivers.
The image data received by the stationary transmission/reception units 30a to 30c are transmitted to the stationary processing unit within the data processing center 40, and are preferably decompressed in the conversion module 42 of the stationary processing unit and provided to the respective modules 44, 46, and/or the driving assistance function is sequentially generated. Here, by means of the modules 44, 46, the representations of objects associated with the driver assistance system can be detected in the images, which representations can then be classified and, if necessary, tracked in several successively acquired images. Based on the driver assistance information generated by the modules 44, 46, driver assistance data with driver assistance information required for providing driver assistance functionality in the vehicle 12 is generated in an output module 48 and transmitted to at least one stationary transmitting/receiving unit 30a to 30c located within the transmission range of the vehicle 12. The driver assistance data is then transmitted from this transmitting/receiving unit 30a to 30c into the vehicle 12. In the vehicle 12, a control unit (not shown) processes the driver assistance data and provides driver assistance information to the control unit for controlling vehicle components in dependence on the driver assistance function to be performed, and/or outputs corresponding information on a display unit or informs the driver of the vehicle 12 of the corresponding information via a loudspeaker.
Fig. 2 shows a block diagram of a driver assistance system according to a second embodiment of the invention. Elements having the same structure or the same function are identified by the same reference numerals. In a second embodiment of the invention, the cameras 20 of the vehicle 12 are designed as stereo cameras, wherein each single camera of the camera system 20 generates one single image at the time of capture, while the captured images are then further processed as an image pair. The captured image data is sent from the camera system 20 to the conversion module 52, and the conversion module 52 compresses the image data and adds supplemental data with additional information. The image data in particular receives a time stamp generated by the time stamp module 54. Data with additional information are in particular vehicle data such as the activation of the direction indicators, the adjustment of the headlights, the activation of the back and brake lights, information relating to the activation of the brakes, and other vehicle data preferably provided via a vehicle bus. In addition, the position data is preferably sent from the position determination module 58, which is part of the navigation system of the vehicle 12, to the conversion module 52. This is additional data, for example a time stamp, and the vehicle data and the position data are transmitted as additional data together with the image data to the transmitting/receiving unit 26 of the vehicle, which data are transmitted from the transmitting/receiving unit 26 to the transmitting/receiving unit 30c via a wireless data connection to the communication network 30. The received data is transmitted from the transmission/reception unit 30c to the data processing center 40. In contrast to the first embodiment of the present invention, an additional memory element 49 is provided in the data processing center 40, and image data can be temporarily stored in the memory element 49. Preferably, the stored image data is deleted after a preset time, for example after one day, unless there is a need to permanently store the data. This is particularly useful when the vehicle camera 20 captures images of an accident, as these are to be stored for later evaluation.
The evaluation of the transmitted image data and the generation of driver assistance information and the transmission of the generated driver assistance information into the transmitting/receiving unit 26 of the vehicle 12 by means of the respective driver assistance data take place in the same way as shown in fig. 1. The received driver assistance data is provided to the control unit 60, which control unit 60 generates vehicle data from the driver assistance information and provides the vehicle data to the module 56, which vehicle data is output via an output unit of the vehicle 12. Additionally or alternatively, the control unit 60 may generate control data for use by vehicle modules, such as control data for activating a braking system 62, for activating a steering system 64, for activating a seatbelt buckle actuator 66, and for activating a headrest actuator 68.
FIG. 3 illustrates the flow of generating data and transferring data between the vehicle 12 and the stationary processing units of the data processing center 40. In step S10, the camera 20 generates image data, which is compressed in step S12. Proceeding in parallel with step S10, vehicle data is determined in step S14, position data is determined in step S16, time stamp generating data is determined in step S18, and data of other data sources of the vehicle 12 is determined in step S20. In step S12, the compressed image data and the additional data determined in steps S14 to S20 are transmitted. When the image data is transmitted in step S12, a part of the image data generated by the camera 20 may be selected and prepared for transmission. The image data is transmitted from the transmitting/receiving unit 26 of the vehicle 12 to the stationary transmitting/receiving unit 30c together with the additional data determined in step S24, and the stationary transmitting/receiving unit 30c receives the transmitted data in step S30. The received image data and preferably the transmitted additional data are then processed by the still processing unit 40 in step S32, wherein the image data is decompressed in step S34 and analyzed in step S36 together with the additional data. The image data or the information determined from the image data is supplied to the module for generating driver assistance information and, if necessary, the transmitted additional information is also supplied to the module for generating driver assistance information. In step S38, these modules generate driver assistance information. These modules include, in particular, at least one lane recognition module, a traffic sign recognition module, a lamp control module, an object detection module, an object verification module and a so-called night vision module, which enhances the visibility of objects to the driver by projecting poorly visible objects onto the windscreen. Basically, modules may be provided for all known driver assistance system functions, and further driver assistance functions, which generate in step S38 the respective driver assistance information required for the respective driver assistance function in the vehicle 12. In addition, in step S40, the driver assistance data with the driver assistance information is then transmitted to the transmission/reception unit 26 of the vehicle 12 through the stationary transmission unit 30 c.
In step S42, the transmitting/receiving unit 26 of the vehicle 12 receives the driver assistance data and provides these driver assistance data to the information module, the warning module and the action module of the vehicle 12, the vehicle 12 processes the driver assistance data in step S44 and outputs corresponding information to the driver via a Human Machine Interface (HMI), and additionally or alternatively the vehicle 12 initiates an action of a vehicle component in step S48, for example the activation of the vehicle 'S braking system or the vehicle' S steering system or the vehicle 'S safety equipment and/or the vehicle' S lighting system.
It is particularly advantageous to design the vehicle components required for the driver assistance system according to the invention as structurally simple components requiring little space, which components are easier to install in new vehicles and can be retrofitted to existing vehicles, since they require only relatively little space. Also, the module for generating the required driver assistance information can be easily updated and updated centrally in the data processing center 40. In this way, these functions are also easily available on demand. Regional difference data, in particular country data, in particular data for traffic sign recognition and lane recognition, may also be stored centrally in the stationary processing unit 40 and may be used for generating driver assistance information depending on the position of the vehicle 12.
The image data may be transmitted from the vehicle 12 to the stationary receiving unit 30 using a known mobile wireless network, such as a wireless broadcast network that is a wireless LAN, or a broadband data network currently under test in the field of wireless mobility, or the like. Alternatively or additionally, a line-of-sight microwave link may be optionally employed to transmit data between the vehicle 12 and the stationary receive/transmit unit 30 c. As an alternative to the illustrated embodiment, each stationary transmission/reception unit 30a to 30c may include a stationary processing unit 40 for processing image data transmitted from the vehicle 12 or each stationary transmission/reception unit 30a to 30c may be connected to such a processing unit 40.
By means of the invention, a space-saving design of the vehicle camera 20 and the transmitting/receiving unit 26 of the vehicle 12 can be achieved, so that these designs can be used universally as much as possible in one configuration on a plurality of vehicles. These vehicle components 20, 26 are suitable for use in any country without requiring country-specific adjustments to the software and/or hardware in the vehicle. The country-specific feature consideration is selected or configured by a software module in the data processing center 40. The evaluation of traffic sign representations, lanes and other objects is performed for object recognition. Based on the above, assistance may be provided, for example, in light control and/or other presently known driver assistance functions. However, the illustrated system is readily extended to other applications as well. The transmission of the image information detected by the camera 20, preferably of compressed image data, can be carried out by suitable electronic means, preferably a microprocessor, and the transmission/reception unit 26 transmits these data to the stationary transmission/reception units 30a to 30c, together with additional data, if the application requires them, to the stationary transmission/reception units 30a to 30 c. In the data processing center 40, the driver assist function is obtained and evaluated according to the characteristics. Based on the above, driver assistance information is generated, which is transmitted in data form from the data processing center 40 to the stationary transmission/reception units 30a to 30c, and from the stationary transmission/reception units 30a to 30c to the transmission/reception unit 26 of the vehicle 12. In the vehicle 12, at least one image sensor 20, i.e. at least one monocular camera, is provided. By means of the camera 20, a road area in front of the vehicle is preferably captured. The driver assistance functions generated from the generated driver assistance data may comprise, in particular, conventional information and/or warning or action information for the driver. By evaluating image information external to the vehicle 12, the vehicle 12 need only provide driver assistance functionality with relatively few resources. Also, the vehicle 12 does not require or requires a relatively small memory capacity to store comparison data that classifies the object. By processing and evaluating the image data in the central data processing center 40, country-dependent or region-dependent image recognition can be carried out. In addition, the stationary processing unit 40 takes into account rapidly changing road conditions, such as changes in road and road construction direction, when generating the driver assistance information, and information transmitted by other vehicles when determining the driver assistance data. With reference to what has been described with reference to fig. 2, the image sent to the still processing unit 40 may be stored by suitable storage means for at least a defined time. In addition to the already mentioned application as an accident document, driver assistance information generated from the images can be collated by means of the stored images, for example incorrect driver assistance information which helps to handle driver complaints.
It is particularly advantageous if the module updating and module expansion for generating driver assistance information from the image data provided can be carried out centrally in the data processing center 40. The driver assistance information generated from the transmitted image data and/or the driver assistance information transmitted to the vehicle in the data processing center 40 can be defined according to a driver assistance function, a software license and/or a software module activated for the vehicle 12. Such activation may be based on consumer authentication and/or vehicle authentication, for example. The individual driver assistance functions may also be limited in part to one country, for example. However, the traffic sign recognition module for germany, for example, may be reserved by the driver or the consumer, and then the data processing center 40 generates each piece of driver assistance information based on the image data transmitted thereto and transmits the generated each piece of driver assistance information to the vehicle 12. Based on these functions, the light information and/or sound information on the recognized traffic sign is output to the driver. Additionally or alternatively, the transmitted driver assistance information can be further processed, for example in the event of an overspeed, by providing it to a system for generating a warning function or by providing it to a drive control for limiting the speed.
Both monocular and stereo cameras may be used as the vehicle camera 20 to capture color images or grayscale images. These cameras comprise, in particular, at least one CMOS sensor for capturing images or at least one CCD sensor for capturing images.

Claims (12)

1. A data processing system for providing at least one driver assistance function, comprising
At least one stationary receiving unit (30a to 30c) for receiving image data which have been generated by capturing at least one image of the surroundings of a vehicle (12) by means of at least one image capturing unit (20) of the vehicle (12);
at least one stationary processing unit (40) for processing at least a part of the received image data, wherein the stationary processing unit (40) generates driver assistance data with at least one type of driver assistance information on the basis of the image data, wherein at least one driver assistance function can be generated in the vehicle (12) depending on the generated driver assistance information; and the number of the first and second groups,
at least one transmitting unit (30a to 30c) for transmitting the driver assistance data to the vehicle (12).
2. A data processing system according to claim 1, characterized in that an image capturing unit (20) of the vehicle (12) captures a plurality of images having an appearance of the surrounding area of the vehicle (12) to form an image sequence, and generates image data corresponding to the appearance for each captured image; and the number of the first and second groups,
a vehicle transmission unit (26) transmits at least a part of the image data of the images to the stationary receiving units (30a to 30 c).
3. The data processing system as claimed in any one of the preceding claims, characterized in that a vehicle receiving unit (26) receives the driver assistance data transmitted by the stationary transmitting units (30a to 30 c).
4. A data processing system according to claim 3, characterized in that a processing unit arranged in the vehicle (12) processes the received driver assistance data and outputs information via a human-machine interface and/or controls at least one vehicle system of the vehicle (12).
5. A data processing system according to claim 4, characterized in that the vehicle system comprises a lighting system, a braking system, a steering system, a transmission system and/or a warning system.
6. The data processing system of one of the preceding claims, wherein the stationary processing unit (40) detects and classifies the appearance of objects in the received image when processing the image, and generates the driver assistance data on the basis of the classified objects.
7. The data processing system according to claim 6, characterized in that the stationary processing unit (40) determines the image position of a classified object and/or the relative position of the classified object with respect to the vehicle (12) and/or the position of the classified object (12) in a coordinate system independent of the vehicle.
8. Data processing system according to one of the preceding claims, characterized in that the image capture system comprises at least one stereo camera (20), wherein the images of a single one of the stereo cameras are transmitted from the vehicle transmission unit (26) to the stationary receiving units (30a to 30c) as image data of a pair of images.
9. The data processing system of claim 8, characterized in that the still processing unit (40) determines the appearance of the same object in the images of each image pair, determines their image positions and determines the distance of the object relative to the stereo camera (20) based on the image positions.
10. The data processing system as claimed in one of the preceding claims, characterized in that the stationary receiving unit (30a to 30c) receives additional data with supplementary information in addition to the image data from the vehicle (12).
11. Data processing system according to claim 11, characterized in that the supplementary information comprises the current position of the vehicle (12), the speed, information about weather conditions, information about visibility conditions, information about the setting and/or operating state of the vehicle (12), such as the light distribution of the headlights of the adjusted vehicle (12), and/or information detected by vehicle sensors, such as lane markings, the determined distance to objects, in particular to other vehicles.
12. A method of providing at least one driver assistance function,
wherein image data which have been generated by capturing at least one image of the surroundings of a vehicle (12) by means of at least one image capturing unit (20) of the vehicle (12) are received by a stationary receiving unit (30a to 30 c);
processing at least a portion of the received image data by a stationary processing unit (40), wherein the stationary processing unit (40) generates driver assistance data with at least one type of driver assistance information based on the image data;
at least one driver assistance function may be generated in the vehicle (12) on the basis of the generated driver assistance information; and transmitting the driver assistance data to the vehicle (12) by a transmitting unit (30a to 30 c).
HK12108581.4A2009-04-062010-03-31Data processing system and method for providing at least one driver assistance functionHK1167920A (en)

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
DE102009016580.02009-04-06

Publications (1)

Publication NumberPublication Date
HK1167920Atrue HK1167920A (en)2012-12-14

Family

ID=

Similar Documents

PublicationPublication DateTitle
US20120133738A1 (en)Data Processing System and Method for Providing at Least One Driver Assistance Function
JP7530830B2 (en) Information processing device, information processing method, imaging device, computer program, information processing system, and mobile device
CN109076163B (en) Imaging control device, imaging control method, and imaging device
US11731637B2 (en)Driver assistance system
KR101741433B1 (en)Driver assistance apparatus and control method for the same
JP7242531B2 (en) Around view providing device
CN110574357B (en)Imaging control apparatus, method for controlling imaging control apparatus, and moving body
US9747800B2 (en)Vehicle recognition notification apparatus and vehicle recognition notification system
US11022795B2 (en)Vehicle display control device
EP2026247B1 (en)Automatic headlamp control system
JP7310524B2 (en) Remote self-driving vehicle and vehicle remote command system
JP7047763B2 (en) Driving assistance devices and methods, mobiles, and programs
JP2009146217A (en) Stereo camera device
JP2013147112A (en)Vehicle driving environment recognition apparatus
EP3036131B1 (en)Imaging system and method with ego motion detection
JPWO2018056104A1 (en) Vehicle control device, vehicle control method, and moving body
JP2008193339A (en) Rear monitoring device
KR20170142939A (en)Mirror replacement system for a vehicle
JP7593806B2 (en) Vehicle operation system, vehicle control device, vehicle operation method, and vehicle operation program
EP4149809B1 (en)Motor-vehicle driving assistance in low meteorological visibility conditions, in particular with fog
JP2022172259A (en) In-vehicle device and transmitter
CN111587572A (en)Image processing apparatus, image processing method, and program
KR101822896B1 (en)Driver assistance apparatus and control method for the same
US20240317256A1 (en)Vehicular driving assistance system with enhanced road curve management
JP7125893B2 (en) TRIP CONTROL DEVICE, CONTROL METHOD AND PROGRAM

[8]ページ先頭

©2009-2025 Movatter.jp