Movatterモバイル変換


[0]ホーム

URL:


CN109116357B - Method, device and server for synchronizing time - Google Patents

Method, device and server for synchronizing time
Download PDF

Info

Publication number
CN109116357B
CN109116357BCN201710497226.0ACN201710497226ACN109116357BCN 109116357 BCN109116357 BCN 109116357BCN 201710497226 ACN201710497226 ACN 201710497226ACN 109116357 BCN109116357 BCN 109116357B
Authority
CN
China
Prior art keywords
time
data
information
real
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710497226.0A
Other languages
Chinese (zh)
Other versions
CN109116357A (en
Inventor
李冲冲
姜媛
王超
石庭敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co LtdfiledCriticalBeijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201710497226.0ApriorityCriticalpatent/CN109116357B/en
Publication of CN109116357ApublicationCriticalpatent/CN109116357A/en
Application grantedgrantedCritical
Publication of CN109116357BpublicationCriticalpatent/CN109116357B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The application discloses a method, a device and a server for synchronizing time. One embodiment of the method comprises: acquiring vehicle speed data and radar data; screening out data when the unmanned vehicle runs straight from the vehicle speed data as reference data, and screening out data in the same time period as the reference data from the radar data as data to be synchronized; and matching the same data in the reference data and the data to be synchronized, and determining the time for synchronizing the radar data according to the first synchronization time information and the second synchronization time information corresponding to the same data. According to the embodiment, the time for synchronizing the radar data is determined by determining the same data in the reference data and the data to be synchronized, so that the accuracy of the radar data is improved.

Description

Method, device and server for synchronizing time
Technical Field
The present application relates to the field of data processing technologies, and in particular, to the field of driverless technologies, and in particular, to a method, an apparatus, and a server for synchronizing time.
Background
The automobile expands the range of people going out, brings convenience to people going out and improves the quality of life of people. With the development and progress of science and technology, the unmanned vehicle controlled by the intelligent system can acquire more driving information than the manned vehicle, has higher safety and becomes an important trend of future vehicle development. The unmanned vehicle adopts a robot operating System to transmit information, and can automatically and safely run without any assistance by means of cooperative cooperation of artificial intelligence, visual calculation, a video camera, a radar sensor, a laser radar and a GPS (Global Positioning System) Positioning System.
Unmanned vehicles include a variety of sensors, each of which is collecting data. The processor controls the unmanned vehicle based on the collected data. However, due to the reasons of different data sizes, different transmission times, different frequencies of collected data, etc., the times of arrival of the data collected by the sensors at the same time to the processor are often different, which makes the processor make an erroneous judgment.
Disclosure of Invention
An object of the embodiments of the present application is to provide a method, an apparatus, and a server for synchronizing time, so as to solve the technical problems mentioned in the above background.
In a first aspect, an embodiment of the present application provides a method for synchronizing time, which is applied to an unmanned vehicle including a combined navigation system and an onboard radar, and includes: acquiring vehicle speed data and radar data, wherein the vehicle speed data comprises real-time vehicle speed information and real-time driving angle information of the unmanned vehicle, which are measured by a combined navigation system, and first time information corresponding to the real-time vehicle speed information and the real-time driving angle information, and the radar data comprises relative speed information and relative angle information of objects except the unmanned vehicle, which are measured by a vehicle-mounted radar, relative to the unmanned vehicle, and second time information corresponding to the relative speed information and the relative angle information; screening out data when the unmanned vehicle runs straight from the vehicle speed data as reference data, and screening out data in the same time period as the reference data from the radar data as data to be synchronized; and matching the same data in the reference data and the data to be synchronized, and determining the time for synchronizing the radar data according to the first synchronization time information and the second synchronization time information corresponding to the same data.
In some embodiments, the step of screening the data when the unmanned vehicle travels straight from the vehicle speed data as reference data includes: calculating the angle change rate according to the real-time driving angle information; and taking the corresponding real-time vehicle speed information and the first time information when the angle change rate is 0 as reference real-time vehicle speed information and reference first time information.
In some embodiments, the screening out data of the same time period as the reference data from the radar data as data to be synchronized includes: selecting second time information to be synchronized, which has the same time period as the time period corresponding to the reference first time information, from the radar data; and determining the relative speed information to be synchronized corresponding to the second time information to be synchronized.
In some embodiments, the matching the same data in the reference data and the data to be synchronized includes: respectively drawing a first speed waveform of the reference real-time speed information and a second speed waveform of the relative speed information to be synchronized based on the reference first time information and the second time information to be synchronized; setting a plurality of mark points for the first speed waveform, and determining waveform data corresponding to the mark points as first identical data, wherein the mark points are used for marking the waveform characteristics of the first speed waveform; and matching data corresponding to the plurality of mark points from the second velocity waveform as second identical data.
In some embodiments, the determining the time for synchronizing the radar data according to the first synchronization time information and the second synchronization time information corresponding to the same data includes: respectively determining first synchronization time information and second synchronization time information corresponding to the first identical data and the second identical data; and calculating a time difference between a time corresponding to the first synchronization time information and a time corresponding to the second synchronization time information, and setting the time difference as a time for synchronizing the radar data.
In a second aspect, the present application provides an apparatus for synchronizing time, which is applied to an unmanned vehicle including a combined navigation system and a vehicle-mounted radar, and includes: the data acquisition unit is used for acquiring vehicle speed data and radar data, wherein the vehicle speed data comprises real-time vehicle speed information and real-time driving angle information of the unmanned vehicle measured by a combined navigation system, and first time information corresponding to the real-time vehicle speed information and the real-time driving angle information, and the radar data comprises relative speed information and relative angle information of objects except the unmanned vehicle, which are measured by a vehicle-mounted radar, relative to the unmanned vehicle, and second time information corresponding to the relative speed information and the relative angle information; the data extraction unit is used for screening out data when the unmanned vehicle runs straight from the vehicle speed data as reference data, and screening out data in the same time period as the reference data from the radar data as data to be synchronized; and the synchronization unit is used for matching the same data in the reference data and the data to be synchronized and determining the time for synchronizing the radar data according to the first synchronization time information and the second synchronization time information corresponding to the same data.
In some embodiments, the data extracting unit includes: the angle change rate calculation subunit is used for calculating the angle change rate according to the real-time driving angle information; and a reference data determining subunit, configured to use the real-time vehicle speed information and the first time information corresponding to the angular change rate of 0 as reference real-time vehicle speed information and reference first time information.
In some embodiments, the data extracting unit includes: a synchronization-waiting second time information selection subunit that uses the same time period as a time period for which the reference first time information is selected from the radar data; and the relative speed information to be synchronized determining subunit is used for determining the relative speed information to be synchronized corresponding to the second time information to be synchronized.
In some embodiments, the synchronization unit includes: a speed waveform drawing subunit, configured to draw a first speed waveform of the reference real-time vehicle speed information and a second speed waveform of the to-be-synchronized relative speed information, respectively, based on the reference first time information and the to-be-synchronized second time information; a first identical data acquiring subunit, configured to set a plurality of mark points for the first speed waveform, and determine waveform data corresponding to the mark points as first identical data, where the mark points are used to mark a waveform feature of the first speed waveform; and a second identical data acquisition subunit, configured to match data corresponding to the multiple mark points from the second velocity waveform as second identical data.
In some embodiments, the synchronization unit includes: a synchronization time information determining subunit, configured to determine first synchronization time information and second synchronization time information corresponding to the first identical data and the second identical data, respectively; and a synchronization time setting subunit configured to calculate a time difference between a time corresponding to the first synchronization time information and a time corresponding to the second synchronization time information, and set the time difference as a time for synchronizing the radar data.
In a third aspect, an embodiment of the present application provides a server, including: one or more processors; the integrated navigation system is used for detecting real-time speed information and real-time driving angle information of the unmanned vehicle in real time and first time information corresponding to the real-time speed information and the real-time driving angle information; the vehicle-mounted radar is used for monitoring relative speed information and relative angle information of objects except the unmanned vehicle relative to the unmanned vehicle, and second time information corresponding to the relative speed information and the relative angle information; the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method for synchronizing time of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the method for synchronizing time of the first aspect.
According to the method, the device and the server for synchronizing the time, the reference data during straight-line running are determined from the vehicle speed data monitored by the integrated navigation system, the data to be synchronized corresponding to the reference data are found from the radar data monitored by the vehicle-mounted radar, the time for synchronizing the radar data is determined by determining the same data in the reference data and the data to be synchronized, and the accuracy of the radar data is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram for one embodiment of a method for synchronizing time according to the present application;
FIG. 3 is a waveform diagram of a portion of vehicle speed data detected by the integrated navigation system;
FIG. 4 is a waveform diagram of a portion of radar data detected by the onboard radar;
FIG. 5 is a real-time vehicle speed waveform plot for straight-line travel in vehicle speed data;
FIG. 6 is a waveform diagram of relative vehicle speed when driving straight in radar data;
FIG. 7 is a schematic diagram of an application scenario of a method for synchronizing time according to the present application;
FIG. 8 is a schematic block diagram illustrating one embodiment of an apparatus for synchronizing time according to the present application;
FIG. 9 is a block diagram of a computer system suitable for use in implementing a server according to embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates anexemplary system architecture 100 to which embodiments of the present method for synchronizing time or apparatus for synchronizing time may be applied.
As shown in fig. 1, thesystem architecture 100 may includeunmanned vehicles 101, 102, 103, anetwork 104, and aserver 105. Network 104 serves as a medium to provide communication links betweenunmanned vehicles 101, 102, 103 andserver 105.Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use theunmanned vehicles 101, 102, 103 to interact with theserver 105 over thenetwork 104 to receive or send messages or the like. Theunmanned vehicles 101, 102, 103 may have various information monitoring devices mounted thereon, such as a combined navigation system, an in-vehicle radar, a GPS, and the like.
Theunmanned vehicles 101, 102, 103 may be various vehicles having a plurality of information monitoring devices and data processing units, including but not limited to electric vehicles, hybrid electric vehicles, and internal combustion engine vehicles, etc.
Theserver 105 may be a server that provides various services, such as a server that calculates data obtained by theunmanned vehicles 101, 102, 103 and obtains control instructions for theunmanned vehicles 101, 102, 103. The server may receive vehicle speed data and radar data monitored by theunmanned vehicles 101, 102, 103, determine information such as relative distances and relative speeds between theunmanned vehicles 101, 102, 103 and other objects, and speeds of theunmanned vehicles 101, 102, 103 themselves according to the vehicle speed data and the radar data, determine driving states of theunmanned vehicles 101, 102, 103 according to the information, and control driving of theunmanned vehicles 101, 102, 103.
It should be noted that the method for synchronizing time provided by the embodiment of the present application is generally performed by theunmanned vehicles 101, 102, 103, and accordingly, the device for synchronizing time is generally disposed in theunmanned vehicles 101, 102, 103.
It should be understood that the number of unmanned vehicles, networks, and servers in fig. 1 is merely illustrative. There may be any number of unmanned vehicles, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, aflow 200 of one embodiment of a method for synchronizing time in accordance with the present application is shown. The method for synchronizing time may include the steps of:
step 201, vehicle speed data and radar data are acquired.
In the present embodiment, an electronic device (for example, theunmanned vehicles 101, 102, 103 shown in fig. 1) on which the method for synchronizing time is executed may acquire vehicle speed data and radar data through a wired connection manner or a wireless connection manner, wherein the vehicle speed data includes real-time vehicle speed information and real-time travel angle information of the unmanned vehicle measured by a combined navigation system, and first time information corresponding to the real-time vehicle speed information and the real-time travel angle information, and the radar data includes relative speed information and relative angle information of an object other than the unmanned vehicle, relative to the unmanned vehicle, measured by a vehicle-mounted radar, and second time information corresponding to the relative speed information and the relative angle information. In practice, the web address is generally represented by a Uniform Resource Locator (URL). It should be noted that the wireless connection means may include, but is not limited to, a 3G/4G connection, a WiFi connection, a bluetooth connection, a WiMAX connection, a Zigbee connection, a uwb (ultra wideband) connection, and other wireless connection means now known or developed in the future.
Theunmanned vehicles 101, 102, 103 need to acquire information of surrounding objects to travel according to the information of the surrounding objects. Theunmanned vehicles 101, 102, 103 in the present application can acquire information of theunmanned vehicles 101, 102, 103 themselves and information of surrounding objects by combining a navigation system and a vehicle-mounted radar or the like. The integrated navigation system is installed on theunmanned vehicles 101, 102, 103 and is used for monitoring data of theunmanned vehicles 101, 102, 103. The integrated navigation system can acquire real-time vehicle speed information and real-time travel angle information of theunmanned vehicles 101, 102, 103, and time information corresponding to the real-time vehicle speed information and the real-time travel angle information. That is, the information acquired by the integrated navigation system is the information without time delay. As shown in fig. 3. In fig. 3, the left ordinate is the real-time vehicle speed coordinate, the right ordinate is the real-time travel angle coordinate, and the abscissa is the first time coordinate. As can be seen from fig. 3, theunmanned vehicles 101, 102, 103 change the driving angle from about 50 degrees to 90 degrees in 1-3 seconds; the speed of the vehicle is reduced from 90 km/h to 40 km/h, which indicates that theunmanned vehicles 101, 102 and 103 are in a deceleration turning stage; in the time of 3-11 seconds, the driving angle is not changed; the speed of theunmanned vehicles 101, 102 and 103 is increased from 40 km/h to 100 km/h, and then is reduced from 100 km/h to 55 km/h, which indicates that the unmanned vehicles are in a straight-going stage; in the time of 11-14 seconds, the driving angle is changed from about 90 degrees to 120 degrees; the vehicle speed decreases from 55 km/h to 40 km/h, indicating that theunmanned vehicles 101, 102, 103 are in a deceleration turning phase.
The vehicle-mounted radar is generally installed at a head position of theunmanned vehicles 101, 102, 103, and is configured to determine relative speed information, relative angle information, and time information corresponding to the relative speed information and the relative angle information of other objects with respect to theunmanned vehicles 101, 102, 103 by monitoring information such as distances and angles between theunmanned vehicles 101, 102, 103 and other objects (such as vehicles, buildings, trees, pedestrians, etc.) other than theunmanned vehicles 101, 102, 103. As shown in fig. 4. In fig. 4, the left ordinate is a relative vehicle speed coordinate, the right ordinate is a relative travel angle coordinate, and the abscissa is a second time coordinate. The waveforms in fig. 4 and 3 are similar because both the integrated navigation system and the onboard radar can measure accurate data for theunmanned vehicles 101, 102, 103, except for the time delay between the two. In fig. 4, only the relative speed information, the relative angle information, and the second time information are displayed, and other information (e.g., distance information, etc.) is not shown. The vehicle-mounted radar may be installed at a head position of theunmanned vehicles 101, 102, 103, or may be installed at a tail or other positions of theunmanned vehicles 101, 102, 103. The vehicle-mounted radar can simultaneously monitor information such as distances and angles of dozens or even hundreds of objects. In practice, the number of moving objects around theunmanned vehicles 101, 102, 103 is not so large that even in extreme cases (e.g., vehicles that are moving around theunmanned vehicles 101, 102, 103), the onboard radar obtains data of only a small number of moving objects. Most of the data monitored by the onboard radar is data of stationary objects (e.g., stationary vehicles, buildings, trees, etc.) surrounding theunmanned vehicles 101, 102, 103. In determining information such as relative speed information, relative angle information, and the like of other objects with respect to theunmanned vehicles 101, 102, 103, data that occupies a dominant proportion may be selected from data obtained by the in-vehicle radar. For example, the vehicle-mounted radar monitors data of 100 objects, wherein data change trends (speed change trend and angle change trend) of 90 objects are consistent, and data change trends of 10 objects are irregular. Then it is indicated that 90 objects are detected as stationary objects and 10 objects may be other vehicles traveling in the same direction as theunmanned vehicles 101, 102, 103, other vehicles traveling in a different direction than theunmanned vehicles 101, 102, 103, or pedestrians, animals, etc. in motion. The vehicle-mounted radar can determine information such as relative speed information and relative angle information of other objects relative to theunmanned vehicles 101, 102 and 103 through the monitored data of the 90 objects, and the information is also information of relatively static objects of theunmanned vehicles 101, 102 and 103.
Step 202, screening out data when the unmanned vehicle runs straight from the vehicle speed data as reference data, and screening out data in the same time period as the reference data from the radar data as data to be synchronized.
The vehicle-mounted radar can monitor information such as distances and angles of objects other than theunmanned vehicles 101, 102, 103, which is important in determining the driving states of theunmanned vehicles 101, 102, 103 and determining control commands for theunmanned vehicles 101, 102, 103. In practice, the amount of data acquired by the vehicle-mounted radar and the occupied data storage space are large. Thus, these data cannot be transmitted to the data processing units of theunmanned vehicles 101, 102, 103 in real time. After a certain time has elapsed since the data was transmitted from the onboard radar to the data processing unit, the data processing unit will configure a time stamp for the data according to the current system time of theunmanned vehicle 101, 102, 103, which results in a situation where the time indicated by the time stamp does not correspond to the actual time at which the data was acquired. Which in turn has an impact on the subsequent determination of the driving state of theunmanned vehicles 101, 102, 103 from these data and the determination of the control commands of theunmanned vehicles 101, 102, 103.
In practice, when the vehicle-mounted radar is fixed in position of theunmanned vehicles 101, 102, 103, and the line between the vehicle-mounted radar and the data processing unit of theunmanned vehicles 101, 102, 103 is fixed, the time (delay time) taken for the data to be transmitted from the vehicle-mounted radar to the data processing unit is also fixed. If the fixed delay time is calculated, the actual acquisition time of the data monitored by the vehicle-mounted radar can be determined by using the delay time after the data processing unit acquires the data monitored by the vehicle-mounted radar.
Theunmanned vehicles 101, 102, 103 may encounter a variety of road conditions during actual travel. In order to facilitate matching of data monitored by the integrated navigation system and data monitored by the vehicle-mounted radar, data of theunmanned vehicles 101, 102 and 103 in a straight-line driving state are screened out from speed data monitored by the integrated navigation system and serve as reference data. And under the straight-line driving state, the angle corresponding to the real-time driving angle information in the vehicle speed data is unchanged. And then, under the corresponding straight line driving state, the angle corresponding to the relative angle information monitored by the radar data is also unchanged, and the data in the radar data corresponding to the straight line driving state is taken as data to be synchronized.
In some optional implementations of the embodiment, the step of screening the data when the unmanned vehicle travels straight as the reference data from the vehicle speed data may include the following steps:
and step one, calculating the angle change rate according to the real-time running angle information.
The data monitored by the integrated navigation system includes real-time vehicle speed information and real-time driving angle information of theunmanned vehicles 101, 102 and 103, and first time information corresponding to the real-time vehicle speed information and the real-time driving angle information. And when the vehicle runs in a straight line, the angle corresponding to the real-time running angle information is unchanged. Therefore, the angle change rate of the angle corresponding to the real-time travel angle information can be calculated, and the travel route of theunmanned vehicles 101, 102, 103 can be determined to be a straight line or a curved line according to the angle change rate.
And secondly, taking the real-time vehicle speed information and the first time information corresponding to the angular change rate of 0 as reference real-time vehicle speed information and reference first time information.
When theunmanned vehicles 101, 102, 103 travel straight, the angle corresponding to the real-time travel angle information is not changed, and the corresponding angle change rate is 0. Therefore, the real-time vehicle speed information and the first time information corresponding to the angular change rate of 0 are selected as the reference real-time vehicle speed information and the reference first time information. The reference real-time vehicle speed information and the reference first time information are data when theunmanned vehicles 101, 102, 103 travel straight. Preferably, the real-time vehicle speed information and the first time information, which have a longer time when the angular change rate is 0, may be selected as the reference real-time vehicle speed information and the reference first time information.
In some optional implementations of this embodiment, the screening out data in the same time period as the reference data from the radar data as the data to be synchronized may include the following steps:
the method includes the first step of selecting second time information to be synchronized, which is the same as a time period corresponding to the reference first time information, from the radar data.
The delay time between the transmission of data from the onboard radar to the data processing unit is typically not very long, and the time during which theunmanned vehicle 101, 102, 103 travels straight is typically much longer than the delay time. After the reference first time information is determined, the second time information to be synchronized may be determined according to a time period (e.g., 9 o 'clock 0min 0 sec to 9 o'clock 1min 20 sec) corresponding to the reference first time information. Since the radar data is a time stamp which is unique after reaching the data processing unit, the time corresponding to the reference first time information does not completely correspond to the time corresponding to the to-be-synchronized second time information, but the time of coincidence between the time corresponding to the reference first time information and the time corresponding to the to-be-synchronized second time information is sufficient to represent the straight-line traveling state of theunmanned vehicles 101, 102, 103.
And secondly, determining the relative speed information to be synchronized corresponding to the second time information to be synchronized.
The radar data includes relative speed information and relative angle information of an object other than the unmanned vehicle, which are measured by the vehicle-mounted radar, with respect to the unmanned vehicle, and second time information corresponding to the relative speed information and the relative angle information. That is, the relative velocity information, the relative angle information, and the second time information have a corresponding relationship. The second time information to be synchronized is determined, and the relative speed information to be synchronized corresponding to the second time information to be synchronized can be found.
And step 203, matching the same data in the reference data and the data to be synchronized, and determining the time for synchronizing the radar data according to the first synchronization time information and the second synchronization time information corresponding to the same data.
Both the combined navigation system and the onboard radar are able to measure accurate data for theunmanned vehicles 101, 102, 103, except for a delay time between the two. Therefore, the same data can be found from the reference data and the data to be synchronized, which correspond to the first synchronization time information and the second synchronization time information in the reference data and the data to be synchronized, respectively. The time for synchronizing the radar data may be determined based on the first synchronization time information and the second synchronization time information.
In some optional implementations of this embodiment, the matching the same data in the reference data and the data to be synchronized may include the following steps:
and in the first step, respectively drawing a first speed waveform of the reference real-time vehicle speed information and a second speed waveform of the relative speed information to be synchronized based on the reference first time information and the second time information to be synchronized.
The waveform can visually reflect the characteristics of data change. Therefore, the first speed waveform of the reference real-time vehicle speed information and the second speed waveform of the relative speed information to be synchronized can be respectively drawn based on the reference first time information and the second time information to be synchronized. In the first velocity waveform and the second velocity waveform, the abscissa may be time and the ordinate may be velocity.
And secondly, setting a plurality of mark points for the first speed waveform, and determining waveform data corresponding to the mark points as first same data.
In practice, the travel speed of theunmanned vehicles 101, 102, 103 is usually not constant. Correspondingly, the velocity waveform is not a straight line, but a varying curve. The curve is usually characterized by itself, so that the speed waveform can be distinguished from other speed waveforms by arranging a plurality of marking points on the speed waveform. In the implementation manner, a plurality of mark points are arranged on the first speed waveform, and the waveform data corresponding to the comparison point is determined as the first same data, that is, the first same data is the speed data corresponding to the first speed waveform. The marking points are used for marking the waveform characteristics of the first speed waveform.
And thirdly, matching data corresponding to the plurality of mark points from the second speed waveform to be used as second identical data.
From the above description, it can be seen that the combined navigation system and the onboard radar can both measure accurate data of theunmanned vehicles 101, 102, 103, except for the time delay between the two. Therefore, the second identical data corresponding to the first identical data corresponding to the plurality of mark points of the first velocity waveform can also be matched in the second velocity waveform. The matching mode can be as follows: determining distances and angles between a plurality of marking points; then a plurality of marking points are connected in sequence to obtain a broken line; and searching data corresponding to the broken line in the second speed waveform to obtain second identical data, namely the second identical data is the speed data corresponding to the second speed waveform. The matching mode can be other modes, and is not described in detail herein.
In some optional implementation manners of this embodiment, the determining, according to the first synchronization time information and the second synchronization time information corresponding to the same data, a time for synchronizing the radar data may include:
the method comprises the steps of firstly, respectively determining first synchronous time information and second synchronous time information corresponding to the first same data and the second same data.
After the first identical data is determined through the mark point, first synchronization time information corresponding to the first identical data can be determined, namely the first synchronization time information is time information corresponding to the first identical data in the first time information. Similarly, the second synchronization time information is time information corresponding to the second identical data in the second time information.
And a second step of calculating a time difference between a time corresponding to the first synchronization time information and a time corresponding to the second synchronization time information, and setting the time difference as a time for synchronizing the radar data.
The first and second synchronization time information correspond to the system time of theunmanned vehicles 101, 102, 103, except for a time difference indicated, which is equal to the delay time due to the route etc., i.e. the time to synchronize the radar data.
As can be seen from the above description, the reference first time information when the angle change rate corresponding to the real-time driving angle information is 0 is 3 to 11 seconds, and the second time information to be synchronized is also selected from 3 to 11 seconds. And respectively drawing a first speed waveform (figure 5) and a second speed waveform (figure 6) through reference first time information, reference real-time vehicle speed information corresponding to the reference first time information, second time information to be synchronized and relative speed information to be synchronized corresponding to the second time information to be synchronized.
Since fig. 5 is the real-time vehicle speed information acquired by the integrated navigation system, the time difference in fig. 6 can be calculated using fig. 5 as a standard. A plurality of mark points (black points in fig. 5) are set in fig. 5, and after the mark points in fig. 5 are matched with the waveforms in fig. 6, the time difference can be determined according to the first synchronization time and the second synchronization time corresponding to the mark points in fig. 5 and fig. 6. The time difference between the same marked point in fig. 5 and 6 is 0.4(8.80-8.40 ═ 0.4 seconds). Therefore, it is possible to advance the time corresponding to all the data of the vehicle-mounted radar acquired by theunmanned vehicles 101, 102, 103 by 0.4 second, which is the time when the data is actually generated.
With continued reference to fig. 7, fig. 7 is a schematic diagram of an application scenario of the method for synchronizing time according to the present embodiment. In the application scenario of fig. 7, the in-vehicle radar 701 may monitor radar data such as distance, angle, and speed of moving vehicles on the road and stationary trees on the roadside. Theintegrated navigation system 702 can monitor the speed data of theunmanned vehicle 703, such as the speed and angle, in real time. After theunmanned vehicle 703 acquires radar data monitored by the vehicle-mountedradar 701, the system time of theunmanned vehicle 703 is set to be the time of the radar data, and the vehicle speed data is real-time. In order to synchronize the radar data, data when the unmanned vehicle runs straight is screened out from the speed data to be used as reference data, and data in the same time period as the reference data is screened out from the radar data to be used as data to be synchronized. Matching the same data in the reference data and the data to be synchronized, and determining the time for synchronizing the radar data according to the first synchronization time information and the second synchronization time information corresponding to the same data, so as to obtain the time for truly generating the radar data.
The method provided by the embodiment of the application determines the reference data during straight-line driving from the vehicle speed data monitored by the integrated navigation system, finds the data to be synchronized corresponding to the reference data from the radar data monitored by the vehicle-mounted radar, and determines the time for synchronizing the radar data by determining the same data in the reference data and the data to be synchronized, so that the accuracy of the radar data is improved.
With further reference to fig. 8, as an implementation of the methods shown in the above-mentioned figures, the present application provides an embodiment of an apparatus for synchronizing time, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable to various electronic devices.
As shown in fig. 8, theapparatus 800 for synchronizing time of the present embodiment may include: adata acquisition unit 801, adata extraction unit 802, and asynchronization unit 803. Thedata acquisition unit 801 is configured to acquire vehicle speed data and radar data, where the vehicle speed data includes real-time vehicle speed information and real-time driving angle information of the unmanned vehicle measured by a combined navigation system, and first time information corresponding to the real-time vehicle speed information and the real-time driving angle information, and the radar data includes relative speed information and relative angle information of an object other than the unmanned vehicle, which are measured by a vehicle-mounted radar, with respect to the unmanned vehicle, and second time information corresponding to the relative speed information and the relative angle information; thedata extraction unit 802 is configured to screen out data of the unmanned vehicle when the unmanned vehicle travels straight from the vehicle speed data as reference data, and screen out data of the radar data in the same time period as the reference data as data to be synchronized; thesynchronization unit 803 is configured to match the same data in the reference data and the data to be synchronized, and determine a time for synchronizing the radar data according to the first synchronization time information and the second synchronization time information corresponding to the same data.
In some optional implementations of the present embodiment, thedata extraction unit 802 may include: an angle change rate calculation subunit (not shown in the figure) and a reference data determination subunit (not shown in the figure). The angle change rate calculation subunit is used for calculating the angle change rate according to the real-time running angle information; and the reference data determining subunit is used for taking the corresponding real-time vehicle speed information and the first time information when the angle change rate is 0 as the reference real-time vehicle speed information and the reference first time information.
In some optional implementations of the present embodiment, thedata extraction unit 802 may include: a second time information selection subunit to be synchronized (not shown in the figure) and a relative speed information determination subunit to be synchronized (not shown in the figure). The second time information to be synchronized selecting subunit uses the same second time information to be synchronized as the time period corresponding to the reference first time information selected from the radar data; and the relative speed information to be synchronized determining subunit is used for determining the relative speed information to be synchronized corresponding to the second time information to be synchronized.
In some optional implementations of this embodiment, thesynchronization unit 803 may include: a velocity waveform drawing sub-unit (not shown), a first identical data obtaining sub-unit (not shown), and a second identical data obtaining sub-unit (not shown). The speed waveform drawing subunit is configured to draw a first speed waveform of the reference real-time vehicle speed information and a second speed waveform of the to-be-synchronized relative speed information, respectively, based on the reference first time information and the to-be-synchronized second time information; the first identical data acquisition subunit is configured to set a plurality of mark points for the first speed waveform, and determine waveform data corresponding to the mark points as first identical data, where the mark points are used to mark a waveform feature of the first speed waveform; and the second identical data acquisition subunit is used for matching data corresponding to the plurality of mark points from the second speed waveform to be used as second identical data.
In some optional implementations of this embodiment, thesynchronization unit 803 may include: a synchronization time information determining subunit (not shown in the figure) and a synchronization time setting subunit (not shown in the figure). The synchronization time information determining subunit is configured to determine first synchronization time information and second synchronization time information corresponding to the first identical data and the second identical data, respectively; the synchronization time setting subunit is configured to calculate a time difference between a time corresponding to the first synchronization time information and a time corresponding to the second synchronization time information, and set the time difference as a time for synchronizing the radar data.
The present embodiment further provides a server, including: one or more processors; a memory for storing one or more programs; the integrated navigation system is used for detecting real-time vehicle speed information and real-time driving angle information of the unmanned vehicle in real time and first time information corresponding to the real-time vehicle speed information and the real-time driving angle information; the vehicle-mounted radar is used for monitoring relative speed information and relative angle information of objects except the unmanned vehicle relative to the unmanned vehicle, and second time information corresponding to the relative speed information and the relative angle information; the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method for synchronizing time described above.
The present embodiment also provides a computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the above-mentioned method for synchronizing time.
Referring now to FIG. 9, shown is a block diagram of acomputer system 900 suitable for use in implementing a server according to embodiments of the present application. The server shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 9, thecomputer system 900 includes a Central Processing Unit (CPU)901 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)902 or a program loaded from astorage section 908 into a Random Access Memory (RAM) 903. In theRAM 903, various programs and data necessary for the operation of thesystem 900 are also stored. TheCPU 901,ROM 902, andRAM 903 are connected to each other via abus 904. An input/output (I/O)interface 905 is also connected tobus 904.
The following components are connected to the I/O interface 905: aninput portion 906 including a keyboard, a mouse, and the like; anoutput section 907 including components such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; astorage portion 908 including a hard disk and the like; and acommunication section 909 including a network interface card such as a LAN card, a modem, or the like. Thecommunication section 909 performs communication processing via a network such as the internet. Thedrive 910 is also connected to the I/O interface 905 as necessary. Aremovable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on thedrive 910 as necessary, so that a computer program read out therefrom is mounted into thestorage section 908 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through thecommunication section 909, and/or installed from theremovable medium 911. The above-described functions defined in the method of the present application are executed when the computer program is executed by a Central Processing Unit (CPU) 901.
It should be noted that the computer readable medium mentioned above in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a data acquisition unit, a data extraction unit, and a synchronization unit. Where the names of these units do not in some cases constitute a limitation of the unit itself, for example, a synchronization unit may also be described as a "unit for synchronizing the time of radar data".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: acquiring vehicle speed data and radar data, wherein the vehicle speed data comprises real-time vehicle speed information and real-time driving angle information of the unmanned vehicle, which are measured by a combined navigation system, and first time information corresponding to the real-time vehicle speed information and the real-time driving angle information, and the radar data comprises relative speed information and relative angle information of objects except the unmanned vehicle, which are measured by a vehicle-mounted radar, relative to the unmanned vehicle, and second time information corresponding to the relative speed information and the relative angle information; screening out data when the unmanned vehicle runs straight from the vehicle speed data as reference data, and screening out data in the same time period as the reference data from the radar data as data to be synchronized; and matching the same data in the reference data and the data to be synchronized, and determining the time for synchronizing the radar data according to the first synchronization time information and the second synchronization time information corresponding to the same data.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (14)

CN201710497226.0A2017-06-262017-06-26Method, device and server for synchronizing timeActiveCN109116357B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201710497226.0ACN109116357B (en)2017-06-262017-06-26Method, device and server for synchronizing time

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201710497226.0ACN109116357B (en)2017-06-262017-06-26Method, device and server for synchronizing time

Publications (2)

Publication NumberPublication Date
CN109116357A CN109116357A (en)2019-01-01
CN109116357Btrue CN109116357B (en)2020-12-11

Family

ID=64821756

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201710497226.0AActiveCN109116357B (en)2017-06-262017-06-26Method, device and server for synchronizing time

Country Status (1)

CountryLink
CN (1)CN109116357B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN114548058B (en)*2022-02-162024-11-08东风商用车有限公司 A method for intelligent driving data analysis and synchronization based on ROS
CN115941612B (en)*2022-11-142024-05-17中国联合网络通信集团有限公司 Data processing method, device and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN103797735A (en)*2011-09-122014-05-14大陆-特韦斯贸易合伙股份公司及两合公司Method and device for synchronizing network subscribers in an on-board network of a vehicle
CN104569965A (en)*2014-12-242015-04-29西安电子工程研究所Method for synchronizing time and frequency of motor-driven configured bistatic radar
CN104584101A (en)*2012-08-162015-04-29大陆-特韦斯贸易合伙股份公司及两合公司 Method and system for compensating for time offsets
CN106101256A (en)*2016-07-072016-11-09百度在线网络技术(北京)有限公司Method and apparatus for synchrodata

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
DE10210711A1 (en)*2002-03-122003-11-13Deutsche Telekom Ag Method for time synchronization of at least two measuring computers cooperating with each other via a telecommunication network, such as internet, intranet or the like

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN103797735A (en)*2011-09-122014-05-14大陆-特韦斯贸易合伙股份公司及两合公司Method and device for synchronizing network subscribers in an on-board network of a vehicle
CN104584101A (en)*2012-08-162015-04-29大陆-特韦斯贸易合伙股份公司及两合公司 Method and system for compensating for time offsets
CN104569965A (en)*2014-12-242015-04-29西安电子工程研究所Method for synchronizing time and frequency of motor-driven configured bistatic radar
CN106101256A (en)*2016-07-072016-11-09百度在线网络技术(北京)有限公司Method and apparatus for synchrodata

Also Published As

Publication numberPublication date
CN109116357A (en)2019-01-01

Similar Documents

PublicationPublication DateTitle
CN109141464B (en)Navigation lane change prompting method and device
CN107063713B (en)Test method and device applied to unmanned automobile
CN113127583B (en)Data transmission method and device
CN110654381B (en) Method and apparatus for controlling a vehicle
CN110019570B (en)Map construction method and device and terminal equipment
CN109709966B (en)Control method and device for unmanned vehicle
CN110696826B (en)Method and device for controlling a vehicle
CN111401255B (en)Method and device for identifying bifurcation junctions
CN110163153B (en)Method and device for recognizing traffic sign board boundary
US11999371B2 (en)Driving assistance processing method and apparatus, computer-readable medium, and electronic device
CN112590813A (en)Method, apparatus, electronic device, and medium for generating information of autonomous vehicle
US10018732B2 (en)Information processing apparatus and information processing system
CN114750759A (en)Following target determination method, device, equipment and medium
JP2019203823A (en)Travel plan generator, travel plan generation method, and control program
CN110619666B (en)Method and device for calibrating camera
CN110717918A (en) Pedestrian detection method and device
CN109116357B (en)Method, device and server for synchronizing time
CN115235487B (en)Data processing method, device, equipment and medium
CN112859109A (en)Unmanned aerial vehicle panoramic image processing method and device and electronic equipment
CN114056337B (en)Method, device and computer program product for predicting vehicle running behavior
CN107092253B (en)Method and device for controlling unmanned vehicle and server
CN112305499A (en)Method and device for positioning according to light source
CN111340880B (en)Method and apparatus for generating predictive model
CN110588666B (en) Method and device for controlling vehicle travel
US12067790B2 (en)Method and system for identifying object

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp