Movatterモバイル変換


[0]ホーム

URL:


CN113189583B - Time-space synchronization millimeter wave radar and visual information fusion method - Google Patents

Time-space synchronization millimeter wave radar and visual information fusion method
Download PDF

Info

Publication number
CN113189583B
CN113189583BCN202110455091.8ACN202110455091ACN113189583BCN 113189583 BCN113189583 BCN 113189583BCN 202110455091 ACN202110455091 ACN 202110455091ACN 113189583 BCN113189583 BCN 113189583B
Authority
CN
China
Prior art keywords
track
wave radar
point
millimeter wave
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110455091.8A
Other languages
Chinese (zh)
Other versions
CN113189583A (en
Inventor
丁雅斌
王晨迁
李云飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin UniversityfiledCriticalTianjin University
Priority to CN202110455091.8ApriorityCriticalpatent/CN113189583B/en
Publication of CN113189583ApublicationCriticalpatent/CN113189583A/en
Application grantedgrantedCritical
Publication of CN113189583BpublicationCriticalpatent/CN113189583B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The invention discloses a time-space synchronization millimeter wave radar and visual information fusion method which mainly comprises three steps, wherein the first step is to complete the primary positioning of a front same-rail train by utilizing the message data analysis of a millimeter wave radar sensor. And secondly, completing the detection of the advancing track of the running train and the image position of the front same-track train by using a vision sensor based on an image processing technology. And step three, fusing millimeter wave radar information and visual information by a time-space synchronization-based combined calibration method to finish accurate identification and positioning of the front same-rail train. The method overcomes the problems of insufficient precision, low adaptability and the like existing in the detection of a single sensor, realizes the real-time monitoring of the distance between the running train and the front same-rail train, and improves the running safety of the train.

Description

Translated fromChinese
一种时-空同步的毫米波雷达和视觉信息融合方法A spatio-temporal synchronization method for millimeter-wave radar and visual information fusion

技术领域technical field

本发明涉及毫米波雷达和视觉测量领域,尤其涉及基于毫米波雷达和视觉信息融合的轨道目标测距方法。The invention relates to the field of millimeter-wave radar and visual measurement, and in particular to a method for measuring distance of orbital targets based on the fusion of millimeter-wave radar and visual information.

背景技术Background technique

轨道交通运行系统中为了提高列车运行的安全性,需要对列车和前方目标之间的距离进行实时监测。单一的传感器检测存在着精度不足和适应性不高等一系列的不足。如毫米波雷达传感器可以实现对于前方全部目标距离信息的准确输出,但由于没有较为直观的图像信息显示,因此无法进行前方目标的准确识别,同时由于毫米波雷达对于金属的敏感程度较高,因此很容易受到杂波的干扰而产生目标检测位置偏移和目标未检测到的情况,严重影响目标跟踪的实时性与稳定性;利用相机可以获取到前方目标的实时图像信息,但对于目标的捕捉定位和与目标之间的实际相对位置信息较难获得,因此也很难满足对于前方被测目标的实时位置检测。In order to improve the safety of train operation in the rail transit operation system, it is necessary to monitor the distance between the train and the target in front of it in real time. Single sensor detection has a series of shortcomings such as insufficient precision and low adaptability. For example, the millimeter-wave radar sensor can achieve accurate output of the distance information of all the targets ahead, but because there is no more intuitive image information display, it cannot accurately identify the front targets. At the same time, because the millimeter-wave radar is highly sensitive to metals, so It is easy to be interfered by clutter, resulting in the situation that the target detection position is shifted and the target is not detected, which seriously affects the real-time and stability of target tracking; the real-time image information of the front target can be obtained by using the camera, but the capture of the target It is difficult to obtain the actual relative position information between the positioning and the target, so it is difficult to meet the real-time position detection of the detected target ahead.

发明内容SUMMARY OF THE INVENTION

本发明的目的在于克服现有技术的缺点,提供一种增加了目标检测准确性,增加了测距结果的实时性的时-空同步的毫米波雷达和视觉传感器信息融合方法。The purpose of the present invention is to overcome the shortcomings of the prior art, and to provide a time-space synchronization millimeter-wave radar and visual sensor information fusion method that increases the accuracy of target detection and increases the real-time performance of ranging results.

本发明的一种时-空同步的毫米波雷达和视觉传感器信息融合方法,包括以下步骤:A time-space synchronous millimeter-wave radar and visual sensor information fusion method of the present invention includes the following steps:

步骤一、利用毫米波雷达传感器的报文数据解析完成被测目标初定位,具体步骤为:Step 1. Use the message data analysis of the millimeter-wave radar sensor to complete the initial positioning of the measured target, and the specific steps are:

首先将毫米波雷达传感器安装在行驶列车车头位置,以毫米波雷达最大平面的几何中心为坐标原点,以行驶列车前进方向为Yrw轴,竖直向上方向为Zrw轴,行驶列车正右侧方向为Xrw轴,建立毫米波雷达三维直角坐标系,毫米波雷达传感器在毫米波雷达三维直角坐标系中的俯仰角、偏航角和滚转角均为零,毫米波雷达传感器通过CAN总线与计算机进行连接,毫米波雷达传感器用于获取对前方所有目标列车检测得到的报文数据信息,然后利用计算机MFC功能及毫米波雷达通讯协议完成雷达报文数据解析,所述的前方所有目标列车包含前方同轨列车和前方相邻轨道列车,所述的报文数据信息包括前方所有目标与毫米波雷达坐标原点的横向距离dx和纵向距离dyFirst, install the millimeter-wave radar sensor at the head of the traveling train, take the geometric center of the largest plane of the millimeter-wave radar as the coordinate origin, take the forward direction of the traveling train as the Yrw axis, the vertical upward direction as the Zrw axis, and the right side of the traveling train The direction is the Xrw axis, and the millimeter-wave radar three-dimensional rectangular coordinate system is established. The pitch, yaw and roll angles of the millimeter-wave radar sensor in the three-dimensional rectangular coordinate system of the millimeter-wave radar are all zero. The computer is connected, and the millimeter-wave radar sensor is used to obtain the message data information detected by all the target trains ahead, and then use the computer MFC function and the millimeter-wave radar communication protocol to complete the analysis of the radar message data. All the target trains in front include The same-track train ahead and the adjacent track train ahead, the message data information includes the horizontal distance dx and the vertical distancedy between all the targets ahead and the origin of the millimeter-wave radar coordinates;

步骤二、利用相机基于图像处理技术完成行驶列车行进轨道检测和前方同轨列车位置检测,具体包括以下步骤:Step 2: Using the camera to complete the detection of the running track of the traveling train and the position detection of the train ahead on the same track based on the image processing technology, which specifically includes the following steps:

第一步,将相机安装于毫米波雷达传感器正下方的行驶列车车头上,以相机的光心为坐标原点建立相机三维直角坐标系Xcw-Ycw-Zcw,相机三维直角坐标系各个坐标轴与毫米波雷达三维直角坐标系中各个坐标轴平行且Zrw轴与Zcw轴重合,在相机三维直角坐标系下相机的俯仰角、偏航角和滚转角均为零,相机与毫米波雷达之间以及相机与计算机之间分别通过USB数据线连接,利用相机对行驶列车前方的场景进行实时图像采集,采集的前方场景图像中包含前方所有目标列车以及行驶列车行进轨道,建立图像坐标系Xp-Yp,图像坐标系坐标原点位于相机光轴与图像平面交点,Xp,Yp分别沿前方场景图像长度方向和宽度方向;The first step is to install the camera on the head of the moving train directly below the millimeter-wave radar sensor, and use the optical center of the camera as the coordinate origin to establish the camera's three-dimensional Cartesian coordinate system Xcw -Ycw -Zcw , each coordinate of the camera three-dimensional Cartesian coordinate system The axis is parallel to each coordinate axis in the three-dimensional rectangular coordinate system of the millimeter-wave radar and the Zrw axis coincides with the Zcw axis. In the three-dimensional rectangular coordinate system of the camera, the pitch, yaw and roll angles of the camera are all zero. The radars and the camera and the computer are respectively connected by USB data lines, and the camera is used to collect real-time images of the scene in front of the moving train. The collected front scene image includes all the target trains ahead and the running track of the moving train, and the image coordinate system is established. Xp -Yp , the coordinate origin of the image coordinate system is located at the intersection of the camera optical axis and the image plane, Xp , Yp are respectively along the length direction and width direction of the front scene image;

第二步,对第一步中相机采集的前方场景图像基于累计概率霍夫变换完成行驶列车行进轨道直线检测及基于直线斜率完成行驶列车行进轨道直线初步筛选;In the second step, the forward scene image collected by the camera in the first step is based on the cumulative probability Hough transform to complete the straight line detection of the running track of the running train and the preliminary screening of the running track of the running train based on the slope of the straight line;

第三步,对第二步所得包含左右两侧轨道在内的多条直线信息进行基于DBSCAN概率密度聚类的轨道直线筛选和基于队列的轨道直线修正,得到修正后的两侧轨道直线位置信息,左右两侧轨道直线分别以lleft,lright表示,斜率分别为kleft,kright,取左右两侧轨道直线lleft,lright交点以p0表示;The third step is to perform the straight line screening based on the DBSCAN probability density clustering and the straight line correction based on the queue on the multiple straight line information including the left and right tracks obtained in the second step, and obtain the corrected straight line position information of the two sides of the track. , the left and right track straight lines are represented by lleft and lright respectively, and the slopes are respectively kleft and kright , and the intersection of the left and right track straight lines lleft and lright is represented by p0 ;

第四步,选取第三步所得修正后的两侧轨道直线位置信息,使用基于对数的轨道直线遍历方式,实现沿轨道方向前方同轨列车附近点的高密度遍历以及远离同轨列车位置的低密度遍历得到左右两侧轨道直线的遍历点在前方场景图像中的点坐标pleft(xleft,yleft),pright(xright,yright),The fourth step is to select the corrected linear position information of the two sides of the track obtained in the third step, and use the logarithm-based linear track traversal method to achieve high-density traversal of points near the same-track trains in front of the track direction and distances away from the same-track trains. Low-density traversal obtains the point coordinates pleft (xleft , yleft ), prig ht (xright , yright ) of the traversal points of the left and right track straight lines in the front scene image,

第五步,选取第四步所得左右两侧轨道遍历点在前方场景图像中的点坐标pleft(xleft,yleft),pright(xright,yright),基于左右两侧轨道直线的遍历点灰度值梯度变化完成对于前方同轨列车的识别,在左右轨道遍历点中存在灰度值突变的位置,将其确定为前方同轨列车位置;The fifth step is to select the point coordinates pleft (xleft , yleft ) and pright (xright , yright ) of the left and right track traversal points obtained in the fourth step in the front scene image, based on the straight lines of the left and right tracks. The gradient change of the gray value of the traversal point completes the identification of the train in front of the same track. There is a sudden change of gray value in the traversal point of the left and right tracks, and it is determined as the position of the train in front of the same track;

步骤三,基于时-空同步的联合标定方法融合毫米波雷达信息和视觉信息,完成前方同轨列车的准确识别与测距,具体包括以下步骤:Step 3: The joint calibration method based on time-space synchronization integrates millimeter-wave radar information and visual information to complete the accurate identification and ranging of the same-track train ahead, which specifically includes the following steps:

第一步,毫米波雷达和相机多线程同步;The first step is multi-thread synchronization of millimeter-wave radar and camera;

在进行数据采集时,选取毫米波雷达数据接收线程、相机接收线程和计算机数据处理线程三线程融合方式实现基于毫米波雷达和视觉信息的多线程时间同步;During data collection, the three-thread fusion mode of millimeter-wave radar data receiving thread, camera receiving thread and computer data processing thread is selected to realize multi-thread time synchronization based on millimeter-wave radar and visual information;

第二步,利用毫米波雷达坐标系、相机坐标系以及图像坐标系之间的平移和旋转关系得到毫米波雷达坐标系中任意一雷达点转换到图像坐标系中的位置,然后将步骤二中第六步所得前方同轨列车底部中间位置点p(xbottom,ybottom)的图像位置信息转换为毫米波雷达坐标系下的坐标,最后计算得到同轨列车底部中间位置点p(xbottom,ybottom)在毫米波雷达坐标系下的相对距离dwIn the second step, use the translation and rotation relationship between the millimeter-wave radar coordinate system, the camera coordinate system and the image coordinate system to obtain the position of any radar point in the millimeter-wave radar coordinate system converted to the image coordinate system, and then convert the position in the second step In the sixth step, the image position information of the bottom middle position point p(xbottom , ybottom ) of the train in front of the same track is converted into coordinates in the millimeter wave radar coordinate system, and finally the middle position point p(xbottom , ybottom ) relative distance dw in the millimeter wave radar coordinate system;

第三步,首先将步骤一中雷达获得的所有目标的横向距离dx和纵向距离dy转换到相机的图像坐标系Xp-Yp下,以雷达点坐标pi(xp,yp)的形式在前方场景图像中显示,然后将步骤一中前方所有目标与毫米波雷达坐标原点的横向距离dx和纵向距离dy转化为毫米波雷达坐标系下的相对距离

Figure BDA0003040245750000031
最后利用毫米波雷达传感器和相机空间距离信息融合完成雷达点坐标pi(xp,yp)的筛选,得到前方同轨列车的雷达点位置信息。The third step is to first convert the horizontal distance dx and vertical distance dy of all targets obtained by the radar instep 1 to the camera's image coordinate system Xp -Yp , and use the radar point coordinates pi (xp ,yp ) ) in the front scene image, and then convert the horizontal distance dx and vertical distance dy between all the front targets instep 1 and the origin of the millimeter-wave radar coordinates into the relative distance in the millimeter-wave radar coordinate system
Figure BDA0003040245750000031
Finally, the radar point coordinates pi (xp , yp ) are screened by the fusion of millimeter-wave radar sensor and camera space distance information, and the position information of the radar point of the same-track train ahead is obtained.

本发明具有以下有益效果如下:The present invention has the following beneficial effects as follows:

1.本发明实现了基于毫米波雷达和视觉信息融合的轨道目标测距,克服了单一传感器测距的不足,增加了目标检测准确性;1. The present invention realizes the ranging of orbital targets based on the fusion of millimeter-wave radar and visual information, overcomes the deficiency of single sensor ranging, and increases the accuracy of target detection;

2.本发明中使用的算法具有较高的运算速率,因此可以满足测距过程中对于算法运算速率的要求,增加测距结果的实时性。2. The algorithm used in the present invention has a relatively high operation rate, so it can meet the requirements for the operation rate of the algorithm in the ranging process, and increase the real-time performance of the ranging result.

附图说明Description of drawings

图1为基于毫米波雷达和视觉信息融合的轨道目标测距流程图;Figure 1 is a flow chart of orbital target ranging based on millimeter-wave radar and visual information fusion;

图2为基于概率密度聚类结果示意图;Fig. 2 is a schematic diagram of clustering results based on probability density;

图3为左右轨道直线点遍历示意图;Figure 3 is a schematic diagram of the traversal of the left and right track straight line points;

图4为毫米波雷达和相机联合标定示意图;Figure 4 is a schematic diagram of the joint calibration of the millimeter-wave radar and the camera;

图5为毫米波雷达和视觉空间距离信息融合的雷达点筛选示意图。Figure 5 is a schematic diagram of radar point screening for millimeter-wave radar and visual-spatial distance information fusion.

具体实施方式Detailed ways

下面结合附图和实施例对本发明做进一步的详细说明。The present invention will be further described in detail below with reference to the accompanying drawings and embodiments.

如附图所示,本发明的一种时-空同步的毫米波雷达和视觉信息融合方法,包括以下步骤:As shown in the accompanying drawings, a spatio-temporal synchronization millimeter-wave radar and visual information fusion method of the present invention includes the following steps:

步骤一、利用毫米波雷达传感器的报文数据解析完成被测目标初定位,具体步骤为:Step 1. Use the message data analysis of the millimeter-wave radar sensor to complete the initial positioning of the measured target, and the specific steps are:

首先将毫米波雷达传感器安装在行驶列车车头位置,以毫米波雷达最大平面的几何中心为坐标原点,以行驶列车前进方向为Yrw轴,竖直向上方向为Zrw轴,行驶列车正右侧方向为Xrw轴,建立毫米波雷达三维直角坐标系。毫米波雷达传感器在毫米波雷达三维直角坐标系中的俯仰角、偏航角和滚转角均为零,如图4中8所示。毫米波雷达传感器通过CAN总线与计算机进行连接,毫米波雷达传感器用于获取对前方所有目标列车检测得到的报文数据信息。然后利用计算机MFC功能及毫米波雷达通讯协议完成雷达报文数据解析(具体参见王升亮.ARS408_ARS404_SRR308通讯协议[M].Technical_Documentation,2019.10.01),所述的前方所有目标列车包含前方同轨列车(同轨静止列车或与行驶列车相向而行的同轨列车)和前方相邻轨道列车(静止列车或与行驶列车相向而行的相邻轨道列车)。所述的报文数据信息包括前方所有目标与毫米波雷达坐标原点的横向距离dx和纵向距离dyFirst, install the millimeter-wave radar sensor at the head of the traveling train, take the geometric center of the largest plane of the millimeter-wave radar as the coordinate origin, take the forward direction of the traveling train as the Yrw axis, the vertical upward direction as the Zrw axis, and the right side of the traveling train The direction is the Xrw axis, and the three-dimensional Cartesian coordinate system of the millimeter-wave radar is established. The pitch angle, yaw angle and roll angle of the millimeter-wave radar sensor in the three-dimensional rectangular coordinate system of the millimeter-wave radar are all zero, as shown in 8 in Figure 4. The millimeter-wave radar sensor is connected to the computer through the CAN bus, and the millimeter-wave radar sensor is used to obtain the message data information obtained by detecting all the target trains ahead. Then use the computer MFC function and the millimeter wave radar communication protocol to complete the data analysis of the radar message (for details, see Wang Shengliang.ARS408_ARS404_SRR308 communication protocol [M].Technical_Documentation, 2019.10.01). A stationary train or a train on the same track running opposite a moving train) and a train on the adjacent track in front (a stationary train or an adjacent track train running opposite to the moving train). The message data information includes the horizontal distance dx and the vertical distancedy between all the targets ahead and the origin of the millimeter-wave radar coordinates;

步骤二、利用相机基于图像处理技术完成行驶列车行进轨道检测和前方同轨列车位置检测,具体包括以下步骤:Step 2: Using the camera to complete the detection of the running track of the traveling train and the position detection of the train ahead on the same track based on the image processing technology, which specifically includes the following steps:

第一步,将相机安装于毫米波雷达传感器正下方的行驶列车车头上,通常为5cm,以相机的光心为坐标原点建立相机三维直角坐标系Xcw-Ycw-Zcw,相机三维直角坐标系各个坐标轴与毫米波雷达三维直角坐标系中各个坐标轴平行且Zrw轴与Zcw轴重合。在相机三维直角坐标系下相机的俯仰角、偏航角和滚转角均为零,如图4中8所示。相机与毫米波雷达之间以及相机与计算机之间分别通过USB数据线连接。利用相机对行驶列车前方的场景进行实时图像采集,采集的前方场景图像中包含前方所有目标列车以及行驶列车行进轨道。建立图像坐标系Xp-Yp,如图4中9所示,坐标原点位于相机光轴与图像平面交点,Xp,Yp分别沿前方场景图像长度方向和宽度方向。The first step is to install the camera on the head of the moving train directly below the millimeter-wave radar sensor, usually 5cm, and use the optical center of the camera as the coordinate origin to establish the camera's three-dimensional rectangular coordinate system Xcw -Ycw -Zcw , the camera three-dimensional right angle Each coordinate axis of the coordinate system is parallel to each coordinate axis in the three-dimensional rectangular coordinate system of the millimeter wave radar, and the Zrw axis coincides with the Zcw axis. In the camera's three-dimensional Cartesian coordinate system, the pitch, yaw and roll angles of the camera are all zero, as shown in Figure 4. The camera and the millimeter-wave radar and the camera and the computer are respectively connected by USB data cables. The camera is used to collect real-time images of the scene in front of the moving train, and the collected front scene image includes all the target trains ahead and the running track of the moving train. An image coordinate systemXp -Yp is established, as shown in 9 in Figure 4, the coordinate origin is located at the intersection of the camera's optical axis and the image plane, andXp andYp are respectively along the length and width directions of the front scene image.

第二步,对第一步中相机采集的前方场景图像基于累计概率霍夫变换(具体参见邱东,翁蒙,杨宏韬.基于改进概率霍夫变换的车道线快速检测方法.计算机技术与发展[J].2020,30(05))完成行驶列车行进轨道直线检测及基于直线斜率完成行驶列车行进轨道直线初步筛选。In the second step, the forward scene image collected by the camera in the first step is based on the cumulative probability Hough transform (for details, please refer to Qiu Dong, Weng Meng, Yang Hongtao. Fast lane detection method based on improved probability Hough transform. Computer Technology and Development [ J]. 2020, 30(05)) Completed the detection of the straight line of the running track of the running train and completed the preliminary screening of the straight line of the running track of the running train based on the slope of the straight line.

所述的行驶列车行进轨道直线初步筛选方法的具体过程为:依据行驶列车行进轨道在前方场景图像中的位置信息(包括斜率和起始点位置)以及相机的安装位置,选定行驶列车行进轨道斜率的阈值,阈值的选取原则包括必须将行进轨道的左右两侧轨道的斜率均包含在内,同时尽可能地去除横向直线等不必要的直线,最终获得包含左右两侧轨道在内的以点斜式表示的多条直线信息,完成基于累计概率霍夫变换的行驶列车行进轨道直线初步筛选。The specific process of the method for preliminary screening of straight lines of the traveling trains is as follows: according to the position information (including the slope and the starting point position) of the traveling tracks of the traveling trains in the front scene image and the installation position of the camera, the slope of the traveling tracks of the traveling trains is selected. The selection principle of the threshold includes that the slopes of the left and right tracks of the traveling track must be included, and at the same time, unnecessary straight lines such as horizontal straight lines must be removed as much as possible, and finally a point slope including the left and right tracks is obtained. The multiple straight line information represented by the formula is used to complete the preliminary screening of the traveling train track straight lines based on the cumulative probability Hough transform.

第三步,对第二步所得包含左右两侧轨道在内的多条直线信息进行基于DBSCAN概率密度聚类的轨道直线筛选和基于队列的轨道直线修正,得到修正后的两侧轨道直线位置信息,左右两侧轨道直线分别以lleft,lright表示,斜率分别为kleft,kright,取左右两侧轨道直线lleft,lright交点,以p0表示,具体实现方式如下:The third step is to perform the straight line screening based on the DBSCAN probability density clustering and the straight line correction based on the queue on the multiple straight line information including the left and right tracks obtained in the second step, and obtain the corrected straight line position information of the two sides of the track. , the left and right track straight lines are represented by lleft and lright respectively, and the slopes are kleft and kright respectively. Take the intersection of the left and right track straight lines lleft and lright , which is represented by p0. The specific implementation is as follows:

步骤101,选取经过行驶列车行进轨道直线初步筛选得到的多条直线,基于DBSCAN概率密度聚类算法(具体参见Abdellah IDRISSI,Altaf ALAOUI.A Multi-CriteriaDecision Method in the DBSCAN Algorithm for Better Clustering[J].International Journal of Advanced Computer Science and Applications,2016.(基于多准则决策方法的DBSCAN聚类算法))对行进轨道的左右两侧轨道进行精确识别,得到以点斜式表示的两侧轨道的初步直线位置信息,斜率为kj,其中j代表基于BSCAN聚类算法初步获得的直线数量,j=1,2,..n(n<5);同时由于检测过程是实时性的,对于检测过程中出现的部分周期内两侧轨道直线位置信息偏离现象,因此执行步骤102;Step 101, select a plurality of straight lines obtained through preliminary screening of the traveling train traveling track straight line, based on the DBSCAN probability density clustering algorithm (refer to Abdellah IDRISSI, Altaf ALAOUI.A Multi-CriteriaDecision Method in the DBSCAN Algorithm for Better Clustering [J]. International Journal of Advanced Computer Science and Applications, 2016. (DBSCAN Clustering Algorithm Based on Multi-criteria Decision-Making Method)) Accurately identify the left and right orbits of the traveling orbit, and obtain the preliminary linear positions of the orbits on both sides expressed in a point-slope format information, the slope is kj , where j represents the number of straight lines initially obtained based on the BSCAN clustering algorithm, j=1,2,..n(n<5); at the same time, since the detection process is real-time, The deviation phenomenon of the linear position information of the tracks on both sides within the partial period of , so step 102 is performed;

步骤102,将每一侧轨道的初步直线位置信息分别设置一个经验长度Length=5的队列,以保证每次DBSCAN聚类算法所得直线全部进入队列;Step 102, the preliminary straight line position information of each side track is set to a queue with an empirical length of Length=5 respectively, to ensure that the straight lines obtained by the DBSCAN clustering algorithm all enter the queue;

步骤103,依次将利用DBSCAN概率密度聚类算法计算得到的两侧轨道初步直线位置信息的直线斜率kj进入队列,选取队列的平均值kmean作为比较值

Figure BDA0003040245750000061
然后依据两侧轨道之间宽度在前方场景图像中的像素大小,设定一个经验距离阈值Distance=5(单位,像素),对于每一侧轨道初步直线斜率进行以下判据,
Figure BDA0003040245750000062
通过以上判断获得的最终更新后的队列的平均值kmean,即为轨道直线斜率,最终完成两侧轨道直线斜率的修正,获得修正后以点斜式表示的两侧轨道直线位置信息,左右两侧轨道直线分别以lleft,lright表示,斜率分别为kleft,kright,取左右两侧轨道直线lleft,lright交点,以p0表示,如图3中5黑色圆点所示。Step 103, the straight line slope kj of the preliminary straight line position information of the two sides of the track calculated by using the DBSCAN probability density clustering algorithm is entered into the queue, and the mean value kmean of the queue is selected as the comparison value.
Figure BDA0003040245750000061
Then, according to the pixel size of the width between the two sides of the track in the front scene image, an empirical distance threshold Distance=5 (unit, pixel) is set, and the following criteria are performed for the initial straight line slope of each side track:
Figure BDA0003040245750000062
The average value kmean of the final updated queue obtained through the above judgment is the straight line slope of the track. Finally, the correction of the straight line slopes of the two sides of the track is completed, and the position information of the two sides of the track line after the correction is obtained. The side trackstraight lines aredenoted by lleft and lright respectively, and the slopes are respectivelykleft and kright .

第四步,选取第三步所得修正后的两侧轨道直线位置信息,使用基于对数的轨道直线遍历方式,实现沿轨道方向前方同轨列车附近点的高密度遍历以及远离同轨列车位置的低密度遍历得到左右两侧轨道直线的遍历点在前方场景图像中的点坐标pleft(xleft,yleft),pright(xright,yright),具体实现方式如下:The fourth step is to select the corrected linear position information of the two sides of the track obtained in the third step, and use the logarithm-based linear track traversal method to achieve high-density traversal of points near the same-track trains in front of the track direction and distances away from the same-track trains. The low-density traversal obtains the point coordinates pleft (xleft , yleft ) and pright (xright , yright ) of the traversal points of the left and right track straight lines in the front scene image. The specific implementation is as follows:

步骤101,取两侧轨道直线lleft,lright的交点p0为初始遍历点;Step 101, take the intersection point p0 of the orbital lines lleft and lright on both sides as the initial traversal point;

步骤102,沿图4中Yp轴方向在前方场景图像中分别对两侧轨道直线进行遍历,两侧轨道取相同遍历距离为

Figure BDA0003040245750000073
其中
Figure BDA0003040245750000074
为交点p0沿Yp轴方向坐标,Width为前方场景图像宽度;Step 102, traverse the straight lines of the tracks on both sides in the front scene image along the Yp -axis direction in FIG. 4, and take the same traversal distance for the tracks on both sides as
Figure BDA0003040245750000073
in
Figure BDA0003040245750000074
is the coordinate of the intersection point p0 along the Yp -axis, and Width is the width of the front scene image;

步骤103,获取左右两侧轨道直线遍历点Yp轴方向坐标,步骤为:Step 103: Obtain the coordinates of the Y andp -axis directions of the track straight line traversal points on the left and right sides, and the steps are:

Figure BDA0003040245750000071
Figure BDA0003040245750000071

式(1)中:yleft,yright分别为左右两侧轨道直线遍历点在Yp轴方向坐标,

Figure BDA0003040245750000075
为左右轨道直线lleft,lright初始遍历点p0沿在Yp轴方向坐标,kleft,kright分别为左右两侧轨道斜率,两侧轨道直线遍历间隔相同,为Δy=logaL,遍历点个数n为
Figure BDA0003040245750000072
L为遍历距离;In formula (1): yleft , yright are the coordinates of the left and right track straight line traversal points in the Yp -axis direction, respectively,
Figure BDA0003040245750000075
are the coordinates of the left and right track straight lines lleft and lright along the Yp -axis direction of the initial traversal point p0 , kleft , kright are the slopes of the left and right tracks respectively, and the traversal interval of the straight lines on both sides is the same, which is Δy=loga L, The number of traversal points n is
Figure BDA0003040245750000072
L is the traversal distance;

步骤104,依据步骤101-103得到左右两侧轨道遍历点的Xp轴方向坐标,最终得到左右两侧轨道基于对数的全部的遍历点在前方场景图像中的点坐标pleft(xleft,yleft),pright(xright,yright),实现结果如图3所示,图3中6和7分别表示遍历后的左右轨道直线的遍历点信息。In step 104, according to steps 101-103, the Xp -axis direction coordinates of the track traversal points on the left and right sides are obtained, and finally the point coordinates pleft (xleft , yleft ), pright (xright , yright ), the implementation result is shown in Figure 3, where 6 and 7 in Figure 3 respectively represent the traversal point information of the left and right track straight lines after traversal.

Figure BDA0003040245750000081
Figure BDA0003040245750000081

式(2)中:xleft,xright分别为左右两侧轨道遍历点的Xp轴方向坐标,yleft,yright分别为左右两侧轨道遍历点Yp轴方向坐标,

Figure BDA0003040245750000083
为左右两侧轨道初始遍历点p0沿Yp轴方向坐标,
Figure BDA0003040245750000084
为左右两侧轨道初始遍历点p0沿Xp轴方向坐标,kleft,kright分别为左右两侧轨道直线lleft,lright的直线斜率;In formula (2): xleft , xright are the Xp -axis direction coordinates of the track traversal points on the left and right sides, respectively, yleft , yright are the Yp -axis direction coordinates of the left and right track traversal points, respectively,
Figure BDA0003040245750000083
The coordinates of the initial traversal point p0 along the Yp axis for the left and right tracks,
Figure BDA0003040245750000084
are the coordinates of the initial traversal point p0 of the left and right tracks along the Xp axis, and kleft and kright are the straight line slopes of the left and right track lines lleft and lright respectively;

第五步,选取第四步所得左右两侧轨道遍历点在前方场景图像中的点坐标pleft(xleft,yleft),pright(xright,yright),基于左右两侧轨道直线的遍历点灰度值梯度变化完成对于前方同轨列车的识别。由于轨道的灰度值较高,而前方同轨列车底部的灰度值较低,因此在左右轨道遍历点中存在灰度值突变的位置,将其确定为前方同轨列车位置。具体的实现方式如下:The fifth step is to select the point coordinates pleft (xleft , yleft ) and pright (xright , yright ) of the left and right track traversal points obtained in the fourth step in the front scene image, based on the straight lines of the left and right tracks. The gradient change of the gray value of the traversing point completes the identification of the train in front of the same track. Since the gray value of the track is high, and the gray value of the bottom of the train on the same track in front is low, there is a sudden change of gray value in the left and right track traversal points, which is determined as the position of the train on the same track ahead. The specific implementation is as follows:

步骤101,对轨道直线的遍历点灰度值进行均匀化操作。为了消除遍历点灰度值的跳动问题,计算连续4个遍历点的灰度值的均值作为一个新的均值遍历点,坐标值为pmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right);Step 101 , perform a homogenization operation on the gray value of the traversing point of the track straight line. In order to eliminate the jumping problem of the gray value of the traversal point, calculate the mean value of the gray value of four consecutive traversal points as a new mean traversal point, and the coordinate values are pmean_left (xmean_left , ymean_left ), pmean_right (xmean_right , ymean_right );

步骤102,确定灰度值突变位置。将坐标pmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right)作为前方同轨列车底部左右位置点坐标,之后选取前方同轨列车底部左右位置点坐标pmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right)的算数平均值作为所求的前方同轨列车底部中间位置点的坐标:Step 102: Determine the position of sudden change of gray value. Take the coordinates pmean_left (xmean_left , ymean_left ) and pmean_right (xmean_right , ymean_right ) as the coordinates of the left and right position points at the bottom of the same-track train ahead, and then select the coordinates pmean_left (xmean_left ,ymean_left ), the arithmetic mean of pmean_right (xmean_right , ymean_right ) is used as the coordinates of the middle position of the bottom of the same-track train ahead:

Figure BDA0003040245750000082
Figure BDA0003040245750000082

式(3)中:xbottom,ybottom为前方同轨列车底部中间位置点沿Xp轴,Yp轴方向坐标,xmean-left,ymean-left为前方同轨列车底部左侧位置点沿Xp轴,Yp轴方向坐标,xmean-right,ymean-right为前方同轨列车底部右侧位置点沿Xp轴,Yp轴方向坐标,最终得到前方同轨列车底部位置点坐标p(xbottom,ybottom),如图5中黑色圆点12所示;In formula (3): xbottom , ybottom are the coordinates of the middle position of the bottom of the co-track train ahead along the Xp -axis and Yp -axis, xmean-left , ymean-left are the left position of the bottom of the co-track train ahead Coordinates along the Xp -axis and Yp -axis, xmean-right , ymean-right are the coordinates of the right position point at the bottom of the front co-track train along the Xp -axis, Yp -axis, and finally get the bottom position point of the front co-track train Coordinate p(xbottom , ybottom ), as shown by theblack dot 12 in Figure 5;

第六步,选取坐标为p(xbottom,ybottom)的前方同轨列车底部中间位置点,基于卡尔曼滤波进行位置修正,得到每一周期内平滑过渡的前方同轨列车底部中间位置点信息,具体实现方式如下:The sixth step, select the middle position point at the bottom of the front co-track train whose coordinates are p(xbottom , ybottom ), perform position correction based on Kalman filter, and obtain the information of the middle position point at the bottom of the front co-track train that smoothly transitions in each cycle , the specific implementation is as follows:

对于前方同轨列车位置点偏离现象和部分周期未检测到前方同轨列车现象,首先设定卡尔曼滤波距离经验阈值dthreshold=50(具体参见Alessio Gagliardi,Francesco deGioia,Sergio Saponara.A real-time video smoke detection algorithm based onKalman filter and CNN[J].Journal of Real-Time Image Processing,2021.一种基于卡尔曼滤波和CNN的实时视频检测算法),当第i周期前方同轨列车底部中间位置点p(xbottom,ybottom)i和第i-1周期前方同轨列车底部中间位置点p(xbottom,ybottom)i-1间欧氏距离大于距离阈值dthreshold,则舍弃本周期前方同轨列车底部中间目标位置点,采用上一周期前方同轨列车底部中间目标位置点代替。同时,对于所有小于距离阈值的前方同轨列车底部中间目标位置点进行卡尔曼滤波处理,实现点位置的平滑过渡;For the position point deviation of the preceding co-track train and the phenomenon that the preceding co-track train is not detected in some cycles, first set the Kalman filter distance empirical threshold dthreshold =50 (see Alessio Gagliardi, Francesco deGioia, Sergio Saponara.A real-time for details). video smoke detection algorithm based on Kalman filter and CNN[J].Journal of Real-Time Image Processing, 2021. A real-time video detection algorithm based on Kalman filter and CNN), when the middle position of the bottom of the same-track train ahead of the i-th cycle The Euclidean distance between p(xbottom ,ybottom )i and the middle position of the bottom of the same-track train in front of the i-1th cycle p(xbottom ,ybottom )i-1 is greater than the distance threshold dthreshold , then discard the same track in front of this cycle. The middle target position point at the bottom of the rail train is replaced by the middle target position point at the bottom of the previous same-track train in the previous cycle. At the same time, Kalman filter processing is performed for all the target position points at the bottom of the front same-track trains that are less than the distance threshold to achieve a smooth transition of point positions;

步骤三,基于时-空同步的联合标定方法融合毫米波雷达信息和视觉信息,完成前方同轨列车的准确识别与测距,具体包括以下步骤:Step 3: The joint calibration method based on time-space synchronization integrates millimeter-wave radar information and visual information to complete the accurate identification and ranging of the same-track train ahead, which specifically includes the following steps:

第一步,毫米波雷达和相机多线程同步;The first step is multi-thread synchronization of millimeter-wave radar and camera;

在进行数据采集时,为使毫米波雷达和相机能采集到同一时刻的目标数据,选取毫米波雷达数据接收线程、相机接收线程和计算机数据处理线程三线程融合方式实现基于毫米波雷达和视觉信息的多线程时间同步(具体参见骆斌,费翔林.多线程技术的研究与应用[J].计算机研究与发展,2000,(04));During data collection, in order to enable the millimeter-wave radar and the camera to collect the target data at the same time, the three-thread fusion method of the millimeter-wave radar data receiving thread, the camera receiving thread and the computer data processing thread is selected to realize the millimeter-wave radar and visual information based on the three-thread fusion method. Multi-thread time synchronization (for details, see Luo Bin, Fei Xianglin. Research and Application of Multi-thread Technology [J]. Computer Research and Development, 2000, (04));

第二步,利用毫米波雷达坐标系、相机坐标系以及图像坐标系之间的平移和旋转关系得到毫米波雷达坐标系中任意一雷达点转换到图像坐标系中的位置,其转换关系式如下(具体参见罗逍,姚远,张金换.一种毫米波雷达和摄像头联合标定方法[N].清华大学学报:2014年第54卷第3期),The second step is to use the translation and rotation relationship between the millimeter-wave radar coordinate system, the camera coordinate system and the image coordinate system to obtain the position of any radar point in the millimeter-wave radar coordinate system converted to the image coordinate system. The conversion relationship is as follows (For details, see Luo Xiao, Yao Yuan, Zhang Jinhuan. A joint calibration method of millimeter-wave radar and camera [N]. Journal of Tsinghua University: 2014, Vol. 54, No. 3),

Figure BDA0003040245750000101
Figure BDA0003040245750000101

式(4)中,xp,yp为毫米波雷达中的点在图像坐标系下的Xp轴方向和Yp轴方向坐标,xrw,yrw为毫米波雷达中的点在毫米波雷达坐标系下的Xrw轴方向和Yrw轴方向坐标,Cx为相机光轴沿Xrw轴方向偏移量,Cy为相机光轴沿Yp轴方向的偏移量,fx为沿Xp轴方向相机焦距,fy为沿Yp轴方向相机焦距,Lx为雷达投影坐标系和摄像头投影坐标系X轴之间的间距,Ly为雷达投影坐标系和摄像头投影坐标系Y轴之间的间距,H为相机安装高度。In formula (4), xp , yp are the coordinates of the Xp -axis and Yp -axis directions of the point in the millimeter-wave radar in the image coordinate system, and xrw , yrw are the point in the millimeter-wave radar in the millimeter-wave The coordinates of the Xrw axis and Yrw axis in the radar coordinate system, Cx is the offset of the camera optical axis along the Xrw axis, Cy is the offset of the camera optical axis along the Yp axis, and fx is The focal length of the camera along the Xp -axis, fy is the focal length of the camera along the Yp -axis, Lx is the distance between the radar projection coordinate system and the X-axis of the camera projection coordinate system, and Ly is the radar projection coordinate system and the camera projection coordinate system The distance between the Y axes, H is the camera installation height.

最后利用公式(4),将步骤二中第六步所得前方同轨列车底部中间位置点p(xbottom,ybottom)的图像位置信息转换为毫米波雷达坐标系下的坐标,然后计算得到同轨列车底部中间位置点p(xbottom,ybottom)在毫米波雷达坐标系下的相对距离dwFinally, using formula (4), the image position information of the bottom middle position point p(xbottom , ybottom ) of the front same-track train obtained in the sixth step instep 2 is converted into coordinates in the millimeter-wave radar coordinate system, and then the same The relative distance dw of the middle position point p(xbottom , ybottom ) at the bottom of the rail train in the millimeter-wave radar coordinate system.

具体转换步骤为:首先获取同轨列车底部中间位置点坐标p(xbottom,ybottom)在Xp轴方向位置xbottom和Yp轴方向位置ybottom,之后将xbottom,ybottom转换到毫米波雷达坐标系下的坐标xw,yw,通过公式

Figure BDA0003040245750000102
计算得到同轨列车底部中间位置点p(xbottom,ybottom)在毫米波雷达坐标系下的相对距离dw。The specific conversion steps are as follows: First, obtain the coordinates p(xbottom , ybottom ) of the middle position point at the bottom of the same-track train in the position xbottom in the Xp -axis direction and the position ybottom in the Yp -axis direction, and then convert the xbottom and ybottom to millimeters The coordinates xw , yw in the wave radar coordinate system, through the formula
Figure BDA0003040245750000102
Calculate the relative distance dw of the bottom middle position point p(xbottom , ybottom ) of the same-track train in the millimeter-wave radar coordinate system.

第三步,首先将步骤一中雷达获得的所有目标的横向距离dx和纵向距离dy利用式(4)转换到相机的图像坐标系Xp-Yp下,以雷达点坐标pi(xp,yp)的形式在前方场景图像中显示,如图5中黑色方块10,11,13所示。然后将步骤一中前方所有目标与毫米波雷达坐标原点的横向距离dx和纵向距离dy转化为毫米波雷达坐标系下的相对距离

Figure BDA0003040245750000111
最后利用毫米波雷达传感器和相机空间距离信息融合完成雷达点坐标pi(xp,yp)的筛选,得到前方同轨列车的雷达点位置信息,包括以下步骤:The third step is to first convert the horizontal distance dx and vertical distance dy of all targets obtained by the radar instep 1 into the camera's image coordinate system Xp -Yp by using formula (4), and use the radar point coordinates pi ( xp , yp ) are displayed in the front scene image, as shown by theblack squares 10, 11, 13 in Fig. 5. Then convert the horizontal distance dx and vertical distancedy between all the targets ahead and the origin of the millimeter-wave radar coordinates instep 1 into the relative distance in the millimeter-wave radar coordinate system
Figure BDA0003040245750000111
Finally, the millimeter-wave radar sensor and the camera space distance information fusion are used to complete the screening of the radar point coordinates pi (xp , yp ) to obtain the radar point position information of the same-track train ahead, including the following steps:

步骤101,对于雷达检测过程中的二次反射导致同一目标出现多组距离成倍数的雷达点数据现象,如图5中10和11所示,其中10的实际相对距离远大于11,首先将前方所有目标与毫米波雷达坐标原点的横向距离dx和纵向距离dy转化为毫米波雷达坐标系下的相对距离

Figure BDA0003040245750000112
比较,然后将dr与同轨列车底部中间位置点p(xbottom,ybottom)在毫米波雷达坐标系下的相对距离dw比较,若dw和dr之差的绝对值|dw-dr|<Δdthreshold,则保留与相对距离dr对应的雷达点,否则删除,实现对于雷达点坐标pi(xp,yp)的粗筛选,得到粗筛选后的雷达点坐标pj(xp,yp)。由于二次反射导致雷达点的实际相对距离至少成二倍增长关系,因此设定距离阈值为1.5倍视觉实际相对距离值Δdthreshold=1.5dw。Step 101, for the phenomenon of multiple sets of radar point data with multiple distances appearing on the same target due to the secondary reflection in the radar detection process, as shown in 10 and 11 in Fig. 5, the actual relative distance of 10 is much greater than 11, and the front The lateral distances dx and longitudinal distances dy of all targets and the origin of the millimeter-wave radar coordinates are converted into relative distances in the millimeter-wave radar coordinate system
Figure BDA0003040245750000112
Compare, and then compare dr with the relative distance dw of the middle position point p(xbottom , ybottom ) at the bottom of the same-track train in the millimeter-wave radar coordinate system, if the absolute value of the difference between dw and dr |dw -dr |<Δdthreshold , then keep the radar point corresponding to the relative distance dr , otherwise delete it, realize the rough screening of the radar point coordinates pi (xp , yp ), and obtain the coarsely screened radar point coordinates pj (xp ,yp ). Since the actual relative distance of the radar points is at least doubled due to the secondary reflection, the distance threshold is set to be 1.5 times the actual relative distance value of the vision Δdthreshold =1.5dw .

步骤102,对于雷达检测过程中出现检测到相邻车辆的雷达点信息现象,如图5中13所示,选取粗筛选所得雷达点坐标pj(xp,yp),在前方场景图像中,根据雷达点坐标pj(xp,yp)位置信息,以步骤二获得的前方场景图像中同轨列车底部中间位置点信息p(xbottom,ybottom)作为参考中心,依据欧式距离最小约束原则,选取与前方同轨列车底部中间位置点p(xbottom,ybottom)最近的雷达点坐标pj(xp,yp)作为最终的雷达点筛选结果,与最终筛选的雷达点对应的相对距离信息dr作为最终的测距结果。Step 102 , for the phenomenon of radar point information that detects adjacent vehicles during the radar detection process, as shown in 13 in FIG. 5 , select the radar point coordinates pj (xp , yp ) obtained by rough screening, in the front scene image. , according to the position information of the radar point coordinates pj (xp , yp ), take the middle position information p (xbottom , ybottom ) of the bottom middle position of the same-track train in the front scene image obtained instep 2 as the reference center, according to the minimum Euclidean distance Constraining principle, select the radar point coordinates pj (xp , yp ) closest to the bottom middle position point p (xbottom , ybottom ) of the same-track train ahead as the final radar point screening result, corresponding to the final screening radar point The relative distance information dr is used as the final ranging result.

Figure BDA0003040245750000113
Figure BDA0003040245750000113

式(5)中:xpi,ypi为雷达点在图像坐标系中的横纵坐标,xbottom,ybottom,为前方同轨列车底部中间位置点的横纵坐标。前方同轨列车底部中间位置点为图5中12所示,选取与其最近的雷达点作为最终雷达点检测结果,实现对于相邻车辆雷达点13的筛除,得到最终基于视觉匹配雷达点11,选取此时雷达点距离信息作为最终雷达点筛选结果,选取该雷达点对应相对距离信息dr作为最终的测距结果。In formula (5): xpi , ypi are the abscissa and ordinate coordinates of the radar point in the image coordinate system, and xbottom , ybottom , are the abscissa and ordinate of the middle position of the bottom of the same-track train ahead. The middle position of the bottom of the same-track train in front is shown as 12 in Figure 5, and the nearest radar point is selected as the final radar point detection result to screen out the adjacent vehicle radar points 13, and finally obtain theradar point 11 based on visual matching, The radar point distance information at this time is selected as the final radar point screening result, and the relative distance information dr corresponding to the radar point is selected as the final ranging result.

Claims (5)

1. A time-space synchronization millimeter wave radar and visual information fusion method is characterized by comprising the following steps:
the method comprises the following steps of firstly, completing primary positioning of a target to be detected by analyzing message data of a millimeter wave radar sensor, and specifically comprising the following steps:
firstly, a millimeter wave radar sensor is arranged at the head position of a running train, the geometric center of the maximum plane of the millimeter wave radar is taken as the origin of coordinates, and the advancing direction of the running train is taken as YrwAxis, vertically upwards direction ZrwThe right side direction of the axle and the running train is XrwThe system comprises a shaft, a millimeter wave radar three-dimensional rectangular coordinate system is established, the pitch angle, the yaw angle and the roll angle of a millimeter wave radar sensor in the millimeter wave radar three-dimensional rectangular coordinate system are all zero, the millimeter wave radar sensor is connected with a computer through a CAN bus and used for acquiring message data information obtained by detecting all target trains in front, and then radar message data analysis is completed by utilizing the MFC function of the computer and a millimeter wave radar communication protocol, all target trains in front comprise a front same-rail train and a front adjacent rail train, and the message data information comprises the transverse distances d between all targets in front and the origin of the millimeter wave radar coordinatexAnd a longitudinal distance dy
Secondly, the detection of the advancing track of the running train and the detection of the position of the front same-track train are completed by utilizing a camera based on an image processing technology, and the method specifically comprises the following steps:
firstly, a camera is arranged on a head of a running train right below a millimeter wave radar sensor, and a three-dimensional rectangular coordinate system X of the camera is established by taking an optical center of the camera as a coordinate origincw-Ycw-ZcwEach coordinate axis of the three-dimensional rectangular coordinate system of the camera is parallel to each coordinate axis of the three-dimensional rectangular coordinate system of the millimeter wave radar and Z isrwAxis and ZcwThe axes are coincident, the pitch angle, the yaw angle and the roll angle of the camera are all zero under a three-dimensional rectangular coordinate system of the camera, the camera is connected with the millimeter wave radar and the camera is connected with the computer through USB data lines respectively, the camera is used for collecting real-time images of a scene in front of a running train, the collected scene images in front comprise all target trains in front and running tracks of the running train, and an image coordinate system X is establishedp-YpThe origin of coordinates of the image coordinate system is located at the intersection of the optical axis of the camera and the image plane, Xp,YpRespectively along the length direction and the width direction of the front scene image;
secondly, completing the straight line detection of the advancing track of the running train based on the accumulated probability Hough transformation and completing the primary screening of the advancing track of the running train based on the straight line slope on the front scene image collected by the camera in the first step;
thirdly, performing track linear screening based on DBSCAN probability density clustering and track linear correction based on queues on a plurality of linear information including the left and right tracks obtained in the second step to obtain corrected linear position information of the tracks on the two sides, wherein the linear positions of the tracks on the left and right sides are respectively represented by lleft,lrightShowing that the slopes are respectively kleft,krightTaking the straight line l of the left and right side railsleft,lrightIntersection point is defined by p0Represents;
fourthly, selecting the corrected linear position information of the two sides of the track obtained in the third step, and using a track linear traversal mode based on logarithm to realize high-density traversal of points near the same-track train in the track direction and low-density traversal far away from the same-track train position to obtain a point coordinate p of traversal points of the left and right linear tracks in the scene image in front of the traversal pointsleft(xleft,yleft),pright(xright,yright),
Fifthly, selecting the point coordinate p of the left and right track traversal points obtained in the fourth step in the front scene imageleft(xleft,yleft),pright(xright,yright) The method comprises the steps that identification of a front same-rail train is completed based on gray value gradient changes of traversal points of track straight lines on the left side and the right side, and positions with gray value mutation exist in the traversal points of the left side and the right side of the track, and the positions are determined as the positions of the front same-rail train;
the sixth step, choose the coordinate to be p (x)bottom,ybottom) The middle position point at the bottom of the front same-rail train is corrected based on Kalman filtering to obtain the front same-rail train in smooth transition in each periodVehicle bottom middle position point information;
step three, fusing millimeter wave radar information and visual information by a time-space synchronization-based combined calibration method to complete accurate identification and distance measurement of the front same-rail train, and specifically comprising the following steps of:
firstly, multithreading synchronization of a millimeter wave radar and a camera;
when data acquisition is carried out, a three-thread fusion mode of a millimeter wave radar data receiving thread, a camera receiving thread and a computer data processing thread is selected to realize multithreading time synchronization based on the millimeter wave radar and visual information;
secondly, obtaining a position of any radar point in the millimeter wave radar coordinate system converted into the image coordinate system by utilizing the translation and rotation relation among the millimeter wave radar coordinate system, the camera coordinate system and the image coordinate system, and then converting a middle position point p (x) at the bottom of the front same-rail train obtained in the sixth step in the second stepbottom,ybottom) Converting the image position information into coordinates under a millimeter wave radar coordinate system, and finally calculating to obtain a middle position point p (x) at the bottom of the same-rail trainbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemw
Thirdly, firstly, the transverse distances d of all the targets obtained by the radar in the first stepxAnd a longitudinal distance dyConversion to image coordinate system X of camerap-YpThen, using radar point coordinate pi(xp,yp) Is displayed in the front scene image, and then the transverse distances d between all the front targets and the millimeter wave radar coordinate origin in the step onexAnd a longitudinal distance dyConverting into relative distance under millimeter wave radar coordinate system
Figure FDA0003642204130000021
Finally, the millimeter wave radar sensor and the camera spatial distance information are fused to complete the radar point coordinate pi(xp,yp) And (4) screening to obtain the position information of the radar point of the front same-rail train.
2. The time-space synchronized millimeter wave radar and visual information fusion method of claim 1, wherein:
the specific processes of the second step and the third step are as follows:
step 101, selecting a plurality of straight lines obtained by preliminarily screening straight lines of a travelling track of a running train, accurately identifying the tracks on the left side and the right side of the travelling track based on a DBSCAN probability density clustering algorithm, and obtaining preliminary straight line position information of the tracks on the two sides expressed in a point skew manner, wherein the slope is kjWhere j represents the number of lines initially obtained based on the BSCAN clustering algorithm, j is 1,2<5;
102, respectively setting a queue with an empirical Length of 5 for the preliminary straight line position information of each side track to ensure that all straight lines obtained by the DBSCAN clustering algorithm enter the queue each time;
step 103, calculating the linear slope k of the preliminary linear position information of the two-side track obtained by utilizing the DBSCAN probability density clustering algorithm in sequencejEntering a queue, and selecting the average value k of the queuemeanAs a comparison value
Figure FDA0003642204130000031
Then, setting an empirical Distance threshold Distance to 5 according to the pixel size of the width between the tracks on the two sides in the front scene image, performing the following criterion on the initial straight line slope of each track on the side,
Figure FDA0003642204130000032
the average value k of the finally updated queue obtained by the above judgmentmeanNamely the slope of the track straight line, and finally finishing the correction of the slopes of the track straight lines on both sides to obtain the position information of the track straight lines on both sides expressed in a point-slope manner after the correction.
3. The time-space synchronized millimeter wave radar and visual information fusion method according to claim 1 or 2, characterized in that: the fourth step of the second step is realized by the following specific method:
step 101, taking a straight line l of two side tracksleft,lrightPoint of intersection p0Is an initial traversal point;
step 102, along YpThe axial direction respectively traverses the straight lines of the two side tracks in the front scene image, and the two side tracks take the same traverse distance as
Figure FDA0003642204130000033
Wherein
Figure FDA0003642204130000034
Is an intersection point p0Along YpThe coordinate of the axis direction, Width is the image Width of the front scene;
step 103, acquiring straight traversing points Y of the tracks on the left side and the right sidepAxial coordinates, comprising the following steps:
Figure FDA0003642204130000035
in formula (1): y isleft,yrightRespectively, the straight line traverse points of the left and right tracks are at YpThe coordinate of the axial direction is determined,
Figure FDA0003642204130000036
is a straight line l of the left and right tracksleft,lrightInitial traversal point p0At YpAxial coordinate, kleft,krightThe track slopes are respectively the track slopes at the left side and the right side, the straight line traversal intervals of the tracks at the two sides are the same, and the value is delta y ═ logaL, the number n of traversal points is
Figure FDA0003642204130000041
L is the traversal distance;
step 104, obtaining X of the traversal points of the tracks on the left and right sides according to the step 101-pThe coordinate in the axial direction is finally obtained, and the point coordinate p of all the traversal points of the left and right tracks based on the logarithm in the front scene image is finally obtainedleft(xleft,yleft),pright(xright,yright):
Figure FDA0003642204130000042
In formula (2): x is the number ofleft,xrightX of the track traversal points on the left and right sides respectivelypAxial coordinate, yleft,yrightRespectively as the traversal points Y of the left and right trackspThe coordinate of the axial direction is determined,
Figure FDA0003642204130000043
for initial traversal of the left and right side orbits by points p0Along YpThe coordinate of the axial direction is determined,
Figure FDA0003642204130000044
for the initial traversal of the left and right orbits by the point p0Along XpAxial coordinate, kleft,krightRespectively a track straight line l on the left and right sidesleft,lrightThe slope of the line of (a).
4. The time-space synchronized millimeter wave radar and visual information fusion method of claim 3, wherein: the concrete implementation method of the fifth step and the sixth step is as follows:
step 101, carrying out homogenization operation on the gray values of the traversal points of the track straight line, namely calculating the mean value of the gray values of 4 continuous traversal points as a new mean value traversal point, wherein the coordinate value is pmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right);
Step 102, determining the position of the gray value mutation: will coordinate pmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right) As the coordinates of the left and right position points at the bottom of the front same-rail train, then selecting the coordinates p of the left and right position points at the bottom of the front same-rail trainmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right) The calculated arithmetic mean value is used as the coordinate of the middle position point at the bottom of the front same-rail train:
Figure FDA0003642204130000045
in formula (3): x is the number ofbottom,ybottomIs a point at the middle position of the bottom of the front same-rail train along XpAxis, YpAxial coordinate, xmean-left,ymean-leftIs a position point at the left side of the bottom of the front same-rail train along XpAxis, YpAxial coordinate, xmean-right,ymean-rightIs a position point at the right side of the bottom of the front same-rail train along XpAxis, YpThe coordinate in the axial direction is finally obtained, and the coordinate p (x) of the bottom position point of the front same-rail train is finally obtainedbottom,ybottom);
The sixth step, choose the coordinate to be p (x)bottom,ybottom) The middle position point of the bottom of the front co-rail train is corrected based on Kalman filtering to obtain the middle position point information d of the bottom of the front co-rail train in smooth transition in each periodthresholdWhen p (x) is 50bottom,ybottom)iAnd p (x)bottom,ybottom)i-1The inter-Euclidean distance is greater than a distance threshold dthresholdThen abandoning the middle target position point at the bottom of the same-rail train in front of the period, and replacing the middle target position point at the bottom of the same-rail train in front of the previous period with the middle target position point; meanwhile, Kalman filtering processing is carried out on all middle target position points at the bottom of the front same-rail train smaller than the distance threshold value, and smooth transition of the point positions is realized.
5. The time-space synchronized millimeter wave radar and visual information fusion method of claim 3, wherein: the third step of the step comprises the following steps:
step 101, firstly, all the targets in front are matched with the millimeter wave radar coordinatesTransverse distance d of originxAnd a longitudinal distance dyConverting into relative distance under millimeter wave radar coordinate system
Figure FDA0003642204130000051
Then d isrWith the middle position point p (x) at the bottom of the same-rail trainbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemwComparison if dwAnd drAbsolute value of the difference | dw-dr|<ΔdthresholdThen the relative distance d is preservedrCorresponding radar points are deleted, otherwise, the coordinate p of the radar point is realizedi(xp,yp) Obtaining the coordinate p of the radar point after coarse screeningj(xp,yp);
102, selecting coordinates p of radar points obtained by coarse screeningj(xp,yp) In the front scene image, according to the radar point coordinates pj(xp,yp) Position information, and the middle position point information p (x) of the bottom of the same-rail train in the front scene image obtained in the step twobottom,ybottom) As a reference center, selecting a bottom middle position point p (x) of the train on the same track with the front according to the Euclidean distance minimum constraint principlebottom,ybottom) Nearest radar point coordinate pj(xp,yp) As a final radar point screening result, relative distance information d corresponding to the finally screened radar pointrAs a final ranging result.
CN202110455091.8A2021-04-262021-04-26Time-space synchronization millimeter wave radar and visual information fusion methodActiveCN113189583B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202110455091.8ACN113189583B (en)2021-04-262021-04-26Time-space synchronization millimeter wave radar and visual information fusion method

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202110455091.8ACN113189583B (en)2021-04-262021-04-26Time-space synchronization millimeter wave radar and visual information fusion method

Publications (2)

Publication NumberPublication Date
CN113189583A CN113189583A (en)2021-07-30
CN113189583Btrue CN113189583B (en)2022-07-01

Family

ID=76978999

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202110455091.8AActiveCN113189583B (en)2021-04-262021-04-26Time-space synchronization millimeter wave radar and visual information fusion method

Country Status (1)

CountryLink
CN (1)CN113189583B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN113900087A (en)*2021-08-312022-01-07通号城市轨道交通技术有限公司Train data measuring method and device, electronic equipment and storage medium
CN114708585B (en)*2022-04-152023-10-10电子科技大学 A three-dimensional target detection method based on the fusion of millimeter wave radar and vision based on the attention mechanism
CN115169452B (en)*2022-06-302023-04-28北京中盛国芯科技有限公司Target information system and method based on space-time synchronous queue characteristic radar fusion
CN115877328B (en)*2023-03-062023-05-12成都鹰谷米特科技有限公司Signal receiving and transmitting method of array radar and array radar

Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107818557A (en)*2016-09-122018-03-20德尔福技术有限公司Enhanced camera object for automotive vehicle detects
CN108960183A (en)*2018-07-192018-12-07北京航空航天大学A kind of bend target identification system and method based on Multi-sensor Fusion
WO2020134512A1 (en)*2018-12-292020-07-02南京慧尔视智能科技有限公司Traffic detection system based on millimeter wave radar and video
CN111368706A (en)*2020-03-022020-07-03南京航空航天大学 Data fusion dynamic vehicle detection method based on millimeter wave radar and machine vision
CN111461088A (en)*2020-06-172020-07-28长沙超创电子科技有限公司Rail transit obstacle avoidance system based on image processing and target recognition
CN111546328A (en)*2020-04-022020-08-18天津大学Hand-eye calibration method based on three-dimensional vision measurement
CN111856441A (en)*2020-06-092020-10-30北京航空航天大学 A train positioning method based on fusion of vision and millimeter wave radar

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9052393B2 (en)*2013-01-182015-06-09Caterpillar Inc.Object recognition system having radar and camera input
CN107609522B (en)*2017-09-192021-04-13东华大学 An information fusion vehicle detection system based on lidar and machine vision
US11287523B2 (en)*2018-12-032022-03-29CMMB Vision USA Inc.Method and apparatus for enhanced camera and radar sensor fusion
CN110208793B (en)*2019-04-262022-03-11纵目科技(上海)股份有限公司Auxiliary driving system, method, terminal and medium based on millimeter wave radar
CN111832410B (en)*2020-06-092022-09-20北京航空航天大学Forward train detection method based on fusion of vision and laser radar

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107818557A (en)*2016-09-122018-03-20德尔福技术有限公司Enhanced camera object for automotive vehicle detects
CN108960183A (en)*2018-07-192018-12-07北京航空航天大学A kind of bend target identification system and method based on Multi-sensor Fusion
WO2020134512A1 (en)*2018-12-292020-07-02南京慧尔视智能科技有限公司Traffic detection system based on millimeter wave radar and video
CN111368706A (en)*2020-03-022020-07-03南京航空航天大学 Data fusion dynamic vehicle detection method based on millimeter wave radar and machine vision
CN111546328A (en)*2020-04-022020-08-18天津大学Hand-eye calibration method based on three-dimensional vision measurement
CN111856441A (en)*2020-06-092020-10-30北京航空航天大学 A train positioning method based on fusion of vision and millimeter wave radar
CN111461088A (en)*2020-06-172020-07-28长沙超创电子科技有限公司Rail transit obstacle avoidance system based on image processing and target recognition

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
《A Train Positioning Method Based-On Vision and Millimeter-Wave Radar Data Fusion》;Z. Wang, G. Yu, B. Zhou, P. Wang and X. Wu;《 IEEE Transactions on Intelligent Transportation Systems》;20210203;1 - 11*
丁雅斌 ; 彭翔 ; 刘则毅 ; 牛憨笨.《基于广义等值面提取的多视场深度像融合》.《工程图学学报》.2004,*
姚文韬 ; 沈春锋 ; 董文生.《一种自适应摄像机与激光雷达联合标定算法》.《控制工程》.2017,*
郑云水 ; 郭双全 ; 董昱.《基于雷达测量数据的列车运行前方障碍物检测方法研究》.《铁道学报》.2021,*

Also Published As

Publication numberPublication date
CN113189583A (en)2021-07-30

Similar Documents

PublicationPublication DateTitle
CN113189583B (en)Time-space synchronization millimeter wave radar and visual information fusion method
CN109684921B (en) A Road Boundary Detection and Tracking Method Based on 3D LiDAR
Zhangyu et al.A camera and LiDAR data fusion method for railway object detection
CN104021676B (en)Vehicle location based on vehicle dynamic video features and vehicle speed measurement method
CN111461088B (en)Rail transit obstacle avoidance system based on image processing and target recognition
CN112991391A (en)Vehicle detection and tracking method based on radar signal and vision fusion
Perrollaz et al.Long range obstacle detection using laser scanner and stereovision
CN112698302A (en)Sensor fusion target detection method under bumpy road condition
CN115113206B (en)Pedestrian and obstacle detection method for assisting driving of underground rail car
CN112991369A (en)Method for detecting overall dimension of running vehicle based on binocular vision
WO2021253245A1 (en)Method and device for identifying vehicle lane changing tendency
CN115856872B (en)Vehicle motion trail continuous tracking method
Wang et al.Object tracking based on the fusion of roadside LiDAR and camera data
CN113850102A (en) Vehicle-mounted visual detection method and system based on millimeter-wave radar assistance
CN114357019A (en)Method for monitoring data quality of road side sensing unit in intelligent networking environment
CN105913454A (en)Pixel coordinate locus prediction method of motion object in video image
WO2025050329A1 (en)Multi-roadside laser radar sensor point cloud data temporal-spatial registration method
CN110443819A (en)A kind of track detection method and device of monorail train
Murmu et al.Relative velocity measurement using low cost single camera-based stereo vision system
Li et al.Multi-sensor fusion for robust localization with moving object segmentation in complex dynamic 3D scenes
CN115980754A (en)Vehicle detection and tracking method fusing sensor information
CN117974773A (en)Method for calibrating bow direction based on geographic azimuth under ship static condition in ship lock
CN117671972A (en)Vehicle speed detection method and device for slow traffic system
Huang et al.FRVO-Mono: Feature-based railway visual odometry with monocular camera
CN115601388A (en)Image multi-target vehicle tracking and track checking method carrying following model

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp