Movatterモバイル変換


[0]ホーム

URL:


CN109035204B - Real-time detection method for weld joint target - Google Patents

Real-time detection method for weld joint target
Download PDF

Info

Publication number
CN109035204B
CN109035204BCN201810666919.2ACN201810666919ACN109035204BCN 109035204 BCN109035204 BCN 109035204BCN 201810666919 ACN201810666919 ACN 201810666919ACN 109035204 BCN109035204 BCN 109035204B
Authority
CN
China
Prior art keywords
welding seam
training
welding
sample
detector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810666919.2A
Other languages
Chinese (zh)
Other versions
CN109035204A (en
Inventor
邹焱飚
王灿烨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUTfiledCriticalSouth China University of Technology SCUT
Priority to CN201810666919.2ApriorityCriticalpatent/CN109035204B/en
Publication of CN109035204ApublicationCriticalpatent/CN109035204A/en
Application grantedgrantedCritical
Publication of CN109035204BpublicationCriticalpatent/CN109035204B/en
Expired - Fee Relatedlegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

Translated fromChinese

本发明属于检测技术领域,特别涉及一种焊缝目标实时检测方法,包括以下步骤:建立训练样本集,被构造为采集不同形态的焊缝图像作为源样本,对源样本进行预处理组成训练样本;离线训练检测器,被构造为利用所述训练样本在不同的初始条件下对神经网络进行训练,将多次训练得到的最优神经网络模型作为焊缝检测器;在线检测,被构造为获取检测图像,利用所述焊缝检测器进行焊缝检测并输出检测结果。本发明针对不同形态的焊缝采用基于线激光成像的焊缝图像训练得到的焊缝检测器,克服了通过形态学方法定位焊缝鲁棒性差的问题,实现了对不同形态的焊缝的精确定位,对不同种类的焊缝分类准确,且定位精度高。

Figure 201810666919

The invention belongs to the technical field of detection, and in particular relates to a real-time detection method for a weld target, comprising the following steps: establishing a training sample set, being configured to collect weld images of different shapes as source samples, and preprocessing the source samples to form training samples ; The offline training detector is configured to use the training samples to train the neural network under different initial conditions, and the optimal neural network model obtained by multiple trainings is used as the weld detector; The online detection is configured to obtain The image is detected, and the weld seam detector is used to detect the weld seam and output the inspection result. The invention adopts the welding seam detector obtained by the welding seam image training based on the line laser imaging for the welding seam of different shapes, overcomes the problem of poor robustness of the welding seam positioning by the morphological method, and realizes the accurate detection of the welding seam of different shapes. Positioning, accurate classification of different types of welds, and high positioning accuracy.

Figure 201810666919

Description

Real-time detection method for weld joint target
Technical Field
The invention relates to a real-time detection method for a weld joint target, in particular to a real-time detection method for a robot line laser weld joint.
Background
Because the problems of high labor intensity, severe working environment, low efficiency and the like exist in the manual welding of workers, the welding robot is gradually applied to the welding production at home and abroad. Most of traditional welding robots are teaching programming robots, and sensors are used for assisting in acquiring external information to adjust welding tracks. In recent years, with the development of computer vision, a weld joint tracking system based on laser and vision sensor imaging begins to appear, and is characterized in that the weld joint feature points are positioned in a tracking mode. Because the tracking algorithm is mostly based on a space-time context learning mode, the identified welding seam initial point is needed, and therefore, the requirement is provided for the positioning accuracy of the welding seam initial point. Most of the prior welding seam tracking systems adopt a manual marking mode or a traditional image processing mode for positioning welding seam characteristic points, and the manual marking mode is time-consuming and labor-consuming; the traditional image processing method, such as a morphological method, needs a lot of time to establish feature engineering and adjust classifier parameters, and can not determine the positions of weld initial points with different forms.
The main technical index of the real-time welding seam tracking system is the distance d between the laser stripe and the welding molten pool, the smaller d is, the higher the tracking precision is, and d is usually expected to be less than 30 mm. Therefore, the welding starting point is used as a reference point of the tracking algorithm, and the target of the welding starting point which is a real-time detection method of the welding seam is quickly and accurately acquired.
Disclosure of Invention
Aiming at the problems, the invention provides a real-time detection method of a weld joint target, which can be used for training a weld joint detector based on a convolutional neural network by acquiring and preprocessing a training sample, so that the weld joint detector can quickly and accurately identify and position different weld joint positions, and the problems of inaccurate positioning and poor robustness of a welding starting point by adopting a morphological method in an automatic weld joint tracking system in the current automatic welding technology can be effectively solved.
The invention is realized by adopting the following technical scheme: a real-time detection method for a weld joint target comprises the following steps:
establishing a training sample set, wherein the training sample set is constructed by collecting welding seam images in different forms as source samples and preprocessing the source samples to form training samples;
the off-line training detector is constructed for training the neural network under different initial conditions by using the training sample, and an optimal neural network model obtained by multiple times of training is used as a weld joint detector;
and an on-line detection configured to acquire a detection image, perform a weld detection using the weld detector, and output a detection result.
Compared with the prior art, the invention has the beneficial effects that:
(1) the seam detector based on the convolutional neural network is trained offline, the positions of different types of seams are quickly and accurately identified and positioned, and the problems that an automatic seam tracking system in the current automatic welding technology adopts a morphological method to position a welding starting point inaccurately and the robustness is poor can be effectively solved.
(2) The welding line image is generated and collected through the line laser sensor and the embedded controller, and the welding line image acquisition device has the characteristics of clear imaging, difficulty in being interfered by an external environment light source, simplicity and convenience in operation and simple structure;
(3) by preprocessing the acquired training samples, the acquired sample types are distributed evenly, the positions and angles of welding seams are variable, and the time and labor for repeatedly acquiring the training samples are saved;
(4) the method has the advantages that the essential characteristics of the welding seam are learned from a large sample through the convolutional neural network, the characteristics are guaranteed to have strong separability, compared with a traditional morphological detection method, the method is higher in robustness, and the positioning effect on the welding seams of different types is more accurate.
Drawings
FIG. 1 is a flowchart illustrating the overall operation of a weld target detection method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an image structure acquired by a line laser real-time weld detection system of a six-degree-of-freedom welding robot according to an embodiment of the present invention;
fig. 3 is a schematic diagram of the internal structure of a line laser sensor according to an embodiment of the present invention.
Detailed Description
The purpose of the present invention will be described in further detail with reference to specific embodiments, which are not repeated herein, but the embodiments of the present invention are not limited thereto.
A real-time detection method for a weld joint target is based on a detection system shown in figure 2 and comprises a six-degree-of-freedom mechanical arm 1, a welding gun 2, alinear laser sensor 3, a workbench 4, anembedded controller 5 and aworkpiece 7, wherein theworkpiece 7 is arranged on the workbench 4, the linearlaser vision sensor 3 is installed on the welding gun 2, the welding gun 2 is arranged at the tail end of the six-degree-of-freedom mechanical arm 1, and thelinear laser sensor 3 and the welding gun 2 change the position of the welding gun 2 in space through the movement of the six-degree-of-freedom mechanical arm 1. The internal structure of theline laser sensor 3 is shown in fig. 3, and includes acamera 6 and alaser generator 8,
as shown in fig. 1, in one embodiment, a method for detecting a weld target in real time includes the following steps:
s1, establishing a training sample set;
in one embodiment, the specific process of S1 is as follows: and collecting welding seam images in different forms as source samples, and preprocessing the source samples to form training samples.
In this embodiment, the S1 establishing the training sample set includes:
s11, collecting a source sample, and collecting an image before welding initiation through acamera 6 in theline laser sensor 3;
s12, preprocessing the source sample;
s13 generates a target sample: and (3) taking the minimum number of the 4 types of weld seam samples after pretreatment as a reference, and downsampling the other 3 types of weld seam samples until the number of the weld seam samples is the same as the reference to obtain a final training sample.
In this embodiment, the S11 collecting the source sample includes:
s111, adjusting the position of the mechanical arm 1 of the six-degree-of-freedom welding robot, enabling the tail end of a welding gun 2 to be positioned right above the position of a welding seam of aworkpiece 7 to be welded, and enabling alinear laser sensor 3 fixed on the welding gun 2 to be positioned at an optimal working position, so that a clear image can be captured in the welding process, and the linear laser sensor and the workpiece cannot interfere with each other;
and S112, emitting laser by thelaser generator 8, and acquiring an image by using thecamera 6 in theline laser sensor 3 and sending the image to the embeddedindustrial controller 5.
In this embodiment, the preprocessing the source sample by the S12 includes:
s121, eliminating weld joint samples with arc light and splashes in the source samples, and only keeping clean weld joint samples without arc light and splashes before welding;
s122, carrying out scale transformation on the clean arc-free and splash-free welding seam sample, and unifying the size of the sample to 1280x 1024;
s123, horizontally overturning the welding seam sample after the scale transformation;
s124, randomly carrying out translation transformation, rotation transformation, brightness transformation, contrast transformation and Gaussian white noise addition on the horizontally turned weld joint sample;
and S125, normalizing the weld joint sample added with the Gaussian white noise.
In this embodiment, 1200 welds of four different types (L-weld, V-weld, I-weld, and Open-weld) were collected by a camera in a line laser sensor for a total of 4800 weld samples. The size is unified to 1280x 1024 pixels. And randomly carrying out horizontal inversion, translation transformation ([ -40, +40] pixels), rotation transformation ([ -20, +20] degrees), brightness transformation ([0.8,1.2] times), contrast transformation ([0.8,1.2] times) and addition of Gaussian white noise (mu is 0 and sigma is 20) on the welding seam samples to obtain 1200 randomly processed welding seam pictures, namely the number of each welding seam sample is 2400, and the total number of samples is 9600. The pixel values of all samples are divided by 255, and their pixel value values are normalized to [0, -1 ].
S2 training the detector offline;
in one embodiment, the specific process of S2 is as follows: and training the neural network under different initial conditions by using the training sample of S1, and taking the optimal neural network model obtained by multiple times of training as a weld joint detector.
In this embodiment, the S2 offline training detector includes:
s21, training a weld joint detector by adopting a Fast-RCNN algorithm, firstly, extracting the characteristics of an input image by utilizing a shared characteristic layer, then, outputting a candidate Region by utilizing a Region candidate Network (RPN), finally, outputting the classification result of the candidate Region by taking Fast-RCNN as a classifier, and only reserving the candidate Region with the highest score by adopting a Non-Maximum Suppression (Non-Maximum Suppression) algorithm;
s22, initializing parameters of an RPN network and a Fast-RCNN classifier under different initialization conditions, training the RPN network and the Fast-RCNN classifier after the parameters reach a preset maximum iteration number or the error rate on a verification set does not decrease any more, and obtaining an optimal model as a weld joint detector after multiple times of training.
In this embodiment, the training of the weld detector by the S21 method using the fast-RCNN algorithm includes the following steps:
s211, pre-training parameters of a convolutional neural network shared feature layer by using an ImageNet data set;
the embodiment adopts a fast-RCNN algorithm, and the network structure of the shared feature layer adopts an inclusion v2 network, and uses convolution kernels with different sizes in one layer of convolution to perform operation so as to obtain features with different sizes, and the features with different sizes are combined before being input into the lower layer network and then are input into the lower layer network together.
S212 adjusts parameters of the regional candidate network: fixing the aspect ratio of the output candidate region to be 1, setting the side length of the size of the output candidate region to be three sizes of 0.5, 0.75 and 1, and setting the total number of the output candidate regions to be 100;
s213 defines the loss function: the loss function is the sum of the classification error and the position deviation of the candidate region;
the principle of defining the loss function in this embodiment is: when IoU of the output candidate region and the real target region is the maximum value, marking the score of the candidate region as a positive sample; when IoU for the output candidate region and the real target region is greater than 0.7, the candidate region score is also marked as a positive sample.
According to the above definition, the loss function is defined as formula (1):
Figure GDA0002948061650000041
in formula (1), i represents the sequence number of the candidate region in a small sample lot, PiRepresents the prediction result of the area i; according to the rule defined above, if the candidate region score is marked as a positive sample, then
Figure GDA0002948061650000042
Equal to 1, otherwise
Figure GDA0002948061650000043
Equal to 0; t is ti={tx,ty,tw,thIs a vector representing the coordinates of the position of the center point of the predicted candidate region and the width and height;
Figure GDA0002948061650000044
representing the position coordinates and width and height of the central point of the real target area; classification loss function LclsIs a cross entropy function, regression loss function, of two classes
Figure GDA0002948061650000045
A stable Smooth L1 loss function is used. λ is default to 10, so that the classification loss function is approximately equal in weight to the regression loss function. N is a radical ofclsRepresenting the total number of sample classes in the sample of the small lot, NregRepresenting the total number of coordinates of the target area in the sample of the small lot.
For the regression loss function, the present embodiment adjusts the parameters of the candidate region coordinates by using the method of formula (2):
Figure GDA0002948061650000046
in formula (2), x, y, w and h represent the center coordinates of the candidate region and the width and height of the region, respectively, and the variables x, xaAnd x*Respectively, a prediction area, an anchor area (anchor box), and an actual area (ground channel box).
S214, calculating the gradient by using a BP algorithm, performing back propagation, setting the learning rate to be 0.0001, and updating the parameters of the shared feature layer by adopting an approximate joint training mode.
In this embodiment, the ratio of the classification error to the scaling coefficient of the position deviation in the loss function set in S213 is 1: 1.5.
In this embodiment, the S214 includes:
s2141, when calculating the gradient by BP algorithm, calculating the network error by using minimum-batch (mini-batches) mode with the extracted ROI area as a fixed value;
s2142, parameters of the RPN network and the Fast-RCNN classifier are updated at the same time;
s2143 combines the gradient from the RPN network and the gradient of the Fast-RCNN classifier, and inputs the combined gradient and the gradient into the shared feature layer for parameter updating.
S3, carrying out online detection;
in one embodiment, the specific process of S3 is as follows: and acquiring a detection image by using thecamera 6 and the embeddedcontroller 5 in thelinear laser sensor 3, detecting the weld joint by using the weld joint detector obtained in the step S2, and outputting a detection result.
In this embodiment, the online detection in step S3 includes:
s31 acquiring detection image: a camera of a linear laser sensor is used for collecting a welding seam image and sending the welding seam image to an embedded industrial controller;
s32 preprocesses the acquired image: carrying out normalization processing on the image;
s33 calling the welding detector S2, and detecting the image by using a Faster-RCNN algorithm;
s34 outputs the detection result: and outputting the classification result of the welding seam and the position of the welding seam in the image, and finishing the detection of the welding seam before the welding starts.
The embodiment solves the problem that the welding starting point is difficult to position in the current welding seam tracking system, and has the advantages of high positioning precision of the welding starting point, high detection speed, high robustness and the like.
The above examples of the present invention are merely examples for clearly illustrating the present invention and are not intended to limit the embodiments of the present invention. Variations or modifications in different forms may occur to those skilled in the art upon reading the foregoing description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (9)

1. A real-time detection method for a weld joint target is characterized by comprising the following steps:
establishing a training sample set which is constructed by using a line laser sensor to acquire welding seam images of an L-shaped welding seam, a V-shaped welding seam, an I-shaped welding seam and an Open-shaped welding seam as source samples; carrying out preprocessing including horizontal turning, translation transformation, rotation transformation, brightness transformation, contrast transformation and addition of Gaussian white noise and normalization processing on a source sample to obtain a preprocessed welding seam sample with balanced distribution and changeable welding seam position and angle; taking the minimum number of 4 types of welding seam samples after pretreatment as a reference, and sampling other 3 types of welding seam samples until the number of the welding seam samples is the same as the reference to obtain training samples;
the off-line training detector is constructed for training the neural network under different initial conditions by using the training sample, and an optimal neural network model obtained by multiple times of training is used as a weld joint detector;
the network structure of the shared feature layer of the welding line detector adopts an Inception V2 network, convolution kernels with different sizes are used for operation in one layer of convolution to obtain features with different sizes, the features with different sizes are combined before being input into a lower layer network, and then the features are input into the lower layer network together;
the loss function of the weld detector is defined as:
Figure FDA0002948061640000011
wherein: i denotes the sequence number of the candidate area in a small sample lot, PiRepresenting the prediction result of the candidate area i; if the candidate region i score is marked as a positive sample, then
Figure FDA0002948061640000012
Equal to 1, otherwise
Figure FDA0002948061640000013
Equal to 0; t is ti={tx,ty,tw,thIs a vector representing the coordinates of the position of the center point of the predicted candidate region and the width and height;
Figure FDA0002948061640000014
representing the position coordinates and width and height of the central point of the real target area; classification loss function LclsIs a cross entropy function, regression loss function, of two classes
Figure FDA0002948061640000015
Then a stable Smooth L1 loss function is used; the lambda is set to 10 by default, so that the weight of the classification loss function and the regression loss functionThe weight is quite large; n is a radical ofclsRepresenting the total number of sample classes in the sample of the small lot, NregA total number of coordinates representing a target area in the small batch of samples;
and the on-line detection is configured to acquire a detection image, perform welding seam detection by using the welding seam detector, output the type and position of the welding seam, and complete the positioning of the welding seam starting point before the welding starts.
2. The method of claim 1, wherein training the detector off-line comprises the steps of:
the method comprises the steps of training a weld detector by adopting a Fast-RCNN algorithm, and is constructed in such a way that the shared characteristic layer is used for extracting the characteristics of an input image, then a candidate area is output by utilizing an area candidate network, finally the Fast-RCNN is used as a classifier for outputting the classification result of the candidate area, and only the candidate area with the highest score is reserved by adopting a non-maximum suppression algorithm;
and (3) initializing parameters of an RPN and a Fast-RCNN classifier under different initialization conditions, and training after the preset maximum iteration times are reached or the error rate on a verification set is not reduced any more, and training for multiple times to obtain an optimal model as a weld joint detector.
3. The method according to claim 2, wherein training the weld detector using the Faster-RCNN algorithm comprises the steps of:
pre-training parameters of a convolutional neural network sharing feature layer by using an ImageNet data set;
adjusting parameters of the regional candidate network;
defining a loss function constructed as a sum of the classification error and a positional deviation of the candidate region;
and calculating the gradient by using a BP algorithm, performing back propagation, and updating the parameters of the shared characteristic layer by adopting an approximate joint training mode.
4. The method of claim 3, wherein the ratio of the classification error to the scaling factor of the position deviation in the loss function is 1: 1.5.
5. The method of claim 3, wherein calculating the gradient by using the BP algorithm and performing back propagation, and updating the parameters of the shared feature layer by adopting an approximate joint training mode, comprises the following steps:
when the BP algorithm calculates the gradient, the extracted ROI is taken as a fixed value, and the network error is calculated in a minimum batch mode;
simultaneously updating parameters of a Fast-RCNN classifier and an RPN network;
and combining the gradient from the RPN network and the gradient of the Fast-RCNN classifier, and inputting the combined gradient and the gradient into the shared feature layer for parameter updating.
6. The method according to any one of claims 2 to 5, wherein the detection system on which the method is based comprises a robot arm, a welding torch, a line laser sensor, a worktable, a workpiece, the workpiece being placed on the worktable, the line laser vision sensor being mounted on the welding torch, the welding torch being placed at the end of the robot arm, the line laser sensor and the welding torch changing their positions in space by the movement of the robot arm, the line laser sensor comprising a camera and a laser generator; the method is characterized in that the online detection comprises the following steps:
acquiring a detection image, and acquiring a welding seam image by using a camera of a linear laser sensor and sending the welding seam image to an embedded industrial controller;
preprocessing the acquired image, and normalizing the image;
calling the welding seam detector, and detecting the image by using a Faster-RCNN algorithm;
and outputting the classification result of the welding seam and the position of the welding seam in the image, and finishing the detection of the welding seam before the welding starts.
7. The method of claim 1, wherein establishing a training sample set comprises the steps of:
collecting a source sample configured to capture an image prior to initiation of welding by a camera in a line laser sensor;
preprocessing the source sample;
and generating a target sample to obtain an offline training sample set.
8. The method of claim 7, wherein collecting a source sample comprises the steps of:
adjusting the position of a mechanical arm of the welding robot to enable the tail end of a welding gun to be positioned right above the position of a welding seam of a workpiece to be welded;
and a camera in the line laser sensor is used for acquiring an image and sending the image to the embedded industrial controller.
9. The method of claim 7, wherein the source sample is pre-processed comprising the steps of:
eliminating weld joint samples with arc light and splashes in the source samples, and only keeping clean weld joint samples without arc light and splashes before welding;
carrying out scale transformation on clean welding seam samples without arc light and splashing to unify the sizes;
horizontally overturning the welding seam sample after the size transformation;
randomly carrying out translation transformation, rotation transformation, brightness transformation, contrast transformation and Gaussian white noise addition on the horizontally turned weld joint sample;
and (4) carrying out normalization processing on the welding seam sample added with the Gaussian white noise.
CN201810666919.2A2018-06-252018-06-25Real-time detection method for weld joint targetExpired - Fee RelatedCN109035204B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201810666919.2ACN109035204B (en)2018-06-252018-06-25Real-time detection method for weld joint target

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201810666919.2ACN109035204B (en)2018-06-252018-06-25Real-time detection method for weld joint target

Publications (2)

Publication NumberPublication Date
CN109035204A CN109035204A (en)2018-12-18
CN109035204Btrue CN109035204B (en)2021-06-08

Family

ID=64610519

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201810666919.2AExpired - Fee RelatedCN109035204B (en)2018-06-252018-06-25Real-time detection method for weld joint target

Country Status (1)

CountryLink
CN (1)CN109035204B (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109977913B (en)*2019-04-082021-11-05北京奇艺世纪科技有限公司Target detection network training method and device and electronic equipment
CN110135513A (en)*2019-05-222019-08-16广东工业大学 A welding seam recognition method for welding robot based on deep learning
CN110245689A (en)*2019-05-232019-09-17杭州有容智控科技有限公司Shield cutter identification and position finding and detection method based on machine vision
CN110210497B (en)*2019-05-272023-07-21华南理工大学 A Robust Real-time Weld Feature Detection Method
CN110163859B (en)*2019-05-292023-05-05广东工业大学PoseCNN-based weld joint welding method, device and equipment
CN110264457B (en)*2019-06-202020-12-15浙江大学 Weld Seam Autonomous Identification Method Based on Rotation Region Candidate Network
CN112440039A (en)*2019-08-312021-03-05南京理工大学Intelligent photoelectric tracking system and measuring method for welding seam based on multi-line structured light projection
CN111037549B (en)*2019-11-292022-09-09重庆顺泰铁塔制造有限公司Welding track processing method and system based on 3D scanning and TensorFlow algorithm
CN110977292A (en)*2019-12-122020-04-10天津博迈科海洋工程有限公司Automatic detection method for welding seam of ocean platform module structure
CN110991385A (en)*2019-12-132020-04-10珠海大横琴科技发展有限公司Method and device for identifying ship driving track and electronic equipment
CN111069736A (en)*2019-12-272020-04-28唐山松下产业机器有限公司Storage medium, welding equipment, welding abnormity detection method and device
CN111289251A (en)*2020-02-272020-06-16湖北工业大学 A fine-grained fault identification method for rolling bearings
CN112001935B (en)*2020-07-282023-07-18上海巧视智能科技有限公司T-shaped weld polishing method, system, medium and terminal based on laser scanning
CN112285114A (en)*2020-09-292021-01-29华南理工大学Enameled wire spot welding quality detection system and method based on machine vision
CN112801984B (en)*2021-01-292022-10-21华南理工大学Weld joint positioning method based on countermeasure learning under laser vision system
CN113012228B (en)*2021-03-232023-06-20华南理工大学 A workpiece positioning system and a workpiece positioning method based on deep learning
CN113177914B (en)*2021-04-152023-02-17青岛理工大学 Robot welding method and system based on semantic feature clustering
CN113674218B (en)*2021-07-282024-06-14中国科学院自动化研究所Weld feature point extraction method and device, electronic equipment and storage medium
CN113828947B (en)*2021-11-232022-03-08昆山宝锦激光拼焊有限公司BP neural network laser welding seam forming prediction method based on double optimization
CN114453809B (en)*2022-03-172024-07-09中国铁建重工集团股份有限公司Cavity welding positioning method
CN114769800B (en)*2022-06-202022-09-27中建五洲工程装备有限公司Intelligent operation control system and method for welding process
CN115018819B (en)*2022-07-042025-08-01泉州装备制造研究所Weld joint position extraction method based on Transformer neural network
CN115950901A (en)*2022-12-162023-04-11广州航海学院 A hull weld flaw detection imaging recognition method and information processing method

Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107392901A (en)*2017-07-242017-11-24国网山东省电力公司信息通信公司A kind of method for transmission line part intelligence automatic identification
CN107451997A (en)*2017-07-312017-12-08南昌航空大学A kind of automatic identifying method of the welding line ultrasonic TOFD D scanning defect types based on deep learning

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
ATE499174T1 (en)*2003-12-102011-03-15Vietz Gmbh ORBITAL WELDING DEVICE FOR PIPELINE CONSTRUCTION

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107392901A (en)*2017-07-242017-11-24国网山东省电力公司信息通信公司A kind of method for transmission line part intelligence automatic identification
CN107451997A (en)*2017-07-312017-12-08南昌航空大学A kind of automatic identifying method of the welding line ultrasonic TOFD D scanning defect types based on deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
探伤机器人的焊缝图像检测技术研究;王姗姗;《中国优秀硕士学位论文全文数据库》;20180415;摘要、正文第31-32,44-51页*

Also Published As

Publication numberPublication date
CN109035204A (en)2018-12-18

Similar Documents

PublicationPublication DateTitle
CN109035204B (en)Real-time detection method for weld joint target
Zou et al.Laser vision seam tracking system based on image processing and continuous convolution operator tracker
CN106392267B (en)A kind of real-time welding seam tracking method of six degree of freedom welding robot line laser
Zou et al.Real-time seam tracking control system based on line laser visions
CN106181162B (en)A kind of real-time weld joint tracking detection method based on machine vision
CN206263418U (en)A kind of real-time seam tracking system of six degree of freedom welding robot line laser
CN109693018B (en)Autonomous mobile robot welding line visual tracking system and tracking method
CN115592324B (en)Automatic welding robot control system based on artificial intelligence
He et al.Autonomous detection of weld seam profiles via a model of saliency-based visual attention for robotic arc welding
Zou et al.An end-to-end calibration method for welding robot laser vision systems with deep reinforcement learning
CN114714355B (en)Embedded vision tracking control system of autonomous mobile welding robot
Hou et al.A teaching-free welding method based on laser visual sensing system in robotic GMAW
Zou et al.Light-weight segmentation network based on SOLOv2 for weld seam feature extraction
CN113369761B (en) A method and system for welding seam positioning based on vision-guided robot
CN113920060A (en) Welding robot autonomous operation method, device, electronic device and storage medium
Liu et al.Seam tracking system based on laser vision and CGAN for robotic multi-layer and multi-pass MAG welding
Nguyen et al.Development of a vision system integrated with industrial robots for online weld seam tracking
Gao et al.Weld-pool image centroid algorithm for seam-tracking vision model in arc-welding process
Zou et al.Automatic seam detection and tracking system for robots based on laser vision
Xiao et al.An automatic calibration algorithm for laser vision sensor in robotic autonomous welding system
Yang et al.Detection of weld groove edge based on multilayer convolution neural network
Yang et al.Automatic extraction and identification of narrow butt joint based on ANFIS before GMAW
Ryberg et al.Stereo vision for path correction in off-line programmed robot welding
Baek et al.Optimization of weld penetration prediction based on weld pool image and deep learning approach in gas tungsten arc welding
Ye et al.Weld seam tracking based on laser imaging binary image preprocessing

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant
CF01Termination of patent right due to non-payment of annual fee
CF01Termination of patent right due to non-payment of annual fee

Granted publication date:20210608


[8]ページ先頭

©2009-2025 Movatter.jp