Movatterモバイル変換


[0]ホーム

URL:


CN108453707B - Robot dragging teaching track generation method - Google Patents

Robot dragging teaching track generation method
Download PDF

Info

Publication number
CN108453707B
CN108453707BCN201810323750.0ACN201810323750ACN108453707BCN 108453707 BCN108453707 BCN 108453707BCN 201810323750 ACN201810323750 ACN 201810323750ACN 108453707 BCN108453707 BCN 108453707B
Authority
CN
China
Prior art keywords
robot
straight line
track
line segment
joint position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810323750.0A
Other languages
Chinese (zh)
Other versions
CN108453707A (en
Inventor
庹华
袁顺宁
韩建欢
韩峰涛
张雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rokae Shandong Intelligent Technology Co ltd
Original Assignee
Rokae Shandong Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rokae Shandong Intelligent Technology Co ltdfiledCriticalRokae Shandong Intelligent Technology Co ltd
Priority to CN201810323750.0ApriorityCriticalpatent/CN108453707B/en
Publication of CN108453707ApublicationCriticalpatent/CN108453707A/en
Application grantedgrantedCritical
Publication of CN108453707BpublicationCriticalpatent/CN108453707B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The invention provides a robot dragging teaching track generation method, which comprises the following steps: receiving a joint position value acquired by a sensor of the robot, filtering the joint position value to eliminate high-frequency jitter, and performing differential processing on the filtered joint position value to acquire the speed and the acceleration of the robot; according to the generated joint position value, speed and acceleration, approaching a user dragging teaching track in a piecewise straight line fitting mode, and calculating the description of the track of the robot in a Cartesian space; and generating a user program file for describing the track according to the obtained straight line segment, the turning area and the speed, and converting the track dragged by the user into a text instruction so as to modify and optimize the taught track. The method can improve the smoothness of the subsequent generated track, the fitting precision can be set automatically according to the actual situation, the flexibility is high, and the continuity of the track is ensured by using the smooth transition straight-line segment of the turning area.

Description

Robot dragging teaching track generation method
Technical Field
The invention relates to the technical field of industrial robots, in particular to a robot dragging teaching track generation method.
Background
At present, the application range of industrial robots is continuously expanded, and the industrial robots gradually permeate into emerging fields of consumption, service and the like from traditional application scenes of automobile manufacturing, electronic assembly, food processing and the like. These fields put higher demands on the ease of use and convenience of the robot.
Industrial robots typically teach to determine desired target points and trajectories and then to repeatedly execute the taught trajectories based on external signals. The traditional industrial robot obtains a user program through a teach pendant field programming or offline programming software offline programming mode, the user is required to have higher professional skill, the use threshold is high, in addition, the teach pendant mode is not visual enough, and if the user is not familiar with the robot, the programming can take longer time.
Disclosure of Invention
The object of the present invention is to solve at least one of the technical drawbacks mentioned.
Therefore, the invention aims to provide a robot dragging teaching track generation method.
In order to achieve the above object, an embodiment of the present invention provides a method for generating a robot dragging teaching track, including the following steps:
step S1, receiving the joint position value collected by the sensor of the robot, filtering the joint position value to eliminate high-frequency jitter, and carrying out differential processing on the filtered joint position value to obtain the speed and the acceleration of the robot;
step S2, according to the joint position value, velocity and acceleration generated in step S1, approximating the user dragging teaching trajectory by means of piecewise linear fitting, and calculating the description of the trajectory information of the robot in cartesian space, including: forming a continuous Cartesian space position P through a robot positive solution according to the continuous joint position values, and obtaining a straight line segment through iterative fitting; constructing a turning area between the straight line sections to realize smooth transition between the straight line sections; taking an average speed of the path on the teaching curve as a desired speed of the straight line segment, wherein the speed of the cartesian space is v (t) ═ jac (t) × J _ dot (t), jac (t) is a jacobian matrix at the corresponding time, and J _ dot (t) is the joint angular speed output in step S1;
and step S3, generating a user program file for describing the track according to the straight line segment, the turning area and the speed obtained in the step S2, and converting the track dragged by the user into a text instruction so as to modify and optimize the taught track.
Further, in the step S1, joint position values are collected by an encoder installed at a robot joint or a motor end.
Further, in step S1, the joint position value is filtered by combining a band-stop filter and a band-pass filter.
Further, in the step S2, the fitting to obtain a straight line segment includes the following steps:
the joint position J (t) is known, the position P (t) ═ FK (J (t)) of the Cartesian space is obtained through positive solution of a robot, the continuous Cartesian space position P is formed through the continuous joint position J, and t is acquisition time;
setting the starting point of a straight line segment as F0, setting the end point of the straight line segment as F1, setting F1 and F0 as a point on P, setting the straight line formed by F0 and F1 as L, integrating fabs (L (t) -P (t)), wherein the fabs are absolute value functions, the integrated result represents the error between a fitted straight line and an actual curve, and when the error is smaller than a set threshold, F1 is a reasonable straight line segment path end point, and searching reasonable F1 on P, and continuously iterating to obtain the fitted straight line segment.
Further, in the step S2, the radius of the turning area is determined according to the front-rear straight line segment length.
Further, in the step S3, the user program file includes: movel P0, V100, Z20, where P0 is the straight line segment target position, V100 represents velocity magnitude, and Z20 represents turn zone radius.
The robot dragging teaching track generation method has the following beneficial effects:
(1) friction and high-frequency jitter introduced by human hands are eliminated through filtering, so that smoothness of a subsequently generated track can be improved;
(2) the teaching track dragged by the user is fitted in a Cartesian space linear form, the fitting precision can be set automatically according to the actual situation, and the flexibility is high;
(3) the straight line section is smoothly transited by using the turning area, so that the consistency of the track is ensured;
(4) the user track is stored in a text instruction form, and the target point, the speed or the turning area can be modified and perfected according to the requirement, so that the method is convenient and flexible.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flowchart of a robot drag teaching trajectory generation method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of velocities resulting from directly differencing raw position data according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a fitted straight line compared to a taught trace according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of turn zone generation according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a robot dragging teaching track generation method according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
As shown in fig. 1 and fig. 5, a robot dragging teaching track generating method according to an embodiment of the present invention includes the following steps:
and step S1, receiving the joint position value collected by the sensor of the robot, filtering the joint position value to eliminate high-frequency jitter, and carrying out difference processing on the filtered joint position value to obtain the speed and the acceleration of the robot.
In one embodiment of the invention, joint position values are collected by encoders mounted on the joints or motor ends of the robot. By performing a preliminary processing of the data acquired from the robot sensors. The sensor can be an encoder at a robot joint or a motor end, and the acquired raw data is a joint position value.
Because the track generation needs speed and acceleration besides the position, the speed and the acceleration can not be directly measured generally, and the position needs to be differentially acquired. Specifically, it can be found by analyzing actual experimental data that, when dragging at a low speed or a high speed, a certain high-frequency jitter exists at the acquired position, and the jitter is more obvious after being differentiated into a speed and an acceleration (fig. 2 is a speed obtained by directly differentiating original data), if a track is generated by directly using the data, the robot jitter is easily caused when playing back the track, and therefore, the data is further processed before extracting track information.
The sources of fluctuation of the position data are mainly two: one is the existence of the friction force of the robot joint, which is embodied as non-linearity, and can be described by the following formula, where fe is external force, fs is maximum static friction, fv is viscous friction coefficient, v is relative motion velocity,
Figure BDA0001625947360000031
because the parameters of the friction force model are time-varying and difficult to determine, the parameters are difficult to eliminate by only friction compensation, and the hand feeling of dragging is influenced by the existence of friction, so that certain overshoot and oscillation are inevitable when a human hand drags; another source is the natural frequency of the human hand as it is dragged, which is also difficult to avoid.
In order to solve the problems, the invention adopts a mode of combining a band-stop filter and a band-pass filter to filter the joint position value, and mainly aims to filter original joint position data, eliminate high-frequency jitter and then obtain speed and acceleration through difference. The filtering can adopt a mode of combining a band-stop filter with a band-pass filter, the band-stop filter filters the jitter of specific frequency of hands, the band-pass filter eliminates high-frequency jitter caused by friction force, and meanwhile expected track information is reserved
And step S2, approaching the user dragging teaching track by adopting a piecewise straight line fitting mode according to the joint position value, the speed and the acceleration generated in the step S1, and calculating the description of the track information of the robot in a Cartesian space.
(1) And (3) forming a continuous Cartesian space position P through the positive solution of the robot according to the continuous joint position values, and obtaining a straight line segment through iterative fitting.
The trajectory of the robot can be represented in two ways, one is joint space and the other is cartesian space. From the use point of view, the trajectory description expected by the user is the movement of the workpiece relative to the workpiece, so that the description in the Cartesian space is more reasonable. Since the trajectory of the user dragging the teaching can be an arbitrary spatial curve, it cannot be described by a uniform expression. The invention adopts a piecewise straight line fitting mode to approach the teaching track dragged by the user.
Specifically, the fitting to obtain the straight line segment comprises the following steps:
when the joint position J (t) is known, the cartesian space position P (t) ═ FK (J (t)) is obtained by the positive solution of the robot, and the continuous joint position J can form the continuous cartesian space position P, and t is the acquisition time.
Assuming that the starting point of the straight line segment is F0, the starting point of the straight line segment may be a point on P, assuming that the ending point of the straight line segment is F1, the point is also a point on P, the straight line formed by F0 and F1 is L, fabs (L (t) -P (t)) are integrated, fabs are absolute value functions, the integrated result represents the error between the fitted straight line and the actual curve, and F1 is the reasonable ending point of the straight line segment path when the error is smaller than the set threshold. The fitting straight line segment is obtained iteratively by searching for a reasonable F1 on P, as shown in fig. 3.
(2) And (4) constructing a turning area between the straight line sections to realize smooth transition between the straight line sections.
The straight line segment fitted above is only continuous in position, but there is a sudden change in slope, which causes inconsistency in robot operation. Therefore, a turning area is constructed between the straight line sections, and the radius of the turning area is determined according to the lengths of the front straight line section and the rear straight line section. A smooth transition between straight line segments is achieved using a turning zone, as shown in fig. 4.
(3) The desired velocity of the straight line segment is an average velocity of the path on the teaching curve, and the velocity in cartesian space is v (t) ═ jac (t) × J _ dot (t), where jac (t) is the jacobian matrix at the corresponding time, and J _ dot (t) is the joint angular velocity output in step S1.
And step S3, generating a user program file for describing the track according to the straight line segment, the turning area and the speed obtained in the step S2, and converting the track dragged by the user into a text instruction so as to modify and optimize the taught track.
In one embodiment of the invention, the user program file includes: movel P0, V100, Z20, where P0 is the straight line segment target position, V100 represents velocity magnitude, and Z20 represents turn zone radius. The track dragged by the user is converted into a line of text instruction, and the track which is taught and finished can be modified and optimized conveniently.
According to the method for generating the robot dragging teaching track, the handheld dragging robot reaches the appointed pose or moves along the specific track, the robot collects sensor information, records target point or track data, generates the dragging teaching track after processing, and a user reproduces the teaching track in a track playback mode. The intuitive teaching mode reduces the requirements on operators, can greatly shorten the time cost of application deployment, improves the programming efficiency, and has very obvious practicability and economic value.
The robot dragging teaching track generation method has the following beneficial effects:
(1) friction and high-frequency jitter introduced by human hands are eliminated through filtering, so that smoothness of a subsequently generated track can be improved;
(2) the teaching track dragged by the user is fitted in a Cartesian space linear form, the fitting precision can be set automatically according to the actual situation, and the flexibility is high;
(3) the straight line section is smoothly transited by using the turning area, so that the consistency of the track is ensured;
(4) the user track is stored in a text instruction form, and the target point, the speed or the turning area can be modified and perfected according to the requirement, so that the method is convenient and flexible.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made in the above embodiments by those of ordinary skill in the art without departing from the principle and spirit of the present invention. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (5)

1. A robot dragging teaching track generation method is characterized by comprising the following steps:
step S1, receiving joint position values collected by a sensor of the robot in the process of carrying out handheld traction teaching on the robot, carrying out filtering processing on the joint position values to eliminate high-frequency jitter, and carrying out differential processing on the filtered joint position values to obtain the speed and the acceleration of the robot;
step S2, according to the joint position value, velocity and acceleration generated in step S1, approximating the user dragging teaching trajectory by means of piecewise linear fitting, and calculating the description of the trajectory information of the robot in cartesian space, including: forming a continuous Cartesian space position P through a positive solution of the robot according to the continuous joint position values, and obtaining a straight line segment through iterative fitting; constructing a turning area between the straight line sections to realize smooth transition between the straight line sections; taking an average speed of the path on the teaching curve as a desired speed of the straight line segment, wherein the speed of the cartesian space is v (t) ═ jac (t) × J _ dot (t), jac (t) is a jacobian matrix at the corresponding time, and J _ dot (t) is the joint angular speed output in step S1;
step S3, generating a user program file for describing the track according to the straight line segment, the turning area and the speed obtained in the step S2, and converting the track dragged by the user into a text instruction so as to modify and optimize the taught track;
in the step S1, a band elimination filter and a band pass filter are combined to filter the joint position value, where the band elimination filter filters out the hand shake with specific frequency, and the band pass filter eliminates the high-frequency shake caused by the friction force of the robot joint.
2. A robot drag teaching trajectory generation method according to claim 1, wherein in step S1, joint position values are acquired by an encoder mounted on a robot joint or a motor terminal.
3. A robot drag teaching trajectory generation method as claimed in claim 1, wherein in said step S2, fitting a straight line segment comprises the steps of:
the joint position J (t) is known, the position P (t) ═ FK (J (t)) of the Cartesian space is obtained through positive solution of a robot, the continuous Cartesian space position P is formed through the continuous joint position J, and t is acquisition time;
setting the starting point of a straight line segment as F0, setting the end point of the straight line segment as F1, setting F1 and F0 as a point on P, setting the straight line formed by F0 and F1 as L, integrating fabs (L (t) -P (t)), wherein the fabs are absolute value functions, the integrated result represents the error between a fitted straight line and an actual curve, and when the error is smaller than a set threshold, F1 is a reasonable straight line segment path end point, and searching reasonable F1 on P, and continuously iterating to obtain the fitted straight line segment.
4. A robot drag teaching trajectory generating method according to claim 1, wherein in said step S2, the radius of said turning area is determined based on the length of the front and rear straight line segments.
5. The robot drag teaching trajectory generating method according to claim 1, wherein in said step S3, the user program file includes: movel P0, V100, Z20, where P0 is the straight line segment target position, V100 represents velocity magnitude, and Z20 represents turn zone radius.
CN201810323750.0A2018-04-122018-04-12Robot dragging teaching track generation methodActiveCN108453707B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201810323750.0ACN108453707B (en)2018-04-122018-04-12Robot dragging teaching track generation method

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201810323750.0ACN108453707B (en)2018-04-122018-04-12Robot dragging teaching track generation method

Publications (2)

Publication NumberPublication Date
CN108453707A CN108453707A (en)2018-08-28
CN108453707Btrue CN108453707B (en)2021-11-19

Family

ID=63234675

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201810323750.0AActiveCN108453707B (en)2018-04-122018-04-12Robot dragging teaching track generation method

Country Status (1)

CountryLink
CN (1)CN108453707B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109483517A (en)*2018-10-222019-03-19天津扬天科技有限公司A kind of cooperation robot teaching method based on the tracking of hand appearance
CN109648571A (en)*2018-12-282019-04-19深圳市越疆科技有限公司Teaching trajectory reproducing method, system and the robot of industrial robot
CN109648549A (en)*2018-12-302019-04-19江苏集萃智能制造技术研究所有限公司Method for planning track of robot and robot
CN111482957B (en)*2019-07-122020-12-29上海智殷自动化科技有限公司Vision offline demonstrator registration method
CN110561421B (en)*2019-08-092021-03-19哈尔滨工业大学(深圳)Mechanical arm indirect dragging demonstration method and device
CN112276947B (en)*2020-10-212021-06-15乐聚(深圳)机器人技术有限公司Robot motion simulation method, device, equipment and storage medium
CN113618710B (en)*2021-07-212023-03-24慧灵科技(深圳)有限公司Dragging teaching method and device and dragging teaching robot
CN114227688B (en)*2021-12-292023-08-04同济大学 A Teaching Trajectory Learning Method Based on Curve Registration
CN114800513B (en)*2022-05-102024-03-29上海交通大学 System and method for automatically generating robot shaft hole assembly program based on single drag teaching
CN115685890B (en)*2022-11-042025-01-07深圳市灵手科技有限公司 Multi-joint equipment trajectory determination method, system, device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105005381A (en)*2015-07-082015-10-28西安电子科技大学Shake elimination method for virtual robot arm interaction
CN105425725A (en)*2015-12-092016-03-23华中科技大学Curve fitting method for discrete cutter path
CN105573315A (en)*2015-12-012016-05-11珞石(北京)科技有限公司Geometric smoothing method for Cartesian space trajectory of industrial robot
CN105710881A (en)*2016-03-162016-06-29杭州娃哈哈精密机械有限公司Continuous trajectory planning transition method for robot tail end
CN107127751A (en)*2017-03-212017-09-05宁波韦尔德斯凯勒智能科技有限公司Articulated manipulator controls integral control system and control method
CN107544299A (en)*2017-08-072018-01-05浙江工业大学PC (personal computer) end APP (application) system for teaching control of six-degree-of-freedom mechanical arm
CN107571261A (en)*2017-08-302018-01-12中国科学院自动化研究所The smooth transient method and device of the more space tracking planning of teaching robot

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP5752500B2 (en)*2011-06-272015-07-22本田技研工業株式会社 Orbit generation system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN105005381A (en)*2015-07-082015-10-28西安电子科技大学Shake elimination method for virtual robot arm interaction
CN105573315A (en)*2015-12-012016-05-11珞石(北京)科技有限公司Geometric smoothing method for Cartesian space trajectory of industrial robot
CN105425725A (en)*2015-12-092016-03-23华中科技大学Curve fitting method for discrete cutter path
CN105710881A (en)*2016-03-162016-06-29杭州娃哈哈精密机械有限公司Continuous trajectory planning transition method for robot tail end
CN107127751A (en)*2017-03-212017-09-05宁波韦尔德斯凯勒智能科技有限公司Articulated manipulator controls integral control system and control method
CN107544299A (en)*2017-08-072018-01-05浙江工业大学PC (personal computer) end APP (application) system for teaching control of six-degree-of-freedom mechanical arm
CN107571261A (en)*2017-08-302018-01-12中国科学院自动化研究所The smooth transient method and device of the more space tracking planning of teaching robot

Also Published As

Publication numberPublication date
CN108453707A (en)2018-08-28

Similar Documents

PublicationPublication DateTitle
CN108453707B (en)Robot dragging teaching track generation method
CN108340351B (en)Robot teaching device and method and teaching robot
CN108789418B (en) Control method of flexible manipulator
KR20200031081A (en) Vibration control of configuration-dependent dynamic systems
BerghuisModel-based robot control: From theory to practice.
CN106774181B (en)The method for control speed of high-precision traction teaching robot based on impedance model
Jin et al.Real-time quadratic sliding mode filter for removing noise
CN106444372A (en)Sliding mode repetitive controller for motor servo system
WO2020135608A1 (en)Industrial robot demonstration track recurrence method and system and robot
CN105710881A (en)Continuous trajectory planning transition method for robot tail end
CN107214702A (en)The method and system for planning of robot trajectory is determined using virtual reality handle
CN102707671A (en)Processing path optimization method applied to machine tool
CN110703684B (en)Trajectory planning method and device with unlimited endpoint speed
CN110281237A (en)A kind of serial manipulator joint-friction power discrimination method based on machine learning
TWI271605B (en)Method of determining permissible feed speed of an object and controlling the object
GB2569139A (en)Three-dimensional drawing tool and method
Fliess et al.Revisiting some practical issues in the implementation of model-free control
CN105563482A (en)Rotation movement planning method for end effector of industrial robot
CN108051001B (en) A robot movement control method, system and inertial sensing control device
Ying et al.A Human Intention Based Fuzzy Variable Admittance Control System for Physical Human–Robot Interaction
CN102298325B (en)Variable parameter control method of sine instruction
CN118031812A (en)Time grating displacement high-speed high-precision measurement method based on dynamic prediction interpolation resampling
CN117506867A (en)Wearable exoskeleton arm equipment and teleoperation control method thereof
CN115781683A (en)Online trajectory planning method and device for mechanical arm and computer readable medium
CN116160443A (en) A Data-Driven Rolling Time Domain Dynamics Forecasting Method

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp