Movatterモバイル変換


[0]ホーム

URL:


CN118802992A - A robot remote control method and system - Google Patents

A robot remote control method and system
Download PDF

Info

Publication number
CN118802992A
CN118802992ACN202310439912.8ACN202310439912ACN118802992ACN 118802992 ACN118802992 ACN 118802992ACN 202310439912 ACN202310439912 ACN 202310439912ACN 118802992 ACN118802992 ACN 118802992A
Authority
CN
China
Prior art keywords
mode
robot
gesture
operation mode
gesture image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310439912.8A
Other languages
Chinese (zh)
Inventor
冯伟
周凯臣
王卫军
安鲸
许睿烁
车其姝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CASfiledCriticalShenzhen Institute of Advanced Technology of CAS
Priority to CN202310439912.8ApriorityCriticalpatent/CN118802992A/en
Priority to PCT/CN2023/137167prioritypatent/WO2024212553A1/en
Publication of CN118802992ApublicationCriticalpatent/CN118802992A/en
Pendinglegal-statusCriticalCurrent

Links

Classifications

Landscapes

Abstract

Translated fromChinese

本申请涉及一种机器人远程控制方法及系统。所述方法包括:通过检测设备获取操作者的手势数据;将所述手势数据投影映射到Unity中,并通过VR操作系统界面显示手势影像显示;根据预先设定的模式参数识别所述手势影像对应的操作模式,根据所述操作模式远程控制机器人做出对应的操作动作。本申请解决了Leapmotion设备头部安装使用时存在的手背遮挡手指出现的感知误差等问题,仅通过简单的单手手势即可完成手势识别以及远程控制操作,无需进行过多的手势训练,操作更为简单。

The present application relates to a robot remote control method and system. The method comprises: obtaining the operator's gesture data through a detection device; projecting and mapping the gesture data into Unity, and displaying the gesture image through a VR operating system interface; identifying the operation mode corresponding to the gesture image according to pre-set mode parameters, and remotely controlling the robot to perform corresponding operation actions according to the operation mode. The present application solves the problems such as the perception error caused by the back of the hand blocking the fingers when the Leapmotion device is installed on the head. Gesture recognition and remote control operations can be completed through simple one-hand gestures, without the need for excessive gesture training, and the operation is simpler.

Description

Translated fromChinese
一种机器人远程控制方法及系统A robot remote control method and system

技术领域Technical Field

本申请属于人机交互技术领域,特别涉及一种机器人远程控制方法及系统。The present application belongs to the field of human-computer interaction technology, and in particular, relates to a robot remote control method and system.

背景技术Background Art

在工业高度发达的今天,生产环境中机器人比重占比越来越大,使用范围亦越来越广泛。随着数字孪生(Digital Twins)和混合现实(Mixed Reality)技术的出现和兴起,在远程机器人上的应用也引起了更多的重视,数字孪生是以数字化方式创建物体实体的虚拟模型,具有实时同步,高保真度等诸多特性,促进物理世界与信息世界交互与融合,借助该种特性,使得远程人机交互的遥控操作变成了可实现的未来,例如在一些存在危险因素的工厂生产车间或无菌化生产车间等特殊场合,可以借助远程人机交互的遥控操作实现安全合格生产的目的。In today's highly developed industrial environment, the proportion of robots in the production environment is increasing, and the scope of use is becoming more and more extensive. With the emergence and rise of digital twins and mixed reality technologies, the application of remote robots has also attracted more attention. Digital twins are virtual models of physical objects created in a digital way. They have many characteristics such as real-time synchronization and high fidelity, which promote the interaction and integration of the physical world and the information world. With this characteristic, remote human-computer interaction remote control operation has become a feasible future. For example, in some special occasions such as factory production workshops or sterile production workshops with dangerous factors, remote human-computer interaction remote control operation can be used to achieve the purpose of safe and qualified production.

目前,利用远程人机交互进行机器人控制的方法多种多样,然而在面对混合现实场景中,现有方法仍存在以下不足:At present, there are various methods for controlling robots using remote human-machine interaction. However, in the face of mixed reality scenarios, existing methods still have the following shortcomings:

1.利用远程人机交互进行机器人控制的传统方法大多需要依赖于鼠标、键盘或手柄等控制设备来实现,操作较为不便。1. Traditional methods of using remote human-computer interaction to control robots mostly rely on control devices such as mice, keyboards or handles, which is inconvenient to operate.

2.现有技术中在使用手势进行机器人远程控制操作时,操作者必须使用双手,且需要较多的手势数据进行训练。2. In the prior art, when using gestures to perform robot remote control operations, the operator must use both hands and requires a lot of gesture data for training.

3.使用leapmotion(体感控制器)等设备在头部安装进行机器人远程控制操作时,容易造成一些感知误差,导致输出数据不够稳定,对于机器人的控制也容易造成运动不稳定。3. When using devices such as LeapMotion (body sensing controller) installed on the head to remotely control the robot, it is easy to cause some perception errors, resulting in unstable output data and unstable movement of the robot.

发明内容Summary of the invention

本申请提供了一种机器人远程控制方法及系统,旨在至少在一定程度上解决现有技术中的上述技术问题之一。The present application provides a robot remote control method and system, aiming to solve at least one of the above-mentioned technical problems in the prior art to a certain extent.

为了解决上述问题,本申请提供了如下技术方案:In order to solve the above problems, this application provides the following technical solutions:

一种机器人远程控制方法,包括:A robot remote control method, comprising:

通过检测设备获取操作者的手势数据;Acquire the operator's gesture data through the detection device;

将所述手势数据投影映射到Unity中,并通过VR操作系统界面显示手势影像显示;Projecting the gesture data into Unity, and displaying the gesture image through a VR operating system interface;

根据预先设定的模式参数识别所述手势影像对应的操作模式,根据所述操作模式远程控制机器人做出对应的操作动作。The operation mode corresponding to the gesture image is identified according to the preset mode parameters, and the robot is remotely controlled to perform corresponding operation actions according to the operation mode.

本申请实施例采取的技术方案还包括:所述检测设备为leapmotion设备。The technical solution adopted by the embodiment of the present application also includes: the detection device is a leapmotion device.

本申请实施例采取的技术方案还包括:所述模式参数包括操作者的手指、手势或/和手腕的旋转动作以及角度参数,所述操作模式包括无控制模式、方向控制模式、位置控制模式以及抓取模式。The technical solution adopted in the embodiment of the present application also includes: the mode parameters include the operator's fingers, gestures and/or rotation movements of the wrist and angle parameters, and the operation modes include no control mode, direction control mode, position control mode and grasping mode.

本申请实施例采取的技术方案还包括:所述根据预先设定的模式参数识别所述手势影像对应的操作模式,根据所述操作模式远程控制机器人做出对应的操作动作包括:The technical solution adopted by the embodiment of the present application also includes: identifying the operation mode corresponding to the gesture image according to the preset mode parameters, and remotely controlling the robot to perform the corresponding operation action according to the operation mode includes:

根据预先设定的模式参数识别手势影像的操作模式是否是无控制模式,如果是,控制机器人暂停操作;其中,所述无控制模式的判断方式为:判断所述手势影像中的手张开角度是否是0弧度,如果是0弧度,则判定所述手势影像的操作模式是无控制模式。According to the pre-set mode parameters, it is identified whether the operation mode of the gesture image is the no-control mode. If so, the robot is controlled to suspend operation. The no-control mode is determined by judging whether the hand opening angle in the gesture image is 0 radians. If it is 0 radians, the operation mode of the gesture image is determined to be the no-control mode.

本申请实施例采取的技术方案还包括:所述根据预先设定的模式参数识别所述手势影像对应的操作模式,根据所述操作模式远程控制机器人做出对应的操作动作包括:The technical solution adopted by the embodiment of the present application also includes: identifying the operation mode corresponding to the gesture image according to the preset mode parameters, and remotely controlling the robot to perform the corresponding operation action according to the operation mode includes:

根据预先设定的模式参数识别手势影像的操作模式是否是方向控制模式,如果是,控制机器人执行翻滚操作;其中,所述方向控制模式的判断方式为:判断所述手势影像是否是张开手腕转动,且腕关节与手掌角度为20°及以上,如果是,则判定所述手势影像的操作模式是方向控制模式。According to the pre-set mode parameters, it is identified whether the operation mode of the gesture image is the direction control mode. If so, the robot is controlled to perform a rolling operation. The direction control mode is determined as follows: whether the gesture image is a rotation with the wrist open, and the angle between the wrist joint and the palm is 20° or more. If so, the operation mode of the gesture image is determined to be the direction control mode.

本申请实施例采取的技术方案还包括:所述根据预先设定的模式参数识别所述手势影像对应的操作模式,根据所述操作模式远程控制机器人做出对应的操作动作包括:The technical solution adopted by the embodiment of the present application also includes: identifying the operation mode corresponding to the gesture image according to the preset mode parameters, and remotely controlling the robot to perform the corresponding operation action according to the operation mode includes:

根据预先设定的模式参数识别手势影像的操作模式是否是位置控制模式,所述位置控制模式的判断方式为:检测操作者是否仅伸出任一手指,如果是,则判定所述手势影像对应的操作模式是位置控制模式,并检测所述手指的指向,控制机器人按照所述手指的指向进行运动。According to the pre-set mode parameters, it is identified whether the operation mode of the gesture image is the position control mode. The judgment method of the position control mode is: detect whether the operator only extends any one finger. If so, it is determined that the operation mode corresponding to the gesture image is the position control mode, and the direction of the finger is detected, and the robot is controlled to move according to the direction of the finger.

本申请实施例采取的技术方案还包括:所述根据预先设定的模式参数识别所述手势影像对应的操作模式,根据所述操作模式远程控制机器人做出对应的操作动作包括:The technical solution adopted by the embodiment of the present application also includes: identifying the operation mode corresponding to the gesture image according to the preset mode parameters, and remotely controlling the robot to perform the corresponding operation action according to the operation mode includes:

根据预先设定的模式参数识别手势影像的操作模式是否是抓取控制模式,如果是,控制机器人执行抓取操作;其中,所述抓取控制模式的判断方式为:判断操作者是否握拳,如果是,则判定所述手势影像的操作模式是抓取控制模式。According to the pre-set mode parameters, it is identified whether the operation mode of the gesture image is the grasping control mode. If so, the robot is controlled to perform the grasping operation; wherein, the grasping control mode is determined by judging whether the operator makes a fist. If so, the operation mode of the gesture image is determined to be the grasping control mode.

本申请实施例采取的另一技术方案为:一种机器人远程控制系统,包括:Another technical solution adopted by the embodiment of the present application is: a robot remote control system, comprising:

检测模块:用于获取操作者的手势数据;Detection module: used to obtain the operator's gesture data;

投影模块:用于将所述手势数据投影映射到Unity中,并通过VR操作系统界面显示手势影像显示;Projection module: used to project the gesture data into Unity and display the gesture image through the VR operating system interface;

模式识别模块:用于根据预先设定的模式参数识别所述手势影像对应的操作模式;Mode recognition module: used to recognize the operation mode corresponding to the gesture image according to the preset mode parameters;

远程控制模块:用于根据所述操作模式远程控制机器人做出对应的操作动作。Remote control module: used to remotely control the robot to perform corresponding operation actions according to the operation mode.

相对于现有技术,本申请实施例产生的有益效果在于:本申请实施例的机器人远程控制方法及系统通过Leapmotion设备以及单手手势相结合的方式,根据预先设定的模式参数对操作者的手势动作进行操作模式识别,根据识别结果对机器人进行远程控制,解决了Leapmotion设备头部安装使用时存在的手背遮挡手指出现的感知误差等问题,从而确保机器人可以稳定运动。本申请实施例仅通过简单的单手手势即可完成手势识别以及远程控制操作,无需进行过多的手势训练,操作更为简单,且无需借助手柄或鼠标等设备,大大减少了对外接设备的依赖性,且该操作系统通过优化后,可以达到更为准确的识别度。Compared with the prior art, the beneficial effect of the embodiments of the present application is that the robot remote control method and system of the embodiments of the present application recognizes the operation mode of the operator's gesture according to the pre-set mode parameters by combining the Leapmotion device and the one-hand gesture, and remotely controls the robot according to the recognition result, solving the problem of perception error caused by the back of the hand blocking the fingers when the Leapmotion device is installed on the head, thereby ensuring that the robot can move stably. The embodiments of the present application can complete gesture recognition and remote control operations only through simple one-hand gestures, without the need for excessive gesture training, making the operation simpler, and without the need for devices such as handles or mice, which greatly reduces the dependence on external devices, and the operating system can achieve more accurate recognition after optimization.

附图说明BRIEF DESCRIPTION OF THE DRAWINGS

图1是本申请第一实施例的机器人远程控制方法的流程图;FIG1 is a flow chart of a robot remote control method according to a first embodiment of the present application;

图2是本申请第二实施例的机器人远程控制方法的流程图;FIG2 is a flow chart of a robot remote control method according to a second embodiment of the present application;

图3为本申请实施例中无控制模式对应的手势动作示意图;FIG3 is a schematic diagram of gesture actions corresponding to no control mode in an embodiment of the present application;

图4是本申请第三实施例的机器人远程控制方法的流程图;FIG4 is a flow chart of a robot remote control method according to a third embodiment of the present application;

图5为本申请实施例中方向控制模式对应的手势动作示意图;FIG5 is a schematic diagram of gesture actions corresponding to the direction control mode in an embodiment of the present application;

图6是本申请第四实施例的机器人远程控制方法的流程图;FIG6 is a flow chart of a robot remote control method according to a fourth embodiment of the present application;

图7为本申请实施例中位置控制模式对应的手势动作示意图;FIG7 is a schematic diagram of gesture actions corresponding to the position control mode in an embodiment of the present application;

图8是本申请第五实施例的机器人远程控制方法的流程图;FIG8 is a flow chart of a robot remote control method according to a fifth embodiment of the present application;

图9为本申请实施例中抓取控制模式对应的手势动作示意图;FIG9 is a schematic diagram of gesture actions corresponding to the grab control mode in an embodiment of the present application;

图10为本申请实施例的机器人远程控制系统结构示意图。FIG. 10 is a schematic diagram of the structure of a robot remote control system according to an embodiment of the present application.

具体实施方式DETAILED DESCRIPTION

下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本申请的一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。The following will be combined with the drawings in the embodiments of the present application to clearly and completely describe the technical solutions in the embodiments of the present application. Obviously, the described embodiments are only part of the embodiments of the present application, not all of the embodiments. Based on the embodiments in the present application, all other embodiments obtained by ordinary technicians in this field without creative work are within the scope of protection of this application.

本申请中的术语“第一”、“第二”、“第三”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”、“第三”的特征可以明示或者隐含地包括至少一个该特征。本申请的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。本申请实施例中所有方向性指示(诸如上、下、左、右、前、后……)仅用于解释在某一特定姿态(如附图所示)下各部件之间的相对位置关系、运动情况等,如果该特定姿态发生改变时,则该方向性指示也相应地随之改变。此外,术语“包括”和“具有”以及它们任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或计算机设备没有限定于已列出的步骤或单元,而是可选地还包括没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或计算机设备固有的其它步骤或单元。The terms "first", "second", "third" in this application are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Thus, the features defined as "first", "second", "third" can expressly or implicitly include at least one of the features. In the description of this application, the meaning of "multiple" is at least two, such as two, three, etc., unless otherwise clearly and specifically defined. In the embodiments of this application, all directional indications (such as up, down, left, right, front, back...) are only used to explain the relative position relationship, movement, etc. between the components under a certain specific posture (as shown in the accompanying drawings). If the specific posture changes, the directional indication also changes accordingly. In addition, the terms "including" and "having" and any of their variations are intended to cover non-exclusive inclusions. For example, a process, method, system, product or computer device that includes a series of steps or units is not limited to the steps or units listed, but optionally also includes steps or units that are not listed, or optionally also includes other steps or units inherent to these processes, methods, products or computer devices.

在本文中提及“实施例”意味着,结合实施例描述的特定特征、结构或特性可以包含在本申请的至少一个实施例中。在说明书中的各个位置出现该短语并不一定均是指相同的实施例,也不是与其它实施例互斥的独立的或备选的实施例。本领域技术人员显式地和隐式地理解的是,本文所描述的实施例可以与其它实施例相结合。Reference to "embodiments" herein means that a particular feature, structure, or characteristic described in conjunction with the embodiments may be included in at least one embodiment of the present application. The appearance of the phrase in various locations in the specification does not necessarily refer to the same embodiment, nor is it an independent or alternative embodiment that is mutually exclusive with other embodiments. It is explicitly and implicitly understood by those skilled in the art that the embodiments described herein may be combined with other embodiments.

请参阅图1,是本申请第一实施例的机器人远程控制方法的流程图。本申请第一实施例的机器人远程控制方法包括以下步骤:Please refer to FIG1 , which is a flow chart of a robot remote control method according to a first embodiment of the present application. The robot remote control method according to the first embodiment of the present application comprises the following steps:

S100:通过检测设备获取操作者的手势数据;S100: Acquiring gesture data of the operator through a detection device;

本步骤中,检测设备为leapmotion等具有摄像功能的体感互动设备,通过跟踪操作者的手部动作、移动轨迹等手势数据以对机器人进行不同的远程控制操作。In this step, the detection device is a somatosensory interactive device with a camera function such as LeapMotion, which performs different remote control operations on the robot by tracking the operator's hand movements, movement trajectories and other gesture data.

S110:将获取的手势数据投影映射到Unity中,并通过VR操作系统界面进行手势影像显示;S110: Projecting and mapping the acquired gesture data into Unity, and displaying the gesture image through a VR operating system interface;

本步骤中,Unity是一种实时3D互动的内容创作和运营平台,可用于创作、运营和变现任何实时互动的2D和3D内容,是开发虚拟现实软件的一个基础工具。本申请实施例通过将leapmotion设备获取的手势数据实时投影映射到Unity中,并通过VR操作系统界面进行手势影像显示,以实现数字孪生(Digital Twin,数字双胞胎)效果。In this step, Unity is a real-time 3D interactive content creation and operation platform that can be used to create, operate and realize any real-time interactive 2D and 3D content. It is a basic tool for developing virtual reality software. The embodiment of the present application projects the gesture data obtained by the leapmotion device into Unity in real time, and displays the gesture image through the VR operating system interface to achieve a digital twin effect.

S120:根据预先设定的模式参数识别手势影像对应的操作模式,根据操作模式远程控制机器人做出对应的操作动作;S120: identifying an operation mode corresponding to the gesture image according to a preset mode parameter, and remotely controlling the robot to perform a corresponding operation action according to the operation mode;

本步骤中,操作模式包括但不限于无控制模式、方向控制模式、位置控制模式以及抓取模式等,每种操作模式对应的模式参数可根据实际应用场景进行预先设定,模式参数包括但不限于手指、手势或/和手腕的旋转动作以及角度等参数,在进行机器人的远程控制操作时,操作者根据预先设定的模式参数做出对应的手势动作,以控制机器人完成对应的操作功能。In this step, the operation modes include but are not limited to no control mode, direction control mode, position control mode and grasping mode, etc. The mode parameters corresponding to each operation mode can be pre-set according to the actual application scenario. The mode parameters include but are not limited to parameters such as finger, gesture and/or wrist rotation movement and angle. When performing remote control operation of the robot, the operator makes corresponding gestures according to the pre-set mode parameters to control the robot to complete the corresponding operation function.

请参阅图2,是本申请第二实施例的机器人远程控制方法的流程图。本申请第二实施例的机器人远程控制方法包括以下步骤:Please refer to FIG2 , which is a flow chart of a robot remote control method according to a second embodiment of the present application. The robot remote control method according to the second embodiment of the present application comprises the following steps:

S200:通过检测设备获取操作者的手势数据;S200: Acquiring gesture data of the operator through a detection device;

本步骤中,手势数据的检测过程与S100相同,为避免冗余,本步骤将不再赘述。In this step, the gesture data detection process is the same as S100, and to avoid redundancy, this step will not be described again.

S210:将获取的手势数据投影映射到Unity中,并通过VR操作系统界面进行手势影像显示;S210: Projecting and mapping the acquired gesture data into Unity, and displaying the gesture image through the VR operating system interface;

本步骤中,手势数据的投影方式与S100相同,为避免冗余,本步骤将不再赘述。In this step, the projection method of the gesture data is the same as that in S100, and to avoid redundancy, this step will not be described again.

S220:根据预先设定的模式参数识别手势影像的操作模式是否是无控制模式,如果是,执行S230;S220: Identify whether the operation mode of the gesture image is a no-control mode according to a preset mode parameter, and if yes, execute S230;

本步骤中,根据预先设定的模式参数识别手势影像的操作模式是否是无控制模式具体为:判断手势影像中的手张开角度是否是0弧度,如果是0弧度,则判定当前手势影像对应的操作模式是无控制模式。具体如图3所示,为本申请实施例中无控制模式对应的手势动作示意图。可以理解,该模式对应的模式参数不限于此,具体可根据使用应用场景进行设置。In this step, the operation mode of the gesture image is identified as the uncontrolled mode according to the pre-set mode parameters: determine whether the hand opening angle in the gesture image is 0 radians, and if it is 0 radians, determine that the operation mode corresponding to the current gesture image is the uncontrolled mode. As shown in FIG3, it is a schematic diagram of the gesture action corresponding to the uncontrolled mode in the embodiment of the present application. It can be understood that the mode parameters corresponding to the mode are not limited to this, and can be set according to the application scenario.

S230:控制机器人暂停操作。S230: Control the robot to suspend operation.

请参阅图4,是本申请第三实施例的机器人远程控制方法的流程图。本申请第三实施例的机器人远程控制方法包括以下步骤:Please refer to FIG4 , which is a flow chart of a robot remote control method according to a third embodiment of the present application. The robot remote control method according to the third embodiment of the present application comprises the following steps:

S300:通过检测设备获取操作者的手势数据;S300: Acquiring gesture data of the operator through a detection device;

本步骤中,手势数据的检测过程与S100相同,为避免冗余,本步骤将不再赘述。In this step, the gesture data detection process is the same as S100, and to avoid redundancy, this step will not be described again.

S310:将获取的手势数据投影映射到Unity中,并通过VR操作系统界面进行手势影像显示;S310: Projecting and mapping the acquired gesture data into Unity, and displaying the gesture image through the VR operating system interface;

本步骤中,手势数据的投影方式与S100相同,为避免冗余,本步骤将不再赘述。In this step, the projection method of the gesture data is the same as that in S100, and to avoid redundancy, this step will not be described again.

S320:根据预先设定的模式参数识别手势影像的操作模式是否是方向控制模式,如果是,执行S330;S320: Identify whether the operation mode of the gesture image is the direction control mode according to the preset mode parameters, if yes, execute S330;

本步骤中,方向控制模式即对机器人的运动方向进行控制,根据预先设定的模式参数识别手势影像的操作模式是否是方向控制模式具体为:判断手势影像是否是张开手腕转动,且腕关节与手掌角度为20°及以上,如果是,则判定当前手势影像的操作模式是方向控制模式。具体如图5所示,为本申请实施例中方向控制模式对应的手势动作示意图。可以理解,该模式对应的模式参数不限于此,具体可根据使用应用场景进行设置。In this step, the direction control mode controls the movement direction of the robot. According to the pre-set mode parameters, the operation mode of the gesture image is identified as the direction control mode. Specifically, it is determined whether the gesture image is an open wrist rotation, and the angle between the wrist joint and the palm is 20° or more. If so, it is determined that the operation mode of the current gesture image is the direction control mode. As shown in Figure 5, it is a schematic diagram of the gesture action corresponding to the direction control mode in the embodiment of the present application. It can be understood that the mode parameters corresponding to this mode are not limited to this, and can be set according to the application scenario.

S330:控制机器人执行翻滚操作。S330: Control the robot to perform a rolling operation.

请参阅图6,是本申请第四实施例的机器人远程控制方法的流程图。本申请第四实施例的机器人远程控制方法包括以下步骤:Please refer to FIG6 , which is a flow chart of a robot remote control method according to a fourth embodiment of the present application. The robot remote control method according to the fourth embodiment of the present application comprises the following steps:

S400:通过检测设备获取操作者的手势数据;S400: Acquiring gesture data of the operator through a detection device;

本步骤中,手势数据的检测过程与S100相同,为避免冗余,本步骤将不再赘述。In this step, the gesture data detection process is the same as S100, and to avoid redundancy, this step will not be described again.

S410:将获取的手势数据投影映射到Unity中,并通过VR操作系统界面进行手势影像显示;S410: Projecting and mapping the acquired gesture data into Unity, and displaying the gesture image through the VR operating system interface;

本步骤中,手势数据的投影方式与S100相同,为避免冗余,本步骤将不再赘述。In this step, the projection method of the gesture data is the same as that in S100, and to avoid redundancy, this step will not be described again.

S420:根据预先设定的模式参数识别手势影像的操作模式是否是位置控制模式,如果是,执行S430;S420: Identify whether the operation mode of the gesture image is the position control mode according to the preset mode parameters, if yes, execute S430;

本步骤中,位置控制模式即对机器人的位置指向进行控制,根据预先设定的模式参数识别手势影像的操作模式是否是位置控制模式具体为:检测操作者是否仅伸出任一手指,如果是,则判定当前手势影像对应的操作模式是位置控制模式,并进一步检测伸出手指的指向。具体如图7所示,为本申请实施例中位置控制模式对应的手势动作示意图中,作为优选,本申请实施例以伸出食指为例,具体可根据使用应用场景进行设置。In this step, the position control mode is to control the position direction of the robot, and the operation mode of the gesture image is identified as the position control mode according to the pre-set mode parameters: detect whether the operator only extends any finger, if so, determine that the operation mode corresponding to the current gesture image is the position control mode, and further detect the direction of the extended finger. As shown in Figure 7, it is a schematic diagram of the gesture action corresponding to the position control mode in the embodiment of the present application. As a preferred embodiment, the present application takes the extension of the index finger as an example, which can be set according to the application scenario.

S430:控制机器人按照手指指向进行运动。S430: Control the robot to move according to the finger pointing.

进一步地,由于传统leapmotion在进行手势检测时,极易出现手指被手背遮挡的情况,导致手势检测存在误差。在本实施例中,为了改善该误差问题,针对位置控制模式的手势检测进行了优化处理,具体为:首先对食指进行建模,当检测到食指指点时,获取食指的指尖位置并设置为初始值,然后获取手掌的位置坐标,利用手掌的位置坐标作为相对较小的感知误差来估计食指的指尖位置,从而降低由于手指被手背遮挡导致的手势检测误差。Furthermore, when performing gesture detection in traditional Leap Motion, fingers are easily blocked by the back of the hand, resulting in errors in gesture detection. In this embodiment, in order to improve this error problem, the gesture detection in the position control mode is optimized, specifically: firstly, the index finger is modeled, and when the index finger is detected, the fingertip position of the index finger is obtained and set as the initial value, and then the position coordinates of the palm are obtained, and the position coordinates of the palm are used as a relatively small perception error to estimate the fingertip position of the index finger, thereby reducing the gesture detection error caused by the finger being blocked by the back of the hand.

请参阅图8,是本申请第五实施例的机器人远程控制方法的流程图。本申请第五实施例的机器人远程控制方法包括以下步骤:Please refer to FIG8 , which is a flow chart of a robot remote control method according to a fifth embodiment of the present application. The robot remote control method according to the fifth embodiment of the present application comprises the following steps:

S500:通过检测设备获取操作者的手势数据;S500: Acquiring gesture data of the operator through a detection device;

本步骤中,手势数据的检测过程与S100相同,为避免冗余,本步骤将不再赘述。In this step, the gesture data detection process is the same as S100, and to avoid redundancy, this step will not be described again.

S510:将获取的手势数据投影映射到Unity中,并通过VR操作系统界面进行手势影像显示;S510: Projecting and mapping the acquired gesture data into Unity, and displaying the gesture image through the VR operating system interface;

本步骤中,手势数据的投影方式与S100相同,为避免冗余,本步骤将不再赘述。In this step, the projection method of the gesture data is the same as that in S100, and to avoid redundancy, this step will not be described again.

S520:根据预先设定的模式参数识别手势影像的操作模式是否是抓取控制模式,如果是,执行S530;S520: Identify whether the operation mode of the gesture image is a grab control mode according to a preset mode parameter, and if yes, execute S530;

本步骤中,抓取控制模式即控制机器人执行抓取动作,根据预先设定的模式参数识别手势影像的操作模式是否是抓取控制模式具体为:检测操作者是否握拳,如果是,则判定当前手势影像对应的操作模式是抓取控制模式。具体如图9所示,为本申请实施例中抓取控制模式对应的手势动作示意图。可以理解,该模式对应的模式参数不限于此,具体可根据使用应用场景进行设置。In this step, the grasping control mode is to control the robot to perform the grasping action, and the operation mode of the gesture image is identified according to the pre-set mode parameters. Specifically, it is to detect whether the operator is making a fist, and if so, it is determined that the operation mode corresponding to the current gesture image is the grasping control mode. As shown in Figure 9, it is a schematic diagram of the gesture action corresponding to the grasping control mode in the embodiment of the present application. It can be understood that the mode parameters corresponding to this mode are not limited to this, and can be set according to the application scenario.

S530:控制机器人执行抓取操作。S530: Control the robot to perform a grasping operation.

基于上述,本申请实施例的机器人远程控制方法通过Leapmotion设备以及单手手势相结合的方式,根据预先设定的模式参数对操作者的手势动作进行操作模式识别,根据识别结果对机器人进行远程控制,解决了Leapmotion设备头部安装使用时存在的手背遮挡手指出现的感知误差等问题,从而确保机器人可以稳定运动。本申请实施例仅通过简单的单手手势即可完成手势识别以及远程控制操作,无需进行过多的手势训练,操作更为简单,且无需借助手柄或鼠标等设备,大大减少了对外接设备的依赖性,且该操作系统通过优化后,可以达到更为准确的识别度。Based on the above, the robot remote control method of the embodiment of the present application uses a combination of Leapmotion devices and one-hand gestures to identify the operation mode of the operator's gestures according to pre-set mode parameters, and remotely controls the robot according to the recognition results, solving the problem of perception errors caused by the back of the hand blocking the fingers when the Leapmotion device is installed on the head, thereby ensuring that the robot can move stably. The embodiment of the present application can complete gesture recognition and remote control operations only through simple one-hand gestures, without the need for excessive gesture training, making the operation simpler, and without the need for devices such as handles or mice, which greatly reduces the dependence on external devices, and the operating system can achieve more accurate recognition after optimization.

请参阅图10,为本申请实施例的机器人远程控制系统结构示意图。本申请实施例的机器人远程控制系统40包括:Please refer to FIG10 , which is a schematic diagram of the structure of a robot remote control system according to an embodiment of the present application. The robot remote control system 40 according to the embodiment of the present application includes:

基于上述,本申请实施例的机器人远程控制系统通过Leapmotion设备以及单手手势相结合的方式,根据预先设定的模式参数对操作者的手势动作进行操作模式识别,根据识别结果对机器人进行远程控制,解决了Leapmotion设备头部安装使用时存在的手背遮挡手指出现的感知误差等问题,从而确保机器人可以稳定运动。本申请实施例仅通过简单的单手手势即可完成手势识别以及远程控制操作,无需进行过多的手势训练,操作更为简单,且无需借助手柄或鼠标等设备,大大减少了对外接设备的依赖性,且该操作系统通过优化后,可以达到更为准确的识别度。Based on the above, the robot remote control system of the embodiment of the present application recognizes the operation mode of the operator's gesture movements according to the pre-set mode parameters through the combination of the Leapmotion device and the one-hand gesture, and remotely controls the robot according to the recognition result, which solves the problem of perception error caused by the back of the hand blocking the fingers when the Leapmotion device is installed on the head, thereby ensuring that the robot can move stably. The embodiment of the present application can complete gesture recognition and remote control operations only through simple one-hand gestures, without the need for excessive gesture training, and the operation is simpler, and there is no need to use devices such as handles or mice, which greatly reduces the dependence on external devices, and the operating system can achieve more accurate recognition after optimization.

在本申请所提供的几个实施例中,应该理解到,所揭露的系统,系统和方法,可以通过其它的方式实现。例如,以上所描述的系统实施例仅仅是示意性的,例如,单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,系统或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。In the several embodiments provided in the present application, it should be understood that the disclosed systems, systems and methods can be implemented in other ways. For example, the system embodiments described above are only schematic. For example, the division of units is only a logical function division. There may be other division methods in actual implementation, such as multiple units or components can be combined or integrated into another system, or some features can be ignored or not executed. Another point is that the mutual coupling or direct coupling or communication connection shown or discussed can be through some interfaces, indirect coupling or communication connection of systems or units, which can be electrical, mechanical or other forms.

另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。以上仅为本申请的实施方式,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。In addition, each functional unit in each embodiment of the present application can be integrated into one processing unit, or each unit can exist physically separately, or two or more units can be integrated into one unit. The above integrated unit can be implemented in the form of hardware or in the form of software functional units. The above is only an implementation method of the present application, and does not limit the patent scope of the present application. Any equivalent structure or equivalent process transformation made by using the description and drawings of this application, or directly or indirectly used in other related technical fields, is also included in the patent protection scope of the present application.

Claims (10)

CN202310439912.8A2023-04-122023-04-12 A robot remote control method and systemPendingCN118802992A (en)

Priority Applications (2)

Application NumberPriority DateFiling DateTitle
CN202310439912.8ACN118802992A (en)2023-04-122023-04-12 A robot remote control method and system
PCT/CN2023/137167WO2024212553A1 (en)2023-04-122023-12-07Robot remote control method and system

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202310439912.8ACN118802992A (en)2023-04-122023-04-12 A robot remote control method and system

Publications (1)

Publication NumberPublication Date
CN118802992Atrue CN118802992A (en)2024-10-18

Family

ID=93032493

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202310439912.8APendingCN118802992A (en)2023-04-122023-04-12 A robot remote control method and system

Country Status (2)

CountryLink
CN (1)CN118802992A (en)
WO (1)WO2024212553A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN119781620A (en)*2024-12-252025-04-08江苏集萃智能制造技术研究所有限公司 A remote control method for a semi-humanoid robot based on VR equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN107765855A (en)*2017-10-252018-03-06电子科技大学A kind of method and system based on gesture identification control machine people motion
CN108044625B (en)*2017-12-182019-08-30中南大学 A robot manipulator control method based on multi-Leapmotion virtual gesture fusion
CN108509026B (en)*2018-02-062020-04-14西安电子科技大学 Remote maintenance support system and method based on enhanced interaction
WO2020221311A1 (en)*2019-04-302020-11-05齐鲁工业大学Wearable device-based mobile robot control system and control method
WO2022021432A1 (en)*2020-07-312022-02-03Oppo广东移动通信有限公司Gesture control method and related device

Also Published As

Publication numberPublication date
WO2024212553A1 (en)2024-10-17

Similar Documents

PublicationPublication DateTitle
CN103809733B (en) Human-computer interaction system and method
Surale et al.Experimental analysis of barehand mid-air mode-switching techniques in virtual reality
Kim et al.Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality
KR102021851B1 (en)Method for processing interaction between object and user of virtual reality environment
Zubrycki et al.Using integrated vision systems: three gears and leap motion, to control a 3-finger dexterous gripper
CN102830798A (en)Mark-free hand tracking method of single-arm robot based on Kinect
KR20050102803A (en)Apparatus, system and method for virtual user interface
CN102270037A (en) Freehand man-machine interface operating system and method thereof
Choi et al.3D hand pose estimation on conventional capacitive touchscreens
WO2023160697A1 (en)Mouse model mapping method and apparatus, device and storage medium
CN205068294U (en)Human -computer interaction of robot device
CN118466805A (en) Non-contact 3D model human-computer interaction method based on machine vision and gesture recognition
WO2024212553A1 (en)Robot remote control method and system
CN106598422B (en)hybrid control method, control system and electronic equipment
WO2021195916A1 (en)Dynamic hand simulation method, apparatus and system
CN118170259B (en)Screen interaction method and device based on gesture control, electronic equipment and medium
CN116185205B (en) Non-contact gesture interaction method and device
CN118470063A (en)Cockpit man-machine interaction method based on multi-vision sensing human body tracking
Sumukha et al.Gesture Controlled 6 DoF Manipulator with Custom Gripper for Pick and Place Operation using ROS2 Framework
CN117237526A (en)Gesture recognition drawing method, device, equipment and storage medium
CN118034483A (en)Gesture recognition method, apparatus, device, storage medium and program product
Jeevan et al.AI Virtual Mouse Using Hand Geatures
Tumkor et al.Hand gestures in CAD systems
KR102861897B1 (en)Apparatus and Method for Recognizing Motion
CN115798054B (en) A gesture recognition method and electronic device based on AR/MR technology

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination

[8]ページ先頭

©2009-2025 Movatter.jp