Movatterモバイル変換


[0]ホーム

URL:


CN113534835A - A kind of tourism virtual remote experience system and method - Google Patents

A kind of tourism virtual remote experience system and method
Download PDF

Info

Publication number
CN113534835A
CN113534835ACN202110754987.6ACN202110754987ACN113534835ACN 113534835 ACN113534835 ACN 113534835ACN 202110754987 ACN202110754987 ACN 202110754987ACN 113534835 ACN113534835 ACN 113534835A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
flight
module
sight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110754987.6A
Other languages
Chinese (zh)
Other versions
CN113534835B (en
Inventor
周震宇
叶琴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiangnan University
Original Assignee
Xiangnan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiangnan UniversityfiledCriticalXiangnan University
Priority to CN202110754987.6ApriorityCriticalpatent/CN113534835B/en
Publication of CN113534835ApublicationCriticalpatent/CN113534835A/en
Application grantedgrantedCritical
Publication of CN113534835BpublicationCriticalpatent/CN113534835B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

Translated fromChinese

本发明提出了一种旅游虚拟远程体验系统及方法,包括:无人机,所述无人机配置有双目相机,用于实时采集全景视频帧;视线追踪模块,利用训练的深度神经网络模型获得用户的视线方向;飞行控制模块,所述飞行控制模块利用追踪到的所述视线方向控制无人机的飞行方向,实现对无人机的远程飞行控制;服务器处理模块,用于将双目相机采集的全景视频帧进行处理,发送至裸眼3D显示屏;裸眼3D显示屏,所述裸眼3D显示屏用于进行信息显示,提供给用户沉浸式飞行体验。本发明能够利用视线实现对无人机的控制,配合5G的高速网络及裸眼3D显示屏,获得极佳的沉浸式的体验。

Figure 202110754987

The present invention provides a system and method for virtual remote experience of tourism, including: an unmanned aerial vehicle, wherein the unmanned aerial vehicle is equipped with a binocular camera, which is used to collect panoramic video frames in real time; a sight tracking module, which uses a trained deep neural network model Obtain the line of sight direction of the user; a flight control module, the flight control module uses the tracked line of sight direction to control the flight direction of the unmanned aerial vehicle to realize remote flight control of the unmanned aerial vehicle; the server processing module is used to convert the binocular The panoramic video frames collected by the camera are processed and sent to the naked-eye 3D display screen; the naked-eye 3D display screen is used to display information and provide users with an immersive flight experience. The invention can realize the control of the drone by using the sight line, and cooperate with the 5G high-speed network and the naked-eye 3D display screen to obtain an excellent immersive experience.

Figure 202110754987

Description

Tourism virtual remote experience system and method
Technical Field
The invention relates to a tourism experience technology, in particular to a tourism virtual remote experience system and a tourism virtual remote experience method, and the related technology comprises the technical fields of unmanned aerial vehicle control, sight tracking, deep learning and the like.
Background
With the improvement of living standard of people, the tourism demand of people is increased vigorously. But with the influence of global epidemic situation, great harm is brought to the tourism industry. To avoid conglomerates, many amusement scenes are closed. For example: the Guangzhou tower of the famous scenery spot is closed once due to the influence of the local epidemic in 2021 in 5-6 months and does not receive the visit of tourists.
Therefore, the virtual tourism becomes a real demand of people, and compared with the traditional tourism, the virtual tourism has the advantages of no need of going out, no need of crowding, low carbon, environmental protection, low time cost, high speed, high efficiency and the like. In the prior art, some methods related to virtual tourism exist, however, the control experience of virtual tourism is poor, interaction is performed by adopting a manual control mode, immersive experience is lacked, visited scenic spots are limited, for example, a certain scenic spot needs to be modeled, only the tourism of the fixed scenic spot can be participated, the modeled scene is not a real scene, and the immersive experience is lacked.
According to the tourism virtual remote experience system and method, information acquisition of a tourism scene is achieved by means of the unmanned aerial vehicle, the sight line of a user is tracked, the flight direction and speed of the unmanned aerial vehicle are controlled, the man-machine integrated state is achieved, the user feels that the user is an airplane and not only controls the airplane, the unmanned aerial vehicle information is passively received in the prior art, the man-machine integrated state cannot be achieved through the sight line tracking technology, and the real immersive experience effect is obtained by combining a naked eye 3D display screen.
The innovation of the invention is mainly as follows:
1) the virtual remote experience system and method for tourism provided by the invention are suitable for remote virtual tourism, the unmanned aerial vehicle is introduced to participate in the virtual tourism, the problem of single tourism scene of the traditional virtual tourism is avoided, and the unmanned aerial vehicle can carry out close-range and/or remote observation at any time according to the viewpoint of a user due to flexible flight.
2) The invention adopts a deep learning method to realize sight tracking so as to control the flight state of the unmanned aerial vehicle and carry out speed control and direction control, and the deep learning method can solve the problem of realizing tracking of a user and further convert the tracking into a flight control instruction, so that participants obtain man-machine-in-one immersive virtual tourism experience.
Disclosure of Invention
The invention provides a virtual remote experience system and a virtual remote experience method for tourism, wherein the system comprises the following steps:
the unmanned aerial vehicle is provided with a binocular camera and is used for acquiring panoramic video frames in real time;
the sight tracking module is used for acquiring the sight direction of the user by utilizing the trained first deep neural network model;
the flight control module controls the flight direction of the unmanned aerial vehicle by utilizing the traced sight direction, so as to realize remote flight control of the unmanned aerial vehicle;
the server processing module is used for processing the panoramic video frames collected by the binocular camera and sending the panoramic video frames to a naked eye 3D display screen;
the naked eye 3D display screen is used for displaying information and providing immersive flight experience for users.
Optionally, the unmanned aerial vehicle and the server processing module perform high-speed communication through the 5G communication module, and send panoramic video frame data acquired by the binocular camera to the server processing module.
Optionally, the experience system further includes: and the 3D sound box module is used for playing audio information captured in the flight process of the unmanned aerial vehicle.
Optionally, the control module controls the flying direction and speed of the unmanned aerial vehicle through the sight line direction and the sight line concentration ratio; the higher the concentration of the sight line is, the faster the advancing direction is; the flight direction is adjusted along with the sight line direction.
Optionally, the system further comprises a console module, and the unmanned aerial vehicle is remotely controlled through each operating button; and/or the voice control module is used for realizing the remote control of the unmanned aerial vehicle through a voice command; and/or the gesture control module is used for realizing the remote control of the unmanned aerial vehicle through the recognized gesture.
Correspondingly, the invention also provides a virtual tourism remote experience method, which is characterized by comprising the following steps:
acquiring a panoramic video frame in real time by using an unmanned aerial vehicle, wherein the unmanned aerial vehicle is provided with a binocular camera;
obtaining a sight direction of a user by using a sight tracking module, wherein the sight direction is obtained by using a trained first deep neural network model;
the flight control module is used for realizing remote flight control of the unmanned aerial vehicle, and the flight control module controls the flight direction of the unmanned aerial vehicle by using the traced sight direction;
processing the panoramic video frames collected by the binocular camera by using a server processing module, and sending the panoramic video frames to a naked eye 3D display screen;
and information is displayed by utilizing a naked eye 3D display screen, and immersive flight experience is provided for a user.
Optionally, the unmanned aerial vehicle and the server processing module perform high-speed communication through the 5G communication module, and send panoramic video frame data acquired by the binocular camera to the server processing module.
Optionally, the method further includes: and playing audio information captured in the flight process of the unmanned aerial vehicle by using the 3D sound box module.
Optionally, the method further includes: the control module controls the flying direction and speed of the unmanned aerial vehicle through the sight line direction and the sight line concentration ratio; the higher the concentration of the sight line is, the faster the advancing direction is; the flight direction is adjusted along with the sight line direction.
Optionally, the method further includes: the control console module is used for realizing remote control of the unmanned aerial vehicle, and the unmanned aerial vehicle is remotely controlled through each operating button; and/or the voice control module is used for realizing the remote control of the unmanned aerial vehicle, and the voice command is used for realizing the remote control of the unmanned aerial vehicle; and/or utilize gesture control module to realize the remote control to unmanned aerial vehicle, realize the remote control to unmanned aerial vehicle through the gesture of discerning.
Has the advantages that:
1) the virtual remote experience system and method for tourism provided by the invention are suitable for remote virtual tourism, the unmanned aerial vehicle is introduced to participate in the virtual tourism, the problem of single tourism scene of the traditional virtual tourism is avoided, such as Guangzhou tower, Baiyunshan, Zhujiang day/night sightseeing and the like can be flown, and the unmanned aerial vehicle can carry out close-range and/or remote observation at any time according to the viewpoint of a user due to flexible flight.
2) The invention adopts a deep learning method to realize sight tracking so as to control the flight state of the unmanned aerial vehicle and carry out speed control and direction control, and the deep learning method can solve the problem of realizing tracking of a user and further convert the tracking into a flight control instruction, so that participants obtain man-machine-in-one immersive virtual tourism experience.
Drawings
FIG. 1 is a functional schematic diagram of a travel virtual remote experience system.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and embodiments.
As shown in FIG. 1, the present invention provides a virtual travel remote experience system, comprising:
the unmanned aerial vehicle is provided with a binocular camera and is used for acquiring panoramic video frames in real time;
the sight tracking module is used for acquiring the sight direction of the user by utilizing the trained first deep neural network model;
the flight control module controls the flight direction of the unmanned aerial vehicle by utilizing the traced sight direction, so as to realize remote flight control of the unmanned aerial vehicle;
the server processing module is used for processing the panoramic video frames collected by the binocular camera and sending the panoramic video frames to a naked eye 3D display screen;
the naked eye 3D display screen is used for displaying information and providing immersive flight experience for users.
Optionally, the unmanned aerial vehicle and the server processing module perform high-speed communication through the 5G communication module, and send panoramic video frame data acquired by the binocular camera to the server processing module.
Optionally, the experience system further includes: and the 3D sound box module is used for playing audio information captured in the flight process of the unmanned aerial vehicle.
Optionally, the control module controls the flying direction and speed of the unmanned aerial vehicle through the sight line direction and the sight line concentration ratio; the higher the concentration of the sight line is, the faster the advancing direction is; the flight direction is adjusted along with the sight line direction.
Optionally, the system further comprises a console module, and the unmanned aerial vehicle is remotely controlled through each operating button; and/or the voice control module is used for realizing the remote control of the unmanned aerial vehicle through a voice command; and/or the gesture control module is used for realizing the remote control of the unmanned aerial vehicle through the recognized gesture.
Optionally, under the condition that the control distance of the unmanned aerial vehicle can be met, the user can configure the unmanned aerial vehicle by himself; when the condition does not allow, also can rent the unmanned aerial vehicle of sight spot, then can rent unmanned aerial vehicle after the user pays, be responsible for unmanned aerial vehicle's management by sight spot integrated control center, for example: when a plurality of unmanned aerial vehicles possibly have flight position conflicts in the air flight process, an alarm is given in advance, flight positions and speeds meeting requirements are given, and under the emergency situation, a renter directly obtains flight authorities, so that different unmanned aerial vehicles are prevented from colliding. And after the risk is relieved, the speaking right is returned to the user.
Optionally, the first deep neural network is used to track the line of sight, determine a change state of the line of sight, and further convert the change state into a flight control command, for example: left, right, up, down, etc.
Optionally, the first deep neural network is a DRCNN network, and the DRCNN includes: one or more convolutional layers, one or more pooling layers, fully-connected layers; the convolution kernel size adopted by the convolution layer is 3 x 3; the DRCNN adopts an excitation function which is a sigmod function;
optionally, the DRCNN utilizes a Sine-Index-Softmax (Sine-Index-Softmax) to improve the accuracy of the gaze tracking; the sine exponential loss function is:
Figure BDA0003143927910000041
wherein, thetayiDenoted as sample i and its corresponding label yiAngle of vector (b) in whichyiIndicating that sample i is at its label yiDeviation of (a) from (b)jRepresents the deviation at output node j; the N represents the number of training samples; said wyiRepresenting a sample i on its label yiThe weight of (c).
Optionally, the pooling method of the pooling layer is as follows:
S=f(elogw+LOSSSIS);
where s represents the output of the current layer, f () represents the activation function, and w represents the weight of the current layer.
Optionally, the gaze concentration is obtained by a second deep neural network, where the second deep neural network is implemented by using an attention mechanism, and the second deep neural network is specifically an attention neural network, and optionally, the gaze concentration may share a convolution feature with the first neural network; or training independently to obtain the convolution characteristics suitable for the model. The attention neural network divides the user's attention into a plurality of speed levels. Optionally, the speed levels are 1, 2, 3, 4, 5, 6, 7 … N. The serial numbers represent the magnitude levels of the speeds, and the smaller the number, the faster the flight speed, whereas the larger the number, the slower the flight speed.
The excitation function adopted by the attention neural network is a cosine exponential excitation function and is marked as g (), wherein
Figure RE-RE-GDA0003235465650000042
Wherein, thetayiDenoted as sample i and its corresponding label yiThe vector included angle of (A); the N represents the number of training samples; said wyiIndicating that sample i is at its label yiThe weight of (c).
Correspondingly, the invention also provides a virtual tourism remote experience method, which is characterized by comprising the following steps:
acquiring a panoramic video frame in real time by using an unmanned aerial vehicle, wherein the unmanned aerial vehicle is provided with a binocular camera;
obtaining a sight direction of a user by using a sight tracking module, wherein the sight direction is obtained by using a trained first deep neural network model;
the flight control module is used for realizing remote flight control of the unmanned aerial vehicle, and the flight control module controls the flight direction of the unmanned aerial vehicle by using the traced sight direction;
processing the panoramic video frames collected by the binocular camera by using a server processing module, and sending the panoramic video frames to a naked eye 3D display screen;
and information is displayed by utilizing a naked eye 3D display screen, and immersive flight experience is provided for a user.
Optionally, the unmanned aerial vehicle and the server processing module perform high-speed communication through the 5G communication module, and send panoramic video frame data acquired by the binocular camera to the server processing module.
Optionally, the method further includes: and playing audio information captured in the flight process of the unmanned aerial vehicle by using the 3D sound box module.
Optionally, the method further includes: the control module controls the flying direction and speed of the unmanned aerial vehicle through the sight line direction and the sight line concentration ratio; the higher the concentration of the sight line is, the faster the advancing direction is; the flight direction is adjusted along with the sight line direction.
Optionally, the method further includes: the control console module is used for realizing remote control of the unmanned aerial vehicle, and the unmanned aerial vehicle is remotely controlled through each operating button; and/or the voice control module is used for realizing the remote control of the unmanned aerial vehicle, and the voice command is used for realizing the remote control of the unmanned aerial vehicle; and/or utilize gesture control module to realize the remote control to unmanned aerial vehicle, realize the remote control to unmanned aerial vehicle through the gesture of discerning.
Optionally, under the condition that the control distance of the unmanned aerial vehicle can be met, the user can configure the unmanned aerial vehicle by himself; when the condition does not allow, also can rent the unmanned aerial vehicle of sight spot, then can rent unmanned aerial vehicle after the user pays, be responsible for unmanned aerial vehicle's management by sight spot integrated control center, for example: when a plurality of unmanned aerial vehicles possibly have flight position conflicts in the air flight process, an alarm is given in advance, flight positions and speeds meeting requirements are given, and under the emergency situation, a renter directly obtains flight authorities, so that different unmanned aerial vehicles are prevented from colliding. And after the risk is relieved, the authority is returned to the user.
Optionally, the first deep neural network is used to track the line of sight, determine a change state of the line of sight, and further convert the change state into a flight control command, for example: left, right, up, down, etc.
Optionally, the first deep neural network is a DRCNN network, and the DRCNN includes: one or more convolutional layers, one or more pooling layers, fully-connected layers; the convolution kernel size adopted by the convolution layer is 3 x 3; the excitation function adopted by the DRCNN is a sigmod excitation function.
Optionally, the DRCNN utilizes a Sine-Index-Softmax (Sine-Index-Softmax) to improve the accuracy of the gaze tracking; the sine exponential loss function is:
Figure BDA0003143927910000051
wherein, thetayiDenoted as sample i and its corresponding label yiAngle of vector (b) in whichyiIndicating that sample i is at its label yiDeviation of (a) from (b)jRepresents the deviation at output node j; the N represents the number of training samples; said wyiRepresenting a sample i on its label yiThe weight of (c).
Optionally, the pooling method of the pooling layer is as follows:
S=f(elogw+LOSSSIS);
where s represents the output of the current layer, f () represents the activation function, and w represents the weight of the current layer.
Optionally, the gaze concentration is obtained by a second deep neural network, where the second deep neural network is implemented by using an attention mechanism, and the second deep neural network is specifically an attention neural network, and optionally, the gaze concentration may share a convolution feature with the first neural network; or training independently to obtain the convolution characteristics suitable for the model. The attention neural network divides the user's attention into a plurality of speed levels. Optionally, the speed levels are 1, 2, 3, 4, 5, 6, 7 … N. The serial numbers represent the magnitude levels of the speeds, and the smaller the number, the faster the flight speed, whereas the larger the number, the slower the flight speed.
The excitation function adopted by the attention neural network is a cosine exponential excitation function and is marked as g (), wherein
Figure RE-GDA0003235465650000061
Wherein, thetayiDenoted as sample i and its corresponding label yiThe vector included angle of (A); the N represents the number of training samples; said wyiIndicating that sample i is at its label yiThe weight of (c).
The present application also proposes a computer-readable medium storing computer program instructions capable of executing any of the methods proposed by the present invention.
In the description herein, references to the description of "one embodiment," "an example," "a specific example" or the like are intended to mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium may be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, and an optical disk.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the preferred embodiment of the present invention and is not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, or direct or indirect applications in other related fields, which are made by the present specification and drawings, are included in the scope of the present invention. The preferred embodiments of the invention disclosed above are intended to be illustrative only. The preferred embodiments are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention. The invention is limited only by the claims and their full scope and equivalents.

Claims (10)

1. A virtual remote experience system and method for travel, the system comprising:
the unmanned aerial vehicle is provided with a binocular camera and is used for acquiring panoramic video frames in real time;
the sight tracking module is used for acquiring the sight direction of the user by utilizing the trained first deep neural network model;
the flight control module controls the flight direction of the unmanned aerial vehicle by utilizing the traced sight line direction, so that the remote flight control of the unmanned aerial vehicle is realized;
the server processing module is used for processing the panoramic video frames collected by the binocular camera and sending the panoramic video frames to a naked eye 3D display screen;
the naked eye 3D display screen is used for displaying information and providing immersive flight experience for users.
2. The system of claim 1, wherein the drone communicates with the server processing module via the 5G communication module to send the panoramic video frame data captured by the binocular camera to the server processing module.
3. The system of claim 1, the experience system further comprising: and the 3D sound box module is used for playing audio information captured in the flight process of the unmanned aerial vehicle.
4. The system of claim 1, the control module controls the direction and speed of flight of the drone through the direction and concentration of the line of sight; the higher the concentration of the sight line is, the faster the advancing direction is; the flight direction is adjusted along with the sight line direction.
5. The system of claim 1, further comprising a console module for remotely controlling the drone via the operating buttons; and/or the voice control module is used for realizing the remote control of the unmanned aerial vehicle through a voice command; and/or the gesture control module is used for realizing the remote control of the unmanned aerial vehicle through the recognized gesture.
6. A virtual remote experience system and method for tourism is characterized in that:
acquiring a panoramic video frame in real time by using an unmanned aerial vehicle, wherein the unmanned aerial vehicle is provided with a binocular camera;
obtaining a sight direction of a user by using a sight tracking module, wherein the sight direction is obtained by using a trained first deep neural network model;
the flight control module is used for realizing remote flight control of the unmanned aerial vehicle, and the flight control module is used for controlling the flight direction of the unmanned aerial vehicle by utilizing the traced sight line direction;
processing the panoramic video frames collected by the binocular camera by using a server processing module, and sending the panoramic video frames to a naked eye 3D display screen;
and information is displayed by utilizing a naked eye 3D display screen, and immersive flight experience is provided for a user.
7. The method of claim 6, wherein the UAV communicates with the server processing module via the 5G communication module to send the panoramic video frame data acquired by the binocular camera to the server processing module.
8. The method of claim 6, further comprising: and playing audio information captured in the flight process of the unmanned aerial vehicle by using the 3D sound box module.
9. The method of claim 6, further comprising: the control module controls the flying direction and speed of the unmanned aerial vehicle through the sight line direction and the sight line concentration ratio; the higher the concentration of the sight line is, the faster the advancing direction is; the flight direction is adjusted along with the sight line direction.
10. The method of claim 6, further comprising: the control console module is used for realizing remote control of the unmanned aerial vehicle, and the unmanned aerial vehicle is remotely controlled through each operating button; and/or the voice control module is used for realizing the remote control of the unmanned aerial vehicle, and the voice command is used for realizing the remote control of the unmanned aerial vehicle; and/or utilize gesture control module to realize the remote control to unmanned aerial vehicle, realize the remote control to unmanned aerial vehicle through the gesture of discerning.
CN202110754987.6A2021-07-012021-07-01 A kind of tourism virtual remote experience system and methodActiveCN113534835B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202110754987.6ACN113534835B (en)2021-07-012021-07-01 A kind of tourism virtual remote experience system and method

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202110754987.6ACN113534835B (en)2021-07-012021-07-01 A kind of tourism virtual remote experience system and method

Publications (2)

Publication NumberPublication Date
CN113534835Atrue CN113534835A (en)2021-10-22
CN113534835B CN113534835B (en)2022-05-31

Family

ID=78126648

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202110754987.6AActiveCN113534835B (en)2021-07-012021-07-01 A kind of tourism virtual remote experience system and method

Country Status (1)

CountryLink
CN (1)CN113534835B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN119648478A (en)*2024-11-132025-03-18贵州迦太利华信息科技有限公司 Immersive digital tourism experience configuration method and system based on eye tracking technology

Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20160313743A1 (en)*2015-04-242016-10-27Samsung Display Co., Ltd.Flying display device
CN107065905A (en)*2017-03-232017-08-18东南大学A kind of immersion unmanned aerial vehicle control system and its control method
CN109032183A (en)*2018-08-232018-12-18广州创链科技有限公司A kind of unmanned plane control device and method based on Pupil diameter
US20190204824A1 (en)*2013-10-252019-07-04Ioannis MicrosOptically assisted landing and takeoff of drones
CN110412996A (en)*2019-06-182019-11-05中国人民解放军军事科学院国防科技创新研究院It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system
CN111277756A (en)*2020-02-132020-06-12西安交通大学 Camera control method of small multi-rotor UAV based on eye recognition and tracking technology
CN112738498A (en)*2020-12-242021-04-30京东方科技集团股份有限公司 A virtual tour system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20190204824A1 (en)*2013-10-252019-07-04Ioannis MicrosOptically assisted landing and takeoff of drones
US20160313743A1 (en)*2015-04-242016-10-27Samsung Display Co., Ltd.Flying display device
CN107065905A (en)*2017-03-232017-08-18东南大学A kind of immersion unmanned aerial vehicle control system and its control method
CN109032183A (en)*2018-08-232018-12-18广州创链科技有限公司A kind of unmanned plane control device and method based on Pupil diameter
CN110412996A (en)*2019-06-182019-11-05中国人民解放军军事科学院国防科技创新研究院It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system
CN111277756A (en)*2020-02-132020-06-12西安交通大学 Camera control method of small multi-rotor UAV based on eye recognition and tracking technology
CN112738498A (en)*2020-12-242021-04-30京东方科技集团股份有限公司 A virtual tour system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
无双: "凝视控制系统——人眼控制无人机", 《微信公众号》*

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN119648478A (en)*2024-11-132025-03-18贵州迦太利华信息科技有限公司 Immersive digital tourism experience configuration method and system based on eye tracking technology

Also Published As

Publication numberPublication date
CN113534835B (en)2022-05-31

Similar Documents

PublicationPublication DateTitle
AU2024203150B2 (en)System and method for augmented and virtual reality
JP2022046670A (en)System, method, and medium for displaying interactive augmented reality presentation
EP3224574B1 (en)Street-level guidance via route path
HildebrandAerial play: Drone medium, mobility, communication, and culture
CN117271687A (en)Track playback method, track playback device, electronic equipment and storage medium
CN113534835A (en) A kind of tourism virtual remote experience system and method
BeesleyHead in the Clouds: documenting the rise of personal drone cultures
CN113467616A (en)Augmented reality processing method and related device, vehicle and storage medium
HildebrandConsumer Drones as Mobile Media: A Technographic Study of Seeing, Moving, and Being (with) Drones
US20250252670A1 (en)Method for dynamic navigation mapping in virtual reality environments based on hybrid data integration
CN119603558A (en) Shooting control method, device, system and computer readable storage medium
CN119031112A (en) HUD screen display method, device, equipment, medium and vehicle
HildebrandAerial Play

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp