Movatterモバイル変換


[0]ホーム

URL:


CN113055598A - Orientation data compensation method and device, electronic equipment and readable storage medium - Google Patents

Orientation data compensation method and device, electronic equipment and readable storage medium
Download PDF

Info

Publication number
CN113055598A
CN113055598ACN202110321774.4ACN202110321774ACN113055598ACN 113055598 ACN113055598 ACN 113055598ACN 202110321774 ACN202110321774 ACN 202110321774ACN 113055598 ACN113055598 ACN 113055598A
Authority
CN
China
Prior art keywords
orientation data
sequence
orientation
time
shooting time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110321774.4A
Other languages
Chinese (zh)
Other versions
CN113055598B (en
Inventor
黄凯
王楠
章国锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shangtang Technology Development Co Ltd
Original Assignee
Zhejiang Shangtang Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shangtang Technology Development Co LtdfiledCriticalZhejiang Shangtang Technology Development Co Ltd
Priority to CN202110321774.4ApriorityCriticalpatent/CN113055598B/en
Publication of CN113055598ApublicationCriticalpatent/CN113055598A/en
Application grantedgrantedCritical
Publication of CN113055598BpublicationCriticalpatent/CN113055598B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The invention provides an orientation data compensation method, an orientation data compensation device, an electronic device and a computer-readable storage medium, wherein the compensation method comprises the following steps: acquiring a first orientation data sequence and a shooting time sequence of an image sequence shot in a preset time; acquiring a second orientation data sequence of the orientation meter in the preset time; calculating the offset speed of the yaw angle of the orientation meter according to the first orientation data sequence, the second orientation data sequence and the shooting time sequence; obtaining compensation data corresponding to each second orientation data according to the offset speed; compensating each of the second orientation data in the second orientation data sequence based on the compensation data. The method is used for compensating the yaw angle, so that the negative influence of the yaw angle deviation on the tracking precision is eliminated or reduced.

Description

Orientation data compensation method and device, electronic equipment and readable storage medium
Technical Field
The invention relates to the technical field of computer vision, in particular to an orientation data compensation method, an orientation data compensation device, electronic equipment and a readable storage medium.
Background
Visual odometry technology is a core problem of some current scenarios applying computer vision technology. Visual odometry technology can have important application in many fields, such as AR/VR, automatic driving technology, indoor navigation and the like.
On a smart phone platform, the system estimates and maintains the orientation of the mobile phone based on an accelerometer and a gyroscope, and in the method, under the condition of continuous motion of the mobile phone, an unblocked continuous deviation exists in the estimated orientation of the yaw angle. This constant offset will cause an increasing error towards the yaw angle part of the odometer, which will eventually have an increasingly negative effect on the tracking accuracy of the odometer.
Disclosure of Invention
The invention provides an orientation data compensation method, an electronic device and a readable storage medium, which are used for eliminating or reducing the negative influence of yaw angle deviation on tracking accuracy.
In order to solve the above technical problems, a first technical solution provided by the present invention is: there is provided an orientation data compensation method, comprising: acquiring a first orientation data sequence and a shooting time sequence of an image sequence shot in a preset time; acquiring a second orientation data sequence of the orientation meter in the preset time; calculating the offset speed of the yaw angle of the orientation meter according to the first orientation data sequence, the second orientation data sequence and the shooting time sequence; obtaining compensation data corresponding to each second orientation data according to the offset speed; compensating each of the second orientation data in the second orientation data sequence based on the compensation data.
Wherein the calculating of the offset speed of the yaw angle of the heading meter according to the first heading data sequence, the second heading data sequence, and the shooting time sequence includes: calculating a first difference value of each first orientation data and a first orientation data in the first orientation data sequence; calculating a second difference value between each second orientation data and the first second orientation data in the second orientation data sequence; calculating the difference value of the shooting time of each shooting time and the first shooting time in the shooting time sequence; and calculating the offset speed of the yaw angle of the orientation meter by using the first difference, the second difference and the shooting time difference corresponding to the shooting time.
Wherein the calculating the offset speed of the yaw angle of the heading meter by using the first difference, the second difference and the difference between the shooting times corresponding to the shooting times includes: calculating the offset speed of the yaw angle of the orientation meter corresponding to each shooting time by using the first difference, the second difference and the shooting time difference corresponding to each shooting time, and calculating the offset speed of the yaw angle of the orientation meter corresponding to all shooting times according to the offset speeds of the yaw angles of the orientation meter corresponding to all shooting times.
Wherein the method further comprises: acquiring a statistical time sequence corresponding to the second orientation data sequence, wherein the statistical time in the statistical time sequence corresponds to the second orientation data in the second orientation data sequence one to one; the calculating the offset speed of the yaw angle of the heading meter corresponding to each shooting time by using the first difference, the second difference and the shooting time difference corresponding to each shooting time comprises: for any target shooting time, determining a corresponding target first difference value according to first orientation data corresponding to the target shooting time; determining target statistic time closest to the target shooting time in the statistic time sequence, and determining a corresponding target second difference value according to a second orientation corresponding to the target statistic time; and calculating the offset speed of the yaw angle of the heading meter corresponding to the target shooting time according to the target first difference, the target second difference and the shooting time difference corresponding to the target shooting time.
Wherein the first orientation data and the first second orientation data correspond to the same time.
Wherein, the method should include: acquiring a statistical time sequence corresponding to the second orientation data sequence, wherein the statistical time in the statistical time sequence corresponds to the second orientation data in the second orientation data sequence one to one; the obtaining compensation data corresponding to each second orientation data according to the offset speed includes: calculating to obtain a statistical time difference value between each statistical time and the first statistical time in the statistical time sequence; and calculating to obtain the compensation data corresponding to each second orientation data according to the statistical time difference and the offset speed.
Wherein the compensating each second orientation data in the sequence of second orientation data based on the compensation data comprises: and calculating the product of each second orientation data and the compensation data to obtain each compensated second orientation data.
Wherein the acquiring of the first orientation data sequence and the capturing time sequence of the image sequence captured within the predetermined time includes: acquiring an image sequence, and acquiring the shooting time of each image to obtain the shooting time sequence; processing each image by using a neural network algorithm to obtain first orientation data corresponding to each image, and obtaining the first orientation data sequence according to each first orientation data.
Wherein the acquiring of the first orientation data sequence and the capturing time sequence of the image sequence captured within the predetermined time includes: acquiring an image sequence, and acquiring the shooting time of each image to obtain the shooting time sequence; processing each image by using a neural network algorithm to obtain first orientation data corresponding to each image, and obtaining the first orientation data sequence according to each first orientation data.
In order to solve the above technical problems, a second technical solution provided by the present invention is: provided is a yaw angle compensation device including: the data acquisition module is used for acquiring a first orientation data sequence and a shooting time sequence of the image sequence shot in a preset time; acquiring a second orientation data sequence of the orientation meter in the preset time; the speed calculation module is used for calculating the offset speed of the yaw angle of the orientation meter according to the first orientation data sequence, the second orientation data sequence and the shooting time sequence; the compensation data calculation module is used for obtaining compensation data corresponding to each second orientation data according to the offset speed; a compensation module for compensating each of the second orientation data in the second orientation data sequence based on the compensation data.
In order to solve the above technical problems, a third technical solution provided by the present invention is: provided is an electronic device including: the orientation data compensation method comprises a memory and a processor, wherein the memory stores program instructions, and the processor calls the program instructions from the memory to execute the orientation data compensation method.
In order to solve the above technical problems, a fourth technical solution provided by the present invention is: there is provided a computer readable storage medium storing a program file executable to implement the orientation data compensation method as described above.
The method has the advantages that the method is different from the prior art, and the offset speed of the yaw angle of the heading indicator is obtained through calculation; obtaining compensation data corresponding to each second orientation data according to the offset speed; and compensating each second orientation data in the second orientation data sequence based on the compensation data so as to compensate the yaw angle, thereby eliminating or reducing the negative influence of the yaw angle deviation on the tracking precision.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without inventive efforts, wherein:
FIG. 1 is a flow chart illustrating a method for orientation data compensation according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating an embodiment of step S11 in FIG. 1;
FIG. 3 is a flowchart illustrating an embodiment of step S12 in FIG. 1;
FIG. 4 is a flowchart illustrating an embodiment of step S13 in FIG. 1;
FIG. 5 is a schematic structural diagram of a yaw angle compensation apparatus according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of an electronic device according to an embodiment of the invention;
FIG. 7 is a schematic structural diagram of a storage medium according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Visual odometry technology is a core problem of some current scenarios applying computer vision technology. Visual odometry technology can have important application in many fields, such as AR/VR, automatic driving technology, indoor navigation and the like.
On a smart phone platform, the system estimates and maintains the orientation of the mobile phone based on an accelerometer and a gyroscope, estimates the current orientation of the mobile phone through a series of filtering algorithms on the bottom layer, and constructs a software sensor orientation meter for a visual mileage calculation method. Estimating the orientation of the camera again from the original accelerometer and gyroscope in the visual odometry calculation method is redundant, especially at the application layer far from the bottom layer (such as the Web end of a non-native application), the calculation time is multiplied, and therefore directly reading the orientation meter is the best performance scheme. However, most smartphones do not use magnetometers, and maintain orientation based only on accelerometers and gyroscopes, and this algorithm is not observable in the yaw angle dimension, so there is an unobstructed continuous shift in yaw angle in the estimated orientation under continuous motion. This constant offset will cause an increasing error towards the yaw angle part of the odometer, which will eventually have an increasingly negative effect on the tracking accuracy of the odometer.
Referring to fig. 1, a schematic flow chart of a first embodiment of a method for data compensation according to the present invention includes:
step S11: acquiring a first orientation data sequence and a shooting time sequence of an image sequence shot in a preset time; and acquiring a second orientation data sequence of the orientation meter in a preset time.
In this embodiment, the mobile terminal is taken as an example to perform orientation data compensation; the mobile terminal may be a mobile phone, smart glasses, a watch, etc., and the mobile phone is taken as an example for explanation in the following description. The mobile phone is provided with a camera, and continuously shoots in a preset time period through the camera to obtain all shot images in the preset time period, wherein each shot image is correspondingly recorded with shooting time to form a shooting time sequence, and the shot images can be sequenced according to the shooting time to form an image sequence. Furthermore, the mobile phone can acquire corresponding first orientation data through the shot images, wherein the first orientation data represents the image shooting time of the mobile phone and the orientation of a camera in the mobile phone, and the first orientation data can be the shape of a unit quaternion.
The mobile phone is also provided with an orientation meter which is a device integrated in the mobile terminal with a data processing function and can be used for measuring the orientation of a camera in the mobile terminal; the heading meter counts the second heading data of the yaw angle in each preset time period, and each counting records the counting time. The orientation meter can be a device integrated in the mobile terminal with a data processing function and can be used for measuring the orientation of a camera in the mobile terminal; the mobile terminal may be a mobile phone, smart glasses, a watch, and the like, and the following description will be given by taking the mobile phone as an example; and the second orientation data represents a state quantity representing the orientation of the camera in the mobile terminal at the present moment, and may be in the form of a unit quaternion.
Specifically, the mobile phone obtains each second orientation data counted by the orientation meter within a preset time period and the corresponding statistical time of each second orientation data to obtain a second orientation data sequence and a statistical time sequence. For example, the orientation in a cell phone is in 9: 00-9: acquiring and counting orientation data i times in a cycle of 10 to obtain i second orientation data which are respectively recorded as qa0, qa1, qa2 and qa3 … … qai, and acquiring and counting time of the second orientation data are respectively recorded as ta0, ta1, ta2 and ta3 … … tai; accordingly, the second orientation data are sorted according to time sequence, and 9: 00-9: the second heading data series of the heading count in 10 is qa ═ q (qa0, qa1, qa2, qa3 … … qai), and the corresponding statistical time series is ta ═ ta (ta0, ta1, ta2, ta3 … … tai). It should be noted that the above sequences are ordered according to time sequence, and certainly, the sequences may also be ordered according to a reverse time sequence manner in practice, or ordered randomly, but it should be understood that a certain second orientation data in the second orientation data sequence has a corresponding relationship with the statistical time that is the same as the statistical time in the statistical time sequence, for example, the statistical time of the nth second orientation data in the second orientation data sequence is the nth data in the statistical time sequence.
The first orientation data and the second orientation data are unit quaternions, which represent a state quantity indicating the orientation of the camera in the mobile terminal at the present time, and the yaw angle refers to a relative change, for example, a change in yaw angle between time tai and time ta0 is denoted by yaw (qai-qa 0).
Referring to fig. 2, step S11 includes:
step S111: acquiring an image sequence, and acquiring the shooting time of each image to obtain the shooting time sequence.
In a specific embodiment, the mobile phone continuously shoots within a preset time period by using the camera, and then can obtain shot images within the preset time period, each shot image is correspondingly recorded with shooting time to form a shooting time sequence, and the images can be sorted according to time to obtain corresponding image sequences. For example, at 9: 00-9: when images are captured at tv0, tv1, tv2 and tv3 … … tvi times within 10 time periods, respectively, the capturing time sequence of the captured images is tv (tv0, tv1, tv2 and tv3 … … tvi), and the images can be sorted according to the capturing time to form an image sequence.
Step S112: processing each image by using a neural network algorithm to obtain first orientation data corresponding to each image, and obtaining a first orientation data sequence according to each first orientation data.
Specifically, for the captured images, a neural network algorithm may be used to process each captured image to obtain first orientation data corresponding to each image, and then a first orientation data sequence corresponding to the image sequence is obtained according to the first orientation data. Neural network algorithms include, but are not limited to, convolutional neural network algorithms, deep neural network algorithms, and the like. For example, 9: 00-9: the first orientation data sequence corresponding to the captured image within 10 is qv ═ (qv0, qv1, qv2, qv3 … … qvi). It should be noted that, the process of processing the image by using the neural network algorithm may be executed locally by the mobile terminal; certainly, the first orientation data may also be executed by the cloud server, that is, after the mobile terminal obtains the image, the image is sent to the cloud server, the cloud server processes the image by using a neural network algorithm to obtain the corresponding first orientation data, and then the first orientation data is returned to the mobile terminal. Orientation data of an orientation meter in the visual image can be accurately obtained through the neural network algorithm.
Step S12: and calculating the offset speed of the yaw angle of the orientation meter according to the first orientation data sequence, the second orientation data sequence and the shooting time sequence.
In a specific implementation, the yaw angle of the mobile phone can be estimated through data collected by the orientation meter, but the estimation has an error, that is, the yaw angle estimated by the orientation meter (hereinafter, referred to as the "yaw angle of the orientation meter") has an error, and the yaw angle s continuously deviates, that is, the error is continuously accumulated (the yaw angle continuously deviates); in this regard, a speed of deviation of the yaw angle of the heading meter may be defined for characterizing a speed of heading meter error accumulation, wherein a change in yaw angle between time tai and time ta0 may be denoted as yaw (qai-qa 0). In one embodiment, the offset speed of the yaw angle of the heading meter can be calculated through the first orientation data sequence, the second orientation data sequence and the shooting time sequence.
Specifically, referring to fig. 3, step S12 includes:
step S121: calculating a first difference value of each first orientation data and the first orientation data in the first orientation data sequence; calculating a second difference value between each second orientation data and the first second orientation data in the second orientation data sequence; and calculating the difference value of the shooting time of each shooting time and the first shooting time in the shooting time sequence.
In an embodiment, the shooting times in the shooting time sequence and the statistical times in the statistical time sequence are sorted in chronological order, and the first shooting time and the first statistical time may be the same time (initial time), that is, the first orientation data and the first second orientation data correspond to the same time, that is, tv0 is ta0, and it can be considered that the orientation at this time (initial time) is not offset by the yaw angle.
Specifically, a first difference value between each first orientation data in the first orientation data sequence qv and the first orientation data is calculated, for example, Δ v0-qv 0-qv0, Δ v1-qv 1-qv0, Δ v2-qv 2-qv0, Δ v3-qv 3-qv0, … … Δ vi-qv 0). A second difference is calculated between each second orientation data in the sequence of second orientation data and the first second orientation data, e.g., Δ a0-qa 0-qa0, Δ a1-qa 1-qa0, Δ a2-qa 2-qa0, Δ a3-qa 3-qa0, … … Δ ai-qai-qa 0). The difference in shooting time between each shooting time and the first shooting time in the sequence of shooting times is calculated, for example, Δ tv0 ═ tv1-tv0, Δ tv1 ═ tv2-tv0, Δ tv2 ═ tv3-tv0, … … Δ tvi ═ tv 0.
Step S122: and calculating the offset speed of the yaw angle of the orientation meter by using the first difference, the second difference and the shooting time difference corresponding to the shooting time.
When the first difference, the second difference, and the shooting time difference are obtained, the offset speed of the yaw angle of the heading meter can be calculated according to the first difference, the second difference, and the shooting time difference. The first difference is obtained from the captured image, and thus can be considered as global information obtained by using visual positioning to determine the deviation rule of the heading meter yaw angle.
Through analysis, the deviation of the heading meter yaw angle fluctuates in a short time window, but is relatively stable in a long time window, the deviation angle is in a linear relation with time, the frequency of the heading meter counting a second heading data sequence is high, but in the actual tracking process of the visual odometer, each frame of image corresponds to a first heading data, and when the image is shot, each preset time is needed for shooting, so the frequency of obtaining the first heading data sequence is low. In this regard, the shooting time may be used as a reference, and for each shooting time, the corresponding first difference, second difference, and shooting time difference are obtained, and the yaw angle of the heading meter corresponding to each shooting time is calculated.
Specifically, any shooting time in the shooting time series may be referred to as a target shooting time; for the target capture time, there is a corresponding captured image and first orientation data, and from the first orientation data, a corresponding first difference, referred to as a target first difference, may be determined. Secondly, in the statistical time sequence, a target statistical time closest to the target shooting time can be determined, for example, the target shooting time is 9 o ' clock 1 min 10 sec, and two statistical times, namely 9 o ' clock 1 min 6 sec and 9 o ' clock 1 min 11 sec, are around 9 o ' clock 1 min 10 sec in the statistical time sequence, so that 9 o ' clock 1 min 11 sec is the target statistical time; after the target statistical time is determined, corresponding second orientation data can be determined according to the target statistical time, and a corresponding second difference value is determined according to the second orientation data, and the second difference value is called a target second difference value. Then, the offset speed of the yaw angle of the heading meter corresponding to the target shooting time, that is, the offset speed vdi ═ Δ vi- Δ ai)/Δ tvi ═ yaw (qvi-qv0) -yaw (qai-qa0))/(tvi-tv0) may be determined according to the target first difference, the target second difference, and the shooting time difference corresponding to the target shooting time (i.e., the difference between the target shooting time and the first shooting time).
For each shooting time of the shooting time sequence, the steps can be executed to obtain the offset speed of the yaw angle corresponding to each shooting time, and further obtain an offset speed sequence formed by a series of offset speeds. Then, the average value of the offset speeds of the yaw angles of the heading meters corresponding to all shooting times can be calculated, and further the offset speeds of the yaw angles of the heading meters in the whole preset time period can be obtained, namely the average offset speed of all the offset speeds in the offset speed sequence is calculated, so that the offset rule of the heading meters is obtained. Of course, in practice, statistical indexes such as the median and mode of the offset velocity sequence may be used as the offset velocity of the yaw angle of the heading meter.
Step S13: and obtaining compensation data corresponding to each second orientation data according to the offset speed.
After the offset speed of the yaw angle of the heading meter in the whole preset time period is determined, the compensation data corresponding to each second heading data can be determined according to the offset speed.
Referring to fig. 4, step S13 specifically includes:
step S131: and calculating to obtain the statistical time difference between each statistical time and the first statistical time in the statistical time sequence.
Specifically, the statistical time difference between each statistical time and the first statistical time in the statistical time series is calculated, for example, Δ ta 0-ta 1-ta0, Δ ta1-ta 2-ta0, Δ ta2-ta 3-ta0, and … … Δ tai-ta 0.
Step S132: and calculating to obtain compensation data corresponding to each second orientation data according to the statistical time difference and the offset speed.
Specifically, for each second orientation data in the second orientation data sequence counted by the orientation meter, the elapsed time Δ tai-ta0 is required, and the compensation data corresponding to each second orientation data is obtained according to the offset speed calculated in step S12 as follows: x ═ vdi × Δ tai, and the calculated compensation data is converted into a corresponding relative rotation amount, for example, Y ═ rot (X).
Step S14: each second orientation data in the second orientation data sequence is compensated based on the compensation data.
Specifically, a product of each second orientation data and the compensation data is calculated, and then each second orientation data after compensation is obtained. And inputting the compensated second orientation data into the visual mileage calculation, so that the visual positioning result is more accurate. Specifically, the compensated second orientation data is input into the visual odometer technology for computer visual positioning, so that the tracking precision of the visual odometer is more accurate.
The method obtains the offset speed of the yaw angle of the heading indicator through calculation; obtaining compensation data corresponding to each second orientation data according to the offset speed; and compensating each second orientation data in the second orientation data sequence based on the compensation data so as to compensate the yaw angle, thereby eliminating or reducing the negative influence of the yaw angle deviation on the tracking precision. The global information of the visual positioning is utilized to make up the continuous deviation of the yaw angle of the heading meter, and the tracking precision of the visual mileage calculation method is improved.
Referring to fig. 5, a schematic structural diagram of the yaw angle compensation device of the present invention includes: adata acquisition module 51, aspeed calculation module 52, a compensationdata calculation module 53, and acompensation module 54.
Thedata acquisition module 51 is configured to acquire a first orientation data sequence and a shooting time sequence of an image sequence shot in a predetermined time; and acquiring a second orientation data sequence of the orientation meter in the preset time.
Thespeed calculation module 52 is configured to calculate a deviation speed of a yaw angle of the heading meter according to the first heading data sequence, the second heading data sequence, and the shooting time sequence.
The compensationdata calculating module 53 is configured to obtain compensation data corresponding to each second orientation data according to the offset speed.
Thecompensation module 54 is configured to compensate each of the second orientation data in the second orientation data sequence based on compensation data. In an embodiment, a product of each second orientation data and the compensation data is calculated to obtain each compensated second orientation data. The compensated second orientation data may then be input into the visual odometry calculation, thereby making the visual positioning result more accurate.
Further, thespeed calculating module 52 is further configured to calculate a first difference between each first orientation data in the first orientation data sequence and a first orientation data; calculating a second difference value between each second orientation data and the first second orientation data in the second orientation data sequence; calculating the difference value of the shooting time of each shooting time and the first shooting time in the shooting time sequence; and calculating the offset speed of the yaw angle of the orientation meter by using the first difference, the second difference and the shooting time difference corresponding to the shooting time.
Further, thespeed calculating module 52 is further configured to calculate, by using the first difference, the second difference and the difference between the shooting times corresponding to each shooting time, an offset speed of a yaw angle of the heading meter corresponding to each shooting time; and calculating the offset speed of the yaw angle of the orientation meter according to the offset speeds of the yaw angle of the orientation meter corresponding to all the shooting time.
Further, thedata obtaining module 51 is further configured to obtain a statistical time sequence corresponding to the second orientation data sequence, where statistical time in the statistical time sequence corresponds to second orientation data in the second orientation data sequence one to one;
thespeed calculation module 52 is further configured to determine, for any target shooting time, a corresponding target first difference according to the first orientation data corresponding to the target shooting time; determining target statistic time closest to the target shooting time in the statistic time sequence, and determining a corresponding target second difference according to second orientation data corresponding to the target statistic time; and calculating the offset speed of the yaw angle of the heading meter corresponding to the target shooting time according to the target first difference, the target second difference and the shooting time difference corresponding to the target shooting time.
Further, the first orientation data and the first second orientation data correspond to the same time.
Further, thedata obtaining module 51 is further configured to obtain a statistical time sequence corresponding to the second orientation data sequence, where statistical time in the statistical time sequence corresponds to second orientation data in the second orientation data sequence one to one;
the compensationdata calculating module 53 is further configured to calculate a statistical time difference between each statistical time in the statistical time sequence and the first statistical time; and calculating to obtain the compensation data corresponding to each second orientation data according to the statistical time difference and the offset speed.
Further, thecompensation module 54 is further configured to calculate a product of each second orientation data and the compensation data, so as to obtain each compensated second orientation data.
Further, thedata obtaining module 51 is further configured to obtain an image sequence, and obtain a shooting time of each image to obtain the shooting time sequence; processing each image by using a neural network algorithm to obtain first orientation data corresponding to each image, and obtaining the first orientation data sequence according to each first orientation data.
For specific functional implementation of each module of the yaw angle compensation device, reference may be made to each embodiment of the orientation data compensation method, which is not described herein again. The yaw angle compensation device obtains the offset speed of the yaw angle of the orientation meter through calculation; obtaining compensation data corresponding to each second orientation data according to the offset speed; and compensating each second orientation data in the second orientation data sequence based on the compensation data so as to compensate the yaw angle, thereby eliminating or reducing the negative influence of the yaw angle deviation on the tracking precision. The global information of the visual positioning is utilized to make up the continuous deviation of the yaw angle of the heading meter, and the tracking precision of the visual mileage calculation method is improved.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the invention. The electronic device comprises amemory 202 and aprocessor 201 connected to each other.
Thememory 202 is used for storing program instructions for implementing the orientation data compensation method of the device, which can be specifically referred to in the embodiments of the orientation data compensation method, and will not be described herein again.
Theprocessor 201 is used to execute program instructions stored by thememory 202.
Theprocessor 201 may also be referred to as a Central Processing Unit (CPU). Theprocessor 201 may be an integrated circuit chip having signal processing capabilities. Theprocessor 201 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Thestorage 202 may be a memory bank, a TF card, etc., and may store all information in the electronic device of the device, including the input raw data, the computer program, the intermediate operation results, and the final operation results. It stores and retrieves information based on the location specified by the controller. With the memory, the electronic device can only have the memory function to ensure the normal operation. The storage of electronic devices can be classified into a main storage (internal storage) and an auxiliary storage (external storage) according to the use, and also into an external storage and an internal storage. The external memory is usually a magnetic medium, an optical disk, or the like, and can store information for a long period of time. The memory refers to a storage component on the main board, which is used for storing data and programs currently being executed, but is only used for temporarily storing the programs and the data, and the data is lost when the power is turned off or the power is cut off.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a system server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method of the embodiments of the present application.
Fig. 7 is a schematic structural diagram of a computer-readable storage medium according to the present invention. The storage medium of the present application stores aprogram file 203 capable of implementing all the above orientation data compensation methods, wherein theprogram file 203 may be stored in the storage medium in the form of a software product, and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. The aforementioned storage device includes: various media capable of storing program codes, such as a usb disk, a mobile hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, or terminal devices, such as a computer, a server, a mobile phone, and a tablet.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (11)

CN202110321774.4A2021-03-252021-03-25Orientation data compensation method and device, electronic equipment and readable storage mediumActiveCN113055598B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202110321774.4ACN113055598B (en)2021-03-252021-03-25Orientation data compensation method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202110321774.4ACN113055598B (en)2021-03-252021-03-25Orientation data compensation method and device, electronic equipment and readable storage medium

Publications (2)

Publication NumberPublication Date
CN113055598Atrue CN113055598A (en)2021-06-29
CN113055598B CN113055598B (en)2022-08-05

Family

ID=76515692

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202110321774.4AActiveCN113055598B (en)2021-03-252021-03-25Orientation data compensation method and device, electronic equipment and readable storage medium

Country Status (1)

CountryLink
CN (1)CN113055598B (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1334915A (en)*1998-12-172002-02-06株式会社东金Orientation angle detector
US20040174453A1 (en)*2003-03-052004-09-09Fuji Jukogyo Kabushiki KaishaImage compensation apparatus
CN101460810A (en)*2006-04-282009-06-17诺基亚公司Calibration
US20110175806A1 (en)*2010-01-062011-07-21Cywee Group Ltd.Electronic device for use in motion detection and method for obtaining resultant deviation thereof
US20150298822A1 (en)*2014-04-162015-10-22ParrotRotary-wing drone provided with a video camera delivering stabilized sequences of images
JP2017106891A (en)*2015-11-302017-06-15株式会社リコーInertia device, program, and positioning method
US20170193830A1 (en)*2016-01-052017-07-06California Institute Of TechnologyControlling unmanned aerial vehicles to avoid obstacle collision
CN107167131A (en)*2017-05-232017-09-15北京理工大学A kind of depth integration of micro-inertia measuring information and the method and system of real-Time Compensation
CN107945220A (en)*2017-11-302018-04-20华中科技大学A kind of method for reconstructing based on binocular vision
CN108561274A (en)*2017-12-292018-09-21华润电力风能(汕头潮南)有限公司Fan yaw bearing calibration and device, computer installation and readable storage medium storing program for executing
CN108737734A (en)*2018-06-152018-11-02Oppo广东移动通信有限公司Image compensation method and device, computer readable storage medium and electronic equipment
JP2018190024A (en)*2017-04-282018-11-29公立大学法人大阪市立大学 Aircraft control system and aircraft
CN110377058A (en)*2019-08-302019-10-25深圳市道通智能航空技术有限公司A kind of yaw corner correcting method, device and the aircraft of aircraft
US20190365479A1 (en)*2018-05-302019-12-05Auris Health, Inc.Systems and methods for location sensor-based branch prediction
CN110595464A (en)*2019-08-192019-12-20北京数研科技发展有限公司IMU and visual sensor fusion positioning method and device
CN110775288A (en)*2019-11-262020-02-11哈尔滨工业大学(深圳) A bionic-based flying mechanical neck-eye system and control method
CN111207736A (en)*2016-07-262020-05-29广州亿航智能技术有限公司Method, system, equipment and readable storage medium for calibrating yaw angle of unmanned aerial vehicle
JP2021028188A (en)*2019-08-092021-02-25富士通株式会社Drone imaging device and method

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1334915A (en)*1998-12-172002-02-06株式会社东金Orientation angle detector
US20040174453A1 (en)*2003-03-052004-09-09Fuji Jukogyo Kabushiki KaishaImage compensation apparatus
CN101460810A (en)*2006-04-282009-06-17诺基亚公司Calibration
US20110175806A1 (en)*2010-01-062011-07-21Cywee Group Ltd.Electronic device for use in motion detection and method for obtaining resultant deviation thereof
US20150298822A1 (en)*2014-04-162015-10-22ParrotRotary-wing drone provided with a video camera delivering stabilized sequences of images
JP2017106891A (en)*2015-11-302017-06-15株式会社リコーInertia device, program, and positioning method
US20170193830A1 (en)*2016-01-052017-07-06California Institute Of TechnologyControlling unmanned aerial vehicles to avoid obstacle collision
CN111207736A (en)*2016-07-262020-05-29广州亿航智能技术有限公司Method, system, equipment and readable storage medium for calibrating yaw angle of unmanned aerial vehicle
JP2018190024A (en)*2017-04-282018-11-29公立大学法人大阪市立大学 Aircraft control system and aircraft
CN107167131A (en)*2017-05-232017-09-15北京理工大学A kind of depth integration of micro-inertia measuring information and the method and system of real-Time Compensation
CN107945220A (en)*2017-11-302018-04-20华中科技大学A kind of method for reconstructing based on binocular vision
CN108561274A (en)*2017-12-292018-09-21华润电力风能(汕头潮南)有限公司Fan yaw bearing calibration and device, computer installation and readable storage medium storing program for executing
US20190365479A1 (en)*2018-05-302019-12-05Auris Health, Inc.Systems and methods for location sensor-based branch prediction
CN108737734A (en)*2018-06-152018-11-02Oppo广东移动通信有限公司Image compensation method and device, computer readable storage medium and electronic equipment
JP2021028188A (en)*2019-08-092021-02-25富士通株式会社Drone imaging device and method
CN110595464A (en)*2019-08-192019-12-20北京数研科技发展有限公司IMU and visual sensor fusion positioning method and device
CN110377058A (en)*2019-08-302019-10-25深圳市道通智能航空技术有限公司A kind of yaw corner correcting method, device and the aircraft of aircraft
CN110775288A (en)*2019-11-262020-02-11哈尔滨工业大学(深圳) A bionic-based flying mechanical neck-eye system and control method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
章孝等: "无人飞行器姿态稳定性优化控制仿真研究", 《计算机仿真》*

Also Published As

Publication numberPublication date
CN113055598B (en)2022-08-05

Similar Documents

PublicationPublication DateTitle
CN111121768B (en)Robot pose estimation method and device, readable storage medium and robot
CN113034594A (en)Pose optimization method and device, electronic equipment and storage medium
CN114119744B (en)Method, device, equipment and storage medium for constructing point cloud map
CN110160524A (en)A kind of the sensing data acquisition methods and device of inertial navigation system
CN111220155A (en)Method, device and processor for estimating pose based on binocular vision inertial odometer
CN108827341A (en)The method of the deviation in Inertial Measurement Unit for determining image collecting device
CN112762944A (en)Zero-speed interval detection and zero-speed updating method
CN111998870B (en)Calibration method and device of camera inertial navigation system
CN114964270B (en)Fusion positioning method, device, vehicle and storage medium
CN113055598B (en)Orientation data compensation method and device, electronic equipment and readable storage medium
CN112461258A (en)Parameter correction method and device
CN118984391A (en) Sampling rate adjustment method, device, computer equipment, and storage medium
WO2022179047A1 (en)State information estimation method and apparatus
US12294773B2 (en)Method and apparatus for generating video digest and readable storage medium
CN115507846B (en) Positioning accuracy evaluation method, device and computer-readable storage medium
US20150142310A1 (en)Self-position measuring terminal
CN114322996B (en)Pose optimization method and device of multi-sensor fusion positioning system
CN111829552A (en)Error correction method and device for visual inertial system
CN115906641A (en)IMU gyroscope random error compensation method and device based on deep learning
CN115311624A (en)Slope displacement monitoring method and device, electronic equipment and storage medium
CN115560744A (en)Robot, multi-sensor-based three-dimensional mapping method and storage medium
CN114979456A (en)Anti-shake processing method and device for video data, computer equipment and storage medium
CN119665994A (en) Method, device and electronic equipment for determining position and posture of autonomous driving vehicle
CN118225080A (en)Inertial navigation data acquisition method and device
CN114545017B (en) Speed fusion method, device and computer equipment based on optical flow and accelerometer

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp