Movatterモバイル変換


[0]ホーム

URL:


CN104361248A - Method for transforming actual motion of human body into motion in virtual scene - Google Patents

Method for transforming actual motion of human body into motion in virtual scene
Download PDF

Info

Publication number
CN104361248A
CN104361248ACN201410686154.0ACN201410686154ACN104361248ACN 104361248 ACN104361248 ACN 104361248ACN 201410686154 ACN201410686154 ACN 201410686154ACN 104361248 ACN104361248 ACN 104361248A
Authority
CN
China
Prior art keywords
motion
virtual
energy consumption
movement
actual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410686154.0A
Other languages
Chinese (zh)
Other versions
CN104361248B (en
Inventor
李旋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IndividualfiledCriticalIndividual
Priority to CN201410686154.0ApriorityCriticalpatent/CN104361248B/en
Publication of CN104361248ApublicationCriticalpatent/CN104361248A/en
Priority to PCT/CN2015/093639prioritypatent/WO2016082660A1/en
Application grantedgrantedCritical
Publication of CN104361248BpublicationCriticalpatent/CN104361248B/en
Expired - Fee Relatedlegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The invention relates to a method for transforming the actual motion of a human body into the motion in a virtual scene and belongs to the technical field of measurement of energy consumption of the human body. The method comprises that measuring energy consumed by the actual motion of the human body, adopting the energy consumed by the action as a medium, mapping the energy consumed by the actual motion of the human body obtained by measurement into energy consumed by the virtual motion, obtaining the distance of the motion path of the virtual motion in the virtual scene and further obtaining the motion endpoint of the virtual motion. The method has the advantages that the motion of the human body is transformed into the motion done between two points on a map in the virtual scene, so that the level of energy consumed by the motion of the human body by judging the distance between the two points in the virtual scene.

Description

Method for transforming actual motion of human body into motion in virtual scene
Technical Field
The invention relates to a method for mapping human body movement to movement in a virtual scene through energy consumption, and belongs to the technical field of human body energy consumption measurement.
Background
With the continuous development of social economy and the continuous improvement of living standard, people pay more and more attention to their health and make various exercise schemes for themselves to build up body, so that various exercise equipments and devices for monitoring exercise schemes appear.
The conventional device for monitoring the exercise scheme can detect the energy consumption of the exercise scheme in real time and detect the exercise distance of the exercise scheme at the same time, but the exercise distance is different for different exercise schemes, so that the energy consumption of the exercise scheme to the end cannot be judged according to the exercise distance.
In addition, the prior art can only monitor real-time movement energy consumption and movement distance in a real environment, but cannot monitor movement distance which can be achieved by consuming the same physical strength in other paths of geographic environments.
Disclosure of Invention
Aiming at the defects of the problems, the invention provides a method for converting the actual motion of the human body into the motion in the virtual scene, and the method can judge the energy consumption between different motions or the same motion through the position of a terminal point in the virtual scene.
The technical scheme provided by the invention for solving the technical problems is as follows: the method for transforming the actual motion of the human body into the motion in the virtual scene comprises the steps of measuring the energy consumed by the actual motion of the human body, mapping the measured energy consumed by the actual motion of the human body into the energy consumed by the virtual motion by taking the energy consumed by the motion as a medium, obtaining the distance of the virtual motion in the motion path of the virtual scene, and further obtaining the motion end point of the virtual motion.
The method comprises the following steps:
the method comprises the following steps that firstly, movement is divided into different movement modes, and corresponding acceleration information generated during movement is obtained through a three-axis acceleration sensor worn by a human body;
step two, sampling: aiming at different motion modes, sampling test acceleration information generated by the motion of the motion mode, determining a motion mode test interval of the corresponding motion mode, and simultaneously obtaining virtual motion speed under each motion mode;
step three, establishing a model: establishing energy consumption models corresponding to the motion modes according to the height, the weight, the age and the sex of the human body, the sampled acceleration information and the motion modes;
step four, when the human body actually moves, acquiring acceleration information corresponding to the movement generated at the moment through a three-axis acceleration sensor, and comparing and checking the acceleration information with the movement mode checking interval determined in the step two to determine the movement mode of the human body;
step five, selecting a corresponding energy consumption model in the step three according to the motion mode judged in the step four, substituting acceleration information measured by the actual motion of the human body in the step four into the energy consumption model for solving, and thus obtaining the energy consumed by the actual motion;
step six, planning a motion path of virtual motion in the virtual scene, and setting A, B as two points on the path, A as a starting point, B as a motion end point, and recording the distance from the point A to the point B as dABAnd the height from point A to point B is hABEstablishing a distance dABHigh h, hABModels for A, B location information, respectively, where a-point location information is known;
step seven, selecting a motion mode on the virtual motion path planned in the step six, and establishing a virtual energy consumption model related to the motion mode, the height, the weight, the age, the sex, the acceleration information, the virtual motion time, the wind speed, the altitude, the rainfall amount of rain, the snowfall amount and the temperature;
step eight, according to the energy consumed by the actual movement obtained in the step five and the virtual energy consumption model obtained in the step seven, obtaining the virtual movement time of the virtual movement on the virtual movement path by the fact that the energy consumed by the actual movement is equal to the virtual energy consumption;
step nine, establishing virtual motion related to virtual motion speed, virtual motion time and distance dABThe distance d is obtained according to the virtual movement speed obtained in the step two and the virtual movement time obtained in the step eightABDistance d obtained in step sixABHigh h, hABAnd respectively determining the position of the point B by using the model of the position information of A, B and further obtaining the position information of the point B.
The method for determining the motion pattern check interval of the corresponding motion pattern in the second step comprises the following steps:
step two, sampling, namely firstly, giving a motion mode in a sampling duration to obtain the acceleration information of the triaxial acceleration sensor in the sampling duration;
secondly, determining a detection interval; according to the acceleration information of the three-axis acceleration sensor, calculating the average power of the acceleration information of the three-axis acceleration sensor, and simultaneously determining the fluctuation interval of the acceleration information so as to determine the acceleration information inspection interval of the three-axis acceleration sensor;
and step two, changing another motion mode, repeating the step two and the step two to obtain an acceleration information inspection interval of the triaxial acceleration sensor of the motion mode, wherein the acceleration information inspection interval is the motion mode inspection interval.
The method for determining the virtual movement speed in each movement mode in the second step comprises the following steps:
step two, sampling, namely firstly giving a motion mode in sampling duration to obtain the acceleration information of the triaxial acceleration sensor in the sampling duration;
step two, determining the movement speed of the movement mode according to the acceleration information of the three-axis acceleration sensor; averaging the motion speeds obtained in the sampling duration, and taking the average value as the virtual motion speed of the motion mode;
and step two, changing another motion mode, and repeating the step two, the step two and the step five to obtain the virtual motion speed of the motion mode.
The energy consumed by the movement comprises the sum of basic energy consumption and energy consumption corresponding to each movement mode; the exercise modes in the first step comprise three exercise modes of walking, running and cycling; the energy consumption model corresponding to each motion mode in the third step comprises walking, running and riding energy consumption models;
basic energy consumption model:
male:
for the woman:
wherein,the basic energy consumption of the male and the female are respectively, and H, W, N and T are respectively height, weight, age and exercise time;for the purpose of correcting the coefficients,
when the exercise mode is the walking mode, the exercise energy consumption model is as follows:
male:
female:
wherein,the walking energy consumption values of male and female, H, W, V2,T2Respectively the height, the weight, the walking speed and the walking time;in order to correct the coefficients of the coefficients,
when the exercise mode is running, the exercise energy consumption model is as follows:
male:
female:
wherein:running energy consumption values of male and female, H, W, V3,T3Height, weight, running speed and running time;in order to correct the coefficients of the coefficients,
when the exercise mode is the bicycle riding mode, the exercise energy consumption model is as follows:
male:
female:
wherein:energy consumption values of the bicycle, H, W, V4,T4Respectively the height, the weight, the rotating speed and the riding time;in order to correct the coefficients of the coefficients,
the energy consumed by the actual movement in the step five comprises the sum of actual basic energy consumption and energy consumption corresponding to each movement mode; energy consumed by the actual movementThe above-mentionedE1,E2,E3,E4Energy consumed by actual exercise, actual basic energy consumption, actual walking exercise energy consumption, actual running exercise energy consumption and bicycle riding exercise energy consumption are respectively; the motion time of each motion mode is the time of the acceleration information of each motion mode in an acceleration information inspection interval; the walking speed and the running speed are obtained by calculating the walking frequency and the stride of the human body walking measured by the three-axis acceleration sensor; the rotating speed is obtained by calculating human body pedaling step frequency parameters measured by a three-axis acceleration sensor, and the obtained actual parameters are brought into each model in the third step, so that the actual basic energy consumption, the actual walking exercise energy consumption, the actual running exercise energy consumption and the bicycle riding exercise energy consumption can be obtained.
Virtual energy consumption in said step sevenWherein,for virtual energy consumption, E5For virtual basic energy consumption, E6Energy consumption for selected virtual movement patterns, E7For gravitational work consumption, E8For wind energy consumption, E9Energy consumption for other environmental factors.
The gravity work consumption model is as follows: e7=β7WhAB
Wherein E is7Consumption for gravity work, W, hABHeight between body weight and A, B, respectively; beta is a7To correct the coefficient, β7(2.5,5.3);
The wind consumption model is as follows: <math> <mrow> <msub> <mi>E</mi> <mn>8</mn> </msub> <mo>=</mo> <mrow> <mo>(</mo> <msubsup> <mi>&rho;pv</mi> <mn>2</mn> <mn>2</mn> </msubsup> <mo>+</mo> <mi>c</mi> <mo>)</mo> </mrow> <msub> <mi>d</mi> <mi>AB</mi> </msub> <mi>cos</mi> <mi>&theta;</mi> <mo>;</mo> </mrow></math>
wherein rho is wind resistance, p is air mass density, and v2The wind speed is, c is a constant, and theta is an included angle between the direction from the point A to the point B and the wind direction;
other environmental factor consumption models:
wherein E is9For environmental energy consumption, i, j, k, m, α99In order to correct the coefficients, O, R, Z, Q are altitude, rainfall, snowfall and temperature of rain, H, W, t are height, weight and virtual exercise time,is a constant, i e (0.003,0.004), j e (0.38,0.42), k e (0.49,0.51), m e (0.50,0.55),
the virtual base energy consumption E5Is the base energy consumption in relation to the virtual movement time, the energy consumption E of the selected virtual movement pattern6Is the consumption of the movement pattern in relation to the virtual movement time.
Compared with the prior art, the method for transforming the actual motion of the human body into the motion in the virtual scene has the following beneficial effects:
1. the invention can convert the human body movement into the movement of the consumed energy corresponding to the virtual movement of the human body from one point to another point in the virtual scene by measuring the consumed energy of the human body movement and taking the consumed energy of the movement as a medium, thereby determining the position of the other point in the virtual scene. And for the same initial point, the energy consumed by the motion of the human body can be judged by judging the position of the final point.
2. According to the invention, the actual motion of the human body is converted into the virtual motion in different geographic environments through the energy consumed by the motion, so that the consumption of the motion of the human body in the paths of other geographic environments can be presumed through the actual motion of the human body, thereby better guiding people to scientifically plan future journey and preventing accidents caused by insufficient physical strength during the motion in the environment.
Detailed Description
In order to better explain the technical solution of the present invention, the technical solution of the present invention will be described in detail below. Examples
The method for transforming the actual motion of the human body into the motion in the virtual scene comprises the steps of measuring the energy consumed by the actual motion of the human body, mapping the measured energy consumed by the actual motion of the human body into the energy consumed by the virtual motion by taking the energy consumed by the motion as a medium, obtaining the distance of the virtual motion in the motion path of the virtual scene, and further obtaining the motion end point of the virtual motion.
The method comprises the following steps:
the method comprises the following steps that firstly, movement is divided into different movement modes, and corresponding acceleration information generated during movement is obtained through a three-axis acceleration sensor worn by a human body;
step two, sampling: aiming at different motion modes, sampling test acceleration information generated by the motion of the motion mode, determining a motion mode test interval of the corresponding motion mode, and simultaneously obtaining virtual motion speed under each motion mode;
step three, establishing a model: establishing energy consumption models corresponding to the motion modes according to the height, the weight, the age and the sex of the human body, the sampled acceleration information and the motion modes;
step four, when the human body actually moves, acquiring acceleration information corresponding to the movement generated at the moment through a three-axis acceleration sensor, and comparing and checking the acceleration information with the movement mode checking interval determined in the step two to determine the movement mode of the human body;
step five, selecting a corresponding energy consumption model in the step three according to the motion mode judged in the step four, substituting acceleration information measured by the actual motion of the human body in the step four into the energy consumption model for solving, and thus obtaining the energy consumed by the actual motion;
step six, planning a motion path of virtual motion in the virtual scene, and setting A, B as two points on the path, A as a starting point, B as a motion end point, and recording the distance from the point A to the point B as dABAnd the height from point A to point B is hABEstablishing a distance dABHigh h, hABModels for A, B location information, respectively, where a-point location information is known;
step seven, selecting a motion mode on the virtual motion path planned in the step six, and establishing a virtual energy consumption model related to the motion mode, the height, the weight, the age, the sex, the acceleration information, the virtual motion time, the wind speed, the altitude, the rainfall amount of rain, the snowfall amount and the temperature;
step eight, according to the energy consumed by the actual movement obtained in the step five and the virtual energy consumption model obtained in the step seven, obtaining the virtual movement time of the virtual movement on the virtual movement path by the fact that the energy consumed by the actual movement is equal to the virtual energy consumption;
step nine, establishing virtual motion related to virtual motion speed, virtual motion time and distance dABThe distance d is obtained according to the virtual movement speed obtained in the step two and the virtual movement time obtained in the step eightABDistance d obtained in step sixABHigh h, hABAnd respectively determining the position of the point B by using the model of the position information of A, B and further obtaining the position information of the point B.
The method for determining the motion pattern check interval of the corresponding motion pattern in the second step comprises the following steps:
step two, sampling, namely firstly, giving a motion mode in a sampling duration to obtain the acceleration information of the triaxial acceleration sensor in the sampling duration;
secondly, determining a detection interval; according to the acceleration information of the three-axis acceleration sensor, calculating the average power of the acceleration information of the three-axis acceleration sensor, and simultaneously determining the fluctuation interval of the acceleration information so as to determine the acceleration information inspection interval of the three-axis acceleration sensor;
and step two, changing another motion mode, repeating the step two and the step two to obtain an acceleration information inspection interval of the triaxial acceleration sensor of the motion mode, wherein the acceleration information inspection interval is the motion mode inspection interval.
The method for determining the virtual movement speed in each movement mode in the second step comprises the following steps:
step two, sampling, namely firstly giving a motion mode in sampling duration to obtain the acceleration information of the triaxial acceleration sensor in the sampling duration;
step two, determining the movement speed of the movement mode according to the acceleration information of the three-axis acceleration sensor; averaging the motion speeds obtained in the sampling duration, and taking the average value as the virtual motion speed of the motion mode;
and step two, changing another motion mode, and repeating the step two, the step two and the step five to obtain the virtual motion speed of the motion mode.
The energy consumed by the movement comprises the sum of basic energy consumption and energy consumption corresponding to each movement mode; the exercise modes in the first step comprise three exercise modes of walking, running and cycling; the energy consumption model corresponding to each motion mode in the third step comprises walking, running and riding energy consumption models;
basic energy consumption model:
male:
for the woman:
wherein,the basic energy consumption of the male and the female are respectively, and H, W, N and T are respectively height, weight, age and exercise time;for the purpose of correcting the coefficients,
when the exercise mode is the walking mode, the exercise energy consumption model is as follows:
male:
female:
wherein,the walking energy consumption values of male and female, H, W, V2,T2Respectively the height, the weight, the walking speed and the walking time;in order to correct the coefficients of the coefficients,
when the exercise mode is running, the exercise energy consumption model is as follows:
male:
female:
wherein:running energy consumption values of male and female, H, W, V3,T3Height, weight, running speed and running time;in order to correct the coefficients of the coefficients,
when the exercise mode is the bicycle riding mode, the exercise energy consumption model is as follows:
male:
female:
wherein:energy consumption values of the bicycle, H, W, V4,T4Respectively the height, the weight, the rotating speed and the riding time;in order to correct the coefficients of the coefficients,
the energy consumed by the actual movement in the step five comprises the sum of actual basic energy consumption and energy consumption corresponding to each movement mode; energy consumed by the actual movementThe above-mentionedE1,E2,E3,E4Energy consumed by actual exercise, actual basic energy consumption, actual walking exercise energy consumption, actual running exercise energy consumption and bicycle riding exercise energy consumption are respectively; the motion time of each motion mode is the time of the acceleration information of each motion mode in an acceleration information inspection interval; the walking speed and the running speed are obtained by calculating the walking frequency and the stride of the human body walking measured by the three-axis acceleration sensor; the rotating speed is obtained by calculating human body pedaling step frequency parameters measured by a three-axis acceleration sensor, and the obtained actual parameters are brought into each model in the third step, so that the actual basic energy consumption, the actual walking exercise energy consumption, the actual running exercise energy consumption and the bicycle riding exercise energy consumption can be obtained.
Virtual energy consumption in said step sevenWherein,for virtual energy consumption, E5For virtual basic energy consumption, E6Energy consumption for selected virtual movement patterns, E7For gravitational work consumption, E8For wind energy consumption, E9Energy consumption for other environmental factors.
The gravity work consumption model is as follows: e7=β7WhAB
Wherein E is7Consumption for gravity work, W, hABHeight between body weight and A, B, respectively; beta is a7To correct the coefficient, β7(2.5,5.3);
The wind consumption model is as follows: <math> <mrow> <msub> <mi>E</mi> <mn>8</mn> </msub> <mo>=</mo> <mrow> <mo>(</mo> <mi>&rho;</mi> <msubsup> <mi>pv</mi> <mn>2</mn> <mn>2</mn> </msubsup> <mo>+</mo> <mi>c</mi> <mo>)</mo> </mrow> <msub> <mi>d</mi> <mi>AB</mi> </msub> <mi>cos</mi> <mi>&theta;</mi> <mo>;</mo> </mrow></math>
wherein rho is wind resistance, p is air mass density, and v2The wind speed is, c is a constant, and theta is an included angle between the direction from the point A to the point B and the wind direction;
other environmental factor consumption models:
wherein E is9For environmental energy consumption, i, j, k, m, α99In order to correct the coefficients, O, R, Z, Q are altitude, rainfall, snowfall and temperature of rain, H, W, t are height, weight and virtual exercise time,is a constant, i e (0.003,0.004), j e (0.38,0.42), k e (0.49,0.51), m e (0.50,0.55),
the virtual base energy consumption E5Is the base energy consumption in relation to the virtual movement time, the energy consumption E of the selected virtual movement pattern6Is the consumption of the movement pattern in relation to the virtual movement time.
To better illustrate the invention, an example of this is given for illustration:
measuring the energy consumed by the actual motion of the human body, taking the energy consumed by the motion as a medium, mapping the measured energy consumed by the actual motion of the human body into the energy consumed by the virtual motion, obtaining the distance of the virtual motion on the motion path of the virtual scene, and further obtaining the motion end point of the virtual motion. Namely, the energy consumed by the actual motion of the human body is taken as the energy consumed by the virtual motion, the motion time of the human body in the virtual scene is calculated according to the energy consumed by the virtual motion, the motion distance of the human body in the virtual scene is calculated, and the motion endpoint of the human body in the virtual scene is further calculated.
The method is divided into a sampling and model establishing stage, a consumption measuring stage of each motion mode and a virtual scene end point calculating stage.
Sampling and modeling stage:
the method comprises the following steps that firstly, the movement is divided into different movement modes, specifically three modes of walking, running and riding, three-axis acceleration sensors are worn on four-limb joint points, the waist and the head of a human body, and corresponding acceleration information is generated during the movement through the three-axis acceleration sensors worn by the human body;
step two, sampling: and aiming at different motion modes, sampling test acceleration information generated by the motion of the motion model, determining a motion mode test interval of the corresponding motion mode, and simultaneously obtaining the virtual motion speed of each motion mode.
The method for determining the motion pattern check interval of the corresponding motion pattern in the second step comprises the following steps:
step two, sampling, namely firstly, giving a motion mode in a sampling duration to obtain the acceleration information of the triaxial acceleration sensor in the sampling duration; secondly, determining a detection interval; according to the acceleration information of the three-axis acceleration sensor, calculating the average power of the acceleration information of the three-axis acceleration sensor, and simultaneously determining the fluctuation interval of the acceleration information so as to determine the acceleration information inspection interval of the three-axis acceleration sensor;
and step two, changing another motion mode, repeating the step two and the step two to obtain an acceleration information inspection interval of the triaxial acceleration sensor of the motion mode, wherein the acceleration information inspection interval is the motion mode inspection interval.
The method for determining the virtual movement speed in each movement mode in the second step comprises the following steps:
step two, sampling, namely firstly giving a motion mode in sampling duration to obtain the acceleration information of the triaxial acceleration sensor in the sampling duration;
step two, determining the movement speed of the movement mode according to the acceleration information of the three-axis acceleration sensor; averaging the motion speeds obtained in the sampling duration, and taking the average value as the virtual motion speed of the motion mode;
and step two, changing another motion mode, and repeating the step two, the step two and the step five to obtain the virtual motion speed of the motion mode.
If a walking mode is given, a person walks at a normal walking speed, then the acceleration information of the primary triaxial acceleration sensor is collected at a certain collection frequency, the walking speed corresponding to the collection is calculated according to the acceleration information, and the collection frequency is generally collected once in 0.5 second. Then, the walking speed corresponding to the acceleration information acquired each time is obtained by continuously sampling in the sampling time length by the method, the average walking speed is obtained by an arithmetic mean formula for the walking speed, and the average walking speed is used as the virtual movement walking speed. Typical sampling periods are 30-45 minutes. In the same way, the virtual running speed and the virtual bicycle riding speed can be obtained.
Step three, establishing a model: establishing energy consumption models corresponding to the motion modes according to the height, the weight, the age and the sex of the human body, the sampled acceleration information and the motion modes;
the energy consumed by the movement comprises the sum of basic energy consumption and energy consumption corresponding to each movement mode; the energy consumption model corresponding to each motion mode comprises walking, running and riding energy consumption models;
basic energy consumption model:
male:E11=(0.2085H+0.5731W-0.2815N+2.7697)T;
for the woman:E12=(0.1771H+0.3985W-0.1948N+27.2956)T;
wherein,the basic energy consumption of the male and the female are respectively, and H, W, N and T are respectively height, weight, age and exercise time;
when the exercise mode is the walking mode, the exercise energy consumption model is as follows:
male:E21=(0.2145H+0.3724W+11.257)(V2)2T2;
female:E22=(0.2683H+0.3225W+14.749)(V2)2T2;
wherein,the walking energy consumption values of male and female, H, W, V2,T2Height, weight, walking speed and walking time, respectively.
When the exercise mode is running, the exercise energy consumption model is as follows:
male:E31=(0.2351H+0.6662W+7.932)V3T3;
female:E32=(0.1855H+0.5523W+16.327)V3T3;
wherein:running energy consumption values of male and female, H, W, V3,T3Height, weight, running speed and running time, respectively.
When the exercise mode is the bicycle riding mode, the exercise energy consumption model is as follows:
male:E41=(0.1147H+0.3492W+8.794)V4T4;
female:E42=(0.1152H+0.5578W+7.452)V4T4;
wherein:energy consumption values of the bicycle, H, W, V4,T4Height, weight, rotation speed and riding time.
Consumption measuring stage for each motion mode
And step four, when the human body actually moves, acquiring acceleration information corresponding to the movement generated at the moment through a three-axis acceleration sensor, and comparing and checking the acceleration information with the movement mode checking interval determined in the step two to determine the movement mode of the human body.
The motion mode detection interval is obtained in the sampling stage, so that the acceleration information generated by the motion of the human body is obtained when the human body actually moves at the moment, the acceleration information is compared with the motion mode detection interval, and if the acceleration information falls within a certain motion mode detection interval, the motion at the moment belongs to the motion mode. For example, if the acceleration information falls within the pattern detection interval of the running exercise, the exercise at this time is the running exercise. Meanwhile, calculating the time length of the motion in the same motion mode detection interval as the actual motion time of the motion mode
Step five, selecting a corresponding energy consumption model in the step three according to the motion mode judged in the step four, substituting acceleration information measured by the actual motion of the human body in the step four into the energy consumption model for solving, and thus obtaining the energy consumed by the actual motion;
the energy consumed by the actual movement in the step five comprises the sum of actual basic energy consumption and energy consumption corresponding to each movement mode; energy consumed by the actual movementThe above-mentionedE1,E2,E3,E4Energy consumed by actual exercise, actual basic energy consumption, actual walking exercise energy consumption, actual running exercise energy consumption and bicycle riding exercise energy consumption are respectively; the motion time of each motion mode is the time of the acceleration information of each motion mode in an acceleration information inspection interval; the walking speed and the running speed are obtained by calculating the walking frequency and the stride of the human body walking measured by the three-axis acceleration sensor; the rotating speed is obtained by calculating human body pedaling step frequency parameters measured by a three-axis acceleration sensor, and the obtained actual parameters are brought into each model in the third step, so that the actual basic energy consumption, the actual walking exercise energy consumption, the actual running exercise energy consumption and the bicycle riding exercise energy consumption can be obtained.
Actual basic energy consumption model:
male: <math> <mrow> <msubsup> <mi>E</mi> <mn>1</mn> <mn>1</mn> </msubsup> <mo>=</mo> <mrow> <mo>(</mo> <mn>0.2085</mn> <mi>H</mi> <mo>+</mo> <mn>0.5731</mn> <mi>W</mi> <mo>-</mo> <mn>0.2815</mn> <mi>N</mi> <mo>+</mo> <mn>2.7697</mn> <mo>)</mo> </mrow> <msup> <mi>T</mi> <mo>&prime;</mo> </msup> <mo>;</mo> </mrow></math>
for the woman: <math> <mrow> <msubsup> <mi>E</mi> <mn>1</mn> <mn>2</mn> </msubsup> <mo>=</mo> <mrow> <mo>(</mo> <mn>0</mn> <mo>.</mo> <mn>1771</mn> <mi>H</mi> <mo>+</mo> <mn>0</mn> <mo>.</mo> <mn>3985</mn> <mi>W</mi> <mo>-</mo> <mn>0</mn> <mo>.</mo> <mn>1948</mn> <mi>N</mi> <mo>+</mo> <mn>27.2956</mn> <mo>)</mo> </mrow> <msup> <mi>T</mi> <mo>&prime;</mo> </msup> <mo>;</mo> </mrow></math>
wherein,the basic energy consumption of male and female, H, W, N, T' are height, weight, age and actual exercise time, and the actual basic energy consumption E1Selecting a corresponding basic energy consumption model for men or women according to actual conditions;
when the exercise mode is the walking mode, the exercise energy consumption model is as follows:
male: <math> <mrow> <msubsup> <mi>E</mi> <mn>2</mn> <mn>1</mn> </msubsup> <mo>=</mo> <mrow> <mo>(</mo> <mn>0.2145</mn> <mi>H</mi> <mo>+</mo> <mn>0.3724</mn> <mi>W</mi> <mo>+</mo> <mn>11.257</mn> <mo>)</mo> </mrow> <msup> <mrow> <mo>(</mo> <msub> <mi>V</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <msup> <msub> <mi>T</mi> <mn>2</mn> </msub> <mo>&prime;</mo> </msup> <mo>;</mo> </mrow></math>
female: <math> <mrow> <msubsup> <mi>E</mi> <mn>2</mn> <mn>2</mn> </msubsup> <mo>=</mo> <mrow> <mo>(</mo> <mn>0.2683</mn> <mi>H</mi> <mo>+</mo> <mn>0.3225</mn> <mi>W</mi> <mo>+</mo> <mn>14.749</mn> <mo>)</mo> </mrow> <msup> <mrow> <mo>(</mo> <msub> <mi>V</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <msup> <msub> <mi>T</mi> <mn>2</mn> </msub> <mo>&prime;</mo> </msup> <mo>;</mo> </mrow></math>
wherein,the walking energy consumption values of male and female, H, W, V2,T2' height, weight, walking speed and actual walking time, actual walking exercise energy consumption E2And selecting the corresponding walking mode energy consumption model of the male or the female according to the actual situation.
When the exercise mode is running, the exercise energy consumption model is as follows:
male: <math> <mrow> <msubsup> <mi>E</mi> <mn>3</mn> <mn>1</mn> </msubsup> <mo>=</mo> <mrow> <mo>(</mo> <mn>0.2351</mn> <mi>H</mi> <mo>+</mo> <mn>0.6662</mn> <mi>W</mi> <mo>+</mo> <mn>7.932</mn> <mo>)</mo> </mrow> <msub> <mi>V</mi> <mn>3</mn> </msub> <msup> <msub> <mi>T</mi> <mn>3</mn> </msub> <mo>&prime;</mo> </msup> <mo>;</mo> </mrow></math>
female: <math> <mrow> <msubsup> <mi>E</mi> <mn>3</mn> <mn>2</mn> </msubsup> <mo>=</mo> <mrow> <mo>(</mo> <mn>0.1855</mn> <mi>H</mi> <mo>+</mo> <mn>0.5523</mn> <mi>W</mi> <mo>+</mo> <mn>16.327</mn> <mo>)</mo> </mrow> <msub> <mi>V</mi> <mn>3</mn> </msub> <msup> <msub> <mi>T</mi> <mn>3</mn> </msub> <mo>&prime;</mo> </msup> <mo>;</mo> </mrow></math>
wherein:running energy consumption values of male and female, H, W, V3,T3' height, weight, running speed and actual running time, respectively. Actual running exercise energy expenditure E3And selecting the corresponding running mode energy consumption model of the male or the female according to the actual situation.
When the exercise mode is the bicycle riding mode, the exercise energy consumption model is as follows:
male: <math> <mrow> <msubsup> <mi>E</mi> <mn>4</mn> <mn>1</mn> </msubsup> <mo>=</mo> <mrow> <mo>(</mo> <mn>0.1147</mn> <mi>H</mi> <mo>+</mo> <mn>0.3492</mn> <mi>W</mi> <mo>+</mo> <mn>8.794</mn> <mo>)</mo> </mrow> <msub> <mi>V</mi> <mn>4</mn> </msub> <msup> <msub> <mi>T</mi> <mn>4</mn> </msub> <mo>&prime;</mo> </msup> <mo>;</mo> </mrow></math>
female: <math> <mrow> <msubsup> <mi>E</mi> <mn>4</mn> <mn>2</mn> </msubsup> <mo>=</mo> <mrow> <mo>(</mo> <mn>0.1152</mn> <mi>H</mi> <mo>+</mo> <mn>0.5578</mn> <mi>W</mi> <mo>+</mo> <mn>7.452</mn> <mo>)</mo> </mrow> <msub> <mi>V</mi> <mn>4</mn> </msub> <msup> <msub> <mi>T</mi> <mn>4</mn> </msub> <mo>&prime;</mo> </msup> <mo>;</mo> </mrow></math>
wherein:energy consumption values of the bicycle, H, W, V4,T4' height, weight, rotational speed and actual riding time, respectively. Energy consumption for bicycle riding4According toThe actual situation selects the corresponding male or female biking mode energy expenditure model.
Step six, planning a motion path of virtual motion in the virtual scene, and setting A, B as two points on the path, A as a starting point, B as a motion end point, and recording the distance from the point A to the point B as dABAnd the height from point A to point B is hABEstablishing a distance dABHigh h, hABModels for A, B location information, respectively, where a-point location information is known.
The invention uses GPS as a positioning system of a virtual scene, plans a motion path of virtual motion through the GPS, gives a starting point and an interception point on a map, obtains the path of the starting point and the interception point through the GPS, and leads A, B to be two points on the path, A to be a starting point and B to be a motion end point, and notes the distance from the point A to the point B to be dABAnd the height from point A to point B is hABEstablishing a distance dABHigh h, hABModels for A, B location information, respectively, where a-point location information is known, i.e., latitude and longitude information, may be provided by GPS.
Note that the longitude and latitude of A, B are (jA, wA), (jB, wB), respectively, and R is the earth radius;
dAB=R*arccos[sin(wA)sin(wB)+cos(wA)cos(wB)*cos(jA-jB)];
hAB=R[sin(wB)-sin(wA)];
a, B the distance d between point A and point B is calculated because both points are on the mapABThe end point B can be determined from the start point a, and the position information of the point B can be determined.
And step seven, selecting a motion mode on the virtual motion path planned in the step six, and establishing a virtual energy consumption model related to the motion mode, the height, the weight, the age, the sex, the acceleration information, the virtual motion time, the wind speed, the altitude, the rainfall amount of rain, the snowfall amount and the temperature.
The above-mentionedVirtual energy consumption in step sevenWherein,for virtual energy consumption, E5For virtual basic energy consumption, E6Energy consumption for selected virtual movement patterns, E7For gravitational work consumption, E8For wind energy consumption, E9Energy consumption for other environmental factors.
In this embodiment, if the selected virtual exercise mode is walking, then:
energy expenditure E of selected virtual movement pattern6
When the exercise mode is the walking mode, the exercise energy consumption model is as follows:
male:E61=(0.214H+0.3724W+11.257)(V2)2t;
female:E62(0.2683H+0.3225W+14.749)(V2)2t;
wherein,the values of virtual walking energy consumption, H, W, V, for male and female respectively2T is height, weight, walking speed and virtual walking time, and energy consumption of virtual walking movement E6And selecting the corresponding virtual walking mode energy consumption model of the male or the female according to the actual situation. Similarly, a model of the consumption of exercise energy can be obtained when the exercise mode is running and cycling.
Virtual base energy consumption model:
male:E51(0.2085H+0.5731W-0.2815N+2.7697)t;
for the woman:E52(0.1771H+0.3985W-0.1948N+27.2956)t;
wherein,respectively for man and woman virtual basic energy consumption, H, W, N, t respectively for height and weight,age and virtual movement time, virtual base energy consumption E5Selecting a corresponding male or female virtual basic energy consumption model according to the actual situation;
the gravity work consumption model is as follows: e7=2.68WhAB
Wherein E is7Consumption for gravity work, W, hABHeight between body weight and A, B, respectively;
the wind consumption model is as follows: <math> <mrow> <msub> <mi>E</mi> <mn>8</mn> </msub> <mo>=</mo> <mrow> <mo>(</mo> <mi>&rho;</mi> <msubsup> <mi>pv</mi> <mn>2</mn> <mn>2</mn> </msubsup> <mo>+</mo> <mi>c</mi> <mo>)</mo> </mrow> <msub> <mi>d</mi> <mi>AB</mi> </msub> <mi>cos</mi> <mi>&theta;</mi> <mo>;</mo> </mrow></math>
wherein rho is wind resistance, p is air mass density, and v2The wind speed is, c is a constant, and theta is an included angle between the direction from the point A to the point B and the wind direction;
other environmental factor consumption models:
E9(0.0032O +0.3859R +0.4953Z +0.5231Q) (0.0845H +0.8282W) t + 56; wherein E is9For environment energy consumption, O, R, Z and Q are respectively altitude, rainfall amount of rain, snowfall amount and temperature, wherein the snowfall amount is calculated by adopting the rainfall amount during snowing, the altitude is provided by a GPS according to the position during moving, and H, W and t are respectively height, weight and virtual movement time.
Step eight, according to the energy consumed by the actual movement obtained in the step five and the virtual energy consumption model obtained in the step seven, obtaining the virtual movement time of the virtual movement on the virtual movement path by the fact that the energy consumed by the actual movement is equal to the virtual energy consumption;
and (4) solving the model obtained in the step seven according to the fact that the energy consumed by the actual movement is equal to the virtual energy consumption, and obtaining a relational expression of the distance and the height between the virtual movement time and A, B.
Step nine, establishing virtual motion related to virtual motion speed, virtual motion time and distance dABThe distance d is obtained according to the virtual movement speed obtained in the step two and the virtual movement time obtained in the step eightABDistance d obtained in step sixABHigh h, hABAnd respectively determining the position of the point B by using the model of the position information of A, B and further obtaining the position information of the point B.
Obtaining the virtual movement speed obtained in the step two and the virtual movement time obtained in the step eight:
dABwhere v, t is the virtual speed of movement and the virtual walking time of the person on the virtual map.
The relation between the virtual movement time and the distance and the height A, B obtained in the step eight, and the relation d between the distance and the height A, B obtained in the step six and related to the position informationAB=R*arccos[sin(wA)sin(wB)+cos(wA)cos(wB)*cos(jA-jB)];hAB=R[sin(wB)-sin(wA)](ii) a And the trigonometric function relational expression is used for solving the position information (longitude and latitude information) of the point B and positioning the point B according to the longitude and latitude information by a GPS (global positioning system), thereby determining the position of the point B.
From the above, the present invention has the following features:
1. the virtual scene of the present invention may be a specific actual scene, and is not limited to only a virtual scene. The conversion mode is that energy consumption is used as a medium, the speed and time of movement are not simply mapped into a map, and if the movement is mapped onto the map directly through the movement speed, the energy consumed by different people is judged through the direct distance between two points on the map, which is not scientific. The invention maps the actual movement to the virtual scene by taking the energy consumption as an intermediate variable, the consumed energy of the actual movement is not changed, but the movement distance of the actual movement in the virtual scene is changed, and the movement consumption of the actual movement is judged by the distance in the virtual scene, so the result is accurate and has high reference significance.
2. If the actual movement speed is multiplied by the movement time to directly calculate the movement distance under different geographic environments, the calculated result is seriously inconsistent with the actual movement result because the different geographic environments have larger influence difference on the body energy consumption of the human body, for example, the walking speed on a plain is directly used for calculating the movement distance in a high-altitude area, because the human body is influenced by factors such as altitude, climate and the like, the energy consumed by the human body is far larger than the consumption on the plain, the movement distance is far smaller than the calculated distance in the same time in the actual movement process, therefore, the calculation of the movement distance under another geographic environment according to the movement speed under one geographic environment is not scientific, but the invention can well solve the problem, the invention takes the energy consumed by the movement as a medium to map the measured energy consumed by the actual movement of the human body into the energy consumed by the virtual movement, the distance of the virtual motion in the motion path of the virtual scene is obtained, and then the motion end point of the virtual motion is obtained, so that the motion distance in another geographic environment can be well calculated.
The preferred embodiments of the present invention described above are only for illustrating the embodiments of the present invention and are not to be construed as limiting the objects of the invention described above and the contents and scope of the appended claims, and any simple modification, equivalent change and modification made to the above embodiments according to the technical essence of the present invention still fall within the technical and claim protection scope of the present invention.

Claims (8)

6. the method of transforming actual motion of a human body into motion in a virtual scene of claim 5, wherein: the energy consumed by the actual movement in the step five comprisesThe sum of the actual basic energy consumption and the corresponding energy consumption of each motion mode; energy consumed by the actual movementThe above-mentionedE1,E2,E3,E4Energy consumed by actual exercise, actual basic energy consumption, actual walking exercise energy consumption, actual running exercise energy consumption and bicycle riding exercise energy consumption are respectively; the motion time of each motion mode is the time of the acceleration information of each motion mode in an acceleration information inspection interval; the walking speed and the running speed are obtained by calculating the walking frequency and the stride of the human body walking measured by the three-axis acceleration sensor; the rotating speed is obtained by calculating human body pedaling step frequency parameters measured by a three-axis acceleration sensor, and the obtained actual parameters are brought into each model in the third step, so that the actual basic energy consumption, the actual walking exercise energy consumption, the actual running exercise energy consumption and the bicycle riding exercise energy consumption can be obtained.
CN201410686154.0A2014-11-252014-11-25Method for transforming actual motion of human body into motion in virtual sceneExpired - Fee RelatedCN104361248B (en)

Priority Applications (2)

Application NumberPriority DateFiling DateTitle
CN201410686154.0ACN104361248B (en)2014-11-252014-11-25Method for transforming actual motion of human body into motion in virtual scene
PCT/CN2015/093639WO2016082660A1 (en)2014-11-252015-11-03Method for converting actual motion of human body into motion in virtual scenario

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201410686154.0ACN104361248B (en)2014-11-252014-11-25Method for transforming actual motion of human body into motion in virtual scene

Publications (2)

Publication NumberPublication Date
CN104361248Atrue CN104361248A (en)2015-02-18
CN104361248B CN104361248B (en)2017-04-19

Family

ID=52528507

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201410686154.0AExpired - Fee RelatedCN104361248B (en)2014-11-252014-11-25Method for transforming actual motion of human body into motion in virtual scene

Country Status (2)

CountryLink
CN (1)CN104361248B (en)
WO (1)WO2016082660A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2016082660A1 (en)*2014-11-252016-06-02李旋Method for converting actual motion of human body into motion in virtual scenario
CN109151555A (en)*2018-10-292019-01-04奇想空间(北京)教育科技有限公司Amusement facility and the method for handling video image
CN109682394A (en)*2019-01-282019-04-26百度在线网络技术(北京)有限公司Method and apparatus for pushing Walking Route information

Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20030004444A1 (en)*2000-05-302003-01-02Barbara PernerRehabilitation device for persons with paresis of lower limps enabling them to walk
CN101553860A (en)*2005-11-282009-10-07鲍尔格力德健身器材公司Method and apparatus for operatively controlling a virtual reality scenario with an isometric exercise system
CN102226880A (en)*2011-06-032011-10-26北京新岸线网络技术有限公司Somatosensory operation method and system based on virtual reality
CN103181767A (en)*2011-12-282013-07-03三星电子株式会社Method for measuring quantity of exercise and display apparatus thereof
CN103258338A (en)*2012-02-162013-08-21克利特股份有限公司Method and system for driving simulated virtual environments with real data

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP4907128B2 (en)*2005-08-302012-03-28任天堂株式会社 Game system and game program
JP2007144107A (en)*2005-10-252007-06-14Vr Sports:KkExercise assisting system
KR101339431B1 (en)*2010-11-192013-12-09도시바삼성스토리지테크놀러지코리아 주식회사Game controller, game machine, and game system employ the game controller
CN103656988A (en)*2013-08-062014-03-26刘涛Electricity-saving intelligent game running machine
CN104111978B (en)*2014-06-252017-08-29京东方科技集团股份有限公司Energy consumption measurement method and energy consumption measurement system
CN104361248B (en)*2014-11-252017-04-19李旋Method for transforming actual motion of human body into motion in virtual scene

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20030004444A1 (en)*2000-05-302003-01-02Barbara PernerRehabilitation device for persons with paresis of lower limps enabling them to walk
CN101553860A (en)*2005-11-282009-10-07鲍尔格力德健身器材公司Method and apparatus for operatively controlling a virtual reality scenario with an isometric exercise system
CN102226880A (en)*2011-06-032011-10-26北京新岸线网络技术有限公司Somatosensory operation method and system based on virtual reality
CN103181767A (en)*2011-12-282013-07-03三星电子株式会社Method for measuring quantity of exercise and display apparatus thereof
CN103258338A (en)*2012-02-162013-08-21克利特股份有限公司Method and system for driving simulated virtual environments with real data

Cited By (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2016082660A1 (en)*2014-11-252016-06-02李旋Method for converting actual motion of human body into motion in virtual scenario
CN109151555A (en)*2018-10-292019-01-04奇想空间(北京)教育科技有限公司Amusement facility and the method for handling video image
CN109151555B (en)*2018-10-292021-11-19北京西潼科技有限公司Amusement apparatus and method of processing video images
CN109682394A (en)*2019-01-282019-04-26百度在线网络技术(北京)有限公司Method and apparatus for pushing Walking Route information
CN109682394B (en)*2019-01-282021-08-03百度在线网络技术(北京)有限公司Method and device for pushing walking route information

Also Published As

Publication numberPublication date
CN104361248B (en)2017-04-19
WO2016082660A1 (en)2016-06-02

Similar Documents

PublicationPublication DateTitle
CN104061934B (en)Pedestrian indoor position tracking method based on inertial sensor
CN103411607B (en)Pedestrian&#39;s step-size estimation and dead reckoning method
CN104790283B (en) A rapid detection system for road surface roughness based on vehicle accelerometer
JP5344491B2 (en) Method, apparatus, display unit, and system for measuring the advancement of a moving person
KR102111104B1 (en)Route smoothing
CN105608326B (en)Mountainous area complex terrain wind field large vortex simulation entrance boundary condition input method
CN105190238B (en)Method and apparatus for improving navigation of riding
CN108955675A (en)A kind of underground piping track detection system and method based on inertia measurement
CN104197935B (en)Indoor localization method based on mobile intelligent terminal
CN109725284B (en)Method and system for determining a direction of motion of an object
CN103093088A (en)Safety evaluation method for steep slope and winding road
CN104361248B (en)Method for transforming actual motion of human body into motion in virtual scene
CN111831960B (en) Dynamic measurement method of network-connected truck load based on identification and elimination of slope disturbance
CN103175502A (en)Attitude angle detecting method based on low-speed movement of data glove
CN106644208A (en)Riding capability analyzing system and riding capability analyzing method
CN109459028A (en)A kind of adaptive step estimation method based on gradient decline
CN108303043A (en)Plant leaf area index detection method and system combined of multi-sensor information
CN105698795A (en)Indoor localization step size calculation method
d’Auteuil et al.Riverine hydrokinetic resource assessment using low cost winter imagery
CN112067058A (en)Automatic monitoring equipment for detecting karst channel and use method
CN108939488A (en)A kind of sailing boat supplemental training device based on augmented reality and training paths planning method
CN112004183A (en) A robot autonomous localization method based on convolutional neural network fusion of IMU and WiFi information
CN107339982A (en)High ferro wire plotting method
CN113325455B (en)Method and system for tracking and determining indoor position of object
CN107727045B (en) Road flat curve radius measurement method based on driving track

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant
CF01Termination of patent right due to non-payment of annual fee

Granted publication date:20170419

Termination date:20181125

CF01Termination of patent right due to non-payment of annual fee

[8]ページ先頭

©2009-2025 Movatter.jp