- Notifications
You must be signed in to change notification settings - Fork113
Dene33/video_to_bvh
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Convert human motion from video to .bvh with Google Colab
- Go tohttps://colab.research.google.com
File
>Upload notebook...
>GitHub
>Paste this link:
https://github.com/Dene33/video_to_bvh/blob/master/video_to_bvh.ipynb- Ensure that
Runtime
>Change runtime type
isPython 3
withGPU
Second step is to install all the required dependencies. Select the first code cell and pushshift+enter
. You'll see running lines of executing code. Wait until it's done (1-2 minutes).
- Select the code cell and push
shift+enter
- Push
select files
button - Select the video you want to process (it should contain only one person, all body parts in frame, long videos will take a lot of time to process)
- Specify desired
fps
rate at which you want to convert video to images. Lower fps = faster processing - Select the code cell and push
shift+enter
This step does all the job:
- Convertion of video to images (images are required for pose estimation to work)
- 2d pose estimation. For each image creates corresponding .json file with 2djoints with format similar to output .json format of originalopenpose.Fork ofkeras_Realtime_Multi-Person_Pose_Estimation is used.
- 3d pose estimation. Creates .csv file of all the frames of video with 3d joints coordinates.Fork ofEnd-to-end Recovery of Human Shape and Pose
- Convertion of estimated .csv files to .bvh with help ofcustom script with.blend file.
- Select the code cell and push
shift+enter
.bvh will be saved to your PC. - If you want preview it, runBlender on your PC.
File
>Import
>Motion Capture (.bvh)
>alt+a
- Select the code cell and push
shift+enter
.
About
Convert human motion from video to .bvh
Resources
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
No releases published
Packages0
No packages published
Uh oh!
There was an error while loading.Please reload this page.