Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

[TOG & SIGGRAPH 2024] Joint Stroke Tracing and Correspondence for 2D Animation

License

NotificationsYou must be signed in to change notification settings

MarkMoHR/JoSTC

Repository files navigation

[Paper] |[Paper (ACM)] |[Project Page]

This code is used for producing stroke tracing and correspondence results, which can be imported into an inbetweening product namedCACANi for making 2D animations.

Outline

Dependencies

Quick Start

Model Preparation

Download the modelshere, and place them in this file structure:

models/    quickdraw-perceptual.pth    point_matching_model/        sketch_correspondence_50000.pkl        transform_module/            sketch_transform_30000.pkl    stroke_tracing_model/        sketch_tracing_30000.pkl

Create Reference Vector Frames with Krita

Our method takes as inputs consecutive raster keyframes and a single vector drawing from the starting keyframe, and then generates vector images for the remaining keyframes with one-to-one stroke correspondence. So we have to create the vector image for the reference frame here.

Note: We provide several examples for testing in directorysample_inputs/. If you use them, you can skip step-1 and step-2 below and execute step-3 directly.

  1. Our method takes squared images as input, so please preprocess the images first usingtools/image_squaring.py:
python3 tools/image_squaring.py --file path/to/the/image.png
  1. Follow tutorialhere to make vector frames as a reference in svg format withKrita.
  2. Place the svg files insample_inputs/*/svg/. Then, convert them into npz format usingtools/svg_to_npz.py:
cd tools/python3 svg_to_npz.py --database ../sample_inputs/rough/ --reference 23-0.png

Execute the Main Code

Perform joint stroke tracing and correspondence usingsketch_tracing_inference.py. We provide several examples for testing in directorysample_inputs/.

python3 sketch_tracing_inference.py --dataset_base sample_inputs --data_type rough --img_seq 23-0.png 23-1.png 23-2.png
  • --data_type: specify the image type withclean orrough.
  • --img_seq: specify the animation frames here. The first one should be the reference frame.
  • The results are placed inoutputs/inference/*. Inside this folder:
    • raster/ stores rendered line drawings of the target vector frames.
    • rgb/ stores visualization (with target images underneath) of the vector stroke correspondence to the reference frame.
    • rgb-wo-bg/ stores visualization without target images underneath.
    • parameter/ stores vector stroke parameters.

Create Inbetweening and Animation with CACANi

  1. Convert the output npz file (vector stroke parameters) into svg format usingtools/npz_to_svg.py:
cd tools/python3 npz_to_svg.py --database_input ../sample_inputs/ --database_output ../outputs/inference/ --data_type rough --file_names 23-1.png 23-2.png
  • The results are placed inoutputs/inference/*/svg. There are two kinds of results:
    • chain/: the svg files store stroke chains, defining each path as a chain.
    • separate/: the svg files store separated strokes, defining each path as a single stroke.Note that the automatic inbetweening inCACANi relies on this format.
  1. Follow tutorialhere to generate inbetweening and 2D animation withCACANi.

More Tools

  • tools/vis_difference.py: visualize difference between the reference image and the target one. The results are placed insample_inputs/*/raster_diff/
cd tools/python3 vis_difference.py --database_input ../sample_inputs --data_type rough --reference_image 23-0.png --target_image 23-1.png

Vector Stroke Correspondence Dataset

We collect a dataset for training with 10k+ pairs of raster frames and their vector drawings with stroke correspondence. Please download ithere. We provide a reference codedataset_utils/tuberlin_dataset_util.py showing how to use the data.


Citation

If you use the code and models please cite:

@article{mo2024joint,  title={Joint Stroke Tracing and Correspondence for 2D Animation},  author={Mo, Haoran and Gao, Chengying and Wang, Ruomei},  journal={ACM Transactions on Graphics},  volume={43},  number={3},  pages={1--17},  year={2024},  publisher={ACM New York, NY}}

About

[TOG & SIGGRAPH 2024] Joint Stroke Tracing and Correspondence for 2D Animation

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp