- Notifications
You must be signed in to change notification settings - Fork9
[TOG & SIGGRAPH 2024] Joint Stroke Tracing and Correspondence for 2D Animation
License
MarkMoHR/JoSTC
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
[Paper] |[Paper (ACM)] |[Project Page]
This code is used for producing stroke tracing and correspondence results, which can be imported into an inbetweening product namedCACANi for making 2D animations.
- cudatoolkit == 11.0.3
- cudnn == 8.4.1.50
- pytorch == 1.9.0
- torchvision == 0.9.0
- diffvg
- Krita: for making reference vector frame
- CACANi: for making inbetweening and 2D animation
Download the modelshere, and place them in this file structure:
models/ quickdraw-perceptual.pth point_matching_model/ sketch_correspondence_50000.pkl transform_module/ sketch_transform_30000.pkl stroke_tracing_model/ sketch_tracing_30000.pkl
Our method takes as inputs consecutive raster keyframes and a single vector drawing from the starting keyframe, and then generates vector images for the remaining keyframes with one-to-one stroke correspondence. So we have to create the vector image for the reference frame here.
Note: We provide several examples for testing in directorysample_inputs/
. If you use them, you can skip step-1 and step-2 below and execute step-3 directly.
- Our method takes squared images as input, so please preprocess the images first usingtools/image_squaring.py:
python3 tools/image_squaring.py --file path/to/the/image.png
- Follow tutorialhere to make vector frames as a reference in svg format withKrita.
- Place the svg files in
sample_inputs/*/svg/
. Then, convert them into npz format usingtools/svg_to_npz.py:
cd tools/python3 svg_to_npz.py --database ../sample_inputs/rough/ --reference 23-0.png
Perform joint stroke tracing and correspondence usingsketch_tracing_inference.py. We provide several examples for testing in directorysample_inputs/
.
python3 sketch_tracing_inference.py --dataset_base sample_inputs --data_type rough --img_seq 23-0.png 23-1.png 23-2.png
--data_type
: specify the image type withclean
orrough
.--img_seq
: specify the animation frames here. The first one should be the reference frame.- The results are placed in
outputs/inference/*
. Inside this folder:raster/
stores rendered line drawings of the target vector frames.rgb/
stores visualization (with target images underneath) of the vector stroke correspondence to the reference frame.rgb-wo-bg/
stores visualization without target images underneath.parameter/
stores vector stroke parameters.
- Convert the output npz file (vector stroke parameters) into svg format usingtools/npz_to_svg.py:
cd tools/python3 npz_to_svg.py --database_input ../sample_inputs/ --database_output ../outputs/inference/ --data_type rough --file_names 23-1.png 23-2.png
- The results are placed in
outputs/inference/*/svg
. There are two kinds of results:chain/
: the svg files store stroke chains, defining each path as a chain.separate/
: the svg files store separated strokes, defining each path as a single stroke.Note that the automatic inbetweening inCACANi relies on this format.
- tools/vis_difference.py: visualize difference between the reference image and the target one. The results are placed in
sample_inputs/*/raster_diff/
cd tools/python3 vis_difference.py --database_input ../sample_inputs --data_type rough --reference_image 23-0.png --target_image 23-1.png
- tools/make_inbetweening.py: visualize the inbetweening in a single image or a gif file.
We collect a dataset for training with 10k+ pairs of raster frames and their vector drawings with stroke correspondence. Please download ithere. We provide a reference codedataset_utils/tuberlin_dataset_util.py showing how to use the data.
If you use the code and models please cite:
@article{mo2024joint, title={Joint Stroke Tracing and Correspondence for 2D Animation}, author={Mo, Haoran and Gao, Chengying and Wang, Ruomei}, journal={ACM Transactions on Graphics}, volume={43}, number={3}, pages={1--17}, year={2024}, publisher={ACM New York, NY}}