- Notifications
You must be signed in to change notification settings - Fork8
[CVPR'24] 3D Neural Edge Reconstruction
License
cvg/EMAP
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Lei Li ·Songyou Peng ·Zehao Yu ·Shaohui Liu ·Rémi Pautrat
Xiaochuan Yin ·Marc Pollefeys
EMAP enables 3D edge reconstruction from multi-view 2D edge maps.
git clone https://github.com/cvg/EMAP.gitcd EMAPconda create -n emap python=3.8conda activate emapconda install pytorch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 pytorch-cuda=12.1 -c pytorch -c nvidiapip install -r requirements.txt
Download datasets:
python scripts/download_data.py
The data is organized as follows:
<scan_id>|-- meta_data.json # camera parameters|-- color # images for each view |-- 0_colors.png |-- 1_colors.png ...|-- edge_DexiNed # edge maps extracted from DexiNed |-- 0_colors.png |-- 1_colors.png ...|-- edge_PidiNet # edge maps extracted from PidiNet |-- 0_colors.png |-- 1_colors.png ...
To train and extract edges on different datasets, use the following commands:
bash scripts/run_ABC.bash
bash scripts/run_Replica.bash
bash scripts/run_DTU.bash
We have uploaded the model checkpoints onGoogle Drive.
To evaluate extracted edges on ABC-NEF_Edge dataset, use the following commands:
python src/eval/eval_ABC.py
- Training Code
- Inference Code
- Evaluation Code
- Custom Dataset Support
The majority of EMAP is licensed under aMIT License.
If you find the code useful, please consider the following BibTeX entry.
@InProceedings{li2024neural,title={3D Neural Edge Reconstruction},author={Li, Lei and Peng, Songyou and Yu, Zehao and Liu, Shaohui and Pautrat, R{\'e}mi and Yin, Xiaochuan and Pollefeys, Marc},booktitle={IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},year={2024},}
If you encounter any issues, you can also contact Lei throughlllei.li0386@gmail.com.
This project is built uponNeuralUDF,NeuS andMeshUDF. We use pretrainedDexiNed andPidiNet for edge map extraction. We thank all the authors for their great work and repos.