Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

CGI 2022 Paper Accompanying Code Base and Data

NotificationsYou must be signed in to change notification settings

adnan0819/Urban-Tree-Generator

Repository files navigation

This repository contains our dataset contribution and codebase toreproduce our papertitled "Urban Tree Generator: Spatio-Temporal and Generative Deep Learning for Urban Tree Localization and Modeling" in CGI 2022 (published in The Visual Computer journal).

Read our paperhere.

Besides using the codebase to reproduce our results, we hope that the dataset and codebase will help other researchers extend our methods in other domains also.

Annotated Dataset

As per our dataset contribution in our paper noted in Sec. 3.1, the annotated dataset of four cities (Chicago, Indianapolis, Austin, and Lagos) into three classes - tree, grass, others can be downloaded fromhere.

A sample of the annotation of Indianapolis is shown below (green = tree, red = grass):

annotation sample

Codebase

Requirements/Prerequisites for installation/training

All the required libraries are enlisted inrequirements.txt. To directly install usingpip, please just use:

pip install -r requirements.txt

Segmentation and clustering

The repository is arranged so that can be easily reproducible into directories. The directorySegmentation_and_clustering contains all the code necessary to train and infer the segmentation and clustering section as noted in the paper. Here are some points as pre-requisites:

  • Clone into this directory.
  • Download the preprocessed training data fromhere.
  • Place the zip file inside theSegmentation_and_clustering directory and unzip
  • A directory calledData will be created
  • Simply run./Segmentation_and_clustering/python main.py to train
  • Inference and usage of pre-trained models are documented and commented insidemain.py

Localization

The directoryLocalization contains all the code necessary to train and infer the localization section as noted in the paper (Sec. 4). Here are some points as pre-requisites:

  • Clone into this directory.
  • Download the preprocessed training data fromhere.
  • Place the zip file inside theLocalization directory and unzip
  • Simply run./Localization/python train_localization.py to train the cGAN model
  • Inference and usage of pre-trained models are documented and commented insidetrain_localization.py.

Model Architectures

Below is our deep learning model for Segmentation of trees (Sec 3).

segmentation model

For the model of our localization network (Sec. 4) please see the implementatation insidetrain_localization.py which is inspired by the standard TensorflowcGAN network. The figure is reproduced below.

localization model

Ref. TensorflowcGAN network.

Localization Output Example

An illustrative example of our Localization output (bottom row) and ground truth (top row) is shown below in a segment of Chicago (see Sec. 4 and Sec. 5 of our paper for more results and examples).

Localization example

To Cite

If our paper, data and/or approach is in any way helpful to you and your research, please cite the paper fromhere as BibTeX.

Alternatively, cite as:

Firoze, A., Benes, B. & Aliaga, D. Urban tree generator: spatio-temporal and generative deep learning for urban tree localization and modeling. Vis Comput (2022). https://doi.org/10.1007/s00371-022-02526-x

About

CGI 2022 Paper Accompanying Code Base and Data

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp