Skip to content

Commit

Permalink
Code Release
Browse files Browse the repository at this point in the history
  • Loading branch information
tovacinni committed May 12, 2021
1 parent 93d6694 commit 707f1b5
Show file tree
Hide file tree
Showing 103 changed files with 11,202 additions and 2 deletions.
14 changes: 14 additions & 0 deletions .gitattributes
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
*.png filter=lfs diff=lfs merge=lfs -text
*.jpg filter=lfs diff=lfs merge=lfs -text
*.jpeg filter=lfs diff=lfs merge=lfs -text
*.mp4 filter=lfs diff=lfs merge=lfs -text
*.gif filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.tar.xz filter=lfs diff=lfs merge=lfs -text
*.pdf filter=lfs diff=lfs merge=lfs -text
*.ppt filter=lfs diff=lfs merge=lfs -text
*.pptx filter=lfs diff=lfs merge=lfs -text
*.obj filter=lfs diff=lfs merge=lfs -text
*.deb filter=lfs diff=lfs merge=lfs -text
*.dll filter=lfs diff=lfs merge=lfs -text
*.exe filter=lfs diff=lfs merge=lfs -text
5 changes: 5 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
.DS_Store
*.sw*
.ipynb_checkpoints
compile_commands.json
.vscode
20 changes: 20 additions & 0 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
The MIT License (MIT)

Copyright (c) 2021, NVIDIA CORPORATION.

Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
165 changes: 163 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,169 @@
# Neural Geometric Level of Detail: Real-time Rendering with Implicit 3D Surfaces

Official PyTorch code (TBD) for NGLOD. For technical details, please refer to:
Official code release for NGLOD. For technical details, please refer to:

**Neural Geometric Level of Detail: Real-time Rendering with Implicit 3D Surfaces**
[Towaki Takikawa*](https://tovacinni.github.io), [Joey Litalien*](https://joeylitalien.github.io), [Kangxue Xin](https://kangxue.org/), [Karsten Kreis](https://scholar.google.de/citations?user=rFd-DiAAAAAJ), [Charles Loop](https://research.nvidia.com/person/charles-loop), [Derek Nowrouzezahrai](http://www.cim.mcgill.ca/~derek/), [Alec Jacobson](https://www.cs.toronto.edu/~jacobson/), [Morgan McGuire](https://casual-effects.com/), and [Sanja Fidler](https://www.cs.toronto.edu/~fidler/)\
In submission, 2021\
In Computer Vision and Pattern Recognition (CVPR), 2021\
**[[Paper](https://arxiv.org/abs/2101.10994)] [[Bibtex](https://nv-tlabs.github.io/nglod/assets/nglod.bib)] [[Project Page](https://nv-tlabs.github.io/nglod/)]**

![](imgs/imgs_teaser.jpg)

If you find this code useful, please consider citing:

```
@article{takikawa2021nglod,
title = {Neural Geometric Level of Detail: Real-time Rendering with Implicit {3D} Shapes},
author = {Towaki Takikawa and
Joey Litalien and
Kangxue Yin and
Karsten Kreis and
Charles Loop and
Derek Nowrouzezahrai and
Alec Jacobson and
Morgan McGuire and
Sanja Fidler},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2021},
}
```

## Directory Structure

`sol-renderer` contains our real-time rnedering code.

`sdf-net` contains our training code.

Within `sdf-net`:

`sdf-net/lib` contains all of our core codebase.

`sdf-net/app` contains standalone applications that users can run.

## Getting started

### Python dependencies
The easiest way to get started is to create a virtual Python 3.8 environment:
```
conda create -n nglod python=3.8
conda activate nglod
pip install --upgrade pip
pip install -r ./infra/requirements.txt
```

The code also relies on [OpenEXR](https://www.openexr.com/), which requires a system library:

```
sudo apt install libopenexr-dev
pip install pyexr
```

To see the full list of dependencies, see the [requirements](infra/requirements.txt).

### Building CUDA extensions
To build the corresponding CUDA kernels, run:
```
cd sdf-net/lib/extensions
chmod +x build_ext.sh && ./build_ext.sh
```

The above instructions were tested on Ubuntu 18.04/20.04 with CUDA 10.2/11.1.

**Note.** If you wish to use CUDA 10.X, you must install the appropriate
[CuPy](https://pypi.org/project/cupy/) module (e.g. `pip install cupy-cuda102`. Default: 11.1).


## Training & Rendering

**Note.** All following commands should be ran within the `sdf-net` directory.

### Download sample data

To download a cool armadillo:

```
wget https://raw.githubusercontent.com/alecjacobson/common-3d-test-models/master/data/armadillo.obj -P data/
```

To download a cool matcap file:

```
wget https://raw.githubusercontent.com/nidorx/matcaps/master/1024/6E8C48_B8CDA7_344018_A8BC94.png -O data/matcap/green.png
```

### Training from scratch

```
python app/main.py \
--net OctreeSDF \
--num-lods 5 \
--dataset-path data/armadillo_normalized.obj \
--raw-obj-path data/armadillo.obj \
--epoch 250 \
--exp-name armadillo
```

This will populate `_results` with TensorBoard logs.

### Rendering the trained model

If you set custom network parameters in training, you need to also reflect them for the renderer.

For example, if you set `--feature-dim 16` above, you need to set it here too.

```
python app/sdf_renderer.py \
--net OctreeSDF \
--num-lods 5 \
--pretrained _results/models/armadillo.pth \
--render-res 1280 720 \
--shading-mode matcap \
--lod 4
```

By default, this will populate `_results` with the rendered image.

If you want to export a `.npz` model which can be loaded into the C++ real-time renderer, add the argument
`--export path/file.npz`. Note that the renderer only supports the base Neural LOD configuration
(the default parameters with `OctreeSDF`).

## Core Library Development Guide

To add new functionality, you will likely want to make edits to the files in `lib`.

We try our best to keep our code modular, such that key components such as `trainer.py` and `renderer.py`
need not be modified very frequently to add new functionalities.

To add a new network architecture for an example, you can simply add a new Python file in `lib/models` that
inherits from a base class of choice. You will probably only need to implement the `sdf` method which
implements the forward pass, but you have the option to override other methods as needed if more custom
operations are needed.

By default, the loss function used are defined in a CLI argument, which the code will automatically parse
and iterate through each loss function. The network architecture class is similarly defined in the CLI
argument; simply use the exact class name, and don't forget to add a line in `__init__.py` to resolve the
namespace.

## App Development Guide

To make apps that use the core library, add the `sdf-net` directory into the Python `sys.path`, so
the modules can be loaded correctly. Then, you will likely want to inherit the same CLI parser defined
in `lib/options.py` to save time. You can then add a new argument group `app` to the parser to add custom
CLI arguments to be used in conjunction with the defaults. See `app/sdf_renderer.py` for an example.

Examples of things that are considered `apps` include, but are not limited to:

- visualizers
- training code
- downstream applications

## Third-Party Libraries

This code includes code derived from 3 third-party libraries, all distributed under the MIT License:

https://github.com/zekunhao1995/DualSDF

https://github.com/rogersce/cnpy

https://github.com/krrish94/nerf-pytorch

3 changes: 3 additions & 0 deletions imgs/imgs_teaser.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
17 changes: 17 additions & 0 deletions infra/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
torch==1.6.0
torchvision==0.7.0
tensorboard
matplotlib
cupy-cuda111
git+https://github.com/tinyobjloader/tinyobjloader.git#subdirectory=python
pybind11
trimesh>=3.0
tqdm
Pillow
scipy
scikit-image
six==1.12.0
moviepy
opencv-python
plyfile
polyscope
2 changes: 2 additions & 0 deletions sdf-net/.gitattributes
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
*.hdr filter=lfs diff=lfs merge=lfs -text
*.exr filter=lfs diff=lfs merge=lfs -text
8 changes: 8 additions & 0 deletions sdf-net/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
__pycache__
sandbox.ipynb
logs
*.so
imgs
images*
_*/
.polyscope.ini
1 change: 1 addition & 0 deletions sdf-net/.vimrc
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
set makeprg=./bdr.sh
91 changes: 91 additions & 0 deletions sdf-net/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
## Directory Structure

`lib` contains all of our core codebase.

`app` contains standalone applications that users can run.

## Training & Rendering

**Note.** All following commands should be ran within the `sdf-net` directory.

### Download sample data

To download a cool armadillo:

```
wget https://raw.githubusercontent.com/alecjacobson/common-3d-test-models/master/data/armadillo.obj -P data/
```

To download a cool matcap file:

```
wget https://raw.githubusercontent.com/nidorx/matcaps/master/1024/6E8C48_B8CDA7_344018_A8BC94.png -O data/matcap/green.png
```

### Training from scratch

```
python app/main.py \
--net OctreeSDF \
--num-lods 5 \
--dataset-path data/armadillo_normalized.obj \
--raw-obj-path data/armadillo.obj \
--epoch 250 \
--exp-name armadillo
```

This will populate `_results` with TensorBoard logs.

### Rendering the trained model

If you set custom network parameters in training, you need to also reflect them for the renderer.

For example, if you set `--feature-dim 16` above, you need to set it here too.

```
python app/sdf_renderer.py \
--net OctreeSDF \
--num-lods 5 \
--pretrained _results/models/armadillo.pth \
--render-res 1280 720 \
--shading-mode matcap \
--lod 4
```

By default, this will populate `_results` with the rendered image.

If you want to export a `.npz` model which can be loaded into the C++ real-time renderer, add the argument
`--export path/file.npz`. Note that the renderer only supports the base Neural LOD configuration
(the default parameters with `OctreeSDF`).

## Core Library Development Guide

To add new functionality, you will likely want to make edits to the files in `lib`.

We try our best to keep our code modular, such that key components such as `trainer.py` and `renderer.py`
need not be modified very frequently to add new functionalities.

To add a new network architecture for an example, you can simply add a new Python file in `lib/models` that
inherits from a base class of choice. You will probably only need to implement the `sdf` method which
implements the forward pass, but you have the option to override other methods as needed if more custom
operations are needed.

By default, the loss function used are defined in a CLI argument, which the code will automatically parse
and iterate through each loss function. The network architecture class is similarly defined in the CLI
argument; simply use the exact class name, and don't forget to add a line in `__init__.py` to resolve the
namespace.

## App Development Guide

To make apps that use the core library, add the `sdf-net` directory into the Python `sys.path`, so
the modules can be loaded correctly. Then, you will likely want to inherit the same CLI parser defined
in `lib/options.py` to save time. You can then add a new argument group `app` to the parser to add custom
CLI arguments to be used in conjunction with the defaults. See `app/sdf_renderer.py` for an example.

Examples of things that are considered `apps` include, but are not limited to:

- visualizers
- training code
- downstream applications


Loading

0 comments on commit 707f1b5

Please sign in to comment.