This is the final project for the course Computer Vision taught by Prof. Pengshuai Wang at Peking University. In this project, we intend to implement the classic model of NeRF with positional encoding and fit on multi-view images. Besides, we will also implement the extension of NeRF which is capable of modeling dynamic scenes utilizing both the straightfoward way of representing the scene by a 6D input of the query 3D location, viewing direction and time
├── logs # save checkpoints and pre-trained weights
├── data # data for training and testing
├── configs # parameters for training and testing, for adjusting different models
├── requirements.txt # environment required to run the code
├── run.py # code
├── ...
git clone https://github.com/MAMBA4L924/GT-NeRF.git
cd GT-NeRF
pip install -r requirements.txt
cd ..
You can download the pre-trained models from drive. Unzip the downloaded data to the project root dir in order to test it later. See the following directory structure for an example:
├── logs
│ ├── NeRF
| | ├── Lego
| | ├── Fern
| | ├── ...
| ├── D-NeRF
│ | ├── standup
│ | ├── mutant
| | ├── lego
You can download the datasets from drive. Unzip the downloaded data to the project root dir in order to train. See the following directory structure for an example:
├── data
│ ├── NeRF
| | ├── Lego
| | ├── Fern
| | ├── ...
| ├── D-NeRF
│ | ├── standup
│ | ├── mutant
| | ├── lego
| | ├── ...
These .txt files are the basic parameters for training, loading data and rendering. You can adjust by yourself to change model.
├── configs
│ ├── NeRF
| | ├── Lego.txt
| | ├── Fern.txt
| | ├── ...
| ├── D-NeRF
│ | ├── standup.txt
│ | ├── mutant.txt
| | ├── lego.txt
| | ├── ...
We provide simple jupyter notebooks to explore the model. To use them first download the pre-trained weights and dataset.
Description | Jupyter Notebook |
---|---|
Synthesize novel views at an arbitrary point in time. | render.ipynb |
Reconstruct mesh at an arbitrary point in time. | reconstruct.ipynb |
Quantitatively evaluate trained model. | metrics.ipynb |
First download pre-trained weights and dataset. Then,
python run.py --config configs/D-NeRF/mutant.txt --render_only --render_test
This command will run the mutant
experiment. When finished, results are saved to ./D-NeRF/logs/mutant/renderonly_test_799999
To quantitatively evaluate model run metrics.ipynb
notebook
First download the dataset. Then,
- For NeRF:
python run.py --config configs/NeRF/Lego.txt
- For D-NeRF:
python run.py --config configs/D-NeRF/mutant.txt
- For T-NeRF:
python run.py --config configs/D-NeRF/mutant.txt --is_straightforward True
- For GT-NeRF:
python run.py --config configs/D-NeRF/mutant.txt --is_ViT True
This project is all done by me@Kuangzhi Ge and @Yiyang Tian. Yiyang is mainly resiponsible for:
- the implementation of positional encoding
- the implementation of NeRF model
- LLFF NeRF:PM_Model I am responsible for:
- the implementation of D-NeRF, T-NeRF
- propose GT-NeRF And we are the co-authors of the final report for the final project.
[1] @article{pumarola2020d, title={D-NeRF: Neural Radiance Fields for Dynamic Scenes}, author={Pumarola, Albert and Corona, Enric and Pons-Moll, Gerard and Moreno-Noguer, Francesc}, journal={arXiv preprint arXiv:2011.13961}, year={2020} }
[2] @misc{lin2020nerfpytorch, title={NeRF-pytorch}, author={Yen-Chen, Lin}, publisher = {GitHub}, journal = {GitHub repository}, howpublished={\url{https://github.com/yenchenlin/nerf-pytorch/}}, year={2020} }