Skip to content
/ UC-NeRF Public

[TMI'24] UC-NeRF: Uncertainty-aware Conditional Neural Radiance Fields from Surgical Sparse Views

License

Notifications You must be signed in to change notification settings

wrld/UC-NeRF

Repository files navigation

UC-NeRF: Uncertainty-Aware Conditional Neural Radiance Fields From Endoscopic Sparse Views

Jiaxin Guo1    Jiangliu Wang1    Ruofeng Wei1    Di Kang2    Qi Dou1    Yun-hui Liu1, 3   
1CUHK    2Tencent AI Lab    3HKCLR   

The repository contains the official implementation for the TMI 2024 paper UC-NeRF: Uncertainty-Aware Conditional Neural Radiance Fields From Endoscopic Sparse Views. In this paper, we propose uncertainty-aware conditional NeRF for novel view synthesis to tackle the severe shape-radiance ambiguity from sparse surgical views. The core of UC-NeRF is to incorporate the multi-view uncertainty estimation to condition the neural radiance field for modeling the severe photometric inconsistencies adaptively.

Pipeline Pipeline

Installation

Install environment:

conda create -n ucnerf python=3.9
conda activate ucnerf
pip install torch==1.12.1+cu116 torchvision==0.13.1+cu116 torchaudio==0.12.1 --extra-index-url https://download.pytorch.org/whl/cu116
pip install -r requirements.txt

Our code is tested on Ubuntu 20.04 + CUDA 11.6 + Pytorch 1.12.1.

Dataset Preprocessing

To reproduce the results, please download the preprocessed datasets of SCARED and Hamlyn and unzip.

To test our method on your data, please follow the preprocessing as below:

  • The dataset follows the directory structure as below:
├── data 
│   ├── scene01    
│   ├── ├── images
├── ├── scene02    
│   ├── ├── images     
  • Run the COLMAP to obtain camera poses and sparse point cloud:
python preprocess/colmap/img2poses.py <your_scene_folder> 
  • Run monocular depth model to obtain monocular depth as prior, we provide the script to use DPT following the paper implementation:
python preprocess/DPT/run_monodepth.py -i <your_scene_folder> 

You can find the generated monocular depth under <your_scene_folder>/dpt/.

Evaluation

  • Our pretrained model is included in the folder pretrained_weights. To reproduce the experiment result, please run the command for evaluation:
sh scripts/eval.sh

Training

Please see each subsection for training on different datasets. Available training datasets:

SCARED

To obtain the dataset and code, please sign the challenge rules and email them to [email protected].

To reproduce our result quickly, we provide the training samples from SCARED Dataset, please download here and unzip.

  • To train the model, run the training script:
sh scripts/train_scared.sh

Hamlyn

sh scripts/train_hamlyn.sh

Citing

If you find our work helpful, please cite:

@article{guo2024uc,
  title={UC-NeRF: Uncertainty-aware Conditional Neural Radiance Fields from Endoscopic Sparse Views},
  author={Guo, Jiaxin and Wang, Jiangliu and Wei, Ruofeng and Kang, Di and Dou, Qi and Liu, Yun-hui},
  journal={IEEE Transactions on Medical Imaging},
  year={2024},
  publisher={IEEE}
}

About

[TMI'24] UC-NeRF: Uncertainty-aware Conditional Neural Radiance Fields from Surgical Sparse Views

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published