NeuroMorph: Unsupervised Shape Interpolation and Correspondence in One Go

Overview

NeuroMorph: Unsupervised Shape Interpolation and Correspondence in One Go

This repository provides our implementation of the CVPR 2021 paper NeuroMorph. Our algorithm produces in one go, i.e., in a single feed forward pass, a smooth interpolation and point-to-point correspondences between two input 3D shapes. It is learned in a self-supervised manner from an unlabelled collection of deformable and heterogeneous shapes.

If you use our work, please cite:

@inproceedings{eisenberger2021neuromorph, 
  title={NeuroMorph: Unsupervised Shape Interpolation and Correspondence in One Go}, 
  author={Eisenberger, Marvin and Novotny, David and Kerchenbaum, Gael and Labatut, Patrick and Neverova, Natalia and Cremers, Daniel and Vedaldi, Andrea}, 
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition}, 
  pages={7473--7483}, 
  year={2021}
}

Requirements

The code was tested on Python 3.8.10 with the PyTorch version 1.9.1 and CUDA 10.2. The code also requires the pytorch-geometric library (installation instructions) and matplotlib. Finally, MATLAB with the Statistics and Machine Learning Toolbox is used to pre-process ceratin datasets (we tested MATLAB versions 2019b and 2021b). The code should run on Linux, macOS and Windows.

Installing NeuroMorph

Using Anaconda, you can install the required dependencies as follows:

conda create -n neuromorph python=3.8
conda activate neuromorph
conda install pytorch cudatoolkit=10.2 -c pytorch
conda install matplotlib
conda install pyg -c pyg -c conda-forge

Running NeuroMorph

In order to run NeuroMorph:

  • Specify the location of datasets on your device under data_folder_ in param.py.
  • To use your own data, create a new dataset in data/data.py.
  • To train FAUST remeshed, run the main script main_train.py. Modify the script as needed to train on different data.

For a more detailed tutorial, see the next section.

Reproducing the experiments

We show below how to reproduce the experiments on the FAUST remeshed data.

Data download

You can download experimental mesh data from here from the authors of the Deep Geometric Functional Maps. Download the FAUST_r.zip file from this site, unzip it, and move the content of the directory to /data/mesh/FAUST_r .

Data preprocessing

Meshes must be subsampled and remeshed (for data augmentation during training) and geodesic distance matrices must be computed before the learning code runs. For this, we use the data_preprocessing/preprocess_dataset.m MATLAB scripts (we tested V2019b and V2021b).

Start MATLAB and do the following:

cd 
   
    /data_preprocessing
   
preprocess_dataset("../data/meshes/FAUST_r/", ".off")

The result should be a list of MATLAB mesh files in a mat subfolder (e.g., data/meshes/FAUST_r/mat ), plus additional data.

Model training

If you stored the data in the directory given above, you can train the model by running:

mkdir -p data/{checkpoint,out}
python main_train.py

The trained models will be saved in a series of checkpoints at /data/out/ . Otherwise, edit param.py to change the paths.

Model testing

Upon completion, evaluate the trained model with main_test.py . Specify the checkpoint folder name by running:

python main_test.py <TIME_STAMP_FAUST>

Here is any of the directories saved in /data/out/ . This automatically saves correspondences and interpolations on the FAUST remeshed test set to /data/out/ . For reference, on FAUST you should expect a validation error around 0.25 after 400 epochs.

Contributing

See the CONTRIBUTING file for how to help out.

License

NeuroMorph is MIT licensed, as described in the LICENSE file. NeuroMorph includes a few files from other open source projects, as further detailed in the same LICENSE file.

Owner
Meta Research
Meta Research
A lightweight library to compare different PyTorch implementations of the same network architecture.

TorchBug is a lightweight library designed to compare two PyTorch implementations of the same network architecture. It allows you to count, and compar

Arjun Krishnakumar 5 Jan 02, 2023
Image Restoration Using Swin Transformer for VapourSynth

SwinIR SwinIR function for VapourSynth, based on https://github.com/JingyunLiang/SwinIR. Dependencies NumPy PyTorch, preferably with CUDA. Note that t

Holy Wu 11 Jun 19, 2022
Towards Part-Based Understanding of RGB-D Scans

Towards Part-Based Understanding of RGB-D Scans (CVPR 2021) We propose the task of part-based scene understanding of real-world 3D environments: from

26 Nov 23, 2022
Implementation of "A Deep Learning Loss Function based on Auditory Power Compression for Speech Enhancement" by pytorch

This repository is used to suspend the results of our paper "A Deep Learning Loss Function based on Auditory Power Compression for Speech Enhancement"

ScorpioMiku 19 Sep 30, 2022
Apply our monocular depth boosting to your own network!

MergeNet - Boost Your Own Depth Boost custom or edited monocular depth maps using MergeNet Input Original result After manual editing of base You can

Computational Photography Lab @ SFU 142 Dec 17, 2022
A Physics-based Noise Formation Model for Extreme Low-light Raw Denoising (CVPR 2020 Oral & TPAMI 2021)

ELD The implementation of CVPR 2020 (Oral) paper "A Physics-based Noise Formation Model for Extreme Low-light Raw Denoising" and its journal (TPAMI) v

Kaixuan Wei 359 Jan 01, 2023
Code for the paper "Relation of the Relations: A New Formalization of the Relation Extraction Problem"

This repo contains the code for the EMNLP 2020 paper "Relation of the Relations: A New Paradigm of the Relation Extraction Problem" (Jin et al., 2020)

YYY 27 Oct 26, 2022
Brax is a differentiable physics engine that simulates environments made up of rigid bodies, joints, and actuators

Brax is a differentiable physics engine that simulates environments made up of rigid bodies, joints, and actuators. It's also a suite of learning algorithms to train agents to operate in these enviro

Google 1.5k Jan 02, 2023
Tutorial materials for Part of NSU Intro to Deep Learning with PyTorch.

Intro to Deep Learning Materials are part of North South University (NSU) Intro to Deep Learning with PyTorch workshop series. (Slides) Related materi

Hasib Zunair 9 Jun 08, 2022
BDDM: Bilateral Denoising Diffusion Models for Fast and High-Quality Speech Synthesis

Bilateral Denoising Diffusion Models (BDDMs) This is the official PyTorch implementation of the following paper: BDDM: BILATERAL DENOISING DIFFUSION M

172 Dec 23, 2022
A-SDF: Learning Disentangled Signed Distance Functions for Articulated Shape Representation (ICCV 2021)

A-SDF: Learning Disentangled Signed Distance Functions for Articulated Shape Representation (ICCV 2021) This repository contains the official implemen

81 Dec 14, 2022
PAMI stands for PAttern MIning. It constitutes several pattern mining algorithms to discover interesting patterns in transactional/temporal/spatiotemporal databases

Introduction PAMI stands for PAttern MIning. It constitutes several pattern mining algorithms to discover interesting patterns in transactional/tempor

RAGE UDAY KIRAN 43 Jan 08, 2023
An official source code for "Augmentation-Free Self-Supervised Learning on Graphs"

Augmentation-Free Self-Supervised Learning on Graphs An official source code for Augmentation-Free Self-Supervised Learning on Graphs paper, accepted

Namkyeong Lee 59 Dec 01, 2022
This is a collection of simple PyTorch implementations of neural networks and related algorithms. These implementations are documented with explanations,

labml.ai Deep Learning Paper Implementations This is a collection of simple PyTorch implementations of neural networks and related algorithms. These i

labml.ai 16.4k Jan 09, 2023
This is the repository for The Machine Learning Workshops, published by AI DOJO

This is the repository for The Machine Learning Workshops, published by AI DOJO. It contains all the workshop's code with supporting project files necessary to work through the code.

AI Dojo 12 May 06, 2022
An NLP library with Awesome pre-trained Transformer models and easy-to-use interface, supporting wide-range of NLP tasks from research to industrial applications.

简体中文 | English News [2021-10-12] PaddleNLP 2.1版本已发布!新增开箱即用的NLP任务能力、Prompt Tuning应用示例与生成任务的高性能推理! 🎉 更多详细升级信息请查看Release Note。 [2021-08-22]《千言:面向事实一致性的生

6.9k Jan 01, 2023
Object detection, 3D detection, and pose estimation using center point detection:

Objects as Points Object detection, 3D detection, and pose estimation using center point detection: Objects as Points, Xingyi Zhou, Dequan Wang, Phili

Xingyi Zhou 6.7k Jan 03, 2023
This repository contains the code used for the implementation of the paper "Probabilistic Regression with HuberDistributions"

Public_prob_regression_with_huber_distributions This repository contains the code used for the implementation of the paper "Probabilistic Regression w

David Mohlin 1 Dec 04, 2021
Testing and Estimation of structural breaks in Stata

xtbreak estimating and testing for many known and unknown structural breaks in time series and panel data. For an overview of xtbreak test see xtbreak

Jan Ditzen 13 Jun 19, 2022
Implementation for Curriculum DeepSDF

Curriculum-DeepSDF This repository is an implementation for Curriculum DeepSDF. Full paper is available here. Preparation Please follow original setti

Haidong Zhu 69 Dec 29, 2022