《Dual-Resolution Correspondence Network》(NeurIPS 2020)

Overview

Dual-Resolution Correspondence Network

Dual-Resolution Correspondence Network, NeurIPS 2020

Dependency

All dependencies are included in asset/dualrcnet.yml. You need to install conda first, and then run

conda env create --file asset/dualrcnet.yml 

To activate the environment, run

conda activate dualrcnet

Preparing data

We train our model on MegaDepth dataset. To prepare for the data, you need to download the MegaDepth SfM models from the MegaDepth website and download training_pairs.txt and validation_pairs.txt from this link. Then place both training_pairs.txt and validation_pairs.txt files under the downloaded directory MegaDepth_v1_SfM.

Training

After downloading the training data, run

python train.py --training_file path/to/training_pairs.txt --validation_file path/to/validation_pairs.txt --image_path path/to/MegaDepth_v1_SfM

Pre-trained model

We also provide our pre-trained model. You can download dualrc-net.pth.tar from this link and place it under the directory trained_models.

Evaluation on HPatches

The dataset can be downloaded from HPatches repo. You need to download HPatches full sequences.
After downloading the dataset, then:

  1. Browse to HPatches/
  2. Run python eval_hpatches.py --checkpoint path/to/model --root path/to/parent/directory/of/hpatches_sequences. This will generate a text file which stores the result in current directory.
  3. Open draw_graph.py. Change relevent path accordingly and run the script to draw the result.

We provide results of DualRC-Net alongside with results of other methods in directory cache-top.

Evaluation on InLoc

In order to run the InLoc evaluation, you first need to clone the InLoc demo repo, and download and compile all the required depedencies. Then:

  1. Browse to inloc/.
  2. Run python eval_inloc_extract.py adjusting the checkpoint and experiment name. This will generate a series of matches files in the inloc/matches/ directory that then need to be fed to the InLoc evaluation Matlab code.
  3. Modify the inloc/eval_inloc_compute_poses.m file provided to indicate the path of the InLoc demo repo, and the name of the experiment (the particular directory name inside inloc/matches/), and run it using Matlab.
  4. Use the inloc/eval_inloc_generate_plot.m file to plot the results from shortlist file generated in the previous stage: /your_path_to/InLoc_demo_old/experiment_name/shortlist_densePV.mat. Precomputed shortlist files are provided in inloc/shortlist.

Evaluation on Aachen Day-Night

In order to run the Aachen Day-Night evaluation, you first need to clone the Visualization benchmark repo, and download and compile all the required depedencies (note that you'll need to compile Colmap if you have not done so yet). Then:

  1. Browse to aachen_day_and_night/.
  2. Run python eval_aachen_extract.py adjusting the checkpoint and experiment name.
  3. Copy the eval_aachen_reconstruct.py file to visuallocalizationbenchmark/local_feature_evaluation and run it in the following way:
python eval_aachen_reconstruct.py 
	--dataset_path /path_to_aachen/aachen 
	--colmap_path /local/colmap/build/src/exe
	--method_name experiment_name
  1. Upload the file /path_to_aachen/aachen/Aachen_eval_[experiment_name].txt to https://www.visuallocalization.net/ to get the results on this benchmark.

BibTex

If you use this code, please cite our paper

@inproceedings{li20dualrc,
 author		= {Xinghui Li and Kai Han and Shuda Li and Victor Prisacariu},
 title   	= {Dual-Resolution Correspondence Networks},
 booktitle 	= {Conference on Neural Information Processing Systems (NeurIPS)},
 year    	= {2020},
}

Acknowledgement

Our code is based on the wonderful code provided by NCNet, Sparse-NCNet and ANC-Net.

Effective Use of Transformer Networks for Entity Tracking

Effective Use of Transformer Networks for Entity Tracking (EMNLP19) This is a PyTorch implementation of our EMNLP paper on the effectiveness of pre-tr

5 Nov 06, 2021
Generalized Data Weighting via Class-level Gradient Manipulation

Generalized Data Weighting via Class-level Gradient Manipulation This repository is the official implementation of Generalized Data Weighting via Clas

18 Nov 12, 2022
A library for optimization on Riemannian manifolds

TensorFlow RiemOpt A library for manifold-constrained optimization in TensorFlow. Installation To install the latest development version from GitHub:

Oleg Smirnov 83 Dec 27, 2022
Open source implementation of AceNAS: Learning to Rank Ace Neural Architectures with Weak Supervision of Weight Sharing

AceNAS This repo is the experiment code of AceNAS, and is not considered as an official release. We are working on integrating AceNAS as a built-in st

Yuge Zhang 6 Sep 07, 2022
Implementation of UNET architecture for Image Segmentation.

Semantic Segmentation using UNET This is the implementation of UNET on Carvana Image Masking Kaggle Challenge About the Dataset This dataset contains

Anushka agarwal 4 Dec 21, 2021
The GitHub repository for the paper: “Time Series is a Special Sequence: Forecasting with Sample Convolution and Interaction“.

SCINet This is the original PyTorch implementation of the following work: Time Series is a Special Sequence: Forecasting with Sample Convolution and I

386 Jan 01, 2023
K-PLUG: Knowledge-injected Pre-trained Language Model for Natural Language Understanding and Generation in E-Commerce (EMNLP Founding 2021)

Introduction K-PLUG: Knowledge-injected Pre-trained Language Model for Natural Language Understanding and Generation in E-Commerce. Installation PyTor

Xu Song 21 Nov 16, 2022
StyleGAN - Official TensorFlow Implementation

StyleGAN — Official TensorFlow Implementation Picture: These people are not real – they were produced by our generator that allows control over differ

NVIDIA Research Projects 13.1k Jan 09, 2023
Vision-and-Language Navigation in Continuous Environments using Habitat

Vision-and-Language Navigation in Continuous Environments (VLN-CE) Project Website — VLN-CE Challenge — RxR-Habitat Challenge Official implementations

Jacob Krantz 132 Jan 02, 2023
Uni-Fold: Training your own deep protein-folding models.

Uni-Fold: Training your own deep protein-folding models. This package provides and implementation of a trainable, Transformer-based deep protein foldi

DeepModeling 88 Jan 03, 2023
Compute FID scores with PyTorch.

FID score for PyTorch This is a port of the official implementation of Fréchet Inception Distance to PyTorch. See https://github.com/bioinf-jku/TTUR f

2.1k Jan 06, 2023
Supplementary code for TISMIR paper "Sliding-Window Pitch-Class Histograms as a Means of Modeling Musical Form"

Sliding-Window Pitch-Class Histograms as a Means of Modeling Musical Form This is supplementary code for the TISMIR paper Sliding-Window Pitch-Class H

1 Nov 27, 2021
CenterPoint 3D Object Detection and Tracking using center points in the bird-eye view.

CenterPoint 3D Object Detection and Tracking using center points in the bird-eye view. Center-based 3D Object Detection and Tracking, Tianwei Yin, Xin

Tianwei Yin 134 Dec 23, 2022
An implementation for the ICCV 2021 paper Deep Permutation Equivariant Structure from Motion.

Deep Permutation Equivariant Structure from Motion Paper | Poster This repository contains an implementation for the ICCV 2021 paper Deep Permutation

72 Dec 27, 2022
Instance-wise Feature Importance in Time (FIT)

Instance-wise Feature Importance in Time (FIT) FIT is a framework for explaining time series perdiction models, by assigning feature importance to eve

Sana 46 Dec 25, 2022
Creative Applications of Deep Learning w/ Tensorflow

Creative Applications of Deep Learning w/ Tensorflow This repository contains lecture transcripts and homework assignments as Jupyter Notebooks for th

Parag K Mital 1.5k Dec 30, 2022
TextureGAN in Pytorch

TextureGAN This code is our PyTorch implementation of TextureGAN [Project] [Arxiv] TextureGAN is a generative adversarial network conditioned on sketc

Patsorn 147 Dec 14, 2022
Mixup for Supervision, Semi- and Self-Supervision Learning Toolbox and Benchmark

OpenSelfSup News Downstream tasks now support more methods(Mask RCNN-FPN, RetinaNet, Keypoints RCNN) and more datasets(Cityscapes). 'GaussianBlur' is

AI Lab, Westlake University 332 Jan 03, 2023
Large-Scale Unsupervised Object Discovery

Large-Scale Unsupervised Object Discovery Huy V. Vo, Elena Sizikova, Cordelia Schmid, Patrick Pérez, Jean Ponce [PDF] We propose a novel ranking-based

17 Sep 19, 2022
An implementation of the AlphaZero algorithm for Gomoku (also called Gobang or Five in a Row)

AlphaZero-Gomoku This is an implementation of the AlphaZero algorithm for playing the simple board game Gomoku (also called Gobang or Five in a Row) f

Junxiao Song 2.8k Dec 26, 2022