Meta-Learning Sparse Implicit Neural Representations (NeurIPS 2021)

Overview

Meta-SparseINR

Official PyTorch implementation of "Meta-learning Sparse Implicit Neural Representations" (NeurIPS 2021) by Jaeho Lee*, Jihoon Tack*, Namhoon Lee, and Jinwoo Shin.

TL;DR: We develop a scalable method to learn sparse neural representations for a large set of signals.

Illustrations of (a) an implicit neural representation, (b) the standard pruning algorithm that prunes and retrains the model for each signal considered, and (c) the proposed Meta-SparseINR procedure to find a sparse initial INR, which can be trained further to fit each signal.

1. Requirements

conda create -n inrprune python=3.7
conda activate inrprune

conda install pytorch torchvision cudatoolkit=11.1 -c pytorch -c nvidia

pip install torchmeta
pip install imageio einops tensorboardX

Datasets

  • Download Imagenette and SDF file from the following page:
  • One should locate the dataset into /data folder

2. Training

Training option

The option for the training method is as follows:

  • <DATASET>: {celeba,sdf,imagenette}

Meta-SparseINR (ours)

# Train dense model first
python main.py --exp meta_baseline --epoch 150000 --data <DATASET>

# Iterative pruning (magnitude pruning)
python main.py --exp metaprune --epoch 30000 --pruner MP --amount 0.2 --data <DATASET>

Random Pruning

# Train dense model first
python main.py --exp meta_baseline --epoch 150000 --data <DATASET>

# Iterative pruning (random pruning)
python main.py --exp metaprune --epoch 30000 --pruner RP --amount 0.2 --data <DATASET>

Dense-Narrow

# Train dense model with a given width

# Shell script style
widthlist="230 206 184 164 148 132 118 106 94 84 76 68 60 54 48 44 38 34 32 28"
for width in $widthlist
do
    python main.py --exp meta_baseline --epoch 150000 --data <DATASET> --width $width --id width_$width
done

3. Evaluation

Evaluation option

The option for the training method is as follows:

  • <DATASET>: {celeba,sdf,imagenette}
  • <OPT_TYPE>: {default,two_step_sgd}, default denotes adam optimizer with 100 steps.

We assume all checkpoints are trained.

Meta-SparseINR (ours)

python eval.py --exp prune --pruner MP --data <DATASET> --opt_type <OPT_TYPE>

Baselines

# Random pruning
python eval.py --exp prune --pruner RP --data <DATASET> --opt_type <OPT_TYPE>

# Dense-Narrow
python eval.py --exp dense_narrow --data <DATASET> --opt_type <OPT_TYPE>

# MAML + One-Shot
python eval.py --exp one_shot --data <DATASET> --opt_type default

# MAML + IMP
python eval.py --exp imp --data <DATASET> --opt_type default

# Scratch
python eval.py --exp scratch --data <DATASET> --opt_type <OPT_TYPE>

4. Experimental Results

Citation

@inproceedings{lee2021meta,
  title={Meta-learning Sparse Implicit Neural Representations},
  author={Jaeho Lee and Jihoon Tack and Namhoon Lee and Jinwoo Shin},
  booktitle={Advances in Neural Information Processing Systems},
  year={2021}
}

Reference

Owner
Jaeho Lee
Postdoctoral researcher at KAIST.
Jaeho Lee
Tensorflow/Keras Plug-N-Play Deep Learning Models Compilation

DeepBay This project was created with the objective of compile Machine Learning Architectures created using Tensorflow or Keras. The architectures mus

Whitman Bohorquez 4 Sep 26, 2022
Self-Supervised Learning with Kernel Dependence Maximization

Self-Supervised Learning with Kernel Dependence Maximization This is the code for SSL-HSIC, a self-supervised learning loss proposed in the paper Self

DeepMind 29 Dec 29, 2022
Training Very Deep Neural Networks Without Skip-Connections

DiracNets v2 update (January 2018): The code was updated for DiracNets-v2 in which we removed NCReLU by adding per-channel a and b multipliers without

Sergey Zagoruyko 585 Oct 12, 2022
ConE: Cone Embeddings for Multi-Hop Reasoning over Knowledge Graphs

ConE: Cone Embeddings for Multi-Hop Reasoning over Knowledge Graphs This is the code of paper ConE: Cone Embeddings for Multi-Hop Reasoning over Knowl

MIRA Lab 33 Dec 07, 2022
Code for the paper: "On the Bottleneck of Graph Neural Networks and Its Practical Implications"

On the Bottleneck of Graph Neural Networks and its Practical Implications This is the official implementation of the paper: On the Bottleneck of Graph

75 Dec 22, 2022
Implementation for the EMNLP 2021 paper "Interactive Machine Comprehension with Dynamic Knowledge Graphs".

Interactive Machine Comprehension with Dynamic Knowledge Graphs Implementation for the EMNLP 2021 paper. Dependencies apt-get -y update apt-get instal

Xingdi (Eric) Yuan 19 Aug 23, 2022
Semi-Autoregressive Transformer for Image Captioning

Semi-Autoregressive Transformer for Image Captioning Requirements Python 3.6 Pytorch 1.6 Prepare data Please use git clone --recurse-submodules to clo

YE Zhou 23 Dec 09, 2022
A 2D Visual Localization Framework based on Essential Matrices [ICRA2020]

A 2D Visual Localization Framework based on Essential Matrices This repository provides implementation of our paper accepted at ICRA: To Learn or Not

Qunjie Zhou 27 Nov 07, 2022
pq is a jq-like Pickle file viewer

pq PQ is a jq-like viewer/processing tool for pickle files. howto # pq '' file.pkl {'other': 456, 'test': 123} # pq 'table' file.pkl |other|test| | 45

3 Mar 15, 2022
Source code for EquiDock: Independent SE(3)-Equivariant Models for End-to-End Rigid Protein Docking (ICLR 2022)

Source code for EquiDock: Independent SE(3)-Equivariant Models for End-to-End Rigid Protein Docking (ICLR 2022) Please cite "Independent SE(3)-Equivar

Octavian Ganea 154 Jan 02, 2023
Pytorch implementation of “Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency Parsing with Iterative Refinement”

Graph-to-Graph Transformers Self-attention models, such as Transformer, have been hugely successful in a wide range of natural language processing (NL

Idiap Research Institute 40 Aug 14, 2022
Code for EmBERT, a transformer model for embodied, language-guided visual task completion.

Code for EmBERT, a transformer model for embodied, language-guided visual task completion.

41 Jan 03, 2023
Survival analysis (SA) is a well-known statistical technique for the study of temporal events.

DAGSurv Survival analysis (SA) is a well-known statistical technique for the study of temporal events. In SA, time-to-an-event data is modeled using a

Rahul Kukreja 1 Sep 05, 2022
Official PyTorch implementation of the paper "TEMOS: Generating diverse human motions from textual descriptions"

TEMOS: TExt to MOtionS Generating diverse human motions from textual descriptions Description Official PyTorch implementation of the paper "TEMOS: Gen

Mathis Petrovich 187 Dec 27, 2022
face property detection pytorch

This is the face property train code of project face-detection-project

i am x 2 Oct 18, 2021
Summary Explorer is a tool to visually explore the state-of-the-art in text summarization.

Summary Explorer Summary Explorer is a tool to visually inspect the summaries from several state-of-the-art neural summarization models across multipl

Webis 42 Aug 14, 2022
Code for the paper "Functional Regularization for Reinforcement Learning via Learned Fourier Features"

Reinforcement Learning with Learned Fourier Features State-space Soft Actor-Critic Experiments Move to the state-SAC-LFF repository. cd state-SAC-LFF

Alex Li 10 Nov 11, 2022
Reinforcement Learning for finance

Reinforcement Learning for Finance We apply reinforcement learning for stock trading. Fetch Data Example import utils # fetch symbols from yahoo fina

Tomoaki Fujii 159 Jan 03, 2023
Implementation of ViViT: A Video Vision Transformer

ViViT: A Video Vision Transformer Unofficial implementation of ViViT: A Video Vision Transformer. Notes: This is in WIP. Model 2 is implemented, Model

Rishikesh (ऋषिकेश) 297 Jan 06, 2023
Improved Fitness Optimization Landscapes for Sequence Design

ReLSO Improved Fitness Optimization Landscapes for Sequence Design Description Citation How to run Training models Original data source Description In

Krishnaswamy Lab 44 Dec 20, 2022