Surrogate- and Invariance-Boosted Contrastive Learning (SIB-CL)

Related tags

Deep LearningSIB-CL
Overview

Surrogate- and Invariance-Boosted Contrastive Learning (SIB-CL)

This repository contains all source code used to generate the results in the article "Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science". (url: to-be-updated)

  • The folder generate_datasets contains all numerical programs used to generate the datasets, for both Photonic Crystals (PhC) and the Time-independent Schrodinger Equation (TISE)
  • main.py is the main code used to train the neural networks (explained in detail below)

Dependencies

Please install the required Python packages: pip install -r requirements.txt

A python3 environment can be created prior to this: conda create -n sibcl python=3.8; conda activate sibcl

Assess to MATLAB is required to calculate the density-of-states (DOS) of PhCs.

Dataset Generation

Photonic Crystals (PhCs)

Relevant code stored in generate_datasets/PhC/. Periodic unit cells are defined using a level set of a Fourier sum; different unit cells can be generated using the get_random() method of the FourierPhC class defined in fourier_phc.py.

To generate the labeled PhC datasets, we first compute their band structures using MPB. This can be executed via:

For the target dataset of random fourier unit cells, python phc_gendata.py --h5filename="mf1-s1" --pol="tm" --nsam=5000 --maxF=1 --seed=1;

and for the source dataset of simple cylinders, python phc_gencylin.py --h5filename="cylin" --pol="tm" --nsam=10000;

each program will create a dataset with the eigen-frequencies, group velocities, etc, stored in a .h5 file (which can be accessed using the h5py package). We then calculate the DOS using the GRR method provided by the MATLAB code https://github.com/boyuanliuoptics/DOS-calculation/blob/master/DOS_GGR.m. To do so, we first parse the data to create the .txt files required as inputs to the program, compute the DOS using MATLAB and then add the DOS labels back to the original .h5 files. These steps will be executed automatically by simply running the shell script get_DOS.sh after modifying the h5 filename identifier defined at the top. Note that for this to run smoothly, python and MATLAB will first need to be added to PATH.

Time-independent Schrodinger Equation (TISE)

Relevant code stored in generate_datasets/TISE/. Example usage:

To generate target dataset, e.g. in 3D, python tise_gendata.py --h5filename="tise3d" --ndim 3 --nsam 5000

To generate low resolution dataset, python tise_gendata.py --h5filename='tise3d_lr' --ndim 3 --nsam 10000 --lowres --orires=32 (--orires defines the resolution of the input to the neural network)

To generate qho dataset, python tise_genqho.py --h5filename='tise2d_qho' --ndim 2 --nsam 10000

SIB-CL and baselines training

Training of the neural networks for all problems introduced in the article (i.e. PhC DOS prediction, PhC Band structure prediction, TISE ground state energy prediction using both low resolution or QHO data as surrogate) can all be executed using main.py by indicating the appropriate flags (see below). This code also allows training via the SIB-CL framework or any of the baselines, again with the use of the appropriate flag. This code also contains other prediction problems not presented in the article, such as predicting higher energy states of TISE, TISE wavefunctions and single band structure.

Important flags:

--path_to_h5: indicate directory where h5 datasets are located. The h5 filenames defined in the dataset classes in datasets_PhC_SE.py should also be modified according to the names used during dataset generation.

--predict: defines prediction task. Options: 'DOS', 'bandstructures', 'eigval', 'oneband', 'eigvec'

--train: specify if training via SIB-CL or baselines. Options: 'sibcl', 'tl', 'sl', 'ssl' ('ssl' performs regular contrastive learning without surrogate dataset). For invariance-boosted baselines, e.g. TL-I or SL-I, specify 'tl' or 'sl' here and add the relevant invariances flags (see below).

--iden: required; specify identifier for saving of models, training logs and results

Invariances flags: --translate_pbc (set this flag to include rolling translations), --pg_uniform (set this flag to uniformly sample the point group symmetry transformations), --scale (set this flag to scale unit cell - used for PhC), --rotate (set this flag to do 4-fold rotations), --flip (set this flag to perform horizontal and vertical mirrors). If --pg_uniform is used, there is no need to include --rotate and --flip.

Other optional flags can be displayed via python main.py --help. Examples of shell scripts can be found in the sh_scripts folder.

Training outputs:

By default, running main.py will create 3 subdirectories:

  • ./pretrained_models/: state dictionaries of pretrained models at various epochs indicated in the eplist variable will be saved to this directory. These models are used for further fine-tuning.
  • ./dicts/: stores the evaluation losses on the test set as dictionaries saved as .json files. The results can then be plotted using plot_results.py.
  • ./tlogs/: training curves for pre-training and fine-tuning are stored in dictionaries saved as .json files. The training curves can be plotted using get_training_logs.py. Alternatively, the --log_to_tensorboard flag can be set and training curves can be viewed using tensorboard; in this case, the dictionaries will not be generated.
You might also like...
pytorch implementation of
pytorch implementation of "Contrastive Multiview Coding", "Momentum Contrast for Unsupervised Visual Representation Learning", and "Unsupervised Feature Learning via Non-Parametric Instance-level Discrimination"

Unofficial implementation: MoCo: Momentum Contrast for Unsupervised Visual Representation Learning (Paper) InsDis: Unsupervised Feature Learning via N

Dense Contrastive Learning (DenseCL) for self-supervised representation learning, CVPR 2021.
Dense Contrastive Learning (DenseCL) for self-supervised representation learning, CVPR 2021.

Dense Contrastive Learning for Self-Supervised Visual Pre-Training This project hosts the code for implementing the DenseCL algorithm for se

CRLT: A Unified Contrastive Learning Toolkit for Unsupervised Text Representation Learning
CRLT: A Unified Contrastive Learning Toolkit for Unsupervised Text Representation Learning

CRLT: A Unified Contrastive Learning Toolkit for Unsupervised Text Representation Learning This repository contains the code and relevant instructions

Source code and dataset for ACL2021 paper: "ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learning".

ERICA Source code and dataset for ACL2021 paper: "ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive L

PyTorch implementation of
PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)

PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)

VIMPAC: Video Pre-Training via Masked Token Prediction and Contrastive Learning

This is a release of our VIMPAC paper to illustrate the implementations. The pretrained checkpoints and scripts will be soon open-sourced in HuggingFace transformers.

Code for 'Single Image 3D Shape Retrieval via Cross-Modal Instance and Category Contrastive Learning', ICCV 2021
Code for 'Single Image 3D Shape Retrieval via Cross-Modal Instance and Category Contrastive Learning', ICCV 2021

CMIC-Retrieval Code for Single Image 3D Shape Retrieval via Cross-Modal Instance and Category Contrastive Learning. ICCV 2021. Introduction In this wo

Official Pytorch implementation of "Unbiased Classification Through Bias-Contrastive and Bias-Balanced Learning (NeurIPS 2021)

Unbiased Classification Through Bias-Contrastive and Bias-Balanced Learning (NeurIPS 2021) Official Pytorch implementation of Unbiased Classification

This is the repository for the AAAI 21 paper [Contrastive and Generative Graph Convolutional Networks for Graph-based Semi-Supervised Learning].

CG3 This is the repository for the AAAI 21 paper [Contrastive and Generative Graph Convolutional Networks for Graph-based Semi-Supervised Learning]. R

Releases(v1.0)
Owner
Charlotte Loh
PhD candidate at MIT EECS
Charlotte Loh
Code for ACL2021 paper Consistency Regularization for Cross-Lingual Fine-Tuning.

xTune Code for ACL2021 paper Consistency Regularization for Cross-Lingual Fine-Tuning. Environment DockerFile: dancingsoul/pytorch:xTune Install the f

Bo Zheng 42 Dec 09, 2022
cl;asification problem using classification models in supervised learning

wine-quality-predition---classification cl;asification problem using classification models in supervised learning Wine Quality Prediction Analysis - C

Vineeth Reddy Gangula 1 Jan 18, 2022
Spatial-Temporal Transformer for Dynamic Scene Graph Generation, ICCV2021

Spatial-Temporal Transformer for Dynamic Scene Graph Generation Pytorch Implementation of our paper Spatial-Temporal Transformer for Dynamic Scene Gra

Yuren Cong 119 Jan 01, 2023
Harmonic Memory Networks for Graph Completion

HMemNetworks Code and documentation for Harmonic Memory Networks, a series of models for compositionally assembling representations of graph elements

mlalisse 0 Oct 27, 2021
PyTorch implementation for paper Neural Marching Cubes.

NMC PyTorch implementation for paper Neural Marching Cubes, Zhiqin Chen, Hao Zhang. Paper | Supplementary Material (to be updated) Citation If you fin

Zhiqin Chen 109 Dec 27, 2022
Hypercomplex Neural Networks with PyTorch

HyperNets Hypercomplex Neural Networks with PyTorch: this repository would be a container for hypercomplex neural network modules to facilitate resear

Eleonora Grassucci 21 Dec 27, 2022
Dataset for the Research2Clinics @ NeurIPS 2021 Paper: What Do You See in this Patient? Behavioral Testing of Clinical NLP Models

Behavioral Testing of Clinical NLP Models This repository contains code for testing the behavior of clinical prediction models based on patient letter

Betty van Aken 2 Sep 20, 2022
《LightXML: Transformer with dynamic negative sampling for High-Performance Extreme Multi-label Text Classification》(AAAI 2021) GitHub:

LightXML: Transformer with dynamic negative sampling for High-Performance Extreme Multi-label Text Classification

76 Dec 05, 2022
Ludwig is a toolbox that allows to train and evaluate deep learning models without the need to write code.

Translated in 🇰🇷 Korean/ Ludwig is a toolbox that allows users to train and test deep learning models without the need to write code. It is built on

Ludwig 8.7k Dec 31, 2022
Spontaneous Facial Micro Expression Recognition using 3D Spatio-Temporal Convolutional Neural Networks

Spontaneous Facial Micro Expression Recognition using 3D Spatio-Temporal Convolutional Neural Networks Abstract Facial expression recognition in video

Bogireddy Sai Prasanna Teja Reddy 103 Dec 29, 2022
This repository stores the code to reproduce the results published in "TiWS-iForest: Isolation Forest in Weakly Supervised and Tiny ML scenarios"

TinyWeaklyIsolationForest This repository stores the code to reproduce the results published in "TiWS-iForest: Isolation Forest in Weakly Supervised a

2 Mar 21, 2022
NIMA: Neural IMage Assessment

PyTorch NIMA: Neural IMage Assessment PyTorch implementation of Neural IMage Assessment by Hossein Talebi and Peyman Milanfar. You can learn more from

Kyryl Truskovskyi 293 Dec 30, 2022
A curated list of neural network pruning resources.

A curated list of neural network pruning and related resources. Inspired by awesome-deep-vision, awesome-adversarial-machine-learning, awesome-deep-learning-papers and Awesome-NAS.

Yang He 1.7k Jan 09, 2023
SwinTrack: A Simple and Strong Baseline for Transformer Tracking

SwinTrack This is the official repo for SwinTrack. A Simple and Strong Baseline Prerequisites Environment conda (recommended) conda create -y -n SwinT

LitingLin 196 Jan 04, 2023
EfficientDet (Scalable and Efficient Object Detection) implementation in Keras and Tensorflow

EfficientDet This is an implementation of EfficientDet for object detection on Keras and Tensorflow. The project is based on the official implementati

1.3k Dec 19, 2022
Self-Supervised Contrastive Learning of Music Spectrograms

Self-Supervised Music Analysis Self-Supervised Contrastive Learning of Music Spectrograms Dataset Songs on the Billboard Year End Hot 100 were collect

27 Dec 10, 2022
Combinatorially Hard Games where the levels are procedurally generated

puzzlegen Implementation of two procedurally simulated environments with gym interfaces. IceSlider: the agent needs to reach and stop on the pink squa

Autonomous Learning Group 3 Jun 26, 2022
Kroomsa: A search engine for the curious

Kroomsa A search engine for the curious. It is a search algorithm designed to en

Wingify 7 Jun 20, 2022
Spectral Temporal Graph Neural Network (StemGNN in short) for Multivariate Time-series Forecasting

Spectral Temporal Graph Neural Network for Multivariate Time-series Forecasting This repository is the official implementation of Spectral Temporal Gr

Microsoft 306 Dec 29, 2022
A practical ML pipeline for data labeling with experiment tracking using DVC.

Auto Label Pipeline A practical ML pipeline for data labeling with experiment tracking using DVC Goals: Demonstrate reproducible ML Use DVC to build a

Todd Cook 4 Mar 08, 2022