This is the accompanying toolbox for the paper "A Survey on GANs for Anomaly Detection"

Overview

Anomaly Toolbox

Description

Anomaly Toolbox Powered by GANs.

This is the accompanying toolbox for the paper "A Survey on GANs for Anomaly Detection" (https://arxiv.org/pdf/1906.11632.pdf).

The toolbox is meant to be used by the user to explore the performance of different GAN based architectures (in our work aka "experiments"). It also already provides some datasets to perform experiments on:

We provided the MNIST dataset because the original works extensively use it. On the other hand, we have also added the previously listed datasets both because used by a particular architecture and because they contribute a good benchmark for the models we have implemented.

All the architectures were tested on commonly used datasets such as MNIST, FashionMNIST, CIFAR-10, and KDD99. Some of them were even tested on more specific datasets, such as an X-Ray dataset that, however, we could not provide because of the impossibility of getting the data (privacy reasons).

The user can create their own dataset and use it to test the models.

Quick Start

  • First thing first, install the toolbox
pip install anomaly-toolbox

Then you can choose what experiment to run. For example:

  • Run the GANomaly experiment (i.e., the GANomaly architecture) with hyperparameters tuning enabled, the pre-defined hyperparameters file hparams.json and the MNIST dataset:
anomaly-box.py --experiment GANomalyExperiment --hps-path path/to/config/hparams.json --dataset 
MNIST 
  • Otherwise, you can run all the experiments using the pre-defined hyperparameters file hparams. json and the MNIST dataset:
anomaly-box.py --run-all --hps-path path/to/config/hparams.json --dataset MNIST 

For any other information, feel free to check the help:

anomaly-box.py --help

Contribution

This work is completely open source, and we would appreciate any contribution to the code. Any merge request to enhance, correct or expand the work is welcome.

Notes

The structures of the models inside the toolbox come from their respective papers. We have tried to respect them as much as possible. However, sometimes, due to implementation issues, we had to make some minor-ish changes. For this reason, you could find out that, in some cases, some features such as the number of layers, the size of kernels, or other such things may differ from the originals.

However, you don't have to worry. The heart and purpose of the architectures have remained intact.

Installation

pip install anomaly-toolbox

Usage

Options:
  --experiment [AnoGANExperiment|DeScarGANExperiment|EGBADExperiment|GANomalyExperiment]
                                  Experiment to run.
  --hps-path PATH                 When running an experiment, the path of the
                                  JSON file where all the hyperparameters are
                                  located.  [required]
  --tuning BOOLEAN                If you want to use hyperparameters tuning,
                                  use 'True' here. Default is False.
  --dataset TEXT                  The dataset to use. Can be a ready to use
                                  dataset, or a .py file that implements the
                                  AnomalyDetectionDataset interface
                                  [required]
  --run-all BOOLEAN               Run all the available experiments
  --help                          Show this message and exit.

Datasets and Custom Datasets

The provided datasets are:

and are automatically downloaded when the user makes a specific choice: ["MNIST", "CorruptedMNIST", "SurfaceCracks","MVTecAD"].

The user can also add its own specific dataset. To do this, the new dataset should inherit from the AnomalyDetectionDataset abstract class implementing its own configure method. For a more detailed guide, the user can refer to the README.md file inside the src/anomaly_toolbox/datasets folder. Moreover, in the examples folder, the user can find a dummy.py module with the basic skeleton code to implement a dataset.

References

Owner
Zuru Tech
Open source @ ZURU Tech
Zuru Tech
Custom TensorFlow2 implementations of forward and backward computation of soft-DTW algorithm in batch mode.

Batch Soft-DTW(Dynamic Time Warping) in TensorFlow2 including forward and backward computation Custom TensorFlow2 implementations of forward and backw

19 Aug 30, 2022
PyTorch implementation for NED. It can be used to manipulate the facial emotions of actors in videos based on emotion labels or reference styles.

Neural Emotion Director (NED) - Official Pytorch Implementation Example video of facial emotion manipulation while retaining the original mouth motion

Foivos Paraperas 89 Dec 23, 2022
A PyTorch Reimplementation of TecoGAN: Temporally Coherent GAN for Video Super-Resolution

TecoGAN-PyTorch Introduction This is a PyTorch reimplementation of TecoGAN: Temporally Coherent GAN for Video Super-Resolution (VSR). Please refer to

165 Dec 17, 2022
This library is a location of the LegacyLogger for PyTorch Lightning.

neptune-contrib Documentation See neptune-contrib documentation site Installation Get prerequisites python versions 3.5.6/3.6 are supported Install li

neptune.ai 26 Oct 07, 2021
IJCAI2020 & IJCV 2020 :city_sunrise: Unsupervised Scene Adaptation with Memory Regularization in vivo

Seg_Uncertainty In this repo, we provide the code for the two papers, i.e., MRNet:Unsupervised Scene Adaptation with Memory Regularization in vivo, IJ

Zhedong Zheng 348 Jan 05, 2023
SEC'21: Sparse Bitmap Compression for Memory-Efficient Training onthe Edge

Training Deep Learning Models on The Edge Training on the Edge enables continuous learning from new data for deployed neural networks on memory-constr

Brown University Scale Lab 4 Nov 18, 2022
Official code implementation for "Personalized Federated Learning using Hypernetworks"

Personalized Federated Learning using Hypernetworks This is an official implementation of Personalized Federated Learning using Hypernetworks paper. [

Aviv Shamsian 121 Dec 25, 2022
Code for paper "Document-Level Argument Extraction by Conditional Generation". NAACL 21'

Argument Extraction by Generation Code for paper "Document-Level Argument Extraction by Conditional Generation". NAACL 21' Dependencies pytorch=1.6 tr

Zoey Li 87 Dec 26, 2022
Machine-in-the-Loop Rewriting for Creative Image Captioning

Machine-in-the-Loop Rewriting for Creative Image Captioning Data Annotated sources of data used in the paper: Data Source URL Mohammed et al. Link Gor

Vishakh P 6 Jul 24, 2022
Official implementation of Few-Shot and Continual Learning with Attentive Independent Mechanisms

Few-Shot and Continual Learning with Attentive Independent Mechanisms This repository is the official implementation of Few-Shot and Continual Learnin

Chikan_Huang 25 Dec 08, 2022
Repo for my Tensorflow/Keras CV experiments. Mostly revolving around the Danbooru20xx dataset

SW-CV-ModelZoo Repo for my Tensorflow/Keras CV experiments. Mostly revolving around the Danbooru20xx dataset Framework: TF/Keras 2.7 Training SQLite D

20 Dec 27, 2022
Computationally efficient algorithm that identifies boundary points of a point cloud.

BoundaryTest Included are MATLAB and Python packages, each of which implement efficient algorithms for boundary detection and normal vector estimation

6 Dec 09, 2022
A Multi-attribute Controllable Generative Model for Histopathology Image Synthesis

A Multi-attribute Controllable Generative Model for Histopathology Image Synthesis This is the pytorch implementation for our MICCAI 2021 paper. A Mul

Jiarong Ye 7 Apr 04, 2022
Make your master artistic punk avatar through machine learning world famous paintings.

Master-art-punk Make your master artistic punk avatar through machine learning world famous paintings. 通过机器学习世界名画制作属于你的大师级艺术朋克头像 Nowadays, NFT is beco

Philipjhc 53 Dec 27, 2022
IRON Kaggle project done while doing IRONHACK Bootcamp where we had to analyze and use a Machine Learning Project to predict future sales

IRON Kaggle project done while doing IRONHACK Bootcamp where we had to analyze and use a Machine Learning Project to predict future sales. In this case, we ended up using XGBoost because it was the o

1 Jan 04, 2022
Collection of generative models, e.g. GAN, VAE in Pytorch and Tensorflow.

Generative Models Collection of generative models, e.g. GAN, VAE in Pytorch and Tensorflow. Also present here are RBM and Helmholtz Machine. Note: Gen

Agustinus Kristiadi 7k Jan 02, 2023
Pyramid addon for OpenAPI3 validation of requests and responses.

Validate Pyramid views against an OpenAPI 3.0 document Peace of Mind The reason this package exists is to give you peace of mind when providing a REST

Pylons Project 79 Dec 30, 2022
​ This is the Pytorch implementation of Progressive Attentional Manifold Alignment.

PAMA This is the Pytorch implementation of Progressive Attentional Manifold Alignment. Requirements python 3.6 pytorch 1.2.0+ PIL, numpy, matplotlib C

98 Nov 15, 2022
AlgoVision - A Framework for Differentiable Algorithms and Algorithmic Supervision

NeurIPS 2021 Paper "Learning with Algorithmic Supervision via Continuous Relaxations"

Felix Petersen 76 Jan 01, 2023