SIR model parameter estimation using a novel algorithm for differentiated uniformization.

Overview

TenSIR

Parameter estimation on epidemic data under the SIR model using a novel algorithm for differentiated uniformization of Markov transition rate matrices in tensor representation.

This repository contains the code for the paper.

Data

We used the data from the Austrian BMSGPK on the COVID-19 pandemic from March 2020 to August 2020. A CSV file containing the data used by us can be found here if the API is subject to change in the future.

Results

Kernel density estimation plot of points generated by Hamilton Monte Carlo simulation

HMC plot

The x marks the least squares estimate after grid search using the default deterministic model (system of ODEs).

Susceptible and infected people to/with COVID-19 in Austria during the early months of the pandemic

Timeline plot

Reproducing the results

Prerequisites

  • Python 3.7+ with Pip (tested with Python 3.9 and 3.10)

Setup

We advise you to use a virtual environment for running the code. After you activated it change to the source directory and run

pip install -r requirements.txt

Generating points

To exactly reproduce our results, one should use the generate-points.py script as

python generate-points.py <month> <run>

where <month> is a number from 3 (March 2020) to 8 (August 2020) and <run> specifies a number for an independent HMC run. The random number generator is seeded uniquely for each run by seed = month * 1000 + run. For the HMC simulation, we did 10 runs (with numbers 0 - 9) for each month (3 - 8) resulting in 60 runs total.

Note that the script assumes 48 CPU threads. This can be changed in the script, though diminishing returns are expected for thread counts greater than 60. More runs can of course be computed independently in parallel.

The parameters for all simulations were as follows (as seen in generate-points.py):

  • Initial parameter Theta0 = (0.1, 0.1) (*)
  • Covariance matrix M = diag(2)
  • "Learning rate" epsilon = 0.05
  • Leapfrog count L = 5 per generated point
  • Simulation until 100 points are accepted for each run
  • Discard the first 10% of accepted points as "burn-in" before plotting

(*) In our framework we use the convention Theta = (alpha, beta) and theta = (log(alpha), log(beta)) where alpha, beta are the parameters of the SIR model.

Leveraging HPC clusters

Especially for months March, April and August the simulation can take quite some time. If there is access to a compute cluster that uses slurm the slurm-job-template.sh can be utilized. Note that the venv must be setup on the target architecture.

Owner
The Spang Lab
Statistical Bioinformatics Department, University of Regensburg, Germany
The Spang Lab
Secure Distributed Training at Scale

Secure Distributed Training at Scale This repository contains the implementation of experiments from the paper "Secure Distributed Training at Scale"

Yandex Research 9 Jul 11, 2022
Implemented fully documented Particle Swarm Optimization algorithm (basic model with few advanced features) using Python programming language

Implemented fully documented Particle Swarm Optimization (PSO) algorithm in Python which includes a basic model along with few advanced features such as updating inertia weight, cognitive, social lea

9 Nov 29, 2022
Flickr-Faces-HQ (FFHQ) is a high-quality image dataset of human faces, originally created as a benchmark for generative adversarial networks (GAN)

Flickr-Faces-HQ Dataset (FFHQ) Flickr-Faces-HQ (FFHQ) is a high-quality image dataset of human faces, originally created as a benchmark for generative

NVIDIA Research Projects 2.9k Dec 28, 2022
ONNX-GLPDepth - Python scripts for performing monocular depth estimation using the GLPDepth model in ONNX

ONNX-GLPDepth - Python scripts for performing monocular depth estimation using the GLPDepth model in ONNX

Ibai Gorordo 18 Nov 06, 2022
NIMA: Neural IMage Assessment

PyTorch NIMA: Neural IMage Assessment PyTorch implementation of Neural IMage Assessment by Hossein Talebi and Peyman Milanfar. You can learn more from

Kyryl Truskovskyi 293 Dec 30, 2022
Hashformers is a framework for hashtag segmentation with transformers.

Hashtag segmentation is the task of automatically inserting the missing spaces between the words in a hashtag. Hashformers applies Transformer models

Ruan Chaves 41 Nov 09, 2022
Cancer Drug Response Prediction via a Hybrid Graph Convolutional Network

DeepCDR Cancer Drug Response Prediction via a Hybrid Graph Convolutional Network This work has been accepted to ECCB2020 and was also published in the

Qiao Liu 50 Dec 18, 2022
Lite-HRNet: A Lightweight High-Resolution Network

LiteHRNet Benchmark 🔥 🔥 Based on MMsegmentation 🔥 🔥 Cityscapes FCN resize concat config mIoU last mAcc last eval last mIoU best mAcc best eval bes

16 Dec 12, 2022
NAS-FCOS: Fast Neural Architecture Search for Object Detection (CVPR 2020)

NAS-FCOS: Fast Neural Architecture Search for Object Detection This project hosts the train and inference code with pretrained model for implementing

Ning Wang 180 Dec 06, 2022
TrackTech: Real-time tracking of subjects and objects on multiple cameras

TrackTech: Real-time tracking of subjects and objects on multiple cameras This project is part of the 2021 spring bachelor final project of the Bachel

5 Jun 17, 2022
ViSER: Video-Specific Surface Embeddings for Articulated 3D Shape Reconstruction

ViSER: Video-Specific Surface Embeddings for Articulated 3D Shape Reconstruction. NeurIPS 2021.

Gengshan Yang 59 Nov 25, 2022
i-RevNet Pytorch Code

i-RevNet: Deep Invertible Networks Pytorch implementation of i-RevNets. i-RevNets define a family of fully invertible deep networks, built from a succ

Jörn Jacobsen 378 Dec 06, 2022
DECA: Detailed Expression Capture and Animation (SIGGRAPH 2021)

DECA: Detailed Expression Capture and Animation (SIGGRAPH2021) input image, aligned reconstruction, animation with various poses & expressions This is

Yao Feng 1.5k Jan 02, 2023
Usable Implementation of "Bootstrap Your Own Latent" self-supervised learning, from Deepmind, in Pytorch

Bootstrap Your Own Latent (BYOL), in Pytorch Practical implementation of an astoundingly simple method for self-supervised learning that achieves a ne

Phil Wang 1.4k Dec 29, 2022
This is a pytorch implementation of the NeurIPS paper GAN Memory with No Forgetting.

GAN Memory for Lifelong learning This is a pytorch implementation of the NeurIPS paper GAN Memory with No Forgetting. Please consider citing our paper

Miaoyun Zhao 43 Dec 27, 2022
Linear image-to-image translation

Linear (Un)supervised Image-to-Image Translation Examples for linear orthogonal transformations in PCA domain, learned without pairing supervision. Tr

Eitan Richardson 40 Aug 31, 2022
Multi-task yolov5 with detection and segmentation based on yolov5

YOLOv5DS Multi-task yolov5 with detection and segmentation based on yolov5(branch v6.0) decoupled head anchor free segmentation head README中文 Ablation

150 Dec 30, 2022
Aggragrating Nested Transformer Official Jax Implementation

NesT is a simple method, which aggragrates nested local transformers on image blocks. The idea makes vision transformers attain better accuracy, data efficiency, and convergence on the ImageNet bench

Google Research 169 Dec 20, 2022
OpenDILab Multi-Agent Environment

Go-Bigger: Multi-Agent Decision Intelligence Environment GoBigger Doc (中文版) Ongoing 2021.11.13 We are holding a competition —— Go-Bigger: Multi-Agent

OpenDILab 441 Jan 05, 2023
Experimental Python implementation of OpenVINO Inference Engine (very slow, limited functionality). All codes are written in Python. Easy to read and modify.

PyOpenVINO - An Experimental Python Implementation of OpenVINO Inference Engine (minimum-set) Description The PyOpenVINO is a spin-off product from my

Yasunori Shimura 7 Oct 31, 2022