Deep Survival Machines - Fully Parametric Survival Regression

Overview

Build Status     codecov     License: MIT     GitHub Repo stars

Package: dsm

Python package dsm provides an API to train the Deep Survival Machines and associated models for problems in survival analysis. The underlying model is implemented in pytorch.

For full documentation of the module, please see https://autonlab.github.io/DeepSurvivalMachines/

What is Survival Analysis?

Survival Analysis involves estimating when an event of interest, T would take place given some features or covariates X. In statistics and ML, these scenarios are modelled as regression to estimate the conditional survival distribution, P(T>t|X).
As compared to typical regression problems, Survival Analysis differs in two major ways:

  • The Event distribution, T has positive support i.e. T ∈ [0, ∞).
  • There is presence of censoring i.e. a large number of instances of data are lost to follow up.

Deep Survival Machines

Deep Survival Machines (DSM) is a fully parametric approach to model Time-to-Event outcomes in the presence of Censoring, first introduced in [1]. In the context of Healthcare ML and Biostatistics, this is known as 'Survival Analysis'. The key idea behind Deep Survival Machines is to model the underlying event outcome distribution as a mixure of some fixed ( K ) parametric distributions. The parameters of these mixture distributions as well as the mixing weights are modelled using Neural Networks.

Usage Example

from dsm import DeepSurvivalMachines
model = DeepSurvivalMachines()
model.fit()
model.predict_risk()

Recurrent Deep Survival Machines

Recurrent Deep Survival Machines (RDSM) builds on the original DSM model and allows for learning of representations of the input covariates using Recurrent Neural Networks like LSTMs, GRUs. Deep Recurrent Survival Machines is a natural fit to model problems where there are time dependendent covariates.

Deep Convolutional Survival Machines

Predictive maintenance and medical imaging sometimes requires to work with image streams. Deep Convolutional Survival Machines extends DSM and DRSM to learn representations of the input image data using convolutional layers. If working with streaming data, the learnt representations are then passed through an LSTM to model temporal dependencies before determining the underlying survival distributions.

⚠️ Not Implemented Yet!

Deep Cox Mixtures

The Cox Mixture involves the assumption that the survival function of the individual to be a mixture of K Cox Models. Conditioned on each subgroup Z=k; the PH assumptions are assumed to hold and the baseline hazard rates is determined non-parametrically using an spline-interpolated Breslow's estimator. For full details on Deep Cox Mixture, refer to the paper:

Deep Cox Mixtures for Survival Regression. Machine Learning in Health Conference (2021)

Installation

[email protected]:~$ git clone https://github.com/autonlab/DeepSurvivalMachines.git
[email protected]:~$ cd DeepSurvivalMachines
[email protected]:~$ pip install -r requirements.txt

Examples

  1. Deep Survival Machines on the SUPPORT Dataset
  2. Recurrent Deep Survival Machines on the PBC Dataset

References

Please cite the following papers if you are using the dsm package.

[1] Deep Survival Machines: Fully Parametric Survival Regression and Representation Learning for Censored Data with Competing Risks. IEEE Journal of Biomedical & Health Informatics (2021)

  @article{nagpal2021deep,
  title={Deep Survival Machines: Fully Parametric Survival Regression and\
  Representation Learning for Censored Data with Competing Risks},
  author={Nagpal, Chirag and Li, Xinyu and Dubrawski, Artur},
  journal={IEEE Journal of Biomedical and Health Informatics},
  year={2021}
  }

[2] Deep Parametric Time-to-Event Regression with Time-Varying Covariates. AAAI Spring Symposium (2021)

@InProceedings{pmlr-v146-nagpal21a,
  title = 	 {Deep Parametric Time-to-Event Regression with Time-Varying Covariates},
  author =       {Nagpal, Chirag and Jeanselme, Vincent and Dubrawski, Artur},
  booktitle = 	 {Proceedings of AAAI Spring Symposium on Survival Prediction - Algorithms, Challenges, and Applications 2021},
  series = 	 {Proceedings of Machine Learning Research},
  publisher =    {PMLR},
  }

[3] Deep Cox Mixtures for Survival Regression. Machine Learning for Healthcare (2021)

@InProceedings{nagpal2021dcm,
  title={Deep Cox Mixtures for Survival Regression},
  author={Nagpal, Chirag and Yadlowsky, Steve and Rostamzadeh, Negar and Heller, Katherine},
  booktitle={Proceedings of the 6th Machine Learning for Healthcare Conference},
  pages={674--708},
  year={2021},
  volume={149},
  series={Proceedings of Machine Learning Research},
  publisher={PMLR},
}

Compatibility

dsm requires python 3.5+ and pytorch 1.1+.

To evaluate performance using standard metrics dsm requires scikit-survival.

Contributing

dsm is on GitHub. Bug reports and pull requests are welcome.

License

MIT License

Copyright (c) 2020 Carnegie Mellon University, Auton Lab

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Owner
Carnegie Mellon University Auton Lab
Carnegie Mellon University Auton Lab
NCVX (NonConVeX): A User-Friendly and Scalable Package for Nonconvex Optimization in Machine Learning.

NCVX (NonConVeX): A User-Friendly and Scalable Package for Nonconvex Optimization in Machine Learning.

SUN Group @ UMN 28 Aug 03, 2022
TensorFlow Decision Forests (TF-DF) is a collection of state-of-the-art algorithms for the training, serving and interpretation of Decision Forest models.

TensorFlow Decision Forests (TF-DF) is a collection of state-of-the-art algorithms for the training, serving and interpretation of Decision Forest models. The library is a collection of Keras models

538 Jan 01, 2023
A repository of PyBullet utility functions for robotic motion planning, manipulation planning, and task and motion planning

pybullet-planning (previously ss-pybullet) A repository of PyBullet utility functions for robotic motion planning, manipulation planning, and task and

Caelan Garrett 260 Dec 27, 2022
MLReef is an open source ML-Ops platform that helps you collaborate, reproduce and share your Machine Learning work with thousands of other users.

The collaboration platform for Machine Learning MLReef is an open source ML-Ops platform that helps you collaborate, reproduce and share your Machine

MLReef 1.4k Dec 27, 2022
Fast Fourier Transform-accelerated Interpolation-based t-SNE (FIt-SNE)

FFT-accelerated Interpolation-based t-SNE (FIt-SNE) Introduction t-Stochastic Neighborhood Embedding (t-SNE) is a highly successful method for dimensi

Kluger Lab 547 Dec 21, 2022
Module for statistical learning, with a particular emphasis on time-dependent modelling

Operating system Build Status Linux/Mac Windows tick tick is a Python 3 module for statistical learning, with a particular emphasis on time-dependent

X - Data Science Initiative 410 Dec 14, 2022
Implemented four supervised learning Machine Learning algorithms

Implemented four supervised learning Machine Learning algorithms from an algorithmic family called Classification and Regression Trees (CARTs), details see README_Report.

Teng (Elijah) Xue 0 Jan 31, 2022
Machine Learning for RC Cars

Suiron Machine Learning for RC Cars Prediction visualization (green = actual, blue = prediction) Click the video below to see it in action! Dependenci

Kendrick Tan 706 Jan 02, 2023
Forecast dynamically at scale with this unique package. pip install scalecast

🌄 Scalecast: Dynamic Forecasting at Scale About This package uses a scaleable forecasting approach in Python with common scikit-learn and statsmodels

Michael Keith 158 Jan 03, 2023
🎛 Distributed machine learning made simple.

🎛 lazycluster Distributed machine learning made simple. Use your preferred distributed ML framework like a lazy engineer. Getting Started • Highlight

Machine Learning Tooling 44 Nov 27, 2022
Projeto: Machine Learning: Linguagens de Programacao 2004-2001

Projeto: Machine Learning: Linguagens de Programacao 2004-2001 Projeto de Data Science e Machine Learning de análise de linguagens de programação de 2

Victor Hugo Negrisoli 0 Jun 29, 2021
Meerkat provides fast and flexible data structures for working with complex machine learning datasets.

Meerkat makes it easier for ML practitioners to interact with high-dimensional, multi-modal data. It provides simple abstractions for data inspection, model evaluation and model training supported by

Robustness Gym 115 Dec 12, 2022
Generate music from midi files using BPE and markov model

Generate music from midi files using BPE and markov model

Aditya Khadilkar 37 Oct 24, 2022
Karate Club: An API Oriented Open-source Python Framework for Unsupervised Learning on Graphs (CIKM 2020)

Karate Club is an unsupervised machine learning extension library for NetworkX. Please look at the Documentation, relevant Paper, Promo Video, and Ext

Benedek Rozemberczki 1.8k Jan 03, 2023
Real-time domain adaptation for semantic segmentation

Advanced-Machine-Learning This repository contains the code for the project Real

Andrea Cavallo 1 Jan 30, 2022
XGBoost-Ray is a distributed backend for XGBoost, built on top of distributed computing framework Ray.

XGBoost-Ray is a distributed backend for XGBoost, built on top of distributed computing framework Ray.

92 Dec 14, 2022
Implementation of the Object Relation Transformer for Image Captioning

Object Relation Transformer This is a PyTorch implementation of the Object Relation Transformer published in NeurIPS 2019. You can find the paper here

Yahoo 158 Dec 24, 2022
A project based example of Data pipelines, ML workflow management, API endpoints and Monitoring.

MLOps template with examples for Data pipelines, ML workflow management, API development and Monitoring.

Utsav 33 Dec 03, 2022
onelearn: Online learning in Python

onelearn: Online learning in Python Documentation | Reproduce experiments | onelearn stands for ONE-shot LEARNning. It is a small python package for o

15 Nov 06, 2022
Vowpal Wabbit is a machine learning system which pushes the frontier of machine learning with techniques

Vowpal Wabbit is a machine learning system which pushes the frontier of machine learning with techniques such as online, hashing, allreduce, reductions, learning2search, active, and interactive learn

Vowpal Wabbit 8.1k Dec 30, 2022