An interactive DNN Model deployed on web that predicts the chance of heart failure for a patient with an accuracy of 98%

Overview

Heart Failure Predictor

About

A Web UI deployed Dense Neural Network Model Made using Tensorflow that predicts whether the patient is healthy or has chances of heart disease with probability.

Dataset

The Dataset used is the Heart Failure Prediction Dataset from kaggle. -Cardiovascular diseases (CVDs) are the number 1 cause of death globally, taking an estimated 17.9 million lives each year, which accounts for 31% of all deaths worldwide. Four out of 5CVD deaths are due to heart attacks and strokes, and one-third of these deaths occur prematurely in people under 70 years of age. Heart failure is a common event caused by CVDs and this dataset contains 11 features that can be used to predict a possible heart disease. -People with cardiovascular disease or who are at high cardiovascular risk (due to the presence of one or more risk factors such as hypertension, diabetes, hyperlipidaemia or already established disease) need early detection and management wherein a machine learning model can be of great help. -This dataset was created by combining different datasets already available independently but not combined before. In this dataset, 5 heart datasets are combined over 11 common features which makes it the largest heart disease dataset available so far for research purposes.

UI Demonstration

This is an interactive website made using a python library called streamlit that implements the neural network model. You can view dataset (scrollable and explandable), several plots that have good insights on data. For prediction, user has to input various details about the patient being tested into the form. User has to provide details like age,blood pressure, maximum heart rate which can be filled using numerical inputs, sliders for numerical values and a selectbox for categorical options. Click the submit button and then click the Predict button to infer whether the patient has chances of heart disease and the probablity of having a heart disease.

ui_demonstration.mp4

To run this ui open the directory in command terminal and use the command streamlit run interface.py

Attribute Information
  • Age: age of the patient (years)
  • Sex: sex of the patient (M: Male, F: Female)
  • ChestPainType: chest pain type (TA: Typical Angina, ATA: Atypical Angina, NAP: Non-Anginal Pain, ASY: Asymptomatic)
  • RestingBP: resting blood pressure (mm Hg)
  • Cholesterol: serum cholesterol (mm/dl)
  • FastingBS: fasting blood sugar (1: if FastingBS > 120 mg/dl, 0: otherwise)
  • RestingECG: resting electrocardiogram results (Normal: Normal, ST: having ST-T wave abnormality (T wave inversions and/or ST elevation or depression of > 0.05 mV), LVH: showing probable or definite left ventricular hypertrophy by Estes' criteria)
  • MaxHR: maximum heart rate achieved (Numeric value between 60 and 202)
  • ExerciseAngina: exercise-induced angina (Y: Yes, N: No)
  • Oldpeak: oldpeak = ST (Numeric value measured in depression)
  • ST_Slope: the slope of the peak exercise ST segment (Up: upsloping, Flat: flat, Down: downsloping)
  • HeartDisease: output class (1: heart disease, 0: Normal)

DNN Model (Keras)

The model is used is shown in the codeblock below:

model = tf.keras.Sequential([
    layers.DenseFeatures(feature_cols.values()),
    layers.BatchNormalization(input_dim = (len(feature_cols.keys()),)),
    layers.Dense(256, activation='relu',kernel_regularizer='l2'),
    layers.BatchNormalization(),
    layers.Dropout(0.4),
    layers.Dense(256, activation='relu',kernel_regularizer='l2'),
    layers.BatchNormalization(),
    layers.Dropout(0.4),
    layers.Dense(1, activation='sigmoid')
])

model.compile(optimizer = tf.keras.optimizers.Adam(learning_rate=0.001),loss ='binary_crossentropy',metrics=['accuracy',tf.keras.metrics.AUC()])

The model is very dense and the dataset is small, so as to avoid overfitting various regularization methods are used like:

  • Batch Normalization
  • Dropout Layers
  • L2 Regularization
  • Early Stopping Callback

Feature Columns are used and datasets are of converted into tf.data.Dataset type for faster processing. Age Feature is bucketized. Whereas all other numerical features are passed as numerical feature columns. Categorical as categorical feature columns.

The model has an accuracy of approximately 98% on Test Dataset and AUC(area under roc curve) of 1.00. The model training is visualized in Tensorboard.

About files in repo

  • pred_model.ipynb: Jupyter Notebook of the code used to build the DNN and exploratory data analysis using pandas,matplotlib and seaborn
  • interface.py: Used to run the website for interactive UI
  • model_py.py: DNN Model code available in .py format
  • saved_model folder: Contains the DNN Model saved in .pb format that can be imported into any python file.
Owner
Adit Ahmedabadi
ML and DL Enthusiast | Pursuing B.Tech Degree in Electrical Engineering in Sardar Patel College for Engineering , Mumbai.
Adit Ahmedabadi
A fast MoE impl for PyTorch

An easy-to-use and efficient system to support the Mixture of Experts (MoE) model for PyTorch.

Rick Ho 873 Jan 09, 2023
In this project, we develop a face recognize platform based on MTCNN object-detection netcwork and FaceNet self-supervised network.

模式识别大作业——人脸检测与识别平台 本项目是一个简易的人脸检测识别平台,提供了人脸信息录入和人脸识别的功能。前端采用 html+css+js,后端采用 pytorch,

Xuhua Huang 5 Aug 02, 2022
ULMFiT for Genomic Sequence Data

Genomic ULMFiT This is an implementation of ULMFiT for genomics classification using Pytorch and Fastai. The model architecture used is based on the A

Karl 276 Dec 12, 2022
A Neural Net Training Interface on TensorFlow, with focus on speed + flexibility

Tensorpack is a neural network training interface based on TensorFlow. Features: It's Yet Another TF high-level API, with speed, and flexibility built

Tensorpack 6.2k Jan 09, 2023
📖 Deep Attentional Guided Image Filtering

📖 Deep Attentional Guided Image Filtering [Paper] Zhiwei Zhong, Xianming Liu, Junjun Jiang, Debin Zhao ,Xiangyang Ji Harbin Institute of Technology,

9 Dec 23, 2022
Learning to Simulate Dynamic Environments with GameGAN (CVPR 2020)

Learning to Simulate Dynamic Environments with GameGAN PyTorch code for GameGAN Learning to Simulate Dynamic Environments with GameGAN Seung Wook Kim,

199 Dec 26, 2022
Supervised Classification from Text (P)

MSc-Thesis Module: Masters Research Thesis Language: Python Grade: 75 Title: An investigation of supervised classification of therapeutic process from

Matthew Laws 1 Nov 22, 2021
Implementation for NeurIPS 2021 Submission: SparseFed

READ THIS FIRST This repo is an anonymized version of an existing repository of GitHub, for the AIStats 2021 submission: SparseFed: Mitigating Model P

2 Jun 15, 2022
Code for "OctField: Hierarchical Implicit Functions for 3D Modeling (NeurIPS 2021)"

OctField(Jittor): Hierarchical Implicit Functions for 3D Modeling Introduction This repository is code release for OctField: Hierarchical Implicit Fun

55 Dec 08, 2022
Self-Guided Contrastive Learning for BERT Sentence Representations

Self-Guided Contrastive Learning for BERT Sentence Representations This repository is dedicated for releasing the implementation of the models utilize

Taeuk Kim 16 Dec 04, 2022
Official code repository of the paper Learning Associative Inference Using Fast Weight Memory by Schlag et al.

Learning Associative Inference Using Fast Weight Memory This repository contains the offical code for the paper Learning Associative Inference Using F

Imanol Schlag 18 Oct 12, 2022
Unsupervised Representation Learning by Invariance Propagation

Unsupervised Learning by Invariance Propagation This repository is the official implementation of Unsupervised Learning by Invariance Propagation. Pre

FengWang 15 Jul 06, 2022
Código de um painel de auto atendimento feito em Python.

Painel de Auto-Atendimento O intuito desse projeto era fazer em Python um programa que simulasse um painel de auto atendimento, no maior estilo Mac Do

Calebe Alves Evangelista 2 Nov 09, 2022
Source code and Dataset creation for the paper "Neural Symbolic Regression That Scales"

NeuralSymbolicRegressionThatScales Pytorch implementation and pretrained models for the paper "Neural Symbolic Regression That Scales", presented at I

35 Nov 25, 2022
This repo includes our code for evaluating and improving transferability in domain generalization (NeurIPS 2021)

Transferability for domain generalization This repo is for evaluating and improving transferability in domain generalization (NeurIPS 2021), based on

gordon 9 Nov 29, 2022
Official repo for AutoInt: Automatic Integration for Fast Neural Volume Rendering in CVPR 2021

AutoInt: Automatic Integration for Fast Neural Volume Rendering CVPR 2021 Project Page | Video | Paper PyTorch implementation of automatic integration

Stanford Computational Imaging Lab 149 Dec 22, 2022
This is the official repository for our paper: ''Pruning Self-attentions into Convolutional Layers in Single Path''.

Pruning Self-attentions into Convolutional Layers in Single Path This is the official repository for our paper: Pruning Self-attentions into Convoluti

Zhuang AI Group 77 Dec 26, 2022
Collect super-resolution related papers, data, repositories

Collect super-resolution related papers, data, repositories

WangChaofeng 1.7k Jan 03, 2023
Example scripts for the detection of lanes using the ultra fast lane detection model in ONNX.

Example scripts for the detection of lanes using the ultra fast lane detection model in ONNX.

Ibai Gorordo 35 Sep 07, 2022
[ICCV 2021 Oral] Just Ask: Learning to Answer Questions from Millions of Narrated Videos

Just Ask: Learning to Answer Questions from Millions of Narrated Videos Webpage • Demo • Paper This repository provides the code for our paper, includ

Antoine Yang 87 Jan 05, 2023