Live training loss plot in Jupyter Notebook for Keras, PyTorch and others

Overview

livelossplot

livelossplot version - PyPI PyPI status MIT license - PyPI Python version - PyPI GitHub Workflow Status Downloads Twitter @pmigdal

Don't train deep learning models blindfolded! Be impatient and look at each epoch of your training!

(RECENT CHANGES, EXAMPLES IN COLAB, API LOOKUP, CODE)

A live training loss plot in Jupyter Notebook for Keras, PyTorch and other frameworks. An open-source Python package by Piotr Migdał, Bartłomiej Olechno and others. Open for collaboration! (Some tasks are as simple as writing code docstrings, so - no excuses! :))

from livelossplot import PlotLossesKeras

model.fit(X_train, Y_train,
          epochs=10,
          validation_data=(X_test, Y_test),
          callbacks=[PlotLossesKeras()],
          verbose=0)

Animated fig for livelossplot tracking log-loss and accuracy

  • (The most FA)Q: Why not TensorBoard?
  • A: Jupyter Notebook compatibility (for exploration and teaching). The simplicity of use.

Installation

To install this version from PyPI, type:

pip install livelossplot

To get the newest one from this repo (note that we are in the alpha stage, so there may be frequent updates), type:

pip install git+git://github.com/stared/livelossplot.git

Examples

Look at notebook files with full working examples:

You run examples in Colab.

Overview

Text logs are easy, but it's easy to miss the most crucial information: is it learning, doing nothing or overfitting? Visual feedback allows us to keep track of the training process. Now there is one for Jupyter.

If you want to get serious - use TensorBoard, . But what if you just want to train a small model in Jupyter Notebook? Here is a way to do so, using livelossplot as a plug&play component

from livelossplot import ...

PlotLosses for a generic API.

plotlosses = PlotLosses()
plotlosses.update({'acc': 0.7, 'val_acc': 0.4, 'loss': 0.9, 'val_loss': 1.1})
plot.send()  # draw, update logs, etc

There are callbacks for common libraries and frameworks: PlotLossesKeras, PlotLossesKerasTF, PlotLossesPoutyne, PlotLossesIgnite.

Feel invited to write, and contribute, your adapter. If you want to use a bare logger, there is MainLogger.

from livelossplot.outputs import ...

Plots: MatplotlibPlot, BokehPlot.

Loggers: ExtremaPrinter (to standard output), TensorboardLogger, TensorboardTFLogger, NeptuneLogger.

To use them, initialize PlotLosses with some outputs:

plotlosses = PlotLosses(outputs=[MatplotlibPlot(), TensorboardLogger()])

There are custom matplotlib plots in livelossplot.outputs.matplotlib_subplots you can pass in MatplotlibPlot arguments.

If you like to plot with Bokeh instead of matplotlib, use

plotlosses = PlotLosses(outputs=[BokehPlot()])

Sponsors

This project supported by Jacek Migdał, Marek Cichy, Casper da Costa-Luis, and Piotr Zientara. Join the sponsors - show your ❤️ and support, and appear on the list! It will give me time and energy to work on this project.

This project is also supported by a European program Program Operacyjny Inteligentny Rozwój for GearShift - building the engine of behavior of wheeled motor vehicles and map’s generation based on artificial intelligence algorithms implemented on the Unreal Engine platform lead by ECC Games (NCBR grant GameINN).

Trivia

It started as this gist. Since it went popular, I decided to rewrite it as a package.

Oh, and I am in general interested in data vis, see Simple diagrams of convoluted neural networks (and overview of deep learning architecture diagrams):

A good diagram is worth a thousand equations — let’s create more of these!

...or my other data vis projects.

Todo

If you want more functionality - open an Issue or even better - prepare a Pull Request.

Owner
Piotr Migdał
Making quantum mainstream @ Quantum Flytrap. Data viz / explorable explanations / tensors. PhD in quantum optics, a deep learning consultant.
Piotr Migdał
Runtime type annotations for the shape, dtype etc. of PyTorch Tensors.

torchtyping Type annotations for a tensor's shape, dtype, names, ... Turn this: def batch_outer_product(x: torch.Tensor, y: torch.Tensor) - torch.Ten

Patrick Kidger 1.2k Jan 03, 2023
The official implementation of "Rethink Dilated Convolution for Real-time Semantic Segmentation"

RegSeg The official implementation of "Rethink Dilated Convolution for Real-time Semantic Segmentation" Paper: arxiv D block Decoder Setup Install the

Roland 61 Dec 27, 2022
Tensorflow Repo for "DeepGCNs: Can GCNs Go as Deep as CNNs?"

DeepGCNs: Can GCNs Go as Deep as CNNs? In this work, we present new ways to successfully train very deep GCNs. We borrow concepts from CNNs, mainly re

Guohao Li 612 Nov 15, 2022
A flexible ML framework built to simplify medical image reconstruction and analysis experimentation.

meddlr Getting Started Meddlr is a config-driven ML framework built to simplify medical image reconstruction and analysis problems. Installation To av

Arjun Desai 36 Dec 16, 2022
Open CV - Convert a picture to look like a cartoon sketch in python

Use the video https://www.youtube.com/watch?v=k7cVPGpnels for initial learning.

Sammith S Bharadwaj 3 Jan 29, 2022
Benchmark spaces - Benchmarks of how well different two dimensional spaces work for clustering algorithms

benchmark_spaces Benchmarks of how well different two dimensional spaces work fo

Bram Cohen 6 May 07, 2022
[NIPS 2021] UOTA: Improving Self-supervised Learning with Automated Unsupervised Outlier Arbitration.

UOTA: Improving Self-supervised Learning with Automated Unsupervised Outlier Arbitration This repository is the official PyTorch implementation of UOT

6 Jun 29, 2022
PyTorch implementation of Grokking: Generalization Beyond Overfitting on Small Algorithmic Datasets

Simple PyTorch Implementation of "Grokking" Implementation of Grokking: Generalization Beyond Overfitting on Small Algorithmic Datasets Usage Running

Teddy Koker 15 Sep 29, 2022
Code in conjunction with the publication 'Contrastive Representation Learning for Hand Shape Estimation'

HanCo Dataset & Contrastive Representation Learning for Hand Shape Estimation Code in conjunction with the publication: Contrastive Representation Lea

Computer Vision Group, Albert-Ludwigs-Universität Freiburg 38 Dec 13, 2022
Improving Transferability of Representations via Augmentation-Aware Self-Supervision

Improving Transferability of Representations via Augmentation-Aware Self-Supervision Accepted to NeurIPS 2021 TL;DR: Learning augmentation-aware infor

hankook 38 Sep 16, 2022
This thesis is mainly concerned with state-space methods for a class of deep Gaussian process (DGP) regression problems

Doctoral dissertation of Zheng Zhao This thesis is mainly concerned with state-space methods for a class of deep Gaussian process (DGP) regression pro

Zheng Zhao 21 Nov 14, 2022
Modelisation on galaxy evolution using PEGASE-HR

model_galaxy Modelisation on galaxy evolution using PEGASE-HR This is a labwork done in internship at IAP directed by Damien Le Borgne (https://github

Adrien Anthore 1 Jan 14, 2022
A PyTorch Implementation of "SINE: Scalable Incomplete Network Embedding" (ICDM 2018).

Scalable Incomplete Network Embedding ⠀⠀ A PyTorch implementation of Scalable Incomplete Network Embedding (ICDM 2018). Abstract Attributed network em

Benedek Rozemberczki 69 Sep 22, 2022
MHFormer: Multi-Hypothesis Transformer for 3D Human Pose Estimation

MHFormer: Multi-Hypothesis Transformer for 3D Human Pose Estimation This repo is the official implementation of "MHFormer: Multi-Hypothesis Transforme

Vegetabird 281 Jan 07, 2023
Near-Duplicate Video Retrieval with Deep Metric Learning

Near-Duplicate Video Retrieval with Deep Metric Learning This repository contains the Tensorflow implementation of the paper Near-Duplicate Video Retr

2 Jan 24, 2022
Machine learning for NeuroImaging in Python

nilearn Nilearn enables approachable and versatile analyses of brain volumes. It provides statistical and machine-learning tools, with instructive doc

919 Dec 25, 2022
Generalized Jensen-Shannon Divergence Loss for Learning with Noisy Labels

The official code for the NeurIPS 2021 paper Generalized Jensen-Shannon Divergence Loss for Learning with Noisy Labels

13 Dec 22, 2022
A high-performance distributed deep learning system targeting large-scale and automated distributed training.

HETU Documentation | Examples Hetu is a high-performance distributed deep learning system targeting trillions of parameters DL model training, develop

DAIR Lab 150 Dec 21, 2022
All of the figures and notebooks for my deep learning book, for free!

"Deep Learning - A Visual Approach" by Andrew Glassner This is the official repo for my book from No Starch Press. Ordering the book My book is called

Andrew Glassner 227 Jan 04, 2023
A graphical Semi-automatic annotation tool based on labelImg and Yolov5

💕YOLOV5 semi-automatic annotation tool (Based on labelImg)

EricFang 247 Jan 05, 2023