Forecasting for knowable future events using Bayesian informative priors (forecasting with judgmental-adjustment).

Overview

What is judgyprophet?

judgyprophet is a Bayesian forecasting algorithm based on Prophet, that enables forecasting while using information known by the business about future events. The aim is to enable users to perform forecasting with judgmental adjustment, in a way that is mathematically as sound as possible.

Some events will have a big effect on your timeseries. Some of which you are aware of ahead of time. For example:

  • An existing product entering a new market.
  • A price change to a product.

These events will typically cause a large change in your timeseries of e.g. product sales, which a standard statistical forecast will consistently underestimate.

The business will often have good estimates (or at least better than your statistical forecast) about how these events will affect your timeseries. But this is difficult to encode into your statistical forecasting algorithm. One option is to use a regressor, but this typically works poorly. This is because you have no data on the event before it occurs, and the statistical forecast does not know how to balance the information in your regressor and trend after the event occurs (which can lead to erratic behaviour).

judgyprophet solves this problem by encoding the business estimate of how the event will affect the forecast (the judgmental adjustment) as a Bayesian informative prior.

Before the event occurs, this business information is used to reflect the forecast of what will happen post-event e.g. the estimated uplift in product sales once the event has happened. After the event occurs, we update what the business thinks will happen, with what we see happening in the actuals. This is done using standard Bayesian updating.

Installation

1. install judgyprophet python package using pip

pip install judgyprophet

2. compile the STAN model

judgyprophet depends on STAN, whose models have to be compiled before running.

So to use judgyprophet, you have to compile the model. Do this in the shell using

python -c "from judgyprophet import JudgyProphet; JudgyProphet().compile()"

or in python using

from judgyprophet import JudgyProphet

JudgyProphet().compile()

This will take a while. But you only have to run this once, after the initial install.

Documentation

Full documentation is available on our Github Pages site here.

Scroll down for a quickstart tutorial.

A runnable jupyter notebook version of the quickstart tutorial is available here

Roadmap

Some things on our roadmap:

  • Currently judgyprophet STAN file is only tested on Unix-based Linux or Mac machines. We aim to fully test Windows machines ASAP.
  • Option to run full MCMC, rather than just L-BFGS.
  • Prediction intervals
  • Regressors/holidays

Quickstart Tutorial

Imagine your business currently operates in the US, but is launching its product in Europe. As a result it anticipates a sharp uptake in sales (which it has an estimate of). As your forecasting team, they come to you and ask you to account for this.

Let's look at how we might do this using judgyprophet with some example data, where we know what happened. First let's plot this:

from judgyprophet.tutorials.resources import get_trend_event

example_data = get_trend_event()
p = example_data.plot.line()

png

We can see that product sales increased sharply from about September 2020. Suppose it was a launch in a new market, and that the business had an initial estimate of the impact in May 2020. The business expected the slope increase to be 6.

Let's use judgyprophet to forecast this series from May 2020. We do this by encoding the initial business estimate as a trend event.

from judgyprophet import JudgyProphet
import pandas as pd
import seaborn as sns

# Create the expected trend events by consulting with the business
trend_events = [
    {'name': "New market entry", 'index': '2020-09-01', 'm0': 6}
]


# Cutoff the data to May 2020
data_may2020 = example_data.loc[:"2020-05-01"]

# Make the forecast with the business estimated level event
# We have no level events, so just provide the empty list.
jp = JudgyProphet()
# Because the event is beyond the actuals, judgyprophet throws a warning.
#    This is just because the Bayesian model at the event has no actuals to learn from.
#    The event is still used in predictions.
jp.fit(
    data=data_may2020,
    level_events=[],
    trend_events=trend_events,
    # Set random seed for reproducibility
    seed=13
)
predictions = jp.predict(horizon=12)
INFO:judgyprophet.judgyprophet:Rescaling onto 0-mean, 1-sd.
WARNING:judgyprophet.judgyprophet:Post-event data for trend event New market entry less than 0 points. Event deactivated in model. Event index: 2020-09-01, training data end index: 2019-06-01 00:00:00
WARNING:judgyprophet.utils:No active trend or level events (i.e. no event indexes overlap with data). The model will just fit a base trend to the data.


Initial log joint probability = -3.4521
    Iter      log prob        ||dx||      ||grad||       alpha      alpha0  # evals  Notes
       3      -2.92768      0.054987   8.11433e-14           1           1        7
Optimization terminated normally:
  Convergence detected: gradient norm is below tolerance

Because we are in May 2020, the forecasting algorithm has nothing to use for learning; so just uses the business estimate. Let's plot the result:

from judgyprophet.tutorials.resources import plot_forecast

plot_forecast(
    actuals=example_data,
    predictions=predictions,
    cutoff="2020-05-01",
    events=trend_events
)
INFO:prophet:Disabling yearly seasonality. Run prophet with yearly_seasonality=True to override this.
INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.
INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.



Initial log joint probability = -17.0121
Iteration  1. Log joint probability =    10.4753. Improved by 27.4875.
Iteration  2. Log joint probability =    12.7533. Improved by 2.27796.
Iteration  3. Log joint probability =    25.4696. Improved by 12.7163.
Iteration  4. Log joint probability =     26.707. Improved by 1.2374.
Iteration  5. Log joint probability =    26.7075. Improved by 0.000514342.
Iteration  6. Log joint probability =    26.7104. Improved by 0.00296558.
Iteration  7. Log joint probability =    26.7122. Improved by 0.00171322.
Iteration  8. Log joint probability =    26.7157. Improved by 0.00351772.
Iteration  9. Log joint probability =    26.7159. Improved by 0.000208268.
Iteration 10. Log joint probability =    26.7159. Improved by 6.64977e-05.
Iteration 11. Log joint probability =     26.716. Improved by 6.89899e-05.
Iteration 12. Log joint probability =     26.716. Improved by 3.06578e-05.
Iteration 13. Log joint probability =     26.716. Improved by 8.91492e-07.
Iteration 14. Log joint probability =     26.716. Improved by 8.71052e-09.

png

We can see judgyprophet is accounting for the increased trend, but the business slightly overestimated the increase in sales due to the product launch.

Let's fast forward to January 2021, the business want to reforecast based on their estimate, and what they've seen so far for the product launch. This is where judgyprophet comes into its own.

Once actuals are observed after the event has taken place, judgyprophet updates its estimate of what the event impact is. Let's look at this in action:

# Cutoff the data to January 2021
data_jan2021 = example_data.loc[:"2021-01-01"]

# Reforecast using the new actuals, not we are at Jan 2021
jp = JudgyProphet()
jp.fit(
    data=data_jan2021,
    level_events=[],
    trend_events=trend_events,
    # Set random seed for reproducibility
    seed=13
)
predictions = jp.predict(horizon=12)
INFO:judgyprophet.judgyprophet:Rescaling onto 0-mean, 1-sd.
INFO:judgyprophet.judgyprophet:Adding trend event New market entry to model. Event index: 2020-09-01, training data start index: 2019-06-01 00:00:00, training data end index: 2021-01-01 00:00:00. Initial gradient: 6. Damping: None.


Initial log joint probability = -309.562
    Iter      log prob        ||dx||      ||grad||       alpha      alpha0  # evals  Notes
      10      -1.64341   2.10244e-05   3.61281e-06           1           1       15
Optimization terminated normally:
  Convergence detected: relative gradient magnitude is below tolerance

Now let's plot the results:

plot_forecast(actuals=example_data, predictions=predictions, cutoff="2021-01-01", events=trend_events)
INFO:prophet:Disabling yearly seasonality. Run prophet with yearly_seasonality=True to override this.
INFO:prophet:Disabling weekly seasonality. Run prophet with weekly_seasonality=True to override this.
INFO:prophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.



Initial log joint probability = -24.5881
Iteration  1. Log joint probability =   -1.06803. Improved by 23.5201.
Iteration  2. Log joint probability =    11.6215. Improved by 12.6895.
Iteration  3. Log joint probability =    36.5271. Improved by 24.9056.
Iteration  4. Log joint probability =    37.3776. Improved by 0.850488.
Iteration  5. Log joint probability =    37.6489. Improved by 0.271259.
Iteration  6. Log joint probability =    37.6547. Improved by 0.00580657.
Iteration  7. Log joint probability =    37.7831. Improved by 0.128419.
Iteration  8. Log joint probability =    37.7884. Improved by 0.00527858.
Iteration  9. Log joint probability =     37.789. Improved by 0.000612124.
Iteration 10. Log joint probability =    37.7891. Improved by 9.93823e-05.
Iteration 11. Log joint probability =    37.7902. Improved by 0.00112416.
Iteration 12. Log joint probability =    37.7902. Improved by 3.17397e-06.
Iteration 13. Log joint probability =    37.7902. Improved by 1.59404e-05.
Iteration 14. Log joint probability =    37.7902. Improved by 5.06854e-07.
Iteration 15. Log joint probability =    37.7902. Improved by 6.87792e-07.
Iteration 16. Log joint probability =    37.7902. Improved by 4.82761e-08.
Iteration 17. Log joint probability =    37.7902. Improved by 2.50385e-07.
Iteration 18. Log joint probability =    37.7902. Improved by 6.60322e-09.

png

In this case, once judgyprophet observes the data post-event, the Bayesian updating starts to realise the business estimate is a bit large, so it reduces it.

This was a simple example to demonstrate judgyprophet. You can add many trend events into a single forecasting horizon, add damping. You can also add level events – changes in the forecasting level; and seasonality see our other tutorials for details about this.

You might also like...
This codebase is the official implementation of Test-Time Classifier Adjustment Module for Model-Agnostic Domain Generalization (NeurIPS2021, Spotlight)

Test-Time Classifier Adjustment Module for Model-Agnostic Domain Generalization This codebase is the official implementation of Test-Time Classifier A

A face dataset generator with out-of-focus blur detection and dynamic interval adjustment.

A face dataset generator with out-of-focus blur detection and dynamic interval adjustment.

Official implementation of
Official implementation of "Accelerating Reinforcement Learning with Learned Skill Priors", Pertsch et al., CoRL 2020

Accelerating Reinforcement Learning with Learned Skill Priors [Project Website] [Paper] Karl Pertsch1, Youngwoon Lee1, Joseph Lim1 1CLVR Lab, Universi

 Geometry-Free View Synthesis: Transformers and no 3D Priors
Geometry-Free View Synthesis: Transformers and no 3D Priors

Geometry-Free View Synthesis: Transformers and no 3D Priors Geometry-Free View Synthesis: Transformers and no 3D Priors Robin Rombach*, Patrick Esser*

 DETReg: Unsupervised Pretraining with Region Priors for Object Detection
DETReg: Unsupervised Pretraining with Region Priors for Object Detection

DETReg: Unsupervised Pretraining with Region Priors for Object Detection Amir Bar, Xin Wang, Vadim Kantorov, Colorado J Reed, Roei Herzig, Gal Chechik

This repository contains the data and code for the paper
This repository contains the data and code for the paper "Diverse Text Generation via Variational Encoder-Decoder Models with Gaussian Process Priors" ([email protected])

GP-VAE This repository provides datasets and code for preprocessing, training and testing models for the paper: Diverse Text Generation via Variationa

Pytorch implementation of Make-A-Scene: Scene-Based Text-to-Image Generation with Human Priors
Pytorch implementation of Make-A-Scene: Scene-Based Text-to-Image Generation with Human Priors

Make-A-Scene - PyTorch Pytorch implementation (inofficial) of Make-A-Scene: Scene-Based Text-to-Image Generation with Human Priors (https://arxiv.org/

Implementation of CVPR'2022:Reconstructing Surfaces for Sparse Point Clouds with On-Surface Priors
Implementation of CVPR'2022:Reconstructing Surfaces for Sparse Point Clouds with On-Surface Priors

Reconstructing Surfaces for Sparse Point Clouds with On-Surface Priors (CVPR 2022) Personal Web Pages | Paper | Project Page This repository contains

Implementation of CVPR'2022:Surface Reconstruction from Point Clouds by Learning Predictive Context Priors
Implementation of CVPR'2022:Surface Reconstruction from Point Clouds by Learning Predictive Context Priors

Surface Reconstruction from Point Clouds by Learning Predictive Context Priors (CVPR 2022) Personal Web Pages | Paper | Project Page This repository c

Comments
  • Bumping jupyter versions in dev dependencies for security patch.

    Bumping jupyter versions in dev dependencies for security patch.

    Patching dev dependencies in light of jupyter security issues:

    • https://github.com/advisories/GHSA-m87f-39q9-6f55
    • https://github.com/advisories/GHSA-p737-p57g-4cpr
    opened by jackcbaker 0
  • Unspecified argument used in judgyprophet.fit()

    Unspecified argument used in judgyprophet.fit()

    The docstring of judgyprophet.fit() states that the dict array fed into 'trend_events' argument only needs three values per dict:

    :param trend_events: A list of dictionaries. Each dict should have the following entries
                - 'index' the start index of the event (i.e. index = i assumes the start of the event
                    is at location actuals[i]). The index should be of the same type as the actuals index.
                - 'm0' the estimated gradient increase following the event
                - 'gamma' (Optional) the damping to use for the trend. This is a float between 0 and 1.
                    It's not recommended to be below 0.8 and must be 0 > gamma <= 1.
                    If gamma is missing from the dict, or gamma = 1, a linear trend is used (i.e. no damping).
    

    But it actually needs 4 to work - the missing one being 'name'.

    The only need for this value currently is logging purposes (lines 1059 and 1085). Perhaps remove this argument from the logging, or add it as a forth key in the dictionary in the docstring?

    opened by Andrew47658 2
  • Correction for the docstring for judgyprophet.fit()

    Correction for the docstring for judgyprophet.fit()

    The docstring for judgyprophet.fit() states:

    :param actuals: A pandas series of the actual timeseries to forecast.
                It is assumed there are no missing data points,
                i.e. x[1] is the observation directly following x[0], etc.
    

    But I believe this argument should be named data, not actuals in the docstring. Thanks!

    opened by Andrew47658 0
Releases(0.1.2)
Owner
AstraZeneca
Data and AI: Unlocking new science insights
AstraZeneca
WHENet: Real-time Fine-Grained Estimation for Wide Range Head Pose

WHENet: Real-time Fine-Grained Estimation for Wide Range Head Pose Yijun Zhou and James Gregson - BMVC2020 Abstract: We present an end-to-end head-pos

368 Dec 26, 2022
PyTorch CZSL framework containing GQA, the open-world setting, and the CGE and CompCos methods.

Compositional Zero-Shot Learning This is the official PyTorch code of the CVPR 2021 works Learning Graph Embeddings for Compositional Zero-shot Learni

EML Tübingen 70 Dec 27, 2022
Official Pytorch implementation of paper "Reverse Engineering of Generative Models: Inferring Model Hyperparameters from Generated Images"

Reverse_Engineering_GMs Official Pytorch implementation of paper "Reverse Engineering of Generative Models: Inferring Model Hyperparameters from Gener

100 Dec 18, 2022
GarmentNets: Category-Level Pose Estimation for Garments via Canonical Space Shape Completion

GarmentNets This repository contains the source code for the paper GarmentNets: Category-Level Pose Estimation for Garments via Canonical Space Shape

Columbia Artificial Intelligence and Robotics Lab 43 Nov 21, 2022
Easily benchmark PyTorch model FLOPs, latency, throughput, max allocated memory and energy consumption

⏱ pytorch-benchmark Easily benchmark model inference FLOPs, latency, throughput, max allocated memory and energy consumption Install pip install pytor

Lukas Hedegaard 21 Dec 22, 2022
👨‍💻 run nanosaur in simulation with Gazebo/Ingnition

🦕 👨‍💻 nanosaur_gazebo nanosaur The smallest NVIDIA Jetson dinosaur robot, open-source, fully 3D printable, based on ROS2 & Isaac ROS. Designed & ma

nanosaur 9 Jul 19, 2022
Transformer based SAR image despeckling

Transformer based SAR image despeckling Using the code: The code is stable while using Python 3.6.13, CUDA =10.1 Clone this repository: git clone htt

27 Nov 13, 2022
Making self-supervised learning work on molecules by using their 3D geometry to pre-train GNNs. Implemented in DGL and Pytorch Geometric.

3D Infomax improves GNNs for Molecular Property Prediction Video | Paper We pre-train GNNs to understand the geometry of molecules given only their 2D

Hannes Stärk 95 Dec 30, 2022
source code of “Visual Saliency Transformer” (ICCV2021)

Visual Saliency Transformer (VST) source code for our ICCV 2021 paper “Visual Saliency Transformer” by Nian Liu, Ni Zhang, Kaiyuan Wan, Junwei Han, an

89 Dec 21, 2022
PyTorch code for the "Deep Neural Networks with Box Convolutions" paper

Box Convolution Layer for ConvNets Single-box-conv network (from `examples/mnist.py`) learns patterns on MNIST What This Is This is a PyTorch implemen

Egor Burkov 515 Dec 18, 2022
AquaTimer - Programmable Timer for Aquariums based on ATtiny414/814/1614

AquaTimer - Programmable Timer for Aquariums based on ATtiny414/814/1614 AquaTimer is a programmable timer for 12V devices such as lighting, solenoid

Stefan Wagner 4 Jun 13, 2022
[ICML 2021] “ Self-Damaging Contrastive Learning”, Ziyu Jiang, Tianlong Chen, Bobak Mortazavi, Zhangyang Wang

Self-Damaging Contrastive Learning Introduction The recent breakthrough achieved by contrastive learning accelerates the pace for deploying unsupervis

VITA 51 Dec 29, 2022
PyTorch Language Model for 1-Billion Word (LM1B / GBW) Dataset

PyTorch Large-Scale Language Model A Large-Scale PyTorch Language Model trained on the 1-Billion Word (LM1B) / (GBW) dataset Latest Results 39.98 Perp

Ryan Spring 114 Nov 04, 2022
Spatial-Location-Constraint-Prototype-Loss-for-Open-Set-Recognition

Spatial Location Constraint Prototype Loss for Open Set Recognition Official PyTorch implementation of "Spatial Location Constraint Prototype Loss for

Xia Ziheng 12 Jun 24, 2022
Classify bird species based on their songs using SIamese Networks and 1D dilated convolutions.

The goal is to classify different birds species based on their songs/calls. Spectrograms have been extracted from the audio samples and used as features for classification.

Aditya Dutt 9 Dec 27, 2022
Course materials for Fall 2021 "CIS6930 Topics in Computing for Data Science" at New College of Florida

Fall 2021 CIS6930 Topics in Computing for Data Science This repository hosts course materials used for a 13-week course "CIS6930 Topics in Computing f

Yoshi Suhara 101 Nov 30, 2022
An open source object detection toolbox based on PyTorch

MMDetection is an open source object detection toolbox based on PyTorch. It is a part of the OpenMMLab project.

Bo Chen 24 Dec 28, 2022
Regularized Frank-Wolfe for Dense CRFs: Generalizing Mean Field and Beyond

CRF - Conditional Random Fields A library for dense conditional random fields (CRFs). This is the official accompanying code for the paper Regularized

Đ.Khuê Lê-Huu 21 Nov 26, 2022
An unofficial implementation of "Unpaired Image Super-Resolution using Pseudo-Supervision." CVPR2020

UnpairedSR An unofficial implementation of "Unpaired Image Super-Resolution using Pseudo-Supervision." CVPR2020 turn RCAN(modified) -- xmodel(xilinx

JiaKui Hu 10 Oct 28, 2022
In this work, we will implement some basic but important algorithm of machine learning step by step.

WoRkS continued English 中文 Français Probability Density Estimation-Non-Parametric Methods(概率密度估计-非参数方法) 1. Kernel / k-Nearest Neighborhood Density Est

liziyu0104 1 Dec 30, 2021