JumpDiff: Non-parametric estimator for Jump-diffusion processes for Python

Overview

PyPI - License PyPI PyPI - Python Version Build Status codecov Documentation Status

jumpdiff

jumpdiff is a python library with non-parametric Nadaraya─Watson estimators to extract the parameters of jump-diffusion processes. With jumpdiff one can extract the parameters of a jump-diffusion process from one-dimensional timeseries, employing both a kernel-density estimation method combined with a set on second-order corrections for a precise retrieval of the parameters for short timeseries.

Installation

To install jumpdiff, run

   pip install jumpdiff

Then on your favourite editor just use

   import jumpdiff as jd

Dependencies

The library parameter estimation depends on numpy and scipy solely. The mathematical formulae depend on sympy. It stems from kramersmoyal project, but functions independently from it3.

Documentation

You can find the documentation here.

Jump-diffusion processes

The theory

Jump-diffusion processes1, as the name suggest, are a mixed type of stochastic processes with a diffusive and a jump term. One form of these processes which is mathematically traceable is given by the Stochastic Differential Equation

which has 4 main elements: a drift term , a diffusion term , and jump amplitude term , which is given by a Gaussian distribution, and finally a jump rate . You can find a good review on this topic in Ref. 2.

Integrating a jump-diffusion process

Let us use the functions in jumpdiff to generate a jump-difussion process, and subsequently retrieve the parameters. This is a good way to understand the usage of the integrator and the non-parametric retrieval of the parameters.

First we need to load our library. We will call it jd

import jumpdiff as jd

Let us thus define a jump-diffusion process and use jd_process to integrate it. Do notice here that we need the drift and diffusion as functions.

# integration time and time sampling
t_final = 10000
delta_t = 0.001

# A drift function
def a(x):
    return -0.5*x

# and a (constant) diffusion term
def b(x):
    return 0.75

# Now define a jump amplitude and rate
xi = 2.5
lamb = 1.75

# and simply call the integration function
X = jd.jd_process(t_final, delta_t, a=a, b=b, xi=xi, lamb=lamb)

This will generate a jump diffusion process X of length int(10000/0.001) with the given parameters.

Using jumpdiff to retrieve the parameters

Moments and Kramers─Moyal coefficients

Take the timeseries X and use the function moments to retrieve the conditional moments of the process. For now let us focus on the shortest time lag, so we can best approximate the Kramers─Moyal coefficients. For this case we can simply employ

edges, moments = jd.moments(timeseries = X)

In the array edges are the limits of our space, and in our array moments are recorded all 6 powers/order of our conditional moments. Let us take a look at these before we proceed, to get acquainted with them.

We can plot the first moment with any conventional plotter, so lets use here plotly from matplotlib

import matplotlib.plotly as plt

# we want the first power, so we need 'moments[1,...]'
plt.plot(edges, moments[1,...])

The first moment here (i.e., the first Kramers─Moyal coefficient) is given solely by the drift term that we have selected -0.5*x

And the second moment (i.e., the second Kramers─Moyal coefficient) is a mixture of both the contributions of the diffusive term and the jump terms and .

You have this stored in moments[2,...].

Retrieving the jump-related terms

Naturally one of the most pertinent questions when addressing jump-diffusion processes is the possibility of recovering these same parameters from data. For the given jump-diffusion process we can use the jump_amplitude and jump_rate functions to non-parametrically estimate the jump amplitude and jump rate terms.

After having the moments in hand, all we need is

# first estimate the jump amplitude
xi_est = jd.jump_amplitude(moments = moments)

# and now estimated the jump rate
lamb_est = jd.jump_rate(moments = moments)

which resulted in our case in (xi_est) ξ = 2.43 ± 0.17 and (lamb_est) λ = 1.744 * delta_t (don't forget to divide lamb_est by delta_t)!

Other functions and options

Include in this package is also the Milstein scheme of integration, particularly important when the diffusion term has some spacial x dependence. moments can actually calculate the conditional moments for different lags, using the parameter lag.

In formulae the set of formulas needed to calculate the second order corrections are given (in sympy).

Contributions

We welcome reviews and ideas from everyone. If you want to share your ideas, upgrades, doubts, or simply report a bug, open an issue here on GitHub, or contact us directly. If you need help with the code, the theory, or the implementation, drop us an email. We abide to a Conduct of Fairness.

Changelog

  • Version 0.4 - Designing a set of self-consistency checks, the documentation, examples, and a trial code. Code at PyPi.
  • Version 0.3 - Designing a straightforward procedure to retrieve the jump amplitude and jump rate functions, alongside with a easy sympy displaying the correction.
  • Version 0.2 - Introducing the second-order corrections to the moments
  • Version 0.1 - Design an implementation of the moments functions, generalising kramersmoyal km.

Literature and Support

History

This project was started in 2017 at the neurophysik by Leonardo Rydin Gorjão, Jan Heysel, Klaus Lehnertz, and M. Reza Rahimi Tabar, and separately by Pedro G. Lind, at the Department of Computer Science, Oslo Metropolitan University. From 2019 to 2021, Pedro G. Lind, Leonardo Rydin Gorjão, and Dirk Witthaut developed a set of corrections and an implementation for python, presented here.

Funding

Helmholtz Association Initiative Energy System 2050 - A Contribution of the Research Field Energy and the grant No. VH-NG-1025 and STORM - Stochastics for Time-Space Risk Models project of the Research Council of Norway (RCN) No. 274410.


Bibliography

1 Tabar, M. R. R. Analysis and Data-Based Reconstruction of Complex Nonlinear Dynamical Systems. Springer, International Publishing (2019), Chapter Stochastic Processes with Jumps and Non-vanishing Higher-Order Kramers–Moyal Coefficients.

2 Friedrich, R., Peinke, J., Sahimi, M., Tabar, M. R. R. Approaching complexity by stochastic methods: From biological systems to turbulence, Physics Reports 506, 87–162 (2011).

3 Rydin Gorjão, L., Meirinhos, F. kramersmoyal: Kramers–Moyal coefficients for stochastic processes. Journal of Open Source Software, 4(44) (2019).

Extended Literature

You can find further reading on SDE, non-parametric estimatons, and the general principles of the Fokker–Planck equation, Kramers–Moyal expansion, and related topics in the classic (physics) books

  • Risken, H. The Fokker–Planck equation. Springer, Berlin, Heidelberg (1989).
  • Gardiner, C.W. Handbook of Stochastic Methods. Springer, Berlin (1985).

And an extensive review on the subject here

You might also like...
This python-based package offers a way of creating a parametric OpenMC plasma source from plasma parameters.
This python-based package offers a way of creating a parametric OpenMC plasma source from plasma parameters.

openmc-plasma-source This python-based package offers a way of creating a parametric OpenMC plasma source from plasma parameters. The OpenMC sources a

A Robust Non-IoU Alternative to Non-Maxima Suppression in Object Detection
A Robust Non-IoU Alternative to Non-Maxima Suppression in Object Detection

Confluence: A Robust Non-IoU Alternative to Non-Maxima Suppression in Object Detection 1. 介绍 用以替代 NMS,在所有 bbox 中挑选出最优的集合。 NMS 仅考虑了 bbox 的得分,然后根据 IOU 来

A parametric soroban written with CADQuery.
A parametric soroban written with CADQuery.

A parametric soroban written in CADQuery The purpose of this project is to demonstrate how "code CAD" can be intuitive to learn. See soroban.py for a

The personal repository of the work: *DanceNet3D: Music Based Dance Generation with Parametric Motion Transformer*.
The personal repository of the work: *DanceNet3D: Music Based Dance Generation with Parametric Motion Transformer*.

DanceNet3D The personal repository of the work: DanceNet3D: Music Based Dance Generation with Parametric Motion Transformer. Dataset and Results Pleas

Official implementation of NPMs: Neural Parametric Models for 3D Deformable Shapes - ICCV 2021
Official implementation of NPMs: Neural Parametric Models for 3D Deformable Shapes - ICCV 2021

NPMs: Neural Parametric Models Project Page | Paper | ArXiv | Video NPMs: Neural Parametric Models for 3D Deformable Shapes Pablo Palafox, Aljaz Bozic

Parametric Contrastive Learning (ICCV2021)

Parametric-Contrastive-Learning This repository contains the implementation code for ICCV2021 paper: Parametric Contrastive Learning (https://arxiv.or

The official implementation of the research paper
The official implementation of the research paper "DAG Amendment for Inverse Control of Parametric Shapes"

DAG Amendment for Inverse Control of Parametric Shapes This repository is the official Blender implementation of the paper "DAG Amendment for Inverse

This repository contains a pytorch implementation of
This repository contains a pytorch implementation of "HeadNeRF: A Real-time NeRF-based Parametric Head Model (CVPR 2022)".

HeadNeRF: A Real-time NeRF-based Parametric Head Model This repository contains a pytorch implementation of "HeadNeRF: A Real-time NeRF-based Parametr

A Python implementation of global optimization with gaussian processes.
A Python implementation of global optimization with gaussian processes.

Bayesian Optimization Pure Python implementation of bayesian global optimization with gaussian processes. PyPI (pip): $ pip install bayesian-optimizat

Comments
  • Example notation

    Example notation

    Hi @LRydin, thank you for this great package. I have a question regarding the notation in the definition of the jump term. The parameter xi is referred to as the jump amplitude, which is however normally distributed with variance sigma_xi. Is the xi parameter as set in the example actually the standard deviation of the jump amplitude?

    Furthermore, is there a way to set the initial condition, e.g., X_0 = 0?

    Thanks again! Kyriakos

    opened by kyrgeorgiou 2
  • negative lambda and others

    negative lambda and others

    This is a great package. And I have some questions. Recently I come across the KM perspective to estimate levy triplet by virtue of the book of Mr Tabar published in 2019. Statisticians, especially French scholars, have done a lot on it. However, real time series are traces of dynamic processes ruled by some unknown laws. Methods that lack the support of some natural law governance yet only focus on data may not be the right road. And the KM perspective shows me a direction. I tried your package last night with some real time series, but many of them are negative lambda. Besides, I want to use the estimated lambda and xi to generate surrogate process to compare with the original time series (I also want to get drift and diffusion from the moments but moments are not scalar,thus I calculated from the original), but the generated X has many nan. Could you give some guidance? And the KM perspective is a nonparametric method, which may have some similarities with the statistical methods worked by, e.g. Fabienne Comte or Jean Jacod. Could you show some details on them? Finally neural network is nonparametric method to approximate the unknown distribution by stacked neural layers, do you have some plan to try them?

    Many thanks

    opened by funfwo 1
Releases(v0.4)
  • v0.4(Jul 29, 2021)

    Jump-Diffusion processes non-parametric estimation methods. From numerical simulations to estimations. This release includes two integrations schemes, moment calculation, the Q-ratio to examine jumpiness of timeseries, and parameter estimation for univariate Jump-Diffusion processes.

    Source code(tar.gz)
    Source code(zip)
Owner
Rydin
I had cereals for breakfast
Rydin
Code for the bachelors-thesis flaky fault localization

Flaky_Fault_Localization Scripts for the Bachelors-Thesis: "Flaky Fault Localization" by Christian Kasberger. The thesis examines the usefulness of sp

Christian Kasberger 1 Oct 26, 2021
Human Dynamics from Monocular Video with Dynamic Camera Movements

Human Dynamics from Monocular Video with Dynamic Camera Movements Ri Yu, Hwangpil Park and Jehee Lee Seoul National University ACM Transactions on Gra

215 Jan 01, 2023
Self-Supervised Pre-Training for Transformer-Based Person Re-Identification

Self-Supervised Pre-Training for Transformer-Based Person Re-Identification [pdf] The official repository for Self-Supervised Pre-Training for Transfo

Hao Luo 116 Jan 04, 2023
A graphical Semi-automatic annotation tool based on labelImg and Yolov5

💕YOLOV5 semi-automatic annotation tool (Based on labelImg)

EricFang 247 Jan 05, 2023
[CVPR 2020] GAN Compression: Efficient Architectures for Interactive Conditional GANs

GAN Compression project | paper | videos | slides [NEW!] GAN Compression is accepted by T-PAMI! We released our T-PAMI version in the arXiv v4! [NEW!]

MIT HAN Lab 1k Jan 07, 2023
Official PyTorch implementation of the ICRA 2021 paper: Adversarial Differentiable Data Augmentation for Autonomous Systems.

Adversarial Differentiable Data Augmentation This repository provides the official PyTorch implementation of the ICRA 2021 paper: Adversarial Differen

Manli 3 Oct 15, 2022
Speech recognition tool to convert audio to text transcripts, for Linux and Raspberry Pi.

Spchcat Speech recognition tool to convert audio to text transcripts, for Linux and Raspberry Pi. Description spchcat is a command-line tool that read

Pete Warden 279 Jan 03, 2023
No-reference Image Quality Assessment(NIQA) Algorithms (BRISQUE, NIQE, PIQE, RankIQA, MetaIQA)

No-Reference Image Quality Assessment Algorithms No-reference Image Quality Assessment(NIQA) is a task of evaluating an image without a reference imag

Dae-Young Song 26 Jan 04, 2023
JAXDL: JAX (Flax) Deep Learning Library

JAXDL: JAX (Flax) Deep Learning Library Simple and clean JAX/Flax deep learning algorithm implementations: Soft-Actor-Critic (arXiv:1812.05905) Transf

Patrick Hart 4 Nov 27, 2022
Unofficial implementation of HiFi-GAN+ from the paper "Bandwidth Extension is All You Need" by Su, et al.

HiFi-GAN+ This project is an unoffical implementation of the HiFi-GAN+ model for audio bandwidth extension, from the paper Bandwidth Extension is All

Brent M. Spell 134 Dec 30, 2022
Shared Attention for Multi-label Zero-shot Learning

Shared Attention for Multi-label Zero-shot Learning Overview This repository contains the implementation of Shared Attention for Multi-label Zero-shot

dathuynh 26 Dec 14, 2022
Official repository for "Deep Recurrent Neural Network with Multi-scale Bi-directional Propagation for Video Deblurring".

RNN-MBP Deep Recurrent Neural Network with Multi-scale Bi-directional Propagation for Video Deblurring (AAAI-2022) by Chao Zhu, Hang Dong, Jinshan Pan

SIV-LAB 22 Aug 31, 2022
Official Implementation for "ReStyle: A Residual-Based StyleGAN Encoder via Iterative Refinement" https://arxiv.org/abs/2104.02699

ReStyle: A Residual-Based StyleGAN Encoder via Iterative Refinement Recently, the power of unconditional image synthesis has significantly advanced th

967 Jan 04, 2023
Record radiologists' eye gaze when they are labeling images.

Record radiologists' eye gaze when they are labeling images. Read for installation, usage, and deep learning examples. Why use MicEye Versatile As a l

24 Nov 03, 2022
the code for paper "Energy-Based Open-World Uncertainty Modeling for Confidence Calibration"

EOW-Softmax This code is for the paper "Energy-Based Open-World Uncertainty Modeling for Confidence Calibration". Accepted by ICCV21. Usage Commnd exa

Yezhen Wang 36 Dec 02, 2022
Implementation of Advantage-Weighted Regression: Simple and Scalable Off-Policy Reinforcement Learning

advantage-weighted-regression Implementation of Advantage-Weighted Regression: Simple and Scalable Off-Policy Reinforcement Learning, by Peng et al. (

Omar D. Domingues 1 Dec 02, 2021
Reproducible research and reusable acyclic workflows in Python. Execute code on HPC systems as if you executed them on your personal computer!

Reproducible research and reusable acyclic workflows in Python. Execute code on HPC systems as if you executed them on your machine! Motivation Would

Joeri Hermans 15 Sep 11, 2022
School of Artificial Intelligence at the Nanjing University (NJU)School of Artificial Intelligence at the Nanjing University (NJU)

F-Principle This is an exercise problem of the digital signal processing (DSP) course at School of Artificial Intelligence at the Nanjing University (

Thyrix 5 Nov 23, 2022
Repo 4 basic seminar §How to make human machine readable"

WORK IN PROGRESS... Notebooks from the Seminar: Human Machine Readable WS21/22 Introduction into programming Georg Trogemann, Christian Heck, Mattis

experimental-informatics 3 May 29, 2022
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.

NNI Doc | 简体中文 NNI (Neural Network Intelligence) is a lightweight but powerful toolkit to help users automate Feature Engineering, Neural Architecture

Microsoft 12.4k Dec 31, 2022