More than a hundred strange attractors

Related tags

Deep Learningdysts
Overview

dysts

Analyze more than a hundred chaotic systems.

An embedding of all chaotic systems in the collection

Basic Usage

Import a model and run a simulation with default initial conditions and parameter values

from dysts.flows import Lorenz

model = Lorenz()
sol = model.make_trajectory(1000)
# plt.plot(sol[:, 0], sol[:, 1])

Modify a model's parameter values and re-integrate

model = Lorenz()
model.gamma = 1
model.ic = [0, 0, 0.2]
sol = model.make_trajectory(1000)
# plt.plot(sol[:, 0], sol[:, 1])

Load a precomputed trajectory for the model

eq = Lorenz()
sol = eq.load_trajectory(subsets="test", noise=False, granularity="fine")
# plt.plot(sol[:, 0], sol[:, 1])

Integrate new trajectories from all 131 chaotic systems with a custom granularity

from dysts.base import make_trajectory_ensemble

all_out = make_trajectory_ensemble(100, resample=True, pts_per_period=75)

Load a precomputed collection of time series from all 131 chaotic systems

from dysts.datasets import load_dataset

data = load_dataset(subsets="train", data_format="numpy", standardize=True)

Additional functionality and examples can be found in the demonstrations notebook.. The full API documentation can be found here.

Reference

For additional details, please see the preprint. If using this code for published work, please consider citing the paper.

William Gilpin. "Chaos as an interpretable benchmark for forecasting and data-driven modelling" Advances in Neural Information Processing Systems (NeurIPS) 2021 https://arxiv.org/abs/2110.05266

Installation

Install from PyPI

pip install dysts

To obtain the latest version, including new features and bug fixes, download and install the project repository directly from GitHub

git clone https://github.com/williamgilpin/dysts
cd dysts
pip install -I . 

Test that everything is working

python -m unittest

Alternatively, to use this as a regular package without downloading the full repository, install directly from GitHub

pip install git+git://github.com/williamgilpin/dysts

The key dependencies are

  • Python 3+
  • numpy
  • scipy
  • pandas
  • sdeint (optional, but required for stochastic dynamics)
  • numba (optional, but speeds up generation of trajectories)

These additional optional dependencies are needed to reproduce some portions of this repository, such as benchmarking experiments and estimation of invariant properties of each dynamical system:

  • nolds (used for calculating the correlation dimension)
  • darts (used for forecasting benchmarks)
  • sktime (used for classification benchmarks)
  • tsfresh (used for statistical quantity extraction)
  • pytorch (used for neural network benchmarks)

Contributing

New systems. If you know of any systems should be included, please feel free to submit an issue or pull request. The biggest bottleneck when adding new models is a lack of known parameter values and initial conditions, and so please provide a reference or code that contains all parameter values necessary to reproduce the claimed dynamics. Because there are an infinite number of chaotic systems, we currently are only including systems that have appeared in published work.

Development and Maintainence. We are very grateful for any suggestions or contributions. See the to-do list below for some of the ongoing work.

Benchmarks

The benchmarks reported in our preprint can be found in benchmarks. An overview of the contents of the directory can be found in BENCHMARKS.md, while individual task areas are summarized in corresponding Jupyter Notebooks within the top level of the directory.

Contents

  • Code to generate benchmark forecasting and training experiments are included in benchmarks
  • Pre-computed time series with training and test partitions are included in data
  • The raw definitions metadata for all chaotic systems are included in the database file chaotic_attractors. The Python implementations of differential equations can be found in the flows module

Implementation Notes

  • Currently there are 131 continuous time models, including several delay diffential equations. There is also a separate module with 10 discrete maps, which is currently being expanded.
  • The right hand side of each dynamical equation is compiled using numba, wherever possible. Ensembles of trajectories are vectorized where needed.
  • Attractor names, default parameter values, references, and other metadata are stored in parseable JSON database files. Parameter values are based on standard or published values, and default initial conditions were generated by running each model until the moments of the autocorrelation function all become stationary.
  • The default integration step is stored in each continuous-time model's dt field. This integration timestep was chosen based on the highest significant frequency observed in the power spectrum, with significance being determined relative to random phase surrogates. The period field contains the timescale associated with the dominant frequency in each system's power spectrum. When using the model.make_trajectory() method with the optional setting resample=True, integration is performed at the default dt. The integrated trajectory is then resampled based on the period. The resulting trajectories will have have consistant dominant timescales across models, despite having different integration timesteps.

Acknowledgements

  • Two existing collections of named systems can be found on the webpages of Jürgen Meier and J. C. Sprott. The current version of dysts contains all systems from both collections.
  • Several of the analysis routines (such as calculation of the correlation dimension) use the library nolds. If re-using the fractal dimension code that depends on nolds, please be sure to credit that library and heed its license. The Lyapunov exponent calculation is based on the QR factorization approach used by Wolf et al 1985 and Eckmann et al 1986, with implementation details adapted from conventions in the Julia library DynamicalSystems.jl

Ethics & Reporting

Dataset datasheets and metadata are reported using the dataset documentation guidelines described in Gebru et al 2018; please see our preprint for a full dataset datasheet and other information. We note that all datasets included here are mathematical in nature, and do not contain human or clinical observations. If any users become aware of unintended harms that may arise due to the use of this data, we encourage reporting them by submitting an issue on this repository.

Development to-do list

A partial list of potential improvements in future versions

  • Speed up the delay equation implementation
    • We need to roll our own implementation of DDE23 in the utils module.
  • Improve calculations of Lyapunov exponents for delay systems
  • Implement multivariate multiscale entropy and re-calculate for all attractors
  • Add a method for parallel integrating multiple systems at once, based on a list of names and a set of shared settings
    • Can use multiprocessing for a few systems, but greater speedups might be possible by compiling all right hand sides into a single function acting on a large vector.
    • Can also use this same utility to integrate multiple initial conditions for the same model
  • Add a separate jacobian database file, and add an attribute that can be used to check if an analytical one exists. This will speed up numerical integration, as well as potentially aid in calculating Lyapunov exponents.
  • Align the initial phases, potentially by picking default starting initial conditions that lie on the attractor, but which are as close as possible to the origin
  • Expand and finalize the discrete dysts.maps module
    • Maps are deterministic but not differentiable, and so not all analysis methods will work on them. Will probably need a decorator to declare whether utilities work on flows, maps, or both
  • Switch stochastic integration to a newer package, like torchsde or sdepy
Owner
William Gilpin
Physics researcher at Harvard. Soon @GilpinLab at UT Austin
William Gilpin
Molecular Sets (MOSES): A benchmarking platform for molecular generation models

Molecular Sets (MOSES): A benchmarking platform for molecular generation models Deep generative models are rapidly becoming popular for the discovery

Neelesh C A 3 Oct 14, 2022
Python scripts for performing lane detection using the LSTR model in ONNX

ONNX LSTR Lane Detection Python scripts for performing lane detection using the Lane Shape Prediction with Transformers (LSTR) model in ONNX. Requirem

Ibai Gorordo 29 Aug 30, 2022
Rotation Robust Descriptors

RoRD Rotation-Robust Descriptors and Orthographic Views for Local Feature Matching Project Page | Paper link Evaluation and Datasets MMA : Training on

Udit Singh Parihar 25 Nov 15, 2022
Generates all variables from your .tf files into a variables.tf file.

tfvg Generates all variables from your .tf files into a variables.tf file. It searches for every var.variable_name in your .tf files and generates a v

1 Dec 01, 2022
Code for the KDD 2021 paper 'Filtration Curves for Graph Representation'

Filtration Curves for Graph Representation This repository provides the code from the KDD'21 paper Filtration Curves for Graph Representation. Depende

Machine Learning and Computational Biology Lab 16 Oct 16, 2022
Generate high quality pictures. GAN. Generative Adversarial Networks

ESRGAN generate high quality pictures. GAN. Generative Adversarial Networks """ Super-resolution of CelebA using Generative Adversarial Networks. The

Lieon 1 Dec 14, 2021
Code for WSDM 2022 paper, Contrastive Learning for Representation Degeneration Problem in Sequential Recommendation.

DuoRec Code for WSDM 2022 paper, Contrastive Learning for Representation Degeneration Problem in Sequential Recommendation. Usage Download datasets fr

Qrh 46 Dec 19, 2022
Toward Multimodal Image-to-Image Translation

BicycleGAN Project Page | Paper | Video Pytorch implementation for multimodal image-to-image translation. For example, given the same night image, our

Jun-Yan Zhu 1.4k Dec 22, 2022
PyTorch Live is an easy to use library of tools for creating on-device ML demos on Android and iOS.

PyTorch Live is an easy to use library of tools for creating on-device ML demos on Android and iOS. With Live, you can build a working mobile app ML demo in minutes.

559 Jan 01, 2023
So-ViT: Mind Visual Tokens for Vision Transformer

So-ViT: Mind Visual Tokens for Vision Transformer        Introduction This repository contains the source code under PyTorch framework and models trai

Jiangtao Xie 44 Nov 24, 2022
Speedy Implementation of Instance-based Learning (IBL) agents in Python

A Python library to create single or multi Instance-based Learning (IBL) agents that are built based on Instance Based Learning Theory (IBLT) 1 Instal

0 Nov 18, 2021
3D position tracking for soccer players with multi-camera videos

This repo contains a full pipeline to support 3D position tracking of soccer players, with multi-view calibrated moving/fixed video sequences as inputs.

Yuchang Jiang 72 Dec 27, 2022
Code for ACL 21: Generating Query Focused Summaries from Query-Free Resources

marge This repository releases the code for Generating Query Focused Summaries from Query-Free Resources. Please cite the following paper [bib] if you

Yumo Xu 28 Nov 10, 2022
Python scripts performing class agnostic object localization using the Object Localization Network model in ONNX.

ONNX Object Localization Network Python scripts performing class agnostic object localization using the Object Localization Network model in ONNX. Ori

Ibai Gorordo 15 Oct 14, 2022
Repo for CReST: A Class-Rebalancing Self-Training Framework for Imbalanced Semi-Supervised Learning

CReST in Tensorflow 2 Code for the paper: "CReST: A Class-Rebalancing Self-Training Framework for Imbalanced Semi-Supervised Learning" by Chen Wei, Ki

Google Research 75 Nov 01, 2022
Using knowledge-informed machine learning on the PRONOSTIA (FEMTO) and IMS bearing data sets. Predict remaining-useful-life (RUL).

Knowledge Informed Machine Learning using a Weibull-based Loss Function Exploring the concept of knowledge-informed machine learning with the use of a

Tim 43 Dec 14, 2022
Find the Heart simple Python Game

This is a simple Python game for finding a heart emoji. There is a 3 x 3 matrix in which a heart emoji resides. The location of the heart is randomized and is not revealed. The player must guess the

p.katekomol 1 Jan 24, 2022
A PyTorch implementation of unsupervised SimCSE

A PyTorch implementation of unsupervised SimCSE

99 Dec 23, 2022
Repositório criado para abrigar os notebooks com a listas de exercícios propostos pelo professor Gustavo Guanabara do canal Curso em Vídeo do YouTube durante o Curso de Python 3

Curso em Vídeo - Exercícios de Python 3 Sobre o repositório Este repositório contém os notebooks com a listas de exercícios propostos pelo professor G

João Pedro Pereira 9 Oct 15, 2022
Code for paper [ACE: Ally Complementary Experts for Solving Long-Tailed Recognition in One-Shot] (ICCV 2021, oral))

ACE: Ally Complementary Experts for Solving Long-Tailed Recognition in One-Shot This repository is the official PyTorch implementation of ICCV-21 pape

Jiarui 21 May 09, 2022