Deep Distributed Control of Port-Hamiltonian Systems

Overview

De(e)pendable Distributed Control of Port-Hamiltonian Systems (DeepDisCoPH)

This repository is associated to the paper [1] and it contains:

  1. The full paper manuscript.
  2. The code to reproduce numerical experiments.

Summary

By embracing the compositional properties of port-Hamiltonian (pH) systems, we characterize deep Hamiltonian control policies with built-in closed-loop stability guarantees — irrespective of the interconnection topology and the chosen neural network parameters. Furthermore, our setup enables leveraging recent results on well-behaved neural ODEs to prevent the phenomenon of vanishing gradients by design [2]. The numerical experiments described in the report and available in this repository corroborate the dependability of the proposed DeepDisCoPH architecture, while matching the performance of general neural network policies.

Report

The report as well as the corresponding Appendices can be found in the docs folder.

Installation of DeepDisCoPH

The following lines indicates how to install the Deep Distributed Control for Port-Hamiltonian Systems (DeepDisCoPH) package.

git clone https://github.com/DecodEPFL/DeepDisCoPH.git

cd DeepDisCoPH

python setup.py install

Basic usage

To train distributed controllers for the 12 robots in the xy-plane:

./run.py --model [MODEL]

where available values for MODEL are distributed_HDNN, distributed_HDNN_TI and distributed_MLP.

To plot the norms of the backward sensitivity matrices (BSMs) when training a distributed H-DNN as the previous example, run:

./bsm.py --layer [LAYER]

where available values for LAYER are 1,2,...,100. If LAYER=-1, then it is set to N. The LAYER parameter indicates the layer number at which we consider the loss function is evaluated.

Examples: formation control with collision avoidance

The following gifs show the trajectories of the robots before and after the training of a distributed H-DNN controller. The goal is to reach the target positions within T = 5 seconds while avoiding collisions.

robot_trajectories_before_training robot_trajectories_after_training_a_distributed_HDNN_controller

Training performed for t in [0,5]. Trajectories shown for t in [0,6], highlighting that robots stay close to the desired position when the time horizon is extended (grey background).

Early stopping of the training

We verify that DeepDisCoPH controllers ensure closed-loop stability by design even during exploration. We train the DeepDisCoPH controller for 25%, 50% and 75% of the total number of iterations and report the results in the following gifs.

robot_trajectories_25_training robot_trajectories_50_training robot_trajectories_75_training

Training performed for t in [0,5]. Trajectories shown for t in [0,15]. The extended horizon, i.e. when t in [5,15], is shown with grey background. Partially trained distributed controllers exhibit suboptimal behavior, but never compromise closed-loop stability.

References

[1] Luca Furieri, Clara L. Galimberti, Muhammad Zakwan and Giancarlo Ferrrari Trecate. "Distributed neural network control with dependability guarantees: a compositional port-Hamiltonian approach", under review.

[2] Clara L. Galimberti, Luca Furieri, Liang Xu and Giancarlo Ferrrari Trecate. "Hamiltonian Deep Neural Networks Guaranteeing Non-vanishing Gradients by Design," arXiv:2105.13205, 2021.

Owner
Dependable Control and Decision group - EPFL
Dependable Control and Decision group - EPFL
Implementation of the bachelor's thesis "Real-time stock predictions with deep learning and news scraping".

Real-time stock predictions with deep learning and news scraping This repository contains a partial implementation of my bachelor's thesis "Real-time

David Álvarez de la Torre 0 Feb 09, 2022
Point cloud processing tool library.

Point Cloud ToolBox This point cloud processing tool library can be used to process point clouds, 3d meshes, and voxels. Environment python 3.7.5 Dep

ZhangXinyun 40 Dec 09, 2022
IMBENS: class-imbalanced ensemble learning in Python.

IMBENS: class-imbalanced ensemble learning in Python. Links: [Documentation] [Gallery] [PyPI] [Changelog] [Source] [Download] [知乎/Zhihu] [中文README] [a

Zhining Liu 176 Jan 04, 2023
U-Net: Convolutional Networks for Biomedical Image Segmentation

Deep Learning Tutorial for Kaggle Ultrasound Nerve Segmentation competition, using Keras This tutorial shows how to use Keras library to build deep ne

Yihui He 401 Nov 21, 2022
Share a benchmark that can easily apply reinforcement learning in Job-shop-scheduling

Gymjsp Gymjsp is an open source Python library, which uses the OpenAI Gym interface for easily instantiating and interacting with RL environments, and

134 Dec 08, 2022
This is the face keypoint train code of project face-detection-project

face-key-point-pytorch 1. Data structure The structure of landmarks_jpg is like below: |--landmarks_jpg |----AFW |------AFW_134212_1_0.jpg |------AFW_

I‘m X 3 Nov 27, 2022
[ICCV 2021] HRegNet: A Hierarchical Network for Large-scale Outdoor LiDAR Point Cloud Registration

HRegNet: A Hierarchical Network for Large-scale Outdoor LiDAR Point Cloud Registration Introduction The repository contains the source code and pre-tr

Intelligent Sensing, Perception and Computing Group 55 Dec 14, 2022
Adds timm pretrained backbone to pytorch's FasterRcnn model

Operating Systems Lab (ETCS-352) Experiments for Operating Systems Lab (ETCS-352) performed by me in 2021 at uni. All codes are written by me except t

Mriganka Nath 12 Dec 03, 2022
University of Rochester 2021 Summer REU focusing on music sentiment transfer using CycleGAN

Music-Sentiment-Transfer University of Rochester 2021 Summer REU focusing on music sentiment transfer using CycleGAN Poster: Music Sentiment Transfer

Miles Sigel 2 Jan 24, 2022
A computer vision pipeline to identify the "icons" in Christian paintings

Christian-Iconography A computer vision pipeline to identify the "icons" in Christian paintings. A bit about iconography. Iconography is related to id

Rishab Mudliar 3 Jul 30, 2022
Codeflare - Scale complex AI/ML pipelines anywhere

Scale complex AI/ML pipelines anywhere CodeFlare is a framework to simplify the integration, scaling and acceleration of complex multi-step analytics

CodeFlare 169 Nov 29, 2022
Chinese Advertisement Board Identification(Pytorch)

Chinese-Advertisement-Board-Identification. We use YoloV5 to extract the ROI of the location of the chinese word. Next, we sort the bounding box and recognize every chinese words which we extracted.

Li-Wei Hsiao 12 Jul 21, 2022
Contrastive unpaired image-to-image translation, faster and lighter training than cyclegan (ECCV 2020, in PyTorch)

Contrastive Unpaired Translation (CUT) video (1m) | video (10m) | website | paper We provide our PyTorch implementation of unpaired image-to-image tra

1.7k Dec 27, 2022
Statistical-Rethinking-with-Python-and-PyMC3 - Python/PyMC3 port of the examples in " Statistical Rethinking A Bayesian Course with Examples in R and Stan" by Richard McElreath

Statistical Rethinking with Python and PyMC3 This repository has been deprecated in favour of this one, please check that repository for updates, for

Osvaldo Martin 786 Dec 29, 2022
Complete* list of autonomous driving related datasets

AD Datasets Complete* and curated list of autonomous driving related datasets Contributing Contributions are very welcome! To add or update a dataset:

Daniel Bogdoll 13 Dec 19, 2022
DCSL - Generalizable Crowd Counting via Diverse Context Style Learning

DCSL Generalizable Crowd Counting via Diverse Context Style Learning Requirement

3 Jun 13, 2022
Predict halo masses from simulations via graph neural networks

HaloGraphNet Predict halo masses from simulations via Graph Neural Networks. Given a dark matter halo and its galaxies, creates a graph with informati

Pablo Villanueva Domingo 20 Nov 15, 2022
Data Engineering ZoomCamp

Data Engineering ZoomCamp I'm partaking in a Data Engineering Bootcamp / Zoomcamp and will be tracking my progress here. I can't promise these notes w

Aaron 61 Jan 06, 2023
Official Pytorch implementation of 'GOCor: Bringing Globally Optimized Correspondence Volumes into Your Neural Network' (NeurIPS 2020)

Official implementation of GOCor This is the official implementation of our paper : GOCor: Bringing Globally Optimized Correspondence Volumes into You

Prune Truong 71 Nov 18, 2022
clustering moroccan stocks time series data using k-means with dtw (dynamic time warping)

Moroccan Stocks Clustering Context Hey! we don't always have to forecast time series am I right ? We use k-means to cluster about 70 moroccan stock pr

Ayman Lafaz 7 Oct 18, 2022