Multivariate Boosted TRee

Related tags

Deep Learningmbtr
Overview

Documentation Status Build Status codecov Latest Version License: MIT

Multivariate Boosted TRee

What is MBTR

MBTR is a python package for multivariate boosted tree regressors trained in parameter space. The package can handle arbitrary multivariate losses, as long as their gradient and Hessian are known. Gradient boosted trees are competition-winning, general-purpose, non-parametric regressors, which exploit sequential model fitting and gradient descent to minimize a specific loss function. The most popular implementations are tailored to univariate regression and classification tasks, precluding the possibility of capturing multivariate target cross-correlations and applying conditional penalties to the predictions. This package allows to arbitrarily regularize the predictions, so that properties like smoothness, consistency and functional relations can be enforced.

Installation

pip install --upgrade git+https://github.com/supsi-dacd-isaac/mbtr.git

Usage

MBT regressor follows the scikit-learn syntax for regressors. Creating a default instance and training it is as simple as:

m = MBT().fit(x,y)

while predictions for the test set are obtained through

y_hat = m.predict(x_te)

The most important parameters are the number of boosts n_boost, that is, the number of fitted trees, learning_rate and the loss_type. An extensive explanation of the different parameters can be found in the documentation.

Documentation

Documentation and examples on the usage can be found at docs.

Reference

If you make use of this software for your work, we would appreciate it if you would cite us:

Lorenzo Nespoli and Vasco Medici (2020). Multivariate Boosted Trees and Applications to Forecasting and Control arXiv

@article{nespoli2020multivariate,
  title={Multivariate Boosted Trees and Applications to Forecasting and Control},
  author={Nespoli, Lorenzo and Medici, Vasco},
  journal={arXiv preprint arXiv:2003.03835},
  year={2020}
}

Acknowledgments

The authors would like to thank the Swiss Federal Office of Energy (SFOE) and the Swiss Competence Center for Energy Research - Future Swiss Electrical Infrastructure (SCCER-FURIES), for their financial and technical support to this research work.

You might also like...
Grammar Induction using a Template Tree Approach

Gitta Gitta ("Grammar Induction using a Template Tree Approach") is a method for inducing context-free grammars. It performs particularly well on data

Code for "Learning Structural Edits via Incremental Tree Transformations" (ICLR'21)

Learning Structural Edits via Incremental Tree Transformations Code for "Learning Structural Edits via Incremental Tree Transformations" (ICLR'21) 1.

Deep Image Search is an AI-based image search engine that includes deep transfor learning features Extraction and tree-based vectorized search.
Deep Image Search is an AI-based image search engine that includes deep transfor learning features Extraction and tree-based vectorized search.

Deep Image Search - AI-Based Image Search Engine Deep Image Search is an AI-based image search engine that includes deep transfer learning features Ex

Code for Graph-to-Tree Learning for Solving Math Word Problems (ACL 2020)

Graph-to-Tree Learning for Solving Math Word Problems PyTorch implementation of Graph based Math Word Problem solver described in our ACL 2020 paper G

NAS Benchmark in
NAS Benchmark in "Prioritized Architecture Sampling with Monto-Carlo Tree Search", CVPR2021

NAS-Bench-Macro This repository includes the benchmark and code for NAS-Bench-Macro in paper "Prioritized Architecture Sampling with Monto-Carlo Tree

The official code for paper "R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling".

R2D2 This is the official code for paper titled "R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Mode

Instance Segmentation in 3D Scenes using Semantic Superpoint Tree Networks
Instance Segmentation in 3D Scenes using Semantic Superpoint Tree Networks

SSTNet Instance Segmentation in 3D Scenes using Semantic Superpoint Tree Networks(ICCV2021) by Zhihao Liang, Zhihao Li, Songcen Xu, Mingkui Tan, Kui J

Implementation of fast algorithms for Maximum Spanning Tree (MST) parsing that includes fast ArcMax+Reweighting+Tarjan algorithm for single-root dependency parsing.

Fast MST Algorithm Implementation of fast algorithms for (Maximum Spanning Tree) MST parsing that includes fast ArcMax+Reweighting+Tarjan algorithm fo

This is the code repository implementing the paper
This is the code repository implementing the paper "TreePartNet: Neural Decomposition of Point Clouds for 3D Tree Reconstruction".

TreePartNet This is the code repository implementing the paper "TreePartNet: Neural Decomposition of Point Clouds for 3D Tree Reconstruction". Depende

Comments
  • Is it possible to define custom loss function ?

    Is it possible to define custom loss function ?

    Dear all, First thank you for developping this tool, that I believe is of great interest. I am working with:

    • environmental variables (e.g. temperature, salinity)
    • multi-dimensional targets, that are relative abundance, with their sum = 1 for each site

    Therefore, I was wondering if it is possible to implement a custom loss function in the mbtr framework, that would be adapted for proportions. Please note that I am quite new to python.

    To do some testing, I tryed to dupplicate the mse loss function with another name in the losses.py file and adding the new loss in the LOSS_MAP in __inits__.py. Then I compiled the files. However, I have this error when trying to run the model from the multi_reg.py example:

    >>> m = MBT(loss_type = 'mse', n_boosts=30,  min_leaf=100, lambda_weights=1e-3).fit(x_tr, y_tr, do_plot=True)
      3%|▎         | 1/30 [00:03<01:45,  3.63s/it]
    >>> m = MBT(loss_type = 'custom_mse', n_boosts=30,  min_leaf=100, lambda_weights=1e-3).fit(x_tr, y_tr, do_plot=True)
      0%|          | 0/30 [00:00<?, ?it/s]KeyError: 'custom_mse'
    

    It seems that the new loss is not recognised in LOSS_MAP:

    >>> LOSS_MAP = {'custom_mse': losses.custom_MSE,
    ...             'mse': losses.MSE,
    ...             'time_smoother': losses.TimeSmoother,
    ...             'latent_variable': losses.LatentVariable,
    ...             'linear_regression': losses.LinRegLoss,
    ...             'fourier': losses.FourierLoss,
    ...             'quantile': losses.QuantileLoss,
    ...             'quadratic_quantile': losses.QuadraticQuantileLoss}
    AttributeError: module 'mbtr.losses' has no attribute 'custom_MSE'
    

    I guess that I missed something when trying to dupplicate and rename the mse loss. I would appreciate any help if the definition of a custom loss function is possible.

    Best regards,

    opened by alexschickele 2
  • Dataset cannot be reached

    Dataset cannot be reached

    Hi thank you for your effort to create this. I want to try this but i cannot download nor visit the web that you provided in example multivariate_forecas.py

    Is there any alternative link for that dataset? thank you regards!

    opened by kristfrizh 1
  • Error at import time with python 3.10.*

    Error at import time with python 3.10.*

    I want to use MBTR in a teaching module and I need to use jupyter-lab inside a conda environment for teaching purposes. While MBTR works as expected in a vanilla python 3.8, it errors out (on the same machine) in a conda environment using python 3.10

    Steps to reproduce

    conda create --name testenv
    conda activate testenv
    
    conda install -c conda-forge jupyterlab
    pip install --upgrade git+https://github.com/supsi-dacd-isaac/mbtr.git
    # to make sure to get the latest version; but the version on pypi gives the same error 
    

    Then

    python
    

    and in python

    from mbtr.mbtr import MBT
    

    which outputs the following error

    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/mbtr/mbtr.py", line 317, in <module>
        def leaf_stats(y, edges, x, order):
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/decorators.py", line 219, in wrapper
        disp.compile(sig)
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/dispatcher.py", line 965, in compile
        cres = self._compiler.compile(args, return_type)
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/dispatcher.py", line 129, in compile
        raise retval
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/dispatcher.py", line 139, in _compile_cached
        retval = self._compile_core(args, return_type)
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/dispatcher.py", line 152, in _compile_core
        cres = compiler.compile_extra(self.targetdescr.typing_context,
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/compiler.py", line 716, in compile_extra
        return pipeline.compile_extra(func)
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/compiler.py", line 452, in compile_extra
        return self._compile_bytecode()
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/compiler.py", line 520, in _compile_bytecode
        return self._compile_core()
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/compiler.py", line 499, in _compile_core
        raise e
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/compiler.py", line 486, in _compile_core
        pm.run(self.state)
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/compiler_machinery.py", line 368, in run
        raise patched_exception
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/compiler_machinery.py", line 356, in run
        self._runPass(idx, pass_inst, state)
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/compiler_lock.py", line 35, in _acquire_compile_lock
        return func(*args, **kwargs)
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/compiler_machinery.py", line 311, in _runPass
        mutated |= check(pss.run_pass, internal_state)
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/compiler_machinery.py", line 273, in check
        mangled = func(compiler_state)
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/typed_passes.py", line 105, in run_pass
        typemap, return_type, calltypes, errs = type_inference_stage(
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/typed_passes.py", line 83, in type_inference_stage
        errs = infer.propagate(raise_errors=raise_errors)
      File "/home/myself/.conda/envs/testenv/lib/python3.10/site-packages/numba/core/typeinfer.py", line 1086, in propagate
        raise errors[0]
    numba.core.errors.TypingError: Failed in nopython mode pipeline (step: nopython frontend)
    No conversion from UniTuple(none x 2) to UniTuple(array(float64, 2d, A) x 2) for '$116return_value.7', defined at None
    
    File ".conda/envs/testenv/lib/python3.10/site-packages/mbtr/mbtr.py", line 327:
    def leaf_stats(y, edges, x, order):
        <source elided>
            s_left, s_right = None, None
        return s_left, s_right
        ^
    
    During: typing of assignment at /home/myself/.conda/envs/testenv/lib/python3.10/site-packages/mbtr/mbtr.py (327)
    
    File ".conda/envs/test/lib/python3.10/site-packages/mbtr/mbtr.py", line 327:
    def leaf_stats(y, edges, x, order):
        <source elided>
            s_left, s_right = None, None
        return s_left, s_right
        ^
    

    Thanks in advance for any pointer/help. The course where I want to present this is a summer course and is closing in on me 😉

    opened by jiho 0
Releases(v0.1.3)
Owner
SUPSI-DACD-ISAAC
SUPSI-DACD-ISAAC
ACAV100M: Automatic Curation of Large-Scale Datasets for Audio-Visual Video Representation Learning. In ICCV, 2021.

ACAV100M: Automatic Curation of Large-Scale Datasets for Audio-Visual Video Representation Learning This repository contains the code for our ICCV 202

sangho.lee 28 Nov 08, 2022
Pytorch implementation of PTNet for high-resolution and longitudinal infant MRI synthesis

Pyramid Transformer Net (PTNet) Project | Paper Pytorch implementation of PTNet for high-resolution and longitudinal infant MRI synthesis. PTNet: A Hi

Xuzhe Johnny Zhang 6 Jun 08, 2022
Graph WaveNet apdapted for brain connectivity analysis.

Graph WaveNet for brain network analysis This is the implementation of the Graph WaveNet model used in our manuscript: S. Wein , A. Schüller, A. M. To

4 Dec 17, 2022
[AI6122] Text Data Management & Processing

[AI6122] Text Data Management & Processing is an elective course of MSAI, SCSE, NTU, Singapore. The repository corresponds to the AI6122 of Semester 1, AY2021-2022, starting from 08/2021. The instruc

HT. Li 1 Jan 17, 2022
PyTorch implementation of SimSiam: Exploring Simple Siamese Representation Learning

SimSiam: Exploring Simple Siamese Representation Learning This is a PyTorch implementation of the SimSiam paper: @Article{chen2020simsiam, author =

Facebook Research 834 Dec 30, 2022
Guiding evolutionary strategies by (inaccurate) differentiable robot simulators @ NeurIPS, 4th Robot Learning Workshop

Guiding Evolutionary Strategies by Differentiable Robot Simulators In recent years, Evolutionary Strategies were actively explored in robotic tasks fo

Vladislav Kurenkov 4 Dec 14, 2021
Tools to create pixel-wise object masks, bounding box labels (2D and 3D) and 3D object model (PLY triangle mesh) for object sequences filmed with an RGB-D camera.

Tools to create pixel-wise object masks, bounding box labels (2D and 3D) and 3D object model (PLY triangle mesh) for object sequences filmed with an RGB-D camera. This project prepares training and t

305 Dec 16, 2022
Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit

CNTK Chat Windows build status Linux build status The Microsoft Cognitive Toolkit (https://cntk.ai) is a unified deep learning toolkit that describes

Microsoft 17.3k Dec 29, 2022
face property detection pytorch

This is the face property train code of project face-detection-project

i am x 2 Oct 18, 2021
Fully convolutional deep neural network to remove transparent overlays from images

Fully convolutional deep neural network to remove transparent overlays from images

Marc Belmont 1.1k Jan 06, 2023
Face Detection & Age Gender & Expression & Recognition

Face Detection & Age Gender & Expression & Recognition

Sajjad Ayobi 188 Dec 28, 2022
The toolkit to generate auto labeled datasets

Ozeu Ozeu is the toolkit to autolabal dataset for instance segmentation. You can generate datasets labaled with segmentation mask and bounding box fro

Xiong Jie 28 Mar 28, 2022
Official implementation of NeurIPS 2021 paper "One Loss for All: Deep Hashing with a Single Cosine Similarity based Learning Objective"

Official implementation of NeurIPS 2021 paper "One Loss for All: Deep Hashing with a Single Cosine Similarity based Learning Objective"

Ng Kam Woh 71 Dec 22, 2022
PyGAD, a Python 3 library for building the genetic algorithm and training machine learning algorithms (Keras & PyTorch).

PyGAD: Genetic Algorithm in Python PyGAD is an open-source easy-to-use Python 3 library for building the genetic algorithm and optimizing machine lear

Ahmed Gad 1.1k Dec 26, 2022
Public repository containing materials used for Feed Forward (FF) Neural Networks article.

Art041_NN_Feed_Forward Public repository containing materials used for Feed Forward (FF) Neural Networks article. -- Illustration of a very simple Fee

SolClover 2 Dec 29, 2021
Torchlight2 lan game server tool - A message forwarding tool for Torchlight 2 lan game

Torchlight 2 Lan Game Server Tool A message forwarding tool for Torchlight 2 lan

Huaijun Jiang 3 Nov 01, 2022
Facial recognition project

Facial recognition project documentation Project introduction This project is developed by linuxu. It is a face model recognition project developed ba

Jefferson 2 Dec 04, 2022
Defense-GAN: Protecting Classifiers Against Adversarial Attacks Using Generative Models (published in ICLR2018)

Defense-GAN: Protecting Classifiers Against Adversarial Attacks Using Generative Models Pouya Samangouei*, Maya Kabkab*, Rama Chellappa [*: authors co

Maya Kabkab 212 Dec 07, 2022
A custom-designed Spider Robot trained to walk using Deep RL in a PyBullet Simulation

SpiderBot_DeepRL Title: Implementation of Single and Multi-Agent Deep Reinforcement Learning Algorithms for a Walking Spider Robot Authors(s): Arijit

Arijit Dasgupta 9 Jul 28, 2022