Simple tool to combine(merge) onnx models. Simple Network Combine Tool for ONNX.

Overview

snc4onnx

Simple tool to combine(merge) onnx models. Simple Network Combine Tool for ONNX.

https://github.com/PINTO0309/simple-onnx-processing-tools

Downloads GitHub PyPI CodeQL

1. Setup

1-1. HostPC

### option
$ echo export PATH="~/.local/bin:$PATH" >> ~/.bashrc \
&& source ~/.bashrc

### run
$ pip install -U onnx \
&& pip install -U onnx-simplifier \
&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \
&& pip install -U snc4onnx

1-2. Docker

### docker pull
$ docker pull pinto0309/snc4onnx:latest

### docker build
$ docker build -t pinto0309/snc4onnx:latest .

### docker run
$ docker run --rm -it -v `pwd`:/workdir pinto0309/snc4onnx:latest
$ cd /workdir

2. CLI Usage

$ snc4onnx -h

usage:
  snc4onnx [-h]
    --input_onnx_file_paths INPUT_ONNX_FILE_PATHS [INPUT_ONNX_FILE_PATHS ...]
    --srcop_destop SRCOP_DESTOP [SRCOP_DESTOP ...]
    [--op_prefixes_after_merging OP_PREFIXES_AFTER_MERGING [OP_PREFIXES_AFTER_MERGING ...]]
    [--output_onnx_file_path OUTPUT_ONNX_FILE_PATH]
    [--output_of_onnx_file_in_the_process_of_fusion]
    [--non_verbose]

optional arguments:
  -h, --help
    show this help message and exit

  --input_onnx_file_paths INPUT_ONNX_FILE_PATHS [INPUT_ONNX_FILE_PATHS ...]
    Input onnx file paths. At least two onnx files must be specified.

  --srcop_destop SRCOP_DESTOP [SRCOP_DESTOP ...]
    The names of the output OP to join from and the input OP to join to are
    out1 in1 out2 in2 out3 in3 .... format.
    In other words, to combine model1 and model2,
    --srcop_destop model1_out1 model2_in1 model1_out2 model2_in2
    Also, --srcop_destop can be specified multiple times.
    The first --srcop_destop specifies the correspondence between model1 and model2,
    and the second --srcop_destop specifies the correspondence
    between model1 and model2 combined and model3.
    It is necessary to take into account that the prefix specified
    in op_prefixes_after_merging is given at the beginning of each OP name.
    e.g. To combine model1 with model2 and model3.
    --srcop_destop model1_src_op1 model2_dest_op1 model1_src_op2 model2_dest_op2 ...
    --srcop_destop comb_model12_src_op1 model3_dest_op1 comb_model12_src_op2 model3_dest_op2 ...

  --op_prefixes_after_merging OP_PREFIXES_AFTER_MERGING [OP_PREFIXES_AFTER_MERGING ...]
    Since a single ONNX file cannot contain multiple OPs with the same name,
    a prefix is added to all OPs in each input ONNX model to avoid duplication.
    Specify the same number of paths as input_onnx_file_paths.
    e.g. --op_prefixes_after_merging model1_prefix model2_prefix model3_prefix ...

  --output_onnx_file_path OUTPUT_ONNX_FILE_PATH
    Output onnx file path.

  --output_of_onnx_file_in_the_process_of_fusion
    Output of onnx files in the process of fusion.

  --non_verbose
    Do not show all information logs. Only error logs are displayed.

3. In-script Usage

$ python
>>> from snc4onnx import combine
>>> help(combine)

Help on function combine in module snc4onnx.onnx_network_combine:

combine(
  srcop_destop: List[str],
  op_prefixes_after_merging: Union[List[str], NoneType] = [],
  input_onnx_file_paths: Union[List[str], NoneType] = [],
  onnx_graphs: Union[List[onnx.onnx_ml_pb2.ModelProto], NoneType] = [],
  output_onnx_file_path: Union[str, NoneType] = '',
  output_of_onnx_file_in_the_process_of_fusion: Union[bool, NoneType] = False,
  non_verbose: Union[bool, NoneType] = False
) -> onnx.onnx_ml_pb2.ModelProto

    Parameters
    ----------
    srcop_destop: List[str]
        The names of the output OP to join from and the input OP to join to are
        [["out1","in1"], ["out2","in2"], ["out3","in3"]] format.

        In other words, to combine model1 and model2,
        srcop_destop =
            [
                ["model1_out1_opname","model2_in1_opname"],
                ["model1_out2_opname","model2_in2_opname"]
            ]

        The first srcop_destop specifies the correspondence between model1 and model2,
        and the second srcop_destop specifies the correspondence
        between model1 and model2 combined and model3.
        It is necessary to take into account that the prefix specified
        in op_prefixes_after_merging is given at the beginning of each OP name.

        e.g. To combine model1 with model2 and model3.
        srcop_destop =
            [
                [
                    ["model1_src_op1","model2_dest_op1"],
                    ["model1_src_op2","model2_dest_op2"]
                ],
                [
                    ["combined_model1.2_src_op1","model3_dest_op1"],
                    ["combined_model1.2_src_op2","model3_dest_op2"]
                ],
                ...
            ]

    op_prefixes_after_merging: List[str]
        Since a single ONNX file cannot contain multiple OPs with the same name,
        a prefix is added to all OPs in each input ONNX model to avoid duplication.
        Specify the same number of paths as input_onnx_file_paths.
        e.g. op_prefixes_after_merging = ["model1_prefix","model2_prefix","model3_prefix", ...]

    input_onnx_file_paths: Optional[List[str]]
        Input onnx file paths. At least two onnx files must be specified.
        Either input_onnx_file_paths or onnx_graphs must be specified.
        onnx_graphs If specified, ignore input_onnx_file_paths and process onnx_graphs.
        e.g. input_onnx_file_paths = ["model1.onnx", "model2.onnx", "model3.onnx", ...]

    onnx_graphs: Optional[List[onnx.ModelProto]]
        List of onnx.ModelProto. At least two onnx graphs must be specified.
        Either input_onnx_file_paths or onnx_graphs must be specified.
        onnx_graphs If specified, ignore input_onnx_file_paths and process onnx_graphs.
        e.g. onnx_graphs = [graph1, graph2, graph3, ...]

    output_onnx_file_path: Optional[str]
        Output onnx file path.
        If not specified, .onnx is not output.
        Default: ''

    output_of_onnx_file_in_the_process_of_fusion: Optional[bool]
        Output of onnx files in the process of fusion.
        Default: False

    non_verbose: Optional[bool]
        Do not show all information logs. Only error logs are displayed.
        Default: False

    Returns
    -------
    combined_graph: onnx.ModelProto
        Combined onnx ModelProto

4. CLI Execution

$ snc4onnx \
--input_onnx_file_paths crestereo_init_iter2_120x160.onnx crestereo_next_iter2_240x320.onnx \
--srcop_destop output flow_init \
--op_prefixes_after_merging init next

5. In-script Execution

5-1. ONNX files

from snc4onnx import combine

combined_graph = combine(
    srcop_destop = [
        ['output', 'flow_init']
    ],
    op_prefixes_after_merging = [
        'init',
        'next',
    ],
    input_onnx_file_paths = [
        'crestereo_init_iter2_120x160.onnx',
        'crestereo_next_iter2_240x320.onnx',
    ],
    non_verbose = True,
)

5-2. onnx.ModelProtos

from snc4onnx import combine

combined_graph = combine(
    srcop_destop = [
        ['output', 'flow_init']
    ],
    op_prefixes_after_merging = [
        'init',
        'next',
    ],
    onnx_graphs = [
        graph1,
        graph2,
        graph3,
    ],
    non_verbose = True,
)

6. Sample

6-1 INPUT <-> OUTPUT

  • Summary

    image

  • Model.1

    image

  • Model.2

    image

  • Merge

    $ snc4onnx \
    --input_onnx_file_paths crestereo_init_iter2_120x160.onnx crestereo_next_iter2_240x320.onnx \
    --op_prefixes_after_merging init next \
    --srcop_destop output flow_init
  • Result

    image image

6-2 INPUT + INPUT

  • Summary

    image

  • Model.1

    image

  • Model.2

    image

  • Merge

    $ snc4onnx \
    --input_onnx_file_paths objectron_camera_224x224.onnx objectron_chair_224x224.onnx \
    --srcop_destop input_1 input_1 \
    --op_prefixes_after_merging camera chair \
    --output_onnx_file_path objectron_camera_chair_224x224.onnx
  • Result

    image image

7. Reference

  1. https://github.com/onnx/onnx/blob/main/docs/PythonAPIOverview.md
  2. https://github.com/PINTO0309/sne4onnx
  3. https://github.com/PINTO0309/snd4onnx
  4. https://github.com/PINTO0309/scs4onnx
  5. https://github.com/PINTO0309/sog4onnx
  6. https://github.com/PINTO0309/PINTO_model_zoo

8. Issues

https://github.com/PINTO0309/simple-onnx-processing-tools/issues

You might also like...
Python scripts performing class agnostic object localization using the Object Localization Network model in ONNX.
Python scripts performing class agnostic object localization using the Object Localization Network model in ONNX.

ONNX Object Localization Network Python scripts performing class agnostic object localization using the Object Localization Network model in ONNX. Ori

MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Convert models between Caffe, Keras, MXNet, Tensorflow, CNTK, PyTorch Onnx and CoreML.
MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Convert models between Caffe, Keras, MXNet, Tensorflow, CNTK, PyTorch Onnx and CoreML.

MMdnn MMdnn is a comprehensive and cross-framework tool to convert, visualize and diagnose deep learning (DL) models. The "MM" stands for model manage

tf2onnx - Convert TensorFlow, Keras and Tflite models to ONNX.

tf2onnx converts TensorFlow (tf-1.x or tf-2.x), tf.keras and tflite models to ONNX via command line or python api.

This package proposes simplified exporting pytorch models to ONNX and TensorRT, and also gives some base interface for model inference.

PyTorch Infer Utils This package proposes simplified exporting pytorch models to ONNX and TensorRT, and also gives some base interface for model infer

Convert onnx models to pytorch.

onnx2torch onnx2torch is an ONNX to PyTorch converter. Our converter: Is easy to use – Convert the ONNX model with the function call convert; Is easy

Simple node deletion tool for onnx.
Simple node deletion tool for onnx.

snd4onnx Simple node deletion tool for onnx. I only test very miscellaneous and limited patterns as a hobby. There are probably a large number of bugs

XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale

XtremeDistilTransformers for Distilling Massive Multilingual Neural Networks ACL 2020 Microsoft Research [Paper] [Video] Releasing [XtremeDistilTransf

Ever felt tired after preprocessing the dataset, and not wanting to write any code further to train your model? Ever encountered a situation where you wanted to record the hyperparameters of the trained model and able to retrieve it afterward? Models Playground is here to help you do that. Models playground allows you to train your models right from the browser. PyTorch ,ONNX and TensorRT implementation of YOLOv4
PyTorch ,ONNX and TensorRT implementation of YOLOv4

PyTorch ,ONNX and TensorRT implementation of YOLOv4

Releases(1.0.11)
  • 1.0.11(Jan 2, 2023)

  • 1.0.10(Jan 2, 2023)

  • 1.0.9(Sep 7, 2022)

    • Add short form parameter

      $ snc4onnx -h
      
      usage:
        snc4onnx [-h]
          -if INPUT_ONNX_FILE_PATHS [INPUT_ONNX_FILE_PATHS ...]
          -sd SRCOP_DESTOP [SRCOP_DESTOP ...]
          [-opam OP_PREFIXES_AFTER_MERGING [OP_PREFIXES_AFTER_MERGING ...]]
          [-of OUTPUT_ONNX_FILE_PATH]
          [-f]
          [-n]
      
      optional arguments:
        -h, --help
          show this help message and exit.
      
        -if INPUT_ONNX_FILE_PATHS [INPUT_ONNX_FILE_PATHS ...], --input_onnx_file_paths INPUT_ONNX_FILE_PATHS [INPUT_ONNX_FILE_PATHS ...]
            Input onnx file paths. At least two onnx files must be specified.
      
        -sd SRCOP_DESTOP [SRCOP_DESTOP ...], --srcop_destop SRCOP_DESTOP [SRCOP_DESTOP ...]
            The names of the output OP to join from and the input OP to join to are
            out1 in1 out2 in2 out3 in3 ....
            format.
            In other words, to combine model1 and model2,
            --srcop_destop model1_out1 model2_in1 model1_out2 model2_in2
            Also, --srcop_destop can be specified multiple times.
            The first --srcop_destop specifies the correspondence between model1 and model2,
            and the second --srcop_destop specifies the correspondence between
            model1 and model2 combined and model3.
            It is necessary to take into account that the prefix specified
            in op_prefixes_after_merging is
            given at the beginning of each OP name.
            e.g. To combine model1 with model2 and model3.
            --srcop_destop model1_src_op1 model2_dest_op1 model1_src_op2 model2_dest_op2 ...
            --srcop_destop combined_model1.2_src_op1 model3_dest_op1 combined_model1.2_src_op2 model3_dest_op2 ...
      
        -opam OP_PREFIXES_AFTER_MERGING [OP_PREFIXES_AFTER_MERGING ...], --op_prefixes_after_merging OP_PREFIXES_AFTER_MERGING [OP_PREFIXES_AFTER_MERGING ...]
            Since a single ONNX file cannot contain multiple OPs with the same name,
            a prefix is added to all OPs in each input ONNX model to avoid duplication.
            Specify the same number of paths as input_onnx_file_paths.
            e.g. --op_prefixes_after_merging model1_prefix model2_prefix model3_prefix ...
      
        -of OUTPUT_ONNX_FILE_PATH, --output_onnx_file_path OUTPUT_ONNX_FILE_PATH
            Output onnx file path.
      
        -f, --output_of_onnx_file_in_the_process_of_fusion
            Output of onnx files in the process of fusion.
      
        -n, --non_verbose
            Do not show all information logs. Only error logs are displayed.
      
    Source code(tar.gz)
    Source code(zip)
  • 1.0.8(Sep 6, 2022)

    1. Fixed a bug that caused INPUT names to be corrupted. There was a problem with the removal of prefixes added during the model merging process.
      • before: main_input -> put (bug)
      • after: main_input -> input
      • Stop using lstrip and change to forward matching logic with re.sub
    2. Added process to clean up OUTPUT prefixes as much as possible image
    Source code(tar.gz)
    Source code(zip)
  • 1.0.7(May 25, 2022)

  • 1.0.6(May 7, 2022)

  • 1.0.5(May 1, 2022)

  • 1.0.4(Apr 27, 2022)

    • Change op_prefixes_after_merging to optional
    • Added duplicate OP name check
      • If there is a duplicate OP name, the model cannot be combined and the process is aborted with the following error message.
        ERROR: 
        There is a duplicate OP name after merging models.
        op_name:input count:2, op_name:output count:2
        Avoid duplicate OP names by specifying a prefix in op_prefixes_after_merging.
        
    Source code(tar.gz)
    Source code(zip)
  • 1.0.3(Apr 24, 2022)

  • 1.0.2(Apr 11, 2022)

  • 1.0.1(Apr 10, 2022)

  • 1.0.0(Apr 10, 2022)

Owner
Katsuya Hyodo
Hobby programmer. Intel Software Innovator Program member.
Katsuya Hyodo
It's a implement of this paper:Relation extraction via Multi-Level attention CNNs

Relation Classification via Multi-Level Attention CNNs It's a implement of this paper:Relation Classification via Multi-Level Attention CNNs. Training

Aybss 2 Nov 04, 2022
Equivariant Imaging: Learning Beyond the Range Space

Equivariant Imaging: Learning Beyond the Range Space Equivariant Imaging: Learning Beyond the Range Space Dongdong Chen, Julián Tachella, Mike E. Davi

Dongdong Chen 46 Jan 01, 2023
Anderson Acceleration for Deep Learning

Anderson Accelerated Deep Learning (AADL) AADL is a Python package that implements the Anderson acceleration to speed-up the training of deep learning

Oak Ridge National Laboratory 7 Nov 24, 2022
Code to go with the paper "Decentralized Bayesian Learning with Metropolis-Adjusted Hamiltonian Monte Carlo"

dblmahmc Code to go with the paper "Decentralized Bayesian Learning with Metropolis-Adjusted Hamiltonian Monte Carlo" Requirements: https://github.com

1 Dec 17, 2021
Let Python optimize the best stop loss and take profits for your TradingView strategy.

TradingView Machine Learning TradeView is a free and open source Trading View bot written in Python. It is designed to support all major exchanges. It

Robert Roman 473 Jan 09, 2023
BT-Unet: A-Self-supervised-learning-framework-for-biomedical-image-segmentation-using-Barlow-Twins

BT-Unet: A-Self-supervised-learning-framework-for-biomedical-image-segmentation-using-Barlow-Twins Deep learning has brought most profound contributio

Narinder Singh Punn 12 Dec 04, 2022
Adapter-BERT: Parameter-Efficient Transfer Learning for NLP.

Adapter-BERT: Parameter-Efficient Transfer Learning for NLP.

Google Research 340 Jan 03, 2023
Official PyTorch implementation of "The Center of Attention: Center-Keypoint Grouping via Attention for Multi-Person Pose Estimation" (ICCV 21).

CenterGroup This the official implementation of our ICCV 2021 paper The Center of Attention: Center-Keypoint Grouping via Attention for Multi-Person P

Dynamic Vision and Learning Group 43 Dec 25, 2022
Mixed Neural Likelihood Estimation for models of decision-making

Mixed neural likelihood estimation for models of decision-making Mixed neural likelihood estimation (MNLE) enables Bayesian parameter inference for mo

mackelab 9 Dec 22, 2022
Data Consistency for Magnetic Resonance Imaging

Data Consistency for Magnetic Resonance Imaging Data Consistency (DC) is crucial for generalization in multi-modal MRI data and robustness in detectin

Dimitris Karkalousos 19 Dec 12, 2022
a reimplementation of UnFlow in PyTorch that matches the official TensorFlow version

pytorch-unflow This is a personal reimplementation of UnFlow [1] using PyTorch. Should you be making use of this work, please cite the paper according

Simon Niklaus 134 Nov 20, 2022
Imaginaire - NVIDIA's Deep Imagination Team's PyTorch Library

Imaginaire Docs | License | Installation | Model Zoo Imaginaire is a pytorch library that contains optimized implementation of several image and video

NVIDIA Research Projects 3.6k Dec 29, 2022
Face recognition project by matching the features extracted using SIFT.

MV_FaceDetectionWithSIFT Face recognition project by matching the features extracted using SIFT. By : Aria Radmehr Professor : Ali Amiri Dependencies

Aria Radmehr 4 May 31, 2022
Rename Images with Auto Generated Neural Image Captions

Recaption Images with Generated Neural Image Caption Example Usage: Commandline: Recaption all images from folder /home/feng/Downloads/images to folde

feng wang 3 May 01, 2022
Improving Calibration for Long-Tailed Recognition (CVPR2021)

MiSLAS Improving Calibration for Long-Tailed Recognition Authors: Zhisheng Zhong, Jiequan Cui, Shu Liu, Jiaya Jia [arXiv] [slide] [BibTeX] Introductio

DV Lab 116 Dec 20, 2022
constructing maps of intellectual influence from publication data

Influencemap Project @ ANU Influence in the academic communities has been an area of interest for researchers. This can be seen in the popularity of a

CS Metrics 13 Jun 18, 2022
NeurIPS'21 Tractable Density Estimation on Learned Manifolds with Conformal Embedding Flows

NeurIPS'21 Tractable Density Estimation on Learned Manifolds with Conformal Embedding Flows This repo contains the code for the paper Tractable Densit

Layer6 Labs 4 Dec 12, 2022
The Submission for SIMMC 2.0 Challenge 2021

The Submission for SIMMC 2.0 Challenge 2021 challenge website Requirements python 3.8.8 pytorch 1.8.1 transformers 4.8.2 apex for multi-gpu nltk Prepr

5 Jul 26, 2022
Network Compression via Central Filter

Network Compression via Central Filter Environments The code has been tested in the following environments: Python 3.8 PyTorch 1.8.1 cuda 10.2 torchsu

2 May 12, 2022
A library to inspect itermediate layers of PyTorch models.

A library to inspect itermediate layers of PyTorch models. Why? It's often the case that we want to inspect intermediate layers of a model without mod

archinet.ai 380 Dec 28, 2022