Unified Interface for Constructing and Managing Workflows on different workflow engines, such as Argo Workflows, Tekton Pipelines, and Apache Airflow.

Overview

CI Slack Twitter

Couler

What is Couler?

Couler aims to provide a unified interface for constructing and managing workflows on different workflow engines, such as Argo Workflows, Tekton Pipelines, and Apache Airflow.

Couler is included in CNCF Cloud Native Landscape and LF AI Landscape.

Who uses Couler?

You can find a list of organizations who are using Couler in ADOPTERS.md. If you'd like to add your organization to the list, please send us a pull request.

Why use Couler?

Many workflow engines exist nowadays, e.g. Argo Workflows, Tekton Pipelines, and Apache Airflow. However, their programming experience varies and they have different level of abstractions that are often obscure and complex. The code snippets below are some examples for constructing workflows using Apache Airflow and Kubeflow Pipelines.

Apache Airflow Kubeflow Pipelines

def create_dag(dag_id,
               schedule,
               dag_number,
               default_args):
    def hello_world_py(*args):
        print('Hello World')

    dag = DAG(dag_id,
              schedule_interval=schedule,
              default_args=default_args)
    with dag:
        t1 = PythonOperator(
            task_id='hello_world',
            python_callable=hello_world_py,
            dag_number=dag_number)
    return dag

for n in range(1, 10):
    default_args = {'owner': 'airflow',
                    'start_date': datetime(2018, 1, 1)
                    }
    globals()[dag_id] = create_dag(
        'hello_world_{}'.format(str(n)),
        '@daily',
        n,
        default_args)

class FlipCoinOp(dsl.ContainerOp):
    """Flip a coin and output heads or tails randomly."""
    def __init__(self):
        super(FlipCoinOp, self).__init__(
            name='Flip',
            image='python:alpine3.6',
            command=['sh', '-c'],
            arguments=['python -c "import random; result = \'heads\' if random.randint(0,1) == 0 '
                       'else \'tails\'; print(result)" | tee /tmp/output'],
            file_outputs={'output': '/tmp/output'})

class PrintOp(dsl.ContainerOp):
    """Print a message."""
    def __init__(self, msg):
        super(PrintOp, self).__init__(
            name='Print',
            image='alpine:3.6',
            command=['echo', msg],
        )

# define the recursive operation
@graph_component
def flip_component(flip_result):
    print_flip = PrintOp(flip_result)
    flipA = FlipCoinOp().after(print_flip)
    with dsl.Condition(flipA.output == 'heads'):
        flip_component(flipA.output)

@dsl.pipeline(
    name='pipeline flip coin',
    description='shows how to use graph_component.'
)
def recursive():
    flipA = FlipCoinOp()
    flipB = FlipCoinOp()
    flip_loop = flip_component(flipA.output)
    flip_loop.after(flipB)
    PrintOp('cool, it is over. %s' % flipA.output).after(flip_loop)

Couler provides a unified interface for constructing and managing workflows that provides the following:

  • Simplicity: Unified interface and imperative programming style for defining workflows with automatic construction of directed acyclic graph (DAG).
  • Extensibility: Extensible to support various workflow engines.
  • Reusability: Reusable steps for tasks such as distributed training of machine learning models.
  • Efficiency: Automatic workflow and resource optimizations under the hood.

Please see the following sections for installation guide and examples.

Installation

  • Couler currently only supports Argo Workflows. Please see instructions here to install Argo Workflows on your Kubernetes cluster.
  • Install Python 3.6+
  • Install Couler Python SDK via the following pip command:
pip install git+https://github.com/couler-proj/couler

Alternatively, you can clone this repository and then run the following to install:

python setup.py install

Examples

Coin Flip

This example combines the use of a Python function result, along with conditionals, to take a dynamic path in the workflow. In this example, depending on the result of the first step defined in flip_coin(), the template will either run the heads() step or the tails() step.

Steps can be defined via either couler.run_script() for Python functions or couler.run_container() for containers. In addition, the conditional logic to decide whether to flip the coin in this example is defined via the combined use of couler.when() and couler.equal().

import couler.argo as couler
from couler.argo_submitter import ArgoSubmitter


def random_code():
    import random

    res = "heads" if random.randint(0, 1) == 0 else "tails"
    print(res)


def flip_coin():
    return couler.run_script(image="python:alpine3.6", source=random_code)


def heads():
    return couler.run_container(
        image="alpine:3.6", command=["sh", "-c", 'echo "it was heads"']
    )


def tails():
    return couler.run_container(
        image="alpine:3.6", command=["sh", "-c", 'echo "it was tails"']
    )


result = flip_coin()
couler.when(couler.equal(result, "heads"), lambda: heads())
couler.when(couler.equal(result, "tails"), lambda: tails())

submitter = ArgoSubmitter()
couler.run(submitter=submitter)

DAG

This example demonstrates different ways to define the workflow as a directed-acyclic graph (DAG) by specifying the dependencies of each task via couler.set_dependencies() and couler.dag(). Please see the code comments for the specific shape of DAG that we've defined in linear() and diamond().

import couler.argo as couler
from couler.argo_submitter import ArgoSubmitter


def job_a(message):
    couler.run_container(
        image="docker/whalesay:latest",
        command=["cowsay"],
        args=[message],
        step_name="A",
    )


def job_b(message):
    couler.run_container(
        image="docker/whalesay:latest",
        command=["cowsay"],
        args=[message],
        step_name="B",
    )


def job_c(message):
    couler.run_container(
        image="docker/whalesay:latest",
        command=["cowsay"],
        args=[message],
        step_name="C",
    )


def job_d(message):
    couler.run_container(
        image="docker/whalesay:latest",
        command=["cowsay"],
        args=[message],
        step_name="D",
    )

#     A
#    / \
#   B   C
#  /
# D
def linear():
    couler.set_dependencies(lambda: job_a(message="A"), dependencies=None)
    couler.set_dependencies(lambda: job_b(message="B"), dependencies=["A"])
    couler.set_dependencies(lambda: job_c(message="C"), dependencies=["A"])
    couler.set_dependencies(lambda: job_d(message="D"), dependencies=["B"])


#   A
#  / \
# B   C
#  \ /
#   D
def diamond():
    couler.dag(
        [
            [lambda: job_a(message="A")],
            [lambda: job_a(message="A"), lambda: job_b(message="B")],  # A -> B
            [lambda: job_a(message="A"), lambda: job_c(message="C")],  # A -> C
            [lambda: job_b(message="B"), lambda: job_d(message="D")],  # B -> D
            [lambda: job_c(message="C"), lambda: job_d(message="D")],  # C -> D
        ]
    )


linear()
submitter = ArgoSubmitter()
couler.run(submitter=submitter)

Note that the current version only works with Argo Workflows but we are actively working on the design of the unified interface that is extensible to additional workflow engines. Please stay tuned for more updates and we welcome any feedback and contributions from the community.

Community Blogs and Presentations

Comments
  • feat: Support all script template fields

    feat: Support all script template fields

    What changes were proposed in this pull request?

    This PR makes it so that script templates inherit from the container template and support all the fields that Argo supports.

    Why are the changes needed?

    We were not able to use the script template as is since a lot fields were missing.

    Does this PR introduce any user-facing change?

    Yes, the script template signature has additional arguments.

    How was this patch tested?

    We made sure the current tests pass and this use case for script templates is supported.

    opened by rushtehrani 12
  • fix: Fix issue with volume mount

    fix: Fix issue with volume mount

    What changes were proposed in this pull request?

    fix Issue #193

    Why are the changes needed?

    fix Issue #193

    Does this PR introduce any user-facing change?

    No

    How was this patch tested?

    volume_test.py

    opened by peiniliu 8
  • feat: removing stray yaml dump

    feat: removing stray yaml dump

    What changes were proposed in this pull request?

    This fixes https://github.com/couler-proj/couler/issues/248

    In that issue, @merlintang mentions this section can be removed.

    Why are the changes needed?

    Since this code is not contained in a method, it ends up being called outside of the context of couler/argo.

    Does this PR introduce any user-facing change?

    Only the reduction of output.

    How was this patch tested?

    Removed this and raised an exception and didn't get the yaml dump

    opened by dmerrick 7
  • feat: Add image pull secrect support for python src

    feat: Add image pull secrect support for python src

    What changes were proposed in this pull request?

    add image pull secret config for workflow

    Why are the changes needed?

    because our project need pull image from private hub

    Does this PR introduce any user-facing change?

    i think no.it was a new feature.

    How was this patch tested?

    add one unit test and our project use this code to pull image

    opened by lcgash 7
  • Install instructions don't work in Pycharm `venv`

    Install instructions don't work in Pycharm `venv`

    Summary

    I'm working with a PyCharm generated venv (3.8.5) and running install scripts supplied here doesn't work. The folder in site-packages has no real code in it.

    When I run it from a normal terminal, that is Python 3.8.5 sans venv, it install an egg. I don't think this is expected behaviour - I was expecting a wheel...

    Diagnostics

    Mac BigSur 11.3.1 Latest repository version 0.1.1rc8.

    bug 
    opened by moshewe 7
  • [RFC] Add initial design doc for core Couler API

    [RFC] Add initial design doc for core Couler API

    Hi community,

    We are opening this PR to share our thoughts on supporting multiple workflow engines. We'd appreciate any feedback and suggestions. In addition, if you are interested in contributing either a new backend or functionalities of the existing backend, please let us know in this pull request.

    opened by terrytangyuan 7
  • fix: Relax dependency version pinning.

    fix: Relax dependency version pinning.

    What changes were proposed in this pull request?

    To avoid depencency version conflicts with other libraries, pin dependencies to version ranges rather than exact versions.

    Why are the changes needed?

    Let me know if you disagree, but in general I think library code should try to avoid pinning to exact versions. Otherwise it can get very tricky to resolve conflicts for non-trivial projects.

    Does this PR introduce any user-facing change?

    It's possible that there are breaking changes in dependencies that I don't know about! I'll wait for tests to run and see if anything breaks.

    How was this patch tested?

    See above: this change shouldn't require new tests.

    opened by jmcarp 6
  • container volumeMounts can not mount exsist PVC.

    container volumeMounts can not mount exsist PVC.

    Summary

    What happened/what you expected to happen?

    We expect the container use the existing volume defined inside the workflow. Such as: volumes-existing.yaml

    example for testing:

    import os
    
    import couler.argo as couler
    from couler.argo_submitter import ArgoSubmitter
    from couler.core.templates.volume import VolumeMount, Volume
    
    couler.add_volume(Volume("apppath", "mnist"))
    
    mount = VolumeMount("apppath", "/data/")
    command = ["ls", mount.mount_path]
    
    couler.run_container(
            image="alpine:3.12.0", command=command, volume_mounts=[mount]
        )
    
    submitter = ArgoSubmitter(namespace="testagent")
    couler.run(submitter=submitter)
    

    OrderedDict([('apiVersion', 'argoproj.io/v1alpha1'), ('kind', 'Workflow'), ('metadata', {'generateName': 'runpy-'}), ('spec', {'entrypoint': 'runpy', 'volumes': [OrderedDict([('name', 'apppath'), ('persistentVolumeClaim', {'claimName': 'mnist'})])], 'templates': [{'name': 'runpy', 'steps': [[OrderedDict([('name', 'module-3418'), ('template', 'module')])]]}, OrderedDict([('name', 'module'), ('container', OrderedDict([('image', 'alpine:3.12.0'), ('command', ['ls', '/data/']), ('volumeMounts', [OrderedDict([('name', 'apppath'), ('mountPath', '/data/')])])])), (**'volumes', [{'name': 'apppath', 'emptyDir': {}}**])])]})])

    This will raise the problem because the volumeMounts inside the container find volumes inside the container('emptyDir') rather than volumes inside the workflow(PVC).

    The reason is this code, because inside the container, automatically generate volumes for volumeMounts,

    https://github.com/couler-proj/couler/blob/d20f874882e55c5e3aa53ffaf78670f6b4d314a0/couler/core/templates/container.py#L146-L153

    After removing the automatically generated 'emptydir{}', the volumeMount point to the right volume definition inside the workflow.

    OrderedDict([('apiVersion', 'argoproj.io/v1alpha1'), ('kind', 'Workflow'), ('metadata', {'generateName': 'runpy-'}), ('spec', {'entrypoint': 'runpy', 'volumes': [OrderedDict([('name', 'apppath'), ('persistentVolumeClaim', {'claimName': 'mnist'}**)])], 'templates': [{'name': 'runpy', 'steps': [[OrderedDict([('name', 'module-3418'), ('template', 'module')])]]}, OrderedDict([('name', 'module'), ('container', OrderedDict([('image', 'alpine:3.12.0'), ('command', ['ls', '/data/']), ('volumeMounts', [OrderedDict([('name', 'apppath'), ('mountPath', '/data/')])])])), ('volumes', [])**])]})])

    Diagnostics

    What is the version of Couler you are using?

    latest v0.1.1rc8

    What is the version of the workflow engine you are using?

    argo: v3.0.0-rc3

    Any logs or other information that could help debugging?


    Message from the maintainers:

    Impacted by this bug? Give it a 👍. We prioritize the issues with the most 👍.

    bug good first issue 
    opened by peiniliu 6
  • fix: Support multiple function arguments in couler.map()

    fix: Support multiple function arguments in couler.map()

    What changes were proposed in this pull request?

    Added changes to accept multiple function arguments in couler.map() #169 continuing the fix adding the test function for the modifications.

    Had to start again from scratch. Reason: The previous idea just pretty much loops the arguments in the map().

    return map(map(function, input_list), *other)

    But the return value is a Step Class not a function so the first map() will succeed but not the second one or the third... because the map() checks the function first.

    inner_step = Step(name=inner_dict["id"], template=template_name)

    return inner_step #32 issue

    Why are the changes needed?

    #32 issue

    Does this PR introduce any user-facing change?

    No.

    How was this patch tested?

    A test was created for it. similar to the one-argument test. Test

    opened by nooraangelva 5
  • fix: Volume_claim to dynamic

    fix: Volume_claim to dynamic

    What changes were proposed in this pull request?

    The proposal would make the VolumeClaimTemplates size and accessModes dynamic by letting the user input them so they would not be hardcoded.

    Why are the changes needed?

    #210 Working on a workflow I noticed that my workflow would require the accessModes to be ReadWriteMany, not ReadWriteOnce. ReadWriteOnce was hardcoded, I thought that it would be better if the workflow creator could set it the way she/he needs it. Same for the size.

    Does this PR introduce any user-facing change?

    Users would have to write 3 arguments to the VolumeClaimTemplate instead of one.

    Previous version: volume = VolumeClaimTemplate("workdir")

    Updated version: volume = VolumeClaimTemplate("workdir", ['ReadWriteMany'], '1Gi')

    How was this patch tested?

    The testing was done by abiding Coulers instructions

    I added a new scripts/integration_tests.Unix.sh. Because I encountered the following problem the solution is in the link also.

    opened by nooraangelva 5
  • How to specify securityContext

    How to specify securityContext

    The k8s cluster I deploy to has a pod security policy, and requires that the Argo workflows have the following, top-level securityContext:

    apiVersion: argoproj.io/v1alpha1
    kind: Workflow
    metadata:
      generateName: main-
    spec:
      securityContext:
         fsGroup: 2000
         runAsNonRoot: true
         runAsUser: 1000
    ...
    

    How can I specify that via couler? I couldn't find anything in the docs.

    opened by kodeninja 5
  • Kubernetes API exception 404

    Kubernetes API exception 404

    Hi I am trying out one of the example using argo workflow and I am getting k8s api exeception. Please find the logs:

    (venv) (base) [email protected] pythonProject1 % python main.py INFO:root:Argo submitter namespace: argo INFO:root:Found local kubernetes config. Initialized with kube_config. INFO:root:Checking workflow name/generatedName main- INFO:root:Submitting workflow to Argo ERROR:root:Failed to submit workflow Traceback (most recent call last): File "main.py", line 9, in <module> result = couler.run(submitter=submitter) File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/couler/argo.py", line 73, in run res = submitter.submit(wf, secrets=secrets) File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/couler/argo_submitter.py", line 151, in submit return self._create_workflow(workflow_yaml) File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/couler/argo_submitter.py", line 174, in _create_workflow raise e File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/couler/argo_submitter.py", line 158, in _create_workflow response = self._custom_object_api_client.create_namespaced_custom_object( # noqa: E501 File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/api/custom_objects_api.py", line 225, in create_namespaced_custom_object return self.create_namespaced_custom_object_with_http_info(group, version, namespace, plural, body, **kwargs) # noqa: E501 File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/api/custom_objects_api.py", line 344, in create_namespaced_custom_object_with_http_info return self.api_client.call_api( File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/api_client.py", line 348, in call_api return self.__call_api(resource_path, method, File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/api_client.py", line 180, in __call_api response_data = self.request( File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/api_client.py", line 391, in request return self.rest_client.POST(url, File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/rest.py", line 275, in POST return self.request("POST", url, File "/Users/saurav/PycharmProjects/pythonProject1/venv/lib/python3.8/site-packages/kubernetes/client/rest.py", line 234, in request raise ApiException(http_resp=r) kubernetes.client.exceptions.ApiException: (404) Reason: Not Found HTTP response headers: HTTPHeaderDict({'Audit-Id': '47ae3075-3530-4e82-b495-fc359d544f51', 'Cache-Control': 'no-cache, private', 'Content-Type': 'text/plain; charset=utf-8', 'X-Content-Type-Options': 'nosniff', 'X-Kubernetes-Pf-Flowschema-Uid': '52efdd2c-fc48-4b6a-a014-0f20cb376b7f', 'X-Kubernetes-Pf-Prioritylevel-Uid': '4dbba312-e8f6-4b16-97b2-90fe3b9d7dc0', 'Date': 'Tue, 26 Jul 2022 06:20:02 GMT', 'Content-Length': '19'}) HTTP response body: 404 page not found

    bug 
    opened by smetal1 0
  • Documentation: Difference to other meta workflow engines

    Documentation: Difference to other meta workflow engines

    Couler is a meta workflow engine, but I know of at least ZenML and Kedro which are meta workflow engines as well. While the self-presentation of these other two is towards machine learning, it seems to me they are pretty much generally usable for any type of work. What is Couler aiming to do differently than ZenML and/or Kedro?

    opened by Make42 3
  • Need additional clarification/examples around using set_dependencies+map

    Need additional clarification/examples around using set_dependencies+map

    Summary

    I'm confused on how to properly use dependencies. Let's say I have a workflow with 4 groups of steps (A, B, C, D) and each has multiple subtasks that can happen in parallel (A1, A2, ..., B1, B2, ...). Currently, I'm adding all the A steps using couler.map, then adding all the B steps with couler.map, etc. This correctly parallelizes across A1, A2, ..., but none of the B steps start until all the A steps have completed, despite the fact that I never explicitly set dependencies.

    In this case, I want A and B to run in parallel, then C then D. Having this run sequentially as A, B, C, D is technically correct, but not ideally performant. However, given that I'm not setting dependencies, and they're still running sequentially, I feel like using the set_dependencies function wouldn't help. Also, when I tried to use the set_dependencies function, the couler code errored on parsing its own generated yaml due to duplicate anchor definitions. Would definitely like to see a more in-depth example than those currently present in the README which shows how to properly use set_dependencies in combination with functions like map.

    Use Cases

    Mostly explained above.


    Message from the maintainers:

    Impacted by this bug? Give it a 👍. We prioritize the issues with the most 👍.

    enhancement 
    opened by varunm22 2
  • Revert

    Revert "home brew" code to official Python client

    Summary

    Change custom code to use the official Python API

    Use Cases

    It's difficult developing new features and onboarding new developers to the codebase, as the underlying structures don't follow the (well-documented) official Python API.


    Message from the maintainers:

    Impacted by this bug? Give it a 👍. We prioritize the issues with the most 👍.

    enhancement 
    opened by moshewe 6
  • Explicit parameter passing between steps

    Explicit parameter passing between steps

    Summary

    Example:

    out1 = couler.create_parameter_artifact(path="/mnt/test.txt")
    out2 = couler.create_parameter_artifact(path="/mnt/test2.txt")
    
    
    def producer(name):
        return couler.run_container(
            image="alpine:3.6", command=["sh", "-c", 'echo "test" > /mnt/test.txt']
            , step_name=name, output=[out1, out2]
        )
    
    
    def consumer(name):
        inputs = couler.get_step_output(step_name="1")
        return couler.run_container(
            image="alpine:3.6", command=["sh", "-c", 'cat /mnt/test.txt']
            , step_name=name, args=[inputs[0]],
        )
    
    
    couler.set_dependencies(lambda: producer("1"), dependencies=None)
    couler.set_dependencies(lambda: consumer("2"), dependencies=["1"])
    

    Now arguments in consumer template look like this:

    arguments:
                  parameters:
                    - name: para-2-0
                      value: "{{tasks.1.outputs.parameters.output-id-15}}"
                    - name: para-2-1
                      value: "{{tasks.1.outputs.parameters.output-id-15}}"
                    - name: para-2-2
                      value: "{{tasks.1.outputs.parameters.output-id-16}}"
    

    It would be useful if there was a way for setting dependency between steps without implicit parameter passing.

    For myself I just added a flag to run_container that just turns off this behavior.

    Use Cases

    I have one parent step that generates couple outputs, and I have multiple children steps, each one of them only needs proper subset of parent outputs, the rest of information would be redundant.


    Message from the maintainers:

    Impacted by this bug? Give it a 👍. We prioritize the issues with the most 👍.

    enhancement 
    opened by hcnt 1
  • Example for usage of secret

    Example for usage of secret

    Summary

    What change needs making? Example script on how to use created secret successfully. I have tried myself using the following code from test It creates a secret successfully but fails to echo the secrets value.

    Use Cases

    When would you use this? when I need to use a secret for example in the authentication.

    Message from the maintainers:

    Impacted by this bug? Give it a 👍. We prioritize the issues with the most 👍.

    enhancement good first issue help wanted 
    opened by nooraangelva 2
Releases(v0.1.1rc8-stable)
  • v0.1.1rc8-stable(Apr 12, 2021)

    This release includes the compatibility fixes with different protobuf versions as well as fix for unnecessarily raised exception when using /tmp as mount path. This release also introduces the support for workflow memoization caches.

    List of changes since the last release can be found here.

    Source code(tar.gz)
    Source code(zip)
  • v0.1.1rc8(Mar 23, 2021)

  • v0.1.1rc7(Dec 2, 2020)

  • v0.1.1rc6(Sep 25, 2020)

    This release includes several bug fixes and enhancements. Below are some of the notable changes:

    • Bump the dependency of Argo Python client to v3.5.1 and re-enable Argo Workflow spec validation.
    • Fix incorrect ApiException import path for Kubernetes Python client with version 11.0.0 and above.
    • Support callable for Couler core APIs in stead of previously only types.FunctionType.
    • Switch to use Argo Workflows v2.10.2 for integration tests.
    Source code(tar.gz)
    Source code(zip)
  • v0.1.1rc5(Sep 15, 2020)

Owner
Couler Project
Unified Interface for Constructing and Managing Workflows
Couler Project
Membership Inference Attack against Graph Neural Networks

MIA GNN Project Starter If you meet the version mismatch error for Lasagne library, please use following command to upgrade Lasagne library. pip insta

6 Nov 09, 2022
[NeurIPS 2021] SSUL: Semantic Segmentation with Unknown Label for Exemplar-based Class-Incremental Learning

SSUL - Official Pytorch Implementation (NeurIPS 2021) SSUL: Semantic Segmentation with Unknown Label for Exemplar-based Class-Incremental Learning Sun

Clova AI Research 44 Dec 27, 2022
Pytorch implementation of "Grad-TTS: A Diffusion Probabilistic Model for Text-to-Speech"

GradTTS Unofficial Pytorch implementation of "Grad-TTS: A Diffusion Probabilistic Model for Text-to-Speech" (arxiv) About this repo This is an unoffic

HeyangXue1997 103 Dec 23, 2022
BiSeNet based on pytorch

BiSeNet BiSeNet based on pytorch 0.4.1 and python 3.6 Dataset Download CamVid dataset from Google Drive or Baidu Yun(6xw4). Pretrained model Download

367 Dec 26, 2022
MG-GCN: Scalable Multi-GPU GCN Training Framework

MG-GCN MG-GCN: multi-GPU GCN training framework. For more information, please read our paper. After cloning our repository, run git submodule update -

Translational Data Analytics (TDA) Lab @GaTech 6 Oct 24, 2022
A fast MoE impl for PyTorch

An easy-to-use and efficient system to support the Mixture of Experts (MoE) model for PyTorch.

Rick Ho 873 Jan 09, 2023
PyTorch code to run synthetic experiments.

Code repository for Invariant Risk Minimization Source code for the paper: @article{InvariantRiskMinimization, title={Invariant Risk Minimization}

Facebook Research 345 Dec 12, 2022
[CVPR 2022 Oral] Crafting Better Contrastive Views for Siamese Representation Learning

Crafting Better Contrastive Views for Siamese Representation Learning (CVPR 2022 Oral) 2022-03-29: The paper was selected as a CVPR 2022 Oral paper! 2

249 Dec 28, 2022
OMAMO: orthology-based model organism selection

OMAMO: orthology-based model organism selection OMAMO is a tool that suggests the best model organism to study a biological process based on orthologo

Dessimoz Lab 5 Apr 22, 2022
Implementation of momentum^2 teacher

Momentum^2 Teacher: Momentum Teacher with Momentum Statistics for Self-Supervised Learning Requirements All experiments are done with python3.6, torch

jemmy li 121 Sep 26, 2022
OptaPlanner wrappers for Python. Currently significantly slower than OptaPlanner in Java or Kotlin.

OptaPy is an AI constraint solver for Python to optimize the Vehicle Routing Problem, Employee Rostering, Maintenance Scheduling, Task Assignment, School Timetabling, Cloud Optimization, Conference S

OptaPy 211 Jan 02, 2023
Speech-Emotion-Analyzer - The neural network model is capable of detecting five different male/female emotions from audio speeches. (Deep Learning, NLP, Python)

Speech Emotion Analyzer The idea behind creating this project was to build a machine learning model that could detect emotions from the speech we have

Mitesh Puthran 965 Dec 24, 2022
This project provides an unsupervised framework for mining and tagging quality phrases on text corpora with pretrained language models (KDD'21).

UCPhrase: Unsupervised Context-aware Quality Phrase Tagging To appear on KDD'21...[pdf] This project provides an unsupervised framework for mining and

Xiaotao Gu 146 Dec 22, 2022
Black-Box-Tuning - Black-Box Tuning for Language-Model-as-a-Service

Black-Box-Tuning Source code for paper "Black-Box Tuning for Language-Model-as-a-Service". Being busy recently, the code in this repo and this tutoria

Tianxiang Sun 149 Jan 04, 2023
Collaborative forensic timeline analysis

Timesketch Table of Contents About Timesketch Getting started Community Contributing About Timesketch Timesketch is an open-source tool for collaborat

Google 2.1k Dec 28, 2022
Classify music genre from a 10 second sound stream using a Neural Network.

MusicGenreClassification Academic research in the field of Deep Learning (Deep Neural Networks) and Sound Processing, Tel Aviv University. Featured in

Matan Lachmish 453 Dec 27, 2022
A python toolbox for predictive uncertainty quantification, calibration, metrics, and visualization

Website, Tutorials, and Docs    Uncertainty Toolbox A python toolbox for predictive uncertainty quantification, calibration, metrics, and visualizatio

Uncertainty Toolbox 1.4k Dec 28, 2022
[SIGGRAPH 2022 Journal Track] AvatarCLIP: Zero-Shot Text-Driven Generation and Animation of 3D Avatars

AvatarCLIP: Zero-Shot Text-Driven Generation and Animation of 3D Avatars Fangzhou Hong1*  Mingyuan Zhang1*  Liang Pan1  Zhongang Cai1,2,3  Lei Yang2 

Fangzhou Hong 749 Jan 04, 2023
Interpretable and Generalizable Person Re-Identification with Query-Adaptive Convolution and Temporal Lifting

QAConv Interpretable and Generalizable Person Re-Identification with Query-Adaptive Convolution and Temporal Lifting This PyTorch code is proposed in

Shengcai Liao 166 Dec 28, 2022
The official MegEngine implementation of the ICCV 2021 paper: GyroFlow: Gyroscope-Guided Unsupervised Optical Flow Learning

[ICCV 2021] GyroFlow: Gyroscope-Guided Unsupervised Optical Flow Learning This is the official implementation of our ICCV2021 paper GyroFlow. Our pres

MEGVII Research 36 Sep 07, 2022