A Python pickling decompiler and static analyzer

Overview

Fickling

Fickling is a decompiler, static analyzer, and bytecode rewriter for Python pickle object serializations.

Pickled Python objects are in fact bytecode that is interpreted by a stack-based virtual machine built into Python called the "Pickle Machine". Fickling can take pickled data streams and decompile them into human-readable Python code that, when executed, will deserialize to the original serialized object.

The authors do not prescribe any meaning to the “F” in Fickling; it could stand for “fickle,” … or something else. Divining its meaning is a personal journey in discretion and is left as an exercise to the reader.

Installation

Fickling has been tested on Python 3.6 through Python 3.9 and has very few dependencies. It can be installed through pip:

pip3 install fickling

This installs both the library and the command line utility.

Usage

Fickling can be run programmatically:

>>> import ast
>>> import pickle
>>> from fickling.pickle import Pickled
>>> print(ast.dump(Pickled.load(pickle.dumps([1, 2, 3, 4])).ast, indent=4))
Module(
    body=[
        Assign(
            targets=[
                Name(id='result', ctx=Store())],
            value=List(
                elts=[
                    Constant(value=1),
                    Constant(value=2),
                    Constant(value=3),
                    Constant(value=4)],
                ctx=Load()))])

Fickling can also be run as a command line utility:

$ fickling pickled.data
result = [1, 2, 3, 4]

This is of course a simple example. However, Python pickle bytecode can run arbitrary Python commands (such as exec or os.system) so it is a security risk to unpickle untrusted data. You can test for common patterns of malicious pickle files with the --check-safety option:

$ fickling --check-safety pickled.data
Warning: Fickling failed to detect any overtly unsafe code, but the pickle file may still be unsafe.
Do not unpickle this file if it is from an untrusted source!

You can also safely trace the execution of the Pickle virtual machine without exercising any malicious code with the --trace option.

Finally, you can inject arbitrary Python code that will be run on unpickling into an existing pickle file with the --inject option.

License

This utility was developed by Trail of Bits. It is licensed under the GNU Lesser General Public License v3.0. Contact us if you're looking for an exception to the terms. © 2021, Trail of Bits.

Comments
  • Injections not cleaning up after itself.

    Injections not cleaning up after itself.

    The malicious code injected doesn't clean up the stack after itself which is what prevents it from being injected into arbitrary locations. This also would be the easiest way to detect pickles you've injected into. A "correct" pickle will only leave one value on the stack when everything is done, the pointer to the final object. I've never seen a real pickle not comply with this, so using pickletools.dis or your symbolic interpreter you can detect pickles you've injected into because it leaves two values on the stack whether you inject at the beginning or end.

    You can make the injections more covert by adding a pop instruction to the end so that it cleans up after itself. You would then also be able to inject into an arbitrary location like I do in https://github.com/coldwaterq/pickle_injector/blob/main/inject.py.

    For replacing the output you would add the pop instruction to the beginning of your payload / end of the real pickle, to throw away everything created and replace it with what you create.

    opened by coldwaterq 4
  • NotImplementedError: TODO: Add support for Opcode BININT

    NotImplementedError: TODO: Add support for Opcode BININT

    File "/Users/abenavides/workspace/enricher/fury_fda-models-hub-enricher/venvpython3/lib/python3.8/site-packages/fickling/pickle.py", line 106, in new raise NotImplementedError(f"TODO: Add support for Opcode {info.name}") NotImplementedError: TODO: Add support for Opcode BININT

    bug enhancement 
    opened by abenavidesmeli 2
  • NotImplementedError: TODO: Add support for Opcode BINFLOAT

    NotImplementedError: TODO: Add support for Opcode BINFLOAT

    I was trying out something sophisticated with a simple model pre-trained on MNIST. But I git this error.

    Traceback (most recent call last):
      File ".\pytorch_poc.py", line 147, in <module>
        exfil_model.pickled.insert_python_exec(
      File ".\pytorch_poc.py", line 58, in pickled
        self._pickled = Pickled.load(pickle_file)
      File "C:\Python38\lib\site-packages\fickling\pickle.py", line 343, in load
        opcodes.append(Opcode(info=info, argument=arg, data=data, position=pos))
      File "C:\Python38\lib\site-packages\fickling\pickle.py", line 105, in __new__
        raise NotImplementedError(f"TODO: Add support for Opcode {info.name}")
    NotImplementedError: TODO: Add support for Opcode BINFLOAT
    

    I guess, the project still needs work to allow to make a full-fledged ML-based attack. Any plans for when this will be completed?

    opened by shreyansh26 2
  • Bump pypa/gh-action-pip-audit from 1.0.2 to 1.0.3

    Bump pypa/gh-action-pip-audit from 1.0.2 to 1.0.3

    Bumps pypa/gh-action-pip-audit from 1.0.2 to 1.0.3.

    Release notes

    Sourced from pypa/gh-action-pip-audit's releases.

    Release 1.0.3

    Full Changelog: https://github.com/pypa/gh-action-pip-audit/compare/v1.0.2...v1.0.3

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 1
  • Bump pypa/gh-action-pip-audit from 1.0.0 to 1.0.1

    Bump pypa/gh-action-pip-audit from 1.0.0 to 1.0.1

    Bumps pypa/gh-action-pip-audit from 1.0.0 to 1.0.1.

    Release notes

    Sourced from pypa/gh-action-pip-audit's releases.

    Release 1.0.1

    What's Changed

    New Contributors

    Full Changelog: https://github.com/pypa/gh-action-pip-audit/compare/v1.0.0...v1.0.1

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 1
  • Make sure to inject our code after PROTO and FRAME

    Make sure to inject our code after PROTO and FRAME

    Resolves issue #30.

    When injecting new code, preserve leading PROTO and FRAME opcodes.

    Also adds an analysis to detect invalid PROTO opcodes that can be an indicator of tampering.

    bug enhancement 
    opened by ESultanik 1
  • Errors when scanning Stable Diffusion/Textual Inversion embeddings pickle file

    Errors when scanning Stable Diffusion/Textual Inversion embeddings pickle file

    I'm trying to give the stable diffusion community the ability to trade Textual Inversion embeddings (basically, fine-tuning the model) between each other. When I run fickle against one my embeddings, I see this:

    (base) [email protected]:~/Downloads/archive$ fickling -t data.pkl

    PROTO EMPTY_DICT Pushed {} BINPUT Memoized 0 -> {} MARK Pushed MARK BINUNICODE Pushed 'string_to_token' BINPUT Memoized 1 -> 'string_to_token' EMPTY_DICT Pushed {} BINPUT Memoized 2 -> {} BINUNICODE Pushed '' BINPUT Memoized 3 -> '' GLOBAL Traceback (most recent call last): File "/home/berble/.local/bin/fickling", line 8, in sys.exit(main()) File "/home/berble/.local/lib/python3.8/site-packages/fickling/cli.py", line 82, in main print(unparse(trace.run())) File "/home/berble/.local/lib/python3.8/site-packages/fickling/tracing.py", line 54, in run self.on_statement(added) File "/home/berble/.local/lib/python3.8/site-packages/fickling/tracing.py", line 38, in on_statement print(f"\t{unparse(statement).strip()}") File "/home/berble/.local/lib/python3.8/site-packages/astunparse/init.py", line 13, in unparse Unparser(tree, file=v) File "/home/berble/.local/lib/python3.8/site-packages/astunparse/unparser.py", line 38, in init self.dispatch(tree) File "/home/berble/.local/lib/python3.8/site-packages/astunparse/unparser.py", line 66, in dispatch meth(tree) File "/home/berble/.local/lib/python3.8/site-packages/astunparse/unparser.py", line 113, in _ImportFrom interleave(lambda: self.write(", "), self.dispatch, t.names) File "/home/berble/.local/lib/python3.8/site-packages/astunparse/unparser.py", line 19, in interleave f(next(seq)) File "/home/berble/.local/lib/python3.8/site-packages/astunparse/unparser.py", line 66, in dispatch meth(tree) File "/home/berble/.local/lib/python3.8/site-packages/astunparse/unparser.py", line 856, in _alias if t.asname: AttributeError: 'alias' object has no attribute 'asname'

    Any idea where I could start looking? We'd really like to be able to share embeddings safely!

    Here's a base64-encoded copy of my data.pkl:

    gAJ9cQAoWA8AAABzdHJpbmdfdG9fdG9rZW5xAX1xAlgBAAAAKnEDY3RvcmNoLl91dGlscwpfcmVi dWlsZF90ZW5zb3JfdjIKcQQoKFgHAAAAc3RvcmFnZXEFY3RvcmNoCkxvbmdTdG9yYWdlCnEGWAEA AAAwcQdYAwAAAGNwdXEIS010cQlRSwEpKYljY29sbGVjdGlvbnMKT3JkZXJlZERpY3QKcQopUnEL dHEMUnENc1gPAAAAc3RyaW5nX3RvX3BhcmFtcQ5jdG9yY2gubm4ubW9kdWxlcy5jb250YWluZXIK UGFyYW1ldGVyRGljdApxDymBcRB9cREoWAgAAAB0cmFpbmluZ3ESiFgLAAAAX3BhcmFtZXRlcnNx E2gKKVJxFGgDY3RvcmNoLl91dGlscwpfcmVidWlsZF9wYXJhbWV0ZXIKcRVoBCgoaAVjdG9yY2gK RmxvYXRTdG9yYWdlCnEWWAEAAAAxcRdYBgAAAGN1ZGE6MHEYTQADdHEZUUsASwFNAAOGcRpNAANL AYZxG4loCilScRx0cR1ScR6IaAopUnEfh3EgUnEhc1gIAAAAX2J1ZmZlcnNxImgKKVJxI1gbAAAA X25vbl9wZXJzaXN0ZW50X2J1ZmZlcnNfc2V0cSRjX19idWlsdGluX18Kc2V0CnElXXEmhXEnUnEo WA8AAABfYmFja3dhcmRfaG9va3NxKWgKKVJxKlgWAAAAX2lzX2Z1bGxfYmFja3dhcmRfaG9va3Er TlgOAAAAX2ZvcndhcmRfaG9va3NxLGgKKVJxLVgSAAAAX2ZvcndhcmRfcHJlX2hvb2tzcS5oCilS cS9YEQAAAF9zdGF0ZV9kaWN0X2hvb2tzcTBoCilScTFYGgAAAF9sb2FkX3N0YXRlX2RpY3RfcHJl X2hvb2tzcTJoCilScTNYGwAAAF9sb2FkX3N0YXRlX2RpY3RfcG9zdF9ob29rc3E0aAopUnE1WAgA AABfbW9kdWxlc3E2aAopUnE3WAUAAABfa2V5c3E4fXE5aANOc3VidS4=

    bug 
    opened by BeanCounterTop 1
  • Error using check-safety/trace features (AttributeError: 'alias' object has no attribute 'asname')

    Error using check-safety/trace features (AttributeError: 'alias' object has no attribute 'asname')

    Hello! Great tool, I like that it also includes a way to check for potentially malicious opcodes in pickle files.

    I injected a payload into a stylegan2-ada pickle file and it behaves as expected. :)

    Now, when running both --check-safety or --trace commands the following error is shown:

    !fickling --check-safety /tmp/network-snapshot-000250.backdoor.pkl
    
    Traceback (most recent call last):
      File "/usr/local/bin/fickling", line 8, in <module>
        sys.exit(main())
      File "/usr/local/lib/python3.7/dist-packages/fickling/cli.py", line 79, in main
        return [1, 0][check_safety(pickled)]
      File "/usr/local/lib/python3.7/dist-packages/fickling/analysis.py", line 38, in check_safety
        shortened, already_reported = shorten_code(node)
      File "/usr/local/lib/python3.7/dist-packages/fickling/analysis.py", line 23, in shorten_code
        code = unparse(ast_node).strip()
      File "/usr/local/lib/python3.7/dist-packages/astunparse/__init__.py", line 13, in unparse
        Unparser(tree, file=v)
      File "/usr/local/lib/python3.7/dist-packages/astunparse/unparser.py", line 38, in __init__
        self.dispatch(tree)
      File "/usr/local/lib/python3.7/dist-packages/astunparse/unparser.py", line 66, in dispatch
        meth(tree)
      File "/usr/local/lib/python3.7/dist-packages/astunparse/unparser.py", line 113, in _ImportFrom
        interleave(lambda: self.write(", "), self.dispatch, t.names)
      File "/usr/local/lib/python3.7/dist-packages/astunparse/unparser.py", line 19, in interleave
        f(next(seq))
      File "/usr/local/lib/python3.7/dist-packages/astunparse/unparser.py", line 66, in dispatch
        meth(tree)
      File "/usr/local/lib/python3.7/dist-packages/astunparse/unparser.py", line 856, in _alias
        if t.asname:
    AttributeError: 'alias' object has no attribute 'asname'
    

    Let me know if there is anything more needed to debug the issue.

    Greetings!

    bug duplicate 
    opened by wunderwuzzi23 1
  • NotImplementedError: TODO: Add support for Opcode LONG1

    NotImplementedError: TODO: Add support for Opcode LONG1

    When attempting to use fickling on PyTorch models I get this error. I believe these models were just the weights. So i'm currious if this is hard to fix, and if you don't have time to fix it, any guidance you can give about the code base to help me attempt to patch it.

    opened by coldwaterq 1
  • Add DICT and INT opcodes

    Add DICT and INT opcodes

    Two very simple additions: The INT opcode is used to declare constant integer values. The DICT opcode reads values from the stack until it reaches a MARK opcode, alternating between keys and values.

    Take the following script:

    import ast
    import pickletools
    from fickling.pickle import Pickled
    
    if __name__ == '__main__':
    	
    	pickled = b"(I1\nI2\nd."
    
    	for op, arg, pos in pickletools.genops(pickled):
    		print(f"{pos}: {op.name} {arg}")
    
    	ast_data = Pickled.load(pickled).ast
    	print(ast.dump(ast_data))
    

    Before, it would output:

    0: MARK None
    1: INT 1
    4: INT 2
    7: DICT None
    8: STOP None
    Traceback (most recent call last):
      File "test.py", line 12, in <module>
        ast_data = Pickled.load(pickled).ast
      File "/home/carlos/.local/lib/python3.7/site-packages/fickling/pickle.py", line 343, in load
        opcodes.append(Opcode(info=info, argument=arg, data=data, position=pos))
      File "/home/carlos/.local/lib/python3.7/site-packages/fickling/pickle.py", line 105, in __new__
        raise NotImplementedError(f"TODO: Add support for Opcode {info.name}")
    NotImplementedError: TODO: Add support for Opcode INT
    

    Now it outputs:

    0: MARK None
    1: INT 1
    4: INT 2
    7: DICT None
    8: STOP None
    Module(body=[Assign(targets=[Name(id='result', ctx=Store())], value=Dict(keys=[Constant(value=1)], values=[Constant(value=2)]))])
    
    opened by 00xc 1
  • Add EMPTY_SET Opcode

    Add EMPTY_SET Opcode

    Thanks for this great tool! This Opcode appears to be necessary to fickle some uses of PyTorch modules such as torch.nn.Linear, which I'd love to have support for.

    Here is an example which triggers the following error:

    # example.py
    import pickle
    from fickling.pickle import Pickled
    from torch import nn
    
    filename = "model.pt"
    model = nn.Linear(2, 1)
    
    with open(filename, "wb") as model_file:
        pickle.dump(model, model_file)
    
    with open(filename, "rb") as model_file:
        pickled = Pickled.load(model_file)
    
    $ python example.py
    Traceback (most recent call last):
      File "~/ml-attacks/src/pickle_deserialization/example.py", line 12, in <module>
        pickled = Pickled.load(model_file)
      File "~/ml-attacks/.venv/lib/python3.9/site-packages/fickling/pickle.py", line 343, in load
        opcodes.append(Opcode(info=info, argument=arg, data=data, position=pos))
      File "~/ml-attacks/.venv/lib/python3.9/site-packages/fickling/pickle.py", line 105, in __new__
        raise NotImplementedError(f"TODO: Add support for Opcode {info.name}")
    NotImplementedError: TODO: Add support for Opcode EMPTY_SET
    

    Is anything else needed?

    opened by willclarktech 1
  • Bump pypa/gh-action-pip-audit from 1.0.2 to 1.0.4

    Bump pypa/gh-action-pip-audit from 1.0.2 to 1.0.4

    Bumps pypa/gh-action-pip-audit from 1.0.2 to 1.0.4.

    Release notes

    Sourced from pypa/gh-action-pip-audit's releases.

    Release 1.0.4

    Full Changelog: https://github.com/pypa/gh-action-pip-audit/compare/v1.0.3...v1.0.4

    Release 1.0.3

    Full Changelog: https://github.com/pypa/gh-action-pip-audit/compare/v1.0.2...v1.0.3

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    dependencies 
    opened by dependabot[bot] 0
  • runpy._run_code and torch.jit.unsupported_tensor_ops.execWrapper

    runpy._run_code and torch.jit.unsupported_tensor_ops.execWrapper

    Hello,

    I've been playing around with some alternative ways to execute Python via pickles, and discovered both runpy._run_code and torch.jit.unsupported_tensor_ops.execWrapper can be used to call into exec without fickling detecting it. I have some demo code here that will create pickles using these techniques: https://bitbucket.org/hiddenlayersec/sai/src/master/pytorch_inject/torch_picke_inject.py

    runpy._run_code produces no warnings, and execWrapper generates a "Call to execWrapper(...) can execute arbitrary code and is inherently unsafe" warning.

    It might be worth adding explicit checks for both of these methods and detecting as overtly bad.

    Many thanks btw for the awesome library!

    Best regards,

    Tom

    opened by hidden-tom 0
  • Possible to apply heuristics scan to pickle files?

    Possible to apply heuristics scan to pickle files?

    I'm not so familiar with pickling and these scans. However, I wondered if maybe there are heuristics or signatures for certain types of pickle files that could be evaluated.

    If you knew for example that a pickle file should be for a stable diffusion model, some properties could be examined that might help to verify a bit more.

    If so, could set up something like a /signatures directoy and let people pull request in definitions, then could scan -security -sig='signatures/typename'

    This can be closed, just wanted to pass the idea by in case it could be useful

    opened by neural-loop 0
  • Add direct support for PyTorch/TorchScript serialized models

    Add direct support for PyTorch/TorchScript serialized models

    Right now, pytorch_poc.py injects malicious code contained within the pickle files of the PyTorch standard model format. This and the TorchScript serialization format are ZIP archives with pickle files. It would be great to expand upon those and provide users with easy-to-use functions that can directly manipulate these files since they're relatively common.

    enhancement 
    opened by suhacker1 2
Releases(v0.0.4)
Owner
Trail of Bits
More code: binary lifters @lifting-bits, blockchain @crytic
Trail of Bits
Minimal example utilizing fastapi and celery with RabbitMQ for task queue, Redis for celery backend and flower for monitoring the celery tasks.

FastAPI with Celery Minimal example utilizing FastAPI and Celery with RabbitMQ for task queue, Redis for Celery backend and flower for monitoring the

Grega Vrbančič 371 Jan 01, 2023
Install multiple versions of r2 and its plugins via Pip on any system!

r2env This repository contains the tool available via pip to install and manage multiple versions of radare2 and its plugins. r2-tools doesn't conflic

radare org 18 Oct 11, 2022
volunteer-database

This is the official CSM (Crowd source medical) database The What Now? We created this in light of the COVID-19 pandemic to allow volunteers to work t

32 Jun 21, 2022
Example app using FastAPI and JWT

FastAPI-Auth Example app using FastAPI and JWT virtualenv -p python3 venv source venv/bin/activate pip3 install -r requirements.txt mv config.yaml.exa

Sander 28 Oct 25, 2022
Adds integration of the Jinja template language to FastAPI.

fastapi-jinja Adds integration of the Jinja template language to FastAPI. This is inspired and based off fastapi-chamelon by Mike Kennedy. Check that

Marc Brooks 58 Nov 29, 2022
FastAPI构建的API服务

使用FastAPI 构建的商城项目API 学习FastAPI 构建项目目录 构建项目接口: 对应博客:https://www.charmcode.cn/article/2020-06-08_vue_mall_api 声明 此项目已经不再维护, 可以参考我另外一个项目https://github.co

王小右 64 Oct 04, 2022
FastAPI Boilerplate

FastAPI Boilerplate Features SQlAlchemy session Custom user class Top-level dependency Dependencies for specific permissions Celery SQLAlchemy for asy

Hide 417 Jan 07, 2023
I'm curious if pydantic + fast api can be sensibly used with DDD + hex arch methodology

pydantic-ddd-exploration I'm curious if pydantic + fast api can be sensibly used with DDD + hex arch methodology Prerequisites nix direnv (nix-env -i

Olgierd Kasprowicz 2 Nov 17, 2021
官方文档已经有翻译的人在做了,

FastAPI 框架,高性能,易学,快速编码,随时可供生产 文档:https://fastapi.tiangolo.com 源码:https://github.com/tiangolo/fastapi FastAPI 是一个现代、快速(高性能)的 Web 框架,基于标准 Python 类型提示,使用

ApacheCN 27 Nov 11, 2022
FastAPI Learning Example,对应中文视频学习教程:https://space.bilibili.com/396891097

视频教学地址 中文学习教程 1、本教程每一个案例都可以独立跑,前提是安装好依赖包。 2、本教程并未按照官方教程顺序,而是按照实际使用顺序编排。 Video Teaching Address FastAPI Learning Example 1.Each case in this tutorial c

381 Dec 11, 2022
Lung Segmentation with fastapi

Lung Segmentation with fastapi This app uses FastAPI as backend. Usage for app.py First install required libraries by running: pip install -r requirem

Pejman Samadi 0 Sep 20, 2022
FastAPI Skeleton App to serve machine learning models production-ready.

FastAPI Model Server Skeleton Serving machine learning models production-ready, fast, easy and secure powered by the great FastAPI by Sebastián Ramíre

268 Jan 01, 2023
a lightweight web framework based on fastapi

start-fastapi Version 2021, based on FastAPI, an easy-to-use web app developed upon Starlette Framework Version 2020 中文文档 Requirements python 3.6+ (fo

HiKari 71 Dec 30, 2022
FastAPI + Django experiment

django-fastapi-example This is an experiment to demonstrate one potential way of running FastAPI with Django. It won't be actively maintained. If you'

Jordan Eremieff 78 Jan 03, 2023
A Nepali Dictionary API made using FastAPI.

Nepali Dictionary API A Nepali dictionary api created using Fast API and inspired from https://github.com/nirooj56/Nepdict. You can say this is just t

Nishant Sapkota 4 Mar 18, 2022
Analytics service that is part of iter8. Robust analytics and control to unleash cloud-native continuous experimentation.

iter8-analytics iter8 enables statistically robust continuous experimentation of microservices in your CI/CD pipelines. For in-depth information about

16 Oct 14, 2021
MLServer

MLServer An open source inference server to serve your machine learning models. ⚠️ This is a Work in Progress. Overview MLServer aims to provide an ea

Seldon 341 Jan 03, 2023
The template for building scalable web APIs based on FastAPI, Tortoise ORM and other.

FastAPI and Tortoise ORM. Powerful but simple template for web APIs w/ FastAPI (as web framework) and Tortoise-ORM (for working via database without h

prostomarkeloff 95 Jan 08, 2023
Publish Xarray Datasets via a REST API.

Xpublish Publish Xarray Datasets via a REST API. Serverside: Publish a Xarray Dataset through a rest API ds.rest.serve(host="0.0.0.0", port=9000) Clie

xarray-contrib 106 Jan 06, 2023
Twitter API with fastAPI

Twitter API with fastAPI Content Forms Cookies and headers management Files edition Status codes HTTPExceptions Docstrings or documentation Deprecate

Juan Agustin Di Pasquo 1 Dec 21, 2021