Tortoise ORM is an easy-to-use asyncio ORM inspired by Django.

Overview

Tortoise ORM

https://badges.gitter.im/tortoise/community.png https://img.shields.io/pypi/v/tortoise-orm.svg?style=flat https://readthedocs.org/projects/tortoise-orm/badge/?version=latest https://pepy.tech/badge/tortoise-orm/month https://api.codacy.com/project/badge/Grade/b5b77021ba284e4a9e0c033a4611b046

Introduction

Tortoise ORM is an easy-to-use asyncio ORM (Object Relational Mapper) inspired by Django.

Tortoise ORM was build with relations in mind and admiration for the excellent and popular Django ORM. It's engraved in it's design that you are working not with just tables, you work with relational data.

You can find docs at ReadTheDocs

Note

Tortoise ORM is young project and breaking changes are to be expected. We keep a Changelog and it will have possible breakage clearly documented.

Tortoise ORM is supported on CPython >= 3.7 for SQLite, MySQL and PostgreSQL.

Why was Tortoise ORM built?

Python has many existing and mature ORMs, unfortunately they are designed with an opposing paradigm of how I/O gets processed. asyncio is relatively new technology that has a very different concurrency model, and the largest change is regarding how I/O is handled.

However, Tortoise ORM is not first attempt of building asyncio ORM, there are many cases of developers attempting to map synchronous python ORMs to the async world, initial attempts did not have a clean API.

Hence we started Tortoise ORM.

Tortoise ORM is designed to be functional, yet familiar, to ease the migration of developers wishing to switch to asyncio.

It also performs well when compared to other Python ORMs, trading places with Pony ORM:

https://raw.githubusercontent.com/tortoise/tortoise-orm/develop/docs/ORM_Perf.png

How is an ORM useful?

When you build an application or service that uses a relational database, there is a point when you can't just get away with just using parameterized queries or even query builder, you just keep repeating yourself, writing slightly different code for each entity. Code has no idea about relations between data, so you end up concatenating your data almost manually. It is also easy to make a mistake in how you access your database, making it easy for SQL-injection attacks to occur. Your data rules are also distributed, increasing the complexity of managing your data, and even worse, is applied inconsistently.

An ORM (Object Relational Mapper) is designed to address these issues, by centralising your data model and data rules, ensuring that your data is managed safely (providing immunity to SQL-injection) and keeps track of relationships so you don't have to.

Getting Started

Installation

First you have to install tortoise like this:

pip install tortoise-orm

You can also install with your db driver (aiosqlite is builtin):

pip install tortoise-orm[asyncpg]

Or for MySQL:

pip install tortoise-orm[aiomysql]

Or another asyncio MySQL driver asyncmy:

pip install tortoise-orm[asyncmy]

Quick Tutorial

Primary entity of tortoise is tortoise.models.Model. You can start writing models like this:

from tortoise.models import Model
from tortoise import fields

class Tournament(Model):
    id = fields.IntField(pk=True)
    name = fields.TextField()

    def __str__(self):
        return self.name


class Event(Model):
    id = fields.IntField(pk=True)
    name = fields.TextField()
    tournament = fields.ForeignKeyField('models.Tournament', related_name='events')
    participants = fields.ManyToManyField('models.Team', related_name='events', through='event_team')

    def __str__(self):
        return self.name


class Team(Model):
    id = fields.IntField(pk=True)
    name = fields.TextField()

    def __str__(self):
        return self.name

After you defined all your models, tortoise needs you to init them, in order to create backward relations between models and match your db client with appropriate models.

You can do it like this:

from tortoise import Tortoise

async def init():
    # Here we connect to a SQLite DB file.
    # also specify the app name of "models"
    # which contain models from "app.models"
    await Tortoise.init(
        db_url='sqlite://db.sqlite3',
        modules={'models': ['app.models']}
    )
    # Generate the schema
    await Tortoise.generate_schemas()

Here we create connection to SQLite database in the local directory called db.sqlite3, and then we discover & initialise models.

Tortoise ORM currently supports the following databases:

  • SQLite (requires aiosqlite)
  • PostgreSQL (requires asyncpg)
  • MySQL (requires aiomysql)

generate_schema generates the schema on an empty database. Tortoise generates schemas in safe mode by default which includes the IF NOT EXISTS clause, so you may include it in your main code.

After that you can start using your models:

# Create instance by save
tournament = Tournament(name='New Tournament')
await tournament.save()

# Or by .create()
await Event.create(name='Without participants', tournament=tournament)
event = await Event.create(name='Test', tournament=tournament)
participants = []
for i in range(2):
    team = await Team.create(name='Team {}'.format(i + 1))
    participants.append(team)

# M2M Relationship management is quite straightforward
# (also look for methods .remove(...) and .clear())
await event.participants.add(*participants)

# You can query related entity just with async for
async for team in event.participants:
    pass

# After making related query you can iterate with regular for,
# which can be extremely convenient for using with other packages,
# for example some kind of serializers with nested support
for team in event.participants:
    pass


# Or you can make preemptive call to fetch related objects
selected_events = await Event.filter(
    participants=participants[0].id
).prefetch_related('participants', 'tournament')

# Tortoise supports variable depth of prefetching related entities
# This will fetch all events for team and in those events tournaments will be prefetched
await Team.all().prefetch_related('events__tournament')

# You can filter and order by related models too
await Tournament.filter(
    events__name__in=['Test', 'Prod']
).order_by('-events__participants__name').distinct()

Migration

Tortoise ORM use Aerich as database migrations tool, see more detail at it's docs.

Contributing

Please have a look at the Contribution Guide

License

This project is licensed under the Apache License - see the LICENSE.txt file for details

Comments
  • A unified, robust and bug-free connection management interface for the ORM

    A unified, robust and bug-free connection management interface for the ORM

    Description

    This PR provides for a much more robust implementation of the connection management interface of the ORM which is primarily geared towards improving performance and usability.

    Motivation and Context

    I was really excited that a native asyncIO based ORM was in town and implemented a lot of API constructs similar to the Django ORM since I personally like the way queries are expressed in Django. When I was tinkering around,

    • The first thing I noticed was that connections to the DB were being established on boot-up which I wasn't very comfortable with.
    • The second thing I noticed, after tinkering around with the source code, was that though the docs said that Tortoise.get_connection returns a connection for a given DB alias, there was another construct named current_transaction_map and another method named get_connection in tortoise/transactions.py that was being used internally for storing and retrieving connections.

    This PR tries to address the above problems/inconsistencies as follows:

    • A unified connection management interface for accessing, modifying and deleting connections using asyncio native constructs.
    • Lazy connection creation i.e the underlying connection to the DB gets established only upon execution of a query.
    • Fixes a lot of bugs that could occur due to connections being stored in two different places ( Tortoise._connections and current_transaction_map). Also, the usage of ContextVars has been done in the most optimal way.

    How Has This Been Tested?

    • All test cases present currently pass for all environments. Once the design and solution are agreed upon, I could write tests for the newly added code.
    • Any code attempting to access a specific connection using an alias has now been refactored to use the new connections interface (including all existing tests)

    Once the solution has been accepted and reviewed by the core team, I can update the tests, docs and changlog accordingly.

    Checklist:

    • [x] My code follows the code style of this project.
    • [x] My change requires a change to the documentation.
    • [ ] I have updated the documentation accordingly.
    • [ ] I have added the changelog accordingly.
    • [x] I have read the CONTRIBUTING document.
    • [ ] I have added tests to cover my changes.
    • [x] All new and existing tests passed.
    opened by blazing-gig 52
  • Migrations

    Migrations

    (edited by @grigi ) Migrations is currently planned as a post-v1.0 feature.

    Some useful resources for getting this working right now:

    • https://github.com/tortoise/tortoise-orm/issues/8#issuecomment-534946871 (Native python, MySQL)
    • https://github.com/tortoise/tortoise-orm/issues/8#issuecomment-575982472 (Go, MySQL/PostgreSQL/SQLite)

    Forward migrations

    Best guess at this time for a complete solution, not a "quick" solution:

    • [ ] Make the describe_model() contain all the data one needs to generate DDL from.
    • [ ] Generate a sequence of high-level DDL instructions from describe_model()
    • [ ] Port generate_schema to use this high-level DDL instructions to generate a schema in the requested dialect. (This would require some decoupling from the driver instance, look at #72)
    • [ ] Build a persistance model (read/write) for high-level DDL insctuctions much like Django Migrations
    • [ ] And need a way to run "management" commands. So possibly a standard command-line utility.
    • [ ] Build a full model (e.g. same format as describe_model())from a series of high-level DDL instructions (as in read the migration persistance model)
    • [ ] Diff models, and generate a diff DDL instruction set.
    • [ ] If a Diff requires a default not pre-specified, we need to ask the user for something.
    • [ ] Have some way of determining the "version" of the models persisted in the DB.
    • [ ] Put it all together and make it work

    I'm not particularily happy about many migration systems storing state in the DB itself, as for some environments that won't work, but it is a very good place from a consistency PoV. We should have a backup for when we generate DDL that one would pass on to the DB team in cases where DDL changes have an established process (e.g. enterprise/consulting)

    Data migrations

    Proper data migration management makes this very useful:

    • [ ] Allow custom user scripts as migrations
    • [ ] Allow data-migrations to get a version of a model as it was at that time. So ability to build a model class from the intermediate describe_model() data
    • [ ] Handle DB edge cases, like Postgres being able to do DDL changes in a transaction OR data changes in a transaction, but not both.

    Backward migrations

    There is some cases where backwards migrations cannot be done as a forward migration clobbered/destroyed some data. In cases where we detect that we need mising data, and request a sane default. What do we do about data migrations? Is it safe to just emit a warning?

    discussion Future Release 
    opened by abondar 32
  • pymysql.err.InternalError: Packet sequence number wrong - got X expected 1

    pymysql.err.InternalError: Packet sequence number wrong - got X expected 1

    Describe the bug I use aiohttp+tortoise-orm+mysql for simple API, which can only insert a row to a single mysql database table called "TableName" and read the rows from it. If my API is idle for some time, say, 3 hours, and I do API request which should return me table rows, it shows "pymysql.err.InternalError: Packet sequence number wrong - got 0 expected 1" (not necessarily 0, it could be other number) error and status 500 instead of returning rows.

    It shows this error when run this piece of code: rows = await TableName.filter(some_field=some_field)

    To Reproduce

    1. Create models.py file:
    from tortoise import Model, fields
    
    
    class TableName(Model):
        id = fields.IntField(pk=True)
        some_field = fields.CharField(250)
    
    1. Create main.py file:
    from aiohttp import web
    from tortoise.contrib.aiohttp import register_tortoise
    
    from models import TableName
    
    
    async def list_rows(request):
        data = await request.json()
        try:
            some_field = data['some_field']
        except KeyError:
            return web.json_response({'message': f'Not all fields were specified'}, status=400)
    
        rows = await TableName.filter(some_field=some_field)
    
        rows_json = [row.__dict__ for row in rows]
        return web.json_response(rows_json)
    
    
    app = web.Application()
    app.add_routes([
        web.post("/rows", list_rows)
    ])
    
    register_tortoise(
        app, db_url="mysql://user:[email protected]:3306/any-database-name", modules={"models": ["models"]},
        generate_schemas=True
    )
    
    if __name__ == "__main__":
        web.run_app(app, port=6000)
    
    1. pack it into docker container with such Dockerfile:
    FROM python:3.9.5
    COPY requirements.txt . 
    RUN pip install -r requirements.txt
    COPY . .
    CMD ["python", "main.py"]
    

    docker build -t image-name:tag-name .

    1. Run API with command:

    docker run -d -p 6000:6000 image-name:tag-name

    1. After 3 hours of working, run curl query:

    curl -i -X POST --data '{"some_field": "blabla"}' http://localhost:6000/rows

    1. curl will return 500 error, inside container's logs you will see error:
    pymysql.err.InternalError: Packet sequence number wrong - got 35 expected 1
    

    Expected behavior curl query should return json with rows' values in it.

    Additional context Full trace:

    ERROR:aiohttp.server:Error handling request
    Traceback (most recent call last):
      File "/usr/local/lib/python3.9/site-packages/tortoise/backends/mysql/client.py", line 44, in translate_exceptions_
        return await func(self, *args)
      File "/usr/local/lib/python3.9/site-packages/tortoise/backends/mysql/client.py", line 199, in execute_query
        await cursor.execute(query, values)
      File "/usr/local/lib/python3.9/site-packages/aiomysql/cursors.py", line 239, in execute
        await self._query(query)
      File "/usr/local/lib/python3.9/site-packages/aiomysql/cursors.py", line 457, in _query
        await conn.query(q)
      File "/usr/local/lib/python3.9/site-packages/aiomysql/connection.py", line 428, in query
        await self._read_query_result(unbuffered=unbuffered)
      File "/usr/local/lib/python3.9/site-packages/aiomysql/connection.py", line 622, in _read_query_result
        await result.read()
      File "/usr/local/lib/python3.9/site-packages/aiomysql/connection.py", line 1105, in read
        first_packet = await self.connection._read_packet()
      File "/usr/local/lib/python3.9/site-packages/aiomysql/connection.py", line 574, in _read_packet
        raise InternalError(
    pymysql.err.InternalError: Packet sequence number wrong - got 0 expected 1
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/usr/local/lib/python3.9/site-packages/aiohttp/web_protocol.py", line 422, in _handle_request
        resp = await self._request_handler(request)
      File "/usr/local/lib/python3.9/site-packages/aiohttp/web_app.py", line 499, in _handle
        resp = await handler(request)
      File "//main.py", line 37, in list_rowa
        rows = await TableName.filter(some_field=some_field)
      File "/usr/local/lib/python3.9/site-packages/tortoise/queryset.py", line 879, in _execute
        instance_list = await self._db.executor_class(
      File "/usr/local/lib/python3.9/site-packages/tortoise/backends/base/executor.py", line 124, in execute_select
        _, raw_results = await self.db.execute_query(query.get_sql())
      File "/usr/local/lib/python3.9/site-packages/tortoise/backends/mysql/client.py", line 52, in translate_exceptions_
        raise OperationalError(exc)
    tortoise.exceptions.OperationalError: Packet sequence number wrong - got 0 expected 1
    
    opened by osintegrator 27
  • Builtin TestCase from python 3.8

    Builtin TestCase from python 3.8

    First and foremost, congrats, I really like this library and looking forward to contribute and follow the path of its evolution. I embraced this library mostly because it is async and it resembles django so it was very easy to get started with.

    Is your feature request related to a problem? Please describe. I am using tortoise with python 3.8, as of python 3.8 asyncio unittesting is possible without asynctest. https://docs.python.org/3/library/unittest.html#unittest.IsolatedAsyncioTestCase

    There are also AsyncMock and patch supports async methods.

    Describe the solution you'd like I would like not to include asynctest when using python3.8 and above.

    Describe alternatives you've considered I have considered no other alternatives, but I am open to consider other alternatives.

    Additional context I have been using a work around by copying some code from IsolatedTestCase

        async def asyncSetUp(self) -> None:
            config = generate_config(db_url='sqlite://:memory:', app_modules={'models': ['lib.models']})
            await Tortoise.init(config, _create_db=True)
            await Tortoise.generate_schemas(safe=False)
            self._connections = Tortoise._connections.copy()
    
        async def asyncTearDown(self) -> None:
            Tortoise._connections = self._connections.copy()
            await Tortoise._drop_databases()
    
            Tortoise.apps = {}
            Tortoise._connections = {}
            Tortoise._inited = False
    

    I am open to open a PR with the fixes.

    Another question, why the testing modules are in contrib? Testing is vital to every piece of software, is testing not fully supported or you are planning to change the api?

    enhancement 
    opened by WisdomPill 26
  • Add contains and contained_by filter to JSONField

    Add contains and contained_by filter to JSONField

    Description

    JSON is a native type in postgresql (since 9.2) and you can apply the filter on json object. The tortoise-orm not supported some filters such as contains , contained_by, filter by key and index in JSONField.

    Motivation and Context

    For example when we have an array of objects in json , we need to filter jsonfield if contains some objects or vice versa. Also to search inside the object by keys and index with equal , is_null, not_is_null, not options.

    How Has This Been Tested?

    I have added some methods in TestJSONFields class for testing.

    Checklist:

    • [x] My code follows the code style of this project.
    • [x] My change requires a change to the documentation.
    • [x] I have updated the documentation accordingly.
    • [x] I have read the CONTRIBUTING document.
    • [x] I have added tests to cover my changes.
    • [x] All new and existing tests passed.
    opened by ahmadgh74 22
  • fix bug with select_related not yielding null #up

    fix bug with select_related not yielding null #up

    Fixed (a bug ???) when select_related yielded unfilled instances of related objects instead of just nulls.

    Description

    Ran into this unexpected behaviour with something like this:

    class B(Model):
          id = fields.UUIDField(pk=True)
    class A(Model):
        the_b = fields.OneToOneField(
            "models.B", related_name="the_a", on_delete=fields.SET_NULL, null=True
        )
    

    then i got it like:

    a1 = await A.create(the_b=await B.create())
    a2 = await A.create(the_b=None)
    a = A.all().select_related('the_b').get(id=a2.id)
    # and for some reason the next checks failed
    assert a.the_b is None
    assert a.the_b == a2.the_b
    

    Motivation and Context

    Looks like this packages follows the ideas of django-orm and duplicates its best features in most cases, so i thought, like at least select_related behaviour should be same as it in django-orm. linked to this issue https://github.com/tortoise/tortoise-orm/issues/825.

    How Has This Been Tested?

    tested this with code like the one above. + as import dep in my pet proj, just assured that it don't gives me issues anymore.

    Checklist:

    • [x] My code follows the code style of this project.
    • [ ] My change requires a change to the documentation.
    • [ ] I have updated the documentation accordingly.
    • [x] I have read the CONTRIBUTING document.
    • [x] I have added tests to cover my changes.
    • [x] All new and existing tests passed.
    opened by urm8 19
  • Duplicate model instance

    Duplicate model instance

    Is your feature request related to a problem? Please describe. Usually in django to duplicate a model and to not copy every field there is a work around that is pretty straightforward and easy to do. This can be done via setting the pk to None. But with tortoise is not that easy, setting also the attribute ._saved_in_db is necessary.

    Describe the solution you'd like Not to access a private variables for this trick to work.

    Describe alternatives you've considered Maybe a copy method as well, that would be much more intuitive.

    Additional context Example of the code needed as of now

    book = Book(title='1984', author='George Orwell')
    
    print(book.pk)
    
    await book.save()
    
    print(book.pk)
    
    book.pk = None
    book._saved_in_db = False
    book.title = 'Animal farm'
    await book.save()
    
    print(book.pk)
    
    print(await Book.all().count())
    

    Example of the code that I would like to use instead

    book = Book(title='1984', author='George Orwell')
    
    print(book.pk)
    
    await book.save()
    
    print(book.pk)
    
    book.pk = None
    book.title = 'Animal farm'
    await book.save()
    
    print(book.pk)
    
    print(await Book.all().count())
    
    enhancement Next Release 
    opened by WisdomPill 18
  • Ability to SUM multiple columns

    Ability to SUM multiple columns

    Is your feature request related to a problem? Please describe. I'd like to be able to use the Sum function over multiple columns.

    Example for a model

    class Stats(Model):
        one_column = fields.IntField(default=0)
        two_column = fields.IntField(default=0)
    

    I'd like to be able to do run a query like select sum(one_column+two_column) from stats

    Describe the solution you'd like Stats.annotate(total_sum=Sum('one_column+two_column')) (or something similar)

    Additional context

    Pypika can already resolve this as seen in this test:

    https://github.com/kayak/pypika/blob/16339f85ed871d937609e9dbe26487b382ca5211/pypika/tests/test_formats.py

    fn.Sum(self.table_abc.fizz + self.table_abc.buzz)

    enhancement 
    opened by bbedward 18
  • Q-objects

    Q-objects

    after working withe the Q objects i have some questions, remarks and things that don't work as intended/expected

    • a blank Q object in AND mode works with just 1 argument to filter on, however OR mode requires a minimum of 2, quite inconvienient when your Q object is assembled based on client data
    • you can't combine keywords and nested Q objects into the same Q object but you got to wrap them into seperate Q objects to then combine into one master Q object to handle both, throws an explicit exception so seems to be intended behaviour but makes things more complex
    • are there any performance penalties to using nested Q objects several layers deep (other then ofc the overhead of python constructing objects)
    bug enhancement Next Release Waiting for feedback 
    opened by AEnterprise 18
  • New pool (WIP)

    New pool (WIP)

    Now that code base is simpler, and test runner should be more sane, attempt to add connection pooling for the third time.

    The plan is to change to connection pooling ONLY, as we currently implement persistent connections, but only one persistent connection. A connection pool should:

    • add robustness (if connection dies, then reconnect)
    • Allow multiple DB clients to operate at the same time (up to maxsize)
    • Allow more conflicts, so we need to handle rollback/retries explicitly.

    Things done:

    • [x] Change to a connection pooling system for MySQL.
    • [ ] Add tests for concurrency
    • [ ] Add tests for robustness (hackery allowed)
    • [ ] Add tests for handling conflicts.

    Concerns:

    • Can SQLite be concurrent at all? If difficult, should we limit it?
    • We need to add concurrency to the benchmarks, to manage performance
    opened by grigi 18
  • Tortoise orm not creating columns in postgres database in all tables

    Tortoise orm not creating columns in postgres database in all tables

    Hi i am having a serious issue i have read all docs and searched for the solution but my code is correct, acctually what is happening i have mades models and when i am running my python file its creating those tables in database but its not creating some columns in most of the tables here is the code of my one of the table:

    class GuildData(models.Model):
        class Meta:
            table = "server_configs"
        guild_id = fields.BigIntField(pk=True)
        is_bot_setuped = fields.BooleanField(default=False)
        scrims_manager = fields.BooleanField(default=False)
        autorole_toggle = fields.BooleanField(default=False)
        autorole_bot_toggle = fields.BooleanField(default=False)
        autorole_human_toggle = fields.BooleanField(default=False)
        autorole_human = fields.BigIntField(null = True)
        autorole_bot = fields.BigIntField(null = True)
        automeme_toogle = fields.BooleanField(default=False) 
        automeme_channel_id = fields.BigIntField(null = True)
        is_guild_premium = fields.BooleanField(default = False)
    

    and in this some of the columns are not getting created like i am naming the columns which have got created : guild_id,scrims_manager, autorole_bot, autorole_human, these columns are getting created but other columns dosen't get created here is the output logged bye toetoise logging:

    CREATE TABLE IF NOT EXISTS "server_configs" (
        "guild_id" BIGSERIAL NOT NULL PRIMARY KEY,
        "scrims_manager" BOOL NOT NULL  DEFAULT False,
        "autorole_human" BIGINT,
        "autorole_bot" BIGINT
    );
    

    you can see that most of the tables are missing same thing is happening in other tables too here is image of table in pg admin https://cdn.discordapp.com/attachments/754690035319701545/856917612395102218/Screenshot_from_2021-06-22_20-54-04.png.

    Kindly solve the issue fast because i am getting late with my bots update because of this.

    Thanks

    opened by TierGamerpy 17
  • TypeError: error: Incompatible types in assignment (expression has type

    TypeError: error: Incompatible types in assignment (expression has type "Clan", variable has type "Optional[ForeignKeyFieldInstance[Clan]]")

    Describe the bug Getting a type error when updating the ForeignKey field to a different clan on a Nullable Foreign Key Field instance.

    To Reproduce

    if await Clan.get_or_none(tag=clantag):
    	clan = await Clan.get(tag=clantag)
    	if await Player.get_or_none(name=playername):
    		if await Season.active_seasons.all().exists():
    			await ctx.send("Cannot move players between clans during an active season.")
    		else:
    			player = await Player.get(name=playername)
    			if player.is_enabled():
    				player.clan = clan
    				await player.save()
    				await ctx.send(
    					f"Welcome @{playername} to the [{clan.tag}] {clan.name} Clan roster!"
    				)
    			else:
    				player.clan = clan
    				await player.save()
    				await ctx.send(
    					f"Welcome @{playername} to the [{clan.tag}] {clan.name} Clan roster!"
    				)
    	else:
    		await Player.create(name=playername, clan=clan, enabled=True)
    		await ctx.send(
    			f"Welcome @{playername} to the [{clan.tag}] {clan.name} Clan roster!"
    		)
    else:
    	await ctx.send(f"Clan {clantag} does not exist.")
    

    Expected behavior This shouldn't error as the code works fine and foreign key field gets updated fine.

    Additional context Add any other context about the problem here.

    opened by adambirds 0
  • Integration with existing Django projects

    Integration with existing Django projects

    If you are looking for how to use tortoise orm with django here is a lib https://github.com/KhDenys/django-tortoise. It generates tortoise models and creates connections automatically based on Django’s models.

    opened by KhDenys 0
  • Unable to set empty sql_mode for MySQL

    Unable to set empty sql_mode for MySQL

    Describe the bug Empty connection url parameters are ignored by Tortoise.init.

    To Reproduce

    1. Set connection url to something like mysql://myuser:[email protected]:3306/somedb?sql_mode= (empty value for sql_mode).
    2. Call Tortoise.init(...) with debug logs enabled.
    3. Get log message Created connection pool with params: {'host': 'db.host', 'port': 3306, 'user': 'myuser', 'db': 'somedb', 'autocommit': True, 'charset': 'utf8mb4', 'minsize': 1, 'maxsize': 5, 'sql_mode': 'STRICT_TRANS_TABLES'}

    Expected behavior Created connection pool with params: {'host': 'db.host', 'port': 3306, 'user': 'myuser', 'db': 'somedb', 'autocommit': True, 'charset': 'utf8mb4', 'minsize': 1, 'maxsize': 5, 'sql_mode': ''} (empty string as sql_mode).

    Additional context It seems like the problem is caused by parse_qs in expand_db_url, which sets keep_blank_values=False by default. Here https://github.com/tortoise/tortoise-orm/blob/develop/tortoise/backends/base/config_generator.py#L148.

    opened by sda97ghb 0
  •  TimeField native time format

    TimeField native time format

    Describe the bug TimeField is not in native time format. The TimeField is shown in the table as follows '00:00:04+00'.I just want it to be '00:00:04'. 'use_tz': False, same problem persists.

    To Reproduce

    from tortoise import Model, fields
    
    SUPPORT_TYPE = (
        (1, 'Payment'),
        (2, 'Free'),
        (3, 'Continuous'),
    )
    
    class User(Model):
        id = fields.IntField(pk=True)
        username = fields.CharField(max_length=20, unique=True)
        password = fields.CharField(max_length=128, null=True)
        first_name = fields.CharField(max_length=60, null=True)
        last_name = fields.CharField(max_length=60, null=True)
        email = fields.CharField(max_length=60, unique=True)
        image = fields.CharField(max_length=60, null=True)
        phone = fields.CharField(max_length=30, null=True)
        created_at = fields.DatetimeField(auto_now_add=True)
        updated_at = fields.DatetimeField(auto_now=True)
    
        def full_name(self) -> str:
            if self.first_name or self.last_name:
                return f"{self.first_name or ''} {self.last_name or ''}".strip()
            return self.username
    
    class Customer(Model):
        id = fields.BigIntField(pk=True)
        code = fields.CharField(max_length=30, null=True)
        name = fields.CharField(max_length=200)
        image = fields.CharField(max_length=60, null=True)
        website = fields.CharField(max_length=120, null=True)
        description = fields.TextField(null=True)
    
        class Meta:
            ordering = ["name"]
    
        def __str__(self) -> str:
            return self.name
    
    class Support(Model):
        id = fields.BigIntField(pk=True)
        customer = fields.ForeignKeyField('models.Customer', on_delete=fields.CASCADE)
        user = fields.ForeignKeyField('models.User', on_delete=fields.CASCADE)
        support_type = fields.SmallIntField(choices=SUPPORT_TYPE)
        created_date = fields.DateField(auto_now_add=True)
        created_time = fields.TimeField()
        duration = fields.TimeField()
    
        class Meta:
            table = 'support'
    
        def __str__(self) -> str:
            return self.customer.name
    
    from models import User, Support
    from sanic import Sanic, response
    from tortoise.contrib.sanic import register_tortoise
    
    app = Sanic("WebServer")
    
    
    @app.get("/support_list")
    @app.ext.template("support_list.html")
    async def list_support(request):
        objects = await Support.all()
        return {"objects": objects}
    
    
    register_tortoise(
        app,
        db_url="postgres://postgres:[email protected]:5432/myweb",
        modules={"models": ["models"]},
        generate_schemas=True
    )
    
    if __name__ == '__main__':
        app.run(host="0.0.0.0", port=8000, debug=True, auto_reload=True)
    
    

    Expected behavior I am using Sanic framework postgresql database with Tortoise-orm.

    Additional context sanic_tortoiseorm

    opened by ystekno 0
  • OneToOne In Pydantic Model with pydantic_model_creator method

    OneToOne In Pydantic Model with pydantic_model_creator method

    Describe the bug I cannot use OneToOne relation field inside a pydantic model creator's method.

    To Reproduce When I try to create a pydantic model using pydantic_model_creator method, it ignores OneToOne field, as it doesn't exist. Further on, I wrote validation for the method, but raised error to not write validation that doesn't exist in the pydantic method. Also it's not included in swagger's document, so it's not detected as a field at all.

    models.py

    class Supplier(models.Model):
        id = fields.UUIDField(pk=True)
        user = fields.OneToOneField('models.Users', on_delete=fields.SET_NULL, null=True, related_name='supplier')
        comission = fields.DecimalField(max_digits=30, decimal_places=2)
        current_debt = fields.DecimalField(max_digits=30, decimal_places=2, default=0)
        last_time_refresh = fields.DatetimeField(null=True, blank=True)
        shared_proxies = fields.BooleanField(default=False)
        wallet = fields.TextField(null=True, blank=True, )
        is_vip = fields.BooleanField(default=False)
        server = fields.CharField(default='1.1.1.1', max_length=15)
    
    SupplierPydanticIn = pydantic_model_creator(models.Supplier, name="SupplierIn", 
        include=('user', 'comission', 'current_debt', 'shared_proxies', 'server')
    )
    
    class SupplierIn(SupplierPydanticIn):
        @validator('user')
        def validate_user(cls, value):
            print("Just for test purpose!")
            return value
    

    image

    Expected behavior should get related field's ID as in pydantic model

    opened by ManiMozaffar 1
  • Support python 3.11(TestCase)

    Support python 3.11(TestCase)

    Is your feature request related to a problem? Please describe. A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] If already support python 3.11, then I'm really sorry.

    Describe the solution you'd like A clear and concise description of what you want to happen. Support python 3.11

    Describe alternatives you've considered A clear and concise description of any alternative solutions or features you've considered.

    Additional context Add any other context about the feature request here.

    connection is ok. but at test case TestCase fail with python 3.11.1, [email protected]

    Details
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/tortoise/contrib/test/__init__.py:297: in asyncSetUp
        await self.__transaction__.__aenter__()  # type: ignore
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/tortoise/contrib/test/__init__.py:273: in __aenter__
        self.connection._connection = await self.connection._parent._pool.acquire()
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/asyncpg/pool.py:838: in _acquire
        return await _acquire_impl()
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/asyncpg/pool.py:823: in _acquire_impl
        proxy = await ch.acquire()  # type: PoolConnectionProxy
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/asyncpg/pool.py:137: in acquire
        await self.connect()
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/asyncpg/pool.py:129: in connect
        self._con = await self._pool._get_new_connection()
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/asyncpg/pool.py:521: in _get_new_connection
        con = await connect_utils._connect_addr(
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/asyncpg/connect_utils.py:773: in _connect_addr
        return await __connect_addr(params, timeout, True, *args)
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/asyncpg/connect_utils.py:825: in __connect_addr
        tr, pr = await compat.wait_for(connector, timeout=timeout)
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/asyncpg/compat.py:56: in wait_for
        return await asyncio.wait_for(fut, timeout)
    /opt/hostedtoolcache/Python/3.11.1/x64/lib/python3.11/asyncio/tasks.py:479: in wait_for
        return fut.result()
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/asyncpg/connect_utils.py:684: in _create_ssl_connection
        tr, pr = await loop.create_connection(
    /opt/hostedtoolcache/Python/3.11.1/x64/lib/python3.11/asyncio/base_events.py:1039: in create_connection
        infos = await self._ensure_resolved(
    /opt/hostedtoolcache/Python/3.11.1/x64/lib/python3.11/asyncio/base_events.py:1413: in _ensure_resolved
        return await loop.getaddrinfo(host, port, family=family, type=type,
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
    self = <_UnixSelectorEventLoop running=False closed=False debug=False>
    host = 'localhost', port = 15432
        async def getaddrinfo(self, host, port, *,
                              family=0, type=0, proto=0, flags=0):
            if self._debug:
                getaddr_func = self._getaddrinfo_debug
            else:
                getaddr_func = socket.getaddrinfo
        
    >       return await self.run_in_executor(
                None, getaddr_func, host, port, family, type, proto, flags)
    E       RuntimeError: Task <Task pending name='Task-782' coro=<_create_ssl_connection() running at /home/runner/.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/asyncpg/connect_utils.py:684> cb=[remove_from_cache() at /home/runner/.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/newrelic/hooks/coroutines_asyncio.py:20, _release_waiter(<Future pendi...events.py:427>)() at /opt/hostedtoolcache/Python/3.11.1/x64/lib/python3.11/asyncio/tasks.py:421] created at /home/runner/.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/newrelic/common/object_wrapper.py:384> got Future <Future pending cb=[_chain_future.<locals>._call_check_cancel() at /opt/hostedtoolcache/Python/3.11.1/x64/lib/python3.11/asyncio/futures.py:387]> attached to a different loop
    /opt/hostedtoolcache/Python/3.11.1/x64/lib/python3.11/asyncio/base_events.py:867: RuntimeError
    _______ TestAPI.test_api ________
    self = <test_api.TestAPI testMethod=test_api>
        async def asyncSetUp(self) -> None:
    >       await super().asyncSetUp()
    app/tests/api/test_api.py:69: 
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/tortoise/contrib/test/__init__.py:297: in asyncSetUp
        await self.__transaction__.__aenter__()  # type: ignore
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/tortoise/contrib/test/__init__.py:273: in __aenter__
        self.connection._connection = await self.connection._parent._pool.acquire()
    ../../../.cache/pypoetry/virtualenvs/shopyo-sHG86FW6-py3.11/lib/python3.11/site-packages/asyncpg/pool.py:838: in _acquire
    
    enhancement 
    opened by hyeongguen-song 0
Releases(0.19.2)
  • 0.19.2(Jul 11, 2022)

    Added

    • Added schema attribute to Model's Meta to specify exact schema to use with the model.

    Fixed

    • Mixin does not work. (#1133)
    • using_db wrong position in model shortcut methods. (#1150)
    • Fixed connection to Oracle database by adding database info to DBQ in connection string.
    • Fixed ORA-01435 error while using Oracle database (#1155)
    • Fixed processing of ssl option in MySQL connection string.
    • Fixed type hinting for QuerySetSingle.
    Source code(tar.gz)
    Source code(zip)
  • 0.19.1(May 20, 2022)

    Added

    • Added Postgres/SQLite partial indexes support. (#1103)
    • Added Microsoft SQL Server/Oracle support, powered by asyncodbc, note that which is not fully tested.
    • Added optional parameter to pydantic_model_creator. (#770)
    • Added using_db parameter to Model shortcut methods. (#1109)

    Fixed

    • TimeField for MySQL will return datetime.timedelta object instead of datetime.time object.
    • Fix on conflict do nothing. (#1122)
    • Fix _custom_generated_pk attribute not set in Model._init_from_db method. (#633)
    Source code(tar.gz)
    Source code(zip)
  • 0.19.0(Mar 27, 2022)

    Added

    • Added psycopg backend support.
    • Added a new unified and robust connection management interface to access DB connections which includes support for lazy connection creation and much more. For more details, check out this PR.
    • Added TimeField. (#1054).
    • Added ArrayField for postgres.

    Fixed

    • Fix bulk_create doesn't work correctly with more than 1 update_fields. (#1046)
    • Fix bulk_update errors when setting null for a smallint column on postgres. (#1086)

    Deprecated

    • Existing connection management interface and related public APIs which are deprecated:
    • Tortoise.get_connection
    • Tortoise.close_connections

    Changed

    • Refactored tortoise.transactions.get_connection method to tortoise.transactions._get_connection. Note that this method has now been marked private to this module and is not part of the public API
    Source code(tar.gz)
    Source code(zip)
  • 0.18.1(Jan 10, 2022)

    Added

    • Add on conflict do update for bulk_create. (#1024)

    Fixed

    • Fix bulk_create error. (#1012)
    • Fix unittest invalid.
    • Fix bulk_update in postgres with some type. (#968) (#1022)
    Source code(tar.gz)
    Source code(zip)
  • 0.18.0(Dec 20, 2021)

    Added

    • Add Case-When support. (#943)
    • Add Rand/Random function in contrib. (#944)
    • Add ON CONFLICT support in INSERT statements. (#428)

    Fixed

    • Fix bulk_update error when pk is uuid. (#986)
    • Fix mutable default value. (#969)

    Changed

    • Move Function, Aggregate from functions.py to expressions.py. (#943)
    • Move Q from query_utils.py to expressions.py.
    • Replace python-rapidjson to orjson.

    Removed

    • Remove asynctest and use unittest.IsolatedAsyncioTestCase. (#416)
    • Remove py37 support in tests.
    • Remove green and nose2 test runner.
    Source code(tar.gz)
    Source code(zip)
  • 0.17.8(Oct 6, 2021)

    Added

    • Add Model.raw method to support the raw sql query.
    • Add QuerySet.bulk_update method. (#924)
    • Add QuerySet.in_bulk method.
    • Add MaxValueValidator and MinValueValidator (#927)

    Fixed

    • Fix QuerySet subclass being lost when _clone is run on the instance.
    • Fix bug in .values with source_field. (#844)
    • Fix contrib.blacksheep exception handlers, use builtin json response. (#914)
    • Fix Indexes defined in Meta class do not make use of exists parameter in their template (#928)

    Changed

    • Allow negative values with IntEnumField. (#889)
    • Make .values() and .values_list() awaited return more consistent. (#899)
    Source code(tar.gz)
    Source code(zip)
  • 0.17.7(Aug 31, 2021)

    • Fix select_related behaviour for forward relation. (#825)
    • Fix bug in nested QuerySet and Manager. (#864)
    • Add Concat function for MySQL/PostgreSQL. (#873)
    • Patch for use_index/force_index mutable problem when making query. (#888)
    • Lift annotation field's priority in make query. (#883)
    • Make use/force index available in select type Query. (#893)
    • Fix all logging to use Tortoise's logger instead of root logger. (#879)
    • Rename db_client logger to tortoise.db_client.
    • Add indexes to Model.describe.
    Source code(tar.gz)
    Source code(zip)
  • 0.17.6(Jul 26, 2021)

    • Add RawSQL expression.
    • Fix columns count with annotations in _make_query. (#776)
    • Make functions nested. (#828)
    • Add db_constraint in field describe.
    Source code(tar.gz)
    Source code(zip)
  • 0.17.5(Jul 7, 2021)

    • Set field_type of fk and o2o same to which relation field type. (#443)
    • Fix error sql for .sql() call more than once. (#796)
    • Fix incorrect splitting of the import route when using Router (#798)
    • Fix filter error after annotate with F. (#806)
    • Fix select_related for reverse relation. (#808)
    Source code(tar.gz)
    Source code(zip)
  • 0.17.4(Jun 3, 2021)

  • 0.17.3(May 22, 2021)

    • Fix duplicates when using custom through association class on M2M relations
    • Fix update_or_create and get_or_create. (#721)
    • Fix refresh_from_db without fields pass. (#734)
    • Make update query work with limit and order_by. (#748)
    • Add Subquery expression. (#756) (#9) (#337)
    • Use JSON in JSONField.
    Source code(tar.gz)
    Source code(zip)
  • 0.17.2(Apr 9, 2021)

    • Add more index types.
    • Add force_index, use_index to queryset.
    • Fix F in update error with update_fields.
    • Make delete query work with limit and order_by. (#697)
    • Filter backward FK fields with IS NULL and NOT IS NULL filters (#700)
    • Add select_for_update in update_or_create. (#702)
    • Add Model.select_for_update.
    • Add __search full text search to queryset.
    Source code(tar.gz)
    Source code(zip)
  • 0.17.1(Mar 27, 2021)

    • Fix type for modules.
    • Fix select_related when related model specified more than once. (#679)
    • Add __iter__ to model, now can just return model/models in fastapi response.
    • Fix in_transaction bug caused by router. (#677) (#678)
    Source code(tar.gz)
    Source code(zip)
  • 0.17.0(Mar 20, 2021)

  • 0.16.21(Feb 4, 2021)

    • Fixed validating JSON before decoding. (#623)
    • Add model method update_or_create.
    • Add batch_size parameter for bulk_create method.
    • Fix save with F expression and field with source_field.
    Source code(tar.gz)
    Source code(zip)
  • 0.16.20(Jan 23, 2021)

  • 0.16.19(Dec 23, 2020)

    • Replace set TZ environment variable to TIMEZONE to avoid affecting global timezone.
    • Allow passing module objects to models_paths param of Tortoise.init_models(). (#561)
    • Implement PydanticMeta.backward_relations. (#536)
    • Allow overriding PydanticMeta in PydanticModelCreator. (#536)
    • Fixed make_native typo to make_naive in timezone module
    Source code(tar.gz)
    Source code(zip)
  • 0.16.18(Nov 16, 2020)

    • Support custom function in update. (#537)
    • Add Model.refresh_from_db. (#549)
    • Add timezone support, be careful to upgrade to this version, see docs for details. (#335)
    • Remove aerich in case of cyclic dependency. (#558)
    Source code(tar.gz)
    Source code(zip)
  • 0.16.17(Oct 23, 2020)

    • Add on_delete in ManyToManyField. (#508)
    • Support F expression in annotate. (#475)
    • Fix QuerySet.select_related in case of join same table twice. (#525)
    • Integrate Aerich into the install. (#530)
    Source code(tar.gz)
    Source code(zip)
  • 0.16.16(Sep 24, 2020)

  • 0.16.15(Sep 16, 2020)

    • Make DateField accept valid date str.
    • Add QuerySet.select_for_update().
    • check default for not None on pydantic model creation
    • propagate default to pydantic model
    • Add QuerySet.select_related().
    • Add custom attribute name for Prefetch instruction.
    • Add db_constraint for RelationalField family.
    Source code(tar.gz)
    Source code(zip)
  • 0.16.14(Jul 25, 2020)

    • We now do CI runs on a Windows VM as well, to try and prevent Windows specific regressions.
    • Make F expression work with QuerySet.filter().
    • Include py.typed in source distribution.
    • Added datetime parsing from int for fields.DatetimeField.
    • get_or_create passes the using_db= on if provided.
    • Allow custom loop and connection_class parameters to be passed on to asyncpg.
    Source code(tar.gz)
    Source code(zip)
  • 0.16.13(Jun 2, 2020)

    • Default install of tortoise-orm now installs with no C-dependencies, if you want to use the C accelerators, please do a pip install tortoise-orm[accel] instead.
    • Added <instance>.clone() method that will create a cloned instance in memory. To persist it you still need to call .save()
    • .clone() will raise a ParamsError if tortoise can't generate a primary key. In that case do a .clone(pk=<newval>)
    • If manually setting the primary key value to None and the primary key can be automatically generated, this will create a new record. We however still recommend the .clone() method instead.
    • .save() can be forced to do a create by setting force_create=True
    • .save() can be forced to do an update by setting force_update=True
    • Setting update_fields for a .save() operation will strongly prefer to do an update if possible
    Source code(tar.gz)
    Source code(zip)
  • 0.16.12(May 22, 2020)

    • Make Field.default effect on db level when generate table
    • Add converters instead of importing from pymysql
    • Fix PostgreSQL BooleanField default value convertion
    • Fix JSONField typed in pydantic_model_creator
    • Add .sql() method on QuerySet
    Source code(tar.gz)
    Source code(zip)
  • 0.16.11(May 14, 2020)

    • fix: sqlite://:memory: in Windows thrown OSError: [WinError 123]
    • Support bulk_create() insertion of records with overridden primary key when the primary key is DB-generated
    • Add queryset.exists() and Model.exists().
    • Add model subscription lookup, Model[<pkval>] that will return the object or raise KeyError
    Source code(tar.gz)
    Source code(zip)
  • 0.16.10(Apr 30, 2020)

    • Fix bad import of basestring
    • Better handling of NULL characters in strings. Fixes SQLite, raises better error for PostgreSQL.
    • Support .group_by() with join now
    Source code(tar.gz)
    Source code(zip)
  • 0.16.9(Apr 26, 2020)

    • Support F expression in .save() now
    • IntEnumField accept valid int value and CharEnumField accept valid str value
    • Pydantic models get created with globally unique identifier
    • Leaf-detection to minimize duplicate Pydantic model creation
    • Pydantic models with a Primary Key that is also a raw field of a relation is now not hidden when exclude_raw_fields=True as it is a critically important field
    • Raise an informative error when a field is set as nullable and primary key at the same time
    • Foreign key id's are now described to have the positive-integer range of the field it is related to
    • Fixed prefetching over OneToOne relations
    • Fixed __contains for non-text fields (e.g. JSONB)
    Source code(tar.gz)
    Source code(zip)
  • 0.16.8(Apr 22, 2020)

    • Allow Q expression to function with _filter parameter on aggregations
    • Add manual .group_by() support
    • Fixed regression where GROUP BY class is missing for an aggregate with a specified order.
    Source code(tar.gz)
    Source code(zip)
  • 0.15.24(Apr 22, 2020)

  • 0.16.7(Apr 19, 2020)

    • Added preliminary support for Python 3.9
    • TruncationTestCase now properly quotes table names when it clears them out.
    • Add model signals support
    • Added app_label to test initializer(...) and TORTOISE_TEST_APP as test environment variable.
    Source code(tar.gz)
    Source code(zip)
Owner
Tortoise
Familiar asyncio ORM for python, built with relations in mind
Tortoise
a small, expressive orm -- supports postgresql, mysql and sqlite

peewee Peewee is a simple and small ORM. It has few (but expressive) concepts, making it easy to learn and intuitive to use. a small, expressive ORM p

Charles Leifer 9.7k Jan 08, 2023
A Python Library for Simple Models and Containers Persisted in Redis

Redisco Python Containers and Simple Models for Redis Description Redisco allows you to store objects in Redis. It is inspired by the Ruby library Ohm

sebastien requiem 436 Nov 10, 2022
A dataclasses-based ORM framework

dcorm A dataclasses-based ORM framework. [WIP] - Work in progress This framework is currently under development. A first release will be announced in

HOMEINFO - Digitale Informationssysteme GmbH 1 Dec 24, 2021
Twisted wrapper for asynchronous PostgreSQL connections

This is txpostgres is a library for accessing a PostgreSQL database from the Twisted framework. It builds upon asynchronous features of the Psycopg da

Jan Urbański 104 Apr 22, 2022
A database migrations tool for TortoiseORM, ready to production.

Aerich Introduction Aerich is a database migrations tool for Tortoise-ORM, which is like alembic for SQLAlchemy, or like Django ORM with it's own migr

Tortoise 596 Jan 06, 2023
Rich Python data types for Redis

Created by Stephen McDonald Introduction HOT Redis is a wrapper library for the redis-py client. Rather than calling the Redis commands directly from

Stephen McDonald 281 Nov 10, 2022
Prisma Client Python is an auto-generated and fully type-safe database client

Prisma Client Python is an unofficial implementation of Prisma which is a next-generation ORM that comes bundled with tools, such as Prisma Migrate, which make working with databases as easy as possi

Robert Craigie 930 Jan 08, 2023
A single model for shaping, creating, accessing, storing data within a Database

'db' within pydantic - A single model for shaping, creating, accessing, storing data within a Database Key Features Integrated Redis Caching Support A

Joshua Jamison 178 Dec 16, 2022
Global base classes for Pyramid SQLAlchemy applications.

pyramid_basemodel pyramid_basemodel is a thin, low level package that provides an SQLAlchemy declarative Base and a thread local scoped Session that c

Grzegorz Śliwiński 15 Jan 03, 2023
A very simple CRUD class for SQLModel! ✨

Base SQLModel A very simple CRUD class for SQLModel! ✨ Inspired on: Full Stack FastAPI and PostgreSQL - Base Project Generator FastAPI Microservices I

Marcelo Trylesinski 40 Dec 14, 2022
Adds SQLAlchemy support to Flask

Flask-SQLAlchemy Flask-SQLAlchemy is an extension for Flask that adds support for SQLAlchemy to your application. It aims to simplify using SQLAlchemy

The Pallets Projects 3.9k Jan 09, 2023
Tortoise ORM is an easy-to-use asyncio ORM inspired by Django.

Tortoise ORM was build with relations in mind and admiration for the excellent and popular Django ORM. It's engraved in it's design that you are working not with just tables, you work with relational

Tortoise 3.3k Jan 07, 2023
A simple project to explore the number of GCs when doing basic ORM work.

Question: Does Python do extremely too many GCs for ORMs? YES, OMG YES. Check this out Python Default GC Settings: SQLAlchemy - 20,000 records in one

Michael Kennedy 26 Jun 05, 2022
An async ORM. 🗃

ORM The orm package is an async ORM for Python, with support for Postgres, MySQL, and SQLite. ORM is built with: SQLAlchemy core for query building. d

Encode 1.7k Dec 28, 2022
Object mapper for Amazon's DynamoDB

Flywheel Build: Documentation: http://flywheel.readthedocs.org/ Downloads: http://pypi.python.org/pypi/flywheel Source: https://github.com/stevearc/fl

Steven Arcangeli 128 Dec 31, 2022
SQLModel is a library for interacting with SQL databases from Python code, with Python objects.

SQLModel is a library for interacting with SQL databases from Python code, with Python objects. It is designed to be intuitive, easy to use, highly compatible, and robust.

Sebastián Ramírez 9.1k Dec 31, 2022
A pythonic interface to Amazon's DynamoDB

PynamoDB A Pythonic interface for Amazon's DynamoDB. DynamoDB is a great NoSQL service provided by Amazon, but the API is verbose. PynamoDB presents y

2.1k Dec 30, 2022
A PostgreSQL or SQLite orm for Python

Prom An opinionated lightweight orm for PostgreSQL or SQLite. Prom has been used in both single threaded and multi-threaded environments, including en

Jay Marcyes 18 Dec 01, 2022
A new ORM for Python specially for PostgreSQL

A new ORM for Python specially for PostgreSQL. Fully-typed for any query with Pydantic and auto-model generation, compatible with any sync or async driver

Yan Kurbatov 3 Apr 13, 2022
Piccolo - A fast, user friendly ORM and query builder which supports asyncio.

A fast, user friendly ORM and query builder which supports asyncio.

919 Jan 04, 2023