Redis OM Python makes it easy to model Redis data in your Python applications.

Overview


Redis OM

Object mapping, and more, for Redis and Python


Version License Build Status

Redis OM Python makes it easy to model Redis data in your Python applications.

Redis OM Python | Redis OM Node.js | Redis OM Spring | Redis OM .NET

Table of contents

span

💡 Why Redis OM?

Redis OM provides high-level abstractions that make it easy to model and query data in Redis with modern Python applications.

This preview release contains the following features:

  • Declarative object mapping for Redis objects
  • Declarative secondary-index generation
  • Fluent APIs for querying Redis

📇 Modeling Your Data

Redis OM contains powerful declarative models that give you data validation, serialization, and persistence to Redis.

Check out this example of modeling customer data with Redis OM. First, we create a Customer model:

import datetime
from typing import Optional

from pydantic import EmailStr

from redis_om import HashModel


class Customer(HashModel):
    first_name: str
    last_name: str
    email: EmailStr
    join_date: datetime.date
    age: int
    bio: Optional[str]

Now that we have a Customer model, let's use it to save customer data to Redis.

"01FJM6PH661HCNNRC884H6K30C" # We can save the model to Redis by calling `save()`: andrew.save() # To retrieve this customer with its primary key, we use `Customer.get()`: assert Customer.get(andrew.pk) == andrew ">
import datetime
from typing import Optional

from pydantic import EmailStr

from redis_om import HashModel


class Customer(HashModel):
    first_name: str
    last_name: str
    email: EmailStr
    join_date: datetime.date
    age: int
    bio: Optional[str]


# First, we create a new `Customer` object:
andrew = Customer(
    first_name="Andrew",
    last_name="Brookins",
    email="[email protected]",
    join_date=datetime.date.today(),
    age=38,
    bio="Python developer, works at Redis, Inc."
)

# The model generates a globally unique primary key automatically
# without needing to talk to Redis.
print(andrew.pk)
# > "01FJM6PH661HCNNRC884H6K30C"

# We can save the model to Redis by calling `save()`:
andrew.save()

# To retrieve this customer with its primary key, we use `Customer.get()`:
assert Customer.get(andrew.pk) == andrew

Ready to learn more? Check out the getting started guide.

Or, continue reading to see how Redis OM makes data validation a snap.

✓ Validating Data With Your Model

Redis OM uses Pydantic to validate data based on the type annotations you assign to fields in a model class.

This validation ensures that fields like first_name, which the Customer model marked as a str, are always strings. But every Redis OM model is also a Pydantic model, so you can use Pydantic validators like EmailStr, Pattern, and many more for complex validations!

For example, because we used the EmailStr type for the email field, we'll get a validation error if we try to create a Customer with an invalid email address:

import datetime
from typing import Optional

from pydantic import EmailStr, ValidationError

from redis_om import HashModel


class Customer(HashModel):
    first_name: str
    last_name: str
    email: EmailStr
    join_date: datetime.date
    age: int
    bio: Optional[str]


try:
    Customer(
        first_name="Andrew",
        last_name="Brookins",
        email="Not an email address!",
        join_date=datetime.date.today(),
        age=38,
        bio="Python developer, works at Redis, Inc."
    )
except ValidationError as e:
    print(e)
    """
    pydantic.error_wrappers.ValidationError: 1 validation error for Customer
     email
       value is not a valid email address (type=value_error.email)
    """

Any existing Pydantic validator should work as a drop-in type annotation with a Redis OM model. You can also write arbitrarily complex custom validations!

To learn more, see the documentation on data validation.

🔎 Rich Queries and Embedded Models

Data modeling, validation, and saving models to Redis all work regardless of how you run Redis.

Next, we'll show you the rich query expressions and embedded models Redis OM provides when the RediSearch and RedisJSON modules are installed in your Redis deployment, or you're using Redis Enterprise.

TIP: Wait, what's a Redis module? If you aren't familiar with Redis modules, review the So, How Do You Get RediSearch and RedisJSON? section of this README.

Querying

Redis OM comes with a rich query language that allows you to query Redis with Python expressions.

To show how this works, we'll make a small change to the Customer model we defined earlier. We'll add Field(index=True) to tell Redis OM that we want to index the last_name and age fields:

import datetime
from typing import Optional

from pydantic import EmailStr

from redis_om import (
    Field,
    HashModel,
    Migrator
)

                 
class Customer(HashModel):
    first_name: str
    last_name: str = Field(index=True)
    email: EmailStr
    join_date: datetime.date
    age: int = Field(index=True)
    bio: Optional[str]


# Now, if we use this model with a Redis deployment that has the
# RediSearch module installed, we can run queries like the following.

# Before running queries, we need to run migrations to set up the
# indexes that Redis OM will use. You can also use the `migrate`
# CLI tool for this!
Migrator().run()

# Find all customers with the last name "Brookins"
Customer.find(Customer.last_name == "Brookins").all()

# Find all customers that do NOT have the last name "Brookins"
Customer.find(Customer.last_name != "Brookins").all()

# Find all customers whose last name is "Brookins" OR whose age is 
# 100 AND whose last name is "Smith"
Customer.find((Customer.last_name == "Brookins") | (
        Customer.age == 100
) & (Customer.last_name == "Smith")).all()

These queries -- and more! -- are possible because Redis OM manages indexes for you automatically.

Querying with this index features a rich expression syntax inspired by the Django ORM, SQLAlchemy, and Peewee. We think you'll enjoy it!

Embedded Models

Redis OM can store and query nested models like any document database, with the speed and power you get from Redis. Let's see how this works.

In the next example, we'll define a new Address model and embed it within the Customer model.

import datetime
from typing import Optional

from redis_om import (
    EmbeddedJsonModel,
    JsonModel,
    Field,
    Migrator,
)


class Address(EmbeddedJsonModel):
    address_line_1: str
    address_line_2: Optional[str]
    city: str = Field(index=True)
    state: str = Field(index=True)
    country: str
    postal_code: str = Field(index=True)


class Customer(JsonModel):
    first_name: str = Field(index=True)
    last_name: str = Field(index=True)
    email: str = Field(index=True)
    join_date: datetime.date
    age: int = Field(index=True)
    bio: Optional[str] = Field(index=True, full_text_search=True,
                               default="")

    # Creates an embedded model.
    address: Address


# With these two models and a Redis deployment with the RedisJSON 
# module installed, we can run queries like the following.

# Before running queries, we need to run migrations to set up the
# indexes that Redis OM will use. You can also use the `migrate`
# CLI tool for this!
Migrator().run()

# Find all customers who live in San Antonio, TX
Customer.find(Customer.address.city == "San Antonio",
              Customer.address.state == "TX")

💻 Installation

Installation is simple with pip, Poetry, or Pipenv.

# With pip
$ pip install redis-om

# Or, using Poetry
$ poetry add redis-om

📚 Documentation

The Redis OM documentation is available here.

⛏️ Troubleshooting

If you run into trouble or have any questions, we're here to help!

Hit us up on the Redis Discord Server or open an issue on GitHub.

So How Do You Get RediSearch and RedisJSON?

Some advanced features of Redis OM rely on core features from two source available Redis modules: RediSearch and RedisJSON.

You can run these modules in your self-hosted Redis deployment, or you can use Redis Enterprise, which includes both modules.

To learn more, read our documentation.

❤️ Contributing

We'd love your contributions!

Bug reports are especially helpful at this stage of the project. You can open a bug report on GitHub.

You can also contribute documentation -- or just let us know if something needs more detail. Open an issue on GitHub to get started.

📝 License

Redis OM uses the BSD 3-Clause license.

Comments
  • redis.exceptions.ResponseError: unknown command 'module'

    redis.exceptions.ResponseError: unknown command 'module'

    Hi,

    So basically trying to reproduce what @simonprickett did here except that i'm doing it on Redis cloud instead of locally or using a venv or docker or whatever.

    So i've managed to successfully write to the redis db cloud after properly configuring REDIS_OM_URL to point at the proper endpoint x credentials, now when I perform a rediSearch query ex: Adoptable.find(Adoptable.name == "Poppy").all() I get the error redis.exceptions.ResponseError: unknown command 'module' Another way of reproducing the above is as follows:

    from redis import ResponseError
    try:
        Adoptable.find(Adoptable.name == "Poppy").all()
    except ResponseError as e:
        print(e)
    

    I've googled the above type of error but all i see is the same type of error but the word 'module' being swapped by different terms (ex: like JSON.GET whose solution was to include redisJSON) Any help is appreciated, thank you.

    PS: yes i have created my database by properly including modules rediSearch and redisJSON.

    ahmad

    opened by therealbazzi 12
  • Query to fetch single record from 10K total keys is very slow (~12 seconds)

    Query to fetch single record from 10K total keys is very slow (~12 seconds)

    import datetime
    from typing import Optional
    
    from redis_om import Field, HashModel, Migrator, get_redis_connection
    
    # This Redis instance is tuned for durability.
    REDIS_DATA_URL = "redis://localhost:6380"
    
    class Person(HashModel):
        first_name: str = Field(index=True)
        last_name: str = Field(index=True)
        emp_no: int =  Field(index=True)
    
    # set redis connection
    Person.Meta.database = get_redis_connection(url=REDIS_DATA_URL,
                                                      decode_responses=True)
    # apply migrations
    Migrator().run()
    
    for row_number in range(0,10000):
        person = Person(first_name="John" + str(row_number), last_name="Doe", emp_no=row_number)
        
        result = Person.find(Person.emp_no ==row_number).all()
        if (len(result) == 0):
            person.save()
    
     
        print(person.pk)
    
    # very slow to query a single record (~12 seconds)
    Person.find().sort_by('-emp_no').first()
    
    opened by msarm 9
  • no such index

    no such index

    Set a db number other than 0, I got an error.

    import datetime
    from typing import Optional
    
    from redis_om import Field, JsonModel, Migrator, get_redis_connection
    
    
    class Customer(JsonModel):
        first_name: str
        last_name: str = Field(index=True)
        email: str
        join_date: datetime.date
        age: int = Field(index=True)
        bio: Optional[str]
    
        class Meta:
            database = get_redis_connection(host="redis", port=6379, db=1)
    
    
    andrew = Customer(
        first_name="Andrew",
        last_name="Brookins",
        email="[email protected]",
        join_date=datetime.date.today(),
        age=38,
    )
    andrew.save()
    Migrator().run()
    
    # Find all customers with the last name "Brookins"
    result = Customer.find(Customer.last_name == "Brookins").all()
    

    error

    Traceback (most recent call last):
      File "/app/aaa.py", line 30, in <module>
        Customer.find(Customer.last_name == "Brookins").all()
      File "/opt/pysetup/.venv/lib/python3.10/site-packages/redis_om/model/model.py", line 761, in all
        return self.execute()
      File "/opt/pysetup/.venv/lib/python3.10/site-packages/redis_om/model/model.py", line 725, in execute
        raw_result = self.model.db().execute_command(*args)
      File "/opt/pysetup/.venv/lib/python3.10/site-packages/redis/client.py", line 1173, in execute_command
        return conn.retry.call_with_retry(
      File "/opt/pysetup/.venv/lib/python3.10/site-packages/redis/retry.py", line 41, in call_with_retry
        return do()
      File "/opt/pysetup/.venv/lib/python3.10/site-packages/redis/client.py", line 1174, in <lambda>
        lambda: self._send_command_parse_response(
      File "/opt/pysetup/.venv/lib/python3.10/site-packages/redis/client.py", line 1150, in _send_command_parse_response
        return self.parse_response(conn, command_name, **options)
      File "/opt/pysetup/.venv/lib/python3.10/site-packages/redis/client.py", line 1189, in parse_response
        response = connection.read_response()
      File "/opt/pysetup/.venv/lib/python3.10/site-packages/redis/connection.py", line 817, in read_response
        raise response
    redis.exceptions.ResponseError: :__main__.Customer:index: no such index
    

    db number set 0 then, no error

        class Meta:
    -        database = get_redis_connection(host="redis", port=6379, db=1)
    +        database = get_redis_connection(host="redis", port=6379, db=0)
    

    result

    [Customer(pk='01FTBDC0A8D47YQ516GQ84A8K8', first_name='Andrew', last_name='Brookins', email='[email protected]', join_date=datetime.date(2022, 1, 26), age=38, bio=None)]
    

    Python 3.10.2 docker-compose version 1.29.2, redis-om 0.0.17 module:name=ReJSON,ver=999999,api=1,filters=0,usedby=[search],using=[],options=[handle-io-errors] module:name=search,ver=20206,api=1,filters=0,usedby=[],using=[ReJSON],options=[handle-io-errors]

    opened by Nishikoh 8
  • find method does not work

    find method does not work

    What is the issue

    I have created a FastAPI application, and I am trying to integrate aredis_om (but this fails with redis_om as well) with one of my rest endpoint modules.

    I created a model called Item:

    # app/models/redis/item.py
    from aredis_om import Field, HashModel
    
    from app.db.redis.session import redis_conn
    
    
    class Item(HashModel):
        id: int = Field(index=True)
        name: str = Field(index=True)
        timestamp: float = Field(index=True)
    
        class Meta:
            database = redis_conn
    
    # app/chemas/redis/item.py
    from pydantic import BaseModel
    
    
    class ItemCreate(BaseModel):
        id: int
        name: str
    
    
    # app/db/redis/session.py
    from aredis_om import get_redis_connection
    
    from app.core.config import settings
    
    
    redis_conn = get_redis_connection(
        url=f"redis://{settings.REDIS_HOST}:{settings.REDIS_PORT}",
        decode_responses=True
    )
    
    # app/api/api_v1/endpoints/redis_item.py
    import time
    from typing import Any, List, Optional
    
    from fastapi import APIRouter, HTTPException
    from aredis_om import NotFoundError
    
    from app.models.redis.item import Item
    from app.schemas.redis.item import ItemCreate
    
    
    router = APIRouter()
    
    
    @router.get("/", response_model=List[Item])
    async def list_redis_items(name: Optional[str] = None) -> Any:
        items = []
        pks = [pk async for pk in await Item.all_pks()]
        for pk in pks:
            item = await Item.get(pk)
            if name is None:
                items.append(item)
            else:
                if item.name == name:
                    items.append(item)
        return items
    
    
    @router.post("/", response_model=Item)
    async def post_redis_item(item: ItemCreate) -> Any:
        return await Item(id=item.id, name=item.name, timestamp=float(time.time())).save()
    
    
    @router.get("/{id}", response_model=Item)
    async def get_redis_item(id: int) -> Any:
        items = []
        pks = [pk async for pk in await Item.all_pks()]
        for pk in pks:
            item = await Item.get(pk)
            if item.id == id:
                return item
    
        raise HTTPException(status_code=404, detail=f"Item {id} not found")
    
    
    @router.put("/{id}", response_model=Item)
    async def update_redis_item(id: int, patch: Item) -> Any:
        try:
            item = await Item.get(id)
        except NotFoundError:
            raise HTTPException(status_code=404, detail=f"Item {id} not found")
        item.name = patch.name
        return await item.save()
    

    As you can see in my endpoints file, I had to make a workaround to be able to pull the individual item, and to be able to get a list of items from Redis. My last endpoint, I believe is just wrong, I have to change id to a pk in order to get that item, so the last endpoint can be ignored.

    My attempt for the first endpoint was this:

    ...
    if name is None:
        items = await Item.find().all()
    else:
        items = await Item.find(Item.name == name).all()
    return items
    

    When I hit the endpoint with the .find() method I received a traceback of:

    INFO:     127.0.0.1:56128 - "GET /api/v1/redis_item/ HTTP/1.1" 500 Internal Server Error
    ERROR:    Exception in ASGI application
    Traceback (most recent call last):
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/uvicorn/protocols/http/h11_impl.py", line 403, in run_asgi
        result = await app(self.scope, self.receive, self.send)
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
        return await self.app(scope, receive, send)
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/fastapi/applications.py", line 269, in __call__
        await super().__call__(scope, receive, send)
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/starlette/applications.py", line 124, in __call__
        await self.middleware_stack(scope, receive, send)
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/starlette/middleware/errors.py", line 184, in __call__
        raise exc
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/starlette/middleware/errors.py", line 162, in __call__
        await self.app(scope, receive, _send)
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/starlette/middleware/cors.py", line 84, in __call__
        await self.app(scope, receive, send)
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/starlette/exceptions.py", line 93, in __call__
        raise exc
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/starlette/exceptions.py", line 82, in __call__
        await self.app(scope, receive, sender)
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
        raise e
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
        await self.app(scope, receive, send)
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/starlette/routing.py", line 670, in __call__
        await route.handle(scope, receive, send)
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/starlette/routing.py", line 266, in handle
        await self.app(scope, receive, send)
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/starlette/routing.py", line 65, in app
        response = await func(request)
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/fastapi/routing.py", line 227, in app
        raw_response = await run_endpoint_function(
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/fastapi/routing.py", line 160, in run_endpoint_function
        return await dependant.call(**values)
      File "/home/user/ApeWorx/Kerkopes/kerkopes/backend/app/./app/api/api_v1/endpoints/redis_item.py", line 17, in list_redis_items
        return await Item.find().all()
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/aredis_om/model/model.py", line 760, in all
        return await query.execute()
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/aredis_om/model/model.py", line 725, in execute
        raw_result = await self.model.db().execute_command(*args)
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/aioredis/client.py", line 1085, in execute_command
        return await self.parse_response(conn, command_name, **options)
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/aioredis/client.py", line 1101, in parse_response
        response = await connection.read_response()
      File "/home/user/.cache/pypoetry/virtualenvs/app-iUv8FE9o-py3.8/lib/python3.8/site-packages/aioredis/connection.py", line 919, in read_response
        raise response from None
    aioredis.exceptions.ResponseError: unknown command `ft.search`, with args beginning with: `:app.models.redis.item.Item:index`, `*`, `LIMIT`, `0`, `10`, 
    

    If you need more information from me, let me know! Thank you in advance.

    opened by johnson2427 7
  • Migrator using get_redis_connection

    Migrator using get_redis_connection

    Hello, I'm trying to use Migrator with get_redis_connection like in the docs: https://redis.com/blog/introducing-redis-om-for-python/ but I'm facing this error, please help if you have an idea about it

    In [11]: from redis_om import get_redis_connection
    
    In [12]: get_redis_connection()
    Out[12]: Redis<ConnectionPool<Connection<host=localhost,port=6379,db=0>>>
    
    In [13]: Migrator(get_redis_connection()).run()
    ---------------------------------------------------------------------------
    AttributeError                            Traceback (most recent call last)
    <ipython-input-13-00d45abb5072> in <module>
    ----> 1 Migrator(get_redis_connection()).run()
    
    ~/dev/redis-om-python-search-demo/venv/lib/python3.9/site-packages/redis_om/model/migrations/migrator.py in run(self)
        154         # TODO: Migration history
        155         # TODO: Dry run with output
    --> 156         self.detect_migrations()
        157         for migration in self.migrations:
        158             migration.run()
    
    ~/dev/redis-om-python-search-demo/venv/lib/python3.9/site-packages/redis_om/model/migrations/migrator.py in detect_migrations(self)
         92         # Try to load any modules found under the given path or module name.
         93         if self.module:
    ---> 94             import_submodules(self.module)
         95 
         96         # Import this at run-time to avoid triggering import-time side effects,
    
    ~/dev/redis-om-python-search-demo/venv/lib/python3.9/site-packages/redis_om/model/migrations/migrator.py in import_submodules(root_module_name)
         22     """Import all submodules of a module, recursively."""
         23     # TODO: Call this without specifying a module name, to import everything?
    ---> 24     root_module = importlib.import_module(root_module_name)
         25 
         26     if not hasattr(root_module, "__path__"):
    
    /usr/lib/python3.9/importlib/__init__.py in import_module(name, package)
        116     """
        117     level = 0
    --> 118     if name.startswith('.'):
        119         if not package:
        120             msg = ("the 'package' argument is required to perform a relative "
    
    AttributeError: 'Redis' object has no attribute 'startswith'
    

    also how I can set the host of redis without using get_redis_connection if that possible

    opened by th3happybit 7
  • Anyone interested in a hard fork?

    Anyone interested in a hard fork?

    Hey everyone. Since this project is not actively maintained anymore, the last release was around 3 months ago, and I am thinking to release a new package/version based on a fork I am currently working on that includes all the patches, additional features, docs... Is anyone interested? Is it ok @simonprickett @ everyone to do so? How about licensing? Do i need to release it under the MIT license?

    opened by wiseaidev 6
  • Raises --> 'list index out of range' when keys loaded > ~605K

    Raises --> 'list index out of range' when keys loaded > ~605K

    This issue is very interesting but consistent when we start loading keys > ~605K.

    Run the below code and you have to wait for an hour or more based on your system configuration to replicate this issue.

    import datetime
    from typing import Optional
    
    from redis_om import Field, HashModel, Migrator, get_redis_connection
    
    # This Redis instance is tuned for durability.
    REDIS_DATA_URL = "redis://localhost:6379"
    
    class Person(HashModel):
        first_name: str = Field(index=True)
        last_name: str = Field(index=True)
        emp_no: int =  Field(index=True, sortable=True)
        
        class Meta:
            global_key_prefix = "person"
    
    # set redis connection
    Person.Meta.database = get_redis_connection(url=REDIS_DATA_URL,
                                                      decode_responses=True)
    # apply migrations
    Migrator().run()
    
    # initilize employee
    emp_no = 0
    
    try:
        
        # find the last employee added
        result = Person.find().sort_by('-emp_no').first()
    
        # if we find a result, we get the last added employee number
        if (result is not None):
            emp_no = result.emp_no
        
    except Exception as ex:
        # when record not exist, then pass
        pass
    
    
    # set a fag for while loop 
    able_to_find_record = True
    
    # add employe untill redis returns the last added employee 
    while (able_to_find_record):
        
        # add new employee
        emp_no += 1
        person = Person(first_name="John" + str(emp_no), last_name="Doe", emp_no=emp_no)
        person.save()
        
        # get the last added employee
        result = Person.find().sort_by('-emp_no').first()
        
        # when redis fails to find the last added employee, we exit the while loop
        if (result is None or result.emp_no!= emp_no):
            able_to_find_record = False
            print("Unable to find record" + str(emp_no))
    
    

    Error is raised from the below method, it expects the result here but the Redis search library returns an empty array. I believe the issue is with the Redis library or Redis server when keys are over > ~605K

        @classmethod
        def from_redis(cls, res: Any):
            # TODO: Parsing logic copied from redisearch-py. Evaluate.
            import six
            from six.moves import xrange
            from six.moves import zip as izip
    
            def to_string(s):
                if isinstance(s, six.string_types):
                    return s
                elif isinstance(s, six.binary_type):
                    return s.decode("utf-8", "ignore")
                else:
                    return s  # Not a string we care about
    
            docs = []
            step = 2  # Because the result has content
            offset = 1  # The first item is the count of total matches.
    
            for i in xrange(1, len(res), step):
                fields_offset = offset
    
                fields = dict(
                    dict(
                        izip(
                            map(to_string, res[i + fields_offset][::2]), <<-- Issue is raised here
    

    Here is the error raised

    list index out of range
      File "/Users/user/projects/dev/fxLib/.venv/lib/python3.9/site-packages/redis_om/model/model.py", line 1204, in from_redis
        map(to_string, res[i + fields_offset][::2]),
      File "/Users/user/projects/dev/fxLib/.venv/lib/python3.9/site-packages/redis_om/model/model.py", line 727, in execute
        results = self.model.from_redis(raw_result)
      File "/Users/user/projects/dev/fxLib/.venv/lib/python3.9/site-packages/redis_om/model/model.py", line 752, in first
        results = query.execute(exhaust_results=False)
      File "/Users/user/projects/dev/fxLib/tests/Test_redis2.py", line 34, in <module> (Current frame)
        result = Person.find().sort_by('-emp_no').first()
      File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/runpy.py", line 87, in _run_code
        exec(code, run_globals)
      File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/runpy.py", line 97, in _run_module_code
        _run_code(code, mod_globals, init_globals,
      File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/runpy.py", line 268, in run_path
        return _run_module_code(code, init_globals, run_name,
      File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/runpy.py", line 87, in _run_code
        exec(code, run_globals)
      File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/runpy.py", line 197, in _run_module_as_main
        return _run_code(code, main_globals, None,
    

    I'm using below stack: docket image: "redis/redis-stack:latest" (may be 6.2.2-v1) DIGEST:sha256:27666e8e1b632cc02bfb926bf9cbbda650aed2b818444c58613379167e12369e

    redis --> 4.2.2 redis-om --> 0.0.26

    Let me know if you need more information.

    opened by msarm 6
  • Add a `page` function for `find`

    Add a `page` function for `find`

    Allow the user to page through all results rather than returning them all in one go... .find().page(offset, limit). See Redis OM Node for a good example.

    enhancement 
    opened by simonprickett 5
  • Finding first match not working as expected?

    Finding first match not working as expected?

    Reported by @msarm initially as part of #207 :

    I see 'ft.search' round trip got reduced now after defaulting the page size to 1000 and this helps other queries to run even faster but my results are not coming this time may be a different issue now.

    When I perform the Person query:

    Person.find().sort_by('-emp_no').first()
    

    Here is the ft.search performed.

    args: ['ft.search', ':__main__.Person:index', '*', 'LIMIT', 0, 1, 'SORTBY', 'emp_no', 'desc']
    args: ['ft.search', ':__main__.Person:index', '*', 'LIMIT', 1000, 1, 'SORTBY', 'emp_no', 'desc']
    args: ['ft.search', ':__main__.Person:index', '*', 'LIMIT', 2000, 1, 'SORTBY', 'emp_no', 'desc']
    args: ['ft.search', ':__main__.Person:index', '*', 'LIMIT', 3000, 1, 'SORTBY', 'emp_no', 'desc']
    args: ['ft.search', ':__main__.Person:index', '*', 'LIMIT', 4000, 1, 'SORTBY', 'emp_no', 'desc']
    args: ['ft.search', ':__main__.Person:index', '*', 'LIMIT', 5000, 1, 'SORTBY', 'emp_no', 'desc']
    args: ['ft.search', ':__main__.Person:index', '*', 'LIMIT', 6000, 1, 'SORTBY', 'emp_no', 'desc']
    args: ['ft.search', ':__main__.Person:index', '*', 'LIMIT', 7000, 1, 'SORTBY', 'emp_no', 'desc']
    args: ['ft.search', ':__main__.Person:index', '*', 'LIMIT', 8000, 1, 'SORTBY', 'emp_no', 'desc']
    args: ['ft.search', ':__main__.Person:index', '*', 'LIMIT', 9000, 1, 'SORTBY', 'emp_no', 'desc']
    args: ['ft.search', ':__main__.Person:index', '*', 'LIMIT', 10000, 1, 'SORTBY', 'emp_no', 'desc']  - Fired query hangs
    

    When the last fired ft.search is performed it does not return any results it retries for every.

    Since are looking just for the first record, can we just fire the first ft.search and return the results?

    args: ['ft.search', ':__main__.Person:index', '*', 'LIMIT', 0, 1, 'SORTBY', 'emp_no', 'desc']
    

    When I set explicitly the exhaust_results=False in the first() function, I see quick results with no extra query. Is that right the fix?

        def first(self):
            query = self.copy(offset=0, limit=1, sort_fields=self.sort_fields)
            results = query.execute(exhaust_results=False)
            if not results:
                raise NotFoundError()
            return results[0]
    

    Original code:

    import datetime
    from typing import Optional
    
    from redis_om import Field, HashModel, Migrator, get_redis_connection
    
    # This Redis instance is tuned for durability.
    REDIS_DATA_URL = "redis://localhost:6380"
    
    class Person(HashModel):
        first_name: str = Field(index=True)
        last_name: str = Field(index=True)
        emp_no: int =  Field(index=True)
    
    # set redis connection
    Person.Meta.database = get_redis_connection(url=REDIS_DATA_URL,
                                                      decode_responses=True)
    # apply migrations
    Migrator().run()
    
    for row_number in range(0,10000):
        person = Person(first_name="John" + str(row_number), last_name="Doe", emp_no=row_number)
        
        result = Person.find(Person.emp_no ==row_number).all()
        if (len(result) == 0):
            person.save()
    
     
        print(person.pk)
    
    # very slow to query a single record (~12 seconds)
    Person.find().sort_by('-emp_no').first()
    
    opened by simonprickett 5
  • error on fetch or filtering

    error on fetch or filtering

    Hello. I'm following the starter tutorial at https://www.youtube.com/watch?v=jJYHUp9ZeTY&t=56s.

    I'm getting error when trying to fetch data from a HashModel. For example: result = Product.all_pks() gets: redis.exceptions.ResponseError: syntax error

    For creating: exists = Product.find(Product.name == product.name).first() gets: TypeError: 'NoneType' object is not subscriptable

    Creating is OK.

    Versions: Python version: 3.9.7 redis-om-python: 0.0.20

    opened by alecvinent 5
  • Example in README is broken, raise exception wrong number of arguments for 'hset' command

    Example in README is broken, raise exception wrong number of arguments for 'hset' command

    below is the detail of the exception:

    File "/root/myproject/mystatement/src/deps/event_store.py", line 64, in main await andrew.save() File "/root/deps/miniconda3/envs/spps/lib/python3.8/site-packages/aredis_om/model/model.py", line 1303, in save await db.hset(self.key(), mapping=document) File "/root/deps/miniconda3/envs/spps/lib/python3.8/site-packages/aioredis/client.py", line 1064, in execute_command return await self.parse_response(conn, command_name, **options) File "/root/deps/miniconda3/envs/spps/lib/python3.8/site-packages/aioredis/client.py", line 1080, in parse_response response = await connection.read_response() File "/root/deps/miniconda3/envs/spps/lib/python3.8/site-packages/aioredis/connection.py", line 868, in read_response raise response from None aioredis.exceptions.ResponseError: wrong number of arguments for 'hset' command

    above exception happened after running:

    class Customer(HashModel):
        first_name: str
        last_name: str
        email: str
        join_date: datetime.date
        age: int
        bio: Optional[str]
    
    # First, we create a new `Customer` object:
    andrew = Customer(
        first_name="Andrew",
        last_name="Brookins",
        email="[email protected]",
        join_date=datetime.date.today(),
        age=38,
        bio="Python developer, works at Redis, Inc.",
    )
    
    print(andrew.pk)
    await andrew.save()
    a = await Customer.get(andrew.pk)
    print(a)
    

    pip show aioredis: Name: aioredis Version: 2.0.0

    pip show redis-om Name: redis-om Version: 0.0.20

    documentation 
    opened by raceychan 5
  • Can't install using Python 3.11

    Can't install using Python 3.11

    pip basically does not found the latest versions when using python 3.11... (it only reaches until 0.0.27)

    python3.10 -m pip index versions redis-om
    WARNING: pip index is currently an experimental command. It may be removed/changed in a future release without prior warning.
    redis-om (0.1.1)
    Available versions: 0.1.1, 0.1.0, 0.0.27, 0.0.26, 0.0.25, 0.0.24, 0.0.23, 0.0.22, 0.0.21, 0.0.20, 0.0.19, 0.0.18, 0.0.17, 0.0.16, 0.0.15, 0.0.14, 0.0.13, 0.0.12, 0.0.11, 0.0.10, 0.0.9, 0.0.8, 0.0.7, 0.0.6, 0.0.5, 0.0.4, 0.0.3
    
    python3.11 -m pip index versions redis-om
    WARNING: pip index is currently an experimental command. It may be removed/changed in a future release without prior warning.
    redis-om (0.0.27)
    Available versions: 0.0.27, 0.0.26, 0.0.25, 0.0.24, 0.0.23, 0.0.22, 0.0.21, 0.0.20, 0.0.19, 0.0.18, 0.0.17, 0.0.16, 0.0.15, 0.0.14, 0.0.13, 0.0.12, 0.0.11, 0.0.10, 0.0.9, 0.0.8, 0.0.7, 0.0.6, 0.0.5, 0.0.4, 0.0.3
    

    Note: I found that all unit tests are running under python3.9.16

    image

    opened by bonastreyair 0
  • Bump coverage from 6.5.0 to 7.0.3

    Bump coverage from 6.5.0 to 7.0.3

    Bumps coverage from 6.5.0 to 7.0.3.

    Changelog

    Sourced from coverage's changelog.

    Version 7.0.3 — 2023-01-03

    • Fix: when using pytest-cov or pytest-xdist, or perhaps both, the combining step could fail with assert row is not None using 7.0.2. This was due to a race condition that has always been possible and is still possible. In 7.0.1 and before, the error was silently swallowed by the combining code. Now it will produce a message "Couldn't combine data file" and ignore the data file as it used to do before 7.0.2. Closes issue 1522_.

    .. _issue 1522: nedbat/coveragepy#1522

    .. _changes_7-0-2:

    Version 7.0.2 — 2023-01-02

    • Fix: when using the [run] relative_files = True setting, a relative [paths] pattern was still being made absolute. This is now fixed, closing issue 1519_.

    • Fix: if Python doesn't provide tomllib, then TOML configuration files can only be read if coverage.py is installed with the [toml] extra. Coverage.py will raise an error if TOML support is not installed when it sees your settings are in a .toml file. But it didn't understand that [tools.coverage] was a valid section header, so the error wasn't reported if you used that header, and settings were silently ignored. This is now fixed, closing issue 1516_.

    • Fix: adjusted how decorators are traced on PyPy 7.3.10, fixing issue 1515_.

    • Fix: the coverage lcov report did not properly implement the --fail-under=MIN option. This has been fixed.

    • Refactor: added many type annotations, including a number of refactorings. This should not affect outward behavior, but they were a bit invasive in some places, so keep your eyes peeled for oddities.

    • Refactor: removed the vestigial and long untested support for Jython and IronPython.

    .. _issue 1515: nedbat/coveragepy#1515 .. _issue 1516: nedbat/coveragepy#1516 .. _issue 1519: nedbat/coveragepy#1519

    .. _changes_7-0-1:

    Version 7.0.1 — 2022-12-23

    ... (truncated)

    Commits
    • 2ff9098 docs: prep for 7.0.3
    • 1f34d8b fix: race condition on data file shouldn't break combining. #1522
    • 85170bf build: two-step combines for speed
    • 1605f07 mypy: misc.py, test_misc.py
    • 4f3ccf2 refactor: a better way to have maybe-importable third-party modules
    • 98301ed mypy: test_config.py, test_context.py
    • 9d2e1b0 mypy: test_concurrency.py, test_python.py
    • c3ee30c refactor(test): use tmp_path instead of tmpdir
    • 0b05b45 mypy: test_annotate.py test_arcs.py test_collector.py
    • 2090f79 style: better
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    opened by dependabot[bot] 1
  • Bump trailofbits/gh-action-pip-audit from 1.0.0 to 1.0.4

    Bump trailofbits/gh-action-pip-audit from 1.0.0 to 1.0.4

    Bumps trailofbits/gh-action-pip-audit from 1.0.0 to 1.0.4.

    Release notes

    Sourced from trailofbits/gh-action-pip-audit's releases.

    Release 1.0.4

    Full Changelog: https://github.com/pypa/gh-action-pip-audit/compare/v1.0.3...v1.0.4

    Release 1.0.3

    Full Changelog: https://github.com/pypa/gh-action-pip-audit/compare/v1.0.2...v1.0.3

    Release 1.0.2

    Full Changelog: https://github.com/pypa/gh-action-pip-audit/compare/v1.0.1...v1.0.2

    Release 1.0.1

    What's Changed

    New Contributors

    Full Changelog: https://github.com/pypa/gh-action-pip-audit/compare/v1.0.0...v1.0.1

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    opened by dependabot[bot] 2
  • Bump rojopolis/spellcheck-github-actions from 0.27.0 to 0.29.0

    Bump rojopolis/spellcheck-github-actions from 0.27.0 to 0.29.0

    Bumps rojopolis/spellcheck-github-actions from 0.27.0 to 0.29.0.

    Release notes

    Sourced from rojopolis/spellcheck-github-actions's releases.

    0.29.0, 2022-12-29, maintenance release, update not required

    • Docker image updated to Python 3.11.1 slim via PR #139 from @​dependabot. Release notes for Python 3.11.1

    • lxml bumped to version 4.9.1 from 4.9.1 to get the build to work, without jumping through too many hoops. We prefer relying on wheel instead of building from source, since lxml can become quite a time sink

    0.28.0, 2022-11-16, maintenance release, update not required

    Changelog

    Sourced from rojopolis/spellcheck-github-actions's changelog.

    0.29.0, 2022-12-29, maintenance release, update not required

    • Docker image updated to Python 3.11.1 slim via PR #139 from @​dependabot. Release notes for Python 3.11.1

    • lxml bumped to version 4.9.1 from 4.9.1 to get the build to work, without jumping through too many hoops. We prefer relying on wheel instead of building from source, since lxml can become quite a time sink

    0.28.0, 2022-11-16, maintenance release, update not required

    Commits
    • 58bd666 Merge pull request #140 from rojopolis/0.29.0-release-candidate
    • 51817b6 Release notes preparred and documentation updated to reflect upcoming release
    • 7a37f75 Bump python from 3.10.8-slim-bullseye to 3.11.1-slim-bullseye (#139)
    • c31ba2d Updated Markdown configuration example, so this is more inline with what is u...
    • 63ac3d7 Updated pre-commit hook configuration and ran it towards the files
    • 4f880ce Updated pre-commit hooks
    • 8e8d087 Minor clean up
    • 9615373 Updated configuration and local word list
    • a743260 Updated documentation with information on codefences (#134)
    • 02a7c59 Minor corrections to Markdown formatting
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    opened by dependabot[bot] 2
  • Cursors for search results

    Cursors for search results

    Non-trivial searches should provide cursor support

    • do a ft.agregate and then return an object to access the resulting cursor
    • ideally, that object would provide a URL-safe, and protected, means to serialize the cursor object for web apps
    opened by doc-hex 0
Releases(v0.1.1)
  • v0.1.1(Dec 6, 2022)

    Changes

    🧰 Maintenance

    • Updating poetry lock dependencies, and removed potential cleo vulnerability (#427)
    • Fixing invalid vulnerability report (#402)
    • Removing extra dependency used in release process (#399)
    • Bump redis from 4.3.5 to 4.4.0 (#424)

    Contributors

    We'd like to thank all the contributors who worked on this release!

    Source code(tar.gz)
    Source code(zip)
  • v0.1.0(Oct 18, 2022)

    Changes

    🔥 Breaking Changes

    • Drop python 3.6 support (#374)
    • Updating default page size to 1000 from 10 (#282)

    🚀 New Features

    • Added count aggregation (#397)
    • Allow users to define a new primary key. (#347)
    • Add delete_many to support for bulk deletes (#305)
    • Implement page function (#339)

    🐛 Bug Fixes

    • Fix crash when trying to delete non-existent record (#372)
    • Fix add slash in DEFAULT_ESCAPED_CHARS (#312) (#376)
    • Fix None instance when querying (#368)
    • Fix a potential bug when deleting an object (#337)
    • Updating default page size to 1000 from 10 (#282)

    🧰 Maintenance

    • Adding dependency vulnerability scanning to the CI process (#345)
    • Updating the release drafter, tying to the main branch (#385)
    • Fix broken links (#348)
    • Fixed issue decode_responses is set to True regardless if REDIS_OM_UR… (#373)
    • Drop python 3.6 support (#374)
    • Replace execute_command with specific redis functions when possible (#346)
    • Updating troves for pypi (#341)
    • Support for Pypy 3.7 in CI (#300)
    • Migrate from aioredis to redis-py with asyncio support (#233)
    • Docs(Getting Started): Fix get_redis_connection() example (#262)
    • Various dependencies updates

    Contributors

    We'd like to thank all the contributors who worked on this release!

    @bonastreyair, @chayim, @dependabot, @dependabot[bot], @dvora-h, @melder, @moznuy, @ninoseki, @ryanrussell, @simonprickett, @tonibofarull and @wiseaidev

    Source code(tar.gz)
    Source code(zip)
  • v0.0.27(May 13, 2022)

  • v0.0.26(Apr 25, 2022)

  • v0.0.25(Apr 22, 2022)

    Significantly improved find behaviour when returning lots of records, as the page size for RediSearch is now 1000, not 10. Saves a lot of round trips to Redis - see #207

    Source code(tar.gz)
    Source code(zip)
  • v0.0.24(Apr 22, 2022)

  • v0.0.23(Apr 13, 2022)

  • v0.0.22(Apr 7, 2022)

    • The migrator will now error if you attempt to create a search index on database > 0 as this is not supported by RediSearch.
    • Some documentaiton improvements.
    • Various dependency updates.
    Source code(tar.gz)
    Source code(zip)
  • v0.0.21(Mar 29, 2022)

    Adds ability to expire a model instance, updates the docs to show how to access the underlying Redis connection, and various dependency updates.

    Source code(tar.gz)
    Source code(zip)
  • v0.0.20(Mar 7, 2022)

  • v0.0.19(Feb 15, 2022)

  • v0.0.18(Feb 11, 2022)

  • v0.0.17(Jan 7, 2022)

  • v0.0.16(Jan 7, 2022)

    This release addresses the following issues, and generally updates dependencies:

    • Connectivity issues with Redis Enterprise / Redis Cloud: https://github.com/redis/redis-om-python/issues/73
    • Full text indexing in Hash fields: https://github.com/redis/redis-om-python/issues/42
    Source code(tar.gz)
    Source code(zip)
Owner
Redis
Redis
A simple project to explore the number of GCs when doing basic ORM work.

Question: Does Python do extremely too many GCs for ORMs? YES, OMG YES. Check this out Python Default GC Settings: SQLAlchemy - 20,000 records in one

Michael Kennedy 26 Jun 05, 2022
Tortoise ORM is an easy-to-use asyncio ORM inspired by Django.

Tortoise ORM was build with relations in mind and admiration for the excellent and popular Django ORM. It's engraved in it's design that you are working not with just tables, you work with relational

Tortoise 3.3k Jan 07, 2023
MongoEngine flask extension with WTF model forms support

Flask-MongoEngine Info: MongoEngine for Flask web applications. Repository: https://github.com/MongoEngine/flask-mongoengine About Flask-MongoEngine i

MongoEngine 815 Jan 03, 2023
a small, expressive orm -- supports postgresql, mysql and sqlite

peewee Peewee is a simple and small ORM. It has few (but expressive) concepts, making it easy to learn and intuitive to use. a small, expressive ORM p

Charles Leifer 9.7k Jan 08, 2023
Bringing Async Capabilities to django ORM

Bringing Async Capabilities to django ORM

Skander BM 119 Dec 01, 2022
Rich Python data types for Redis

Created by Stephen McDonald Introduction HOT Redis is a wrapper library for the redis-py client. Rather than calling the Redis commands directly from

Stephen McDonald 281 Nov 10, 2022
Redis OM Python makes it easy to model Redis data in your Python applications.

Object mapping, and more, for Redis and Python Redis OM Python makes it easy to model Redis data in your Python applications. Redis OM Python | Redis

Redis 568 Jan 02, 2023
SQLAlchemy support for aiohttp.

aiohttp-sqlalchemy SQLAlchemy 1.4 / 2.0 support for AIOHTTP. The library provides the next features: initializing asynchronous sessions through a midd

Ruslan Ilyasovich Gilfanov 5 Dec 11, 2022
Python helpers for using SQLAlchemy with Tornado.

tornado-sqlalchemy Python helpers for using SQLAlchemy with Tornado. Installation $ pip install tornado-sqlalchemy In case you prefer installing from

Siddhant Goel 122 Aug 23, 2022
Object mapper for Amazon's DynamoDB

Flywheel Build: Documentation: http://flywheel.readthedocs.org/ Downloads: http://pypi.python.org/pypi/flywheel Source: https://github.com/stevearc/fl

Steven Arcangeli 128 Dec 31, 2022
A Python Object-Document-Mapper for working with MongoDB

MongoEngine Info: MongoEngine is an ORM-like layer on top of PyMongo. Repository: https://github.com/MongoEngine/mongoengine Author: Harry Marr (http:

MongoEngine 3.9k Dec 30, 2022
A pythonic interface to Amazon's DynamoDB

PynamoDB A Pythonic interface for Amazon's DynamoDB. DynamoDB is a great NoSQL service provided by Amazon, but the API is verbose. PynamoDB presents y

2.1k Dec 30, 2022
A Python Library for Simple Models and Containers Persisted in Redis

Redisco Python Containers and Simple Models for Redis Description Redisco allows you to store objects in Redis. It is inspired by the Ruby library Ohm

sebastien requiem 436 Nov 10, 2022
Global base classes for Pyramid SQLAlchemy applications.

pyramid_basemodel pyramid_basemodel is a thin, low level package that provides an SQLAlchemy declarative Base and a thread local scoped Session that c

Grzegorz Śliwiński 15 Jan 03, 2023
A PostgreSQL or SQLite orm for Python

Prom An opinionated lightweight orm for PostgreSQL or SQLite. Prom has been used in both single threaded and multi-threaded environments, including en

Jay Marcyes 18 Dec 01, 2022
Easy-to-use data handling for SQL data stores with support for implicit table creation, bulk loading, and transactions.

dataset: databases for lazy people In short, dataset makes reading and writing data in databases as simple as reading and writing JSON files. Read the

Friedrich Lindenberg 4.2k Dec 26, 2022
A pure Python Database Abstraction Layer

pyDAL pyDAL is a pure Python Database Abstraction Layer. It dynamically generates the SQL/noSQL in realtime using the specified dialect for the databa

440 Nov 13, 2022
A single model for shaping, creating, accessing, storing data within a Database

'db' within pydantic - A single model for shaping, creating, accessing, storing data within a Database Key Features Integrated Redis Caching Support A

Joshua Jamison 178 Dec 16, 2022
A database migrations tool for TortoiseORM, ready to production.

Aerich Introduction Aerich is a database migrations tool for Tortoise-ORM, which is like alembic for SQLAlchemy, or like Django ORM with it's own migr

Tortoise 596 Jan 06, 2023
Pydantic model support for Django ORM

Pydantic model support for Django ORM

Jordan Eremieff 318 Jan 03, 2023