Beatserver, a periodic task scheduler for Django 🎵

Overview

beatserver

Beat Server

Build Status PyPI version

Beatserver, a periodic task scheduler for django channels | beta software

How to install

Prerequirements:

Follow django channels documentation on howto install channels.

Install beatserver:

pip install -U beatserver

Configurations:

Add beatserver to INSTALLED_APPS in settings.py:

INSTALLED_APPS = [
    'beatserver',
    'channels',
    '...'
]

beatconfig.py

from datetime import timedelta

BEAT_SCHEDULE = {
    'testing-print': [
        {
            # will call test_print method of PrintConsumer
            'type': 'test.print',
            # message to pass to the consumer
            'message': {'testing': 'one'},
            # Every 5 seconds
            'schedule': timedelta(seconds=5)
        },
        {
            'type': 'test.print',
            'message': {'testing': 'two'},
            # Precisely at 3AM on Monday
            'schedule': '0 3 * * 1' 
        },
    ]
}

Schedules can be specified as timedeltas for running tasks on specified intervals, or as cron-syntax strings for running tasks on exact schedules.

routing.py

application = ProtocolTypeRouter({
    "channel": ChannelNameRouter({
        "testing-print": PrintConsumer,
    }),
})

consumers.py

from channels.consumer import SyncConsumer

class PrintConsumer(SyncConsumer):
    def test_print(self, message):
        print(message)

How to run:

python manage.py beatserver
Comments
  • Added cron syntax (absolute scheduling) and multi-beat per channel support

    Added cron syntax (absolute scheduling) and multi-beat per channel support

    I'm still testing this out locally but I figured I'd go ahead and throw up a PR.

    This would add support for cron syntax for scheduling tasks at certain times versus just on certain intervals.

    For example, I might want something to run specifically "at 3AM" versus "every 24 hours". This preserves the default intervals for timedelta objects in config.

    enhancement 
    opened by kylebamos 7
  • Allow for several types/messages per channel

    Allow for several types/messages per channel

    Hello, first i'd like to thank you for this module. I would like to have several periodic tasks. I've thought about grouping them in a specific channel? so the would be all together. But as i see in the code, it's about a single task per channel name. In order to support different message types in one channel, there should be 2 cycles here https://github.com/rajasimon/beatserver/blob/58feddc3dd2b20ba12a99f51c9a70d6cc7c49b5b/beatserver/server.py#L30 Outer one should cycle through channel name/config(key-value), and inner one should cycle through message types. Or there would be one cycle, but config shoud be a list instead of dictionary(maybe a list of tuples?), so that there could be several items with the same channel name and maybe the same message type.

    enhancement 
    opened by mikevlz 6
  • how can I run beatserver

    how can I run beatserver

    After installed with command pip3 install -U beatserver, OS can not recognize command 'beatserver' on OSX. BTW, beatserver is a websocket server which can continuely send message to clients browser frontend if I didn't make any misunderstanding?

    help wanted 
    opened by RealLau 6
  •  Channels 2 support

    Channels 2 support

    These changes give beatserver Channels 2 support without dropping Channels 1 support. Admittedly the modifications are a bit hacky due to my inexperience and the difference between Channels 1 and Channels 2 APIs, so maybe it would make more sense to fork the project into a Channels-1 version and a Channels-2 version to keep the code clean...

    In addition to the changes in these files, the beatserver.py file requires additional lines to work correctly in Channels 2 (no changes needed if the user wants to keep working with Channels 1) which should be documented in the README file. This is an example beatserver.py file that works with Channels 2:

    import os from datetime import timedelta from channels.layers import get_channel_layer

    os.environ.setdefault("DJANGO_SETTINGS_MODULE", "PROJECT_NAME.settings")

    BEAT_SCHEDULE = { 'send-every-5-seconds': { 'channel_name': 'CHANNEL_NAME', 'schedule': timedelta(seconds=5), 'message': {'type': 'type_of_message', 'message': 'message_text'} }, }

    channel_layers = get_channel_layer()

    and the server needs to be initialized with the following command:

    beatserver PROJECT_NAME.beatconfig:application

    Putting all this information in the beatconfig.py file is necessary to get the channel_layer. This is due to the fact that in Channels 2 if you try to import the PROJECT_NAME.asgi file as was done for Channels 1 you get a conflict between a the reactor imported in Server.py and an asyncioreactor used by Daphne. Daphne notices the conflict and proceeds to uninstall automatically the synchronous beatserver reactor which breaks beatserver, so I had to move the necessary information to get the channel_layers to the beatconfig.py file.... I know that this quite inelegant and you'll probably want to find a better solution before merging my changes, but I thought you'd prefer to at least have some "first working solution" you can improve upon :-).

    enhancement 
    opened by facucosta 5
  • won't work with more than 1 beat tasks

    won't work with more than 1 beat tasks

    hey, I need to run 2 different task with beatserver, I use django2.x and channels2.x, in consumers.py I write a class BeatServer as:

    class BeatServer(SyncConsumer):
        def task_one(self, message):
            print("task 1 #############")
            data = {"task_one": "ok"}
            room_group_name_one = "room_one"
            async_to_sync(self.channel_layer.group_send)(room_group_name_one, {
                "type": "chat.message",
                "message": data
            })
    
        def task_two(self, message):
            print("task 2 #############")
            data = {"task_two": "ok"}
            room_group_name_two = "room_two"
            async_to_sync(self.channel_layer.group_send)(room_group_name_two, {
                "type": "chat.message",
                "message": data
            })
    

    and beatconfig.py as below

    from datetime import timedelta
    
    BEAT_SCHEDULE = {
        'task-one': {
            'type': 'task_one',
            'message': {'message': 'send successed'},
            'schedule': timedelta(seconds=4)
        },
        'task-two': {
            'type': 'task_two',
            'message': {'message': 'send successed'},
            'schedule': timedelta(seconds=4)
        },
    }
    

    then run beatserver as python3 manage.py beatserver I can see task 1 #############, and things were going well on client but task two seens not work, i cannot see any tag, and clients can not receive any messages. and there are no errors raise.

    can it be done if more than one tasks with beatserver?

    opened by dazhi509 2
  • Static files are not provided

    Static files are not provided

    Hey, thanks for the project! I cloned the repo and tried to run the example, but there are no static files, namely, there is no websocketbridge.js file

    help wanted 
    opened by kyrgyz-nlp 2
  • Make the repeated task execute immediate after the server started

    Make the repeated task execute immediate after the server started

    Currently the mentioned task in the beatconfig execute after the schedule time and repeat but that's okay but make it option to execute the task immediate. I think the immediate execution of the task should be in default mode and make it options to the current behaviour.

    enhancement 
    opened by rajasimon 1
  • Instructions for using in production

    Instructions for using in production

    Hello, As i understand, spinning up the beatserver is done separately from spinning up my django server. So how would i go about using this in production, if my stack consists of Gunicorn + Daphne (channels and db already configured with redis channel layer )

    So for simplicity, lets say that my current deployment script runs this command sudo systemctl restart gunicorn.service && sudo systemctl restart nginx && sudo systemctl restart daphne.service

    and in my gunicorn config my ExecStart is gunicorn my_app.wsgi --bind 0.0.0.0:8000 --log-level error --log-file=- --workers 5 --preload

    daphne ExecStart ExecStart=/home/ubuntu/env/bin/daphne --bind 0.0.0.0 --port <my_port> --verbosity 2 my_app.asgi:application

    how can i also spin up beatserver to play nice with the others ?

    question 
    opened by eladyaniv01 1
  • Improve formatting of README

    Improve formatting of README

    This converts the existing code snippets into fenced code blocks and adds language identifiers in order to render them with syntax highlighting.

    I've also used bold formatting on the file names above each code block to make them stand out more.

    opened by blackrobot 1
  • Support for all ASGI frameworks

    Support for all ASGI frameworks

    One of the cool thing about ASGI is we can hook with asyncio loop easily and that allows us to implement background task not just to channels but any ASGI framework. I'm going to invest some time around this year to change the code to support for all the asgi applications.

    enhancement help wanted 
    opened by rajasimon 1
Releases(v0.0.7)
Simple job queues for Python

Hypothesis Hypothesis is a family of testing libraries which let you write tests parametrized by a source of examples. A Hypothesis implementation the

RQ 8.7k Jan 07, 2023
Redis-backed message queue implementation that can hook into a discord bot written with hikari-lightbulb.

Redis-backed FIFO message queue implementation that can hook into a discord bot written with hikari-lightbulb. This is eventually intended to be the backend communication between a bot and a web dash

thomm.o 7 Dec 05, 2022
Queuing with django celery and rabbitmq

queuing-with-django-celery-and-rabbitmq Install Python 3.6 or above sudo apt-get install python3.6 Install RabbitMQ sudo apt-get install rabbitmq-ser

1 Dec 22, 2021
OpenQueue is a experimental CS: GO match system written in asyncio python.

What is OpenQueue OpenQueue is a experimental CS: GO match system written in asyncio python. Please star! This project was a lot of work & still has a

OpenQueue 10 May 13, 2022
RQ (Redis Queue) integration for Flask applications

Flask-RQ RQ (Redis Queue) integration for Flask applications Resources Documentation Issue Tracker Code Development Version Installation $ pip install

Matt Wright 205 Nov 06, 2022
Beatserver, a periodic task scheduler for Django 🎵

Beat Server Beatserver, a periodic task scheduler for django channels | beta software How to install Prerequirements: Follow django channels documenta

Raja Simon 130 Dec 17, 2022
Asynchronous tasks in Python with Celery + RabbitMQ + Redis

python-asynchronous-tasks Setup & Installation Create a virtual environment and install the dependencies: $ python -m venv venv $ source env/bin/activ

Valon Januzaj 40 Dec 03, 2022
Distributed Task Queue (development branch)

Version: 5.0.5 (singularity) Web: http://celeryproject.org/ Download: https://pypi.org/project/celery/ Source: https://github.com/celery/celery/ Keywo

Celery 20.7k Jan 02, 2023
Django database backed celery periodic task scheduler with support for task dependency graph

Djag Scheduler (Dj)ango Task D(AG) (Scheduler) Overview Djag scheduler associates scheduling information with celery tasks The task schedule is persis

Mohith Reddy 3 Nov 25, 2022
Full featured redis cache backend for Django.

Redis cache backend for Django This is a Jazzband project. By contributing you agree to abide by the Contributor Code of Conduct and follow the guidel

Jazzband 2.5k Jan 03, 2023
Mr. Queue - A distributed worker task queue in Python using Redis & gevent

MRQ MRQ is a distributed task queue for python built on top of mongo, redis and gevent. Full documentation is available on readthedocs Why? MRQ is an

Pricing Assistant 871 Dec 25, 2022
A fast and reliable background task processing library for Python 3.

dramatiq A fast and reliable distributed task processing library for Python 3. Changelog: https://dramatiq.io/changelog.html Community: https://groups

Bogdan Popa 3.4k Jan 01, 2023
Clearly see and debug your celery cluster in real time!

Clearly see and debug your celery cluster in real time! Do you use celery, and monitor your tasks with flower? You'll probably like Clearly! 👍 Clearl

Rogério Sampaio de Almeida 364 Jan 02, 2023
Dagon - An Asynchronous Task Graph Execution Engine

Dagon - An Asynchronous Task Graph Execution Engine Dagon is a job execution sys

8 Nov 17, 2022
a little task queue for python

a lightweight alternative. huey is: a task queue (2019-04-01: version 2.0 released) written in python (2.7+, 3.4+) clean and simple API redis, sqlite,

Charles Leifer 4.3k Jan 08, 2023
A fully-featured e-commerce application powered by Django

kobbyshop - Django Ecommerce App A fully featured e-commerce application powered by Django. Sections Project Description Features Technology Setup Scr

Kwabena Yeboah 2 Feb 15, 2022
A Django app that integrates with Dramatiq.

django_dramatiq django_dramatiq is a Django app that integrates with Dramatiq. Requirements Django 1.11+ Dramatiq 0.18+ Example You can find an exampl

Bogdan Popa 261 Dec 25, 2022
PostgreSQL-based Task Queue for Python

Procrastinate: PostgreSQL-based Task Queue for Python Procrastinate is an open-source Python 3.7+ distributed task processing library, leveraging Post

Procrastinate 486 Jan 08, 2023
Sync Laravel queue with Python. Provides an interface for communication between Laravel and Python.

Python Laravel Queue Queue sync between Python and Laravel using Redis driver. You can process jobs dispatched from Laravel in Python. NOTE: This pack

Sinan Bekar 3 Oct 01, 2022
Asynchronous serverless task queue with timed leasing of tasks

Asynchronous serverless task queue with timed leasing of tasks. Threaded implementations for SQS and local filesystem.

24 Dec 14, 2022